Neural Data Transformer 3: A Foundation Model for Motor Cortical Decoding

Joel Ye, Carnegie Mellon University

Photo of Joel Ye

Motor neuroscience has long quantified the relation between intracortical activity and behavior with individual datasets. This constrained scope yields models that are often not robust across experiments and sometimes even underfit within experiments due to insufficient data. These limits can be surpassed by transfer learning, particularly in deep networks, where models are not fit from scratch but adapted from models previously trained (pretrained) with data from other experimental sessions, subjects, and tasks. In non-neuroscientific domains, similar narratives of gains from pretraining and transfer have concluded in the pursuit of generalist models that train and perform well on breadths of data. Past a certain scale, these generalists functionally serve as foundational models for whole subfields, accelerating modeling for nearly all downstream datasets.
We accordingly train Neural Data Transformer 3 (NDT3) as an intracortical motor foundation model, with 2000 hours of paired population activity and motor covariates from over 30 monkeys and humans from 10 labs (humans were enrolled in clinical trials for sensorimotor BCIs). This data includes behavior spanning reaching, grasping, and finger motion, both under BCI and native limb control. We test the hypothesis that NDT3’s diverse pretraining enables competence across varied BCI applications, evaluating performance both with subject or task data that are seen by the pretrained model, and novel subjects performing novel tasks collected in different labs. NDT3 is a competent generalist across motor tasks: with minutes to a few hours of a new task, NDT3 far outperforms specifically prepared multi-session, single-task models. For even longer datasets, NDT3 matches the performance of single-task experts. These properties suggest NDT3 as a candidate for the first widely available model to improve arbitrary motor decoding from intracortical data.