Abstract

Proprioceptive feedback is a critical component of voluntary movement planning and execution. Neuroprosthetic technologies aiming at restoring movement must interact with it to restore accurate motor control. Optimization and design of such technologies depends on the availability of quantitative insights into the neural dynamics of proprioceptive afferents during functional movements. However, recording proprioceptive neural activity during unconstrained movements in clinically relevant animal models presents formidable challenges. In this work, we developed a computational framework to estimate the spatiotemporal patterns of proprioceptive inputs to the cervical spinal cord during three-dimensional arm movements in monkeys. We extended a biomechanical model of the monkey arm with ex-vivo measurements, and combined it with models of mammalian group-Ia, Ib and II afferent fibers. We then used experimental recordings of arm kinematics and muscle activity of two monkeys performing a reaching and grasping task to estimate muscle stretches and forces with computational biomechanics. Finally, we projected the simulated proprioceptive firing rates onto the cervical spinal roots, thus obtaining spatiotemporal maps of spinal proprioceptive inputs during voluntary movements. Estimated maps show complex and markedly distinct patterns of neural activity for each of the fiber populations spanning the spinal cord rostro-caudally. Our results indicate that reproducing the proprioceptive information flow to the cervical spinal cord requires complex spatio-temporal modulation of each spinal root. Our model can support the design of neuroprosthetic technologies as well as in-silico investigations of the primate sensorimotor system.

Details

Actions