Olivo-cerebellar loops, where anatomical patches of the cerebellar cortex and inferior olive project one onto the other, form an anatomical unit of cerebellar computation. Here, we investigated how successive computational steps map onto olivo-cerebellar loops. Lobules IX-X of the cerebellar vermis, i.e. the nodulus and uvula, implement an internal model of the inner ear’s graviceptor, the otolith organs. We have previously identified two populations of Purkinje cells that participate in this computation: Tilt-selective cells transform egocentric rotation signals into allocentric tilt velocity signals, to track head motion relative to gravity, and translation-selective cells encode otolith prediction error. Here we show that, despite very distinct simple spike response properties, both types of Purkinje cells emit complex spikes that are proportional to sensory prediction error. This indicates that both cell populations comprise a single olivo-cerebellar loop, in which only translation-selective cells project to the inferior olive. We propose a neural network model where sensory prediction errors computed by translation-selective cells are used as a teaching signal for both populations, and demonstrate that this network can learn to implement an internal model of the otoliths.
Path integration is a sensorimotor computation that can be used to infer latent dynamical states by integrating self-motion cues. We studied the influence of sensory observation (visual/vestibular) and latent control dynamics (velocity/acceleration) on human path integration using a novel motion-cueing algorithm. Sensory modality and control dynamics were both varied randomly across trials, as participants controlled a joystick to steer to a memorized target location in virtual reality. Visual and vestibular steering cues allowed comparable accuracies only when participants controlled their acceleration, suggesting that vestibular signals, on their own, fail to support accurate path integration in the absence of sustained acceleration. Nevertheless, performance in all conditions reflected a failure to fully adapt to changes in the underlying control dynamics, a result that was well explained by a bias in the dynamics estimation. This work demonstrates how an incorrect internal model of control dynamics affects navigation in volatile environments in spite of continuous sensory feedback.
The hippocampal formation is linked to spatial navigation, but there is little corroboration from freely-moving primates with concurrent monitoring of three-dimensional head and gaze stances. We recorded neurons and local field potentials across hippocampal regions in rhesus macaques during free foraging in an open environment while tracking their head and eye. Theta band activity was intermittently present at movement onset and modulated by saccades. Many cells were phase-locked to theta, with few showing theta phase precession. Most hippocampal neurons encoded a mixture of spatial variables beyond place fields and a negligible number showed prominent grid tuning. Spatial representations were dominated by facing location and allocentric direction, mostly in head, rather than gaze, coordinates. Importantly, eye movements strongly modulated neural activity in all regions. These findings reveal that the macaque hippocampal formation represents three-dimensional space using a multiplexed code, with head orientation and eye movement properties dominating over simple place and grid coding during free exploration.
Path integration is a sensorimotor computation that can be used to infer latent dynamical states by integrating self-motion cues. We studied the influence of sensory observation (visual/vestibular) and latent control dynamics (velocity/acceleration) on human path integration using a novel motion-cueing algorithm. Sensory modality and control dynamics were both varied randomly across trials, as participants controlled a joystick to steer to a memorized target location in virtual reality. Visual and vestibular steering cues allowed comparable accuracies only when participants controlled their acceleration, suggesting that vestibular signals, on their own, fail to support accurate path integration in the absence of sustained acceleration. Nevertheless, performance in all conditions reflected a failure to fully adapt to changes in the underlying control dynamics, a result that was well explained by a bias in the dynamics estimation. This work demonstrates how an incorrect internal model of control dynamics affects navigation in volatile environments in spite of continuous sensory feedback.