Refine
Language
- English (22)
Has Fulltext
- yes (22)
Is part of the Bibliography
- no (22)
Keywords
- auditory cortex (4)
- Cortex (2)
- Neural circuits (2)
- Sensory processing (2)
- bats (2)
- frontal cortex (2)
- integrate-and-fire (2)
- local-field potentials (2)
- prefrontal cortex (2)
- Acoustic signals (1)
- Auditory midbrain (1)
- Bats (1)
- Bioacoustics (1)
- Brain-stimulus synchrony (1)
- Caudate nucleus (1)
- Cognitive science (1)
- Echolocation (1)
- Inferior colliculus (1)
- Model Organism (1)
- Mutual information (1)
- Natural sounds (1)
- Neostriatum (1)
- Neuroscience (1)
- Sensorimotor processing (1)
- Sensory Neuroscience (1)
- Vocalization (1)
- auditory processing (1)
- brain rhythms (1)
- caudate (1)
- coherence (1)
- cross-frequency coupling (1)
- delta oscillations (1)
- functional coupling (1)
- gamma oscillations (1)
- local field potentials (1)
- natural sounds (1)
- neuroethology (1)
- neuronal coherence (1)
- oscillations (1)
- phase-amplitude coupling (1)
- sensory coding (1)
- sound coding (1)
- theta oscillations (1)
- vocalization production; (1)
Institute
- Biowissenschaften (21)
- Ernst Strüngmann Institut (2)
- Medizin (2)
- MPI für empirische Ästhetik (1)
- Physik (1)
Although new advances in neuroscience allow the study of vocal communication in awake animals, substantial progress in the processing of vocalizations has been made from brains of anaesthetized preparations. Thus, understanding how anaesthetics affect neuronal responses is of paramount importance. Here, we used electrophysiological recordings and computational modelling to study how the auditory cortex of bats responds to vocalizations under anaesthesia and in wakefulness. We found that multifunctional neurons that process echolocation and communication sounds were affected by ketamine anaesthesia in a manner that could not be predicted by known anaesthetic effects. In wakefulness, acoustic contexts (preceding echolocation or communication sequences) led to stimulus-specific suppression of lagging sounds, accentuating neuronal responses to sound transitions. However, under anaesthesia, communication contexts (but not echolocation) led to a global suppression of responses to lagging sounds. Such asymmetric effect was dependent on the frequency composition of the contexts and not on their temporal patterns. We constructed a neuron model that could replicate the data obtained in vivo. In the model, anaesthesia modulates spiking activity in a channel-specific manner, decreasing responses of cortical inputs tuned to high-frequency sounds and increasing adaptation in the respective cortical synapses. Combined, our findings obtained in vivo and in silico reveal that ketamine anaesthesia does not reduce uniformly the neurons’ responsiveness to low and high frequency sounds. This effect depends on combined mechanisms that unbalance cortical inputs and ultimately affect how auditory cortex neurons respond to natural sounds in anaesthetized preparations.
Summary The auditory midbrain (inferior colliculus, IC) plays an important role in sound processing, acting as hub for acoustic information extraction and for the implementation of fast audio-motor behaviors. IC neurons are topographically organized according to their sound frequency preference: dorsal IC regions encode low frequencies while ventral areas respond best to high frequencies, a type of sensory map defined as tonotopy. Tonotopic maps have been studied extensively using artificial stimuli (pure tones) but our knowledge of how these maps represent information about sequences of natural, spectro-temporally rich sounds is sparse. We studied this question by conducting simultaneous extracellular recordings across IC depths in awake bats (Carollia perspicillata) that listened to sequences of natural communication and echolocation sounds. The hypothesis was that information about these two types of sound streams is represented at different IC depths since they exhibit large differences in spectral composition, i.e. echolocation covers the high frequency portion of the bat soundscape (> 45 kHz), while communication sounds are broadband and carry most power at low frequencies (20-25 kHz). Our results showed that mutual information between neuronal responses and acoustic stimuli, as well as response redundancy in pairs of neurons recorded simultaneously, increase exponentially with IC depth. The latter occurs regardless of the sound type presented to the bats (echolocation or communication). Taken together, our results indicate the existence of mutual information and redundancy maps at the midbrain level whose response cannot be predicted based on the frequency composition of natural sounds and classic neuronal tuning curves.