Refine
Language
- English (23)
Has Fulltext
- yes (23)
Is part of the Bibliography
- no (23)
Keywords
- auditory cortex (4)
- bats (3)
- Cortex (2)
- Neural circuits (2)
- Sensory processing (2)
- frontal cortex (2)
- integrate-and-fire (2)
- local-field potentials (2)
- prefrontal cortex (2)
- Acoustic signals (1)
Institute
- Biowissenschaften (22)
- Ernst Strüngmann Institut (2)
- Medizin (2)
- MPI für empirische Ästhetik (1)
- Physik (1)
The brains of black 6 mice (Mus musculus) and Seba’s short-tailed bats (Carollia perspicillata) weigh roughly the same and share mammalian neocortical laminar architecture. Bats have highly developed sonar calls and social communication and are an excellent neuroethological animal model for auditory research. Mice are olfactory and somatosensory specialists, used frequently in auditory neuroscience for their advantage of standardization and wide genetic toolkit. This study presents an analytical approach to overcome the challenge of inter-species comparison with existing data. In both data sets, we recorded with linear multichannel electrodes down the depth of the primary auditory cortex (A1) while presenting repetitive stimuli trains at ~5 and ~40 Hz to awake bats and mice. We found that while there are similarities between cortical response profiles in both, there was a better signal to noise ratio in bats under these conditions, which allowed for a clearer following response to stimuli trains. Model fit analysis supported this, illustrating that bats had stronger response amplitude suppression to consecutive stimuli. Additionally, continuous wavelet transform revealed that bats had significantly stronger power and phase coherence during stimulus response and mice had stronger power in the background. Better signal to noise ratio and lower intertrial phase variability in bats could represent specialization for faster and more accurate temporal processing at lower metabolic costs. Our findings demonstrate a potentially different general auditory processing principle; investigating such differences may increase our understanding of how the ecological need of a species shapes the development and function of its nervous system.
The brains of black 6 mice (Mus musculus) and Seba’s short-tailed bats (Carollia perspicillata) weigh roughly the same and share the mammalian neocortical laminar architecture. Bats have highly developed sonar calls and social communication and are an excellent neuroethological animal model for auditory research. Mice are olfactory and somatosensory specialists and are used frequently in auditory neuroscience, particularly for their advantage of standardization and genetic tools. Investigating their potentially different general auditory processing principles would advance our understanding of how the ecological needs of a species shape the development and function of the mammalian nervous system. We compared two existing datasets, recorded with linear multichannel electrodes down the depth of the primary auditory cortex (A1) while awake, across both species while presenting repetitive stimulus trains with different frequencies (∼5 and ∼40 Hz). We found that while there are similarities between cortical response profiles in bats and mice, there was a better signal to noise ratio in bats under these conditions, which allowed for a clearer following response to stimuli trains. This was most evident at higher frequency trains, where bats had stronger response amplitude suppression to consecutive stimuli. Phase coherence was far stronger in bats during stimulus response, indicating less phase variability in bats across individual trials. These results show that although both species share cortical laminar organization, there are structural differences in relative depth of layers. Better signal to noise ratio in bats could represent specialization for faster temporal processing shaped by their individual ecological niches.
Although new advances in neuroscience allow the study of vocal communication in awake animals, substantial progress in the processing of vocalizations has been made from brains of anaesthetized preparations. Thus, understanding how anaesthetics affect neuronal responses is of paramount importance. Here, we used electrophysiological recordings and computational modelling to study how the auditory cortex of bats responds to vocalizations under anaesthesia and in wakefulness. We found that multifunctional neurons that process echolocation and communication sounds were affected by ketamine anaesthesia in a manner that could not be predicted by known anaesthetic effects. In wakefulness, acoustic contexts (preceding echolocation or communication sequences) led to stimulus-specific suppression of lagging sounds, accentuating neuronal responses to sound transitions. However, under anaesthesia, communication contexts (but not echolocation) led to a global suppression of responses to lagging sounds. Such asymmetric effect was dependent on the frequency composition of the contexts and not on their temporal patterns. We constructed a neuron model that could replicate the data obtained in vivo. In the model, anaesthesia modulates spiking activity in a channel-specific manner, decreasing responses of cortical inputs tuned to high-frequency sounds and increasing adaptation in the respective cortical synapses. Combined, our findings obtained in vivo and in silico reveal that ketamine anaesthesia does not reduce uniformly the neurons’ responsiveness to low and high frequency sounds. This effect depends on combined mechanisms that unbalance cortical inputs and ultimately affect how auditory cortex neurons respond to natural sounds in anaesthetized preparations.
The mammalian frontal and auditory cortices are important for vocal behaviour. Here, using local field potential recordings, we demonstrate for the first time that the timing and spatial pattern of oscillations in the fronto-auditory cortical network of vocalizing bats (Carollia perspicillata) predict the purpose of vocalization: echolocation or communication. Transfer entropy analyses revealed predominantly top-down (frontal-to-auditory cortex) information flow during spontaneous activity and pre-vocal periods. The dynamics of information flow depended on the behavioural role of the vocalization and on the timing relative to vocal onset. Remarkably, we observed the emergence of predominantly bottom-up (auditory-to-frontal cortex) information transfer patterns specific echolocation production, leading to self-directed acoustic feedback. Electrical stimulation of frontal areas selectively enhanced responses to echolocation sounds in auditory cortex. These results reveal unique changes in information flow across sensory and frontal cortices, potentially driven by the purpose of the vocalization in a highly vocal mammalian model.
Summary The auditory midbrain (inferior colliculus, IC) plays an important role in sound processing, acting as hub for acoustic information extraction and for the implementation of fast audio-motor behaviors. IC neurons are topographically organized according to their sound frequency preference: dorsal IC regions encode low frequencies while ventral areas respond best to high frequencies, a type of sensory map defined as tonotopy. Tonotopic maps have been studied extensively using artificial stimuli (pure tones) but our knowledge of how these maps represent information about sequences of natural, spectro-temporally rich sounds is sparse. We studied this question by conducting simultaneous extracellular recordings across IC depths in awake bats (Carollia perspicillata) that listened to sequences of natural communication and echolocation sounds. The hypothesis was that information about these two types of sound streams is represented at different IC depths since they exhibit large differences in spectral composition, i.e. echolocation covers the high frequency portion of the bat soundscape (> 45 kHz), while communication sounds are broadband and carry most power at low frequencies (20-25 kHz). Our results showed that mutual information between neuronal responses and acoustic stimuli, as well as response redundancy in pairs of neurons recorded simultaneously, increase exponentially with IC depth. The latter occurs regardless of the sound type presented to the bats (echolocation or communication). Taken together, our results indicate the existence of mutual information and redundancy maps at the midbrain level whose response cannot be predicted based on the frequency composition of natural sounds and classic neuronal tuning curves.
The ability to vocalize is ubiquitous in vertebrates, but neural networks leading to vocalization production remain poorly understood. Here we performed simultaneous, large scale, neuronal recordings in the frontal cortex and dorsal striatum (caudate nucleus) during the production of echolocation and non-echolocation calls in bats. This approach allows to assess the general aspects underlying vocalization production in mammals and the unique evolutionary adaptations of bat echolocation. Our findings show that distinct intra-areal brain rhythms in the beta (12-30 Hz) and gamma (30-80 Hz) bands of the local field potential can be used to predict the bats’ vocal output and that phase locking between spikes and field potentials occurs prior vocalization production. Moreover, the fronto-striatal network is differentially coupled in the theta-band during the production of echolocation and non-echolocation calls. Overall, our results present evidence for fronto-striatal network oscillations in motor action prediction in mammals.
Most mammals rely on the extraction of acoustic information from the environment in order to survive. However, the mechanisms that support sound representation in auditory neural networks involving sensory and association brain areas remain underexplored. In this study, we address the functional connectivity between an auditory region in frontal cortex (the frontal auditory field, FAF) and the auditory cortex (AC) in the bat Carollia perspicillata. The AC is a classic sensory area central for the processing of acoustic information. On the other hand, the FAF belongs to the frontal lobe, a brain region involved in the integration of sensory inputs, modulation of cognitive states, and in the coordination of behavioural outputs. The FAF-AC network was examined in terms of oscillatory coherence (local-field potentials, LFPs), and within an information theoretical framework linking FAF and AC spiking activity. We show that in the absence of acoustic stimulation, simultaneously recorded LFPs from FAF and AC are coherent in low frequencies (1-12 Hz). This “default” coupling was strongest in deep AC layers and was unaltered by acoustic stimulation. However, presenting auditory stimuli did trigger the emergence of coherent auditory-evoked gamma-band activity (>25 Hz) between the FAF and AC. In terms of spiking, our results suggest that FAF and AC engage in distinct coding strategies for representing artificial and natural sounds. Taken together, our findings shed light onto the neuronal coding strategies and functional coupling mechanisms that enable sound representation at the network level in the mammalian brain.
Communication sounds are ubiquitous in the animal kingdom, where they play a role in advertising physiological states and/or socio-contextual scenarios. Distress sounds, for example, are typically uttered in distressful scenarios such as agonistic interactions. Here, we report on the occurrence of superfast temporal periodicities in distress calls emitted by bats (species Carollia perspicillata). Distress vocalizations uttered by this bat species are temporally modulated at frequencies close to 1.7 kHz, that is, ∼17 times faster than modulation rates observed in human screams. Fast temporal periodicities are represented in the bats’ brain by means of frequency following responses, and temporally periodic sounds are more effective in boosting the heart rate of awake bats than their demodulated versions. Altogether, our data suggest that bats, an animal group classically regarded as ultrasonic, can exploit the low frequency portion of the soundscape during distress calling to create spectro-temporally complex, arousing sounds.
Neural oscillations are at the core of important computations in the mammalian brain. Interactions between oscillatory activities in different frequency bands, such as delta (1-4 Hz), theta (4-8 Hz), or gamma (>30 Hz), are a powerful mechanism for binding fundamentally distinct spatiotemporal scales of neural processing. Phase-amplitude coupling (PAC) is one such plausible and well-described interaction, but much is yet to be uncovered regarding how PAC dynamics contribute to sensory representations. In particular, although PAC appears to have a major role in audition, the characteristics of coupling profiles in sensory and integration (i.e. frontal) cortical areas remain obscure. Here, we address this question by studying PAC dynamics in the frontal-auditory field (FAF; an auditory area in the bat frontal cortex) and the auditory cortex (AC) of the bat Carollia perspicillata. By means of simultaneous electrophysiological recordings in frontal and auditory cortices examining local-field potentials (LFPs), we show that the amplitude of gamma-band activity couples with the phase of low-frequency LFPs in both structures. Our results demonstrate that the coupling in FAF occurs most prominently in delta/high-gamma frequencies (1-4/75-100 Hz), whereas in the AC the coupling is strongest in the theta/low-gamma (2-8/25-55 Hz) range. We argue that distinct PAC profiles may represent different mechanisms for neuronal processing in frontal and auditory cortices, and might complement oscillatory interactions for sensory processing in the frontal-auditory cortex network.
The auditory midbrain (inferior colliculus, IC) plays an important role in sound processing, acting as hub for acoustic information extraction and for the implementation of fast audio-motor behaviors. IC neurons are topographically organized according to their sound frequency preference: dorsal IC regions encode low frequencies while ventral areas respond best to high frequencies, a type of sensory map defined as tonotopy. Tonotopic maps have been studied extensively using artificial stimuli (pure tones) but our knowledge of how these maps represent information about sequences of natural, spectro-temporally rich sounds is sparse. We studied this question by conducting simultaneous extracellular recordings across IC depths in awake bats (Carollia perspicillata) that listened to sequences of natural communication and echolocation sounds. The hypothesis was that information about these two types of sound streams is represented at different IC depths since they exhibit large differences in spectral composition, i.e., echolocation covers the high-frequency portion of the bat soundscape (> 45 kHz), while communication sounds are broadband and carry most power at low frequencies (20–25 kHz). Our results showed that mutual information between neuronal responses and acoustic stimuli, as well as response redundancy in pairs of neurons recorded simultaneously, increase exponentially with IC depth. The latter occurs regardless of the sound type presented to the bats (echolocation or communication). Taken together, our results indicate the existence of mutual information and redundancy maps at the midbrain level whose response cannot be predicted based on the frequency composition of natural sounds and classic neuronal tuning curves.