Refine
Document Type
- Article (15) (remove)
Has Fulltext
- yes (15)
Is part of the Bibliography
- no (15) (remove)
Keywords
- Cortex (4)
- auditory cortex (3)
- Neural circuits (2)
- Sensory processing (2)
- bats (2)
- frontal cortex (2)
- local-field potentials (2)
- Acoustic signals (1)
- Animal physiology (1)
- Auditory midbrain (1)
- Bats (1)
- Bioacoustics (1)
- Brain-stimulus synchrony (1)
- Caudate nucleus (1)
- Cochlea (1)
- Cognitive science (1)
- Echolocation (1)
- Ecophysiology (1)
- Inferior colliculus (1)
- Midbrain (1)
- Model Organism (1)
- Mutual information (1)
- Natural sounds (1)
- Neostriatum (1)
- Neurophysiology (1)
- Neuroscience (1)
- Sensorimotor processing (1)
- Sensory Neuroscience (1)
- Vocalization (1)
- acoustic stream (1)
- auditory processing (1)
- biosonar (1)
- coherence (1)
- cross-frequency coupling (1)
- delta oscillations (1)
- functional coupling (1)
- gamma oscillations (1)
- integrate-and-fire (1)
- naturalistic stimuli (1)
- neural coding (1)
- neuroethology (1)
- oscillations (1)
- phase-amplitude coupling (1)
- prefrontal cortex (1)
- sensory coding (1)
- sound coding (1)
- theta oscillations (1)
Institute
- Biowissenschaften (14)
- Ernst Strüngmann Institut (1)
- MPI für empirische Ästhetik (1)
- Medizin (1)
- Präsidium (1)
The auditory midbrain (inferior colliculus, IC) plays an important role in sound processing, acting as hub for acoustic information extraction and for the implementation of fast audio-motor behaviors. IC neurons are topographically organized according to their sound frequency preference: dorsal IC regions encode low frequencies while ventral areas respond best to high frequencies, a type of sensory map defined as tonotopy. Tonotopic maps have been studied extensively using artificial stimuli (pure tones) but our knowledge of how these maps represent information about sequences of natural, spectro-temporally rich sounds is sparse. We studied this question by conducting simultaneous extracellular recordings across IC depths in awake bats (Carollia perspicillata) that listened to sequences of natural communication and echolocation sounds. The hypothesis was that information about these two types of sound streams is represented at different IC depths since they exhibit large differences in spectral composition, i.e., echolocation covers the high-frequency portion of the bat soundscape (> 45 kHz), while communication sounds are broadband and carry most power at low frequencies (20–25 kHz). Our results showed that mutual information between neuronal responses and acoustic stimuli, as well as response redundancy in pairs of neurons recorded simultaneously, increase exponentially with IC depth. The latter occurs regardless of the sound type presented to the bats (echolocation or communication). Taken together, our results indicate the existence of mutual information and redundancy maps at the midbrain level whose response cannot be predicted based on the frequency composition of natural sounds and classic neuronal tuning curves.
The mammalian frontal and auditory cortices are important for vocal behavior. Here, using local-field potential recordings, we demonstrate that the timing and spatial patterns of oscillations in the fronto-auditory network of vocalizing bats (Carollia perspicillata) predict the purpose of vocalization: echolocation or communication. Transfer entropy analyses revealed predominant top-down (frontal-to-auditory cortex) information flow during spontaneous activity and pre-vocal periods. The dynamics of information flow depend on the behavioral role of the vocalization and on the timing relative to vocal onset. We observed the emergence of predominant bottom-up (auditory-to-frontal) information transfer during the post-vocal period specific to echolocation pulse emission, leading to self-directed acoustic feedback. Electrical stimulation of frontal areas selectively enhanced responses to sounds in auditory cortex. These results reveal unique changes in information flow across sensory and frontal cortices, potentially driven by the purpose of the vocalization in a highly vocal mammalian model.
Echolocation behavior, a navigation strategy based on acoustic signals, allows scientists to explore neural processing of behaviorally relevant stimuli. For the purpose of orientation, bats broadcast echolocation calls and extract spatial information from the echoes. Because bats control call emission and thus the availability of spatial information, the behavioral relevance of these signals is undiscussable. While most neurophysiological studies, conducted in the past, used synthesized acoustic stimuli that mimic portions of the echolocation signals, recent progress has been made to understand how naturalistic echolocation signals are encoded in the bat brain. Here, we review how does stimulus history affect neural processing, how spatial information from multiple objects and how echolocation signals embedded in a naturalistic, noisy environment are processed in the bat brain. We end our review by discussing the huge potential that state-of-the-art recording techniques provide to gain a more complete picture on the neuroethology of echolocation behavior.
Frontal areas of the mammalian cortex are thought to be important for cognitive control and complex behaviour. These areas have been studied mostly in humans, non-human primates and rodents. In this article, we present a quantitative characterization of response properties of a frontal auditory area responsive to sound in the brain of Carollia perspicillata, the frontal auditory field (FAF). Bats are highly vocal animals, and they constitute an important experimental model for studying the auditory system. We combined electrophysiology experiments and computational simulations to compare the response properties of auditory neurons found in the bat FAF and auditory cortex (AC) to simple sounds (pure tones). Anatomical studies have shown that the latter provides feedforward inputs to the former. Our results show that bat FAF neurons are responsive to sounds, and however, when compared to AC neurons, they presented sparser, less precise spiking and longer-lasting responses. Based on the results of an integrate-and-fire neuronal model, we suggest that slow, subthreshold, synaptic dynamics can account for the activity pattern of neurons in the FAF. These properties reflect the general function of the frontal cortex and likely result from its connections with multiple brain regions, including cortico-cortical projections from the AC to the FAF.
Neural oscillations are at the core of important computations in the mammalian brain. Interactions between oscillatory activities in different frequency bands, such as delta (1–4 Hz), theta (4–8 Hz) or gamma (>30 Hz), are a powerful mechanism for binding fundamentally distinct spatiotemporal scales of neural processing. Phase-amplitude coupling (PAC) is one such plausible and well-described interaction, but much is yet to be uncovered regarding how PAC dynamics contribute to sensory representations. In particular, although PAC appears to have a major role in audition, the characteristics of coupling profiles in sensory and integration (i.e. frontal) cortical areas remain obscure. Here, we address this question by studying PAC dynamics in the frontal-auditory field (FAF; an auditory area in the bat frontal cortex) and the auditory cortex (AC) of the bat Carollia perspicillata. By means of simultaneous electrophysiological recordings in frontal and auditory cortices examining local-field potentials (LFPs), we show that the amplitude of gamma-band activity couples with the phase of low-frequency LFPs in both structures. Our results demonstrate that the coupling in FAF occurs most prominently in delta/high-gamma frequencies (1-4/75-100 Hz), whereas in the AC the coupling is strongest in the delta-theta/low-gamma (2-8/25-55 Hz) range. We argue that distinct PAC profiles may represent different mechanisms for neuronal processing in frontal and auditory cortices, and might complement oscillatory interactions for sensory processing in the frontal-auditory cortex network.
Communication sounds are ubiquitous in the animal kingdom, where they play a role in advertising physiological states and/or socio-contextual scenarios. Human screams, for example, are typically uttered in fearful contexts and they have a distinctive feature termed as “roughness”, which depicts amplitude fluctuations at rates from 30–150 Hz. In this article, we report that the occurrence of fast acoustic periodicities in harsh sounding vocalizations is not unique to humans. A roughness-like structure is also present in vocalizations emitted by bats (species Carollia perspicillata) in distressful contexts. We report that 47.7% of distress calls produced by bats carry amplitude fluctuations at rates ~1.7 kHz (>10 times faster than temporal modulations found in human screams). In bats, rough-like vocalizations entrain brain potentials and are more effective in accelerating the bats’ heart rate than slow amplitude modulated sounds. Our results are consistent with a putative role of fast amplitude modulations (roughness in humans) for grabbing the listeners attention in situations in which the emitter is in distressful, potentially dangerous, contexts.
Most mammals rely on the extraction of acoustic information from the environment in order to survive. However, the mechanisms that support sound representation in auditory neural networks involving sensory and association brain areas remain underexplored. In this study, we address the functional connectivity between an auditory region in frontal cortex (the frontal auditory field, FAF) and the auditory cortex (AC) in the bat Carollia perspicillata. The AC is a classic sensory area central for the processing of acoustic information. On the other hand, the FAF belongs to the frontal lobe, a brain region involved in the integration of sensory inputs, modulation of cognitive states, and in the coordination of behavioral outputs. The FAF-AC network was examined in terms of oscillatory coherence (local-field potentials, LFPs), and within an information theoretical framework linking FAF and AC spiking activity. We show that in the absence of acoustic stimulation, simultaneously recorded LFPs from FAF and AC are coherent in low frequencies (1–12 Hz). This “default” coupling was strongest in deep AC layers and was unaltered by acoustic stimulation. However, presenting auditory stimuli did trigger the emergence of coherent auditory-evoked gamma-band activity (>25 Hz) between the FAF and AC. In terms of spiking, our results suggest that FAF and AC engage in distinct coding strategies for representing artificial and natural sounds. Taken together, our findings shed light onto the neuronal coding strategies and functional coupling mechanisms that enable sound representation at the network level in the mammalian brain.
The ability to vocalize is ubiquitous in vertebrates, but neural networks underlying vocal control remain poorly understood. Here, we performed simultaneous neuronal recordings in the frontal cortex and dorsal striatum (caudate nucleus, CN) during the production of echolocation pulses and communication calls in bats. This approach allowed us to assess the general aspects underlying vocal production in mammals and the unique evolutionary adaptations of bat echolocation. Our data indicate that before vocalization, a distinctive change in high-gamma and beta oscillations (50–80 Hz and 12–30 Hz, respectively) takes place in the bat frontal cortex and dorsal striatum. Such precise fine-tuning of neural oscillations could allow animals to selectively activate motor programs required for the production of either echolocation or communication vocalizations. Moreover, the functional coupling between frontal and striatal areas, occurring in the theta oscillatory band (4–8 Hz), differs markedly at the millisecond level, depending on whether the animals are in a navigational mode (that is, emitting echolocation pulses) or in a social communication mode (emitting communication calls). Overall, this study indicates that fronto-striatal oscillations could provide a neural correlate for vocal control in bats.
Experimental evidence supports that cortical oscillations represent multiscale temporal modulations existent in natural stimuli, yet little is known about the processing of these multiple timescales at a neuronal level. Here, using extracellular recordings from the auditory cortex (AC) of awake bats (Carollia perspicillata), we show the existence of three neuronal types which represent different levels of the temporal structure of conspecific vocalizations, and therefore constitute direct evidence of multiscale temporal processing of naturalistic stimuli by neurons in the AC. These neuronal subpopulations synchronize differently to local-field potentials, particularly in theta- and high frequency bands, and are informative to a different degree in terms of their spike rate. Interestingly, we also observed that both low and high frequency cortical oscillations can be highly informative about the listened calls. Our results suggest that multiscale neuronal processing allows for the precise and non-redundant representation of natural vocalizations in the AC.
Low-frequency spike-field coherence is a fingerprint of periodicity coding in the auditory cortex
(2018)
The extraction of temporal information from sensory input streams is of paramount importance in the auditory system. In this study, amplitude-modulated sounds were used as stimuli to drive auditory cortex (AC) neurons of the bat species Carollia perspicillata, to assess the interactions between cortical spikes and local-field potentials (LFPs) for the processing of temporal acoustic cues. We observed that neurons in the AC capable of eliciting synchronized spiking to periodic acoustic envelopes were significantly more coherent to theta- and alpha-band LFPs than their non-synchronized counterparts. These differences occurred independently of the modulation rate tested and could not be explained by power or phase modulations of the field potentials. We argue that the coupling between neuronal spiking and the phase of low-frequency LFPs might be important for orchestrating the coding of temporal acoustic structures in the AC.