Refine
Year of publication
Has Fulltext
- yes (35)
Is part of the Bibliography
- no (35)
Keywords
- Cortex (4)
- auditory cortex (4)
- bats (4)
- Neural circuits (2)
- Sensory processing (2)
- frontal cortex (2)
- integrate-and-fire (2)
- local-field potentials (2)
- neuroethology (2)
- prefrontal cortex (2)
Institute
- Biowissenschaften (33)
- Ernst Strüngmann Institut (3)
- Medizin (2)
- MPI für empirische Ästhetik (1)
- Physik (1)
- Präsidium (1)
Neural oscillations are at the core of important computations in the mammalian brain. Interactions between oscillatory activities in different frequency bands, such as delta (1-4 Hz), theta (4-8 Hz), or gamma (>30 Hz), are a powerful mechanism for binding fundamentally distinct spatiotemporal scales of neural processing. Phase-amplitude coupling (PAC) is one such plausible and well-described interaction, but much is yet to be uncovered regarding how PAC dynamics contribute to sensory representations. In particular, although PAC appears to have a major role in audition, the characteristics of coupling profiles in sensory and integration (i.e. frontal) cortical areas remain obscure. Here, we address this question by studying PAC dynamics in the frontal-auditory field (FAF; an auditory area in the bat frontal cortex) and the auditory cortex (AC) of the bat Carollia perspicillata. By means of simultaneous electrophysiological recordings in frontal and auditory cortices examining local-field potentials (LFPs), we show that the amplitude of gamma-band activity couples with the phase of low-frequency LFPs in both structures. Our results demonstrate that the coupling in FAF occurs most prominently in delta/high-gamma frequencies (1-4/75-100 Hz), whereas in the AC the coupling is strongest in the theta/low-gamma (2-8/25-55 Hz) range. We argue that distinct PAC profiles may represent different mechanisms for neuronal processing in frontal and auditory cortices, and might complement oscillatory interactions for sensory processing in the frontal-auditory cortex network.
Although new advances in neuroscience allow the study of vocal communication in awake animals, substantial progress in the processing of vocalizations has been made from brains of anaesthetized preparations. Thus, understanding how anaesthetics affect neuronal responses is of paramount importance. Here, we used electrophysiological recordings and computational modelling to study how the auditory cortex of bats responds to vocalizations under anaesthesia and in wakefulness. We found that multifunctional neurons that process echolocation and communication sounds were affected by ketamine anaesthesia in a manner that could not be predicted by known anaesthetic effects. In wakefulness, acoustic contexts (preceding echolocation or communication sequences) led to stimulus-specific suppression of lagging sounds, accentuating neuronal responses to sound transitions. However, under anaesthesia, communication contexts (but not echolocation) led to a global suppression of responses to lagging sounds. Such asymmetric effect was dependent on the frequency composition of the contexts and not on their temporal patterns. We constructed a neuron model that could replicate the data obtained in vivo. In the model, anaesthesia modulates spiking activity in a channel-specific manner, decreasing responses of cortical inputs tuned to high-frequency sounds and increasing adaptation in the respective cortical synapses. Combined, our findings obtained in vivo and in silico reveal that ketamine anaesthesia does not reduce uniformly the neurons’ responsiveness to low and high frequency sounds. This effect depends on combined mechanisms that unbalance cortical inputs and ultimately affect how auditory cortex neurons respond to natural sounds in anaesthetized preparations.
Summary The auditory midbrain (inferior colliculus, IC) plays an important role in sound processing, acting as hub for acoustic information extraction and for the implementation of fast audio-motor behaviors. IC neurons are topographically organized according to their sound frequency preference: dorsal IC regions encode low frequencies while ventral areas respond best to high frequencies, a type of sensory map defined as tonotopy. Tonotopic maps have been studied extensively using artificial stimuli (pure tones) but our knowledge of how these maps represent information about sequences of natural, spectro-temporally rich sounds is sparse. We studied this question by conducting simultaneous extracellular recordings across IC depths in awake bats (Carollia perspicillata) that listened to sequences of natural communication and echolocation sounds. The hypothesis was that information about these two types of sound streams is represented at different IC depths since they exhibit large differences in spectral composition, i.e. echolocation covers the high frequency portion of the bat soundscape (> 45 kHz), while communication sounds are broadband and carry most power at low frequencies (20-25 kHz). Our results showed that mutual information between neuronal responses and acoustic stimuli, as well as response redundancy in pairs of neurons recorded simultaneously, increase exponentially with IC depth. The latter occurs regardless of the sound type presented to the bats (echolocation or communication). Taken together, our results indicate the existence of mutual information and redundancy maps at the midbrain level whose response cannot be predicted based on the frequency composition of natural sounds and classic neuronal tuning curves.
Animals extract behaviorally relevant signals from “noisy” environments. To investigate signal extraction, echolocating provides a rich system testbed. For orientation, bats broadcast calls and assign each echo to the corresponding call. When orienting in acoustically enriched environments or when approaching targets, bats change their spectro-temporal call design. Thus, to assess call adjustments that are exclusively meant to facilitate signal extraction in “noisy” environments, it is necessary to control for distance-dependent call changes. By swinging bats in a pendulum, we tested the influence of acoustic playback on the echolocation behavior of Carollia perspicillata. This paradigm evokes reproducible orientation behavior and allows a precise definition of the influence of the acoustic context. Our results show that bats dynamically switch between different adaptations to cope with sound-based navigation in acoustically contaminated environments. These dynamics of echolocation behavior may explain the large variety of adaptations that have been reported in the bat literature.
Echolocating bats exhibit remarkable auditory behaviors, enabled by adaptations within and outside their auditory system. Yet, research in echolocating bats has focused mostly on brain areas that belong to the classic ascending auditory pathway. This study provides direct evidence linking the cerebellum, an evolutionarily ancient and non-classic auditory structure, to vocalization and hearing. We report that in the fruit-eating bat Carollia perspicillata, external sounds can evoke cerebellar responses with latencies below 20 ms. Such fast responses are indicative of early inputs to the bat cerebellum. In vocalizing bats, distinct spike train patterns allow the prediction with over 85% accuracy of the sound they are about to produce, or have just produced, i.e., communication calls or echolocation pulses. Taken together, our findings provide evidence of specializations for vocalization and hearing in the cerebellum of an auditory specialist.