Refine
Language
- English (9)
Has Fulltext
- yes (9)
Is part of the Bibliography
- no (9)
Keywords
- Cortex (2)
- Neural circuits (2)
- Sensory processing (2)
- auditory cortex (2)
- frontal cortex (2)
- local-field potentials (2)
- Model Organism (1)
- Neuroscience (1)
- Sensorimotor processing (1)
- Sensory Neuroscience (1)
Institute
- Biowissenschaften (8)
- Medizin (2)
- Ernst Strüngmann Institut (1)
Human behaviour is inextricably linked to the interaction of emotion and cognition. For decades, emotion and cognition were perceived as separable processes, yet with mutual interactions. Recently, this differen-tiation has been challenged by more integrative approaches, but without addressing the exact neurophysiological basis of their interaction. Here, we aimed to uncover neurophysiological mechanisms of emotion-cognition interaction. We used an emotional Flanker task paired with EEG/FEM beamforming in a large cohort (N=121) of healthy human participants, obtaining high temporal and fMRI-equivalent spatial resolution. Spatially, emotion and cognition processing overlapped in the right inferior frontal gyrus (rIFG), specifically in pars triangularis. Temporally, emotion and cognition processing overlapped during the transition from emotional to cognitive processing, with a stronger interaction in β-band power leading to worse behavioral performance. Despite functionally segregated subdivisions in rIFG, frequency-specific information flowed extensively within IFG and top-down to visual areas (V2, Precuneus) – explaining the behavioral interference effect. Thus, for the first time we here show the neural mechanisms of emotion-cognition interaction in space, time, frequency and information transfer with high temporal and spatial resolution, revealing a central role for β-band activity in rIFG. Our results support the idea that rIFG plays a broad role in both inhibitory control and emotional interference inhibition as it is a site of convergence in both processes. Furthermore, our results have potential clinical implications for understanding dysfunctional emotion-cognition interaction and emotional interference inhibition in psychiatric disor-ders, e.g. major depression and substance use disorder, in which patients have difficulties in regulating emotions and executing inhibitory control.
The mammalian frontal and auditory cortices are important for vocal behaviour. Here, using local field potential recordings, we demonstrate for the first time that the timing and spatial pattern of oscillations in the fronto-auditory cortical network of vocalizing bats (Carollia perspicillata) predict the purpose of vocalization: echolocation or communication. Transfer entropy analyses revealed predominantly top-down (frontal-to-auditory cortex) information flow during spontaneous activity and pre-vocal periods. The dynamics of information flow depended on the behavioural role of the vocalization and on the timing relative to vocal onset. Remarkably, we observed the emergence of predominantly bottom-up (auditory-to-frontal cortex) information transfer patterns specific echolocation production, leading to self-directed acoustic feedback. Electrical stimulation of frontal areas selectively enhanced responses to echolocation sounds in auditory cortex. These results reveal unique changes in information flow across sensory and frontal cortices, potentially driven by the purpose of the vocalization in a highly vocal mammalian model.
Most mammals rely on the extraction of acoustic information from the environment in order to survive. However, the mechanisms that support sound representation in auditory neural networks involving sensory and association brain areas remain underexplored. In this study, we address the functional connectivity between an auditory region in frontal cortex (the frontal auditory field, FAF) and the auditory cortex (AC) in the bat Carollia perspicillata. The AC is a classic sensory area central for the processing of acoustic information. On the other hand, the FAF belongs to the frontal lobe, a brain region involved in the integration of sensory inputs, modulation of cognitive states, and in the coordination of behavioural outputs. The FAF-AC network was examined in terms of oscillatory coherence (local-field potentials, LFPs), and within an information theoretical framework linking FAF and AC spiking activity. We show that in the absence of acoustic stimulation, simultaneously recorded LFPs from FAF and AC are coherent in low frequencies (1-12 Hz). This “default” coupling was strongest in deep AC layers and was unaltered by acoustic stimulation. However, presenting auditory stimuli did trigger the emergence of coherent auditory-evoked gamma-band activity (>25 Hz) between the FAF and AC. In terms of spiking, our results suggest that FAF and AC engage in distinct coding strategies for representing artificial and natural sounds. Taken together, our findings shed light onto the neuronal coding strategies and functional coupling mechanisms that enable sound representation at the network level in the mammalian brain.
Neural oscillations are at the core of important computations in the mammalian brain. Interactions between oscillatory activities in different frequency bands, such as delta (1-4 Hz), theta (4-8 Hz), or gamma (>30 Hz), are a powerful mechanism for binding fundamentally distinct spatiotemporal scales of neural processing. Phase-amplitude coupling (PAC) is one such plausible and well-described interaction, but much is yet to be uncovered regarding how PAC dynamics contribute to sensory representations. In particular, although PAC appears to have a major role in audition, the characteristics of coupling profiles in sensory and integration (i.e. frontal) cortical areas remain obscure. Here, we address this question by studying PAC dynamics in the frontal-auditory field (FAF; an auditory area in the bat frontal cortex) and the auditory cortex (AC) of the bat Carollia perspicillata. By means of simultaneous electrophysiological recordings in frontal and auditory cortices examining local-field potentials (LFPs), we show that the amplitude of gamma-band activity couples with the phase of low-frequency LFPs in both structures. Our results demonstrate that the coupling in FAF occurs most prominently in delta/high-gamma frequencies (1-4/75-100 Hz), whereas in the AC the coupling is strongest in the theta/low-gamma (2-8/25-55 Hz) range. We argue that distinct PAC profiles may represent different mechanisms for neuronal processing in frontal and auditory cortices, and might complement oscillatory interactions for sensory processing in the frontal-auditory cortex network.
Neural oscillations are at the core of important computations in the mammalian brain. Interactions between oscillatory activities in different frequency bands, such as delta (1–4 Hz), theta (4–8 Hz) or gamma (>30 Hz), are a powerful mechanism for binding fundamentally distinct spatiotemporal scales of neural processing. Phase-amplitude coupling (PAC) is one such plausible and well-described interaction, but much is yet to be uncovered regarding how PAC dynamics contribute to sensory representations. In particular, although PAC appears to have a major role in audition, the characteristics of coupling profiles in sensory and integration (i.e. frontal) cortical areas remain obscure. Here, we address this question by studying PAC dynamics in the frontal-auditory field (FAF; an auditory area in the bat frontal cortex) and the auditory cortex (AC) of the bat Carollia perspicillata. By means of simultaneous electrophysiological recordings in frontal and auditory cortices examining local-field potentials (LFPs), we show that the amplitude of gamma-band activity couples with the phase of low-frequency LFPs in both structures. Our results demonstrate that the coupling in FAF occurs most prominently in delta/high-gamma frequencies (1-4/75-100 Hz), whereas in the AC the coupling is strongest in the delta-theta/low-gamma (2-8/25-55 Hz) range. We argue that distinct PAC profiles may represent different mechanisms for neuronal processing in frontal and auditory cortices, and might complement oscillatory interactions for sensory processing in the frontal-auditory cortex network.
Most mammals rely on the extraction of acoustic information from the environment in order to survive. However, the mechanisms that support sound representation in auditory neural networks involving sensory and association brain areas remain underexplored. In this study, we address the functional connectivity between an auditory region in frontal cortex (the frontal auditory field, FAF) and the auditory cortex (AC) in the bat Carollia perspicillata. The AC is a classic sensory area central for the processing of acoustic information. On the other hand, the FAF belongs to the frontal lobe, a brain region involved in the integration of sensory inputs, modulation of cognitive states, and in the coordination of behavioral outputs. The FAF-AC network was examined in terms of oscillatory coherence (local-field potentials, LFPs), and within an information theoretical framework linking FAF and AC spiking activity. We show that in the absence of acoustic stimulation, simultaneously recorded LFPs from FAF and AC are coherent in low frequencies (1–12 Hz). This “default” coupling was strongest in deep AC layers and was unaltered by acoustic stimulation. However, presenting auditory stimuli did trigger the emergence of coherent auditory-evoked gamma-band activity (>25 Hz) between the FAF and AC. In terms of spiking, our results suggest that FAF and AC engage in distinct coding strategies for representing artificial and natural sounds. Taken together, our findings shed light onto the neuronal coding strategies and functional coupling mechanisms that enable sound representation at the network level in the mammalian brain.
Low-frequency spike-field coherence is a fingerprint of periodicity coding in the auditory cortex
(2018)
The extraction of temporal information from sensory input streams is of paramount importance in the auditory system. In this study, amplitude-modulated sounds were used as stimuli to drive auditory cortex (AC) neurons of the bat species Carollia perspicillata, to assess the interactions between cortical spikes and local-field potentials (LFPs) for the processing of temporal acoustic cues. We observed that neurons in the AC capable of eliciting synchronized spiking to periodic acoustic envelopes were significantly more coherent to theta- and alpha-band LFPs than their non-synchronized counterparts. These differences occurred independently of the modulation rate tested and could not be explained by power or phase modulations of the field potentials. We argue that the coupling between neuronal spiking and the phase of low-frequency LFPs might be important for orchestrating the coding of temporal acoustic structures in the AC.
Experimental evidence supports that cortical oscillations represent multiscale temporal modulations existent in natural stimuli, yet little is known about the processing of these multiple timescales at a neuronal level. Here, using extracellular recordings from the auditory cortex (AC) of awake bats (Carollia perspicillata), we show the existence of three neuronal types which represent different levels of the temporal structure of conspecific vocalizations, and therefore constitute direct evidence of multiscale temporal processing of naturalistic stimuli by neurons in the AC. These neuronal subpopulations synchronize differently to local-field potentials, particularly in theta- and high frequency bands, and are informative to a different degree in terms of their spike rate. Interestingly, we also observed that both low and high frequency cortical oscillations can be highly informative about the listened calls. Our results suggest that multiscale neuronal processing allows for the precise and non-redundant representation of natural vocalizations in the AC.
The mammalian frontal and auditory cortices are important for vocal behavior. Here, using local-field potential recordings, we demonstrate that the timing and spatial patterns of oscillations in the fronto-auditory network of vocalizing bats (Carollia perspicillata) predict the purpose of vocalization: echolocation or communication. Transfer entropy analyses revealed predominant top-down (frontal-to-auditory cortex) information flow during spontaneous activity and pre-vocal periods. The dynamics of information flow depend on the behavioral role of the vocalization and on the timing relative to vocal onset. We observed the emergence of predominant bottom-up (auditory-to-frontal) information transfer during the post-vocal period specific to echolocation pulse emission, leading to self-directed acoustic feedback. Electrical stimulation of frontal areas selectively enhanced responses to sounds in auditory cortex. These results reveal unique changes in information flow across sensory and frontal cortices, potentially driven by the purpose of the vocalization in a highly vocal mammalian model.