Refine
Year of publication
Language
- English (93)
Has Fulltext
- yes (93)
Is part of the Bibliography
- no (93)
Keywords
- dendrite (3)
- Visual cortex (2)
- morphology (2)
- natural scenes (2)
- neuronal populations (2)
- primary visual cortex (2)
- stimulus encoding (2)
- visual attention (2)
- Alpha rhythm (1)
- CNNs (1)
Institute
- Ernst Strüngmann Institut (93) (remove)
Cognition requires the dynamic modulation of effective connectivity, i.e. the modulation of the postsynaptic neuronal response to a given input. If postsynaptic neurons are rhythmically active, this might entail rhythmic gain modulation, such that inputs synchronized to phases of high gain benefit from enhanced effective connectivity. We show that visually induced gamma-band activity in awake macaque area V4 rhythmically modulates responses to unpredictable stimulus events. This modulation exceeded a simple additive superposition of a constant response onto ongoing gamma-rhythmic firing, demonstrating the modulation of multiplicative gain. Gamma phases leading to strongest neuronal responses also led to shortest behavioral reaction times, suggesting functional relevance of the effect. Furthermore, we find that constant optogenetic stimulation of anesthetized cat area 21a produces gamma-band activity entailing a similar gain modulation. As the gamma rhythm in area 21a did not spread backwards to area 17, this suggests that postsynaptic gamma is sufficient for gain modulation.
Abstract Trial-to-trial variability and spontaneous activity of cortical recordings have been suggested to reflect intrinsic noise. This view is currently challenged by mounting evidence for structure in these phenomena: Trial-to-trial variability decreases following stimulus onset and can be predicted by previous spontaneous activity. This spontaneous activity is similar in magnitude and structure to evoked activity and can predict decisions. Allof the observed neuronal properties described above can be accounted for, at an abstract computational level, by the sampling-hypothesis, according to which response variability reflects stimulus uncertainty. However, a mechanistic explanation at the level of neural circuit dynamics is still missing.
In this study, we demonstrate that all of these phenomena can be accounted for by a noise-free self-organizing recurrent neural network model (SORN). It combines spike-timing dependent plasticity (STDP) and homeostatic mechanisms in a deterministic network of excitatory and inhibitory McCulloch-Pitts neurons. The network self-organizes to spatio-temporally varying input sequences.
We find that the key properties of neural variability mentioned above develop in this model as the network learns to perform sampling-like inference. Importantly, the model shows high trial-to-trial variability although it is fully deterministic. This suggests that the trial-to-trial variability in neural recordings may not reflect intrinsic noise. Rather, it may reflect a deterministic approximation of sampling-like learning and inference. The simplicity of the model suggests that these correlates of the sampling theory are canonical properties of recurrent networks that learn with a combination of STDP and homeostatic plasticity mechanisms.
Author Summary Neural recordings seem very noisy. If the exact same stimulus is shown to an animal multiple times, the neural response will vary. In fact, the activity of a single neuron shows many features of a stochastic process. Furthermore, in the absence of a sensory stimulus, cortical spontaneous activity has a magnitude comparable to the activity observed during stimulus presentation. These findings have led to a widespread belief that neural activity is indeed very noisy. However, recent evidence indicates that individual neurons can operate very reliably and that the spontaneous activity in the brain is highly structured, suggesting that much of the noise may in fact be signal. One hypothesis regarding this putative signal is that it reflects a form of probabilistic inference through sampling. Here we show that the key features of neural variability can be accounted for in a completely deterministic network model through self-organization. As the network learns a model of its sensory inputs, the deterministic dynamics give rise to sampling-like inference. Our findings show that the notorious variability in neural recordings does not need to be seen as evidence for a noisy brain. Instead it may reflect sampling-like inference emerging from a self-organized learning process.
Cross-frequency coupling (CFC) has been proposed to coordinate neural dynamics across spatial and temporal scales. Despite its potential relevance for understanding healthy and pathological brain function, the standard CFC analysis and physiological interpretation come with fundamental problems. For example, apparent CFC can appear because of spectral correlations due to common non-stationarities that may arise in the total absence of interactions between neural frequency components. To provide a road map towards an improved mechanistic understanding of CFC, we organize the available and potential novel statistical/modeling approaches according to their biophysical interpretability. While we do not provide solutions for all the problems described, we provide a list of practical recommendations to avoid common errors and to enhance the interpretability of CFC analysis.
Intrinsic covariation of brain activity has been studied across many levels of brain organization. Between visual areas, neuronal activity covaries primarily among portions with similar retinotopic selectivity. We hypothesized that spontaneous inter-areal co-activation is subserved by neuronal synchronization. We performed simultaneous high-density electrocorticographic recordings across several visual areas in awake monkeys to investigate spatial patterns of local and inter-areal synchronization. We show that stimulation-induced patterns of inter-areal co-activation were reactivated in the absence of stimulation. Reactivation occurred through both, inter-areal co-fluctuation of local activity and inter-areal phase synchronization. Furthermore, the trial-by-trial covariance of the induced responses recapitulated the pattern of inter-areal coupling observed during stimulation, i.e. the signal correlation. Reactivation-related synchronization showed distinct peaks in the theta, alpha and gamma frequency bands. During passive states, this rhythmic reactivation was augmented by specific patterns of arrhythmic correspondence. These results suggest that networks of intrinsic covariation observed at multiple levels and with several recording techniques are related to synchronization and that behavioral state may affect the structure of intrinsic dynamics.
Cognition requires the dynamic modulation of effective connectivity, i.e., the modulation of the postsynaptic neuronal response to a given input. If postsynaptic neurons are rhythmically active, this might entail rhythmic gain modulation, such that inputs synchronized to phases of high gain benefit from enhanced effective connectivity. We show that visually induced gamma-band activity in awake macaque area V4 rhythmically modulates responses to unpredictable stimulus events. This modulation exceeded a simple additive superposition of a constant response onto ongoing gamma-rhythmic firing, demonstrating the modulation of multiplicative gain. Gamma phases leading to strongest neuronal responses also led to shortest behavioral reaction times, suggesting functional relevance of the effect. Furthermore, we find that constant optogenetic stimulation of anesthetized cat area 21a produces gamma-band activity entailing a similar gain modulation. As the gamma rhythm in area 21a did not spread backward to area 17, this suggests that postsynaptic gamma is sufficient for gain modulation.
Spike count correlations (SCCs) are ubiquitous in sensory cortices, are characterized by rich structure and arise from structured internal interactions. Yet, most theories of visual perception focus exclusively on the mean responses of individual neurons. Here, we argue that feedback interactions in primary visual cortex (V1) establish the context in which individual neurons process complex stimuli and that changes in visual context give rise to stimulus-dependent SCCs. Measuring V1 population responses to natural scenes in behaving macaques, we show that the fine structure of SCCs is stimulus-specific and variations in response correlations across-stimuli are independent of variations in response means. Moreover, we demonstrate that stimulus-specificity of SCCs in V1 can be directly manipulated by controlling the high-order structure of synthetic stimuli. We propose that stimulus-specificity of SCCs is a natural consequence of hierarchical inference where inferences on the presence of high-level image features modulate inferences on the presence of low-level features.
Rhythmic neural spiking and attentional sampling arising from cortical receptive field interactions
(2018)
Summary: Growing evidence suggests that distributed spatial attention may invoke theta (3-9 Hz) rhythmic sampling processes. The neuronal basis of such attentional sampling is however not fully understood. Here we show using array recordings in visual cortical area V4 of two awake macaques that presenting separate visual stimuli to the excitatory center and suppressive surround of neuronal receptive fields elicits rhythmic multi-unit activity (MUA) at 3-6 Hz. This neuronal rhythm did not depend on small fixational eye movements. In the context of a distributed spatial attention task, during which the monkeys detected a spatially and temporally uncertain target, reaction times (RT) exhibited similar rhythmic fluctuations. RTs were fast or slow depending on the target occurrence during high or low MUA, resulting in rhythmic MUA-RT cross-correlations at at theta frequencies. These findings suggest that theta-rhythmic neuronal activity arises from competitive receptive field interactions and that this rhythm may subserve attentional sampling.
Highlights:
* Center-surround interactions induce theta-rhythmic MUA of visual cortex neurons
* The MUA rhythm does not depend on small fixational eye movements
* Reaction time fluctuations lock to the neuronal rhythm under distributed attention
Reducing neuronal size results in less cell membrane and therefore lower input conductance. Smaller neurons are thus more excitable as seen in their voltage responses to current injections in the soma. However, the impact of a neuron’s size and shape on its voltage responses to synaptic activation in dendrites is much less understood. Here we use analytical cable theory to predict voltage responses to distributed synaptic inputs and show that these are entirely independent of dendritic length. For a given synaptic density, a neuron’s response depends only on the average dendritic diameter and its intrinsic conductivity. These results remain true for the entire range of possible dendritic morphologies irrespective of any particular arborisation complexity. Also, spiking models result in morphology invariant numbers of action potentials that encode the percentage of active synapses. Interestingly, in contrast to spike rate, spike times do depend on dendrite morphology. In summary, a neuron’s excitability in response to synaptic inputs is not affected by total dendrite length. It rather provides a homeostatic input-output relation that specialised synapse distributions, local non-linearities in the dendrites and synaptic plasticity can modulate. Our work reveals a new fundamental principle of dendritic constancy that has consequences for the overall computation in neural circuits.
Excess neuronal branching allows for innervation of specific dendritic compartments in cortex
(2019)
The connectivity of cortical microcircuits is a major determinant of brain function; defining how activity propagates between different cell types is key to scaling our understanding of individual neuronal behaviour to encompass functional networks. Furthermore, the integration of synaptic currents within a dendrite depends on the spatial organisation of inputs, both excitatory and inhibitory. We identify a simple equation to estimate the number of potential anatomical contacts between neurons; finding a linear increase in potential connectivity with cable length and maximum spine length, and a decrease with overlapping volume. This enables us to predict the mean number of candidate synapses for reconstructed cells, including those realistically arranged. We identify an excess of putative connections in cortical data, with densities of neurite higher than is necessary to reliably ensure the possible implementation of any given connection. We show that potential contacts allow the particular implementation of connectivity at a subcellular level.
Sholl analysis has been an important technique in dendritic anatomy for more than 60 years. The Sholl intersection profile is obtained by counting the number of dendritic branches at a given distance from the soma and is a key measure of dendritic complexity; it has applications from evaluating the changes in structure induced by pathologies to estimating the expected number of anatomical synaptic contacts. We find that the Sholl intersection profiles of most neurons can be reproduced from three basic, functional measures: the domain spanned by the dendritic arbor, the total length of the dendrite, and the angular distribution of how far dendritic segments deviate from a direct path to the soma (i.e., the root angle distribution). The first two measures are determined by axon location and hence microcircuit structure; the third arises from optimal wiring and represents a branching statistic estimating the need for conduction speed in a neuron.