Refine
Year of publication
Document Type
- Conference Proceeding (27) (remove)
Language
- English (27)
Has Fulltext
- yes (27)
Is part of the Bibliography
- no (27) (remove)
Institute
- Frankfurt Institute for Advanced Studies (FIAS) (27) (remove)
Two generic mechanisms for emergence of direction selectivity coexist in recurrent neural networks
(2013)
Poster presentation: Twenty Second Annual Computational Neuroscience Meeting: CNS*2013. Paris, France. 13-18 July 2013.
In the mammalian visual cortex, the time-averaged response of many neurons is maximal for stimuli moving in a particular direction. Such a direction selective response is not found in LGN, upstream of the visual processing pathway, suggesting that cortical networks play a strong role in the generation of direction selectivity. Here we investigate the mechanisms for the emergence of direction selectivity in the recurrent networks of nonlinear firing rate neurons in layer 4 of V1 receiving the input from LGN. In the model the LGN inputs are characterized by different receptive field positions, and their relative temporal phase shifts are reversed for the stimuli moving in the opposite direction. We propose that two distinct mechanisms result in the neuronal direction selective response in these recurrent networks. The first one is a result of nonlinear feed-forward summation of several time-shifted inputs. The second mechanism is based on the competition between neurons for firing in a winner-take-all regime. Both mechanisms rely on inhibitory interactions in the connectivity matrix of lateral connections, but the second one involves inhibitory loops. Typically, the first mechanism results in lower selectivity values than the second, but the time-course of acquiring direction selective response is faster for the first mechanism. Importantly, the two mechanisms have different input frequency tuning. The first mechanism, based on the nonlinear summation, result in a relatively narrow tuning curve around the preferred frequency of the stimulus in the case of the moving grating. In contrast, the direction selectivity arising from the second mechanism depends only weakly on the input frequency, i.e. has a broader tuning curve. These differences allow us to provide the recipe for identifying in experiment which of the two mechanisms is used by a given direction selective neuron. We then analyze how the statistics of the connections in the random recurrent networks affect the relative contributions from these two mechanisms and determine the distributions of the direction selectivity values. We identify the motifs in the connectivity matrix, which are required for each mechanism and show that the minimal conditions for both mechanisms are met in a very broad set of random recurrent networks with sufficiently strong inhibitory connections. Thus, we propose that these mechanisms coexist in generic recurrent networks with inhibition. Our results may account for the recent experimental observations that direction selectivity is present in dark-reared mice and ferrets [1,2]. It can also explain the emergence of direction selectivity in species lacking a spatially organized direction selectivity map.
TRENTOOL : an open source toolbox to estimate neural directed interactions with transfer entropy
(2011)
To investigate directed interactions in neural networks we often use Norbert Wiener's famous definition of observational causality. Wiener’s definition states that an improvement of the prediction of the future of a time series X from its own past by the incorporation of information from the past of a second time series Y is seen as an indication of a causal interaction from Y to X. Early implementations of Wiener's principle – such as Granger causality – modelled interacting systems by linear autoregressive processes and the interactions themselves were also assumed to be linear. However, in complex systems – such as the brain – nonlinear behaviour of its parts and nonlinear interactions between them have to be expected. In fact nonlinear power-to-power or phase-to-power interactions between frequencies are reported frequently. To cover all types of non-linear interactions in the brain, and thereby to fully chart the neural networks of interest, it is useful to implement Wiener's principle in a way that is free of a model of the interaction [1]. Indeed, it is possible to reformulate Wiener's principle based on information theoretic quantities to obtain the desired model-freeness. The resulting measure was originally formulated by Schreiber [2] and termed transfer entropy (TE). Shortly after its publication transfer entropy found applications to neurophysiological data. With the introduction of new, data efficient estimators (e.g. [3]) TE has experienced a rapid surge of interest (e.g. [4]). Applications of TE in neuroscience range from recordings in cultured neuronal populations to functional magnetic resonanace imaging (fMRI) signals. Despite widespread interest in TE, no publicly available toolbox exists that guides the user through the difficulties of this powerful technique. TRENTOOL (the TRansfer ENtropy TOOLbox) fills this gap for the neurosciences by bundling data efficient estimation algorithms with the necessary parameter estimation routines and nonparametric statistical testing procedures for comparison to surrogate data or between experimental conditions. TRENTOOL is an open source MATLAB toolbox based on the Fieldtrip data format. ...
Quenched QCD at zero baryonic chemical potential undergoes a first-order deconfinement phase transition at a critical temperature Tc, which is related to the spontaneous breaking of the global center symmetry. Including heavy, dynamical quarks breaks the center symmetry explicitly and weakens the first-order phase transition. For decreasing quark masses the first-order phase transition turns into a smooth crossover at a Z2-critical point. The critical quark mass corresponding to this point has been examined with Nf=2 Wilson fermions for several Nτ in a recent study within our group. For comparison, we also locate the critical point with Nf=2 staggered fermions on Nτ=8 lattices. For this purpose we perform Monte Carlo simulations for several quark mass values and various aspect ratios in order to extrapolate to the thermodynamic limit. The critical mass is obtained by fitting to a finite size scaling formula of the kurtosis of the Polyakov loop. Our results indicate large discretization effects, requiring simulations on lattices with Nτ>8.
The Gribov mode in hot QCD
(2017)
Poster presentation at The Twenty Third Annual Computational Neuroscience Meeting: CNS*2014 Québec City, Canada. 26-31 July 2014: We study random strongly heterogeneous recurrent networks of firing rate neurons, introducing the notion of cohorts: groups of co-active neurons, who compete for firing with one another and whose presence depends sensitively on the structure of the input. The identities of neurons recruited to and dropped from an active cohort changes smoothly with varying input features. We search for network parameter regimes in which the activation of cohorts is robust yet easily switchable by the external input and which exhibit large repertoires of different cohorts. We apply these networks to model the emergence of orientation and direction selectivity in visual cortex. We feed these random networks with a set of harmonic inputs that vary across neurons only in their temporal phase, mimicking the feedforward drive due to a moving grating stimulus. The relationship between the phases that carries the information about the orientation of the stimulus determines which cohort of neurons is activated. As a result the individual neurons acquire non-monotonic orientation tuning curves which are characterized by high orientation and direction selectivity. This mechanism of emergence for direction selectivity differs from the classical motion detector scheme, which is based on the nonlinear summation of the time-shifted inputs. In our model these two mechanisms coexist in the same network, but can be distinguished by their different frequency and contrast dependences. In general, the mechanism we are studying here converts temporal phase sequence into population activity and could therefore be used to extract and represent also various other relevant stimulus features.
The STAR experiment provides a perfect machinery for studying strange matter for more than two decades. Recently, we developed the express procedure, which allows online monitoring of the collected physics data. The high quality of express calibration and reconstruction provides a unique possibility to run the express production and observe almost in real time strange particles including mesons, hyperons, resonances and even hypernuclei.
The STAR Beam Energy Scan II program, including fixed target Au+Au collisions taken in 2018–2021, is particularly suited to study hypernuclei. Light hypernuclei are expected to be abundantly produced in low energy heavy-ion collisions. Measurements of hypernuclei production and their properties will provide information on the hyperon-nucleon interactions, which are essential ingredients for understanding nuclear matter equation of state at high net-baryon densities, such as inside neutron stars.
With the heavy fragment trigger introduced for the 2021 data taking, we were able to run the express production at the STAR High Level Trigger farm. The collected data were suffcient to observe the decay process of Λ5He →4Hepπ− with more than 11σ significance, measure binding energy as a function of hypernuclei mass, and study hypernuclei decay properties with the Dalitz plot technique.
As microscopic transport models usually have difficulties to deal with in-medium effects in heavy-ion collisions, we present an alternative approach that uses coarse-grained output from transport calculations with the UrQMD model to determine thermal dilepton emission rates. A four-dimensional space-time grid is set up to extract local baryon and energy densities, respectively temperature and baryon chemical potential. The lepton pair emission is then calculated for each cell of the grid using thermal equilibrium rates. In the current investigation we inlcude the medium-modified r spectral function by Eletsky et al., as well as contributions from the QGP and four-pion interactions for high collision energies. First dielectron invariant mass spectra for Au+Au collisions at 1.25 AGeV and for dimuons from In+In at 158 AGeV are shown. At 1.25 AGeV a clear enhancement of the total dilepton yield as compared to a pure transport result is observed. In the latter case, we compare our outcome with the NA60 dimuon excess data. Here a good agreement is achieved, but the yield in the low-mass tail is underestimated. In general the results show that the coarse-graining approach gives reasonable results and can cover a broad collision-energy range.
The results of the microscopic transport calculations of -nucleus interactions within a GiBUU model are presented. The dominating mechanism of hyperon production is the strangeness exchange processes → γπ and → ΞK. The calculated rapidity spectra of Ξ hyperons are significantly shifted to forward rapidities with respect to the spectra of S = −1 hyperons. We argue that this shift should be a sensitive test for the possible exotic mechanisms of -nucleus annihilation. The production of the double Λ-hypernuclei by Ξ− interaction with a secondary target is calculated.
To investigate the formation and the propagation of relativistic shock waves in viscous gluon matter we solve the relativistic Riemann problem using a microscopic parton cascade. We demonstrate the transition from ideal to viscous shock waves by varying the shear viscosity to entropy density ratio n/s. Furthermore we compare our results with those obtained by solving the relativistic causal dissipative fluid equations of Israel and Stewart (IS), in order to show the validity of the IS hydrodynamics. Employing the parton cascade we also investigate the formation of Mach shocks induced by a high-energy gluon traversing viscous gluon matter. For n/s = 0.08 a Mach cone structure is observed, whereas the signal smears out for n/s >=0.32.
Neuronal dynamics differs between wakefulness and sleep stages, so does the cognitive state. In contrast, a single attractor state, called self-organized critical (SOC), has been proposed to govern human brain dynamics for its optimal information coding and processing capabilities. Here we address two open questions: First, does the human brain always operate in this computationally optimal state, even during deep sleep? Second, previous evidence for SOC was based on activity within single brain areas, however, the interaction between brain areas may be organized differently. Here we asked whether the interaction between brain areas is SOC. ...