Frankfurt Institute for Advanced Studies (FIAS)
Refine
Year of publication
Document Type
- Preprint (1020)
- Article (813)
- Conference Proceeding (27)
- Doctoral Thesis (18)
- Part of Periodical (6)
- Contribution to a Periodical (4)
- Part of a Book (2)
- Diploma Thesis (1)
- Master's Thesis (1)
- Periodical (1)
Is part of the Bibliography
- no (1894)
Keywords
- Heavy Ion Experiments (22)
- Hadron-Hadron Scattering (14)
- Hadron-Hadron scattering (experiments) (11)
- LHC (11)
- Heavy-ion collisions (9)
- Heavy-ion collision (7)
- heavy-ion collisions (7)
- schizophrenia (7)
- Black holes (6)
- Equation of state (5)
Institute
- Frankfurt Institute for Advanced Studies (FIAS) (1894)
- Physik (1429)
- Informatik (1121)
- Medizin (64)
- MPI für Hirnforschung (31)
- Ernst Strüngmann Institut (26)
- Biowissenschaften (22)
- Psychologie (13)
- Biochemie und Chemie (12)
- Helmholtz International Center for FAIR (7)
ϕ-meson production in In–In collisions at Elab=158A GeV: Evidence for relics of a thermal phase
(2010)
Yields and transverse mass distributions of the ϕ-mesons reconstructed in the ϕ→μ+μ− channel in In+In collisions at Elab=158A GeV are calculated within an integrated Boltzmann+hydrodynamics hybrid approach based on the Ultrarelativistic Quantum Molecular Dynamics (UrQMD) transport model with an intermediate hydrodynamic stage. The analysis is performed for various centralities and a comparison with the corresponding NA60 data in the muon channel is presented. We find that the hybrid model, that embeds an intermediate locally equilibrated phase subsequently mapped into the transport dynamics according to thermal phase-space distributions, gives a good description of the experimental data, both in yield and slope. On the contrary, the pure transport model calculations tend to fail in catching the general properties of the ϕ meson production: not only the yield, but also the slope of the mT spectra, compare poorly with the experimental observations at top SPS energies.
Recent lattice QCD results, comparing to a hadron resonance gas model, have shown the need for hundreds of particles in hadronic models. These extra particles influence both the equation of state and hadronic interactions within hadron transport models. Here, we introduce the PDG21+ particle list, which contains the most up-to-date database of particles and their properties. We then convert all particles decays into 2 body decays so that they are compatible with SMASH in order to produce a more consistent description of a heavy-ion collision.
Hadron lists based on experimental studies summarized by the Particle Data Group (PDG) are a crucial input for the equation of state and thermal models used in the study of strongly-interacting matter produced in heavy-ion collisions. Modeling of these strongly-interacting systems is carried out via hydrodynamical simulations, which are followed by hadronic transport codes that also require a hadronic list as input. To remain consistent throughout the different stages of modeling of a heavy-ion collision, the same hadron list with its corresponding decays must be used at each step. It has been shown that even the most uncertain states listed in the PDG from 2016 are required to reproduce partial pressures and susceptibilities from Lattice Quantum Chromodynamics with the hadronic list known as the PDG2016+. Here, we update the hadronic list for use in heavy-ion collision modeling by including the latest experimental information for all states listed in the Particle Data Booklet in 2021. We then compare our new list, called PDG2021+, to Lattice Quantum Chromodynamics results and find that it achieves even better agreement with the first principles calculations than the PDG2016+ list. Furthermore, we develop a novel scheme based on intermediate decay channels that allows for only binary decays, such that PDG2021+ will be compatible with the hadronic transport framework SMASH. Finally, we use these results to make comparisons to experimental data and discuss the impact on particle yields and spectra.
Various optimality principles have been proposed to explain the characteristics of coordinated eye and head movements during visual orienting behavior. At the same time, researchers have suggested several neural models to underly the generation of saccades, but these do not include online learning as a mechanism of optimization. Here, we suggest an open-loop neural controller with a local adaptation mechanism that minimizes a proposed cost function. Simulations show that the characteristics of coordinated eye and head movements generated by this model match the experimental data in many aspects, including the relationship between amplitude, duration and peak velocity in head-restrained and the relative contribution of eye and head to the total gaze shift in head-free conditions. Our model is a first step towards bringing together an optimality principle and an incremental local learning mechanism into a unified control scheme for coordinated eye and head movements.
Dendritic spines are crucial for excitatory synaptic transmission as the size of a spine head correlates with the strength of its synapse. The distribution of spine head sizes follows a lognormal-like distribution with more small spines than large ones. We analysed the impact of synaptic activity and plasticity on the spine size distribution in adult-born hippocampal granule cells from rats with induced homo- and heterosynaptic long-term plasticity in vivo and CA1 pyramidal cells from Munc-13-1-Munc13-2 knockout mice with completely blocked synaptic transmission. Neither induction of extrinsic synaptic plasticity nor the blockage of presynaptic activity degrades the lognormal-like distribution but changes its mean, variance and skewness. The skewed distribution develops early in the life of the neuron. Our findings and their computational modelling support the idea that intrinsic synaptic plasticity is sufficient for the generation, while a combination of intrinsic and extrinsic synaptic plasticity maintains lognormal like distribution of spines.
We investigate the effect of large magnetic fields on the (2 + 1)-dimensional reduced-magnetohydrodynamical expansion of hot and dense nuclear matter produced in √sNN = 200 GeV Au+Au collisions. For the sake of simplicity,we consider the casewhere themagnetic field points in the direction perpendicular to the reaction plane. We also consider this field to be external, with energy density parametrized as a two-dimensional Gaussian. The width of the Gaussian along the directions orthogonal to the beam axis varies with the centrality of the collision. The dependence of the magnetic field on proper time (τ ) for the case of zero electrical conductivity of the QGP is parametrized following Deng et al. [Phys. Rev. C 85, 044907 (2012)], and for finite electrical conductivity following Tuchin [Phys. Rev. C 88, 024911 (2013)].We solve the equations of motion of ideal hydrodynamics for such an external magnetic field. For collisions with nonzero impact parameter we observe considerable changes in the evolution of the momentum eccentricities of the fireball when comparing the case when the magnetic field decays in a conducting QGP medium and when no magnetic field is present. The elliptic-flow coefficient v2 of π− is shown to increase in the presence of an external magnetic field and the increment in v2 is found to depend on the evolution and the initial magnitude of the magnetic field.
The intrinsic complexity of the brain can lead one to set aside issues related to its relationships with the body, but the field of embodied cognition emphasizes that understanding brain function at the system level requires one to address the role of the brain-body interface. It has only recently been appreciated that this interface performs huge amounts of computation that does not have to be repeated by the brain, and thus affords the brain great simplifications in its representations. In effect the brain’s abstract states can refer to coded representations of the world created by the body. But even if the brain can communicate with the world through abstractions, the severe speed limitations in its neural circuitry mean that vast amounts of indexing must be performed during development so that appropriate behavioral responses can be rapidly accessed. One way this could happen would be if the brain used a decomposition whereby behavioral primitives could be quickly accessed and combined. This realization motivates our study of independent sensorimotor task solvers, which we call modules, in directing behavior. The issue we focus on herein is how an embodied agent can learn to calibrate such individual visuomotor modules while pursuing multiple goals. The biologically plausible standard for module programming is that of reinforcement given during exploration of the environment. However this formulation contains a substantial issue when sensorimotor modules are used in combination: The credit for their overall performance must be divided amongst them. We show that this problem can be solved and that diverse task combinations are beneficial in learning and not a complication, as usually assumed. Our simulations show that fast algorithms are available that allot credit correctly and are insensitive to measurement noise.
We estimate the temperature dependence of the bulk viscosity in a relativistic hadron gas. Employing the Green–Kubo formalism in the SMASH (Simulating Many Accelerated Strongly-interacting Hadrons) transport approach, we study different hadronic systems in increasing order of complexity. We analyze the (in)validity of the single exponential relaxation ansatz for the bulk-channel correlation function and the strong influence of the resonances and their lifetimes. We discuss the difference between the inclusive bulk viscosity of an equilibrated, long-lived system, and the effective bulk viscosity of a short-lived mixture like the hadronic phase of relativistic heavy-ion collisions, where the processes whose inverse relaxation rate are larger than the fireball duration are excluded from the analysis. This clarifies the differences between previous approaches which computed the bulk viscosity including/excluding the very slow processes in the hadron gas. We compare our final results with previous hadron gas calculations and confirm a decreasing trend of the inclusive bulk viscosity over entropy density as temperature increases, whereas the effective bulk viscosity to entropy ratio, while being lower than the inclusive one, shows no strong dependence to temperature.
ALICE (A Large Heavy Ion Experiment) is one of the four large scale experiments at the Large Hadron Collider (LHC) at CERN. The High Level Trigger (HLT) is an online computing farm, which reconstructs events recorded by the ALICE detector in real-time. The most computing-intensive task is the reconstruction of the particle trajectories. The main tracking devices in ALICE are the Time Projection Chamber (TPC) and the Inner Tracking System (ITS). The HLT uses a fast GPU-accelerated algorithm for the TPC tracking based on the Cellular Automaton principle and the Kalman filter. ALICE employs gaseous subdetectors which are sensitive to environmental conditions such as ambient pressure and temperature and the TPC is one of these. A precise reconstruction of particle trajectories requires the calibration of these detectors. As our first topic, we present some recent optimizations to our GPU-based TPC tracking using the new GPU models we employ for the ongoing and upcoming data taking period at LHC. We also show our new approach to fast ITS standalone tracking. As our second topic, we present improvements to the HLT for facilitating online reconstruction including a new flat data model and a new data flow chain. The calibration output is fed back to the reconstruction components of the HLT via a feedback loop. We conclude with an analysis of a first online calibration test under real conditions during the Pb-Pb run in November 2015, which was based on these new features.
The influence of visual tasks on short and long-term memory for visual features was investigated using a change-detection paradigm. Subjects completed 2 tasks: (a) describing objects in natural images, reporting a specific property of each object when a crosshair appeared above it, and (b) viewing a modified version of each scene, and detecting which of the previously described objects had changed. When tested over short delays (seconds), no task effects were found. Over longer delays (minutes) we found the describing task influenced what types of changes were detected in a variety of explicit and incidental memory experiments. Furthermore, we found surprisingly high performance in the incidental memory experiment, suggesting that simple tasks are sufficient to instill long-lasting visual memories. Keywords: visual working memory, natural scenes, natural tasks, change detection