Refine
Year of publication
- 2019 (15) (remove)
Document Type
- Conference Proceeding (15) (remove)
Language
- English (15)
Has Fulltext
- yes (15)
Is part of the Bibliography
- no (15) (remove)
Keywords
- Diffusion (1)
- Heavy Ion Collisions (1)
- Multiple Charge Conservation (1)
- Transport Theory (1)
- baryon stopping (1)
- hadron transport (1)
- string fragmentation (1)
Institute
- Physik (15) (remove)
LICE is one of the four major LHC experiments at CERN. When the accelerator enters the Run 3 data-taking period, starting in 2021, ALICE expects almost 100 times more Pb-Pb central collisions than now, resulting in a large increase of data throughput. In order to cope with this new challenge, the collaboration had to extensively rethink the whole data processing chain, with a tighter integration between Online and Offline computing worlds. Such a system, code-named ALICE O2, is being developed in collaboration with the FAIR experiments at GSI. It is based on the ALFA framework which provides a generalized implementation of the ALICE High Level Trigger approach, designed around distributed software entities coordinating and communicating via message passing.
We will highlight our efforts to integrate ALFA within the ALICE O2 environment. We analyze the challenges arising from the different running environments for production and development, and conclude on requirements for a flexible and modular software framework. In particular we will present the ALICE O2 Data Processing Layer which deals with ALICE specific requirements in terms of Data Model. The main goal is to reduce the complexity of development of algorithms and managing a distributed system, and by that leading to a significant simplification for the large majority of the ALICE users.
GSI High Energy Beam Transfer lines (HEST) link the SIS18 synchrotron with two storage rings (Experimental Storage Ring and Cryring) and six experimental caves. The recent upgrades to HEST beam instrumentation enables precise measurements of beam properties along the lines and allow for faster and more precise beams setup on targets. Preliminary results of some of the measurements performed during runs in 2018 and 2019 are presented here. The focus is on response matrix measurements and quadrupole scans performed on HADES beam line. The errors and future improvements are discussed.
The present study focuses on the beam line optimization from the heavy-ion synchrotron SIS18 to the HADES experiment. BOBYQA (Bound Optimization BY Quadratic Approximation) solves bound constrained optimization problems without using derivatives of the objective function. The Bayesian optimization is another strategy for global optimization of costly, noisy functions without using derivatives. A python programming interface to MADX allow the use of the python implementation of BOBYQA and Bayesian method. This gave the possibility to use tracking simulation with MADX to determine the loss budget for each lattice setting during the optimization and compare both optimization methods.
Since the last 20 years, modern heuristic algorithms and machine learning have been increasingly used for several purposes in accelerator technology and physics. Since computing power has become less and less of a limiting factor, these tools have become part of the physicist community's standard toolkit [1][2] [3] [4] [5]. This paper describes the construction of an algorithm that can be used to generate an optimised lattice design for transfer lines under the consideration of restrictions that usually limit design options in reality. The developed algorithm has been applied to the existing SIS18 to HADES transfer line in GSI.
We discuss the diffusion currents occurring in a dilute system and show that the charge currents do not only depend on gradients in the corresponding charge density, but also on the other conserved charges in the system—the diffusion currents are therefore coupled. Gradients in one charge thus generate dissipative currents in a different charge. In this approach, we model the Navier-Stokes term of the generated currents to consist of a diffusion coefficient matrix, in which the diagonal entries are the usual diffusion coefficients and the off-diagonal entries correspond to the coupling of different diffusion currents. We evaluate the complete diffusion matrix for a specific hadron gas and for a simplified quark-gluon gas, including baryon, electric and strangeness charge. Our findings are that the off-diagonal entries can range within the same magnitude as the diagonal ones.
The changing shape of the rapidity spectrum of net protons over the SPS energy range is still lacking theoretical understanding. In this work, a model for string excitation and string fragmentation is implemented for the description of high energy collisions within a hadronic transport approach. The free parameters of the string model are tuned to reproduce the experimentally measured particle production in proton-proton collisions. With the fixed parameters we advance to calculations for heavy ion collisions, where the shape of the proton rapidity spectrum changes from a single peak to a double peak structure with increasing beam energy in the experiment. We present calculations of proton rapidity spectra at different SPS energies in heavy ion collisions. Qualitatively, a good agreement with the experimental findings is obtained. In a future work, the formation process of string fragments will be studied in detail aiming to quantitatively reproduce the measurement.
The differences between contemporary Monte Carlo generators of high energy hadronic interactions are discussed and their impact on the interpretation of experimental data on ultra-high energy cosmic rays (UHECRs) is studied. Key directions for further model improvements are outlined. The prospect for a coherent interpretation of the data in terms of the UHECR composition is investigated.
The Projectile Spectator Detector (PSD) of the CBM experiment at the future FAIR facility is a compensating lead-scintillator calorimeter designed to measure the energy distribution of the forward going projectile nucleons and nuclei fragments (reaction spectators) produced close to the beam rapidity. The detector performance for the centrality and reaction plane determination is reviewed based on Monte-Carlo simulations of gold-gold collisions by means of four different heavy-ion event generators. The PSD energy resolution and the linearity of the response measured at CERN PS for the PSD supermodule consisting of 9 modules are presented. Predictions of the calorimeter radiation conditions at CBM and response measurement of one PSD module equipped with neutron irradiated MPPCs used for the light read out are discussed.