Refine
Year of publication
Document Type
- Preprint (387)
- Article (365)
- Working Paper (1)
Language
- English (753)
Has Fulltext
- yes (753)
Is part of the Bibliography
- no (753)
Keywords
- Hadron-Hadron Scattering (11)
- Heavy Ion Experiments (11)
- LHC (9)
- ALICE experiment (4)
- Jets (4)
- Quark-Gluon Plasma (4)
- ALICE (3)
- Heavy Ions (3)
- Heavy Quark Production (3)
- Heavy-ion collision (3)
Institute
- Physik (690)
- Frankfurt Institute for Advanced Studies (FIAS) (612)
- Informatik (553)
- Medizin (10)
- Geowissenschaften (7)
- Institut für Ökologie, Evolution und Diversität (4)
- Informatik und Mathematik (3)
- Biodiversität und Klima Forschungszentrum (BiK-F) (2)
- Biowissenschaften (2)
- Hochschulrechenzentrum (2)
ϒ production in p–Pb interactions is studied at the centre-of-mass energy per nucleon–nucleon collision √sNN = 8.16 TeV with the ALICE detector at the CERN LHC. The measurement is performed reconstructing bottomonium resonances via their dimuon decay channel, in the centre-of-mass rapidity intervals 2.03 < ycms < 3.53 and −4.46 < ycms < −2.96, down to zero transverse momentum. In this work, results on the ϒ(1S) production cross section as a function of rapidity and transverse momentum are presented. The corresponding nuclear modification factor shows a suppression of the ϒ(1S) yields with respect to pp collisions, both at forward and backward rapidity. This suppression is stronger in the low transverse momentum region and shows no significant dependence on the centrality of the interactions. Furthermore, the ϒ(2S) nuclear modification factor is evaluated, suggesting a suppression similar to that of the ϒ(1S). A first measurement of the ϒ(3S) has also been performed. Finally, results are compared with previous ALICE measurements in p–Pb collisions at √sNN = 5.02 TeV and with theoretical calculations.
Measurements of K∗(892)0 and φ(1020) resonance production in Pb–Pb and pp collisions at √sNN = 5.02 TeV with the ALICE detector at the Large Hadron Collider are reported. The resonances are measured at midrapidity (|y| < 0.5) via their hadronic decay channels and the transverse momentum (pT) distributions are obtained for various collision centrality classes up to pT = 20 GeV/c. The pT-integrated yield ratio K∗(892)0/K in Pb–Pb collisions shows significant suppression relative to pp collisions and decreases towards more central collisions. In contrast, the φ(1020)/K ratio does not show any suppression. Furthermore, the measured K∗(892)0/K ratio in central Pb–Pb collisions is significantly suppressed with respect to the expectations based on a thermal model calculation, while the φ(1020)/K ratio agrees with the model prediction. These measurements are an experimental demonstration of rescattering of K∗(892)0 decay products in the hadronic phase of the collisions. The K∗(892)0/K yield ratios in Pb–Pb and pp collisions are used to estimate the time duration between chemical and kinetic freeze-out, which is found to be ∼ 4–7 fm/c for central collisions. The pT-differential ratios of K∗(892)0/K, φ(1020)/K, K∗(892)0/π , φ(1020)/π , p/K∗(892)0 and p/φ(1020) are also presented for Pb–Pb and pp collisions at √sNN = 5.02 TeV. These ratios show that the rescattering effect is predominantly a low-pT phenomenon.
Significant reductions in stratospheric ozone occur inside the polar vortices each spring when chlorine radicals produced by heterogeneous reactions on cold particle surfaces in winter destroy ozone mainly in two catalytic cycles, the ClO dimer cycle and the ClO/BrO cycle. Chlorofluorocarbons (CFCs), which are responsible for most of the chlorine currently present in the stratosphere, have been banned by the Montreal Protocol and its amendments, and the ozone layer is predicted to recover to 1980 levels within the next few decades. During the same period, however, climate change is expected to alter the temperature, circulation patterns and chemical composition in the stratosphere, and possible geo-engineering ventures to mitigate climate change may lead to additional changes. To realistically predict the response of the ozone layer to such influences requires the correct representation of all relevant processes. The European project RECONCILE has comprehensively addressed remaining questions in the context of polar ozone depletion, with the objective to quantify the rates of some of the most relevant, yet still uncertain physical and chemical processes. To this end RECONCILE used a broad approach of laboratory experiments, two field missions in the Arctic winter 2009/10 employing the high altitude research aircraft M55-Geophysica and an extensive match ozone sonde campaign, as well as microphysical and chemical transport modelling and data assimilation. Some of the main outcomes of RECONCILE are as follows: (1) vortex meteorology: the 2009/10 Arctic winter was unusually cold at stratospheric levels during the six-week period from mid-December 2009 until the end of January 2010, with reduced transport and mixing across the polar vortex edge; polar vortex stability and how it is influenced by dynamic processes in the troposphere has led to unprecedented, synoptic-scale stratospheric regions with temperatures below the frost point; in these regions stratospheric ice clouds have been observed, extending over >106km2 during more than 3 weeks. (2) Particle microphysics: heterogeneous nucleation of nitric acid trihydrate (NAT) particles in the absence of ice has been unambiguously demonstrated; conversely, the synoptic scale ice clouds also appear to nucleate heterogeneously; a variety of possible heterogeneous nuclei has been characterised by chemical analysis of the non-volatile fraction of the background aerosol; substantial formation of solid particles and denitrification via their sedimentation has been observed and model parameterizations have been improved. (3) Chemistry: strong evidence has been found for significant chlorine activation not only on polar stratospheric clouds (PSCs) but also on cold binary aerosol; laboratory experiments and field data on the ClOOCl photolysis rate and other kinetic parameters have been shown to be consistent with an adequate degree of certainty; no evidence has been found that would support the existence of yet unknown chemical mechanisms making a significant contribution to polar ozone loss. (4) Global modelling: results from process studies have been implemented in a prognostic chemistry climate model (CCM); simulations with improved parameterisations of processes relevant for polar ozone depletion are evaluated against satellite data and other long term records using data assimilation and detrended fluctuation analysis. Finally, measurements and process studies within RECONCILE were also applied to the winter 2010/11, when special meteorological conditions led to the highest chemical ozone loss ever observed in the Arctic. In addition to quantifying the 2010/11 ozone loss and to understand its causes including possible connections to climate change, its impacts were addressed, such as changes in surface ultraviolet (UV) radiation in the densely populated northern mid-latitudes.
The international research project RECONCILE has addressed central questions regarding polar ozone depletion, with the objective to quantify some of the most relevant yet still uncertain physical and chemical processes and thereby improve prognostic modelling capabilities to realistically predict the response of the ozone layer to climate change. This overview paper outlines the scope and the general approach of RECONCILE, and it provides a summary of observations and modelling in 2010 and 2011 that have generated an in many respects unprecedented dataset to study processes in the Arctic winter stratosphere. Principally, it summarises important outcomes of RECONCILE including (i) better constraints and enhanced consistency on the set of parameters governing catalytic ozone destruction cycles, (ii) a better understanding of the role of cold binary aerosols in heterogeneous chlorine activation, (iii) an improved scheme of polar stratospheric cloud (PSC) processes that includes heterogeneous nucleation of nitric acid trihydrate (NAT) and ice on non-volatile background aerosol leading to better model parameterisations with respect to denitrification, and (iv) long transient simulations with a chemistry-climate model (CCM) updated based on the results of RECONCILE that better reproduce past ozone trends in Antarctica and are deemed to produce more reliable predictions of future ozone trends. The process studies and the global simulations conducted in RECONCILE show that in the Arctic, ozone depletion uncertainties in the chemical and microphysical processes are now clearly smaller than the sensitivity to dynamic variability.
Non-standard errors
(2021)
In statistics, samples are drawn from a population in a data-generating process (DGP). Standard errors measure the uncertainty in sample estimates of population parameters. In science, evidence is generated to test hypotheses in an evidence-generating process (EGP). We claim that EGP variation across researchers adds uncertainty: non-standard errors. To study them, we let 164 teams test six hypotheses on the same sample. We find that non-standard errors are sizeable, on par with standard errors. Their size (i) co-varies only weakly with team merits, reproducibility, or peer rating, (ii) declines significantly after peer-feedback, and (iii) is underestimated by participants.
Investigators in the cognitive neurosciences have turned to Big Data to address persistent replication and reliability issues by increasing sample sizes, statistical power, and representativeness of data. While there is tremendous potential to advance science through open data sharing, these efforts unveil a host of new questions about how to integrate data arising from distinct sources and instruments. We focus on the most frequently assessed area of cognition - memory testing - and demonstrate a process for reliable data harmonization across three common measures. We aggregated raw data from 53 studies from around the world which measured at least one of three distinct verbal learning tasks, totaling N = 10,505 healthy and brain-injured individuals. A mega analysis was conducted using empirical bayes harmonization to isolate and remove site effects, followed by linear models which adjusted for common covariates. After corrections, a continuous item response theory (IRT) model estimated each individual subject’s latent verbal learning ability while accounting for item difficulties. Harmonization significantly reduced inter-site variance by 37% while preserving covariate effects. The effects of age, sex, and education on scores were found to be highly consistent across memory tests. IRT methods for equating scores across AVLTs agreed with held-out data of dually-administered tests, and these tools are made available for free online. This work demonstrates that large-scale data sharing and harmonization initiatives can offer opportunities to address reproducibility and integration challenges across the behavioral sciences.
The formation of acquired drug resistance is a major reason for the failure of anti-cancer therapies after initial response. Here, we introduce a novel model of acquired oxaliplatin resistance, a sub-line of the non-MYCN-amplified neuroblastoma cell line SK-N-AS that was adapted to growth in the presence of 4000 ng/mL oxaliplatin (SK-N-ASrOXALI4000). SK-N-ASrOXALI4000 cells displayed enhanced chromosomal aberrations compared to SK-N-AS, as indicated by 24-chromosome fluorescence in situ hybridisation. Moreover, SK-N-ASrOXALI4000 cells were resistant not only to oxaliplatin but also to the two other commonly used anti-cancer platinum agents cisplatin and carboplatin. SK-N-ASrOXALI4000 cells exhibited a stable resistance phenotype that was not affected by culturing the cells for 10 weeks in the absence of oxaliplatin. Interestingly, SK-N-ASrOXALI4000 cells showed no cross resistance to gemcitabine and increased sensitivity to doxorubicin and UVC radiation, alternative treatments that like platinum drugs target DNA integrity. Notably, UVC-induced DNA damage is thought to be predominantly repaired by nucleotide excision repair and nucleotide excision repair has been described as the main oxaliplatin-induced DNA damage repair system. SK-N-ASrOXALI4000 cells were also more sensitive to lysis by influenza A virus, a candidate for oncolytic therapy, than SK-N-AS cells. In conclusion, we introduce a novel oxaliplatin resistance model. The oxaliplatin resistance mechanisms in SK-N-ASrOXALI4000 cells appear to be complex and not to directly depend on enhanced DNA repair capacity. Models of oxaliplatin resistance are of particular relevance since research on platinum drugs has so far predominantly focused on cisplatin and carboplatin.
The polarization of inclusive J/ψ and ϒ(1S) produced in Pb–Pb collisions at √sNN = 5.02 TeV at the LHC is measured with the ALICE detector. The study is carried out by reconstructing the quarkonium through its decay to muon pairs in the rapidity region 2.5 < y < 4 and measuring the polar and azimuthal angular distributions of the muons. The polarization parameters λθ , λφ and λθφ are measured in the helicity and Collins-Soper reference frames, in the transverse momentum interval 2 < pT < 10 GeV/c and pT < 15 GeV/c for the J/ψ and ϒ(1S), respectively. The polarization parameters for the J/ψ are found to be compatible with zero, within a maximum of about two standard deviations at low pT, for both reference frames and over the whole pT range. The values are compared with the corresponding results obtained for pp collisions at √s = 7 and 8 TeV in a similar kinematic region by the ALICE and LHCb experiments. Although with much larger uncertainties, the polarization parameters for ϒ(1S) production in Pb–Pb collisions are also consistent with zero.
The elliptic and triangular flow coefficients v2 and v3 of prompt D0, D+, and D∗+ mesons were measured at midrapidity (|y| < 0.8) in Pb–Pb collisions at the centre-of-mass energy per nucleon pair of √sNN = 5.02 TeV with the ALICE detector at the LHC. The D mesons were reconstructed via their hadronic decays in the transverse momentum interval 1 < pT < 36 GeV/c in central (0–10%) and semi-central (30–50%) collisions. Compared to pions, protons, and J/ψ mesons, the average D-meson vn harmonics are compatible within uncertainties with a mass hierarchy for pT 3 GeV/c, and are similar to those of charged pions for higher pT. The coupling of the charm quark to the light quarks in the underlying medium is further investigated with the application of the event-shape engineering (ESE) technique to the D-meson v2 and pT-differential yields. The D-meson v2 is correlated with average bulk elliptic flow in both central and semi-central collisions. Within the current precision, the ratios of per-event Dmeson yields in the ESE-selected and unbiased samples are found to be compatible with unity. All the measurements are found to be reasonably well described by theoretical calculations including the effects of charm-quark transport and the recombination of charm quarks with light quarks in a hydrodynamically expanding medium.
Multiplicity dependence of inclusive J/ψ production at midrapidity in pp collisions at √s = 13 TeV
(2020)
Measurements of the inclusive J/ψ yield as a function of charged-particle pseudorapidity density dNch/dη in pp collisions at √s = 13 TeV with ALICE at the LHC are reported. The J/ψ meson yield is measured at midrapidity (|y| < 0.9) in the dielectron channel, for events selected based on the charged-particle multiplicity at midrapidity (|η| < 1) and at forward rapidity (−3.7 < η < −1.7 and 2.8 < η < 5.1); both observables are normalized to their corresponding averages in minimum bias events. The increase of the normalized J/ψ yield with normalized dNch/dη is significantly stronger than linear and dependent on the transverse momentum. The data are compared to theoretical predictions, which describe the observed trends well, albeit not always quantitatively.