Refine
Year of publication
Document Type
- Preprint (466)
- Article (229)
- Working Paper (1)
Language
- English (696)
Has Fulltext
- yes (696)
Is part of the Bibliography
- no (696)
Keywords
Institute
The knowledge of the material budget with a high precision is fundamental for measurements of direct photon production using the photon conversion method due to its direct impact on the total systematic uncertainty. Moreover, it influences many aspects of the charged-particle reconstruction performance. In this article, two procedures to determine data-driven corrections to the material-budget description in ALICE simulation software are developed. One is based on the precise knowledge of the gas composition in the Time Projection Chamber. The other is based on the robustness of the ratio between the produced number of photons and charged particles, to a large extent due to the approximate isospin symmetry in the number of produced neutral and charged pions. Both methods are applied to ALICE data allowing for a reduction of the overall material budget systematic uncertainty from 4.5% down to 2.5%. Using these methods, a locally correct material budget is also achieved. The two proposed methods are generic and can be applied to any experiment in a similar fashion.
Long- and short-range correlations for pairs of charged particles are studied via two-particle angular correlations in pp collisions at √sNN = 13 TeV and p–Pb collisions at √s = 5.02 TeV. The correlation functions are measured as a function of relative azimuthal angle ∆φ and pseudorapidity separation ∆η for pairs of primary charged particles within the pseudorapidity interval |η| < 0.9 and the transverse-momentum interval 1 < pT < 4 GeV/c. Flow coefficients are extracted for the long-range correlations (1.6 < |∆η| < 1.8) in various high-multiplicity event classes using the low-multiplicity template fit method. The method is used to subtract the enhanced yield of away-side jet fragments in high-multiplicity events. These results show decreasing flow signals toward lower multiplicity events. Furthermore, the flow coefficients for events with hard probes, such as jets or leading particles, do not exhibit any significant changes compared to those obtained from high-multiplicity events without any specific event selection criteria. The results are compared with hydrodynamic-model calculations, and it is found that a better understanding of the initial conditions is necessary to describe the results, particularly for low-multiplicity events.
Periodontal furcation lesions: a survey of diagnosis and management by general dental practitioners
(2021)
Aim: The aim of this study was to explore general dental practitioners' (GDPs) attitude to periodontal furcation involvement (FI). Materials and methods: An online survey focused on diagnosis and management of periodontal FI was circulated to GDPs in seven different countries. Results: A total of 400 responses were collected. Nearly a fifth of participants reported rarely or never taking 6-point pocket charts; 65.8% of participants had access to a Nabers probe in their practice. When shown clinical pictures and radiographs of FI-involved molars, the majority of participants correctly diagnosed it. Although 47.1% of participants were very/extremely confident in detecting FI, only 8.9% felt very/extremely confident at treating it. Differences in responses were detected according to country and year of qualification, with a trend towards less interest in periodontal diagnosis and treatment in younger generations. Lack of knowledge of management/referral pathways (reported by 22.8%) and lack of correct equipment were considered the biggest barriers to FI management. Most participants (80.9%) were interested in learning more about FI, ideally face to face followed by online tutorials. Conclusions: Plans should be put in place to improve general dentists' knowledge and ability to manage FI, as this can have a significant impact on public health.
A wide variety of enzymatic pathways that produce specialized metabolites in bacteria, fungi and plants are known to be encoded in biosynthetic gene clusters. Information about these clusters, pathways and metabolites is currently dispersed throughout the literature, making it difficult to exploit. To facilitate consistent and systematic deposition and retrieval of data on biosynthetic gene clusters, we propose the Minimum Information about a Biosynthetic Gene cluster (MIBiG) data standard.
Investigators in the cognitive neurosciences have turned to Big Data to address persistent replication and reliability issues by increasing sample sizes, statistical power, and representativeness of data. While there is tremendous potential to advance science through open data sharing, these efforts unveil a host of new questions about how to integrate data arising from distinct sources and instruments. We focus on the most frequently assessed area of cognition - memory testing - and demonstrate a process for reliable data harmonization across three common measures. We aggregated raw data from 53 studies from around the world which measured at least one of three distinct verbal learning tasks, totaling N = 10,505 healthy and brain-injured individuals. A mega analysis was conducted using empirical bayes harmonization to isolate and remove site effects, followed by linear models which adjusted for common covariates. After corrections, a continuous item response theory (IRT) model estimated each individual subject’s latent verbal learning ability while accounting for item difficulties. Harmonization significantly reduced inter-site variance by 37% while preserving covariate effects. The effects of age, sex, and education on scores were found to be highly consistent across memory tests. IRT methods for equating scores across AVLTs agreed with held-out data of dually-administered tests, and these tools are made available for free online. This work demonstrates that large-scale data sharing and harmonization initiatives can offer opportunities to address reproducibility and integration challenges across the behavioral sciences.
Non-standard errors
(2021)
In statistics, samples are drawn from a population in a data-generating process (DGP). Standard errors measure the uncertainty in sample estimates of population parameters. In science, evidence is generated to test hypotheses in an evidence-generating process (EGP). We claim that EGP variation across researchers adds uncertainty: non-standard errors. To study them, we let 164 teams test six hypotheses on the same sample. We find that non-standard errors are sizeable, on par with standard errors. Their size (i) co-varies only weakly with team merits, reproducibility, or peer rating, (ii) declines significantly after peer-feedback, and (iii) is underestimated by participants.