Refine
Year of publication
- 2021 (25) (remove)
Document Type
- Article (25) (remove)
Language
- English (25)
Has Fulltext
- yes (25)
Is part of the Bibliography
- no (25)
Keywords
- Adaptive control (1)
- Automatic (1)
- CBM detector (1)
- Computational geometry (1)
- Hadron-Hadron scattering (experiments) (1)
- Heavy Ion Experiments (1)
- Line reconstruction (1)
- Noisy point clouds (1)
- PointNet (1)
- Z-inspection (1)
Institute
- Informatik (25) (remove)
The ongoing digitalization of educational resources and the use of the internet lead to a steady increase of potentially available learning media. However, many of the media which are used for educational purposes have not been designed specifically for teaching and learning. Usually, linguistic criteria of readability and comprehensibility as well as content-related criteria are used independently to assess and compare the quality of educational media. This also holds true for educational media used in economics. This article aims to improve the analysis of textual learning media used in economic education by drawing on threshold concepts. Threshold concepts are key terms in knowledge acquisition within a domain. From a linguistic perspective, however, threshold concepts are instances of specialized vocabularies, exhibiting particular linguistic features. In three kinds of (German) resources, namely in textbooks, in newspapers, and on Wikipedia, we investigate the distributive profiles of 63 threshold concepts identified in economics education (which have been collected from threshold concept research). We looked at the threshold concepts' frequency distribution, their compound distribution, and their network structure within the three kinds of resources. The two main findings of our analysis show that firstly, the three kinds of resources can indeed be distinguished in terms of their threshold concepts' profiles. Secondly, Wikipedia definitely shows stronger associative connections between economic threshold concepts than the other sources. We discuss the findings in relation to adequate media use for teaching and learning—not only in economic education.
Co-design of a trustworthy AI system in healthcare: deep learning based skin lesion classifier
(2021)
This paper documents how an ethically aligned co-design methodology ensures trustworthiness in the early design phase of an artificial intelligence (AI) system component for healthcare. The system explains decisions made by deep learning networks analyzing images of skin lesions. The co-design of trustworthy AI developed here used a holistic approach rather than a static ethical checklist and required a multidisciplinary team of experts working with the AI designers and their managers. Ethical, legal, and technical issues potentially arising from the future use of the AI system were investigated. This paper is a first report on co-designing in the early design phase. Our results can also serve as guidance for other early-phase AI-similar tool developments.
In this talk we presented a novel technique, based on Deep Learning, to determine the impact parameter of nuclear collisions at the CBM experiment. PointNet based Deep Learning models are trained on UrQMD followed by CBMRoot simulations of Au+Au collisions at 10 AGeV to reconstruct the impact parameter of collisions from raw experimental data such as hits of the particles in the detector planes, tracks reconstructed from the hits or their combinations. The PointNet models can perform fast, accurate, event-by-event impact parameter determination in heavy ion collision experiments. They are shown to outperform a simple model which maps the track multiplicity to the impact parameter. While conventional methods for centrality classification merely provide an expected impact parameter distribution for a given centrality class, the PointNet models predict the impact parameter from 2–14 fm on an event-by-event basis with a mean error of −0.33 to 0.22 fm.
The ALICE Collaboration reports the first fully-corrected measurements of the N-subjettiness observable for track-based jets in heavy-ion collisions. This study is performed using data recorded in pp and Pb-Pb collisions at centre-of-mass energies of s√ = 7 TeV and sNN−−−√ = 2.76 TeV, respectively. In particular the ratio of 2-subjettiness to 1-subjettiness, τ2/τ1, which is sensitive to the rate of two-pronged jet substructure, is presented. Energy loss of jets traversing the strongly interacting medium in heavy-ion collisions is expected to change the rate of two-pronged substructure relative to vacuum. The results are presented for jets with a resolution parameter of R = 0.4 and charged jet transverse momentum of 40 ≤ pT,jet ≤ 60 GeV/c, which constitute a larger jet resolution and lower jet transverse momentum interval than previous measurements in heavy-ion collisions. This has been achieved by utilising a semi-inclusive hadron-jet coincidence technique to suppress the larger jet combinatorial background in this kinematic region. No significant modification of the τ2/τ1 observable for track-based jets in Pb-Pb collisions is observed relative to vacuum PYTHIA6 and PYTHIA8 references at the same collision energy. The measurements of τ2/τ1, together with the splitting aperture angle ∆R, are also performed in pp collisions at s√ = 7 TeV for inclusive jets. These results are compared with PYTHIA calculations at s√ = 7 TeV, in order to validate the model as a vacuum reference for the Pb-Pb centre-of-mass energy. The PYTHIA references for τ2/τ1 are shifted to larger values compared to the measurement in pp collisions. This hints at a reduction in the rate of two-pronged jets in Pb-Pb collisions compared to pp collisions.
Point-based geometry representations have become widely used in numerous contexts, ranging from particle-based simulations, over stereo image matching, to depth sensing via light detection and ranging. Our application focus is on the reconstruction of curved line structures in noisy 3D point cloud data. Respective algorithms operating on such point clouds often rely on the notion of a local neighborhood. Regarding the latter, our approach employs multi-scale neighborhoods, for which weighted covariance measures of local points are determined. Curved line structures are reconstructed via vector field tracing, using a bidirectional piecewise streamline integration. We also introduce an automatic selection of optimal starting points via multi-scale geometric measures. The pipeline development and choice of parameters was driven by an extensive, automated initial analysis process on over a million prototype test cases. The behavior of our approach is controlled by several parameters — the majority being set automatically, leaving only three to be controlled by a user. In an extensive, automated final evaluation, we cover over one hundred thousand parameter sets, including 3D test geometries with varying curvature, sharp corners, intersections, data holes, and systematically applied varying types of noise. Further, we analyzed different choices for the point of reference in the co-variance computation; using a weighted mean performed best in most cases. In addition, we compared our method to current, publicly available line reconstruction frameworks. Up to thirty times faster execution times were achieved in some cases, at comparable error measures. Finally, we also demonstrate an exemplary application on four real-world 3D light detection and ranging datasets, extracting power line cables.
The pT-differential production cross sections of prompt and non-prompt (produced in beauty-hadron decays) D mesons were measured by the ALICE experiment at midrapidity (|y| < 0.5) in proton-proton collisions at s√ = 5.02 TeV. The data sample used in the analysis corresponds to an integrated luminosity of (19.3 ± 0.4) nb−1. D mesons were reconstructed from their decays D0 → K−π+, D+ → K−π+π+, and D+s→φπ+→K−K+π+ and their charge conjugates. Compared to previous measurements in the same rapidity region, the cross sections of prompt D+ and D+s mesons have an extended pT coverage and total uncertainties reduced by a factor ranging from 1.05 to 1.6, depending on pT, allowing for a more precise determination of their pT-integrated cross sections. The results are well described by perturbative QCD calculations. The fragmentation fraction of heavy quarks to strange mesons divided by the one to non-strange mesons, fs/(fu + fd), is compatible for charm and beauty quarks and with previous measurements at different centre-of-mass energies and collision systems. The bb¯¯¯ production cross section per rapidity unit at midrapidity, estimated from non-prompt D-meson measurements, is dσbb¯¯¯/dy∣∣|y|<0.5=34.5±2.4(stat)+4.7−2.9(tot.syst) μb. It is compatible with previous measurements at the same centre-of-mass energy and with the cross section pre- dicted by perturbative QCD calculations.
Future operation of the CBM detector requires ultra-fast analysis of the continuous stream of data from all subdetector systems. Determining the inter-system time shifts among individual detector systems in the existing prototype experiment mCBM is an essential step for data processing and in particular for stable data taking. Based on the input of raw measurements from all detector systems, the corresponding time correlations can be obtained at digital level by evaluating the differences in time stamps. If the relevant systems are stable during data taking and sufficient digital measurements are available, the distribution of time differences should display a clear peak. Up to now, the outcome of the processed time differences is stored in histograms and the maximum peak is considered, after the evaluation of all timeslices of a run leading to significant run times. The results presented here demonstrate the stability of the synchronicity of mCBM systems. Furthermore it is illustrated that relatively small amounts of raw measurements are sufficient to evaluate corresponding time correlations among individual mCBM detectors, thus enabling fast online monitoring of them in future online data processing.
Production of pions, kaons, (anti-)protons and φ mesons in Xe–Xe collisions at √sNN = 5.44 TeV
(2021)
The first measurement of the production of pions, kaons, (anti-)protons and φ mesons at midrapidity in Xe–Xe collisions at √sNN = 5.44 TeV is presented. Transverse momentum (pT) spectra and pT-integrated yields are extracted in several centrality intervals bridging from p–Pb to mid-central Pb–Pb collisions in terms of final-state multiplicity. The study of Xe–Xe and Pb–Pb collisions allows systems at similar charged-particle multiplicities but with different initial geometrical eccentricities to be investigated. A detailed comparison of the spectral shapes in the two systems reveals an opposite behaviour for radial and elliptic flow. In particular, this study shows that the radial flow does not depend on the colliding system when compared at similar charged-particle multiplicity. In terms of hadron chemistry, the previously observed smooth evolution of particle ratios with multiplicity from small to large collision systems is also found to hold in Xe–Xe. In addition, our results confirm that two remarkable features of particle production at LHC energies are also valid in the collision of medium-sized nuclei: the lower proton-to-pion ratio with respect to the thermal model expectations and the increase of the φ-to-pion ratio with increasing final-state multiplicity.
The inclusive production of the J/ψ and ψ(2S) charmonium states is studied as a function of centrality in p-Pb collisions at a centre-of-mass energy per nucleon pair sNN−−−√ = 8.16 TeV at the LHC. The measurement is performed in the dimuon decay channel with the ALICE apparatus in the centre-of-mass rapidity intervals −4.46 < ycms < −2.96 (Pb-going direction) and 2.03 < ycms < 3.53 (p-going direction), down to zero transverse momentum (pT). The J/ψ and ψ(2S) production cross sections are evaluated as a function of the collision centrality, estimated through the energy deposited in the zero degree calorimeter located in the Pb-going direction. The pT-differential J/ψ production cross section is measured at backward and forward rapidity for several centrality classes, together with the corresponding average 〈pT〉 and ⟨p2T⟩ values. The nuclear effects affecting the production of both charmonium states are studied using the nuclear modification factor. In the p-going direction, a suppression of the production of both charmonium states is observed, which seems to increase from peripheral to central collisions. In the Pb-going direction, however, the centrality dependence is different for the two states: the nuclear modification factor of the J/ψ increases from below unity in peripheral collisions to above unity in central collisions, while for the ψ(2S) it stays below or consistent with unity for all centralities with no significant centrality dependence. The results are compared with measurements in p-Pb collisions at sNN−−−√ = 5.02 TeV and no significant dependence on the energy of the collision is observed. Finally, the results are compared with theoretical models implementing various nuclear matter effects.
Jet fragmentation transverse momentum distributions in pp and p-Pb collisions at √s, √sNN = 5.02 TeV
(2021)
Jet fragmentation transverse momentum (jT) distributions are measured in proton-proton (pp) and proton-lead (p-Pb) collisions at sNN−−−√ = 5.02 TeV with the ALICE experiment at the LHC. Jets are reconstructed with the ALICE tracking detectors and electromagnetic calorimeter using the anti-kT algorithm with resolution parameter R = 0.4 in the pseudorapidity range |η| < 0.25. The jT values are calculated for charged particles inside a fixed cone with a radius R = 0.4 around the reconstructed jet axis. The measured jT distributions are compared with a variety of parton-shower models. Herwig and PYTHIA 8 based models describe the data well for the higher jT region, while they underestimate the lower jT region. The jT distributions are further characterised by fitting them with a function composed of an inverse gamma function for higher jT values (called the “wide component”), related to the perturbative component of the fragmentation process, and with a Gaussian for lower jT values (called the “narrow component”), predominantly connected to the hadronisation process. The width of the Gaussian has only a weak dependence on jet transverse momentum, while that of the inverse gamma function increases with increasing jet transverse momentum. For the narrow component, the measured trends are successfully described by all models except for Herwig. For the wide component, Herwig and PYTHIA 8 based models slightly underestimate the data for the higher jet transverse momentum region. These measurements set constraints on models of jet fragmentation and hadronisation.