Refine
Year of publication
Document Type
- Article (126)
- Working Paper (7)
- diplomthesis (1)
- Doctoral Thesis (1)
Has Fulltext
- yes (135)
Is part of the Bibliography
- no (135)
Keywords
Institute
- Physik (125)
- Frankfurt Institute for Advanced Studies (FIAS) (74)
- Informatik (52)
- Hochschulrechenzentrum (1)
- Informatik und Mathematik (1)
- Medizin (1)
The interaction of Λ and Σ hyperons (Y) with nucleons (N) is strongly influenced by the coupled-channel dynamics. Due to the small mass difference of the NΛ and NΣ systems, the sizable coupling strength of the NΣ ↔ NΛ processes constitutes a crucial element in the determination of the NΛ interaction. In this letter we present the most precise measurements on the interaction of p pairs, from zero relative momentum up to the opening of the NΣ channel. The correlation function in the relative momentum space for p ⊕ p pairs measured in high-multiplicity triggered pp collisions at √s = 13 TeV at the LHC is reported. The opening of the inelastic NΣ channels is visible in the extracted correlation function as a cusp-like structure occurring at relative momentum k∗ = 289 MeV/c. This represents the first direct experimental observation of the NΣ ↔ NΛ coupled channel in the p system. The correlation function is compared with recent chiral effective field theory calculations, based on different strengths of the NΣ ↔ NΛ transition potential. A weaker coupling, as possibly supported by the present measurement, would require a more repulsive three-body NNΛ interaction for a proper description of the in-medium properties, which has implications on the nuclear equation of state and for the presence of hyperons inside neutron stars.
Static analysis of different non-strict functional programming languages makes use of set constants like Top, Inf, and Bot denoting all expressions, all lists without a last Nil as tail, and all non-terminating programs, respectively. We use a set language that permits union, constructors and recursive definition of set constants with a greatest fixpoint semantics. This paper proves decidability, in particular EXPTIMEcompleteness, of subset relationship of co-inductively defined sets by using algorithms and results from tree automata. This shows decidability of the test for set inclusion, which is required by certain strictness analysis algorithms in lazy functional programming languages.
This paper proves correctness of Nöcker's method of strictness analysis, implemented in the Clean compiler, which is an effective way for strictness analysis in lazy functional languages based on their operational semantics. We improve upon the work of Clark, Hankin and Hunt did on the correctness of the abstract reduction rules. Our method fully considers the cycle detection rules, which are the main strength of Nöcker's strictness analysis. Our algorithm SAL is a reformulation of Nöcker's strictness analysis algorithm in a higher-order call-by-need lambda-calculus with case, constructors, letrec, and seq, extended by set constants like Top or Inf, denoting sets of expressions. It is also possible to define new set constants by recursive equations with a greatest fixpoint semantics. The operational semantics is a small-step semantics. Equality of expressions is defined by a contextual semantics that observes termination of expressions. Basically, SAL is a non-termination checker. The proof of its correctness and hence of Nöcker's strictness analysis is based mainly on an exact analysis of the lengths of normal order reduction sequences. The main measure being the number of 'essential' reductions in a normal order reduction sequence. Our tools and results provide new insights into call-by-need lambda-calculi, the role of sharing in functional programming languages, and into strictness analysis in general. The correctness result provides a foundation for Nöcker's strictness analysis in Clean, and also for its use in Haskell.
This paper describes context analysis, an extension to strictness analysis for lazy functional languages. In particular it extends Wadler's four point domain and permits in nitely many abstract values. A calculus is presented based on abstract reduction which given the abstract values for the result automatically finds the abstract values for the arguments. The results of the analysis are useful for veri fication purposes and can also be used in compilers which require strictness information.
The extraction of strictness information marks an indispensable element of an efficient compilation of lazy functional languages like Haskell. Based on the method of abstract reduction we have developed an e cient strictness analyser for a core language of Haskell. It is completely written in Haskell and compares favourably with known implementations. The implementation is based on the G#-machine, which is an extension of the G-machine that has been adapted to the needs of abstract reduction.
This paper proves correctness of Nocker s method of strictness analysis, implemented for Clean, which is an e ective way for strictness analysis in lazy functional languages based on their operational semantics. We improve upon the work of Clark, Hankin and Hunt, which addresses correctness of the abstract reduction rules. Our method also addresses the cycle detection rules, which are the main strength of Nocker s strictness analysis. We reformulate Nocker s strictness analysis algorithm in a higherorder lambda-calculus with case, constructors, letrec, and a nondeterministic choice operator used as a union operator. Furthermore, the calculus is expressive enough to represent abstract constants like Top or Inf. The operational semantics is a small-step semantics and equality of expressions is defined by a contextual semantics that observes termination of expressions. The correctness of several reductions is proved using a context lemma and complete sets of forking and commuting diagrams. The proof is based mainly on an exact analysis of the lengths of normal order reductions. However, there remains a small gap: Currently, the proof for correctness of strictness analysis requires the conjecture that our behavioral preorder is contained in the contextual preorder. The proof is valid without referring to the conjecture, if no abstract constants are used in the analysis.
Die Implementation der Striktheits-Analyse, die im Zuge dieser Arbeit vorgenommen wurde, stellt eine effiziente Approximation der abstrakten Reduktion mit Pfadanalyse dar. Durch die G#-Maschine, ein neues, auf der G-Maschine basierendes Maschinenmodell, wurde die verwendete Methode systematisch dargelegt. Die große Ähnlichkeit mit der G-Maschin, die in unserer Implementation beibehalten werden konnte, zeigt, wie natürlich die verwendete Methode der Reduktion in funktionalen Programmiersprachen entspricht. Obwohl die Umsetzung mehr Wert auf Nachvollziehbarkeit, als auf Effizienz legt, zeigt sie, daß die Methode der abstrakten Reduktion mit Pfadanalyse auch in einer funktionalen Implementierung durchaus alltagstauglich ist und Striktheits-Information findet, die Umsetzungen anderer Methoden nicht finden. Es bestehen Möglichkeiten zur Optimierung u. a. von Programmteilen, die für jede simulierte G#-Maschinen-Anweisung ausgeführt werden. Bei vorsichtiger Einschätzung erscheint eine Halbierung der Laufzeit mit vertretbarem Aufwand erreichbar.
Multiplicity dependence of light (anti-)nuclei production in p–Pb collisions at √sNN =5.02 TeV
(2020)
The measurement of the deuteron and anti-deuteron production in the rapidity range −1 < y < 0 as a function of transverse momentum and event multiplicity in p–Pb collisions at √sNN = 5.02 TeV is presented. (Anti-)deuterons are identified via their specific energy loss dE/dx and via their time-offlight. Their production in p–Pb collisions is compared to pp and Pb–Pb collisions and is discussed within the context of thermal and coalescence models. The ratio of integrated yields of deuterons to protons (d/p) shows a significant increase as a function of the charged-particle multiplicity of the event starting from values similar to those observed in pp collisions at low multiplicities and approaching those observed in Pb–Pb collisions at high multiplicities. The mean transverse particle momenta are extracted from the deuteron spectra and the values are similar to those obtained for p and particles. Thus, deuteron spectra do not follow mass ordering. This behaviour is in contrast to the trend observed for non-composite particles in p–Pb collisions. In addition, the production of the rare 3He and 3He nuclei has been studied. The spectrum corresponding to all non-single diffractive p-Pb collisions is obtained in the rapidity window −1 < y < 0 and the pT-integrated yield dN/dy is extracted. It is found that the yields of protons, deuterons, and 3He, normalised by the spin degeneracy factor, follow an exponential decrease with mass number.