Refine
Year of publication
Document Type
- Conference Proceeding (1282) (remove)
Language
Is part of the Bibliography
- no (1282) (remove)
Keywords
- Germanistik (54)
- Bologna-Prozess (32)
- Exzellenzinitiative (32)
- Theorie (32)
- Zukunft (32)
- Deutsch (24)
- Computerlinguistik (23)
- Kongress (23)
- Virtuelle Hochschule (23)
- Informationsstruktur (19)
Institute
- Physik (248)
- Medizin (189)
- Rechtswissenschaft (119)
- Universitätsbibliothek (96)
- Extern (69)
- Informatik (41)
- Geschichtswissenschaften (31)
- Gesellschaftswissenschaften (30)
- Frankfurt Institute for Advanced Studies (FIAS) (27)
- Erziehungswissenschaften (24)
We investigate the role of the Pauli Exclusion Principle (PEP) for light nuclei, at the examples of 12C and 16O. We show that ignoring the PEP does lead not only to a too dense spectrum at low energy but also to a wrong grouping into bands. Using a geometrical mapping, a triangular structure for 12C and a tetrahedral structure in 16O in the ground state is obtained by using the indistinguishably of the α-particles.
Based on the positive results of the 0.63 m unmodulated 325 MHz Ladder-RFQ prototype from 2013 to 2016 [1, 2], a modulated 3.3m Ladder-RFQ (s. Fig. 1) has been designed and built for the acceleration of up to 100 mA protons from 95 keV to 3.0 MeV at the FAIR p-Linac [3, 4]. In this paper, we will show the results of manufacturing as well as low level RF measurements of the Ladder-RFQ including flatness and frequency tuning.
One-photon and multi-photon absorption, spontaneous and stimulated photon emission, resonance Raman scattering and electron transfer are important molecular processes that commonly involve combined vibrational-electronic (vibronic) transitions. The corresponding vibronic transition profiles in the energy domain are usually determined by Franck-Condon factors (FCFs), the squared norm of overlap integrals between vibrational wavefunctions of different electronic states. FC profiles are typically highly congested for large molecular systems and the spectra usually become not well-resolvable at elevated temperatures. The (theoretical) analyses of such spectra are even more difficult when vibrational mode mixing (Duschinsky) effects are significant, because contributions from different modes are in general not separable, even within the harmonic approximation. A few decades ago Doktorov, Malkin and Man'ko [1979 J. Mol. Spectrosc. 77, 178] developed a coherent state-based generating function approach and exploited the dynamical symmetry of vibrational Hamiltonians for the Duschinsky relation to describe FC transitions at zero Kelvin. Recently, the present authors extended the method to incorporate thermal, single vibronic level, non-Condon and multi-photon effects in energy, time and probability density domains for the efficient calculation and interpretation of vibronic spectra. Herein, recent developments and corresponding generating functions are presented for single vibronic levels related to fluorescence, resonance Raman scattering and anharmonic transition.
The influence of an ac current of arbitrary amplitude and frequency on the mixed-state dc-voltage-ac-drive tiltingratchet response of a superconducting film with uniaxial cosine pinning potential at finite temperature is theoretically investigated. The results are obtained in the single-vortex approximation, within the frame of an exact solution of the Langevin equation for non-interacting vortices. Both experimentally achievable, the dc ratchet response and absorbed ac power are predicted to demonstrate a pronounced filter-like behavior at microwave frequencies. Based on our findings, we propose a cut-off filter and discuss its operating curves as functions of the driving parameters, i.e, ac amplitude, frequency, and dc bias. The predicted results can be examined, e.g, on superconducting films with a washboard pinning potential landscape.
LICE is one of the four major LHC experiments at CERN. When the accelerator enters the Run 3 data-taking period, starting in 2021, ALICE expects almost 100 times more Pb-Pb central collisions than now, resulting in a large increase of data throughput. In order to cope with this new challenge, the collaboration had to extensively rethink the whole data processing chain, with a tighter integration between Online and Offline computing worlds. Such a system, code-named ALICE O2, is being developed in collaboration with the FAIR experiments at GSI. It is based on the ALFA framework which provides a generalized implementation of the ALICE High Level Trigger approach, designed around distributed software entities coordinating and communicating via message passing.
We will highlight our efforts to integrate ALFA within the ALICE O2 environment. We analyze the challenges arising from the different running environments for production and development, and conclude on requirements for a flexible and modular software framework. In particular we will present the ALICE O2 Data Processing Layer which deals with ALICE specific requirements in terms of Data Model. The main goal is to reduce the complexity of development of algorithms and managing a distributed system, and by that leading to a significant simplification for the large majority of the ALICE users.
Insulin resistance and working memory exploring the role of blood glucose levels and lifestyle
(2023)
vIntroduction: Type 2 diabetes mellitus and dementia are among the leading causes for reduced quality of life and life expectancy worldwide and often occur comorbidly. Both diseases are linked by altered insulin signaling. Lifestyle factors and blood glucose monitoring play an essential role in the prevention and treatment of type 2 diabetes. So far, a relationship between blood glucose levels, lifestyle, and cognitive performance – a main symptom of dementia - has mainly been established in laboratory settings which reduces its ecological validity.
Objectives: This study uses ambulatory assessment and continuous glucose monitoring to explore the link between blood glucose levels, lifestyle and working memory in an ecological setting. We hypothesize that glycemic variations affect working memory performance in daily life. Second, we hypothesize that a high variance in blood glucose levels has a higher impact on working memory in insulin resistant participants. With this study, we aim to expand the knowledge on the relationship of insulin resistance and cognitive performance from the laboratory setting to everyday life.
Methods: This prospective, exploratory study will include 80 subjects with insulin resistance and 80 healthy controls. At baseline, blood indicators of insulin resistance will be measured to determine group assignment. Our ambulatory assessment includes smartphone-based sampling and sensor-based assessment. Therefore, cognitive performance will be recorded over three consecutive days using a smartphone. Four times a day, a numerical working memory task is prompted by signal-based alarms on the smartphone. Blood glucose levels are recorded in parallel by continuous glucose monitoring. In addition, lifestyle factors such as diet ad physical activity are examined. Diet is assessed by 24-h dietary protocols and movement acceleration by accelerometery.
Multilevel modelling will be used to map the relationship between blood glucose levels and working memory at the within- and between-person level. Diet and exercise are included in the analyses as additional predictors.
Results: Data collection started in March 2021 and is ongoing. Up to now, 40 insulin resistant participants and 36 healthy controls have been measured. Our preliminary results indicate a positive association between blood glucose levels and working memory performance at the within-person level (estimate = .48, 95% CI [.07, .89], p =0.022). At the between-person level the analysis revealed an inverse association between blood glucose levels and working memory performance (estimate = -.45, 95 % CI [-.86 - -.05], p = 0.029).
Conclusion: Our preliminary results are in line with studies showing that an acute rise in blood glucose levels leads to short-term improvements, while stable glucose profiles are beneficial in the long term. This might expand the understanding of the impact of insulin resistance on working memory and represent a target for early interventions. Our preliminary analysis needs to be repeated in our final dataset to confirm our results.
We study threshold testing, an elementary probing model with the goal to choose a large value out of n i.i.d. random variables. An algorithm can test each variable X_i once for some threshold t_i, and the test returns binary feedback whether X_i ≥ t_i or not. Thresholds can be chosen adaptively or non-adaptively by the algorithm. Given the results for the tests of each variable, we then select the variable with highest conditional expectation. We compare the expected value obtained by the testing algorithm with expected maximum of the variables. Threshold testing is a semi-online variant of the gambler’s problem and prophet inequalities. Indeed, the optimal performance of non-adaptive algorithms for threshold testing is governed by the standard i.i.d. prophet inequality of approximately 0.745 + o(1) as n → ∞. We show how adaptive algorithms can significantly improve upon this ratio. Our adaptive testing strategy guarantees a competitive ratio of at least 0.869 - o(1). Moreover, we show that there are distributions that admit only a constant ratio c < 1, even when n → ∞. Finally, when each box can be tested multiple times (with n tests in total), we design an algorithm that achieves a ratio of 1 - o(1).