Universitätspublikationen
Refine
Year of publication
Document Type
- Article (10808)
- Doctoral Thesis (1567)
- Preprint (1554)
- Working Paper (1438)
- Part of Periodical (564)
- Conference Proceeding (511)
- Report (299)
- Part of a Book (107)
- Review (92)
- Book (60)
Language
- English (17089) (remove)
Has Fulltext
- yes (17089) (remove)
Keywords
- inflammation (92)
- COVID-19 (89)
- SARS-CoV-2 (62)
- Financial Institutions (47)
- Germany (45)
- climate change (45)
- aging (43)
- ECB (42)
- cancer (42)
- apoptosis (41)
Institute
- Medizin (5096)
- Physik (2985)
- Wirtschaftswissenschaften (1643)
- Frankfurt Institute for Advanced Studies (FIAS) (1575)
- Biowissenschaften (1399)
- Informatik (1249)
- Center for Financial Studies (CFS) (1136)
- Sustainable Architecture for Finance in Europe (SAFE) (1059)
- Biochemie und Chemie (855)
- House of Finance (HoF) (700)
We present a higher-order call-by-need lambda calculus enriched with constructors, case-expressions, recursive letrec-expressions, a seq-operator for sequential evaluation and a non-deterministic operator amb that is locally bottom-avoiding. We use a small-step operational semantics in form of a single-step rewriting system that defines a (nondeterministic) normal order reduction. This strategy can be made fair by adding resources for bookkeeping. As equational theory we use contextual equivalence, i.e. terms are equal if plugged into any program context their termination behaviour is the same, where we use a combination of may- as well as must-convergence, which is appropriate for non-deterministic computations. We show that we can drop the fairness condition for equational reasoning, since the valid equations w.r.t. normal order reduction are the same as for fair normal order reduction. We evolve different proof tools for proving correctness of program transformations, in particular, a context lemma for may- as well as mustconvergence is proved, which restricts the number of contexts that need to be examined for proving contextual equivalence. In combination with so-called complete sets of commuting and forking diagrams we show that all the deterministic reduction rules and also some additional transformations preserve contextual equivalence.We also prove a standardisation theorem for fair normal order reduction. The structure of the ordering <=c a is also analysed: Ω is not a least element, and <=c already implies contextual equivalence w.r.t. may-convergence.
I present a new business cycle model in which decision making follows a simple mental process motivated by neuroeconomics. Decision makers first compute the value of two different options and then choose the option that offers the highest value, but with errors. The resulting model is highly tractable and intuitive. A demand function in level replaces the traditional Euler equation. As a result, even liquid consumers can have a large marginal propensity to consume. The interest rate affects consumption through the cost of borrowing and not through intertemporal substitution. I discuss the implications for stimulus policies.
The most frequently used boundary-layer turbulence parameterization in numerical weather prediction (NWP) models are turbulence kinetic energy (TKE) based-based schemes. However, these parameterizations suffer from a potential weakness, namely the strong dependence on an ad-hoc quantity, the so-called turbulence length scale. The physical interpretation of the turbulence length scale is difficult and hence it cannot be directly related to measurements or large eddy simulation (LES) data. Consequently, formulations for the turbulence length scale in basically all TKE schemes are based on simplified assumptions and are model-dependent. A good reference for the independent evaluation of the turbulence length scale expression for NWP modeling is missing. Here we propose a new turbulence length scale diagnostic which can be used in the gray zone of turbulence without modifying the underlying TKE turbulence scheme. The new diagnostic is based on the TKE budget: The core idea is to encapsulate the sum of the molecular dissipation and the cross-scale TKE transfer into an effective dissipation, and associate it with the new turbulence length scale. This effective dissipation can then be calculated as a residuum in the TKE budget equation (for horizontal sub-domains of different sizes) using LES data. Estimation of the scale dependence of the diagnosed turbulence length scale using this novel method is presented for several idealized cases.
Im Mai 2008 verwüstete der Sturm Nargis über Myanmar/Burma hinweg, 140.000 Menschen wurden getötet. Das autokratisch regierte Land wies jedoch Katastrophenhilfe als innere Einmischung zurück und verweigerte die Einfuhr von Medikamenten und Lebensmitteln. Der französische Außenminister Kouschner drängte angesichts dieser Situation die UN zum Handeln, auf Grundlage der Responsibility to Protect (kurz R2P).
Dieser Akt der Versicherheitlichung steht allerdings im Kontrast zur Medienberichterstattung, wie Gabi Schlag in diesem Papier untersucht. Besonders das Bildmaterial aus dem Katastrophengebiet erzählt eine andere Geschichte. Die Photos der Berichterstattung von BBC.com zum Thema bilden ein visuelles Narrativ, welches keine Hilfsbedürftigkeit suggeriert, sondern kontrolliertes, besonnenes Vorgehen der lokalen Kräfte. Dieser Kontrast verweist auf die sprichwörtliche Macht der Bilder, welche die jeweiligen Bedingungen von Handlungsmöglichkeiten vorstrukturieren.
Consumers purchase energy in many forms. Sometimes energy goods are consumed directly, for instance, in the form of gasoline used to operate a vehicle, electricity to light a home, or natural gas to heat a home. At other times, the cost of energy is embodied in the prices of goods and services that consumers buy, say when purchasing an airline ticket or when buying online garden furniture made from plastic to be delivered by mail. Previous research has focused on quantifying the pass-through of the price of crude oil or the price of motor gasoline to U.S. inflation. Neither approach accounts for the fact that percent changes in refined product prices need not be proportionate to the percent change in the price of oil, that not all energy is derived from oil, and that the correlation of price shocks across energy markets is far from one. This paper develops a vector autoregressive model that quantifies the joint impact of shocks to several energy prices on headline and core CPI inflation. Our analysis confirms that focusing on gasoline price shocks alone will underestimate the inflationary pressures emanating from the energy sector, but not enough to overturn the conclusion that much of the observed increase in headline inflation in 2021 and 2022 reflected non-energy price shocks.
We tested 6–7-year-olds, 18–22-year-olds, and 67–74-year-olds on an associative memory task that consisted of knowledge-congruent and knowledge-incongruent object–scene pairs that were highly familiar to all age groups. We compared the three age groups on their memory congruency effect (i.e., better memory for knowledge-congruent associations) and on a schema bias score, which measures the participants’ tendency to commit knowledge-congruent memory errors. We found that prior knowledge similarly benefited memory for items encoded in a congruent context in all age groups. However, for associative memory, older adults and, to a lesser extent, children overrelied on their prior knowledge, as indicated by both an enhanced congruency effect and schema bias. Functional Magnetic Resonance Imaging (fMRI) performed during memory encoding revealed an age-independent memory x congruency interaction in the ventromedial prefrontal cortex (vmPFC). Furthermore, the magnitude of vmPFC recruitment correlated positively with the schema bias. These findings suggest that older adults are most prone to rely on their prior knowledge for episodic memory decisions, but that children can also rely heavily on prior knowledge that they are well acquainted with. Furthermore, the fMRI results suggest that the vmPFC plays a key role in the assimilation of new information into existing knowledge structures across the entire lifespan. vmPFC recruitment leads to better memory for knowledge-congruent information but also to a heightened susceptibility to commit knowledge-congruent memory errors, in particular in children and older adults.
Knowledge discovery in biomedical data using supervised methods assumes that the data contain structure relevant to the class structure if a classifier can be trained to assign a case to the correct class better than by guessing. In this setting, acceptance or rejection of a scientific hypothesis may depend critically on the ability to classify cases better than randomly, without high classification performance being the primary goal. Random forests are often chosen for knowledge-discovery tasks because they are considered a powerful classifier that does not require sophisticated data transformation or hyperparameter tuning and can be regarded as a reference classifier for tabular numerical data. Here, we report a case where the failure of random forests using the default hyperparameter settings in the standard implementations of R and Python would have led to the rejection of the hypothesis that the data contained structure relevant to the class structure. After tuning the hyperparameters, classification performance increased from 56% to 65% balanced accuracy in R, and from 55% to 67% balanced accuracy in Python. More importantly, the 95% confidence intervals in the tuned versions were to the right of the value of 50% that characterizes guessing-level classification. Thus, tuning provided the desired evidence that the data structure supported the class structure of the data set. In this case, the tuning made more than a quantitative difference in the form of slightly better classification accuracy, but significantly changed the interpretation of the data set. This is especially true when classification performance is low and a small improvement increases the balanced accuracy to over 50% when guessing.
This policy letter provides an overview of the strengths, weaknesses, risks and opportunities of the upcoming comprehensive risk assessment, a euro area-wide evaluation of bank balance sheets and business models. If carried out properly, the 2014 comprehensive assessment will lead the euro area into a new era of banking supervision. Policy makers in euro area countries are now under severe pressure to define a credible backstop framework for banks. This framework, as the author argues, needs to be a broad, quasi-European system of mutually reinforcing backstops.
Child maltreatment remains a major health threat globally that requires the understanding of socioeconomic and cultural contexts to craft effective interventions. However, little is known about research agendas globally and the development of knowledge-producing networks in this field of study. This study aims to explore the bibliometric overview on child maltreatment publications to understand their growth from 1916 to 2018. Data from the Web of Science Core Collection were collected in May 2018. Only research articles and reviews written in the English language were included, with no restrictions by publication date. We analyzed publication years, number of papers, journals, authors, keywords and countries, and presented the countries collaboration and co-occurrence keywords analysis. From 1916 to 2018, 47,090 papers (53.0% in 2010–2018) were published in 9442 journals. Child Abuse & Neglect (2576 papers; 5.5%); Children and Youth Services Review (1130 papers; 2.4%) and Pediatrics (793 papers, 1.7%) published the most papers. The most common research areas were Psychology (16,049 papers, 34.1%), Family Studies (8225 papers, 17.5%), and Social Work (7367 papers, 15.6%). Among 192 countries with research publications, the most prolific countries were the United States (26,367 papers), England (4676 papers), Canada (3282 papers) and Australia (2664 papers). We identified 17 authors who had more than 60 scientific items. The most cited papers (with at least 600 citations) were published in 29 journals, headed by the Journal of the American Medical Association (JAMA) (7 papers) and the Lancet (5 papers). This overview of global research in child maltreatment indicated an increasing trend in this topic, with the world’s leading centers located in the Western countries led by the United States. We called for interdisciplinary research approaches to evaluating and intervening on child maltreatment, with a focus on low-middle income countries (LMICs) settings and specific contexts.
A Bayesian framework to estimate diversification rates and their variation through time and space
(2011)
Background: Patterns of species diversity are the result of speciation and extinction processes, and molecular phylogenetic data can provide valuable information to derive their variability through time and across clades. Bayesian Markov chain Monte Carlo methods offer a promising framework to incorporate phylogenetic uncertainty when estimating rates of diversification.
Results: We introduce a new approach to estimate diversification rates in a Bayesian framework over a distribution of trees under various constant and variable rate birth-death and pure-birth models, and test it on simulated phylogenies. Furthermore, speciation and extinction rates and their posterior credibility intervals can be estimated while accounting for non-random taxon sampling. The framework is particularly suitable for hypothesis testing using Bayes factors, as we demonstrate analyzing dated phylogenies of Chondrostoma (Cyprinids) and Lupinus (Fabaceae). In addition, we develop a model that extends the rate estimation to a meta-analysis framework in which different data sets are combined in a single analysis to detect general temporal and spatial trends in diversification.
Conclusions: Our approach provides a flexible framework for the estimation of diversification parameters and hypothesis testing while simultaneously accounting for uncertainties in the divergence times and incomplete taxon sampling.