Universitätspublikationen
Refine
Year of publication
Document Type
- Article (12973)
- Part of Periodical (3428)
- Doctoral Thesis (3169)
- Contribution to a Periodical (2075)
- Book (2051)
- Working Paper (1870)
- Preprint (1297)
- Review (1048)
- Report (910)
- Conference Proceeding (659)
Language
- English (16254)
- German (13542)
- Portuguese (231)
- Spanish (123)
- Italian (65)
- French (64)
- Multiple languages (58)
- Turkish (12)
- Ukrainian (10)
- slo (7)
Keywords
- Deutschland (132)
- COVID-19 (93)
- inflammation (91)
- Financial Institutions (90)
- ECB (67)
- Capital Markets Union (64)
- SARS-CoV-2 (63)
- Financial Markets (61)
- Adorno (58)
- Banking Union (50)
Institute
- Medizin (6247)
- Präsidium (4979)
- Physik (3058)
- Wirtschaftswissenschaften (2243)
- Gesellschaftswissenschaften (2009)
- Biowissenschaften (1675)
- Frankfurt Institute for Advanced Studies (FIAS) (1455)
- Biochemie und Chemie (1370)
- Sustainable Architecture for Finance in Europe (SAFE) (1370)
- Informatik (1350)
The growth of aerosol due to the aqueous phase oxidation of sulfur dioxide by ozone was measured in laboratory-generated clouds created in the Cosmics Leaving OUtdoor Droplets (CLOUD) chamber at the European Organization for Nuclear Research (CERN). Experiments were performed at 10 and −10 °C, on acidic (sulfuric acid) and on partially to fully neutralised (ammonium sulfate) seed aerosol. Clouds were generated by performing an adiabatic expansion – pressurising the chamber to 220 hPa above atmospheric pressure, and then rapidly releasing the excess pressure, resulting in a cooling, condensation of water on the aerosol and a cloud lifetime of approximately 6 min. A model was developed to compare the observed aerosol growth with that predicted using oxidation rate constants previously measured in bulk solutions. The model captured the measured aerosol growth very well for experiments performed at 10 and −10 °C, indicating that, in contrast to some previous studies, the oxidation rates of SO2 in a dispersed aqueous system can be well represented by using accepted rate constants, based on bulk measurements. To the best of our knowledge, these are the first laboratory-based measurements of aqueous phase oxidation in a dispersed, super-cooled population of droplets. The measurements are therefore important in confirming that the extrapolation of currently accepted reaction rate constants to temperatures below 0 °C is correct.
Video and image data are regularly used in the field of benthic ecology to document biodiversity. However, their use is subject to a number of challenges, principally the identification of taxa within the images without associated physical specimens. The challenge of applying traditional taxonomic keys to the identification of fauna from images has led to the development of personal, group, or institution level reference image catalogues of operational taxonomic units (OTUs) or morphospecies. Lack of standardisation among these reference catalogues has led to problems with observer bias and the inability to combine datasets across studies. In addition, lack of a common reference standard is stifling efforts in the application of artificial intelligence to taxon identification. Using the North Atlantic deep sea as a case study, we propose a database structure to facilitate standardisation of morphospecies image catalogues between research groups and support future use in multiple front-end applications. We also propose a framework for coordination of international efforts to develop reference guides for the identification of marine species from images. The proposed structure maps to the Darwin Core standard to allow integration with existing databases. We suggest a management framework where high-level taxonomic groups are curated by a regional team, consisting of both end users and taxonomic experts. We identify a mechanism by which overall quality of data within a common reference guide could be raised over the next decade. Finally, we discuss the role of a common reference standard in advancing marine ecology and supporting sustainable use of this ecosystem.
Although direct-acting antiviral medications effectively cure hepatitis C in most patients, sometimes treatment selects for resistant viruses, causing antiviral drugs to be either ineffective or only partially effective. Multidrug resistance is common in patients for whom DAA treatment fails. Older patients and patients with advanced liver diseases are more likely to select drug-resistant viruses. Collective efforts from international communities and governments are needed to develop an optimal approach to managing drug resistance and preventing the transmission of resistant viruses.
Readers of Hannah Arendt’s now classic formulation of the statelessness problem in her 1951 book The Origins of Totalitarianism abound at a moment when the number of stateless peoples worldwide continues to rise exponentially. Along with statelessness, few concepts in Arendt scholarship have spawned such a volume of literature, and perhaps none have provoked as much interest outside of the field of philosophy, as ‘the right to have rights.’ Interpreting this enigmatic term exposes the heart of our beliefs about the nature of the political and has important consequences for how we practice politics on a global scale because it implicitly takes plural human beings, and not the citizen, as its subjects. Arendt’s conceptualization of this problem remains unsurpassed in its diagnosis of the political situation of statelessness, as well as its intimate description of the human cost of what she refers to as ‘world loss,’ a phenomenon that the prevailing human rights and global justice discourse does not take into account. And yet, as an alternative framework for thinking about global politics, the right to have rights resists easy interpretation, let alone practical application.
Management Summary: Conducted within the project “Economic Implications of New Models for Information Supply for Science and Research in Germany”, the Houghton Report for Germany provides a general cost and benefit analysis for scientific communication in Germany comparing different scenarios according to their specific costs and explicitly including the German National License Program (NLP).
Basing on the scholarly lifecycle process model outlined by Björk (2007), the study compared the following scenarios according to their accounted costs:
- Traditional subscription publishing,
- Open access publishing (Gold Open Access; refers primarily to journal publishing where access is free of charge to readers, while the authors or funding organisations pay for publication)
- Open Access self-archiving (authors deposit their work in online open access institutional or subject-based repositories, making it freely available to anyone with Internet access; further divided into (i) CGreen Open Access’ self-archiving operating in parallel with subscription publishing; and (ii) the ‘overlay services’ model in which self-archiving provides the foundation for overlay services (e.g. peer review, branding and quality control services))
- the NLP.
Within all scenarios, five core activity elements (Fund research and research communication; perform research and communicate the results; publish scientific and scholarly works; facilitate dissemination, retrieval and preservation; study publications and apply the knowledge) were modeled and priced with all their including activities.
Modelling the impacts of an increase in accessibility and efficiency resulting from more open access on returns to R&D over a 20 year period and then comparing costs and benefits, we find that the benefits of open access publishing models are likely to substantially outweigh the costs and, while smaller, the benefits of the German NLP also exceed the costs.
This analysis of the potential benefits of more open access to research findings suggests that different publishing models can make a material difference to the benefits realised, as well as the costs faced. It seems likely that more Open Access would have substantial net benefits in the longer term and, while net benefits may be lower during a transitional period, they are likely to be positive for both ‘author-pays’ Open Access publishing and the ‘over-lay journals’ alternatives (‘Gold Open Access’), and for parallel subscription publishing and self-archiving (‘Green Open Access’). The NLP returns substantial benefits and savings at a modest cost, returning one of the highest benefit/cost ratios available from unilateral national policies during a transitional period (second to that of ‘Green Open Access’ self-archiving). Whether ‘Green Open Access’ self-archiving in parallel with subscriptions is a sustainable model over the longer term is debateable, and what impact the NLP may have on the take up of Open Access alternatives is also an important consideration. So too is the potential for developments in Open Access or other scholarly publishing business models to significantly change the relative cost-benefit of the NLP over time.
The results are comparable to those of previous studies from the UK and Netherlands. Green Open Access in parallel with the traditional model yields the best benefits/cost ratio. Beside its benefits/cost ratio, the meaningfulness of the NLP is given by its enforceability. The true costs of toll access publishing (beside the buyback” of information) is the prohibition of access to research and knowledge for society.
Vortrag im Rahmen des Symposiums der Universitätsbibliothek Frankfurt am Main in Kooperation mit der Frankfurter Buchmesse 2011 "Economy and Acceptance of Open Access Strategies", am 14.10.2011.
It is now accepted that heart failure (HF) is a complex multifunctional disease rather than simply a hemodynamic dysfunction. Despite its complexity, stressed cardiomyocytes often follow conserved patterns of structural remodelling in order to adapt, survive, and regenerate. When cardiac adaptations cannot cope with mechanical, ischemic, and metabolic loads efficiently or become chronically activated, as, for example, after infection, then the ongoing structural remodelling and dedifferentiation often lead to compromised pump function and patient death. It is, therefore, of major importance to understand key events in the progression from a compensatory left ventricular (LV) systolic dysfunction to a decompensatory LV systolic dysfunction and HF. To achieve this, various animal models in combination with an “omics” toolbox can be used. These approaches will ultimately lead to the identification of an arsenal of biomarkers and therapeutic targets which have the potential to shape the medicine of the future.
Bipolar disorder (BD) is a genetically complex mental illness characterized by severe oscillations of mood and behavior. Genome-wide association studies (GWAS) have identified several risk loci that together account for a small portion of the heritability. To identify additional risk loci, we performed a two-stage meta-analysis of >9 million genetic variants in 9,784 bipolar disorder patients and 30,471 controls, the largest GWAS of BD to date. In this study, to increase power we used ~2,000 lithium-treated cases with a long-term diagnosis of BD from the Consortium on Lithium Genetics, excess controls, and analytic methods optimized for markers on the Xchromosome. In addition to four known loci, results revealed genome-wide significant associations at two novel loci: an intergenic region on 9p21.3 (rs12553324, p = 5.87×10-9; odds ratio = 1.12) and markers within ERBB2 (rs2517959, p = 4.53×10-9; odds ratio = 1.13). No significant X-chromosome associations were detected and X-linked markers explained very little BD heritability. The results add to a growing list of common autosomal variants involved in BD and illustrate the power of comparing well-characterized cases to an excess of controls in GWAS.