Refine
Year of publication
Document Type
- Preprint (702)
- Article (446)
- Conference Proceeding (1)
Has Fulltext
- yes (1149)
Is part of the Bibliography
- no (1149)
Keywords
- Heavy Ion Experiments (21)
- Hadron-Hadron Scattering (14)
- Hadron-Hadron scattering (experiments) (11)
- LHC (10)
- Heavy-ion collision (6)
- Jets (6)
- ALICE experiment (4)
- Collective Flow (4)
- Heavy Quark Production (4)
- Quark-Gluon Plasma (4)
Institute
- Physik (1129)
- Frankfurt Institute for Advanced Studies (FIAS) (1042)
- Informatik (1007)
- Medizin (9)
- Katholische Theologie (4)
- Philosophie (4)
- Informatik und Mathematik (3)
- ELEMENTS (2)
- Exzellenzcluster Die Herausbildung normativer Ordnungen (2)
- Geowissenschaften (2)
Investigators in the cognitive neurosciences have turned to Big Data to address persistent replication and reliability issues by increasing sample sizes, statistical power, and representativeness of data. While there is tremendous potential to advance science through open data sharing, these efforts unveil a host of new questions about how to integrate data arising from distinct sources and instruments. We focus on the most frequently assessed area of cognition - memory testing - and demonstrate a process for reliable data harmonization across three common measures. We aggregated raw data from 53 studies from around the world which measured at least one of three distinct verbal learning tasks, totaling N = 10,505 healthy and brain-injured individuals. A mega analysis was conducted using empirical bayes harmonization to isolate and remove site effects, followed by linear models which adjusted for common covariates. After corrections, a continuous item response theory (IRT) model estimated each individual subject’s latent verbal learning ability while accounting for item difficulties. Harmonization significantly reduced inter-site variance by 37% while preserving covariate effects. The effects of age, sex, and education on scores were found to be highly consistent across memory tests. IRT methods for equating scores across AVLTs agreed with held-out data of dually-administered tests, and these tools are made available for free online. This work demonstrates that large-scale data sharing and harmonization initiatives can offer opportunities to address reproducibility and integration challenges across the behavioral sciences.
Plants, fungi and algae are important components of global biodiversity and are fundamental to all ecosystems. They are the basis for human well-being, providing food, materials and medicines. Specimens of all three groups of organisms are accommodated in herbaria, where they are commonly referred to as botanical specimens.The large number of specimens in herbaria provides an ample, permanent and continuously improving knowledge base on these organisms and an indispensable source for the analysis of the distribution of species in space and time critical for current and future research relating to global biodiversity. In order to make full use of this resource, a research infrastructure has to be built that grants comprehensive and free access to the information in herbaria and botanical collections in general. This can be achieved through digitization of the botanical objects and associated data.The botanical research community can count on a long-standing tradition of collaboration among institutions and individuals. It agreed on data standards and standard services even before the advent of computerization and information networking, an example being the Index Herbariorum as a global registry of herbaria helping towards the unique identification of specimens cited in the literature.In the spirit of this collaborative history, 51 representatives from 30 institutions advocate to start the digitization of botanical collections with the overall wall-to-wall digitization of the flat objects stored in German herbaria. Germany has 70 herbaria holding almost 23 million specimens according to a national survey carried out in 2019. 87% of these specimens are not yet digitized. Experiences from other countries like France, the Netherlands, Finland, the US and Australia show that herbaria can be comprehensively and cost-efficiently digitized in a relatively short time due to established workflows and protocols for the high-throughput digitization of flat objects.Most of the herbaria are part of a university (34), fewer belong to municipal museums (10) or state museums (8), six herbaria belong to institutions also supported by federal funds such as Leibniz institutes, and four belong to non-governmental organizations. A common data infrastructure must therefore integrate different kinds of institutions.Making full use of the data gained by digitization requires the set-up of a digital infrastructure for storage, archiving, content indexing and networking as well as standardized access for the scientific use of digital objects. A standards-based portfolio of technical components has already been developed and successfully tested by the Biodiversity Informatics Community over the last two decades, comprising among others access protocols, collection databases, portals, tools for semantic enrichment and annotation, international networking, storage and archiving in accordance with international standards. This was achieved through the funding by national and international programs and initiatives, which also paved the road for the German contribution to the Global Biodiversity Information Facility (GBIF).Herbaria constitute a large part of the German botanical collections that also comprise living collections in botanical gardens and seed banks, DNA- and tissue samples, specimens preserved in fluids or on microscope slides and more. Once the herbaria are digitized, these resources can be integrated, adding to the value of the overall research infrastructure. The community has agreed on tasks that are shared between the herbaria, as the German GBIF model already successfully demonstrates.We have compiled nine scientific use cases of immediate societal relevance for an integrated infrastructure of botanical collections. They address accelerated biodiversity discovery and research, biomonitoring and conservation planning, biodiversity modelling, the generation of trait information, automated image recognition by artificial intelligence, automated pathogen detection, contextualization by interlinking objects, enabling provenance research, as well as education, outreach and citizen science.We propose to start this initiative now in order to valorize German botanical collections as a vital part of a worldwide biodiversity data pool.
New neutron cross section measurements of minor actinides have been performed recently in order to reduce the uncertainties in the evaluated data, which is important for the design of advanced nuclear reactors and, in particular, for determining their performance in the transmutation of nuclear waste. We have measured the 241Am(n,γ) cross section at the n_TOF facility between 0.2 eV and 10 keV with a BaF2 Total Absorption Calorimeter, and the analysis of the measurement has been recently concluded. Our results are in reasonable agreement below 20 eV with the ones published by C. Lampoudis et al. in 2013, who reported a 22% larger capture cross section up to 110 eV compared to experimental and evaluated data published before. Our results also indicate that the 241Am(n,γ) cross section is underestimated in the present evaluated libraries between 20 eV and 2 keV by 25%, on average, and up to 35% for certain evaluations and energy ranges.
Gibt es eine moderne Religion? : Jürgen Habermas und die Idee der "postsäkularen Gesellschaft"
(2009)
Das Verhältnis von Religion und Moderne ist in jüngster Zeit wieder zu einem heißen Konfliktherd geworden. So geht es beim Streit um die Piusbrüderschaft im Kern darum, ob eine religiöse Tradition die Kontinuität und Verbindlichkeit ihrer Überlieferung aufrechterhalten und zugleich an wesentliche Einsichten und normative Prinzipien der Moderne anschließen kann. Die traditionalistischen Kritiker des II. Vatikanischen Konzils behaupten, dass religiöse Institutionen wie die katholische Kirche ihre Identität in dem Maße verlieren, in dem sie ein bejahendes und konstruktives Verhältnis zur modernen Gesellschaft entwickeln. Die Anerkennung der Menschenrechte und der Ideen der Französischen Revolution durch das Konzil, also die Akzeptanz des Prinzips der Freiheit in Gestalt der Religionsfreiheit, der Gleichheit als Gleichberechtigung und Gleichwertigkeit aller Religionen und der Brüderlichkeit im Sinne einer gemeinsamen und solidarischen Weltverantwortung »aller Menschen guten Willens« bildet für die reaktionären Kritiker den eigentlichen Skandal der Öffnung der Kirche zur säkularen Moderne. Die unbeholfenen und viele empörenden Versuche der Resozialisierung der Piusbrüder sind letztlich unvollständige und unvollkommene Ansätze, auf eine kulturelle und gesellschaftliche Konstellation zu reagieren, für die Jürgen Habermas den prägnanten Ausdruck »postsäkulare Gesellschaft« geprägt hat. Diese Situation ist nach Habermas nämlich dadurch gekennzeichnet, dass sich religiöse Gemeinschaften auch in einer modernen Lebenswelt dauerhaft einrichten und in ihr fortbestehen. Wir haben laut Habermas Abschied zu nehmen von der Vorstellung eines linearen historischen Prozesses, der zwangsläufig zum Absterben der Religion führen wird. Allerdings schreitet die gesellschaftliche Säkularisierung im Sinne einer Ausdifferenzierung gesellschaftlicher Systeme und einer Pluralisierung von Weltanschauungen weiter voran.
Neurowissenschaftler fordern einen illusionslosen Umgang mit Begriffen wie Willensfreiheit und Bewusstsein. Philosophen kritisieren offen die Thesen von Hirnforschern. Stehen sich diese Positionen unversöhnlich gegenüber? Wo gibt es Möglichkeiten einer Annäherung, gar einer Kooperation? Der Religionsphilosoph Prof. Dr. Thomas M. Schmidt und der Biologe Stefan Kieß loten die Situation in Frankfurt aus; ihre Gesprächspartner sind der Hirnforscher Prof. Dr. Wolf Singer (links), Direktor am Max-Planck-Institut für Hirnforschung, und Prof. Dr. Marcus Willaschek (rechts), Philosoph an der Universität Frankfurt.
Background: The potential anti-cancer effects of mammalian target of rapamycin (mTOR) inhibitors are being intensively studied. To date, however, few randomised clinical trials (RCT) have been performed to demonstrate anti-neoplastic effects in the pure oncology setting, and at present, no oncology endpoint-directed RCT has been reported in the high-malignancy risk population of immunosuppressed transplant recipients. Interestingly, since mTOR inhibitors have both immunosuppressive and anti-cancer effects, they have the potential to simultaneously protect against immunologic graft loss and tumour development. Therefore, we designed a prospective RCT to determine if the mTOR inhibitor sirolimus can improve hepatocellular carcinoma (HCC)-free patient survival in liver transplant (LT) recipients with a pre-transplant diagnosis of HCC. Methods: The study is an open-labelled, randomised, RCT comparing sirolimus-containing versus mTOR-inhibitor-free immunosuppression in patients undergoing LT for HCC. Patients with a histologically confirmed HCC diagnosis are randomised into 2 groups within 4-6 weeks after LT; one arm is maintained on a centre-specific mTOR-inhibitor-free immunosuppressive protocol and the second arm is maintained on a centre-specific mTOR-inhibitor-free immunosuppressive protocol for the first 4-6 weeks, at which time sirolimus is initiated. A 3-year recruitment phase is planned with a 5-year follow-up, testing HCC-free survival as the primary endpoint. Our hypothesis is that sirolimus use in the second arm of the study will improve HCC-free survival. The study is a non-commercial investigator-initiated trial (IIT) sponsored by the University Hospital Regensburg and is endorsed by the European Liver and Intestine Transplant Association; 13 countries within Europe, Canada and Australia are participating. Discussion: If our hypothesis is correct that mTOR inhibition can reduce HCC tumour growth while simultaneously providing immunosuppression to protect the liver allograft from rejection, patients should experience less post-transplant problems with HCC recurrence, and therefore could expect a longer and better quality of life. A positive outcome will likely change the standard of posttransplant immunosuppressive care for LT patients with HCC. (trial registered at www.clinicaltrials.gov: NCT00355862) (EudraCT Number: 2005-005362-36)
The radiative capture cross section of 238U is very important for the developing of new reactor technologies and the safety of existing ones. Here the preliminary results of the 238U(n,γ) cross section measurement performed at n_TOF with C6D6 scintillation detectors are presented, paying particular attention to data reduction and background subtraction.
We have measured the radiative neutron-capture cross section and the total neutron-induced cross section of one of the most important isotopes for the s process, the 25Mg. The measurements have been carried out at the neutron time-of-flight facilities n_TOF at CERN (Switzerland) and GELINA installed at the EC-JRC-IRMM (Belgium). The cross sections as a function of neutron energy have been measured up to approximately 300 keV, covering the energy region of interest to the s process. The data analysis is ongoing and preliminary results show the potential relevance for the s process.
Above 1 MeV of incident neutron energy the fission fragment angular distribution (FFAD) has generally a strong anisotropic behavior due to the combination of the incident orbital momentum and the intrinsic spin of the fissioning nucleus. This effect has to be taken into account for the efficiency estimation of devices used for fission cross section measurements. In addition it bears information on the spin deposition mechanism and on the structure of transitional states. We designed and constructed a detection device, based on Parallel Plate Avalanche Counters (PPAC), for measuring the fission fragment angular distributions of several isotopes, in particular 232Th. The measurement has been performed at n_TOF at CERN taking advantage of the very broad energy spectrum of the neutron beam. Fission events were recognized by back to back detection in coincidence in two position-sensitive detectors surrounding the targets. The detection efficiency, depending mostly on the stopping of fission fragments in backings and electrodes, has been computed with a Geant4 simulation and validated by the comparison to the measured case of 235U below 3 keV where the emission is isotropic. In the case of 232Th, the result is in good agreement with previous data below 10 MeV, with a good reproduction of the structures associated to vibrational states and the opening of second chance fission. In the 14 MeV region our data are much more accurate than previous ones which are broadly scattered.
Activation of TRPC6 channels is essential for lung ischaemia–reperfusion induced oedema in mice
(2012)
Lung ischaemia–reperfusion-induced oedema (LIRE) is a life-threatening condition that causes pulmonary oedema induced by endothelial dysfunction. Here we show that lungs from mice lacking nicotinamide adenine dinucleotide phosphate (NADPH) oxidase (Nox2y/−) or the classical transient receptor potential channel 6 (TRPC6−/−) are protected from LIR-induced oedema (LIRE). Generation of chimeric mice by bone marrow cell transplantation and endothelial-specific Nox2 deletion showed that endothelial Nox2, but not leukocytic Nox2 or TRPC6, are responsible for LIRE. Lung endothelial cells from Nox2- or TRPC6-deficient mice showed attenuated ischaemia-induced Ca2+ influx, cellular shape changes and impaired barrier function. Production of reactive oxygen species was completely abolished in Nox2y/− cells. A novel mechanistic model comprising endothelial Nox2-derived production of superoxide, activation of phospholipase C-γ, inhibition of diacylglycerol (DAG) kinase, DAG-mediated activation of TRPC6 and ensuing LIRE is supported by pharmacological and molecular evidence. This mechanism highlights novel pharmacological targets for the treatment of LIRE.
Die gegenwärtige Krise der Demokratie wird besonders sichtbar in der "symbolischen Dimension politischer Repräsentation". Diese Auffassung vertritt Paula Diehl in ihrem Aufsatz Demokratische Repräsentation und ihre Krise. "In Bildern, Inszenierungen und Diskursen werden sowohl demokratisierende als auch antidemokratische Konzepte 'getestet'. Erfahren sie Resonanz in der Öffentlichkeit und in der Bevölkerung, kann sich die Lage in die eine oder in die andere Richtung entwickeln. Denn Symbole aktivieren Vorstellungen über die politische Ordnung, Repräsentanten, Bürgerinnen und Bürger, über den Staat und auch darüber, wie politische Institutionen funktionieren sollen." So Paula Diehl im besagten Aufsatz, der Überlegungen bündelt, die sie in ihrer Studie Das Symbolische, das Imaginäre und die Demokratie. systematisch entfaltet hat. In dieser Arbeit analysiert Diehl den Zusammenhang zwischen den normativen Strukturen und der symbolischen Praxis eines demokratisch verfassten politischen Gemeinwesens. Beide bedingen sich gegenseitig. Die normative Struktur einer Gesellschaft findet den Grund ihrer Geltung und der Stabilität in der symbolischen Praxis; diese wiederum muss begriffen werden als Ausdruck der Prinzipien und Regeln der normativen Grundstruktur. Eine Krise des Politischen ist zu verstehen als Resultat und Ausdruck einer Störung in diesem wechselseitigen Bedingungsverhältnis von normativer Struktur und symbolischer Praxis der politischen Gemeinschaft. ...
Welchen Stellenwert hat der Glaube im Bereich der menschlichen Überzeugungen? Wie verhalten sich Wissen und Gewissheit zueinander? Die moderne Wissenschaft wird oft mit Faktenwissen gleichgesetzt, doch ohne die Reflexion über die Welt ist Wissenschaft nicht denkbar. Sie findet z.B. statt in der Religionsphilosophie.
Introduction: We examined if a combination of proliferation markers and estrogen receptor (ER) activity could predict early versus late relapses in ER-positive breast cancer and inform the choice and length of adjuvant endocrine therapy.
Methods: Baseline affymetrix gene-expression profiles from ER-positive patients who received no systemic therapy (n = 559), adjuvant tamoxifen for 5 years (cohort-1: n = 683, cohort-2: n = 282) and from 58 patients treated with neoadjuvant letrozole for 3 months (gene-expression available at baseline, 14 and 90 days) were analyzed. A proliferation score based on the expression of mitotic kinases (MKS) and an ER-related score (ERS) adopted from Oncotype DX® were calculated. The same analysis was performed using the Genomic Grade Index as proliferation marker and the luminal gene score from the PAM50 classifier as measure of estrogen-related genes. Median values were used to define low and high marker groups and four combinations were created. Relapses were grouped into time cohorts of 0-2.5, 0-5, 5-10 years.
Results: In the overall 10 years period, the proportional hazards assumption was violated for several biomarker groups indicating time-dependent effects. In tamoxifen-treated patients Low-MKS/Low-ERS cancers had continuously increasing risk of relapse that was higher after 5 years than Low-MKS/High-ERS cancers [0 to 10 year, HR 3.36; p = 0.013]. High-MKS/High-ERS cancers had low risk of early relapse [0-2.5 years HR 0.13; p = 0.0006], but high risk of late relapse which was higher than in the High-MKS/Low-ERS group [after 5 years HR 3.86; p = 0.007]. The High-MKS/Low-ERS subset had most of the early relapses [0 to 2.5 years, HR 6.53; p < 0.0001] especially in node negative tumors and showed minimal response to neoadjuvant letrozole. These findings were qualitatively confirmed in a smaller independent cohort of tamoxifen-treated patients. Using different biomarkers provided similar results.
Conclusions: Early relapses are highest in highly proliferative/low-ERS cancers, in particular in node negative tumors. Relapses occurring after 5 years of adjuvant tamoxifen are highest among the highly-proliferative/high-ERS tumors although their risk of recurrence is modest in the first 5 years on tamoxifen. These tumors could be the best candidates for extended endocrine therapy.
Endothelial tip cells are essential for VEGF-induced angiogenesis, but underlying mechanisms are elusive. The Ena/VASP protein family, consisting of EVL, VASP, and Mena, plays a pivotal role in axon guidance. Given that axonal growth cones and endothelial tip cells share many common features, from the morphological to the molecular level, we investigated the role of Ena/VASP proteins in angiogenesis. EVL and VASP, but not Mena, are expressed in endothelial cells of the postnatal mouse retina. Global deletion of EVL (but not VASP) compromises the radial sprouting of the vascular plexus in mice. Similarly, endothelial-specific EVL deletion compromises the radial sprouting of the vascular plexus and reduces the endothelial tip cell density and filopodia formation. Gene sets involved in blood vessel development and angiogenesis are down-regulated in EVL-deficient P5-retinal endothelial cells. Consistently, EVL deletion impairs VEGF-induced endothelial cell proliferation and sprouting, and reduces the internalization and phosphorylation of VEGF receptor 2 and its downstream signaling via the MAPK/ERK pathway. Together, we show that endothelial EVL regulates sprouting angiogenesis via VEGF receptor-2 internalization and signaling.
Calibration of TCCON column-averaged CO₂: the first aircraft campaign over European TCCON sites
(2011)
The Total Carbon Column Observing Network (TCCON) is a ground-based network of Fourier Transform Spectrometer (FTS) sites around the globe, where the column abundances of CO2, CH4, N2O, CO and O2 are measured. CO2 is constrained with a precision better than 0.25% (1-σ). To achieve a similarly high accuracy, calibration to World Meteorological Organization (WMO) standards is required. This paper introduces the first aircraft calibration campaign of five European TCCON sites and a mobile FTS instrument. A series of WMO standards in-situ profiles were obtained over European TCCON sites via aircraft and compared with retrievals of CO2 column amounts from the TCCON instruments. The results of the campaign show that the FTS measurements are consistently biased 1.1% ± 0.2% low with respect to WMO standards, in agreement with previous TCCON calibration campaigns. The standard a priori profile for the TCCON FTS retrievals is shown to not add a bias. The same calibration factor is generated using aircraft profiles as a priori and with the TCCON standard a priori. With a calibration to WMO standards, the highly precise TCCON CO2 measurements of total column concentrations provide a suitable database for the calibration and validation of nadir-viewing satellites
Calibration of TCCON column-averaged CO₂: the first aircraft campaign over European TCCON sites
(2011)
The Total Carbon Column Observing Network (TCCON) is a ground-based network of Fourier Transform Spectrometer (FTS) sites around the globe, where the column abundances of CO2, CH4, N2O, CO and O2 are measured. CO2 is constrained with a precision better than 0.25 %. To achieve a similarly high accuracy, calibration to World Meteorological Organization (WMO) standards is required. This paper introduces the first aircraft calibration campaign of five European TCCON sites and a mobile FTS instrument. A series of WMO standards in-situ profiles were obtained over European TCCON sites via aircraft and compared with retrievals of CO2 column amounts from the TCCON instruments. The results of the campaign show that the FTS measurements are consistently biased 1.0 % ± 0.2 % low with respect to WMO standards, in agreement with previous TCCON calibration campaigns. The standard a priori profile for the TCCON FTS retrievals is shown to not add a bias. The same calibration factor is generated using aircraft profiles as a priori and with the TCCON standard a priori. With a calibration to WMO standards, the highly precise TCCON CO2 measurements of total column concentrations provide a suitable database for the calibration and validation of nadir-viewing satellites.
The accuracy on neutron capture cross section of fissile isotopes must be improved for the design of future nuclear systems such as Gen-IV reactors and Accelerator Driven Systems. The High Priority Request List of the Nuclear Energy Agency, which lists the most important nuclear data requirements, includes also the neutron capture cross sections of fissile isotopes such as 233,235U and 239,241Pu. A specific experimental setup has been used at the CERN n_TOF facility for the measurement of the neutron capture cross section of 235U by a set of micromegas fission detectors placed inside a segmented BaF2 Total Absorption Calorimeter.
The study of neutron-induced reactions is of high relevance in a wide variety of fields, ranging from stellar nucleosynthesis and fundamental nuclear physics to applications of nuclear technology. In nuclear energy, high accuracy neutron data are needed for the development of Generation IV fast reactors and accelerator driven systems, these last aimed specifically at nuclear waste incineration, as well as for research on innovative fuel cycles. In this context, a high luminosity Neutron Time Of Flight facility, n_TOF, is operating at CERN since more than a decade, with the aim of providing new, high accuracy and high resolution neutron cross-sections. Thanks to the features of the neutron beam, a rich experimental program relevant to nuclear technology has been carried out so far. The program will be further expanded in the near future, thanks in particular to a new high-flux experimental area, now under construction.