Refine
Year of publication
Document Type
- Article (30868)
- Part of Periodical (11923)
- Book (8320)
- Doctoral Thesis (5737)
- Part of a Book (3723)
- Working Paper (3388)
- Review (2878)
- Contribution to a Periodical (2369)
- Preprint (2224)
- Report (1544)
Language
- German (42613)
- English (29713)
- French (1067)
- Portuguese (723)
- Multiple languages (314)
- Croatian (302)
- Spanish (301)
- Italian (195)
- mis (174)
- Turkish (148)
Is part of the Bibliography
- no (75875) (remove)
Keywords
- Deutsch (1038)
- Literatur (809)
- taxonomy (768)
- Deutschland (543)
- Rezension (491)
- new species (453)
- Frankfurt <Main> / Universität (341)
- Rezeption (325)
- Geschichte (292)
- Übersetzung (271)
Institute
- Medizin (7778)
- Präsidium (5235)
- Physik (4591)
- Wirtschaftswissenschaften (2710)
- Extern (2661)
- Gesellschaftswissenschaften (2378)
- Biowissenschaften (2198)
- Biochemie und Chemie (1978)
- Frankfurt Institute for Advanced Studies (FIAS) (1775)
- Center for Financial Studies (CFS) (1632)
Sprachkorpora sind heute Informationsquellen, die Sprachwissenschaftlern für die Erforschung der Sprachen und für den Vergleich von Sprachen als ein unentbehrliches Instrument zur Verfügung stehen und man kann bei einer effizienten Erforschung von Sprachen sowie bei der Datenerhebung um Korpora gar nicht herumkommen. Sprachkorpora kann man jedoch auch für eine effektive Vermittlung jeder Sprache nutzen. Die schnelle Verbreitung der Computertechnik und der schnelle Zugang zu Informationen hat auch stark die immense Entwicklung im Bereich der Sprachverarbeitung beeinflusst und bedeutete zudem eine rasante Entwicklung der Korpuslinguistik. Deswegen erscheinen als große Defizite die ungenügende Vermittlung der Kenntnisse und das Fehlen der praktischen Fähigkeiten im Umgang mit korpuslinguistischen Tools bei Studierenden der philologischen Fachrichtungen. Die als Lehrbuch konzipierte Publikation hat eine gute Chance, diese Lücke im Lehrwerkangebot zu füllen...
Das Kolloquium zur Lexikographie und Wörterbuchforschung wurde erstes Mal im Jahre 2000 von Herbert Ernst Wiegand und Pavel Petkov veranstaltet. Seit dieser Zeit treffen sich Lexikographen im Zwei-Jahres- Rhythmus und führen Diskussionen über die aktuellen Themen im Bereich der Lexikographie und Wörterbuchforschung. Das Kolloquium bietet einen internationalen Raum zum Austausch von Erfahrungen und zur Präsentation von Forschungsergebnissen...
Biodiversity is unevenly distributed on Earth and hotspots of biodiversity are often associated with areas that have undergone orogenic activity during recent geological history (i.e. tens of millions of years). Understanding the underlying processes that have driven the accumulation of species in some areas and not in others may help guide prioritization in conservation and may facilitate forecasts on ecosystem services under future climate conditions. Consequently, the study of the origin and evolution of biodiversity in mountain systems has motivated growing scientific interest. Despite an increasing number of studies, the origin and evolution of diversity hotspots associated with the Qinghai-Tibetan Plateau (QTP) remains poorly understood. We review literature related to the diversification of organisms linked to the uplift of the QTP. To promote hypothesis-based research, we provide a geological and palaeoclimatic scenario for the region of the QTP and argue that further studies would benefit from providing a complete set of complementary analyses (molecular dating, biogeographic, and diversification rates analyses) to test for a link between organismic diversification and past geological and climatic changes in this region. In general, we found that the contribution of biological interchange between the QTP and other hotspots of biodiversity has not been sufficiently studied to date. Finally, we suggest that the biological consequences of the uplift of the QTP would be best understood using a meta-analysis approach, encompassing studies on a variety of organisms (plants and animals) from diverse habitats (forests, meadows, rivers), and thermal belts (montane, subalpine, alpine, nival). Since the species diversity in the QTP region is better documented for some organismic groups than for others, we suggest that baseline taxonomic work should be promoted.
Objective: To compare breech outcomes when mothers delivering vaginally are upright, on their back, or planning cesareans. Methods: A retrospective cohort study was undertaken of all women who presented for singleton breech delivery at a center in Frankfurt, Germany, between January 2004 and June 2011. Results: Of 750 women with term breech delivery, 315 (42.0%) planned and received a cesarean. Of 269 successful vaginal deliveries of neonates, 229 in the upright position were compared with 40 in the dorsal position. Upright deliveries were associated with significantly fewer delivery maneuvers (OR 0.45, 95% CI 0.31–0.68) and neonatal birth injuries (OR 0.08, 95% CI 0.01–0.58), second stages that were on average shorter (1 vs 1.75 hours), and nonsignificantly decreased serious perineal lacerations (OR 0.34, 95% CI 0.05–3.99). When upright position was used almost exclusively, the cesarean rate decreased. Serious fetal and neonatal morbidity potentially related to birth mode was low, and similar for upright vaginal deliveries compared with planned cesareans (OR 1.37, 95% CI 0.10–19.11). Three neonates died; all had lethal birth defects. Forceps were never required. Conclusion: Upright vaginal breech delivery was associated with reductions in duration of the second stage of labor, maneuvers required, maternal/neonatal injuries, and cesarean rate when compared with vaginal delivery in the dorsal position.
Vielfältig sind die Definitionen, die das Überlebensparadigma im Sinne eines die Weltsicht prägenden Denkmusters zu erfassen versuchen, und verschieden sind die Aspekte, die der Betrachter in seiner Auffassung jeweils als die dominierenden pointiert. Nichtsdestoweniger wurzelt das moderne Verständnis vom 'Überleben' zuletzt im evolutionistischen Diskurs der zweiten Hälfte des 19. Jahrhunderts. Unter dem evolutionistischen Diskurs sind nicht bloß Darwins Werke zu verstehen, sondern vielmehr die Konstellation von Autoren, Diskursen, Berichtigungen, Anregungen, Ergänzungen, die sich um Darwins Evolutionstheorie drehen und die den Namen Darwinismus tragen. Anders ausgedrückt: Unsere Auffassung des Überlebensbegriffs ist in diesem Diskurs verfangen und kann von ihm nicht restlos loskommen. Dies gewinnt an höchster Evidenz in den Reflexionen über das Überleben von kulturellen Artefakten, die in Analogie zu den Exemplaren bestehender Spezies als Resultat einer 'natürlichen' Auslese gedeutet werden. Unter den unzähligen Beispielen einer Übertragung des Auslesegesetzes von der biologischen auf die kulturelle Evolution mag hier die Reflexion von Hans Blumenberg vorgeführt werden, denn sie bietet viel mehr als eines der rein evolutionistischen Modelle, die eine Erläuterung des kulturellen Überlebens präsentieren. Kein weiterer Autor hat meines Erachtens in der Nachkriegszeit solch einen anspruchsvollen Versuch unternommen, das Darwinsche Evolutionsgesetz jenseits der Fehlschlüsse des Sozialdarwinismus wiederherzustellen und es für die kulturelle beziehungsweise ästhetische Anthropologie fruchtbar zu machen. Des Weiteren erzielte Blumenberg mit seiner theoretischen Berichtigung zuletzt die Beschreibung eines humaneren Modells der kulturellen Produktion, dessen ethische Dimension im Folgenden auszuloten ist. Das Heranziehen einiger Betrachtungen über Primo Levis narrative Erfahrung dient anschließend dazu, die ethische Grundproblematik herauszudestillieren, die das Verbleiben in diesem - obschon korrigierten - Überlebensparadigma in Hinblick auf das historische Gedächtnis impliziert.
Since 2009 has the central Nigerian Nok Culture – until then primarily known for its highly artistic terracotta figurines and early evidence of iron working in the first millennium BCE – been the focus of a research project by the Goethe University Frankfurt/Main, Germany. The analysis of Nok sculptures has so far been almost entirely restricted to their stylistic features which show such great similarities that one hypothesis of the Frankfurt project has been the possible central production of these artfully crafted figurines.
This volume, written within the scope of a dissertation project completed in 2015, challenges this hypothesis by using scientific materials analysis. Combining the results of the mineralogical and geochemical analyses as well as geographic and geological observations, an alternative model for the organisation and procedure of the manufacture of the famous Nok terracottas is suggested.
They were – as the domestic pottery that is used for comparison and differentiation in this study – manufactured with locally available raw materials (clay and temper) but in different manufacturing sequences with regard to temper and clay composition. The terracottas’ clay was obviously reserved for their production only, demonstrating – aside from stylistic similarities – the value these figurines had during the Nok Culture.
Invasive non-native species are key components of human-induced global environmen-tal change and lead to a loss of biodiversity, alterations of species interactions and changes of ecosystem services. Freshwater ecosystems in particular are strongly affect-ed by biological invasions, since they are spatially restricted environments and often already heavily impacted by anthropogenic activities. Recent human-induced species invasions are often characterized by long-distance dispersal, with many species having extended their native distribution range within a very short time frame. However, a long term view into the past shows that biological invasions are common phenomena in nature—representing the arrival of a species into a location in which it did not originally evolve—as a result of climatic changes, geotectonic activity or other natural events. Once a species arrives in a new habitat, it may experience an array of novel selection pressures resulting from abiotic and biotic environmental factors and simultaneously act as a novel selective agent on the native fauna. Consequences of species invasions are manifold. My thesis, which combines seven studies on different aspects of biological invasions, aims to explore the influence of abiotic stressors and biotic interactions during species introductions and range expansions, as well as the consequences of biological invasions on evolutionary and ecosystem processes.
The first part of my thesis examines human-induced biological invasions, dealing with basic ecological characteristics of invaded ecosystems, novel predator-prey interactions, functional consequences of species invasions and certain behavioral traits that may contribute to the invasiveness of some species. The second part of my thesis examined distribution patterns and phenotypic trait divergence in species that historically invaded new geographical areas. I investigated variation of abiotic and biotic selection factors along a stream gradient as well as ecological and evolutionary consequences of species invasions to extreme habitats. The results highlight the importance of simultaneously considering processes involved in natural invasions and during human-induced invasions to understand the success of invading species.
We often lack detailed information on the impacts of historical biological inva-sions. Also, we are currently lacking crucial knowledge about the time scales during which different mechanisms (behavioral flexibility, plastic phenotypic changes, and ge-netic adaptation) play a role during biological invasions and affect species exchange and establishment. Comparative analyses of historical, natural invasion and recent (man-made) invasions can provide insights into the relative importance of the processes governing adaptation to abiotic stressors and selection resulting from biotic interactions. Beyond their negative effects, the establishment of invasive species and the subsequent range expansion represent “natural experiments” to investigate fundamental questions in ecology and evolution. My comparison of natural and human-induced biological invasions revealed that in many cases preadaptation to altered abiotic conditions plays a key role during early stages of invasions and range expansions. Considering the evolutionary history of invasive species and the evolutionary history of the recipient native fauna might therefore help predict the consequences of biological invasions for the ecosystem under consideration and the future success of the invading species. This knowledge can also be implemented when formulating conservation strategies, including methods to mitigate and manage human-induced biological invasions.
The organic rich Livello Bonarelli formed as a result of oxygen deficiency and carbonate dissolution in the oceans during the Cenomanian/Turonian (C/T) transition. During this Ocean Anoxic Event 2 (OAE2), a combination of factors caused increased productivity, incomplete decomposition of organic matter and widespread deposition of black shales. Although these sediments are extensively studied, the exact extent, cause, timing and duration of oceanic anoxia are debated (Sinton and Duncan, 1997; Mitchell et al., 2008). Contrasting causal mechanisms have been suggested, including stratification of the water column (Lanci et al., 2010) versus intensification of the hydrological cycle driving a dynamic ocean circulation (Trabucho-Alexandre et al., 2010). Studies on trace-elemental and (radiogenic) isotope compositions of Cenomanian marine successions have suggested a volcanic origin of OAE2, by delivering nutrients to the semi-enclosed proto-North Atlantic (Zheng et al., 2013, and references therein; Du Vivier et al., 2014). Deciphering the importance of volcanic and oceanographic processes requires tight constraints on their relative timing. Regularly occurring black cherts and shales below the Livello Bonarelli demonstrate that oceanic conditions in the Umbria-Marche Basin were punctuated by episodes of regional anoxia from the mid-Cenomanian onwards. Their hierarchical stacking pattern suggests an orbital control on the deposition of organic rich horizons (Mitchell et al., 2008; Lanci et al., 2010). Stable carbon isotope data reveal that long-term 15 variations in eccentricity paced the carbon cycle (Sprovieri et al., 2013) and sea level changes (Voigt et al., 2006) of the Late Cretaceous. Here we investigate the role of orbital forcing on climate and the carbon cycle, and, specifically, on organic-rich sedimentation prior, during, and after OAE2.
We also explore the potential for establishing an anchored astrochronology for the C/T interval in Europe. Recent improvements in the astronomical solution (La2011; Laskar et al., 2011b) and in the intercalibration of radiometric and astronomical dating techniques (Kuiper et al., 2008; Renne et al., 2013) allow the extension of the astronomical time scale into the Cretaceous. The C/T boundary in the Western Interior (USA) has been dated at 93.90 ± 0.15 Ma by intercalibration of radio-isotopic and astrochronologic time scales (Meyers et al., 2012b). Also, reinterpretation of proxy records spanning the C/T interval seems to resolve discrepancies in reported durations of the OAE2 (Sageman et al., 2006; Meyers et al., 2012a). The well-documented Italian rhythmic successions, reference sections for climatic processes in the Tethyan realm, need to be tied in with the absolute time scale. Biostratigraphic correlation to radioisotopically-dated ash beds in the Western Interior is complicated by the provinciality of faunas and floras. However, δ13C stratigraphy provides a reliable correlation tool (Gale et al., 2005) and we present a new 40Ar/39Ar age for the Thatcher bentonite from the Western Interior occurring within the mid-Cenomanian δ13C event (MCE). This study integrates the well-developed cyclostratigraphy from the Umbria-Marche Basin with radioisotopic ages from the Western Interior and derives a numerical timescale for this critical interval in Earth’s history.
The oceans at the time of the Cenomanian–Turonian transition were abruptly perturbed by a period of bottom-water anoxia. This led to the brief but widespread deposition of black organic-rich shales, such as the Livello Bonarelli in the Umbria–Marche Basin (Italy). Despite intensive studies, the origin and exact timing of this event are still debated. In this study, we assess leading hypotheses about the inception of oceanic anoxia in the Late Cretaceous greenhouse world by providing a 6 Myr long astronomically tuned timescale across the Cenomanian–Turonian boundary. We procure insights into the relationship between orbital forcing and the Late Cretaceous carbon cycle by deciphering the imprint of astronomical cycles on lithologic, physical properties, and stable isotope records, obtained from the Bottaccione, Contessa and Furlo sections in the Umbria–Marche Basin. The deposition of black shales and cherts, as well as the onset of oceanic anoxia, is related to maxima in the 405 kyr cycle of eccentricity-modulated precession. Correlation to radioisotopic ages from the Western Interior (USA) provides unprecedented age control for the studied Italian successions. The most likely tuned age for the base of the Livello Bonarelli is 94.17 ± 0.15 Ma (tuning 1); however, a 405 kyr older age cannot be excluded (tuning 2) due to uncertainties in stratigraphic correlation, radioisotopic dating, and orbital configuration. Our cyclostratigraphic framework suggests that the exact timing of major carbon cycle perturbations during the Cretaceous may be linked to increased variability in seasonality (i.e. a 405 kyr eccentricity maximum) after the prolonged avoidance of seasonal extremes (i.e. a 2.4 Myr eccentricity minimum). Volcanism is probably the ultimate driver of oceanic anoxia, but orbital periodicities determine the exact timing of carbon cycle perturbations in the Late Cretaceous. This unites two leading hypotheses about the inception of oceanic anoxia in the Late Cretaceous greenhouse world.
The LPJ-GUESS dynamic vegetation model uniquely combines an individual- and patch-based representation of vegetation dynamics with ecosystem biogeochemical cycling from regional to global scales. We present an updated version that includes plant and soil N dynamics, analysing the implications of accounting for C-N interactions on predictions and performance of the model. Stand structural dynamics and allometric scaling of tree growth suggested by global databases of forest stand structure and development were well-reproduced by the model in comparison to an earlier multi-model study. Accounting for N cycle dynamics improved the goodness-of-fit for broadleaved forests. N limitation associated with low N mineralisation rates reduces productivity of cold-climate and dry-climate ecosystems relative to mesic temperate and tropical ecosystems. In a model experiment emulating free-air CO2 enrichment (FACE) treatment for forests globally, N-limitation associated with low N mineralisation rates of colder soils reduces CO2-enhancement of NPP for boreal forests, while some temperate and tropical forests exhibit increased NPP enhancement. Under a business-as-usual future climate and emissions scenario, ecosystem C storage globally was projected to increase by c. 10 %; additional N requirements to match this increasing ecosystem C were within the high N supply limit estimated on stoichiometric grounds in an earlier study. Our results highlight the importance of accounting for C-N interactions not only in studies of global terrestrial C cycling, but to understand underlying mechanisms on local scales and in different regional contexts.
The LPJ-GUESS dynamic vegetation model uniquely combines an individual- and patch-based representation of vegetation dynamics with ecosystem biogeochemical cycling from regional to global scales. We present an updated version that includes plant and soil N dynamics, analysing the implications of accounting for C–N interactions on predictions and performance of the model. Stand structural dynamics and allometric scaling of tree growth suggested by global databases of forest stand structure and development were well reproduced by the model in comparison to an earlier multi-model study. Accounting for N cycle dynamics improved the goodness of fit for broadleaved forests. N limitation associated with low N-mineralisation rates reduces productivity of cold-climate and dry-climate ecosystems relative to mesic temperate and tropical ecosystems. In a model experiment emulating free-air CO2 enrichment (FACE) treatment for forests globally, N limitation associated with low N-mineralisation rates of colder soils reduces CO2 enhancement of net primary production (NPP) for boreal forests, while some temperate and tropical forests exhibit increased NPP enhancement. Under a business-as-usual future climate and emissions scenario, ecosystem C storage globally was projected to increase by ca. 10%; additional N requirements to match this increasing ecosystem C were within the high N supply limit estimated on stoichiometric grounds in an earlier study. Our results highlight the importance of accounting for C–N interactions in studies of global terrestrial N cycling, and as a basis for understanding mechanisms on local scales and in different regional contexts.
Strong seasonal variability of hygric and thermal soil conditions are a defining environmental feature in Northern Australia. However, how such changes affect the soil–atmosphere exchange of nitrous oxide (N2O), nitric oxide (NO) and dinitrogen (N2) is still 5 not well explored. By incubating intact soil cores from four sites (3 savanna, 1 pasture) under controlled soil temperatures (ST) and soil moisture (SM) we investigated the release of the trace gas fluxes of N2O, NO and carbon dioxide (CO2). Furthermore, the release of N2 due to denitrification was measured using the helium gas flow soil core technique. Under dry pre-incubation conditions NO and N2O emission were very low (< 7.0± 5.0 μgNO-Nm−2 h−1; < 0.0± 1.4 μgN2O-Nm−2 h−1) or in case of N2O, even a net soil uptake was observed. Substantial NO (max: 306.5 μgNm−2 h−1) and relatively small N2O pulse emissions (max: 5.8±5.0 μgNm−2 h−1) were recorded following soil wetting, but these pulses were short-lived, lasting only up to 3 days. The total atmospheric loss of nitrogen was dominated by N2 emissions (82.4–99.3% of total N lost), although NO emissions contributed almost 43.2% at 50% SM and 30 °C ST. N2O emissions were systematically higher for 3 of 12 sample locations, which indicates substantial spatial variability at site level, but on average soils acted as weak N2O sources or even sinks. Emissions were controlled by SM and ST for N2O and CO2, ST and pH for NO, and SM and pH for N2.
In this thesis, the production of charged kaons and Φ mesons in Au+Au collisions at sqrt sAuAu = 2.4 GeV is studied. At this energy, all particles carrying open and hidden strangeness are produced below their respective free nucleon-nucleon threshold with the corresponding so-called excess energies: sqrt sK+ exc = -0.15 GeV, sqrt sK- exc = -0.46 GeV, sqrt sΦ exc = -0.49 GeVGeV. As a consequence, the production cross sections are very sensitive to medium effects like momentum distributions, two- or multistep collisions, and modification of the in-medium spectral distribution of the produced states [1]. K+ and K- mesons exhibit different properties in baryon dominated matter, since only K- can be resonantly absorbed by nucleons. Although strangeness exchange reactions have been proposed to be the dominant channel for K- production in the analyzed energy regime, the production yield and kinematic distributions could also be explained in smaller systems based on statistical hadronization model fits to the measured particle yields, including a canonical strangeness suppression radius RC, and taking the Φ feed-down to kaons into account [2, 3]. For the first time in central Au+Au collisions at such low energies, it is possible to reconstruct and do a multi differential analysis of K- and Φ mesons. In principle, this should be the ideal environment for strangeness exchange reactions to occur, as the particles are produced deeply sub-threshold in a large and long-living system. Therefore, it is the ultimate test to differentiate between the different sources for K- production in HIC.
In total 7.3x10exp9 of the 40% most central Au(1.23 GeV per nucleon)+Au collisions are analyzed. The data has been recorded with the High Acceptance DiElectron Spectrometer HADES located at Helmholtzzentrum für Schwerionenforschung GSI in April/May 2012. A substantially improved reconstruction method has been employed to reconstruct the hadrons with high purity in a wide phase space region.
The estimated particle multiplicities follow a clear hierarchy of the excess energy: 41.5 ± 2.1|sys protons at mid-rapidity per unit in rapidity, 11.1 ± 0.6|sys ± 0.4|extrapol π-, (3.01 ± 0.03|stat ± 0.15|sys ± 0.30|extrapól) x10 exp -2 K+, (1.94 ± 0.09|stat ± 0.10|sys ± 0.10|extrapol)x10 exp -4 K- and (0.99 ± 0.24|stat ± 0.10|sys ± 0.05|extrapol)x10 exp -4 Φ per event. The multiplicities of the strange hadrons increase more than linear with the mean number of participating nucleons hAparti, supporting the assumption that the necessary energy to overcome the elementary production threshold is accumulated in multi-particle interactions. Transport models predict such an increase, but are overestimating the measured particle yield and are not able to describe the kinematic distributions of K+ mesons perfectly. However, the best description is given by the IQMD model with a density dependent kaonnucleon potential of 40 MeV at nuclear ground state density.
The K-=K+ multiplicity ratio is constant as a function of centrality and follows with (6.45 ± 0.77)x10 exp -3 the trend of increasing with beam energy indicated from previous experiments [4]. The effective temperature of K- TK+eff = (84 ± 6) MeV is found to be systematically lower than the one of K+ TK+eff = (104 ± 1) MeV, which has also been observed by the other experiments.
The Φ=K- ratio is with a value of 0.52 ± 0.16 higher than the one obtained at higher center-of-mass energies and smaller systems. This behavior is predicted from a tuned version of the UrQMD transport model [5], when including higher mass baryonic resonances which can decay into Φ mesons and from statistical hadronization models when suppressing open strangeness canonically. The found ratio is constant as a function of centrality and results with a branching ratio of 48.9%, that ~ 25% of all measured K- originate from Φ feed-down decays. A two component PLUTO simulation, consisting of a pure thermal and a K- contribution originating from Φ decays, can fully explain the observed lower effective temperature in comparison to K+ and the shape of the measured rapidity distribution of K-. As a result, we find no indication for strangeness exchange reactions being the dominant mechanism for K- production in the SIS18 energy regime, if taking the contribution from Φ feed-down decays into account.
The hadron yields for the 20% most central collisions can be described by a statistical hadronization model fit with the chemical freeze-out temperature of Tchem = (68 ± 2) MeV and baryochemical potential of μB = (883 ± 25) MeV, which is higher than expected from previous parameterizations. The analysis of the transverse mass spectra of protons indicate a kinetic freeze-out temperature of Tkin = (70 ± 4) MeV and radial flow velocity of βr = 0.43 ± 0.01, which is in agreement with the parameters obtained from the linear dependence of the effective temperatures on the particle mass Tkin = (71.5 ± 4.2) MeV and βr = 0.28 ± 0.09.
Strong seasonal variability of hygric and thermal soil conditions are a defining environmental feature in northern Australia. However, how such changes affect the soil–atmosphere exchange of nitrous oxide (N2O), nitric oxide (NO) and dinitrogen (N2) is still not well explored. By incubating intact soil cores from four sites (three savanna, one pasture) under controlled soil temperatures (ST) and soil moisture (SM) we investigated the release of the trace gas fluxes of N2O, NO and carbon dioxide (CO2). Furthermore, the release of N2 due to denitrification was measured using the helium gas flow soil core technique. Under dry pre-incubation conditions NO and N2O emissions were very low (<7.0 ± 5.0 μg NO-N m−2 h−1; <0.0 ± 1.4 μg N2O-N m−2 h−1) or in the case of N2O, even a net soil uptake was observed. Substantial NO (max: 306.5 μg N m−2 h−1) and relatively small N2O pulse emissions (max: 5.8 ± 5.0 μg N m−2 h−1) were recorded following soil wetting, but these pulses were short lived, lasting only up to 3 days. The total atmospheric loss of nitrogen was generally dominated by N2 emissions (82.4–99.3% of total N lost), although NO emissions contributed almost 43.2% to the total atmospheric nitrogen loss at 50% SM and 30 °C ST incubation settings (the contribution of N2 at these soil conditions was only 53.2%). N2O emissions were systematically higher for 3 of 12 sample locations, which indicates substantial spatial variability at site level, but on average soils acted as weak N2O sources or even sinks. By using a conservative upscale approach we estimate total annual emissions from savanna soils to average 0.12 kg N ha−1 yr−1 (N2O), 0.68 kg N ha−1 yr−1 (NO) and 6.65 kg N ha−1 yr−1 (N2). The analysis of long-term SM and ST records makes it clear that extreme soil saturation that can lead to high N2O and N2 emissions only occurs a few days per year and thus has little impact on the annual total. The potential contribution of nitrogen released due to pulse events compared to the total annual emissions was found to be of importance for NO emissions (contribution to total: 5–22%), but not for N2O emissions. Our results indicate that the total gaseous release of nitrogen from these soils is low and clearly dominated by loss in the form of inert nitrogen. Effects of seasonally varying soil temperature and moisture were detected, but were found to be low due to the small amounts of available nitrogen in the soils (total nitrogen <0.1%).
Menschliche Körper und Räume sind wechselseitig aufeinander bezogen und sehr ähnlichen gesellschaftlichen Konstitutionsbedingungen unterworfen. Dessen ungeachtet wurde der Körper mit seinen Bedeutungen für die Konstruktion und Aneignung von Räumen in der Geographie bisher kaum thematisiert. Dieser Beitrag widmet sich aus feministisch-poststrukturalistischer Perspektive dem dynamischen Wechselverhältnis von Körpern und Räumen. Besonders hervorgehoben werden dabei die Bedeutungen von Fremd- und Eigenwahrnehmung für vielfältige Raumaneignungsstrategien. Damit wird zugleich ein theoretischer Ansatz zur Diskussion gestellt, der neue Betrachtungs- und Analyseebenen eröffnet.
Drawing on the example of a research project on the extension of the margins of the global agricultural market through the workings of agribusiness in Ghana, this paper explores what contribution ethnographic approaches can make to the study of quotidian market constructions in organizational settings. It demonstrates how ethnographies of marketization can be grasped conceptually, epistemologically and methodologically, as well as what practical and methodological challenges such a practice-oriented approach towards the everyday organization of markets might encounter. By doing so, the paper offers a methodological contribution to the interdisciplinary field of marketization studies. Moreover, this paper urges economic geographers to further harness the epistemological potential of ethnographic approaches.
Raum und räumliche Beziehungen sind Konstrukte. Sicht-, Denk- und Interpretationsweisen wirken aus dem Hintergrund der aktuellen und historisch gewachsenen gesellschaftlichen Verhältnisse auf die Herstellung nicht zuletzt sprachlich konstruierter Wirklichkeiten ein. In diesem Beitrag werden die Grundzüge von Poststrukturalismus und Postmodernismus skizziert und das vorliegende Heft in seinem Stellenwert für die (Human-) Geographie begründet. Desweiteren wird in die einzelnen Beiträge des Heftes eingeführt.
Metropolregionen werden in der aktuellen Globalisierungsdebatte als Entscheidungs-, Kontroll- und Koordinationszentren von internationaler Bedeutung verstanden. Sie «bündeln» entsprechende Knoten (hubs), deren Verortung, funktionale Bedeutung und regionale Reichweite die Rolle und den Entwicklungspfad der Metropolregion bestimmen. Frankfurt/Rhein-Main ist erst in den letzten beiden Jahrzehnten in diese Rolle hineingewachsen. Knotenfunktionen bestehen heute in drei Themenfeldern: dem Innovationszentrum, dem Finanzplatz und dem Markt(informations)platz. Im Entwicklungspfad der Metropolregion zeigt sich, daß sie zunächst in nationale Bedeutung hineinwachsen mußten, bevor sie internationale Bedeutung gewinnen konnten. Die Knotenfunktionen der Metropolregion Frankfurt/Rhein-Main in den drei genannten Feldern sind jedoch «ungesichert». Daher muß offen bleiben, ob der Aufstieg von Frankfurt/Rhein-Main zur europäischen Metropolregion Auswirkungen auf das weitere System von Metropolregionen in Europa hat.
The Standard Model is one of the greatest successes of modern theoretical physics. Itl describes the physics of elementary particles by means of three forces, the electro-magnetisc, the weak and the strong interactions. The electro-magnetic and the weak interaction are rather well understood in comparison to the strong interaction.
The latest is as fundamental as the others, it is responsible for the formation of all hadrons which are classified into mesons and baryons. Well-known examples of the former is the pion and of the latter is the proton and the neutron, which form the nucleus of every atom. This fundamental force is believed to be described by the Quantum Chromodynamics (QCD) theory. According to this theory, hadrons are not elementary particles but are composed of quarks and gluons. The latter are the vector particles of the force and so are bosons of spin 1 and the former constitute the matter and are fermions with spin 1/2. To describe the interaction a new quantum number had to be introduced: the color charge which exists in three different types (blue, green and red). The name has not been chosen arbitrary as elements created from three quarks of different colors are colorless in the same way that mixing the three primary colors leads to white. However, experimentally no colored structure has ever been observed. The quarks and the gluons seem to be confined in colorless hadrons. This property of QCD is called confinement and results from a large coupling constant at low energy (or large distance). For high energy (or small distance), the perturbative analysis of QCD permits to establish the coupling constant to be small and quarks and gluons are almost free. This property is called asymptotic freedom. The possibility for QCD to describe both behaviors is one of its amazing characteristics. However, both phenomena are not well understood and one needs a method to study both the pertubative and the confining regime.
The only known method which fulfills the above criteria is Lattice QCD and more generally Lattice Quantum Field Theory (LQFT). It consists of a discretization of the spacetime and a formulation of QCD on a four-dimensional Euclidean spacetime grid of spacing a. In this way, the theory is naturally regularized and mathematically well-defined. On the other hand, the path integral formalism allows the theory to be treated as a Statistical Mechanics system which can be evaluated via a Markov chain Monte-Carlo algorithm. This method was first suggested by Wilson in 1974 [1] and shortly after Creutz performed the first numerical simulations of Yang-Mills theory [2] using a heath-bath Monte-Carlo algorithm. It appears that this method is extremely demanding in computational power. In its early days the method was criticized as the only feasible simulations involved non-physical values such as extremely large quark masses, large lattice spacing a and no dynamical quarks. With the progress of the computers and the appearance of the super-computer, the studies have come close to the physical point. But one still needs to deal with discrete space time and finite volume. Several techniques have been developed to estimate the infinite volume limit and the continuum limit. The smaller the lattice spacing and the larger the volume, the better the extrapolation to continuum and infinite volume limits is. The simulations are still very expensive and for the moment a typical length of the box is L ≈ 4fm and a ≈ 0.08fm. However, it has been realized simulating pure Yang-Mills theory and other lower dimensional models that the topology is freezing at small a [3]. It was also observed recently on full QCD simulations [4,5].
The typical lattice spacing for which this problem appears in QCD is a ≈ 0.05fm but this value depends on the quark mass used and on the algorithm. The freezing of topology leads to results which differ from physical results. Solving this issue is important for the future of LQCD [6]. Recently several methods to overcome the problem have been suggested, one of the most popular is the used of open boundary conditions [7] but this promising method has still its own issues, mainly the breaking of translation invariance.
Nach der Wiedervereinigung der beiden deutschenStaaten im Jahr 1990 wurde in den neuen Bundesländern der weitgehend verstaatlichte Grund undBoden in Privateigentum zurücküberführt. Da in derDDR eine Wiedergutmachung nationalsozialistischenUnrechts weitgehend ausgeblieben war, wurde dieRestitutionsregelung auf die Eigentumsentziehungenbis 1933 ausgedehnt.Die wenig erforschte «Arisierung» des Grundeigentums während des Nationalsozialismus gewann damitan erneuter Aktualität. Die «Arisierung» als Teilder nationalsozialistischen Judenverfolgung bedeuteteeine vollständige Verdrängung der Juden aus der Wirtschaft und damit auch aus dem Immobilienbereich.Das Ergebnis der «Arisierung» war «eine der grösstenBesitzumschichtungen der Neuzeit».Ziel des Artikels ist es, einen Überblick über die rechtlichen Regelungen und den Ablauf der «Arisierung»des Grundeigentums zu geben und mit einer Fallstudiediesen Prozess für ein konkretes Quartier im OstteilBerlins darzustellen.
Background: Despite a recent statutory ruling stating the binding nature of advance directives (ADs), only a minority of the population has signed one. Yet, a majority deem it of utmost importance to ensure their wishes are followed through in case they are no longer able to decide. The reasons for this discrepancy have not yet been investigated sufficiently.
Patients and methods: This article is based on a survey of patients using a well-established structured questionnaire. First, patients were asked about their attitudes with respect to six therapeutic options at the end of life: intravenous fluids, artificial feeding, antibiotics, analgesia, chemotherapy/dialysis, and artificial ventilation; and second, they were asked about the negative effects related to the idea of ADs surveying their apprehensions: coercion to fulfill an AD, dictatorial reading of what had been laid down, and abuse of ADs.
Results: A total of 1,260 interviewees completed the questionnaires. A significant percentage of interviewees were indecisive with respect to therapeutic options, ranging from 25% (analgesia) to 45% (artificial feeding). There was no connection to health status. Apprehensions about unwanted effects of ADs were widespread, at 51%, 35%, and 43% for coercion, dictatorial reading, and abuse, respectively.
Conclusion: A significant percentage of interviewees were unable to anticipate decisions about treatment options at the end of life. Apprehensions about negative adverse effects of ADs are widespread.
Der Aufsatz schlägt die Verbindung und Erweiterung von Analysen des (neoliberalen) Regierens mit nicht-subjektzentrierten und affekttheoretischen Ansätzen vor. Anhand einer Analyse des sozialpolitischen und sozialarbeiterischen Umgangs mit Wohnungslosen wird nachvollzogen, welcher Gewinn sich aus der Verbindung von gouvernementalen und affekttheoretischen Perspektiven ergeben kann. Aus einer gouvernementalen Perspektive wird zunächst nachgezeichnet, wie Affekte und Emotionen in Räumen des betreuten Wohnens für Wohnungslose zum Gegenstand fürsorglicher Intervention werden. Im betreuten Wohnen kommen Mikrotechniken zum Einsatz, die auf eine "ausgewogene" emotionale Bindung an Wohnräume und ihr Inventar hinarbeiten. Das betreute Wohnen ist von Problematisierungen durchzogen, die Wohnungslosigkeit als emotionale Haltung der Rastlosigkeit und Unruhe, als einen Mangel an Verbundenheit mit Orten und Dingen deuten. Gleichzeitig wird den Untergebrachten häufig auch eine übersteigerte affektive Bindung an Dinge unterstellt, die sogenannte "Horder" und "Messies" an einer sozial unauffälligen Haushaltsführung hindere. Eine gouvernementale Analyse kann die therapeutische Rationalität sichtbar machen, die diesen Problematisierungen zugrunde liegt. Eine gouvernementale Analyse allein bietet gleichwohl keine Möglichkeit, alternative Erzählungen über die Bedeutung affektiver Beziehungen für das Wohnen zu entwickeln. Mithilfe unterschiedlicher affekttheoretischer Ansätze geht der Aufsatz daher auch der Frage nach, wie sich jenseits therapeutisierender Perspektiven über das Wohnen und die Bedeutung von Bindungen an Orte und Dinge nachdenken lässt. Nicht-subjektzentrierte Konzepte von Affektivität ermöglichen solche alternativen Erzählungen und eröffnen neue Fluchtlinien der Kritik: Wohnen wir sichtbar als immer schon "betreut", eingelassen in ein Netz von intersubjektiven und interobjektiven Beziehungen.
Background: Although the risk of developing colorectal cancer (CRC) is 2-4 times higher in case of a positive family history, risk-adapted screening programs for family members related to CRC- patients do not exist in the German health care system. CRC screening recommendations for persons under 55 years of age that have a family predisposition have been published in several guidelines.
The primary aim of this study is to determine the frequency of positive family history of CRC (1st degree relatives with CRC) among 40–54 year old persons in a general practitioner (GP) setting in Germany. Secondary aims are to detect the frequency of occurrence of colorectal neoplasms (CRC and advanced adenomas) in 1st degree relatives of CRC patients and to identify the variables (e.g. demographic, genetic, epigenetic and proteomic characteristics) that are associated with it. This study also explores whether evidence-based information contributes to informed decisions and how screening participation correlates with anxiety and (anticipated) regret.
Methods/Design: Prior to the beginning of the study, the GP team (GP and one health care assistant) in around 50 practices will be trained, and about 8,750 persons that are registered with them will be asked to complete the “Network against colorectal cancer” questionnaire. The 10 % who are expected to have a positive family history will then be invited to give their informed consent to participate in the study. All individuals with positive family history will be provided with evidence-based information and prevention strategies. We plan to examine each participant’s family history of CRC in detail and to collect information on further variables (e.g. demographics) associated with increased risk. Additional stool and blood samples will be collected from study-participants who decide to undergo a colonoscopy (n ~ 350) and then analyzed at the German Cancer Research Center (DKFZ) Heidelberg to see whether further relevant variables are associated with an increased risk of CRC. One screening list and four questionnaires will be used to collect the data, and a detailed statistical analysis plan will be provided before the database is closed (expected to be June 30, 2015).
Discussion: It is anticipated that when persons with a family history of colorectal cancer have been provided with professional advice by the practice team, there will be an increase in the availability of valid information on the frequency of affected individuals and an increase in the number of persons making informed decisions. We also expect to identify further variables that are associated with colorectal cancer. This study therefore has translational relevance from lab to practice.
Trial registration: German Clinical Trials Register DRKS00006277
Im Neoliberalismus sind die politischen Handlungsspielräume für das zu Selbstführung und -verwertung verdammte Kreativsubjekt eng geworden und auch im Unternehmen Stadt werden politische Prozesse zunehmend Marktlogiken und „-zwängen“ untergeordnet. Am Beispiel der Auseinandersetzungen über die Planung eines KulturCampus in Frankfurt am Main und mit Hilfe neuerer Theorien des Politischen untersucht dieser Artikel aktuelle Formen des Unvernehmens gegen hegemoniale Formen unternehmerischer Politik und lotet neue Möglichkeiten politischer Subjektivitäten in der kreativen Stadt aus, wie sie derzeit u.a. im Kontext der Recht-auf-Stadt-Bewegung und in den performance studies erprobt werden. Dabei wird der Frage nachgegangen, inwiefern diese neuen Formen des Widerstandes in der Lage sind, die marktlogischen, postdemokratischen Regeln von Politik selbst zum Thema, neue Subjektpositionen artikulierbar und Stadt politisch wieder verhandelbar zu machen.
Zum Themenfeld "Diversität und Vielfalt" diskutierten im Rahmen des 8. Treffens des Nachwuchsnetzwerkes "Stadt, Raum, Architektur" Wissenschaftler_innen aus den Sozial-, Geistes- und Raumwissenschaften an den Instituten für Humangeographie, Kulturanthropologie und Europäische Ethnologie der Goethe-Universität Frankfurt am Main am 9. und 10. November 2012. Vor dem Hintergrund aktueller Debatten um die Konzeptualisierung von sowie den praktischen Umgang mit soziokultureller Vielfalt fand ein produktiver Austausch aus den Perspektiven der Stadtplanung, der Architekturwissenschaft sowie der sozial- und kulturwissenschaftlichen Stadt- und Raumforschung statt. Die Ergebnisse dieser interdisziplinären Auseinandersetzung hinsichtlich einer globalen Diskursverschiebung von "Multikulturalismus" zu "Diversität" und der Adaption entsprechender Strategien in Politik, Wirtschaft und Gesellschaft werden in diesem Tagungsbericht anhand theoretischer Ansätze zu "Super-Diversity", Kosmopolitismus und Transnationalismus diskutiert. Empirisch werden insbesondere Fragen zu Standortmarketing, Integrationspolitiken und der Verräumlichung von Diversität sowie konkreter Praktiken der Segregation, Marginalisierung und Aushandlung von Differenz aufgegriffen. Abschließend wird die Frage nach Konflikten und Potenzialen einer "neuen Diversität" aus stadtplanerischer, dekolonialer und poststrukturalistischer Perspektive diskutiert.
In den Vorlesungen zur Gouvernementalität skizziert Foucault die Art und Weise, in der im modernen Staat «aus der Distanz» regiert wird. Diese wird im Artikel dargestellt, materialistisch «geerdet», und es werden hierauf aufbauend die Begriffe Risiko und Versicherheitlichung diskutiert. Die Tauglichkeit dieser Herangehensweise wird anhand der aktuellen Grenz- und Migrationspolitik der EU illustriert, und es werden die in diesem Kontext produzierten Räume skizziert.
Phylogenetic relationships of the primarily wingless insects are still considered unresolved. Even the most comprehensive phylogenomic studies that addressed this question did not yield congruent results. In order to get a grip on these problems, we here analyzed the sources of incongruence in these phylogenomic studies using an extended transcriptome dataset.Our analyses showed that unevenly distributed missing data can be severely misleading by inflating node support despite the absence of phylogenetic signal. In consequence, only decisive datasets should be used which exclusively comprise data blocks containing all taxa whose relationships are addressed. Additionally, we employed Four-cluster Likelihood-Mapping (FcLM) to measure the degree of congruence among genes of a dataset, as a measure of support alternative to bootstrap. FcLM showed incongruent signal among genes, which in our case is correlated with neither functional class assignment of these genes, nor with model misspecification due to unpartitioned analyses. The herein analyzed dataset is the currently largest dataset covering primarily wingless insects, but failed to elucidate their interordinal phylogenetic relationships. While this is unsatisfying from a phylogenetic perspective, we try to show that the analyses of structure and signal within phylogenomic data can protect us from biased phylogenetic inferences due to analytical artefacts.
Recently, the conserved intracellular digestion mechanism ‘autophagy’ has been considered to be involved in early tumorigenesis and its blockade proposed as an alternative treatment approach. However, there is an ongoing debate about whether blocking autophagy has positive or negative effects in tumor cells. Since there is only poor data about the clinico-pathological relevance of autophagy in gliomas in vivo, we first established a cell culture based platform for the in vivo detection of the autophago-lysosomal components. We then investigated key autophagosomal (LC3B, p62, BAG3, Beclin1) and lysosomal (CTSB, LAMP2) molecules in 350 gliomas using immunohistochemistry, immunofluorescence, immunoblotting and qPCR. Autophagy was induced pharmacologically or by altering oxygen and nutrient levels. Our results show that autophagy is enhanced in astrocytomas as compared to normal CNS tissue, but largely independent from the WHO grade and patient survival. A strong upregulation of LC3B, p62, LAMP2 and CTSB was detected in perinecrotic areas in glioblastomas suggesting micro-environmental changes as a driver of autophagy induction in gliomas. Furthermore, glucose restriction induced autophagy in a concentration-dependent manner while hypoxia or amino acid starvation had considerably lesser effects. Apoptosis and autophagy were separately induced in glioma cells both in vitro and in vivo. In conclusion, our findings indicate that autophagy in gliomas is rather driven by micro-environmental changes than by primary glioma-intrinsic features thus challenging the concept of exploitation of the autophago-lysosomal network (ALN) as a treatment approach in gliomas.
Questions about how human-environment-relations can be conceptualized in a non-dualistic way have been intensively discussed throughout the last decades. The majority of the established realist and constructivist perspectives aim at explaining a given situation by analytically dissecting it. Unfortunately, such an interactionist perspective systematically reproduces the dualistic division between humans, environment and nature.
In contrast, this paper offers a transactive perspective origin in classical pragmatism and discusses its meta-theoretical consequences for human-environment-research. A transactionist perspective interprets the world as a flow of unique and entangled events. Instead of ontologically separating humans and environment, it advocates to look at their relations as being part of a "connatural world". Such a point of view raises new ethical and political questions for geographical human-environment research, argues for a renaissance of ideographic methodologies and hints to a fruitful unity of geographical inquiry.
Peptidyl arginine deiminase 4 (PAD4) is a nuclear enzyme that converts arginine residues to citrulline. Although increasingly implicated in inflammatory disease and cancer, the mechanism of action of PAD4 and its functionally relevant pathways remains unclear. E2F transcription factors are a family of master regulators that coordinate gene expression during cellular proliferation and diverse cell fates. We show that E2F-1 is citrullinated by PAD4 in inflammatory cells. Citrullination of E2F-1 assists its chromatin association, specifically to cytokine genes in granulocyte cells. Mechanistically, citrullination augments binding of the BET (bromodomain and extra-terminal domain) family bromodomain reader BRD4 (bromodomain-containing protein 4) to an acetylated domain in E2F-1, and PAD4 and BRD4 coexist with E2F-1 on cytokine gene promoters. Accordingly, the combined inhibition of PAD4 and BRD4 disrupts the chromatin-bound complex and suppresses cytokine gene expression. In the murine collagen-induced arthritis model, chromatin-bound E2F-1 in inflammatory cells and consequent cytokine expression are diminished upon small-molecule inhibition of PAD4 and BRD4, and the combined treatment is clinically efficacious in preventing disease progression. Our results shed light on a new transcription-based mechanism that mediates the inflammatory effect of PAD4 and establish the interplay between citrullination and acetylation in the control of E2F-1 as a regulatory interface for driving inflammatory gene expression.
Fire is the primary disturbance factor in many terrestrial ecosystems. Wildfire alters vegetation structure and composition, affects carbon storage and biogeochemical cycling, and results in the release of climatically relevant trace gases including CO2, CO, CH4, NOx, and aerosols. One way of assessing the impacts of global wildfire on centennial to multi-millennial timescales is to use process-based fire models linked to dynamic global vegetation models (DGVMs). Here we present an update to the LPJ-DGVM and a new fire module based on SPITFIRE that includes several improvements to the way in which fire occurrence, behaviour, and the effects of fire on vegetation are simulated. The new LPJ-LMfire model includes explicit calculation of natural ignitions, the representation of multi-day burning and coalescence of fires, and the calculation of rates of spread in different vegetation types. We describe a new representation of anthropogenic biomass burning under preindustrial conditions that distinguishes the different relationships between humans and fire among hunter-gatherers, pastoralists, and farmers. We evaluate our model simulations against remote-sensing-based estimates of burned area at regional and global scale. While wildfire in much of the modern world is largely influenced by anthropogenic suppression and ignitions, in those parts of the world where natural fire is still the dominant process (e.g. in remote areas of the boreal forest and subarctic), our results demonstrate a significant improvement in simulated burned area over the original SPITFIRE. The new fire model we present here is particularly suited for the investigation of climate–human–fire relationships on multi-millennial timescales prior to the Industrial Revolution.
Fire is the primary disturbance factor in many terrestrial ecosystems. Wildfire alters vegetation structure and composition, affects carbon storage and biogeochemical cycling, and results in the release of climatically relevant trace gases, including CO2, CO, CH4, NO and aerosols. Assessing the impacts of global wildfire on centennial to multimillennial timescales requires the linkage of process-based fire modeling with vegetation modeling using Dynamic Global Vegetation Models (DGVMs). Here we present a new fire module, SPITFIRE-2, and an update to the LPJ-DGVM that includes major improvements to the way in which fire occurrence, behavior, and the effect of fire on vegetation is simulated. The new fire module includes explicit calculation of natural ignitions, the representation of multi-day burning and coalescence of fires and the calculation of rates of spread in different vegetation types, as well as a simple scheme to model crown fires. We describe a new representation of anthropogenic biomass burning under preindustrial conditions that distinguishes the way in which the relationship between humans and fire are different between hunter-gatherers, obligate pastoralists, and farmers. Where and when available, we evaluate our model simulations against remote-sensing based estimates of burned area. While wildfire in much of the modern world is largely influenced by anthropogenic suppression and ignitions, in those parts of the world where natural fire is still the dominant process, e.g. in remote areas of the boreal forest, our results demonstrate a significant improvement in simulated burned area over previous models. With its unique properties of being able to simulate preindustrialfire, the new module we present here is particularly well suited for the investigation of climate-human-fire relationships on multi-millennial timescales.
We investigate complexes of two paramagnetic metal ions Gd3+ and Mn2+ to serve as polarizing agents for solid-state dynamic nuclear polarization (DNP) of 1H, 13C, and 15N at magnetic fields of 5, 9.4, and 14.1 T. Both ions are half-integer high-spin systems with a zero-field splitting and therefore exhibit a broadening of the mS = −1/2 ↔ +1/2 central transition which scales inversely with the external field strength. We investigate experimentally the influence of the chelator molecule, strong hyperfine coupling to the metal nucleus, and deuteration of the bulk matrix on DNP properties. At small Gd-DOTA concentrations the narrow central transition allows us to polarize nuclei with small gyromagnetic ratio such as 13C and even 15N via the solid effect. We demonstrate that enhancements observed are limited by the available microwave power and that large enhancement factors of >100 (for 1H) and on the order of 1000 (for 13C) can be achieved in the saturation limit even at 80 K. At larger Gd(III) concentrations (≥10 mM) where dipolar couplings between two neighboring Gd3+ complexes become substantial a transition towards cross effect as dominating DNP mechanism is observed. Furthermore, the slow spin-diffusion between 13C and 15N, respectively, allows for temporally resolved observation of enhanced polarization spreading from nuclei close to the paramagnetic ion towards nuclei further removed. Subsequently, we present preliminary DNP experiments on ubiquitin by site-directed spin-labeling with Gd3+ chelator tags. The results hold promise towards applications of such paramagnetically labeled proteins for DNP applications in biophysical chemistry and/or structural biology.
We present the first 3-D model of seismic P and S velocities in the crust and uppermost mantle beneath the Gulf of Aqaba and surrounding areas based on the results of passive travel time tomography. The tomographic inversion was performed based on travel time data from ∼ 9000 regional earthquakes provided by the Egyptian National Seismological Network (ENSN), and this was complemented with data from the International Seismological Centre (ISC). The resulting P and S velocity patterns were generally consistent with each other at all depths. Beneath the northern part of the Red Sea, we observed a strong high-velocity anomaly with abrupt limits that coincide with the coastal lines. This finding may indicate the oceanic nature of the crust in the Red Sea, and it does not support the concept of gradual stretching of the continental crust. According to our results, in the middle and lower crust, the seismic anomalies beneath the Gulf of Aqaba seem to delineate a sinistral shift (∼ 100 km) in the opposite flanks of the fault zone, which is consistent with other estimates of the left-lateral displacement in the southern part of the Dead Sea Transform fault. However, no displacement structures were visible in the uppermost lithospheric mantle.
Funded by the German Ministry for Education and Research (BMBF) a major research project called MiKlip (Mittelfristige Klimaprognose, Decadal Climate Prediction) was launched and global as well as regional predictive ensemble hindcasts have been generated. The aim of the project is to demonstrate for past climate change whether predictive models have the capability of predicting climate on time scales of decades. This includes the development of a decadal forecast system, on the one hand to support decision making for economy, politics and society for decadal time spans. On the other hand, the scientific aspect is to explore the feasibility and prospects of global and regional forecasts on decadal time scales. The focus of this paper lies on the description of the regional hindcast ensemble for Europe generated by COSMO-CLM and on the assessment of the decadal variability and predictability against observations. To measure decadal variability we remove the long term bias as well as the long term linear trend from the data. Further, we applied low pass filters to the original data to separate the decadal climate signal from high frequency noise. The decadal variability and predictability assessment is applied to temperature and precipitation data for the summer and winter half-year averages/sums. The best results have been found for the prediction of decadal temperature anomalies, i.e. we have detected a distinct predictive skill and reasonable reliability. Hence it is possible to predict regional temperature variability on decadal timescales, However, the situation is less satisfactory for precipitation. Here we have found regions showing good predictability, but also regions without any predictive skill.
The prediction of climate on time scales of years to decades is attracting the interest of both climate researchers and stakeholders. The German Ministry for Education and Research (BMBF) has launched a major research programme on decadal climate prediction called MiKlip (Mittelfristige Klimaprognosen, Decadal Climate Prediction) in order to investigate the prediction potential of global and regional climate models (RCMs). In this paper we describe a regional predictive hindcast ensemble, its validation, and the added value of regional downscaling. Global predictions are obtained from an ensemble of simulations by the MPI-ESM-LR model (baseline 0 runs), which were downscaled for Europe using the COSMO-CLM regional model. Decadal hindcasts were produced for the 5 decades starting in 1961 until 2001. Observations were taken from the E-OBS data set. To identify decadal variability and predictability, we removed the long-term mean, as well as the long-term linear trend from the data. We split the resulting anomaly time series into two parts, the first including lead times of 1–5 years, reflecting the skill which originates mainly from the initialisation, and the second including lead times from 6–10 years, which are more related to the representation of low frequency climate variability and the effects of external forcing. We investigated temperature averages and precipitation sums for the summer and winter half-year. Skill assessment was based on correlation coefficient and reliability. We found that regional downscaling preserves, but mostly does not improve the skill and the reliability of the global predictions for summer half-year temperature anomalies. In contrast, regionalisation improves global decadal predictions of half-year precipitation sums in most parts of Europe. The added value results from an increased predictive skill on grid-point basis together with an improvement of the ensemble spread, i.e. the reliability.
Biomass burning impacts vegetation dynamics, biogeochemical cycling, atmospheric chemistry, and climate, with sometimes deleterious socio-economic impacts. Under future climate projections it is often expected that the risk of wildfires will increase. Our ability to predict the magnitude and geographic pattern of future fire impacts rests on our ability to model fire regimes, using either well-founded empirical relationships or process-based models with good predictive skill. While a large variety of models exist today, it is still unclear which type of model or degree of complexity is required to model fire adequately at regional to global scales. This is the central question underpinning the creation of the Fire Model Intercomparison Project (FireMIP), an international initiative to compare and evaluate existing global fire models against benchmark data sets for present-day and historical conditions. In this paper we review how fires have been represented in fire-enabled dynamic global vegetation models (DGVMs) and give an overview of the current state of the art in fire-regime modelling. We indicate which challenges still remain in global fire modelling and stress the need for a comprehensive model evaluation and outline what lessons may be learned from FireMIP.