Universitätspublikationen
Refine
Year of publication
Document Type
- Part of a Book (107) (remove)
Language
- English (107) (remove)
Has Fulltext
- yes (107)
Is part of the Bibliography
- no (107)
Keywords
- Social Interaction (3)
- Aufsatzsammlung (2)
- Christentum (2)
- Christianity (2)
- Digitalisierung (2)
- Financial literacy (2)
- Heranwachsender (2)
- Herding (2)
- Herstellung (2)
- Indonesia (2)
Institute
- Sprach- und Kulturwissenschaften (39)
- Medizin (8)
- Informatik (7)
- Wirtschaftswissenschaften (6)
- Kulturwissenschaften (5)
- Cornelia Goethe Centrum für Frauenstudien und die Erforschung der Geschlechterverhältnisse (CGC) (3)
- Exzellenzcluster Die Herausbildung normativer Ordnungen (3)
- Gesellschaftswissenschaften (3)
- Biochemie und Chemie (2)
- Geowissenschaften (2)
The diagnosis that we are living in a world risk society formulated by Ulrich Beck 20 years ago (Beck, Kölner Z Soziol Sozialpsychol 36:119–147, 1996) has lost nothing of its power, especially against the background of the Anthropocene debate. “Global risks” have been identified which are caused by human activities, technology, and modernization processes. Microplastics are a by-product of exactly these modernization processes, being distributed globally by physical processes like ocean currents, and causing effects far from their place of origin. In recent years, the topic has gained great prominence, as microplastics have been discovered nearly everywhere in the environment, raising questions about the impacts on food for human consumption. But are microplastics really a new phenomenon or rather a symptom of an old problem? And exactly what risks are involved? It seems that the phenomenon has accelerated political action—the USA has passed the Microbead-Free Waters Act 2015—and industries have pledged to fade out the use of microbeads in their cosmetic products. At first sight, is it a success for environmentalists and the protection of our planet?
This chapter deals with these questions by adopting a social-ecological perspective, discussing microplastics as a global risk. Taking four main characteristics of global risks, we develop four arguments to discuss (a) the everyday production of risk by societies, (b) scientific risk evaluation of microplastics, (c) social responses, and (d) problems of risk management. To illustrate these four issues, we draw on different aspects of the current scientific and public debate. In doing so, we contribute to a comprehensive understanding of the social-ecological implications of microplastics.
The ubiquitous detection of microplastics in aquatic ecosystems promotes the concern for adverse impacts on freshwater ecosystems. The wide variety of material types, sizes, shapes, and physicochemical properties renders interactions with biota via multiple pathways probable.
So far, our knowledge about the uptake and biological effects of microplastics comes from laboratory studies, applying simplified exposure regimes (e.g., one polymer and size, spherical shape, high concentrations) often with limited environmental relevance. However, the available data illustrates species- and material-related interactions and highlights that microplastics represent a multifaceted stressor. Particle-related toxicities will be driven by polymer type, size, and shape. Chemical toxicity is driven by the adsorption-desorption kinetics of additives and pollutants. In addition, microbial colonization, the formation of hetero-aggregates, and the evolutionary adaptations of the biological receptor further increase the complexity of microplastics as stressors. Therefore, the aim of this chapter is to synthesize and critically revisit these aspects based on the state of the science in freshwater research. Where unavailable we supplement this with data on marine biota. This provides an insight into the direction of future research.
In this regard, the challenge is to understand the complex interactions of biota and plastic materials and to identify the toxicologically most relevant characteristics of the plethora of microplastics. Importantly, as the direct biological impacts of natural particles may be similar, future research needs to benchmark synthetic against natural materials. Finally, given the scale of the research question, we need a multidisciplinary approach to understand the role of microplastics in a multiple-particle world.
The paper investigates the interpretation of the Romanian subjunctive B (subjB) mood when it is embedded under the propositional attitude verb crede (believe). SubjB is analyzed as a single package of three distinct presuppositions: temporal de se, dissociation and propositional de se. I show that subjB is the temporal analogue of null PRO in the individual domain: it allows only for a de se reading. Dissociation enables us to show that subjB always takes scope over a negation embedded in a belief report. Propositional de se derives this empirical generalization. The introduction of centered propositions (generalizing centered worlds), together with propositional de se, dissociation and the belief 'introspection' principles, derives the fact that subjB belief reports (unlike their indicative counterparts) are infelicitous with embedded probabil.
High grade gliomas, including anaplastic glioma WHO grade III and glioblastoma WHO IV (GBM), carry a dismal prognosis. Taking all nowadays-available therapeutics options, including radiation, chemotherapy and surgery, for GBM into consideration the prognosis after initial diagnosis is about 12 month. Despite this bad prognosis, researchers gained a tremendous insight into the molecular and genetic signatures of low and high grade gliomas. Several different subtypes of GBM were demonstrated with respect to their genetic background. These genetic alterations include p53 mutation in secondary GBMs and EGFR amplification in primary GBMs, respectively. Very recently, great excitement was raised after the discovery of IDH1 mutation in low-grade gliomas and secondary GBMs. This discovery is of great significance since it allows further categorizing of GBMs and is helpful in distinguishing low-grade gliomas from non-neoplastic adjacent brain tissue. Despite all this progress there is an urgent need for fresh additional therapeutic strategies. In addition to the identification of novel therapeutic regimens it is of utmost importance to gain an understanding about the molecular mechanisms on how GBMs manage to evade from almost any anti-cancer treatment regimen. In experimental models of glioblastoma there are a number of novel therapeutic regimens that exhibited promising results. These novel therapeutics include, but are not limited to: Apoptosis-based therapeutics (Tumor necrosis factor alpha related apoptosis inducing ligand, TRAIL), tyrosinkinase-inhibitors, Heat-shock-protein 90 (HSP90) inhibitors, polyphenols, novel drug combinations and intracranial application based strategies. This chapter will primarily review and focus on molecular mechanisms of resistance in GBM and rising new therapeutic venues for high-grade gliomas. High-grade gliomas are a group of primary heterogenous tumors of which glioblastoma World Health Organisation, WHO IV (GBM), is the most common one. Once the diagnosis of GBM is made, the average survival time is approximately 12-15 month (Hegi, Diserens et al., 2005). Treatment usually consists of temozolomide (commonly used chemotherapeutic drug for the treatment of GBM, TMZ), radiation (either alone or in combination with chemotherapeutics) and surgery (Hegi, Diserens et al., 2005)...
Application of the liposuction techniques and principles in specific body areas and pathologies
(2011)
The buttocks have been a symbol of attraction, sexuality and eroticism since ancient times and therefore, they have an important role in defining the posterior body contour. More and more people are talking about and understand the meaning and the role that buttocks play in modeling and physical beauty. The three dimensional gluteoplasty (3-DGP) is an innovative technique that allows us to change volume, shape and firmness, not only in the buttocks but also in the adjacent regions such as the thighs and trochanters, becoming an ideal tool to answer the frequent reasons of consultation of our patients about this particular area of the body: ...
Modelling protein structure seems a challenging enterprise because the number of structure parameters required ordinarily exceeds the amount of independent data points available from experimental observations. Expressing the predominant conformation of a protein in terms of a geometry model, a polypeptide chain consisting of N atoms would command 3N – 6 Cartesian coordinates be fixed. Even for small proteins, this becomes a daunting number. Fortunately, so-called holonomic constraints limit the number of variables, leaving substantially fewer, truly relevant parameters for folding the polypeptide chain into its native tertiary structure. For example, adjusting bond lengths and the many angles between the covalent bonds connecting the atoms is of little concern and appropriate standard values can be inserted from tableworks (Pople & Gordon, 1967; Engh & Huber, 1991, 2006). Table 1 exemplifies for the 147-residue protein Desulfovibrio vulgaris flavodoxin how the number of truly independent internal rotational degrees of freedom amounts to less than one-tenth of the Cartesian coordinate set size...
Quantum theory is the most successful physical theory ever. About one third of the gross national product in the developed countries results from its applications. These applications range from nuclear power to most of the high-tech tools for computing, laser, solar cells and so on. No limit for its range of validity has been found up to now...
There are many tools available that are used to evaluate a radiotherapy treatment plan, such as isodose distribution charts, dose volume histograms (DVH), maximum, minimum and mean doses of the dose distributions as well as DVH point dose constraints. All the already mentioned evaluation tools are dosimetric only without taking into account the radiobiological characteristics of tumors or OARs. It has been demonstrated that although competing treatment plans might have similar mean, maximum or minimum doses they may have significantly different clinical outcomes (Mavroidis et al. 2001). For performing a more complete treatment plan evaluation and comparison the complication-free tumor control probability (P+) and the biologically effective uniform dose (D ) have been proposed (Källman et al. 1992a, Mavroidis et al. 2000). The D concept denotes that any two dose distributions within a target or OAR are equivalent if they produce the same probability for tumor control or normal tissue complication, respectively (Mavroidis et al. 2001)...
1. Introduction: The autosomal dominant cerebellar ataxias (ADCA) are a clinically, pathologically and genetically heterogeneous group of neurodegenerative disorders caused by degeneration of cerebellum and its afferent and efferent connections. The degenerative process may additionally involves the ponto- medullar systems, pyramidal tracts, basal ganglia, cerebral cortex, peripheral nerves (ADCA I) and the retina (ADCA II), or can be limited to the cerebellum (ADCA III) (Harding et al., 1993). The most common of these dominantly inherited autosomal ataxias, ADCA I, includes many Spinocerebellar Ataxias (SCA) subtypes, some of which are caused by pathological CAG trinucleotide repeat expansion in the coding region on the mutated gene. Such is the case for SCA1, SCA2, SCA3/MJD, SCA6, SCA7, SCA17 and Dentatorubral-pallidoluysian atrophy (DRPLA) (Matilla et al., 2006). Among the almost 30 SCAs, the variant SCA2 is the second most prevalent subtype worldwide, only surpassed by SCA3 (Schöls et al., 2004; Matilla et al., 2006; Auburger, 2011)...
Temporal regularity allows predicting the temporal locus of future information thereby potentially facilitating cognitive processing. We applied event-related brain potentials (ERPs) to investigate how temporal regularity impacts pre-attentive and attentive processing of deviance in the auditory modality. Participants listened to sequences of sinusoidal tones differing exclusively in pitch. The inter-stimulus interval (ISI) in these sequences was manipulated to convey either isochronous or random temporal structure. In the pre-attentive session, deviance processing was unaffected by the regularity manipulation as evidenced in three event-related-potentials (ERPs): mismatch negativity (MMN), P3a, and reorienting negativity (RON). In the attentive session, the P3b was smaller for deviant tones embedded in irregular temporal structure, while the N2b component remained unaffected. These findings confirm that temporal regularity can reinforce cognitive mechanisms associated with the attentive processing of deviance. Furthermore, they provide evidence for the dynamic allocation of attention in time and dissociable pre-attentive and attention-dependent temporal processing mechanisms.
An analyst who works in Germany is more likely to publish a high (low) price target regarding a DAX30 stock if other Germany based analysts are also optimistic (pessimistic) about the same stock. This finding is not biased by the fact that DAX30 companies are headquartered in Germany. In times of bull markets, price targets of analysts who regularly exchange their opinion are higher correlated compared to other analysts. This effect vanishes in a bearish market environment. This suggests that communication among analysts indeed plays an important role. However, analysts’ incentives induce them not to deviate too much from the overall average during an economic downturn.
With this paper, I propose a simple asset pricing model that accounts for the influence from social interaction. Investors are assumed to make up their mind about an asset's price based on a forecasting strategy and its past profitability as well as on the contemporaneous expectations of other market participants. Empirically analysing stocks in the DAX30 index, I provide evidence that social interaction rather destabilises financial markets. At least, it does not have a stabilising effect.
In this paper, I analyse the reciprocal social influence on investment decisions within an international group of roughly 2,000 mutual fund managers who invested in companies in the DAX30. Using a robust estimation procedure, I provide empirical evidence that the average fund manager puts 0.69% more portfolio weight on a particular stock, if his peers on average assign a weight to the corresponding position which is 1% higher compared to other stocks in the portfolio. The dynamics of this influence on the choice of portfolio weights suggest that fund managers adjust their behaviour according to the prevailing market situation and are more strongly influenced by others in times of an economic downturn. Analysing the working locations of the fund managers, I conclude that more than 90% of the magnitude of influence stems from the social learning. While this form of influence varies much over time, the magnitude of influence resulting from the exchange of opinion is more or less constant.
Structural biology and life sciences in general, and NMR in particular, have always been associated with advanced computing. The current challenges in the post-genomic era call for virtual research platforms that provide the worldwide research community with both user-friendly tools, platforms for data analysis and exchange, and an underlying e-Infrastructure. WeNMR, a three-year European Commission co-funded project started in November 2010, groups different research teams into a worldwide virtual research community. It builds on the established eNMR e-Infrastructure and its steadily growing virtual organisation, which is currently the second largest VO in the area of life sciences. WeNMR provides an e-Infrastructure platform and Science Gateway for structural biology. It involves researchers from around the world and will build bridges to other areas of structural biology.
This article investigates the phenomenon of increasing integration of customers and users into the organizational creation of value, focusing primarily on the dissolving boundaries between production and consumption. Concepts such as "prosuming", the "working customer", "produsing" and "interactive value creation" have been used to describe this phenomenon. Within the framework of a research project at the Goethe-University Frankfurt/Main, this debate was investigated theoretically as well as empirically in three case studies. The research question is as follows: Why do customers participate in "new types of prosuming" or "interactive value creation" and how are these processes coordinated by the firms? The results show a considerable range of motives and forms of coordination: The customers’ primary motives to voluntarily assume tasks and activities were both intrinsic and extrinsic in nature. The organizational models identified range from strategies of rationalization to prosuming as a basic business model to the collaborative and interactive value creation between the company and the web-community.
Communication in the Web 2.0 context mainly works through images. The online video platform YouTube uses this form of visual communication and makes art forms of Western societies visible through their online videos. YouTube, as cultural reservoir and visual archive of moving images, accommodates the whole range of visualising creative processes – from artistic finger exercises to fine arts. A general characteristic of YouTube is the publishing of small everyday gestures of the ‘big ones’ (politicians, stars), like small incidents and their clumsiness in everyday actions, e.g. Beyonce´s fall from the stage or Tom Cruise’s demonic pro-scientology interview. Through their viral distribution on different platforms, these incidents will never be covered up or disappear from the public view. At the same time big gestures and star images are replicated and sometimes reinterpreted by the ‘small people’ who present themselves in the poses and attitudes of the stars. Generally, a coexistence of different perspectives is possible. YouTube allows polysemic and polyvalent views on the everyday and media phenomena. This article relies on YouTube research 2 that started in 2006 at the New Media Department of the Goethe University of Frankfurt. The results of the research have already presented representative forms and basic patterns, that is to say, categories for the clips appearing here. These kinds of clips, recurring in the observation period, have an impact on the basic representation of art or artistic expression within moving images on this platform. Methodologically the focus leads to the investigation (which has to be adequate to the specifics of the medium, or ‘media adequate’) of new visual structures and forms which can create – consciously or unconsciously – an art form. After focusing on the media structures, it will be discussed whether any and, if so, which ‘authentic’ new forms were developed solely on YouTube and whether these forms are innovative and can be characterised as avant-garde. This article first takes a small step in evaluating how to get from a general communication through means of visuality in web 2.0, an often endless chatty cheesy visual noise 3 – to the special quality of a consciously created aesthetic. From where do innovative aesthetic forms emerge, related to their media structures? 4 Are they the products of ‘media amateurs’ 5 or do we have to find new specifications and descriptions for the producers? The definition of a ‘media amateur’ describes technically interested private individuals who acquire and develop technology before commercial use of the technology is even recognisable. Just as artists are developing their own techniques, according to Dieter Daniels, media amateurs are autodidacts who invent techniques, rather than just acquire knowledge about them (see for example the demo scene, the machinima, brickfilm producers as well as many areas of computer gaming in general 6). The media amateur directly intervenes in the production processes of the medium and does not just simply use the medium. What is fascinating is the media amateur’s process of self education – not the result – and the direct impact on the internal structure and the control of the medium. 7 Media amateurs open a previously culturally unformed space of experience. This only partially applies to most of the YouTube clips in the realms of the visual arts; it is here most important to look at the visual content. This article discusses all these concepts and introduces new descriptions for the different forms of production: the technically oriented media master, the do-it-yourselfer, the tinkerer, the amateur handicraftsman and the inventor. It outlines a basic research project on ‘visual media culture’ (a triangulation of research on media structure and iconography) of the presented online video platform. It is a product of the analysis of clips focusing on the media structure, analyzing the creative handling of images and the deviations and differences of pre-set media formats and stereotypes.
Indonesia is a multicultural and multireligious nation whose heterogeneity is codified in the state doctrine, the Pancasila. Yet the relations between the various social, ethnic, and religious groups have been problematic down to the present day, and national unity has remained fragile. In several respects, Christians have a precarious role in the struggle for shaping the nation. They are a small minority (about 9% of the population) in a country predominantly inhabited by Muslims; in the past they were interconnected in manifold ways with the Dutch colonial government; they exert great influence in economy and the military, and constitute the majority of the population in some parts of the so-called Outer Islands (such as Flores, Sumba, and Timor), which are characterized by an attitude fraught with ambivalence towards the state apparatus perceived as ‘Javanese’ and ‘Muslim’. In the aftermath of the former president Suharto’s resignation and in the course of the ensuing political changes – in particular the independence of East Timor – Christians were repeatedly discredited for allegedly posing a threat to Indonesian unity, and have been involved both as victims and perpetrators in violent regional clashes with Muslims that claimed thousands of lives. Since the beginning of the new millennium the violent conflicts have lessened, yet the pressure exerted on Christians by Islamic fundamentalists still continues undiminished in the Muslim-majority regions. The future of the Christians in Indonesia remains uncertain, and pluralist society is still on trial. For this reason the situation of Christians in Indonesia is an important issue that goes far beyond research on a minority, touching on general issues relating to the formation of the nation-state.
From the very outset of European expansion, scholars have been preoccupied with the impact of proselytization and colonization on non-European societies. Anthropologists such as Margaret Mead and Bronislaw Malinowski, who witnessed these processes at the beginning of the twentieth century while at the same time benefitting from the colonial structure, were convinced that the autochthonous societies could not possibly withstand the onslaught of the dominant European cultures, and thus were doomed to vanish in the near future. The fear of losing their object of research, which had just recently been discovered, hung above the heads of the scholars like a sword of Damocles ever since the establishment of anthropology as a discipline. They felt hurried to document what seemed to be crumbling away. Behind these fears there was the notion that the indigenous cultures were comparatively static entities that had existed untouched by any external influences for many centuries, or even millennia, and were unable to change. This idea was shared by proponents of other disciplines; in religious studies, for example, up to the late 1980s the view prevailed that the contact between the great world religions and the belief systems of small, autochthonous societies doomed the latter to extinction. However, more recent studies have shown that this assumption, according to which indigenous peoples have not undergone any changes in the course of history, is untenable. It became apparent that groups supposedly living in isolation have extensive contact networks, and that migration, trade, and conquest are not privileges of modern times. Myths and oral traditions bore witness of journeys to faraway regions, new settlements founded in unknown territories, or the arrival of victorious foreigners who introduced new ways and customs and laid claim to a place of their own within society.
The Internet as the biggest human library ever assembled keeps on growing. Although all kinds of information carriers (e.g. audio/video/hybrid file formats) are available, text based documents dominate. It is estimated that about 80% of all information worldwide stored electronically exists in (or can be converted into) text form. More and more, all kinds of documents are generated by means of a text processing system and are therefore available electronically. Nowadays, many printed journals are also published online and may even discontinue to appear in print form tomorrow. This development has many convincing advantages: the documents are both available faster (cf. prepress services) and cheaper, they can be searched more easily, the physical storage only needs a fraction of the space previously necessary and the medium will not age. For most people, fast and easy access is the most interesting feature of the new age; computer-aided search for specific documents or Web pages becomes the basic tool for information-oriented work. But this tool has problems. The current keyword based search machines available on the Internet are not really appropriate for such a task; either there are (way) too many documents matching the specified keywords are presented or none at all. The problem lies in the fact that it is often very difficult to choose appropriate terms describing the desired topic in the first place. This contribution discusses the current state-of-the-art techniques in content-based searching (along with common visualization/browsing approaches) and proposes a particular adaptive solution for intuitive Internet document navigation, which not only enables the user to provide full texts instead of manually selected keywords (if available), but also allows him/her to explore the whole database.
In intensive care units physicians are aware of a high lethality rate of septic shock patients. In this contribution we present typical problems and results of a retrospective, data driven analysis based on two neural network methods applied on the data of two clinical studies. Our approach includes necessary steps of data mining, i.e. building up a data base, cleaning and preprocessing the data and finally choosing an adequate analysis for the medical patient data. We chose two architectures based on supervised neural networks. The patient data is classified into two classes (survived and deceased) by a diagnosis based either on the black-box approach of a growing RBF network and otherwise on a second network which can be used to explain its diagnosis by human-understandable diagnostic rules. The advantages and drawbacks of these classification methods for an early warning system are discussed.
The encoding of images by semantic entities is still an unresolved task. This paper proposes the encoding of images by only a few important components or image primitives. Classically, this can be done by the Principal Component Analysis (PCA). Recently, the Independent Component Analysis (ICA) has found strong interest in the signal processing and neural network community. Using this as pattern primitives we aim for source patterns with the highest occurrence probability or highest information. For the example of a synthetic image composed by characters this idea selects the salient ones. For natural images it does not lead to an acceptable reproduction error since no a-priori probabilities can be computed. Combining the traditional principal component criteria of PCA with the independence property of ICA we obtain a better encoding. It turns out that the Independent Principal Components (IPC) in contrast to the Principal Independent Components (PIC) implement the classical demand of Shannon’s rate distortion theory.
One of the most interesting domains of feedforward networks is the processing of sensor signals. There do exist some networks which extract most of the information by implementing the maximum entropy principle for Gaussian sources. This is done by transforming input patterns to the base of eigenvectors of the input autocorrelation matrix with the biggest eigenvalues. The basic building block of these networks is the linear neuron, learning with the Oja learning rule. Nevertheless, some researchers in pattern recognition theory claim that for pattern recognition and classification clustering transformations are needed which reduce the intra-class entropy. This leads to stable, reliable features and is implemented for Gaussian sources by a linear transformation using the eigenvectors with the smallest eigenvalues. In another paper (Brause 1992) it is shown that the basic building block for such a transformation can be implemented by a linear neuron using an Anti-Hebb rule and restricted weights. This paper shows the analog VLSI design for such a building block, using standard modules of multiplication and addition. The most tedious problem in this VLSI-application is the design of an analog vector normalization circuitry. It can be shown that the standard approaches of weight summation will not give the convergence to the eigenvectors for a proper feature transformation. To avoid this problem, our design differs significantly from the standard approaches by computing the real Euclidean norm. Keywords: minimum entropy, principal component analysis, VLSI, neural networks, surface approximation, cluster transformation, weight normalization circuit.
An interior delta in the lower course of the Ntem River near the sub-prefecture Ma’an was identified after interpretation of satellite images, topographical maps of SW Cameroon and geological as well as hydrological references and a reconnaissance fieldtrip to the study area. Here neotectonic processes have initiated the establishment of a ‘sediment trap’ (step fault), which in combination with environmental changes strongly generated the fluvial morphology. It transitionally led to temporary lacustrine and palustrine conditions in parts of this river section. Inside the interior delta an anastomosing multi-branched river system has developed, which contains ‘stillwater locations', periodically inundated sections, islands and rapids. Following geomorphological, physiogeographical and sedimentological research approaches, the alluvial plain has been prospected and studied extensively. 91 hand-corings, including three NE–SW transects, were carried out on river benches, levees, cut-off and periodical branches, islands as well as terraces throughout the entire alluvial plain and have unveiled multi-layered, sandy to clayey alluvia reaching up to 440 cm depth. At many locations, fossil organic horizons and palaeosurfaces were discovered, containing valuable palaeoenvironmental proxy data. At these sites, through additional detailed stratigraphical analysis (close-meshed hand-coring and exposure digging) a comprehensive insight into the stratification (lamination) of the alluvia could be gained, clarifying processes and conditions that prevailed in the catchment area during the period of their deposition. 32 Radiocarbon data of macro-rests (leafs, wood), charcoal and organic sediment sampled from these horizons provided ages between 48.230 ± 6.411 and 217 ± 46 years BP (not calibrated). This constitutes the importance of the alluvia as an additional, innovative palaeoarchive for proxy data contributing to the reconstruction of palaeoenvironment and palaeoclimate in western Equatorial Africa. The further examination of the alluvia will not only provide additional information on the dynamics of vegetation, climate and hydrology (esp. fluvial morphology) in SW Cameroon since the ‘First Millennium BC Crisis’ (around 3.000 years BP), the main focus of the DFG-research project, but also on conditions prevailing since the Late Pleistocene, during the Last Glacial Maximum (~18.000 years BP), the Younger Dryas impact (~11.000 years BP) and the ‘Humid African Period’ (~9.000–6.000 years BP). Delta13C-values (–31,4 to –26,4‰) evidence that at the particular drilling sites rain forest has prevailed during the corresponding time period (rain forest refuge theory). The sampled macrorests all indicate rain forest dominated ecosystems, which were able to persist in fluvial habitats, even during arid periods.