Universitätspublikationen
Refine
Year of publication
Document Type
- Part of a Book (107) (remove)
Language
- English (107) (remove)
Has Fulltext
- yes (107)
Is part of the Bibliography
- no (107)
Keywords
- Social Interaction (3)
- Aufsatzsammlung (2)
- Christentum (2)
- Christianity (2)
- Digitalisierung (2)
- Financial literacy (2)
- Heranwachsender (2)
- Herding (2)
- Herstellung (2)
- Indonesia (2)
Institute
- Sprach- und Kulturwissenschaften (39)
- Medizin (8)
- Informatik (7)
- Wirtschaftswissenschaften (6)
- Kulturwissenschaften (5)
- Cornelia Goethe Centrum für Frauenstudien und die Erforschung der Geschlechterverhältnisse (CGC) (3)
- Exzellenzcluster Die Herausbildung normativer Ordnungen (3)
- Gesellschaftswissenschaften (3)
- Biochemie und Chemie (2)
- Geowissenschaften (2)
From the very outset of European expansion, scholars have been preoccupied with the impact of proselytization and colonization on non-European societies. Anthropologists such as Margaret Mead and Bronislaw Malinowski, who witnessed these processes at the beginning of the twentieth century while at the same time benefitting from the colonial structure, were convinced that the autochthonous societies could not possibly withstand the onslaught of the dominant European cultures, and thus were doomed to vanish in the near future. The fear of losing their object of research, which had just recently been discovered, hung above the heads of the scholars like a sword of Damocles ever since the establishment of anthropology as a discipline. They felt hurried to document what seemed to be crumbling away. Behind these fears there was the notion that the indigenous cultures were comparatively static entities that had existed untouched by any external influences for many centuries, or even millennia, and were unable to change. This idea was shared by proponents of other disciplines; in religious studies, for example, up to the late 1980s the view prevailed that the contact between the great world religions and the belief systems of small, autochthonous societies doomed the latter to extinction. However, more recent studies have shown that this assumption, according to which indigenous peoples have not undergone any changes in the course of history, is untenable. It became apparent that groups supposedly living in isolation have extensive contact networks, and that migration, trade, and conquest are not privileges of modern times. Myths and oral traditions bore witness of journeys to faraway regions, new settlements founded in unknown territories, or the arrival of victorious foreigners who introduced new ways and customs and laid claim to a place of their own within society.
The Internet as the biggest human library ever assembled keeps on growing. Although all kinds of information carriers (e.g. audio/video/hybrid file formats) are available, text based documents dominate. It is estimated that about 80% of all information worldwide stored electronically exists in (or can be converted into) text form. More and more, all kinds of documents are generated by means of a text processing system and are therefore available electronically. Nowadays, many printed journals are also published online and may even discontinue to appear in print form tomorrow. This development has many convincing advantages: the documents are both available faster (cf. prepress services) and cheaper, they can be searched more easily, the physical storage only needs a fraction of the space previously necessary and the medium will not age. For most people, fast and easy access is the most interesting feature of the new age; computer-aided search for specific documents or Web pages becomes the basic tool for information-oriented work. But this tool has problems. The current keyword based search machines available on the Internet are not really appropriate for such a task; either there are (way) too many documents matching the specified keywords are presented or none at all. The problem lies in the fact that it is often very difficult to choose appropriate terms describing the desired topic in the first place. This contribution discusses the current state-of-the-art techniques in content-based searching (along with common visualization/browsing approaches) and proposes a particular adaptive solution for intuitive Internet document navigation, which not only enables the user to provide full texts instead of manually selected keywords (if available), but also allows him/her to explore the whole database.
In intensive care units physicians are aware of a high lethality rate of septic shock patients. In this contribution we present typical problems and results of a retrospective, data driven analysis based on two neural network methods applied on the data of two clinical studies. Our approach includes necessary steps of data mining, i.e. building up a data base, cleaning and preprocessing the data and finally choosing an adequate analysis for the medical patient data. We chose two architectures based on supervised neural networks. The patient data is classified into two classes (survived and deceased) by a diagnosis based either on the black-box approach of a growing RBF network and otherwise on a second network which can be used to explain its diagnosis by human-understandable diagnostic rules. The advantages and drawbacks of these classification methods for an early warning system are discussed.
The encoding of images by semantic entities is still an unresolved task. This paper proposes the encoding of images by only a few important components or image primitives. Classically, this can be done by the Principal Component Analysis (PCA). Recently, the Independent Component Analysis (ICA) has found strong interest in the signal processing and neural network community. Using this as pattern primitives we aim for source patterns with the highest occurrence probability or highest information. For the example of a synthetic image composed by characters this idea selects the salient ones. For natural images it does not lead to an acceptable reproduction error since no a-priori probabilities can be computed. Combining the traditional principal component criteria of PCA with the independence property of ICA we obtain a better encoding. It turns out that the Independent Principal Components (IPC) in contrast to the Principal Independent Components (PIC) implement the classical demand of Shannon’s rate distortion theory.
One of the most interesting domains of feedforward networks is the processing of sensor signals. There do exist some networks which extract most of the information by implementing the maximum entropy principle for Gaussian sources. This is done by transforming input patterns to the base of eigenvectors of the input autocorrelation matrix with the biggest eigenvalues. The basic building block of these networks is the linear neuron, learning with the Oja learning rule. Nevertheless, some researchers in pattern recognition theory claim that for pattern recognition and classification clustering transformations are needed which reduce the intra-class entropy. This leads to stable, reliable features and is implemented for Gaussian sources by a linear transformation using the eigenvectors with the smallest eigenvalues. In another paper (Brause 1992) it is shown that the basic building block for such a transformation can be implemented by a linear neuron using an Anti-Hebb rule and restricted weights. This paper shows the analog VLSI design for such a building block, using standard modules of multiplication and addition. The most tedious problem in this VLSI-application is the design of an analog vector normalization circuitry. It can be shown that the standard approaches of weight summation will not give the convergence to the eigenvectors for a proper feature transformation. To avoid this problem, our design differs significantly from the standard approaches by computing the real Euclidean norm. Keywords: minimum entropy, principal component analysis, VLSI, neural networks, surface approximation, cluster transformation, weight normalization circuit.
An interior delta in the lower course of the Ntem River near the sub-prefecture Ma’an was identified after interpretation of satellite images, topographical maps of SW Cameroon and geological as well as hydrological references and a reconnaissance fieldtrip to the study area. Here neotectonic processes have initiated the establishment of a ‘sediment trap’ (step fault), which in combination with environmental changes strongly generated the fluvial morphology. It transitionally led to temporary lacustrine and palustrine conditions in parts of this river section. Inside the interior delta an anastomosing multi-branched river system has developed, which contains ‘stillwater locations', periodically inundated sections, islands and rapids. Following geomorphological, physiogeographical and sedimentological research approaches, the alluvial plain has been prospected and studied extensively. 91 hand-corings, including three NE–SW transects, were carried out on river benches, levees, cut-off and periodical branches, islands as well as terraces throughout the entire alluvial plain and have unveiled multi-layered, sandy to clayey alluvia reaching up to 440 cm depth. At many locations, fossil organic horizons and palaeosurfaces were discovered, containing valuable palaeoenvironmental proxy data. At these sites, through additional detailed stratigraphical analysis (close-meshed hand-coring and exposure digging) a comprehensive insight into the stratification (lamination) of the alluvia could be gained, clarifying processes and conditions that prevailed in the catchment area during the period of their deposition. 32 Radiocarbon data of macro-rests (leafs, wood), charcoal and organic sediment sampled from these horizons provided ages between 48.230 ± 6.411 and 217 ± 46 years BP (not calibrated). This constitutes the importance of the alluvia as an additional, innovative palaeoarchive for proxy data contributing to the reconstruction of palaeoenvironment and palaeoclimate in western Equatorial Africa. The further examination of the alluvia will not only provide additional information on the dynamics of vegetation, climate and hydrology (esp. fluvial morphology) in SW Cameroon since the ‘First Millennium BC Crisis’ (around 3.000 years BP), the main focus of the DFG-research project, but also on conditions prevailing since the Late Pleistocene, during the Last Glacial Maximum (~18.000 years BP), the Younger Dryas impact (~11.000 years BP) and the ‘Humid African Period’ (~9.000–6.000 years BP). Delta13C-values (–31,4 to –26,4‰) evidence that at the particular drilling sites rain forest has prevailed during the corresponding time period (rain forest refuge theory). The sampled macrorests all indicate rain forest dominated ecosystems, which were able to persist in fluvial habitats, even during arid periods.