Universitätspublikationen
Refine
Year of publication
Document Type
- Part of a Book (107) (remove)
Language
- English (107) (remove)
Has Fulltext
- yes (107)
Is part of the Bibliography
- no (107)
Keywords
- Social Interaction (3)
- Aufsatzsammlung (2)
- Christentum (2)
- Christianity (2)
- Digitalisierung (2)
- Financial literacy (2)
- Heranwachsender (2)
- Herding (2)
- Herstellung (2)
- Indonesia (2)
Institute
- Sprach- und Kulturwissenschaften (39)
- Medizin (8)
- Informatik (7)
- Wirtschaftswissenschaften (6)
- Kulturwissenschaften (5)
- Cornelia Goethe Centrum für Frauenstudien und die Erforschung der Geschlechterverhältnisse (CGC) (3)
- Exzellenzcluster Die Herausbildung normativer Ordnungen (3)
- Gesellschaftswissenschaften (3)
- Biochemie und Chemie (2)
- Geowissenschaften (2)
Temporal regularity allows predicting the temporal locus of future information thereby potentially facilitating cognitive processing. We applied event-related brain potentials (ERPs) to investigate how temporal regularity impacts pre-attentive and attentive processing of deviance in the auditory modality. Participants listened to sequences of sinusoidal tones differing exclusively in pitch. The inter-stimulus interval (ISI) in these sequences was manipulated to convey either isochronous or random temporal structure. In the pre-attentive session, deviance processing was unaffected by the regularity manipulation as evidenced in three event-related-potentials (ERPs): mismatch negativity (MMN), P3a, and reorienting negativity (RON). In the attentive session, the P3b was smaller for deviant tones embedded in irregular temporal structure, while the N2b component remained unaffected. These findings confirm that temporal regularity can reinforce cognitive mechanisms associated with the attentive processing of deviance. Furthermore, they provide evidence for the dynamic allocation of attention in time and dissociable pre-attentive and attention-dependent temporal processing mechanisms.
An analyst who works in Germany is more likely to publish a high (low) price target regarding a DAX30 stock if other Germany based analysts are also optimistic (pessimistic) about the same stock. This finding is not biased by the fact that DAX30 companies are headquartered in Germany. In times of bull markets, price targets of analysts who regularly exchange their opinion are higher correlated compared to other analysts. This effect vanishes in a bearish market environment. This suggests that communication among analysts indeed plays an important role. However, analysts’ incentives induce them not to deviate too much from the overall average during an economic downturn.
With this paper, I propose a simple asset pricing model that accounts for the influence from social interaction. Investors are assumed to make up their mind about an asset's price based on a forecasting strategy and its past profitability as well as on the contemporaneous expectations of other market participants. Empirically analysing stocks in the DAX30 index, I provide evidence that social interaction rather destabilises financial markets. At least, it does not have a stabilising effect.
In this paper, I analyse the reciprocal social influence on investment decisions within an international group of roughly 2,000 mutual fund managers who invested in companies in the DAX30. Using a robust estimation procedure, I provide empirical evidence that the average fund manager puts 0.69% more portfolio weight on a particular stock, if his peers on average assign a weight to the corresponding position which is 1% higher compared to other stocks in the portfolio. The dynamics of this influence on the choice of portfolio weights suggest that fund managers adjust their behaviour according to the prevailing market situation and are more strongly influenced by others in times of an economic downturn. Analysing the working locations of the fund managers, I conclude that more than 90% of the magnitude of influence stems from the social learning. While this form of influence varies much over time, the magnitude of influence resulting from the exchange of opinion is more or less constant.
Structural biology and life sciences in general, and NMR in particular, have always been associated with advanced computing. The current challenges in the post-genomic era call for virtual research platforms that provide the worldwide research community with both user-friendly tools, platforms for data analysis and exchange, and an underlying e-Infrastructure. WeNMR, a three-year European Commission co-funded project started in November 2010, groups different research teams into a worldwide virtual research community. It builds on the established eNMR e-Infrastructure and its steadily growing virtual organisation, which is currently the second largest VO in the area of life sciences. WeNMR provides an e-Infrastructure platform and Science Gateway for structural biology. It involves researchers from around the world and will build bridges to other areas of structural biology.
This article investigates the phenomenon of increasing integration of customers and users into the organizational creation of value, focusing primarily on the dissolving boundaries between production and consumption. Concepts such as "prosuming", the "working customer", "produsing" and "interactive value creation" have been used to describe this phenomenon. Within the framework of a research project at the Goethe-University Frankfurt/Main, this debate was investigated theoretically as well as empirically in three case studies. The research question is as follows: Why do customers participate in "new types of prosuming" or "interactive value creation" and how are these processes coordinated by the firms? The results show a considerable range of motives and forms of coordination: The customers’ primary motives to voluntarily assume tasks and activities were both intrinsic and extrinsic in nature. The organizational models identified range from strategies of rationalization to prosuming as a basic business model to the collaborative and interactive value creation between the company and the web-community.
Communication in the Web 2.0 context mainly works through images. The online video platform YouTube uses this form of visual communication and makes art forms of Western societies visible through their online videos. YouTube, as cultural reservoir and visual archive of moving images, accommodates the whole range of visualising creative processes – from artistic finger exercises to fine arts. A general characteristic of YouTube is the publishing of small everyday gestures of the ‘big ones’ (politicians, stars), like small incidents and their clumsiness in everyday actions, e.g. Beyonce´s fall from the stage or Tom Cruise’s demonic pro-scientology interview. Through their viral distribution on different platforms, these incidents will never be covered up or disappear from the public view. At the same time big gestures and star images are replicated and sometimes reinterpreted by the ‘small people’ who present themselves in the poses and attitudes of the stars. Generally, a coexistence of different perspectives is possible. YouTube allows polysemic and polyvalent views on the everyday and media phenomena. This article relies on YouTube research 2 that started in 2006 at the New Media Department of the Goethe University of Frankfurt. The results of the research have already presented representative forms and basic patterns, that is to say, categories for the clips appearing here. These kinds of clips, recurring in the observation period, have an impact on the basic representation of art or artistic expression within moving images on this platform. Methodologically the focus leads to the investigation (which has to be adequate to the specifics of the medium, or ‘media adequate’) of new visual structures and forms which can create – consciously or unconsciously – an art form. After focusing on the media structures, it will be discussed whether any and, if so, which ‘authentic’ new forms were developed solely on YouTube and whether these forms are innovative and can be characterised as avant-garde. This article first takes a small step in evaluating how to get from a general communication through means of visuality in web 2.0, an often endless chatty cheesy visual noise 3 – to the special quality of a consciously created aesthetic. From where do innovative aesthetic forms emerge, related to their media structures? 4 Are they the products of ‘media amateurs’ 5 or do we have to find new specifications and descriptions for the producers? The definition of a ‘media amateur’ describes technically interested private individuals who acquire and develop technology before commercial use of the technology is even recognisable. Just as artists are developing their own techniques, according to Dieter Daniels, media amateurs are autodidacts who invent techniques, rather than just acquire knowledge about them (see for example the demo scene, the machinima, brickfilm producers as well as many areas of computer gaming in general 6). The media amateur directly intervenes in the production processes of the medium and does not just simply use the medium. What is fascinating is the media amateur’s process of self education – not the result – and the direct impact on the internal structure and the control of the medium. 7 Media amateurs open a previously culturally unformed space of experience. This only partially applies to most of the YouTube clips in the realms of the visual arts; it is here most important to look at the visual content. This article discusses all these concepts and introduces new descriptions for the different forms of production: the technically oriented media master, the do-it-yourselfer, the tinkerer, the amateur handicraftsman and the inventor. It outlines a basic research project on ‘visual media culture’ (a triangulation of research on media structure and iconography) of the presented online video platform. It is a product of the analysis of clips focusing on the media structure, analyzing the creative handling of images and the deviations and differences of pre-set media formats and stereotypes.
Indonesia is a multicultural and multireligious nation whose heterogeneity is codified in the state doctrine, the Pancasila. Yet the relations between the various social, ethnic, and religious groups have been problematic down to the present day, and national unity has remained fragile. In several respects, Christians have a precarious role in the struggle for shaping the nation. They are a small minority (about 9% of the population) in a country predominantly inhabited by Muslims; in the past they were interconnected in manifold ways with the Dutch colonial government; they exert great influence in economy and the military, and constitute the majority of the population in some parts of the so-called Outer Islands (such as Flores, Sumba, and Timor), which are characterized by an attitude fraught with ambivalence towards the state apparatus perceived as ‘Javanese’ and ‘Muslim’. In the aftermath of the former president Suharto’s resignation and in the course of the ensuing political changes – in particular the independence of East Timor – Christians were repeatedly discredited for allegedly posing a threat to Indonesian unity, and have been involved both as victims and perpetrators in violent regional clashes with Muslims that claimed thousands of lives. Since the beginning of the new millennium the violent conflicts have lessened, yet the pressure exerted on Christians by Islamic fundamentalists still continues undiminished in the Muslim-majority regions. The future of the Christians in Indonesia remains uncertain, and pluralist society is still on trial. For this reason the situation of Christians in Indonesia is an important issue that goes far beyond research on a minority, touching on general issues relating to the formation of the nation-state.
From the very outset of European expansion, scholars have been preoccupied with the impact of proselytization and colonization on non-European societies. Anthropologists such as Margaret Mead and Bronislaw Malinowski, who witnessed these processes at the beginning of the twentieth century while at the same time benefitting from the colonial structure, were convinced that the autochthonous societies could not possibly withstand the onslaught of the dominant European cultures, and thus were doomed to vanish in the near future. The fear of losing their object of research, which had just recently been discovered, hung above the heads of the scholars like a sword of Damocles ever since the establishment of anthropology as a discipline. They felt hurried to document what seemed to be crumbling away. Behind these fears there was the notion that the indigenous cultures were comparatively static entities that had existed untouched by any external influences for many centuries, or even millennia, and were unable to change. This idea was shared by proponents of other disciplines; in religious studies, for example, up to the late 1980s the view prevailed that the contact between the great world religions and the belief systems of small, autochthonous societies doomed the latter to extinction. However, more recent studies have shown that this assumption, according to which indigenous peoples have not undergone any changes in the course of history, is untenable. It became apparent that groups supposedly living in isolation have extensive contact networks, and that migration, trade, and conquest are not privileges of modern times. Myths and oral traditions bore witness of journeys to faraway regions, new settlements founded in unknown territories, or the arrival of victorious foreigners who introduced new ways and customs and laid claim to a place of their own within society.
The Internet as the biggest human library ever assembled keeps on growing. Although all kinds of information carriers (e.g. audio/video/hybrid file formats) are available, text based documents dominate. It is estimated that about 80% of all information worldwide stored electronically exists in (or can be converted into) text form. More and more, all kinds of documents are generated by means of a text processing system and are therefore available electronically. Nowadays, many printed journals are also published online and may even discontinue to appear in print form tomorrow. This development has many convincing advantages: the documents are both available faster (cf. prepress services) and cheaper, they can be searched more easily, the physical storage only needs a fraction of the space previously necessary and the medium will not age. For most people, fast and easy access is the most interesting feature of the new age; computer-aided search for specific documents or Web pages becomes the basic tool for information-oriented work. But this tool has problems. The current keyword based search machines available on the Internet are not really appropriate for such a task; either there are (way) too many documents matching the specified keywords are presented or none at all. The problem lies in the fact that it is often very difficult to choose appropriate terms describing the desired topic in the first place. This contribution discusses the current state-of-the-art techniques in content-based searching (along with common visualization/browsing approaches) and proposes a particular adaptive solution for intuitive Internet document navigation, which not only enables the user to provide full texts instead of manually selected keywords (if available), but also allows him/her to explore the whole database.
In intensive care units physicians are aware of a high lethality rate of septic shock patients. In this contribution we present typical problems and results of a retrospective, data driven analysis based on two neural network methods applied on the data of two clinical studies. Our approach includes necessary steps of data mining, i.e. building up a data base, cleaning and preprocessing the data and finally choosing an adequate analysis for the medical patient data. We chose two architectures based on supervised neural networks. The patient data is classified into two classes (survived and deceased) by a diagnosis based either on the black-box approach of a growing RBF network and otherwise on a second network which can be used to explain its diagnosis by human-understandable diagnostic rules. The advantages and drawbacks of these classification methods for an early warning system are discussed.
The encoding of images by semantic entities is still an unresolved task. This paper proposes the encoding of images by only a few important components or image primitives. Classically, this can be done by the Principal Component Analysis (PCA). Recently, the Independent Component Analysis (ICA) has found strong interest in the signal processing and neural network community. Using this as pattern primitives we aim for source patterns with the highest occurrence probability or highest information. For the example of a synthetic image composed by characters this idea selects the salient ones. For natural images it does not lead to an acceptable reproduction error since no a-priori probabilities can be computed. Combining the traditional principal component criteria of PCA with the independence property of ICA we obtain a better encoding. It turns out that the Independent Principal Components (IPC) in contrast to the Principal Independent Components (PIC) implement the classical demand of Shannon’s rate distortion theory.
One of the most interesting domains of feedforward networks is the processing of sensor signals. There do exist some networks which extract most of the information by implementing the maximum entropy principle for Gaussian sources. This is done by transforming input patterns to the base of eigenvectors of the input autocorrelation matrix with the biggest eigenvalues. The basic building block of these networks is the linear neuron, learning with the Oja learning rule. Nevertheless, some researchers in pattern recognition theory claim that for pattern recognition and classification clustering transformations are needed which reduce the intra-class entropy. This leads to stable, reliable features and is implemented for Gaussian sources by a linear transformation using the eigenvectors with the smallest eigenvalues. In another paper (Brause 1992) it is shown that the basic building block for such a transformation can be implemented by a linear neuron using an Anti-Hebb rule and restricted weights. This paper shows the analog VLSI design for such a building block, using standard modules of multiplication and addition. The most tedious problem in this VLSI-application is the design of an analog vector normalization circuitry. It can be shown that the standard approaches of weight summation will not give the convergence to the eigenvectors for a proper feature transformation. To avoid this problem, our design differs significantly from the standard approaches by computing the real Euclidean norm. Keywords: minimum entropy, principal component analysis, VLSI, neural networks, surface approximation, cluster transformation, weight normalization circuit.
An interior delta in the lower course of the Ntem River near the sub-prefecture Ma’an was identified after interpretation of satellite images, topographical maps of SW Cameroon and geological as well as hydrological references and a reconnaissance fieldtrip to the study area. Here neotectonic processes have initiated the establishment of a ‘sediment trap’ (step fault), which in combination with environmental changes strongly generated the fluvial morphology. It transitionally led to temporary lacustrine and palustrine conditions in parts of this river section. Inside the interior delta an anastomosing multi-branched river system has developed, which contains ‘stillwater locations', periodically inundated sections, islands and rapids. Following geomorphological, physiogeographical and sedimentological research approaches, the alluvial plain has been prospected and studied extensively. 91 hand-corings, including three NE–SW transects, were carried out on river benches, levees, cut-off and periodical branches, islands as well as terraces throughout the entire alluvial plain and have unveiled multi-layered, sandy to clayey alluvia reaching up to 440 cm depth. At many locations, fossil organic horizons and palaeosurfaces were discovered, containing valuable palaeoenvironmental proxy data. At these sites, through additional detailed stratigraphical analysis (close-meshed hand-coring and exposure digging) a comprehensive insight into the stratification (lamination) of the alluvia could be gained, clarifying processes and conditions that prevailed in the catchment area during the period of their deposition. 32 Radiocarbon data of macro-rests (leafs, wood), charcoal and organic sediment sampled from these horizons provided ages between 48.230 ± 6.411 and 217 ± 46 years BP (not calibrated). This constitutes the importance of the alluvia as an additional, innovative palaeoarchive for proxy data contributing to the reconstruction of palaeoenvironment and palaeoclimate in western Equatorial Africa. The further examination of the alluvia will not only provide additional information on the dynamics of vegetation, climate and hydrology (esp. fluvial morphology) in SW Cameroon since the ‘First Millennium BC Crisis’ (around 3.000 years BP), the main focus of the DFG-research project, but also on conditions prevailing since the Late Pleistocene, during the Last Glacial Maximum (~18.000 years BP), the Younger Dryas impact (~11.000 years BP) and the ‘Humid African Period’ (~9.000–6.000 years BP). Delta13C-values (–31,4 to –26,4‰) evidence that at the particular drilling sites rain forest has prevailed during the corresponding time period (rain forest refuge theory). The sampled macrorests all indicate rain forest dominated ecosystems, which were able to persist in fluvial habitats, even during arid periods.