Universitätspublikationen
Refine
Year of publication
Document Type
- Conference Proceeding (494) (remove)
Language
- English (494) (remove)
Has Fulltext
- yes (494)
Is part of the Bibliography
- no (494)
Keywords
- Democracy (7)
- Law (6)
- Open Science (6)
- human rights (6)
- law (6)
- democracy (5)
- Human Rights (3)
- Internet (3)
- educational freedom (3)
- homeschooling (3)
Institute
- Physik (220)
- Rechtswissenschaft (99)
- Medizin (66)
- Frankfurt Institute for Advanced Studies (FIAS) (25)
- Informatik (22)
- Universitätsbibliothek (15)
- Geowissenschaften (13)
- Biochemie und Chemie (11)
- Wirtschaftswissenschaften (9)
- Präsidium (6)
This paper proposes a new approach for the encoding of images by only a few important components. Classically, this is done by the Principal Component Analysis (PCA). Recently, the Independent Component Analysis (ICA) has found strong interest in the neural network community. Applied to images, we aim for the most important source patterns with the highest occurrence probability or highest information called principal independent components (PIC). For the example of a synthetic image composed by characters this idea selects the salient ones. For natural images it does not lead to an acceptable reproduction error since no a-priori probabilities can be computed. Combining the traditional principal component criteria of PCA with the independence property of ICA we obtain a better encoding. It turns out that this definition of PIC implements the classical demand of Shannon’s rate distortion theory.
Low energy beam transport (LEBT) for a future heavy ion driven inertial fusion (HIDIF [1]) facility is a crucial point using a Bi+ beam of 40 mA at 156 keV. High space charge forces (generalised perveance K=3.6*10-3) restrict the use of electrostatic focussing systems. On the other hand magnetic lenses using space charge compensation suffer from the low particle velocity. Additionally the emittance requirements are very high in order to avoid particle losses in the linac and at ring injection [2]. urthermore source noise and rise time of space charge compensation [3] might enhance particle losses and emittance. Gabor lenses [4] using a continuous space charge cloud for focussing could be a serious alternative to conventional LEBT systems. They combine strong cylinder symmetric focussing with partly space charge compensation and low emittance growth due to lower non linear fields. A high tolerance against source noise and current fluctuations and reduced investment costs are other possible advantages. The proof of principle has already been shown [5, 6]. To broaden the experiences an experimental program was started. Therefrom the first experimental results using a double Gabor lens (DGPL, see fig. 1 ) LEBT system for transporting an high perveance Xe+ beam will be presented and the results of numerical simulations will be shown.
In its first part, this contribution reviews shortly the application of neural network methods to medical problems and characterizes its advantages and problems in the context of the medical background. Successful application examples show that human diagnostic capabilities are significantly worse than the neural diagnostic systems. Then, paradigm of neural networks is shortly introduced and the main problems of medical data base and the basic approaches for training and testing a network by medical data are described. Additionally, the problem of interfacing the network and its result is given and the neuro-fuzzy approach is presented. Finally, as case study of neural rule based diagnosis septic shock diagnosis is described, on one hand by a growing neural network and on the other hand by a rule based system. Keywords: Statistical Classification, Adaptive Prediction, Neural Networks, Neurofuzzy, Medical Systems
The early prediction of mortality is one of the unresolved tasks in intensive care medicine. This contribution models medical symptoms as observations cased by transitions between hidden markov states. Learning the underlying state transition probabilities results in a prediction probability success of about 91%. The results are discussed and put in relation to the model used. Finally, the rationales for using the model are reflected: Are there states in the septic shock data?
Rating agencies state that they take a rating action only when it is unlikely to be reversed shortly afterwards. Based on a formal representation of the rating process, I show that such a policy provides a good explanation for the puzzling empirical evidence: Rating changes occur relatively seldom, exhibit serial dependence, and lag changes in the issuers’ default risk. In terms of informational losses, avoiding rating reversals can be more harmful than monitoring credit quality only twice per year.
Over-allotment arrangements are nowadays part of almost any initial public offering. The underwriting banks borrow stocks from the previous shareholders to issue more than the initially announced number of shares. This is combined with the option to cover this short position at the issue price. We present empirical evidence on the value of these arrangements to the underwriters of initial public offerings on the Neuer Markt. The over-allotment arrangement is regarded as a portfolio of a long call option and a short position in a forward contract on the stock, which is different from other approaches presented in the literature.
Given the economically substantial values for these option- like claims we try to identify benefits to previous shareholders or new investors when the company is using this instrument in the process of going public. Although we carefully control for potential endogeneity problems, we find virtually no evidence for a reduction in underpricing for firms using over-allotment arrangements. Furthermore, we do not find evidence for more pronounced price stabilization activities or better aftermarket performance for firms granting an over-allotment arrangement to the underwriting banks.
EFM Classification: 230, 410
Teaching information literacy: substance and process This presentation explores the concept of information literacy within the broader context of higher education. It argues that, certain assertions in the library literature notwithstanding, the concepts associated with information literacy are not new, but rather very closely resemble the qualities traditionally considered to characterize a well-educated person. The presentation also considers the extent to which the higher education system does indeed foster the attributes commonly associated with information literacy. The term information literacy has achieved the immediacy it currently enjoys within the library community with the advent of the so-called "information age" The information age is commonly touted in the literature, both popular and professional, as constituting nothing short of a revolution. Academic librarians and other educators have of course felt called upon to make their teaching reflect both the growing proliferation of information formats and the major transformations affecting the process of information seeking. Faced with so much novelty and uncertainty, it is no surprise that many have felt that these changes call for a revolution in teaching. It is within this context that the concept of information literacy has flourished. It is argued in this presentation, however, that by treating information literacy as an essentially new specialty that owes much of its importance to the plethora of electronic information, we risk obscuring some of the most fundamental and enduring educational values we should be imparting to our students. Much of the literature on information literacy assumes - rather than argues - that recent changes in the way we approach education are indications of progress. Indeed, much of the self-narrative that institutions produce (in bulletins, mission statements, web sites, etc.) endorses an approach to education that will result in lifelong learners who are critical consumers of information. After critically examining the degree to which such statements of educational approach reflect reality, this presentation concludes by considering the effects of certain changes in the culture of higher education. It considers particularly the transformation - at least in North America - of the traditional model of higher education as a public good to a market-driven business model. It poses the question of whether a change of this significance might in fact detract from, rather than promote, the development of information literate students.
Using faculty-librarian partnerships to ensure that students become information fluent in the 21st century In the 21st century educators in partnership with librarians must prepare students effectively for productive use of information especially in higher education. Students will need to graduate from universities with appropriate information and technology skills to enable them to become productive citizens in the workplace and in society. Technology is having a major impact on society; in economics e-business is moving to the forefront; in communication e-mail, the Internet and cellular telephones have reformed how people communicate; in the work environment computers and web utilizations are emphasized and in education virtual learning and teaching are becoming more important. These few examples indicate how the 21st century information environment requires future members of the workforce to be information fluent so they will have the ability to locate information efficiently, evaluate information for specific needs, organize information to address issues, apply information skillfully to solve problems, use information to communicate effectively, and use information responsibly to ensure a productive work environment. Individuals can achieve information fluency by acquiring cultural, visual, computer, technology, research and information management skills to enable them to think critically.