Refine
Year of publication
- 2004 (25) (remove)
Document Type
- Conference Proceeding (25) (remove)
Language
- English (25) (remove)
Has Fulltext
- yes (25) (remove)
Is part of the Bibliography
- no (25)
Keywords
- Computerlinguistik (4)
- Japanisch (2)
- Deutsch (1)
- Dialektologie (1)
- Englisch (1)
- Frankfurt <Main, 2003> (1)
- Gesellschaft für Semantik (1)
- Informationsstruktur (1)
- Jahrestagung (1)
- Kongress (1)
Institute
- Universitätsbibliothek (7)
- Extern (4)
- Medizin (2)
- Physik (2)
- Informatik (1)
- Sprachwissenschaften (1)
- Wirtschaftswissenschaften (1)
- keine Angabe Institut (1)
Information literacy is a mosaic of attitudes, understandings, capabilities and knowledge about which there are three myths. The first myth is that it is about the ability to use ICTs to access a wealth of information. The second is that students entering higher education are information literate because student centred, resource based, and ICT focused learning are now pervasive in secondary education. The third myth is that information literacy development can be addressed by library-centric generic approaches. This paper addresses those myths and emphasises the need for information literacy to be recognised as the critical whole of education and societal issue, fundamental to an information-enabled and better world. In formal education, information literacy can only be developed by infusion into curriculum design, pedagogies, and assessment.
In the last decade, much effort went into the design of robust third-person pronominal anaphor resolution algorithms. Typical approaches are reported to achieve an accuracy of 60-85%. Recent research addresses the question of how to deal with the remaining difficult-toresolve anaphors. Lappin (2004) proposes a sequenced model of anaphor resolution according to which a cascade of processing modules employing knowledge and inferencing techniques of increasing complexity should be applied. The individual modules should only deal with and, hence, recognize the subset of anaphors for which they are competent. It will be shown that the problem of focusing on the competence cases is equivalent to the problem of giving precision precedence over recall. Three systems for high precision robust knowledge-poor anaphor resolution will be designed and compared: a ruleset-based approach, a salience threshold approach, and a machine-learning-based approach. According to corpus-based evaluation, there is no unique best approach. Which approach scores highest depends upon type of pronominal anaphor as well as upon text genre.
The key hypothesis is that the IT industry lure us into the IT world with a promise to solve our information problems. Do we sign the contract, we will recognise that the IT industry can´t keep the promise. One reason: they themselves lost sight over there own game. Therefore they have to invent new tools continiously. LIS professionals should not leave the field IT professionals. LIS professional should rather put stress on to reveal the difference in the value chain between data – information – knowledge. Information and knowledge is brainware and not produced by hard and software in the sense of IT philosophy. Against the background of the language game of Jean-François Lyotard, the author explains the information and knowledge society as language game invented by the IT industry. Furthermore his beliefs of postmodernen LIS professionals and the consequences involved for LIS traning will be presented.
The research performed in the DeepThought project aims at demonstrating the potential of deep linguistic processing if combined with shallow methods for robustness. Classical information retrieval is extended by high precision concept indexing and relation detection. On the basis of this approach, the feasibility of three ambitious applications will be demonstrated, namely: precise information extraction for business intelligence; email response management for customer relationship management; creativity support for document production and collective brainstorming. Common to these applications, and the basis for their development is the XML-based, RMRS-enabled core architecture framework that will be described in detail in this paper. The framework is not limited to the applications envisaged in the DeepThought project, but can also be employed e.g. to generate and make use of XML standoff annotation of documents and linguistic corpora, and in general for a wide range of NLP-based applications and research purposes.
We present a detailed study of chemical freeze-out in nucleus-nucleus collisions at beam energies of 11.6, 30, 40, 80 and 158A GeV. By analyzing hadronic multiplicities within the statistical hadronization approach, we have studied the chemical equilibration of the system as a function of center of mass energy and of the parameters of the source. Additionally, we have tested and compared different versions of the statistical model, with special emphasis on possible explanations of the observed strangeness hadronic phase space under-saturation.
Background The detection of the new Coranavirus (CoV) causing agent of the severe acute respiratory syndrome (SARS) for diagnostic purposes is still a critical step in prevention of secondary hospital infections. In this respect the PCR for SARS diagnostic is the fastest and most sensitive method and was published very early after the description of the new pathogen by different groups. To evaluate the quality and sensitivity of the SARS PCR performed in diagnostic laboratories all over the world an external quality assurance (EQA) for SARS PCR was initiated by the WHO, the European Network for Diagnostics of "Imported" Viral Diseases (ENIVD) and the Robert Koch-Institut. Methods Therefore 10 samples of inactivated SARS CoV strains isolated in Frankfurt and Hong Kong in different dilutions and negative controls were prepared. The freeze dried samples were send by mail to 62 different laboratories, in 37 countries in Europe and Israel (35), Asia (11), The Americas (11), Australia and New Zealand (4) and Africa (1). The results were returned by email or fax 1 week (13), 2 weeks (14), 3 weeks (6) and later (29) after receiving the material which does not mimic at all the possible speed of this fast method. But this was not considered in the evaluation of these first SARS EQA. Results 44 laboratories showed good or excellent results (26 = 100%, 18 = 90%) and even the 14 laboratories which archived only 80% (10) or 70% (4) correct results are mostly lacking sensitivity. The results of the other 4 laboratories show basic problems in regard to sensitivity, specificity and consistency of results and must be overcome as soon as possible. 4 laboratories seem to have problems with the specificity finding a positive signal in negative samples. The different methods used for preparation of the SARS CoV genome and diagnostic PCR test procedure used by the participating laboratories will be discussed in more detail in the presentation. Conclusion However, in contrast to previous EQAs for Ebola, Lassa and Orthopoxviruses the quality of PCR results was rather good which might be caused by the early publication and distribution of well developed PCR methods. An EQA for evaluation of SARS specific serology is still ongoing, first results will be available beginning of April 2004.
Background and Aim: In Germany, the discharge medication is usually reported to the general practitioner (GP) by an inital short report (SR) /notification (handed over to the patient) and later by a more detailed discharge letter (DL) of the hospital.
Material and Method: We asked N=536 GPs (from Frankfurt/Main and Luebeck) after the typical report format of their patients discharge medication by the local hospitals. The questionnaire asked for 26 items covering (1) the designation of the medication (brand name, generic name) in SR and DL, (2) further specifications e.g. possibilities of generic substitution or supervision of sensible medications, (3) reasons why GPs do not follow the hospitals recommendations and (4) possibilities for an improvement in the medication-related communication between GP and hospitals.
Results: 39% GPs responded sufficiently to the questionnaire. The majority of the GPs (82%) quoted that in the SR only brand names are given (often or ever) and neither the generic name or any further information on generic substitution is available (seldom or never). 65% of the responders quoted that even in the DL only brand names are given. Only 41% of the responders quoted that further treatment relevant specifications are given (often or ever). 95% responded that new medications or change of custom medication is seldom or never explained in the DL and GP were not explicitly informed about relevant medication changes. 58% of the responders quoted economic reasons for re-adjustment of the discharge medication e.g. by generic substitution. The majority of responders (83%) are favouring (useful or very useful) a pre-discharge information (e.g. via fax) about the medication and 54% a hot-line to some relevant person in the hospital when treatment problems emerge. 67% of the responders quoted in favour of regular meetings between GPs and hospital doctors regarding actual pharmacotherapy.
Conclusion: In conclusion, our survey pointed to marked deficiencies in reporting the discharge medication to GPs.
Conflict of interest: None
The volume is a collection of papers given at the conference “sub8 -- Sinn und Bedeutung”, the eighth annual conference of the Gesellschaft für Semantik, held at the Johann-Wolfgang-Goethe-Universität, Frankfurt (Germany) in September 2003. During this conference, experts presented and discussed various aspects of semantics. The very different topics included in this book provide insight into fields of ongoing Semantics research.
Using faculty-librarian partnerships to ensure that students become information fluent in the 21st century In the 21st century educators in partnership with librarians must prepare students effectively for productive use of information especially in higher education. Students will need to graduate from universities with appropriate information and technology skills to enable them to become productive citizens in the workplace and in society. Technology is having a major impact on society; in economics e-business is moving to the forefront; in communication e-mail, the Internet and cellular telephones have reformed how people communicate; in the work environment computers and web utilizations are emphasized and in education virtual learning and teaching are becoming more important. These few examples indicate how the 21st century information environment requires future members of the workforce to be information fluent so they will have the ability to locate information efficiently, evaluate information for specific needs, organize information to address issues, apply information skillfully to solve problems, use information to communicate effectively, and use information responsibly to ensure a productive work environment. Individuals can achieve information fluency by acquiring cultural, visual, computer, technology, research and information management skills to enable them to think critically.