Refine
Year of publication
Document Type
- Conference Proceeding (753) (remove)
Language
- English (753) (remove)
Keywords
- Computerlinguistik (20)
- Informationsstruktur (16)
- Phonetik (12)
- Japanisch (9)
- Democracy (8)
- Englisch (7)
- Grammatik (7)
- Law (6)
- Maschinelle Übersetzung (6)
- Nungisch (6)
Institute
- Physik (244)
- Rechtswissenschaft (101)
- Medizin (81)
- Universitätsbibliothek (69)
- Informatik (37)
- Extern (27)
- Frankfurt Institute for Advanced Studies (FIAS) (27)
- Wirtschaftswissenschaften (14)
- Geowissenschaften (13)
- Biochemie und Chemie (11)
Course management software : supporting the university’s teaching with technology initiatives
(2004)
An increasingly important element of the teaching with technology activities at Northwestern University is the course management system, a web-based class communication and administration environment. The usage growth of the system is substantial and amplifies the need for integration with other web services and resources. Integration is particularly material in area of library services. This presentation contains a case study of Northwestern University's implementation of its course management system software and highlights examples of how the system is being used to enhance the teaching and learning. A description of the integration efforts with library resources is provided. The goal of the presentation is to equip librarians with the basic knowledge required to engage with their colleagues in conversations surrounding the nature of integration of these systems within the teaching and learning landscapes of their home institutions.
The key hypothesis is that the IT industry lure us into the IT world with a promise to solve our information problems. Do we sign the contract, we will recognise that the IT industry can´t keep the promise. One reason: they themselves lost sight over there own game. Therefore they have to invent new tools continiously. LIS professionals should not leave the field IT professionals. LIS professional should rather put stress on to reveal the difference in the value chain between data – information – knowledge. Information and knowledge is brainware and not produced by hard and software in the sense of IT philosophy. Against the background of the language game of Jean-François Lyotard, the author explains the information and knowledge society as language game invented by the IT industry. Furthermore his beliefs of postmodernen LIS professionals and the consequences involved for LIS traning will be presented.
We present a detailed study of chemical freeze-out in nucleus-nucleus collisions at beam energies of 11.6, 30, 40, 80 and 158A GeV. By analyzing hadronic multiplicities within the statistical hadronization approach, we have studied the chemical equilibration of the system as a function of center of mass energy and of the parameters of the source. Additionally, we have tested and compared different versions of the statistical model, with special emphasis on possible explanations of the observed strangeness hadronic phase space under-saturation.
The volume is a collection of papers given at the conference “sub8 -- Sinn und Bedeutung”, the eighth annual conference of the Gesellschaft für Semantik, held at the Johann-Wolfgang-Goethe-Universität, Frankfurt (Germany) in September 2003. During this conference, experts presented and discussed various aspects of semantics. The very different topics included in this book provide insight into fields of ongoing Semantics research.
While the sortal constraints associated with Japanese numeral classifiers are wellstudied, less attention has been paid to the details of their syntax. We describe an analysis implemented within a broadcoverage HPSG that handles an intricate set of numeral classifier construction types and compositionally relates each to an appropriate semantic representation, using Minimal Recursion Semantics.
Hybrid robust deep and shallow semantic processing for creativity support in document production
(2004)
The research performed in the DeepThought project (http://www.project-deepthought.net) aims at demonstrating the potential of deep linguistic processing if added to existing shallow methods that ensure robustness. Classical information retrieval is extended by high precision concept indexing and relation detection. We use this approach to demonstrate the feasibility of three ambitious applications, one of which is a tool for creativity support in document production and collective brainstorming. This application is described in detail in this paper. Common to all three applications, and the basis for their development is a platform for integrated linguistic processing. This platform is based on a generic software architecture that combines multiple NLP components and on robust minimal recursive semantics (RMRS) as a uniform representation language.
The research performed in the DeepThought project aims at demonstrating the potential of deep linguistic processing if combined with shallow methods for robustness. Classical information retrieval is extended by high precision concept indexing and relation detection. On the basis of this approach, the feasibility of three ambitious applications will be demonstrated, namely: precise information extraction for business intelligence; email response management for customer relationship management; creativity support for document production and collective brainstorming. Common to these applications, and the basis for their development is the XML-based, RMRS-enabled core architecture framework that will be described in detail in this paper. The framework is not limited to the applications envisaged in the DeepThought project, but can also be employed e.g. to generate and make use of XML standoff annotation of documents and linguistic corpora, and in general for a wide range of NLP-based applications and research purposes.
In the last decade, much effort went into the design of robust third-person pronominal anaphor resolution algorithms. Typical approaches are reported to achieve an accuracy of 60-85%. Recent research addresses the question of how to deal with the remaining difficult-toresolve anaphors. Lappin (2004) proposes a sequenced model of anaphor resolution according to which a cascade of processing modules employing knowledge and inferencing techniques of increasing complexity should be applied. The individual modules should only deal with and, hence, recognize the subset of anaphors for which they are competent. It will be shown that the problem of focusing on the competence cases is equivalent to the problem of giving precision precedence over recall. Three systems for high precision robust knowledge-poor anaphor resolution will be designed and compared: a ruleset-based approach, a salience threshold approach, and a machine-learning-based approach. According to corpus-based evaluation, there is no unique best approach. Which approach scores highest depends upon type of pronominal anaphor as well as upon text genre.
Background and Aim: In Germany, the discharge medication is usually reported to the general practitioner (GP) by an inital short report (SR) /notification (handed over to the patient) and later by a more detailed discharge letter (DL) of the hospital.
Material and Method: We asked N=536 GPs (from Frankfurt/Main and Luebeck) after the typical report format of their patients discharge medication by the local hospitals. The questionnaire asked for 26 items covering (1) the designation of the medication (brand name, generic name) in SR and DL, (2) further specifications e.g. possibilities of generic substitution or supervision of sensible medications, (3) reasons why GPs do not follow the hospitals recommendations and (4) possibilities for an improvement in the medication-related communication between GP and hospitals.
Results: 39% GPs responded sufficiently to the questionnaire. The majority of the GPs (82%) quoted that in the SR only brand names are given (often or ever) and neither the generic name or any further information on generic substitution is available (seldom or never). 65% of the responders quoted that even in the DL only brand names are given. Only 41% of the responders quoted that further treatment relevant specifications are given (often or ever). 95% responded that new medications or change of custom medication is seldom or never explained in the DL and GP were not explicitly informed about relevant medication changes. 58% of the responders quoted economic reasons for re-adjustment of the discharge medication e.g. by generic substitution. The majority of responders (83%) are favouring (useful or very useful) a pre-discharge information (e.g. via fax) about the medication and 54% a hot-line to some relevant person in the hospital when treatment problems emerge. 67% of the responders quoted in favour of regular meetings between GPs and hospital doctors regarding actual pharmacotherapy.
Conclusion: In conclusion, our survey pointed to marked deficiencies in reporting the discharge medication to GPs.
Conflict of interest: None
Background The detection of the new Coranavirus (CoV) causing agent of the severe acute respiratory syndrome (SARS) for diagnostic purposes is still a critical step in prevention of secondary hospital infections. In this respect the PCR for SARS diagnostic is the fastest and most sensitive method and was published very early after the description of the new pathogen by different groups. To evaluate the quality and sensitivity of the SARS PCR performed in diagnostic laboratories all over the world an external quality assurance (EQA) for SARS PCR was initiated by the WHO, the European Network for Diagnostics of "Imported" Viral Diseases (ENIVD) and the Robert Koch-Institut. Methods Therefore 10 samples of inactivated SARS CoV strains isolated in Frankfurt and Hong Kong in different dilutions and negative controls were prepared. The freeze dried samples were send by mail to 62 different laboratories, in 37 countries in Europe and Israel (35), Asia (11), The Americas (11), Australia and New Zealand (4) and Africa (1). The results were returned by email or fax 1 week (13), 2 weeks (14), 3 weeks (6) and later (29) after receiving the material which does not mimic at all the possible speed of this fast method. But this was not considered in the evaluation of these first SARS EQA. Results 44 laboratories showed good or excellent results (26 = 100%, 18 = 90%) and even the 14 laboratories which archived only 80% (10) or 70% (4) correct results are mostly lacking sensitivity. The results of the other 4 laboratories show basic problems in regard to sensitivity, specificity and consistency of results and must be overcome as soon as possible. 4 laboratories seem to have problems with the specificity finding a positive signal in negative samples. The different methods used for preparation of the SARS CoV genome and diagnostic PCR test procedure used by the participating laboratories will be discussed in more detail in the presentation. Conclusion However, in contrast to previous EQAs for Ebola, Lassa and Orthopoxviruses the quality of PCR results was rather good which might be caused by the early publication and distribution of well developed PCR methods. An EQA for evaluation of SARS specific serology is still ongoing, first results will be available beginning of April 2004.
Despite a legal framework being in place for several years, the market share of qualified electronic signatures is disappointingly low. Mobile Signatures provide a new and promising opportunity for the deployment of an infrastructure for qualified electronic signatures. We that SIM-based signatures are the most secure and convenient solution. However, using the SIM-card as a secure signature creation device (SSCD) raises new challenges, because it would contain the user’s private key as well as the subscriber identification. Combining both functions in one card raises the question who will have the control over the keys and certificates. We propose a protocol called Certification on Demand (COD) that separates certification services from subscriber identification information and allows consumers to choose their appropriate certification services and service providers based on their needs. This infrastructure could be used to enable secure mobile brokerage services that can ommit the necessity of TAN lists and therefore allow a better integration of information and transaction services.
The hadronic final state of central Pb+Pb collisions at 20, 30, 40, 80, and 158 AGeV has been measured by the CERN NA49 collaboration. The mean transverse mass of pions and kaons at midrapidity stays nearly constant in this energy range, whereas at lower energies, at the AGS, a steep increase with beam energy was measured. Compared to p+p collisions as well as to model calculations, anomalies in the energy dependence of pion and kaon production at lower SPS energies are observed. These findings can be explained, assuming that the energy density reached in central A+A collisions at lower SPS energies is sufficient to transform the hot and dense nuclear matter into a deconfined phase.
[Abstract] Occurrence of hepatitis B virus (HBV) reactivation following kidney transplantation
(2004)
Japanese is often taken to be strictly head-final in its syntax. In our work on a broad-coverage, precision implemented HPSG for Japanese, we have found that while this is generally true, there are nonetheless a few minor exceptions to the broad trend. In this paper, we describe the grammar engineering project, present the exceptions we have found, and conclude that this kind of phenomenon motivates on the one hand the HPSG type hierarchical approach which allows for the statement of both broad generalizations and exceptions to those generalizations and on the other hand the usefulness of grammar engineering as a means of testing linguistic hypotheses.
Focus expressions in Yom
(2005)