Refine
Year of publication
- 2004 (28) (remove)
Document Type
- Conference Proceeding (28) (remove)
Language
- English (28) (remove)
Has Fulltext
- yes (28) (remove)
Is part of the Bibliography
- no (28)
Keywords
- Computerlinguistik (4)
- Japanisch (2)
- Deutsch (1)
- Dialektologie (1)
- Englisch (1)
- Frankfurt <Main, 2003> (1)
- Gesellschaft für Semantik (1)
- Informationsstruktur (1)
- Jahrestagung (1)
- Kongress (1)
Institute
- Universitätsbibliothek (7)
- Medizin (5)
- Extern (4)
- Physik (2)
- Informatik (1)
- Sprachwissenschaften (1)
- Wirtschaftswissenschaften (1)
- keine Angabe Institut (1)
Hybrid robust deep and shallow semantic processing for creativity support in document production
(2004)
The research performed in the DeepThought project (http://www.project-deepthought.net) aims at demonstrating the potential of deep linguistic processing if added to existing shallow methods that ensure robustness. Classical information retrieval is extended by high precision concept indexing and relation detection. We use this approach to demonstrate the feasibility of three ambitious applications, one of which is a tool for creativity support in document production and collective brainstorming. This application is described in detail in this paper. Common to all three applications, and the basis for their development is a platform for integrated linguistic processing. This platform is based on a generic software architecture that combines multiple NLP components and on robust minimal recursive semantics (RMRS) as a uniform representation language.
In the last decade, much effort went into the design of robust third-person pronominal anaphor resolution algorithms. Typical approaches are reported to achieve an accuracy of 60-85%. Recent research addresses the question of how to deal with the remaining difficult-toresolve anaphors. Lappin (2004) proposes a sequenced model of anaphor resolution according to which a cascade of processing modules employing knowledge and inferencing techniques of increasing complexity should be applied. The individual modules should only deal with and, hence, recognize the subset of anaphors for which they are competent. It will be shown that the problem of focusing on the competence cases is equivalent to the problem of giving precision precedence over recall. Three systems for high precision robust knowledge-poor anaphor resolution will be designed and compared: a ruleset-based approach, a salience threshold approach, and a machine-learning-based approach. According to corpus-based evaluation, there is no unique best approach. Which approach scores highest depends upon type of pronominal anaphor as well as upon text genre.
Japanese is often taken to be strictly head-final in its syntax. In our work on a broad-coverage, precision implemented HPSG for Japanese, we have found that while this is generally true, there are nonetheless a few minor exceptions to the broad trend. In this paper, we describe the grammar engineering project, present the exceptions we have found, and conclude that this kind of phenomenon motivates on the one hand the HPSG type hierarchical approach which allows for the statement of both broad generalizations and exceptions to those generalizations and on the other hand the usefulness of grammar engineering as a means of testing linguistic hypotheses.
Research on dialectal varieties was for a long time concentrated on phonetic aspects of language. While there was a lot of work done on segmental aspects, suprasegmentals remained unexploited until the last few years, despite the fact that prosody was remarked as a salient aspect of dialectal variants by linguists and by naive speakers. Actual research on dialectal prosody in the German speaking area often deals with discourse analytic methods, correlating intonations curves with communicative functions (P. Auer et al. 2000, P. Gilles & R. Schrambke 2000, R. Kehrein & S. Rabanus 2001). The project I present here has another focus. It looks at general prosodic aspects, abstracted from actual situations. These global structures are modelled and integrated in a speech synthesis system. Today, mostly intonation is being investigated. However, rhythm, the temporal organisation of speech, is not a core of actual research on prosody. But there is evidence that temporal organisation is one of the main structuring elements of speech (B. Zellner 1998, B. Zellner Keller 2002). Following this approach developed for speech synthesis, I will present the modelling of the timing of two Swiss German dialects (Bernese and Zurich dialect) that are considered quite different on the prosodic level. These models are part of the project on the "development of basic knowledge for research on Swiss German prosody by means of speech synthesis modelling" founded by the Swiss National Science Foundation.
Course management software : supporting the university’s teaching with technology initiatives
(2004)
An increasingly important element of the teaching with technology activities at Northwestern University is the course management system, a web-based class communication and administration environment. The usage growth of the system is substantial and amplifies the need for integration with other web services and resources. Integration is particularly material in area of library services. This presentation contains a case study of Northwestern University's implementation of its course management system software and highlights examples of how the system is being used to enhance the teaching and learning. A description of the integration efforts with library resources is provided. The goal of the presentation is to equip librarians with the basic knowledge required to engage with their colleagues in conversations surrounding the nature of integration of these systems within the teaching and learning landscapes of their home institutions.
Despite a legal framework being in place for several years, the market share of qualified electronic signatures is disappointingly low. Mobile Signatures provide a new and promising opportunity for the deployment of an infrastructure for qualified electronic signatures. We that SIM-based signatures are the most secure and convenient solution. However, using the SIM-card as a secure signature creation device (SSCD) raises new challenges, because it would contain the user’s private key as well as the subscriber identification. Combining both functions in one card raises the question who will have the control over the keys and certificates. We propose a protocol called Certification on Demand (COD) that separates certification services from subscriber identification information and allows consumers to choose their appropriate certification services and service providers based on their needs. This infrastructure could be used to enable secure mobile brokerage services that can ommit the necessity of TAN lists and therefore allow a better integration of information and transaction services.
The key hypothesis is that the IT industry lure us into the IT world with a promise to solve our information problems. Do we sign the contract, we will recognise that the IT industry can´t keep the promise. One reason: they themselves lost sight over there own game. Therefore they have to invent new tools continiously. LIS professionals should not leave the field IT professionals. LIS professional should rather put stress on to reveal the difference in the value chain between data – information – knowledge. Information and knowledge is brainware and not produced by hard and software in the sense of IT philosophy. Against the background of the language game of Jean-François Lyotard, the author explains the information and knowledge society as language game invented by the IT industry. Furthermore his beliefs of postmodernen LIS professionals and the consequences involved for LIS traning will be presented.
Using faculty-librarian partnerships to ensure that students become information fluent in the 21st century In the 21st century educators in partnership with librarians must prepare students effectively for productive use of information especially in higher education. Students will need to graduate from universities with appropriate information and technology skills to enable them to become productive citizens in the workplace and in society. Technology is having a major impact on society; in economics e-business is moving to the forefront; in communication e-mail, the Internet and cellular telephones have reformed how people communicate; in the work environment computers and web utilizations are emphasized and in education virtual learning and teaching are becoming more important. These few examples indicate how the 21st century information environment requires future members of the workforce to be information fluent so they will have the ability to locate information efficiently, evaluate information for specific needs, organize information to address issues, apply information skillfully to solve problems, use information to communicate effectively, and use information responsibly to ensure a productive work environment. Individuals can achieve information fluency by acquiring cultural, visual, computer, technology, research and information management skills to enable them to think critically.
Navigating information, facilitating knowledge: the library, the academy, and student learning
(2004)
Understanding the nature and complementarity of the phenomena of information and knowledge lend not only epistemological clarity to their relationship, but also reaffirms the place of the library in the academic mission of knowledge transfer, acquisition, interpretation, and creation. These in turn reassert the legitimacy of the academic library as necessary participant in the teaching enterprise of colleges and universities. Such legitimacy induces an obligation to teach, and that obligation needs to be explored and implemented with adequate vigor and reach. Librarians and the academy must, however, concede that the scope of the task calls for a solution that goes beyond shared responsibilities. Academic libraries should assume a full teaching function even as they continue their exploration and design of activities and programs aimed at reinforcing information literacy in the various disciplines on campus. All must concede that need for collaboration cannot provide grounds for questioning the desirability of autonomous teaching status for the academic library in information literacy education
Background The detection of the new Coranavirus (CoV) causing agent of the severe acute respiratory syndrome (SARS) for diagnostic purposes is still a critical step in prevention of secondary hospital infections. In this respect the PCR for SARS diagnostic is the fastest and most sensitive method and was published very early after the description of the new pathogen by different groups. To evaluate the quality and sensitivity of the SARS PCR performed in diagnostic laboratories all over the world an external quality assurance (EQA) for SARS PCR was initiated by the WHO, the European Network for Diagnostics of "Imported" Viral Diseases (ENIVD) and the Robert Koch-Institut. Methods Therefore 10 samples of inactivated SARS CoV strains isolated in Frankfurt and Hong Kong in different dilutions and negative controls were prepared. The freeze dried samples were send by mail to 62 different laboratories, in 37 countries in Europe and Israel (35), Asia (11), The Americas (11), Australia and New Zealand (4) and Africa (1). The results were returned by email or fax 1 week (13), 2 weeks (14), 3 weeks (6) and later (29) after receiving the material which does not mimic at all the possible speed of this fast method. But this was not considered in the evaluation of these first SARS EQA. Results 44 laboratories showed good or excellent results (26 = 100%, 18 = 90%) and even the 14 laboratories which archived only 80% (10) or 70% (4) correct results are mostly lacking sensitivity. The results of the other 4 laboratories show basic problems in regard to sensitivity, specificity and consistency of results and must be overcome as soon as possible. 4 laboratories seem to have problems with the specificity finding a positive signal in negative samples. The different methods used for preparation of the SARS CoV genome and diagnostic PCR test procedure used by the participating laboratories will be discussed in more detail in the presentation. Conclusion However, in contrast to previous EQAs for Ebola, Lassa and Orthopoxviruses the quality of PCR results was rather good which might be caused by the early publication and distribution of well developed PCR methods. An EQA for evaluation of SARS specific serology is still ongoing, first results will be available beginning of April 2004.
We present a detailed study of chemical freeze-out in nucleus-nucleus collisions at beam energies of 11.6, 30, 40, 80 and 158A GeV. By analyzing hadronic multiplicities within the statistical hadronization approach, we have studied the chemical equilibration of the system as a function of center of mass energy and of the parameters of the source. Additionally, we have tested and compared different versions of the statistical model, with special emphasis on possible explanations of the observed strangeness hadronic phase space under-saturation.
A pragmatic explanation of the stage level/individual level contrast in combination with locatives
(2004)
One important difference between stage level predicates (SLPs) and individual level predicates (ILPs) is their behavior with respect to locative modifiers. It is commonly assumed that SLPs but not ILPs combine with locatives. The present study argues against a semantic account for this behavior (as advanced by e.g. Kratzer 1995, Chierchia 1995) and proposes a genuinely pragmatic explanation of the observed stage level/individual level contrast instead. The proposal is spelled out using Blutners (1998, 2000) optimality theoretic version of the Gricean maxims. Building on the observation that the respective locatives are not event-related but frame-setting modifiers, the preference for main predicates that express temporary properties is explained as a side-effect of “synchronizing” the main predicate with the locative frame in the course of finding an optimal interpretation. By emphasizing the division of labor between grammar and pragmatics, the proposed solution takes a considerable load off of semantics.
Background and Aim: In Germany, the discharge medication is usually reported to the general practitioner (GP) by an inital short report (SR) /notification (handed over to the patient) and later by a more detailed discharge letter (DL) of the hospital.
Material and Method: We asked N=536 GPs (from Frankfurt/Main and Luebeck) after the typical report format of their patients discharge medication by the local hospitals. The questionnaire asked for 26 items covering (1) the designation of the medication (brand name, generic name) in SR and DL, (2) further specifications e.g. possibilities of generic substitution or supervision of sensible medications, (3) reasons why GPs do not follow the hospitals recommendations and (4) possibilities for an improvement in the medication-related communication between GP and hospitals.
Results: 39% GPs responded sufficiently to the questionnaire. The majority of the GPs (82%) quoted that in the SR only brand names are given (often or ever) and neither the generic name or any further information on generic substitution is available (seldom or never). 65% of the responders quoted that even in the DL only brand names are given. Only 41% of the responders quoted that further treatment relevant specifications are given (often or ever). 95% responded that new medications or change of custom medication is seldom or never explained in the DL and GP were not explicitly informed about relevant medication changes. 58% of the responders quoted economic reasons for re-adjustment of the discharge medication e.g. by generic substitution. The majority of responders (83%) are favouring (useful or very useful) a pre-discharge information (e.g. via fax) about the medication and 54% a hot-line to some relevant person in the hospital when treatment problems emerge. 67% of the responders quoted in favour of regular meetings between GPs and hospital doctors regarding actual pharmacotherapy.
Conclusion: In conclusion, our survey pointed to marked deficiencies in reporting the discharge medication to GPs.
Conflict of interest: None
Teaching information literacy: substance and process This presentation explores the concept of information literacy within the broader context of higher education. It argues that, certain assertions in the library literature notwithstanding, the concepts associated with information literacy are not new, but rather very closely resemble the qualities traditionally considered to characterize a well-educated person. The presentation also considers the extent to which the higher education system does indeed foster the attributes commonly associated with information literacy. The term information literacy has achieved the immediacy it currently enjoys within the library community with the advent of the so-called "information age" The information age is commonly touted in the literature, both popular and professional, as constituting nothing short of a revolution. Academic librarians and other educators have of course felt called upon to make their teaching reflect both the growing proliferation of information formats and the major transformations affecting the process of information seeking. Faced with so much novelty and uncertainty, it is no surprise that many have felt that these changes call for a revolution in teaching. It is within this context that the concept of information literacy has flourished. It is argued in this presentation, however, that by treating information literacy as an essentially new specialty that owes much of its importance to the plethora of electronic information, we risk obscuring some of the most fundamental and enduring educational values we should be imparting to our students. Much of the literature on information literacy assumes - rather than argues - that recent changes in the way we approach education are indications of progress. Indeed, much of the self-narrative that institutions produce (in bulletins, mission statements, web sites, etc.) endorses an approach to education that will result in lifelong learners who are critical consumers of information. After critically examining the degree to which such statements of educational approach reflect reality, this presentation concludes by considering the effects of certain changes in the culture of higher education. It considers particularly the transformation - at least in North America - of the traditional model of higher education as a public good to a market-driven business model. It poses the question of whether a change of this significance might in fact detract from, rather than promote, the development of information literate students.
The hadronic final state of central Pb+Pb collisions at 20, 30, 40, 80, and 158 AGeV has been measured by the CERN NA49 collaboration. The mean transverse mass of pions and kaons at midrapidity stays nearly constant in this energy range, whereas at lower energies, at the AGS, a steep increase with beam energy was measured. Compared to p+p collisions as well as to model calculations, anomalies in the energy dependence of pion and kaon production at lower SPS energies are observed. These findings can be explained, assuming that the energy density reached in central A+A collisions at lower SPS energies is sufficient to transform the hot and dense nuclear matter into a deconfined phase.
0. Introduction 1. Observations concerning the structure of morphosyntactically marked focus constructions 1.1 First observation: SF vs. NSF asymmetry 1.2 Second observation: NSF-NAR parallelism 1.3 Affirmative ex-situ focus constructions (SF, NSF), and narrative clauses (NAR) 2. Grammaticalization 2.1 Cleft hypothesis 2.2 Movement hypothesis 2.3 Narrative hypothesis 2.3.1 Back- or Foregrounding? 2.3.2 Converse directionality of FM and conjunction 3. Language specific analysis 4. Conclusionary remarks References
The research performed in the DeepThought project aims at demonstrating the potential of deep linguistic processing if combined with shallow methods for robustness. Classical information retrieval is extended by high precision concept indexing and relation detection. On the basis of this approach, the feasibility of three ambitious applications will be demonstrated, namely: precise information extraction for business intelligence; email response management for customer relationship management; creativity support for document production and collective brainstorming. Common to these applications, and the basis for their development is the XML-based, RMRS-enabled core architecture framework that will be described in detail in this paper. The framework is not limited to the applications envisaged in the DeepThought project, but can also be employed e.g. to generate and make use of XML standoff annotation of documents and linguistic corpora, and in general for a wide range of NLP-based applications and research purposes.
Information literacy is a mosaic of attitudes, understandings, capabilities and knowledge about which there are three myths. The first myth is that it is about the ability to use ICTs to access a wealth of information. The second is that students entering higher education are information literate because student centred, resource based, and ICT focused learning are now pervasive in secondary education. The third myth is that information literacy development can be addressed by library-centric generic approaches. This paper addresses those myths and emphasises the need for information literacy to be recognised as the critical whole of education and societal issue, fundamental to an information-enabled and better world. In formal education, information literacy can only be developed by infusion into curriculum design, pedagogies, and assessment.
[Abstract] Occurrence of hepatitis B virus (HBV) reactivation following kidney transplantation
(2004)
While the sortal constraints associated with Japanese numeral classifiers are wellstudied, less attention has been paid to the details of their syntax. We describe an analysis implemented within a broadcoverage HPSG that handles an intricate set of numeral classifier construction types and compositionally relates each to an appropriate semantic representation, using Minimal Recursion Semantics.
The volume is a collection of papers given at the conference “sub8 -- Sinn und Bedeutung”, the eighth annual conference of the Gesellschaft für Semantik, held at the Johann-Wolfgang-Goethe-Universität, Frankfurt (Germany) in September 2003. During this conference, experts presented and discussed various aspects of semantics. The very different topics included in this book provide insight into fields of ongoing Semantics research.