Refine
Year of publication
- 2004 (25) (remove)
Document Type
- Conference Proceeding (25) (remove)
Language
- English (25) (remove)
Has Fulltext
- yes (25)
Is part of the Bibliography
- no (25)
Keywords
- Computerlinguistik (4)
- Japanisch (2)
- Deutsch (1)
- Dialektologie (1)
- Englisch (1)
- Frankfurt <Main, 2003> (1)
- Gesellschaft für Semantik (1)
- Informationsstruktur (1)
- Jahrestagung (1)
- Kongress (1)
Institute
- Universitätsbibliothek (7)
- Extern (4)
- Medizin (2)
- Physik (2)
- Informatik (1)
- Sprachwissenschaften (1)
- Wirtschaftswissenschaften (1)
- keine Angabe Institut (1)
Japanese is often taken to be strictly head-final in its syntax. In our work on a broad-coverage, precision implemented HPSG for Japanese, we have found that while this is generally true, there are nonetheless a few minor exceptions to the broad trend. In this paper, we describe the grammar engineering project, present the exceptions we have found, and conclude that this kind of phenomenon motivates on the one hand the HPSG type hierarchical approach which allows for the statement of both broad generalizations and exceptions to those generalizations and on the other hand the usefulness of grammar engineering as a means of testing linguistic hypotheses.
Hybrid robust deep and shallow semantic processing for creativity support in document production
(2004)
The research performed in the DeepThought project (http://www.project-deepthought.net) aims at demonstrating the potential of deep linguistic processing if added to existing shallow methods that ensure robustness. Classical information retrieval is extended by high precision concept indexing and relation detection. We use this approach to demonstrate the feasibility of three ambitious applications, one of which is a tool for creativity support in document production and collective brainstorming. This application is described in detail in this paper. Common to all three applications, and the basis for their development is a platform for integrated linguistic processing. This platform is based on a generic software architecture that combines multiple NLP components and on robust minimal recursive semantics (RMRS) as a uniform representation language.
The research performed in the DeepThought project aims at demonstrating the potential of deep linguistic processing if combined with shallow methods for robustness. Classical information retrieval is extended by high precision concept indexing and relation detection. On the basis of this approach, the feasibility of three ambitious applications will be demonstrated, namely: precise information extraction for business intelligence; email response management for customer relationship management; creativity support for document production and collective brainstorming. Common to these applications, and the basis for their development is the XML-based, RMRS-enabled core architecture framework that will be described in detail in this paper. The framework is not limited to the applications envisaged in the DeepThought project, but can also be employed e.g. to generate and make use of XML standoff annotation of documents and linguistic corpora, and in general for a wide range of NLP-based applications and research purposes.
While the sortal constraints associated with Japanese numeral classifiers are wellstudied, less attention has been paid to the details of their syntax. We describe an analysis implemented within a broadcoverage HPSG that handles an intricate set of numeral classifier construction types and compositionally relates each to an appropriate semantic representation, using Minimal Recursion Semantics.
Using faculty-librarian partnerships to ensure that students become information fluent in the 21st century In the 21st century educators in partnership with librarians must prepare students effectively for productive use of information especially in higher education. Students will need to graduate from universities with appropriate information and technology skills to enable them to become productive citizens in the workplace and in society. Technology is having a major impact on society; in economics e-business is moving to the forefront; in communication e-mail, the Internet and cellular telephones have reformed how people communicate; in the work environment computers and web utilizations are emphasized and in education virtual learning and teaching are becoming more important. These few examples indicate how the 21st century information environment requires future members of the workforce to be information fluent so they will have the ability to locate information efficiently, evaluate information for specific needs, organize information to address issues, apply information skillfully to solve problems, use information to communicate effectively, and use information responsibly to ensure a productive work environment. Individuals can achieve information fluency by acquiring cultural, visual, computer, technology, research and information management skills to enable them to think critically.
Teaching information literacy: substance and process This presentation explores the concept of information literacy within the broader context of higher education. It argues that, certain assertions in the library literature notwithstanding, the concepts associated with information literacy are not new, but rather very closely resemble the qualities traditionally considered to characterize a well-educated person. The presentation also considers the extent to which the higher education system does indeed foster the attributes commonly associated with information literacy. The term information literacy has achieved the immediacy it currently enjoys within the library community with the advent of the so-called "information age" The information age is commonly touted in the literature, both popular and professional, as constituting nothing short of a revolution. Academic librarians and other educators have of course felt called upon to make their teaching reflect both the growing proliferation of information formats and the major transformations affecting the process of information seeking. Faced with so much novelty and uncertainty, it is no surprise that many have felt that these changes call for a revolution in teaching. It is within this context that the concept of information literacy has flourished. It is argued in this presentation, however, that by treating information literacy as an essentially new specialty that owes much of its importance to the plethora of electronic information, we risk obscuring some of the most fundamental and enduring educational values we should be imparting to our students. Much of the literature on information literacy assumes - rather than argues - that recent changes in the way we approach education are indications of progress. Indeed, much of the self-narrative that institutions produce (in bulletins, mission statements, web sites, etc.) endorses an approach to education that will result in lifelong learners who are critical consumers of information. After critically examining the degree to which such statements of educational approach reflect reality, this presentation concludes by considering the effects of certain changes in the culture of higher education. It considers particularly the transformation - at least in North America - of the traditional model of higher education as a public good to a market-driven business model. It poses the question of whether a change of this significance might in fact detract from, rather than promote, the development of information literate students.
Research on dialectal varieties was for a long time concentrated on phonetic aspects of language. While there was a lot of work done on segmental aspects, suprasegmentals remained unexploited until the last few years, despite the fact that prosody was remarked as a salient aspect of dialectal variants by linguists and by naive speakers. Actual research on dialectal prosody in the German speaking area often deals with discourse analytic methods, correlating intonations curves with communicative functions (P. Auer et al. 2000, P. Gilles & R. Schrambke 2000, R. Kehrein & S. Rabanus 2001). The project I present here has another focus. It looks at general prosodic aspects, abstracted from actual situations. These global structures are modelled and integrated in a speech synthesis system. Today, mostly intonation is being investigated. However, rhythm, the temporal organisation of speech, is not a core of actual research on prosody. But there is evidence that temporal organisation is one of the main structuring elements of speech (B. Zellner 1998, B. Zellner Keller 2002). Following this approach developed for speech synthesis, I will present the modelling of the timing of two Swiss German dialects (Bernese and Zurich dialect) that are considered quite different on the prosodic level. These models are part of the project on the "development of basic knowledge for research on Swiss German prosody by means of speech synthesis modelling" founded by the Swiss National Science Foundation.
0. Introduction 1. Observations concerning the structure of morphosyntactically marked focus constructions 1.1 First observation: SF vs. NSF asymmetry 1.2 Second observation: NSF-NAR parallelism 1.3 Affirmative ex-situ focus constructions (SF, NSF), and narrative clauses (NAR) 2. Grammaticalization 2.1 Cleft hypothesis 2.2 Movement hypothesis 2.3 Narrative hypothesis 2.3.1 Back- or Foregrounding? 2.3.2 Converse directionality of FM and conjunction 3. Language specific analysis 4. Conclusionary remarks References