Refine
Year of publication
- 2007 (82) (remove)
Document Type
- Conference Proceeding (82) (remove)
Has Fulltext
- yes (82) (remove)
Is part of the Bibliography
- no (82)
Keywords
- Koreanisch (3)
- Germanistik (2)
- Kongress (2)
- Nungisch (2)
- Passiv (2)
- Possessivität (2)
- Präposition (2)
- Tibetobirmanische Sprachen (2)
- Universitätsbibliothek Johann Christian Senckenberg (2)
- Unterspezifikation (2)
Institute
This paper is a follow up on Müller, 2006. It contains some comments on suggestions about the interaction of phrasal Constructions with constituent order that Adele Goldberg made at various occasions. In addition the paper discusses various HPSG analyses of particle verbs that assume lexical representations including phonologically specified parts of particle verb lexical entries. A recent phrasal analysis of resultatives (Haugereid, 2007) is discussed as well and it is pointed out that control constructions pose problems for phrasal analyses that do not assume empty elements but require that the subject is realized in a phrasal configuration.
Modern Hebrew is considered to be a 'partial pro-drop language'. Traditionally, the distinction between cases where pro-drop is licensed and those in which it is prohibited, was based on the person and tense features of the verb: 1st and 2nd person pronominal subjects may be omitted in past and future tense. This generalization, however, was found to be false in a number of papers, each discussing a subset of the data. Thus, contrary to conventional wisdom, dropped 3rd person pronouns subjects do occur in the language in particular contexts.
Identifying these contexts by way of a corpus-based survey is the initial step taken in this study. Subsequently, a careful syntactic analysis of the data reveals broad generalizations which have not been made to date. Thus, what was initially assumed to be a uniform phenomenon of 3rd person pro-drop turns out to be manifested in three distinct types of constructions. Finally, the proposed HPSG-based analysis incorporates insights concerning locality, correlations between finite and non-finite control, non-canonical elements, and binding.
The so-called floating quantifier constructions in languages like Korean display intriguing properties whose successful processing can prove the robustness of a parsing system. This paper shows that a constraint-based analysis, in particular couched upon the framework of HPSG, can offer us an efficient way of analyzing these constructions together with proper semantic representations. It also shows how the analysis has been successfully implemented in the LKB (Linguistic Knowledge Building) system.
Multiple nominative constructions (MNCs) in Korean have two main sub- types: possessive and adjunct types. This paper shows that a grammar allow- ing the interaction of declarative constraints on types of signs - in particular, having constructions (phrases and clauses) - can provide a robust and efficient way of encoding generalizations for two different MNCs. The feasibility of the grammar developed here has been checked with its implementation into the LKB (Linguistic Knowledge Building) system
The paper examines two verb sequencing constructions in Ga: the Serial Verb Construction (SVC) and the Extended Verb Complex (EVC). The former is an instance of a commonly recognized construction, the latter is typically found in the Volta Basin area of West Africa. EVCs are sequences of verbs functioning as single verb units relative to the syntax, but with an internal structure much like syntactic complementation. Both constructions show agreement of aspect and mood marking throughout the sequence, but with differences in exponence: in an SVC all Vs expose such marking, in an EVC only a limited (down to one) number of verbs, depending on the inflectional category. The paper presents the basic facts, based on works by Dakubu (2002, 2004, to appear), and gives an HPSG account of their morphology, syntax and semantics. The analysis is sustained by a grammar of the phenomena implemented with the 'Linguistic Knowledge Builder' (LKB), an engineering platform for natural language processing.
Licenser rules have originally been introduced in Müller (1999) as a part of a grammar based on discontinuous constituents. We propose licenser rules as a means to avoid underspecified empty elements in grammars with continuous constituents. We applied them to a verb movement analysis of the German main clause with right sentence bracket and to complement extraposition. To reduce the number of unnecessary hypotheses, we extended the licenser rule concept with a licenser binding technique. We compared the licenser rule approach to an approach based on underspecified traces with respect to processing performance. In our experiment, the use of licenser rules reduced the parse time by a factor of 13.5.
This paper examines the syntactic behavior of the Mauritian copula in predicative and extracted sentences. As it is the case in many languages, the Mauritian copula ete is absent in certain constructions: It only appears in extraction contexts. Our aim is to show that the postulation of a null copula, which has been proposed in various analyses, is inadequate for the Mauritian data. The phenomenon, as it is argued, rather lends itself to a strictly construction-based analysis within the framework of HPSG and is based on the distribution of weak pronouns and TAM markers.
In this paper I suggest an interface level of semantic representations, that on the one hand corresponds to morpho-syntactic entities such as phrase structure rules, function words and inflections, and that on the other hand can be mapped to lexical semantic representations that one ultimately needs in order to give good predictions about argument frames of lexical items. This interface level consists of basic constructions that can be decomposed into five sub-constructions (arg1-role, arg2-role ... arg5-role). I argue in favour of phrasal constructions in order to account for altering argument frames and maybe also coercion without having to use lexical rules or multiple lexical entries.
Townsend and Bever (2001) and Ferreira (2003) argue that simple templates representing the most commonly used orderings of arguments within a clause (e.g., NP-V-NP = Agent-Action-Patient) are used early in sentence comprehension to derive a preliminary interpretation before a full parse is completed. Sentences which match these templates (e.g., active sentences, subject clefts) are understood quickly and accurately, while sentences which deviate from the templates (e.g. passive sentences, object clefts) require additional processing to arrive at the correct interpretation. The present study extends the idea of canonical templates to the domain of noun phrases. I report on two experiments showing that possessive free relative clauses in English, which involve a non-canonical ordering of the head noun, are more difficult to understand than canonically headed noun phrases. I propose two reasons for this finding: (1) possessive free relatives deviate from the canonical template for interpreting noun phrases; and (2) the formal cues for interpreting possessive free relatives are relatively subtle. More generally I suggest that canonical templates help constrain mismatch in language by making certain kinds of mismatches costly for language users. Finally, I argue that evidence for canonical templates fits best within a parallel-architecture, constructionist theory of grammar.
In this paper, we report on an experiment showing how the introduction of prosodic information from detailed syntactic structures into synthetic speech leads to better disambiguation of structurally ambiguous sentences. Using modifier attachment (MA) ambiguities and subject/object fronting (OF) in German as test cases, we show that prosody which is automatically generated from deep syntactic information provided by an HPSG generator can lead to considerable disambiguation effects, and can even override a strong semantics-driven bias. The architecture used in the experiment, consisting of the LKB generator running a large-scale grammar for German, a syntax-prosody interface module, and the speech synthesis system MARY is shown to be a valuable platform for testing hypotheses in intonation studies.
This paper provides a background on the role of world knowledge in disambiguating modals and proposes treating the disambiguation of counterfactuals as a slightly more tractable sub-case of the general problem. Using a model theoretic possible worlds approach, counterfactuals are disambiguated with respect to a world of evaluation resembling classic Formal Semantic treatments (e.g., Kratzer 1977, 1981, 1989; Lewis 1973; Veltman 2005). The world, which provides a context of evaluation, is located through the interaction of the antecedent and consequent propositions with world knowledge axioms. This approach to modal disambiguation provides a connection between a grammar and the type of inferences typically handled in Knowledge Representation Systems (e.g., Hobbs et al. 1990) in a limited domain. The model theoretic semantics are linked with typed feature structures in an HPSG syntax (Pollard and Sag 1994). This grammar is implemented in TRALE, Penn's (2004) Prolog-based framework for typed feature structure grammar development. The compositional semantics in TRALE is specified in Penn and Richters' (2004, 2005) Constraint Language for Lexical Resource Semantics (CLLRS). This semantic component provides a semantic parse in which heads and arguments are combined systematically and the scope of negation or quantification can be accurately reflected. In the case of counterfactuals, the CLLRS semantic parse is passed to a model-theoretic interpreter. The mapping between the CLLRS semantic parse and the well-formed formulas of the model is defined by checking the parseability of the formula in the compositional semantics. Sets of possible worlds interact with constraints on world knowledge and constraints defining counterfactual validity. The truth value for a counterfactual is returned to the grammar relative to a context of evaluation. The results of counterfactual evaluation are returned in a form consistent with the grammar's internal compositional semantics. By the method described above, the interpreter provides a grammar-external component in which inferences involving world knowledge have the potential to be more efficiently evaluated. Through the development of model-checking techniques, for instance, it could be shown whether or not well-formed formulas and constraints hold in larger models and move towards capturing more fine-grained modal inferences in a larger domain.
This paper discusses a non-constituent coordination construction that occurs in Russian in which constituents with different syntactic functions and different thematic roles are conjoined. These conjuncts are co-arguments of the same head and are subject to a number of idiosyncrasies.
We consider several alternative analysis of the phenomena, and conclude that these are unable to account for the full range of the facts. Thus, even though these conjuncts do not form a semantic unit, there is evidence that they do form a kind of coordination structure. The phenomena are challenging for any theory of grammar, but the syntax-semantics account that we provide involves minimal changes to standard HPSG architecture.
Three distinctions seem relevant for the scope properties of adverbs: their function (adjuncts or complements), their prosody (incidental or integrated) and their lexical semantics (parenthetical or non parenthetical). We propose an analysis in which the scope of French adverbs is aligned with their syntactic properties, relying on a view of adjuncts as loci for quantification, a linearization approach to the word order, and an explicit modelling of dialogue.
Pseudocoordination in Danish
(2007)
In this paper we propose an analysis of Danish pseudocoordination constructions. The analysis is based on a hybrid phrase hierarchy where phrase types are assumed to be subtypes of types that cut across the traditional division of phrasal types, allowing the phrase type of pseudocoordinations to be a subtype of both coordinate phrases and headed phrases, and consequently inherit properties from both types. The analysis is linearization-based. We further develop a set of constraints on the phrasal types in the hierarchy.
The hybrid phrase hierarchy and the set of constraints on the various types in the hierarchy explain why, on the one hand, pseudocoordinations contain conjunctions and the conjuncts must have the same form and tense, and on the other, have a fixed order, allow extraction out of the second conjunct, do not allow overt subjects in the second conjunct and allow transitive verbs to appear in there-constructions.
The scientific innovation process embraces the steps from problem definition through the development and evaluation of innovative solutions to their successful exploitation. The challenges imposed by this process can be answered by the creation of a powerful and flexible next-generation e-Science infrastructure, which exploits leading edge information and knowledge technologies and enables a comprehensive and intelligent means of supporting this process. This paper describes our vision of a Knowledge-based eScience infrastructure, which is based on the results of an in-depth study of the researchers requirements. Furthermore, it introduces the Fraunhofer e-Science Cockpit as a first implementation of our vision.
Die Tagung "Die Rückkehr der deutschen Geschichtswissenschaft in die 'Ökumene der Historiker' nach 1945 – Ein wissenschaftsgeschichtlicher Ansatz" fand am 5./6. Juli 2007 in den Räumen des Deutsches Historischen Instituts Paris statt. In drei Sektionen diskutierten die Teilnehmer Fragen nach der Beharrung und Wandlung in der deutschen Geschichtswissenschaft nach 1945 (I.), der Reinstitutionalisierung und Neuorientierung der bundesdeutschen Geschichtswissenschaft (II.) und dem Historiker als transnationaler Akteur (III.). Zu analysieren galt es, welche inhaltlichen, personellen, methodologischen und epistemologischen Brüche und Kontinuitäten sich bei der Rückkehr der deutschen Geschichtswissenschaft in die Ökumene der Historiker feststellen lassen.
Nach der Begrüßung durch den Direktor des Instituts, Herrn Werner Paravicini, begann die Tagung mit zwei einführenden Vorträgen. Zunächst gab Christoph CORNELISSEN (Kiel) einen Überblick über die deutsche Geschichtswissenschaft nach 1945. ...
Euclidean strong coupling expansion of the partition function is applied to lattice Yang-Mills theory
at finite temperature, i.e. for lattices with a compactified temporal direction. The expansions
have a finite radius of convergence and thus are valid only for b <bc, where bc denotes the nearest
singularity of the free energy on the real axis. The accessible temperature range is thus the
confined regime up to the deconfinement transition. We have calculated the first few orders of
these expansions of the free energy density as well as the screening masses for the gauge groups
SU(2) and SU(3). The resulting free energy series can be summed up and corresponds to a glueball
gas of the lowest mass glueballs up to the calculated order. Our result can be used to fix
the lower integration constant for Monte Carlo calculations of the thermodynamic pressure via
the integral method, and shows from first principles that in the confined phase this constant is
indeed exponentially small. Similarly, our results also explain the weak temperature dependence
of glueball screening masses below Tc, as observed in Monte Carlo simulations. Possibilities and
difficulties in extracting bc from the series are discussed.
Lattice simulations employing reweighting and Taylor expansion techniques have predicted a (m;T)-phase diagram according to general expectations, with an analytic quark-hadron crossover at m =0 turning into a first order transition at some critical chemical potential mE. By contrast, recent simulations using imgainary m followed by analytic continuation obtained a critical structure in the fmu;d;ms;T;mg parameter space favouring the absence of a critical point and first order line. I review the evidence for the latter scenario, arguing that the various raw data are not inconsistent with each other. Rather, the discrepancy appears when attempting to extract continuum results from the coarse (Nt =4) lattices simulated so far, and can be explained by cut-off effects. New (as yet unpublished) data are presented, which for Nf = 3 and on Nt = 4 confirm the scenario without a critical point. Moreover, simulations on finer Nt = 6 lattices show that even if there is a critical point, continuum extrapolation moves it to significantly larger values of mE than anticipated on coarse lattices.
We discuss the use of Wilson fermions with twisted mass for simulations of QCD thermodynamics.
As a prerequisite for a future analysis of the finite-temperature transition making use
of automatic O(a) improvement, we investigate the phase structure in the space spanned by the
hopping parameter k , the coupling b , and the twisted mass parameter m. We present results for
Nf = 2 degenerate quarks on a 163×8 lattice, for which we investigate the possibility of an Aoki
phase existing at strong coupling and vanishing m, as well as of a thermal phase transition at
moderate gauge couplings and non-vanishing m.