Refine
Year of publication
- 2006 (49) (remove)
Document Type
- Preprint (49) (remove)
Has Fulltext
- yes (49)
Is part of the Bibliography
- no (49) (remove)
Keywords
- Chatten <Kommunikation> (3)
- Deutsch (3)
- Dialektologie (3)
- Lexicalized Tree Adjoining Grammar (2)
- RHIC (2)
- Schweizerdeutsch (2)
- Syntax (2)
- syntax (2)
- ADD (1)
- African Diaspora (1)
Institute
- Physik (24)
- Frankfurt Institute for Advanced Studies (FIAS) (16)
- Extern (10)
- Informatik (1)
- Medizin (1)
- Sportwissenschaften (1)
- Universitätsbibliothek (1)
- Zentrum für Weiterbildung (1)
We obtain the D-meson spectral density at finite temperature for the conditions of density and temperature expected at FAIR. We perform a self-consistent coupled-channel calculation taking, as a bare interaction, a separable potential model. The Lambda_c (2593) resonance is generated dynamically. We observe that the D-meson spectral density develops a sizeable width while the quasiparticle peak stays close to the free position. The consequences for the D-meson production at FAIR are discussed.
At present, there are no quantitative, objective methods for diagnosing the Parkinson disease. Existing methods of quantitative analysis by myograms suffer by inaccuracy and patient strain; electronic tablet analysis is limited to the visible drawing, not including the writing forces and hand movements. In our paper we show how handwriting analysis can be obtained by a new electronic pen and new features of the recorded signals. This gives good results for diagnostics. Keywords: Parkinson diagnosis, electronic pen, automatic handwriting analysis
Gravitational radiation from ultra high energy cosmic rays in models with large extra dimensions
(2006)
The effects of classical gravitational radiation in models with large extra dimensions are investigated for ultra high energy cosmic rays (CRs). The cross sections are implemented into a simulation package (SENECA) for high energy hadron induced CR air showers. We predict that gravitational radiation from quasi-elastic scattering could be observed at incident CR energies above 10^9 GeV for a setting with more than two extra dimensions. It is further shown that this gravitational energy loss can alter the energy reconstruction for CR energies E_CR > 5 10^9 GeV.
Eine Reihe von nicht in Kodifikationen des Standards aufgenommenen sprachlichen Mustern wird im Blick auf ihre Karrieren in verschiedenen mündlichen und schriftlichen Texten in einer Flut von Veröffentlichungen thematisiert, meist in der Hoffnung hier grammatische Entwicklungen und die Basis für eine Orientierung der Grammatikschreibung an der Pragmatik zu entdecken. Im Folgenden soll Sprache nicht „konzeptuell schriftlich“ gedacht und „sozusagen literal idealisiert“ werden. Es soll argumentiert werden für eine einheitliche, mit Sprachgeschichte, ontogenetischem Spracherwerb und Variantenbildung verträgliche Erklärung nicht-standardisierter sprachlicher Muster im Rahmen einer Grammatikalisierungstheorie.
Der folgende Text betrachtet die Varietätenverwendung von Schweizer ChatterInnen und rückt dabei altersspezifische Fragen in den Vordergrund. Im Gegensatz zu vielen Versuchen, an die Sprache Jugendlicher heranzugehen, kommt hier ein quantitativer Ansatz zur Anwendung, der die Sprache der jugendlichen ChatterInnen mit der Sprache von ChatterInnen anderer Generationen vergleicht.
In the past, a divide could be seen between ’deep’ parsers on the one hand, which construct a semantic representation out of their input, but usually have significant coverage problems, and more robust parsers on the other hand, which are usually based on a (statistical) model derived from a treebank and have larger coverage, but leave the problem of semantic interpretation to the user. More recently, approaches have emerged that combine the robustness of datadriven (statistical) models with more detailed linguistic interpretation such that the output could be used for deeper semantic analysis. Cahill et al. (2002) use a PCFG-based parsing model in combination with a set of principles and heuristics to derive functional (f-)structures of Lexical-Functional Grammar (LFG). They show that the derived functional structures have a better quality than those generated by a parser based on a state-of-the-art hand-crafted LFG grammar. Advocates of Dependency Grammar usually point out that dependencies already are a semantically meaningful representation (cf. Menzel, 2003). However, parsers based on dependency grammar normally create underspecified representations with respect to certain phenomena such as coordination, apposition and control structures. In these areas they are too "shallow" to be directly used for semantic interpretation. In this paper, we adopt a similar approach to Cahill et al. (2002) using a dependency-based analysis to derive functional structure, and demonstrate the feasibility of this approach using German data. A major focus of our discussion is on the treatment of coordination and other potentially underspecified structures of the dependency data input. F-structure is one of the two core levels of syntactic representation in LFG (Bresnan, 2001). Independently of surface order, it encodes abstract syntactic functions that constitute predicate argument structure and other dependency relations such as subject, predicate, adjunct, but also further semantic information such as the semantic type of an adjunct (e.g. directional). Normally f-structure is captured as a recursive attribute value matrix, which is isomorphic to a directed graph representation. Figure 5 depicts an example target f-structure. As mentioned earlier, these deeper-level dependency relations can be used to construct logical forms as in the approaches of van Genabith and Crouch (1996), who construct underspecified discourse representations (UDRSs), and Spreyer and Frank (2005), who have robust minimal recursion semantics (RMRS) as their target representation. We therefore think that f-structures are a suitable target representation for automatic syntactic analysis in a larger pipeline of mapping text to interpretation. In this paper, we report on the conversion from dependency structures to fstructure. Firstly, we evaluate the f-structure conversion in isolation, starting from hand-corrected dependencies based on the TüBa-D/Z treebank and Versley (2005)´s conversion. Secondly, we start from tokenized text to evaluate the combined process of automatic parsing (using Foth and Menzel (2006)´s parser) and f-structure conversion. As a test set, we randomly selected 100 sentences from TüBa-D/Z which we annotated using a scheme very close to that of the TiGer Dependency Bank (Forst et al., 2004). In the next section, we sketch dependency analysis, the underlying theory of our input representations, and introduce four different representations of coordination. We also describe Weighted Constraint Dependency Grammar (WCDG), the dependency parsing formalism that we use in our experiments. Section 3 characterises the conversion of dependencies to f-structures. Our evaluation is presented in section 4, and finally, section 5 summarises our results and gives an overview of problems remaining to be solved.
In the recent literature there is growing interest in the morpho-syntactic encoding of hierarchical effects. The paper investigates one domain where such effects are attested: ergative splits conditioned by person. This type of splits is then compared to hierarchical effects in direct-inverse alternations. On the basis of two case studies (Lummi instantiating an ergative split person language and Passamaquoddy an inverse language) we offer an account that makes no use of hierarchies as a primitive. We propose that the two language types differ as far as the location of person features is concerned. In inverse systems person features are located exclusively in T, while in ergative systems, they are located in T and a particular type of v. A consequence of our analysis is that Case checking in split and inverse systems is guided by the presence/absence of specific phi-features. This in turn provides evidence for a close connection between Case and phi-features, reminiscent of Chomsky’s (2000, 2001) Agree.
Elliptic flow analysis at RHIC with the Lee-Yang Zeroes method in a relativistic transport approach
(2006)
The Lee-Yang zeroes method is applied to study elliptic flow (v_2) in Au+Au collisions at sqrt s =200 A GeV, with the UrQMD model. In this transport approach, the true event plane is known and both the nonflow effects and event-by-event v_2 fluctuations exist. Although the low resolutions prohibit the application of the method for most central and peripheral collisions, the integral and differential elliptic flow from the Lee-Yang zeroes method agrees with the exact v_2 values very well for semi-central collisions.
Using a qualitative analysis of disagreements from a referentially annotated newspaper corpus, we show that, in coreference annotation, vague referents are prone to greater disagreement. We show how potentially problematic cases can be dealt with in a way that is practical even for larger-scale annotation, considering a real-world example from newspaper text.
We propose to measure azimuthal correlations of heavy-flavor hadrons to address the status of thermalization at the partonic stage of light quarks and gluons in high-energy nuclear collisions. In particular, we show that hadronic interactions at the late stage cannot significantly disturb the initial back-to-back azimuthal correlations of DDbar pairs. Thus, a decrease or the complete absence of these initial correlations does indicate frequent interactions of heavy-flavor quarks and also light partons in the partonic stage, which are essential for the early thermalization of light partons.
We propose to measure correlations of heavy-flavor hadrons to address the status of thermalization at the partonic stage of light quarks and gluons in high-energy nuclear collisions, shown on the example of azimuthal correlations of D-Dbar pairs. We show that hadronic interactions at the late stage can not disturb these correlations significantly. Thus, a decrease or the complete absence of these initial correlations indicates frequent interactions of heavy-flavor quarks in the partonic stage. Therefore, early thermalization of light quarks is likely to be reached. PACS numbers: 25.75.-q
The globalized Western culture of innovation, as propagated by major aid institutions, does not necessarily lead to empowerment or improvement of the well-being of the stakeholders. On the contrary, it often blocks viable indigenous innovation cultures. In African societies and African Diasporas in Latin America, cultures of innovation largely accrue from the informal, not the formal sector. Crucial for their proper understanding is a threefold structural differentiation: between the formal and informal sector, within the informal sector, according to class, gender or religion, and between different transnational social spaces. Different innovation cultures may be complementary, mutually reinforcing, or conflicting, leading in extreme cases even to a ‘clash of cultures’ at the local level. The repercussions of competing, even antagonistic agencies of innovative strategic groups are demonstrated, analyzing the case of the African poor in Benin and the African Diasporas of Brazil and Haiti.
This paper compares two approaches to computational semantics, namely semantic unification in Lexicalized Tree Adjoining Grammars (LTAG) and Lexical Resource Semantics (LRS) in HPSG. There are striking similarities between the frameworks that make them comparable in many respects. We will exemplify the differences and similarities by looking at several phenomena. We will show, first of all, that many intuitions about the mechanisms of semantic computations can be implemented in similar ways in both frameworks. Secondly, we will identify some aspects in which the frameworks intrinsically differ due to more general differences between the approaches to formal grammar adopted by LTAG and HPSG.
The work presented here addresses the question of how to determine whether a grammar formalism is powerful enough to describe natural languages. The expressive power of a formalism can be characterized in terms of i) the string languages it generates (weak generative capacity (WGC)) or ii) the tree languages it generates (strong generative capacity (SGC)). The notion of WGC is not enough to determine whether a formalism is adequate for natural languages. We argue that even SGC is problematic since the sets of trees a grammar formalism for natural languages should be able to generate is difficult to determine. The concrete syntactic structures assumed for natural languages depend very much on theoretical stipulations and empirical evidence for syntactic structures is rather hard to obtain. Therefore, for lexicalized formalisms, we propose to consider the ability to generate certain strings together with specific predicate argument dependencies as a criterion for adequacy for natural languages.
We demonstrate the occurrence of canonical suppression associated with the conservation of an U(1)-charge in current transport models. For this study a pion gas is simulated within two different transport approaches by incorporating inelastic and volume-limited collisions pi pi leftrightarrow K bar-K for the production of kaon pairs. Both descriptions can dynamically account for the suppression in the yields of rare strange particles in a limited box, being in full accordance with a canonical statistical description.
Event-by-event fluctuations of the net baryon number and electric charge in nucleus-nucleus collisions are studied in Pb+Pb at SPS energies within the HSD transport model. We reveal an important role of the fluctuations in the number of target nucleon participants. They strongly influence all measured fluctuations even in the samples of events with rather rigid centrality trigger. This fact can be used to check different scenarios of nucleus-nucleus collisions by measuring the multiplicity fluctuations as a function of collision centrality in fixed kinematical regions of the projectile and target hemispheres. The HSD results for the event-by-event fluctuations of electric charge in central Pb+Pb collisions at 20, 30, 40, 80 and 158 A GeV are in a good agreement with the NA49 experimental data and considerably larger than expected in a quark-gluon plasma. This demonstrate that the distortions of the initial fluctuations by the hadronization phase and, in particular, by the final resonance decays dominate the observable fluctuations.
The transverse momentum dependence of the anisotropic flow v_2 for pi, K, nucleon, Lambda, Xi and Omega is studied for Au+Au collisions at sqrt s_NN = 200 GeV within two independent string-hadron transport approaches (RQMD and UrQMD). Although both models reach only 60% of the absolute magnitude of the measured v_2, they both predict the particle type dependence of v_2, as observed by the RHIC experiments: v_2 exhibits a hadron-mass hierarchy (HMH) in the low p_T region and a number-of-constituent-quark (NCQ) dependence in the intermediate p_T region. The failure of the hadronic models to reproduce the absolute magnitude of the observed v_2 indicates that transport calculations of heavy ion collisions at RHIC must incorporate interactions among quarks and gluons in the early, hot and dense phase. The presence of an NCQ scaling in the string-hadron model results suggests that the particle-type dependencies observed in heavy-ion collisions at intermediate p_T are related to the hadronic cross sections in vacuum rather than to the hadronization process itself, as suggested by quark recombination models.
In this paper, we investigate the usefulness of a wide range of features for their usefulness in the resolution of nominal coreference, both as hard constraints (i.e. completely removing elements from the list of possible candidates) as well as soft constraints (where a cumulation of violations of soft constraints will make it less likely that a candidate is chosen as the antecedent). We present a state of the art system based on such constraints and weights estimated with a maximum entropy model, using lexical information to resolve cases of coreferent bridging.