Refine
Year of publication
- 2006 (38) (remove)
Document Type
- Preprint (38) (remove)
Language
- English (38) (remove)
Has Fulltext
- yes (38)
Is part of the Bibliography
- no (38) (remove)
Keywords
- Deutsch (2)
- Lexicalized Tree Adjoining Grammar (2)
- RHIC (2)
- syntax (2)
- ADD (1)
- African Diaspora (1)
- Benin (1)
- Brazil (1)
- D-Dbar (1)
- Economic development (1)
Institute
- Physik (23)
- Frankfurt Institute for Advanced Studies (FIAS) (15)
- Extern (9)
- Informatik (1)
- Medizin (1)
- Sportwissenschaften (1)
At present, there are no quantitative, objective methods for diagnosing the Parkinson disease. Existing methods of quantitative analysis by myograms suffer by inaccuracy and patient strain; electronic tablet analysis is limited to the visible drawing, not including the writing forces and hand movements. In our paper we show how handwriting analysis can be obtained by a new electronic pen and new features of the recorded signals. This gives good results for diagnostics. Keywords: Parkinson diagnosis, electronic pen, automatic handwriting analysis
Elliptic flow analysis at RHIC with the Lee-Yang Zeroes method in a relativistic transport approach
(2006)
The Lee-Yang zeroes method is applied to study elliptic flow (v_2) in Au+Au collisions at sqrt s =200 A GeV, with the UrQMD model. In this transport approach, the true event plane is known and both the nonflow effects and event-by-event v_2 fluctuations exist. Although the low resolutions prohibit the application of the method for most central and peripheral collisions, the integral and differential elliptic flow from the Lee-Yang zeroes method agrees with the exact v_2 values very well for semi-central collisions.
We propose to measure correlations of heavy-flavor hadrons to address the status of thermalization at the partonic stage of light quarks and gluons in high-energy nuclear collisions, shown on the example of azimuthal correlations of D-Dbar pairs. We show that hadronic interactions at the late stage can not disturb these correlations significantly. Thus, a decrease or the complete absence of these initial correlations indicates frequent interactions of heavy-flavor quarks in the partonic stage. Therefore, early thermalization of light quarks is likely to be reached. PACS numbers: 25.75.-q
In the past, a divide could be seen between ’deep’ parsers on the one hand, which construct a semantic representation out of their input, but usually have significant coverage problems, and more robust parsers on the other hand, which are usually based on a (statistical) model derived from a treebank and have larger coverage, but leave the problem of semantic interpretation to the user. More recently, approaches have emerged that combine the robustness of datadriven (statistical) models with more detailed linguistic interpretation such that the output could be used for deeper semantic analysis. Cahill et al. (2002) use a PCFG-based parsing model in combination with a set of principles and heuristics to derive functional (f-)structures of Lexical-Functional Grammar (LFG). They show that the derived functional structures have a better quality than those generated by a parser based on a state-of-the-art hand-crafted LFG grammar. Advocates of Dependency Grammar usually point out that dependencies already are a semantically meaningful representation (cf. Menzel, 2003). However, parsers based on dependency grammar normally create underspecified representations with respect to certain phenomena such as coordination, apposition and control structures. In these areas they are too "shallow" to be directly used for semantic interpretation. In this paper, we adopt a similar approach to Cahill et al. (2002) using a dependency-based analysis to derive functional structure, and demonstrate the feasibility of this approach using German data. A major focus of our discussion is on the treatment of coordination and other potentially underspecified structures of the dependency data input. F-structure is one of the two core levels of syntactic representation in LFG (Bresnan, 2001). Independently of surface order, it encodes abstract syntactic functions that constitute predicate argument structure and other dependency relations such as subject, predicate, adjunct, but also further semantic information such as the semantic type of an adjunct (e.g. directional). Normally f-structure is captured as a recursive attribute value matrix, which is isomorphic to a directed graph representation. Figure 5 depicts an example target f-structure. As mentioned earlier, these deeper-level dependency relations can be used to construct logical forms as in the approaches of van Genabith and Crouch (1996), who construct underspecified discourse representations (UDRSs), and Spreyer and Frank (2005), who have robust minimal recursion semantics (RMRS) as their target representation. We therefore think that f-structures are a suitable target representation for automatic syntactic analysis in a larger pipeline of mapping text to interpretation. In this paper, we report on the conversion from dependency structures to fstructure. Firstly, we evaluate the f-structure conversion in isolation, starting from hand-corrected dependencies based on the TüBa-D/Z treebank and Versley (2005)´s conversion. Secondly, we start from tokenized text to evaluate the combined process of automatic parsing (using Foth and Menzel (2006)´s parser) and f-structure conversion. As a test set, we randomly selected 100 sentences from TüBa-D/Z which we annotated using a scheme very close to that of the TiGer Dependency Bank (Forst et al., 2004). In the next section, we sketch dependency analysis, the underlying theory of our input representations, and introduce four different representations of coordination. We also describe Weighted Constraint Dependency Grammar (WCDG), the dependency parsing formalism that we use in our experiments. Section 3 characterises the conversion of dependencies to f-structures. Our evaluation is presented in section 4, and finally, section 5 summarises our results and gives an overview of problems remaining to be solved.
In this paper, we investigate the usefulness of a wide range of features for their usefulness in the resolution of nominal coreference, both as hard constraints (i.e. completely removing elements from the list of possible candidates) as well as soft constraints (where a cumulation of violations of soft constraints will make it less likely that a candidate is chosen as the antecedent). We present a state of the art system based on such constraints and weights estimated with a maximum entropy model, using lexical information to resolve cases of coreferent bridging.
Using a qualitative analysis of disagreements from a referentially annotated newspaper corpus, we show that, in coreference annotation, vague referents are prone to greater disagreement. We show how potentially problematic cases can be dealt with in a way that is practical even for larger-scale annotation, considering a real-world example from newspaper text.
We have calculated the D-meson spectral density at finite temperature within a self-consistent coupled-channel approach that generates dynamically the Lambda_c (2593) resonance. We find a small mass shift for the D-meson in this hot and dense medium while the spectral density develops a sizeable width. The reduced attraction felt by the D-meson in hot and dense matter together with the large width observed have important consequences for the D-meson production in the future CBM experiment at FAIR.
We obtain the D-meson spectral density at finite temperature for the conditions of density and temperature expected at FAIR. We perform a self-consistent coupled-channel calculation taking, as a bare interaction, a separable potential model. The Lambda_c (2593) resonance is generated dynamically. We observe that the D-meson spectral density develops a sizeable width while the quasiparticle peak stays close to the free position. The consequences for the D-meson production at FAIR are discussed.
The experimental signatures of TeV-mass black hole (BH) formation in heavy ion collisions at the LHC is examined. We find that the black hole production results in a complete disappearance of all very high p_T (> 500 GeV) back-to-back correlated di-jets of total mass M > M_f ~ 1 TeV. We show that the subsequent Hawking-decay produces multiple hard mono-jets and discuss their detection. We study the possibility of cold black hole remnant (BHR) formation of mass ~ M_f and the experimental distinguishability of scenarios with BHRs and those with complete black hole decay. Due to the rather moderate luminosity in the first year of LHC running the least chance for the observation of BHs or BHRs at this early stage will be by ionizing tracks in the ALICE TPC. Finally we point out that stable BHRs would be interesting candidates for energy production by conversion of mass to Hawking radiation.
The production of Large Extra Dimension (LXD) Black Holes (BHs), with a new, fundamental mass scale of M_f = 1 TeV, has been predicted to occur at the Large Hadron Collider, LHC, with the formidable rate of 10^8 per year in p-p collisions at full energy, 14 TeV, and at full luminosity. We show that such LXD-BH formation will be experimentally observable at the LHC by the complete disappearance of all very high p_t (> 500 GeV) back-to-back correlated Di-Jets of total mass M > M_f = 1 TeV. We suggest to complement this clear cut-off signal at M > 2*500 GeV in the di-jet-correlation function by detecting the subsequent, Hawking-decay products of the LXD-BHs, namely either multiple high energy (> 100 GeV) SM Mono-Jets (i.e. away-side jet missing), sprayed off the evaporating BHs isentropically into all directions or the thermalization of the multiple overlapping Hawking-radiation in a eckler-Kapusta-Plasma. Microcanonical quantum statistical calculations of the Hawking evaporation process for these LXD-BHs show that cold black hole remnants (BHRs) of Mass sim M_f remain leftover as the ashes of these spectacular Di-Jet-suppressed events. Strong Di-Jet suppression is also expected with Heavy Ion beams at the LHC, due to Quark-Gluon-Plasma induced jet attenuation at medium to low jet energies, p_t < 200 GeV. The (Mono-)Jets in these events can be used to trigger for Tsunami-emission of secondary compressed QCD-matter at well defined Mach-angles, both at the trigger side and at the awayside (missing) jet. The Machshock-angles allow for a direct measurement of both the equation of state EoS and the speed of sound c_s via supersonic bang in the "big bang" matter. We discuss the importance of the underlying strong collective flow - the gluon storm - of the QCD- matter for the formation and evolution of these Machshock cones. We predict a significant deformation of Mach shocks from the gluon storm in central Au+Au collisions at RHIC and LHC energies, as compared to the case of weakly coupled jets propagating through a static medium. A possible complete stopping of pt > 50 GeV jets at the LHC in 2-3 fm yields nonlinear high density Mach shocks in he quark gluon plasma, which can be studied in the complex emission and disintegration pattern of the possibly supercooled matter. We report on first full 3-dimensional fluid dynamical studies of the strong effects of a first order phase transition on the evolution and the Tsunami-like Mach shock emission of the QCD matter.