Refine
Year of publication
- 2004 (477) (remove)
Document Type
- Article (162)
- Working Paper (71)
- Part of a Book (67)
- Preprint (48)
- Doctoral Thesis (43)
- Part of Periodical (31)
- Conference Proceeding (28)
- Report (13)
- Book (10)
- diplomthesis (2)
Language
- English (477) (remove)
Has Fulltext
- yes (477) (remove)
Is part of the Bibliography
- no (477) (remove)
Keywords
- Syntax (26)
- Generative Transformationsgrammatik (23)
- Wortstellung (19)
- Deutsch (16)
- Optimalitätstheorie (12)
- Phonologie (11)
- Deutschland (9)
- Englisch (8)
- Formale Semantik (8)
- Informationsstruktur (8)
Institute
- Physik (75)
- Wirtschaftswissenschaften (38)
- Center for Financial Studies (CFS) (28)
- Medizin (27)
- Extern (24)
- Biochemie und Chemie (23)
- Frankfurt Institute for Advanced Studies (FIAS) (20)
- Biowissenschaften (12)
- Informatik (12)
- Mathematik (9)
In the last decade, much effort went into the design of robust third-person pronominal anaphor resolution algorithms. Typical approaches are reported to achieve an accuracy of 60-85%. Recent research addresses the question of how to deal with the remaining difficult-toresolve anaphors. Lappin (2004) proposes a sequenced model of anaphor resolution according to which a cascade of processing modules employing knowledge and inferencing techniques of increasing complexity should be applied. The individual modules should only deal with and, hence, recognize the subset of anaphors for which they are competent. It will be shown that the problem of focusing on the competence cases is equivalent to the problem of giving precision precedence over recall. Three systems for high precision robust knowledge-poor anaphor resolution will be designed and compared: a ruleset-based approach, a salience threshold approach, and a machine-learning-based approach. According to corpus-based evaluation, there is no unique best approach. Which approach scores highest depends upon type of pronominal anaphor as well as upon text genre.
Assessing enhanced knowledge discovery systems (eKDSs) constitutes an intricate issue that is understood merely to a certain extent by now. Based upon an analysis of why it is difficult to formally evaluate eKDSs, it is argued for a change of perspective: eKDSs should be understood as intelligent tools for qualitative analysis that support, rather than substitute, the user in the exploration of the data; a qualitative gap will be identified as the main reason why the evaluation of enhanced knowledge discovery systems is difficult. In order to deal with this problem, the construction of a best practice model for eKDSs is advocated. Based on a brief recapitulation of similar work on spoken language dialogue systems, first steps towards achieving this goal are performed, and directions of future research are outlined.
This study analyses the labour market effects of fixed-term contracts (FTCs) in West Germany by microeconometric methods using individual and establishment level data. In the first part of the study the role of FTCs in firms’ labour demand is analysed. An econometric investigation of the firms’ reasons for using FTCs focussing on the identification of the link between dismissal protection for permanent contract workers and the firms’ use of FTCs is presented. Furthermore, a descriptive analysis of the role of FTCs in worker and job flows at the firm level is provided. The second part of the study evaluates the short-run effects of being employed on an FTC on working conditions and wages using a large cross-sectional dataset of employees. The final part of the study analyses whether taking up an FTC increases the (permanent contract) employment opportunities in the long-run (stepping stone effect) and whether FTCs affect job finding behaviour of unemployed job searchers. Firstly, an econometric unemployment duration analysis distinguishing between both types of contracts as destination states is performed. Secondly, the effects of entering into FTCs from unemployment on future (permanent contract) employment opportunities are evaluated attempting to account for the sequential decision problem of job searchers.
We modify the concept of LLL-reduction of lattice bases in the sense of Lenstra, Lenstra, Lovasz [LLL82] towards a faster reduction algorithm. We organize LLL-reduction in segments of the basis. Our SLLL-bases approximate the successive minima of the lattice in nearly the same way as LLL-bases. For integer lattices of dimension n given by a basis of length 2exp(O(n)), SLLL-reduction runs in O(n.exp(5+epsilon)) bit operations for every epsilon > 0, compared to O(exp(n7+epsilon)) for the original LLL and to O(exp(n6+epsilon)) for the LLL-algorithms of Schnorr (1988) and Storjohann (1996). We present an even faster algorithm for SLLL-reduction via iterated subsegments running in O(n*exp(3)*log n) arithmetic steps.
Let G be a Fuchsian group containing two torsion free subgroups defining isomorphic Riemann surfaces. Then these surface subgroups K and alpha-Kalpha exp(-1) are conjugate in PSl(2,R), but in general the conjugating element alpha cannot be taken in G or a finite index Fuchsian extension of G. We will show that in the case of a normal inclusion in a triangle group G these alpha can be chosen in some triangle group extending G. It turns out that the method leading to this result allows also to answer the question how many different regular dessins of the same type can exist on a given quasiplatonic Riemann surface.
The large conductance voltage- and Ca2+-activated potassium (BK) channel has been suggested to play an important role in the signal transduction process of cochlear inner hair cells. BK channels have been shown to be composed of the pore-forming alpha-subunit coexpressed with the auxiliary beta-1-subunit. Analyzing the hearing function and cochlear phenotype of BK channel alpha-(BKalpha–/–) and beta-1-subunit (BKbeta-1–/–) knockout mice, we demonstrate normal hearing function and cochlear structure of BKbeta-1–/– mice. During the first 4 postnatal weeks also, BKalpha–/– mice most surprisingly did not show any obvious hearing deficits. High-frequency hearing loss developed in BKalpha–/– mice only from ca. 8 weeks postnatally onward and was accompanied by a lack of distortion product otoacoustic emissions, suggesting outer hair cell (OHC) dysfunction. Hearing loss was linked to a loss of the KCNQ4 potassium channel in membranes of OHCs in the basal and midbasal cochlear turn, preceding hair cell degeneration and leading to a similar phenotype as elicited by pharmacologic blockade of KCNQ4 channels. Although the actual link between BK gene deletion, loss of KCNQ4 in OHCs, and OHC degeneration requires further investigation, data already suggest human BK-coding slo1 gene mutation as a susceptibility factor for progressive deafness, similar to KCNQ4 potassium channel mutations. © 2004, The National Academy of Sciences. Freely available online through the PNAS open access option.
Dendritic cells (DC) are known to present exogenous protein Ag effectively to T cells. In this study we sought to identify the proteases that DC employ during antigen processing. The murine epidermal-derived DC line Xs52, when pulsed with PPD, optimally activated the PPD-reactive Th1 clone LNC.2F1 as well as the Th2 clone LNC.4k1, and this activation was completely blocked by chloroquine pretreatment. These results validate the capacity of XS52 DC to digest PPD into immunogenic peptides inducing antigen specific T cell immune responses. XS52 DC, as well as splenic DC and DCs derived from bone marrow degraded standard substrates for cathepsins B, C, D/E, H, J, and L, tryptase, and chymases, indicating that DC express a variety of protease activities. Treatment of XS52 DC with pepstatin A, an inhibitor of aspartic acid proteases, completely abrogated their capacity to present native PPD, but not trypsin-digested PPD fragments to Th1 and Th2 cell clones. Pepstatin A also inhibited cathepsin D/E activity selectively among the XS52 DC-associated protease activities. On the other hand, inhibitors of serine proteases (dichloroisocoumarin, DCI) or of cystein proteases (E-64) did not impair XS52 DC presentation of PPD, nor did they inhibit cathepsin D/E activity. Finally, all tested DC populations (XS52 DC, splenic DC, and bone marrow-derived DC) constitutively expressed cathepsin D mRNA. These results suggest that DC primarily employ cathepsin D (and perhaps E) to digest PPD into antigenic peptides.
Background: The neurophysiological and neuroanatomical foundations of persistent developmental stuttering (PDS) are still a matter of dispute. A main argument is that stutterers show atypical anatomical asymmetries of speech-relevant brain areas, which possibly affect speech fluency. The major aim of this study was to determine whether adults with PDS have anomalous anatomy in cortical speech-language areas. Methods: Adults with PDS (n = 10) and controls (n = 10) matched for age, sex, hand preference, and education were studied using high-resolution MRI scans. Using a new variant of the voxel-based morphometry technique (augmented VBM) the brains of stutterers and non-stutterers were compared with respect to white matter (WM) and grey matter (GM) differences. Results: We found increased WM volumes in a right-hemispheric network comprising the superior temporal gyrus (including the planum temporale), the inferior frontal gyrus (including the pars triangularis), the precentral gyrus in the vicinity of the face and mouth representation, and the anterior middle frontal gyrus. In addition, we detected a leftward WM asymmetry in the auditory cortex in non-stutterers, while stutterers showed symmetric WM volumes. Conclusions: These results provide strong evidence that adults with PDS have anomalous anatomy not only in perisylvian speech and language areas but also in prefrontal and sensorimotor areas. Whether this atypical asymmetry of WM is the cause or the consequence of stuttering is still an unanswered question. This article is available from: http://www.biomedcentral.com/1471-2377/4/23 © 2004 Jäncke et al; licensee BioMed Central Ltd. This is an Open Access article distributed under the terms of the Creative Commons Attribution License (http://creativecommons.org/licenses/by/2.0), which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited.
Background: In rat, deafferentation of one labyrinth (unilateral labyrinthectomy) results in a characteristic syndrome of ocular and motor postural disorders (e.g., barrel rotation, circling behavior, and spontaneous nystagmus). Behavioral recovery (e.g., diminished symptoms), encompassing 1 week after unilateral labyrinthectomy, has been termed vestibular compensation. Evidence suggesting that the histamine H3 receptor plays a key role in vestibular compensation comes from studies indicating that betahistine, a histamine-like drug that acts as both a partial histamine H1 receptor agonist and an H3 receptor antagonist, can accelerate the process of vestibular compensation. Results: Expression levels for histamine H3 receptor (total) as well as three isoforms which display variable lengths of the third intracellular loop of the receptor were analyzed using in situ hybridization on brain sections containing the rat medial vestibular nucleus after unilateral labyrinthectomy. We compared these expression levels to H3 receptor binding densities. Total H3 receptor mRNA levels (detected by oligo probe H3X) as well as mRNA levels of the three receptor isoforms studied (detected by oligo probes H3A, H3B, and H3C) showed a pattern of increase, which was bilaterally significant at 24 h post-lesion for both H3X and H3C, followed by significant bilateral decreases in medial vestibular nuclei occurring 48 h (H3X and H3B) and 1 week post-lesion (H3A, H3B, and H3C). Expression levels of H3B was an exception to the forementioned pattern with significant decreases already detected at 24 h post-lesion. Coinciding with the decreasing trends in H3 receptor mRNA levels was an observed increase in H3 receptor binding densities occurring in the ipsilateral medial vestibular nuclei 48 h post-lesion. Conclusion: Progressive recovery of the resting discharge of the deafferentated medial vestibular nuclei neurons results in functional restoration of the static postural and occulomotor deficits, usually occurring within a time frame of 48 hours in rats. Our data suggests that the H3 receptor may be an essential part of pre-synaptic mechanisms required for reestablishing resting activities 48 h after unilateral labyrinthectomy.
Western cultures have witnessed a tremendous cultural and social transformation of sexuality in the years since the sexual revolution. Apart from a few public debates and scandals, the process has moved along gradually and quietly. Yet its real and symbolic effects are probably much more consequential than those generated by the sexual revolution of the sixties. Sigusch refers to the broad-based recoding and reassessment of the sexual sphere during the eighties and nineties as the "neosexual revolution". The neosexual revolution is dismantling the old patterns of sexuality and reassembling them anew. In the process, dimensions, intimate relationships, preferences and sexual fragments emerge, many of which had submerged, were unnamed or simply did not exist before. In general, sexuality has lost much of its symbolic meaning as a cultural phenomenon. Sexuality is no longer the great metaphor for pleasure and happiness, nor is it so greatly overestimated as it was during the sexual revolution. It is now widely taken for granted, much like egotism or motility. Whereas sex was once mystified in a positive sense - as ecstasy and transgression, it has now taken on a negative mystification characterized by abuse, violence and deadly infection. While the old sexuality was based primarily upon sexual instinct, orgasm and the heterosexual couple, neosexualities revolve predominantly around gender difference, thrills, self-gratification and prosthetic substitution. From the vast number of interrelated processes from which neosexualities emerge, three empirically observable phenomena have been selected for discussion here: the dissociation of the sexual sphere, the dispersion of sexual fragments and the diversification of intimate relationships. The outcome of the neosexual revolution may be described as "lean sexuality" and "self-sex".
Background: Common warts (verrucae vulgares) are human papilloma virus (HPV) infections with a high incidence and prevalence, most often affecting hands and feet, being able to impair quality of life. About 30 different therapeutic regimens described in literature reveal a lack of a single striking strategy. Recent publications showed positive results of photodynamic therapy (PDT) with 5-aminolevulinic acid (5-ALA) in the treatment of HPV-induced skin diseases, especially warts, using visible light (VIS) to stimulate an absorption band of endogenously formed protoporphyrin IX. Additional experiences adding waterfiltered infrared A (wIRA) during 5-ALA-PDT revealed positive effects. Aim of the study: First prospective randomised controlled blind study including PDT and wIRA in the treatment of recalcitrant common hand and foot warts. Comparison of "5-ALA cream (ALA) vs. placebo cream (PLC)" and "irradiation with visible light and wIRA (VIS+wIRA) vs. irradiation with visible light alone (VIS)". Methods: Pre-treatment with keratolysis (salicylic acid) and curettage. PDT treatment: topical application of 5-ALA (Medac) in "unguentum emulsificans aquosum" vs. placebo; irradiation: combination of VIS and a large amount of wIRA (Hydrosun® radiator type 501, 4 mm water cuvette, waterfiltered spectrum 590-1400 nm, contact-free, typically painless) vs. VIS alone. Post-treatment with retinoic acid ointment. One to three therapy cycles every 3 weeks. Main variable of interest: "Percent change of total wart area of each patient over the time" (18 weeks). Global judgement by patient and by physician and subjective rating of feeling/pain (visual analogue scales). 80 patients with therapy-resistant common hand and foot warts were assigned randomly into one of the four therapy groups with comparable numbers of warts at comparable sites in all groups. Results: The individual total wart area decreased during 18 weeks in group 1 (ALA+VIS+wIRA) and in group 2 (PLC+VIS+wIRA) significantly more than in both groups without wIRA (group 3 (ALA+VIS) and 4 (PLC+VIS)): medians and interquartile ranges: -94% (-100%/-84%) vs. -99% (-100%/-71%) vs. -47% (-75%/0%) vs. -73% (-92%/-27%). After 18 weeks the two groups with wIRA differed remarkably from the two groups without wIRA: 42% vs. 7% completely cured patients; 72% vs. 34% vanished warts. Global judgement by patient and by physician and subjective rating of feeling was much better in the two groups with wIRA than in the two groups without wIRA. Conclusions: The above described complete treatment scheme of hand and foot warts (keratolysis, curettage, PDT treatment, irradiation with VIS+wIRA, retinoic acid ointment; three therapy cycles every 3 weeks) proved to be effective. Within this treatment scheme wIRA as non-invasive and painless treatment modality revealed to be an important, effective factor, while photodynamic therapy with 5-ALA in the described form did not contribute recognisably - neither alone (without wIRA) nor in combination with wIRA - to a clinical improvement. For future treatment of warts an even improved scheme is proposed: one treatment cycle (keratolysis, curettage, wIRA, without PDT) once a week for six to nine weeks. © 2004 Fuchs et al; licensee German Medical Science. This is an Open Access article: verbatim copying and redistribution of this article are permitted in all media for any purpose, provided this notice is preserved along with the article's original URL : http://www.egms.de/en/gms/volume2.shtml
We present an overview of the mathematics underlying the quantum Zeno effect. Classical, functional analytic results are put into perspective and compared with more recent ones. This yields some new insights into mathematical preconditions entailing the Zeno paradox, in particular a simplified proof of Misra's and Sudarshan's theorem. We empahsise the complex-analytic structures associated to the issue of existence of the Zeno dynamics. On grounds of the assembled material, we reason about possible future mathematical developments pertaining to the Zeno paradox and its counterpart, the anti-Zeno paradox, both of which seem to be close to complete characterisations. PACS-Klassifikation: 03.65.Xp, 03.65Db, 05.30.-d, 02.30.T . See the corresponding presentation: Schmidt, Andreas U.: "Zeno Dynamics of von Neumann Algebras" and "Zeno Dynamics in Quantum Statistical Mechanics"
We study the quantum Zeno effect in quantum statistical mechanics within the operator algebraic framework. We formulate a condition for the appearance of the effect in W*-dynamical systems, in terms of the short-time behaviour of the dynamics. Examples of quantum spin systems show that this condition can be effectively applied to quantum statistical mechanical models. Furthermore, we derive an explicit form of the Zeno generator, and use it to construct Gibbs equilibrium states for the Zeno dynamics. As a concrete example, we consider the X-Y model, for which we show that a frequent measurement at a microscopic level, e.g. a single lattice site, can produce a macroscopic effect in changing the global equilibrium. PACS - Klassifikation: 03.65.Xp, 05.30.-d, 02.30. See the corresponding papers: Schmidt, Andreas U.: "Zeno Dynamics of von Neumann Algebras" and "Mathematics of the Quantum Zeno Effect" and the talk "Zeno Dynamics in Quantum Statistical Mechanics" - http://publikationen.ub.uni-frankfurt.de/volltexte/2005/1167/
A fundamental work on THz measurement techniques for application to steel manufacturing processes
(2004)
The terahertz (THz) waves had not been obtained except by a huge system, such as a free electron laser, until an invention of a photo-mixing technique at Bell laboratory in 1984 [1]. The first method using the Auston switch could generate up to 1 THz [2]. After then, as a result of some efforts for extending the frequency limit, a combination of antennas for the generation and the detection reached several THz [3, 4]. This technique has developed, so far, with taking a form of filling up the so-called THz gap . At the same time, a lot of researches have been trying to increase the output power as well [5-7]. In the 1990s, a big advantage in the frequency band was brought by non-linear optical methods [8-11]. The technique led to drastically expand the frequency region and recently to realize a measurement up to 41 THz [12]. On the other hand, some efforts have yielded new generation and detection methods from other approaches, a CW-THz as well as the pulse generation [13-19]. Especially, a THz luminescence and a laser, originated in a research on the Bloch oscillator, are recently generated from a quantum cascade structure, even at an only low temperature of 60 K [20-22]. This research attracts a lot of attention, because it would be a breakthrough for the THz technique to become widespread into industrial area as well as research, in a point of low costs and easier operations. It is naturally thought that a technology of short pulse lasers has helped the THz field to be developed. As a background of an appearance of a stable Ti:sapphire laser and a high power chirped pulse amplification (CPA) laser, instead of a dye laser, a lot of concentration on the techniques of a pulse compression and amplification have been done. [23] Viewed from an application side, the THz technique has come into the limelight as a promising measurement method. A discovery of absorption peaks of a protein and a DNA in the THz region is promoting to put the technique into practice in the field of medicine and pharmaceutical science from several years ago [24-27]. It is also known that some absorption of light polar-molecules exist in the region, therefore, some ideas of gas and water content monitoring in the chemical and the food industries are proposed [28-32]. Furthermore, a lot of reports, such as measurements of carrier distribution in semiconductors, refractive index of a thin film and an object shape as radar, indicate that this technique would have a wide range of application [33-37]. I believe that it is worth challenging to apply it into the steel-making industry, due to its unique advantages. The THz wavelength of 30-300 ¼m can cope with both independence of a surface roughness of steel products and a detection with a sub-millimeter precision, for a remote surface inspection. There is also a possibility that it can measure thickness or dielectric constants of relatively high conductive materials, because of a high permeability against non-polar dielectric materials, short pulse detection and with a high signal-to-noise ratio of 103-5. Furthermore, there is a possibility that it could be applicable to a measurement at high temperature, for less influence by a thermal radiation, compared with the visible and infrared light. These ideas have motivated me to start this THz work.
The Kochen-Specker theorem has been discussed intensely ever since its original proof in 1967. It is one of the central no-go theorems of quantum theory, showing the non-existence of a certain kind of hidden states models. In this paper, we first offer a new, non-combinatorial proof for quantum systems with a type I_n factor as algebra of observables, including I_infinity. Afterwards, we give a proof of the Kochen-Specker theorem for an arbitrary von Neumann algebra R without summands of types I_1 and I_2, using a known result on two-valued measures on the projection lattice P(R). Some connections with presheaf formulations as proposed by Isham and Butterfield are made.
The paper provides a comprehensive overview of the gradual evolution of the supervisory policy adopted by the Basle Committee for the regulatory treatment of asset securitisation. We carefully highlight the pathology of the new “securitisation framework” to facilitate a general understanding of what constitutes the current state of computing adequate capital requirements for securitised credit exposures. Although we incorporate a simplified sensitivity analysis of the varying levels of capital charges depending on the security design of asset securitisation transactions, we do not engage in a profound analysis of the benefits and drawbacks implicated in the new securitisation framework. JEL Klassifikation: E58, G21, G24, K23, L51. Forthcoming in Journal of Financial Regulation and Compliance, Vol. 13, No. 1 .
The Basel Committee plans to differentiate risk-adjusted capital requirements between banks regulated under the internal ratings based (IRB) approach and banks under the standard approach. We investigate the consequences for the lending capacity and the failure risk of banks in a model with endogenous interest rates. The optimal regulatory response depends on the banks' inclination to increase their portfolio risk. If IRB-banks are well-capitalized or gain little from taking risks, then they will increase their market share and hold safe portfolios. As risk-taking incentives become more important, the optimal portfolio size of banks adopting intern rating systems will be increasingly constrained, and ultimately they may lose market share relative to banks using the standard approach. The regulator has only limited options to avoid the excessive adoption of internal rating systems. JEL Klassifikation: K13, H41.
We develop an estimated model of the U.S. economy in which agents form expectations by continually updating their beliefs regarding the behavior of the economy and monetary policy. We explore the effects of policymakers' misperceptions of the natural rate of unemployment during the late 1960s and 1970s on the formation of expectations and macroeconomic outcomes. We find that the combination of monetary policy directed at tight stabilization of unemployment near its perceived natural rate and large real-time errors in estimates of the natural rate uprooted heretofore quiescent in inflation expectations and destabilized the economy. Had monetary policy reacted less aggressively to perceived unemployment gaps, in inflation expectations would have remained anchored and the stag inflation of the 1970s would have been avoided. Indeed, we find that less activist policies would have been more effective at stabilizing both in inflation and unemployment. We argue that policymakers, learning from the experience of the 1970s, eschewed activist policies in favor of policies that concentrated on the achievement of price stability, contributing to the subsequent improvements in macroeconomic performance of the U.S. economy.