Refine
Year of publication
- 2009 (28) (remove)
Document Type
- Conference Proceeding (28) (remove)
Language
- English (28) (remove)
Has Fulltext
- yes (28)
Is part of the Bibliography
- no (28) (remove)
Keywords
- Informationsstruktur (3)
- Complex Systems (1)
- Englisch (1)
- Europa (1)
- GMM (1)
- Germanistik (1)
- Geschichte (1)
- Jiddisch (1)
- Jiddistik (1)
- Leasing (1)
Institute
This paper gives a brief overview of computation models for data stream processing, and it introduces a new model for multi-pass processing of multiple streams, the so-called mp2s-automata. Two algorithms for solving the set disjointness problem with these automata are presented. The main technical contribution of this paper is the proof of a lower bound on the size of memory and the number of heads that are required for solving the set disjointness problem with mp2s-automata.
This article discusses the divergent status of the two particles lé and lá in the grammar of Konkomba, a Gur language (Niger-Congo) of the Gurma subgroup. While previous studies claim that both particles are focus markers, this author argues that only the particle lá should be analyzed as a pure pragmatic device. Distributional studies suggest that the use of particle lé, on the other hand, is only required under specific focus conditions, and primarily represents a syntactic device.
Is the digital future a blessing for philologists, especially those working the vast area of Germanic Languages & Literatures? Or does it rather come with problems that jeopardize philology, in the Germanic and the broader scope? This paper sets out to explore the status quo (1.) of digital source material in Germanic philology, ranging from medieval manuscripts to 21st century e-books and their, at times, highly restricted availability to the scientific community. Do we really face a quantum leap in terms of open access, or is this leap rather confined to those only who pay the exorbitant fees specialist libraries charge for the use of their rare manuscript and book collections? How about Google Books (2.)? Are we in danger of neglecting everything that is (still) missing there? And what is in it for the German scholar? Can we believe that in some years’ time we will be able to get our hands on every source text we desire within seconds since it is only a click away? This paper critically assesses the process and progress of the digitization of mankind’s written records (3.), focusing on problems to be overcome by e.g. medievalists wishing to consult certain source material. This is illustrated by means of the example of how it is not yet possible consult certain materials related to the Franciscan preacher Berthold von Regensburg († 1272). A short concluding summary (4.) highlights perspectives for further thinking and discussion.
Experimental data shows that adult learners of an artificial language with a phonotactic restriction learned this restriction better when being trained on word types (e.g. when they were presented with 80 different words twice each) than when being trained on word tokens (e.g. when presented with 40 different words four times each) (Hamann & Ernestus submitted). These findings support Pierrehumbert’s (2003) observation that phonotactic co-occurrence restrictions are formed across lexical entries, since only lexical levels of representation can be sensitive to type frequencies.
The multiplicity fluctuations in A+A collisions at SPS and RHIC energies are studied within the HSD transport approach. We find a dominant role of the fluctuations in the nucleon participant number for the final fluctuations. In order to extract physical fluctuations one should decrease the fluctuations in the participants number. This can be done considering very central collisions. The system size dependence of the multiplicity fluctuations in central A+A collisions at the SPS energy range – obtained in the HSD and UrQMD transport models – is presented. The results can be used as a ‘background’ for experimental measurements of fluctuations as a signal of the critical point. Event-by-event fluctuations of the K/p , K/p and p/p ratios in A+A collisions are also studied. Event-by-event fluctuations of the kaon to pion number ratio in nucleus-nucleus collisions are studied for SPS and RHIC energies. We find that the HSD model can qualitatively reproduce the measured excitation function for the K/p ratio fluctuations in central Au+Au (or Pb+Pb) collisions from low SPS up to top RHIC energies. The forward-backward correlation coefficient measured by the STAR Collaboration in Au+Au collisions at RHIC is also studied. We discuss the effects of initial collision geometry and centrality bin definition on correlations in nucleus-nucleus collisions. We argue that a study of the dependence of correlations on the centrality bin definition as well as the bin size may distinguish between these ‘trivial’ correlations and correlations arising from ‘new physics’. 5th International Workshop on Critical Point and Onset of Deconfinement - CPOD 2009, June 08 - 12 2009 Brookhaven National Laboratory, Long Island, New York, USA
The objective of this paper is to test the hypothesis that in particular financially constrained firms lease a higher share of their assets to mitigate problems of asymmetric information. The assumptions are tested under a GMM framework which simultaneously controls for endogeneity problems and firms' fixed effects. We find that the share of total annual lease expenses attributable to either finance or operating leases is considerably higher for smaller firms with higher average interest rates and high-growth firms - those likely to face higher agency-cost premiums on marginal financing. Furthermore, our results confirm the substitution of leasing and debt financing for lessee firms. However, we find no evidence that firms use leasing as an instrument to reduce their tax burdens. Keywords: Leasing, financial constraints, asymmetric information, GMM JEL Classifications: D23, D92, C23
Ambiguity and communication
(2009)
The ambiguity of a nondeterministic finite automaton (NFA) N for input size n is the maximal number of accepting computations of N for an input of size n. For all k, r 2 N we construct languages Lr,k which can be recognized by NFA's with size k poly(r) and ambiguity O(nk), but Lr,k has only NFA's with exponential size, if ambiguity o(nk) is required. In particular, a hierarchy for polynomial ambiguity is obtained, solving a long standing open problem (Ravikumar and Ibarra, 1989, Leung, 1998).
Word formation in Distributed Morphology (see Arad 2005, Marantz 2001, Embick 2008): 1. Language has atomic, non-decomposable, elements = roots. 2. Roots combine with the functional vocabulary and build larger elements. 3. Roots are category neutral. They are then categorized by combining with category defining functional heads.