Refine
Year of publication
Document Type
- Report (471) (remove)
Language
- English (471) (remove)
Has Fulltext
- yes (471)
Is part of the Bibliography
- no (471)
Keywords
- islamic state (13)
- terrorism (13)
- IS (9)
- Egypt (8)
- Syria (8)
- islamism (7)
- Europe (6)
- Germany (6)
- Goethe, Johann Wolfgang von (6)
- USA (6)
Institute
- Gesellschaftswissenschaften (212)
- Exzellenzcluster Die Herausbildung normativer Ordnungen (112)
- Wirtschaftswissenschaften (96)
- Center for Financial Studies (CFS) (42)
- House of Finance (HoF) (30)
- Sustainable Architecture for Finance in Europe (SAFE) (30)
- Informatik (20)
- Mathematik (16)
- Extern (8)
- Physik (7)
The redeemers of the land
(1999)
Jan Snyman papers
(2007)
Biographical history and context: Professor Jan Snyman spent most of his life researching the lesser known and marginalised San languages of Botswana and South West Africa (now Namibia). Together with O. Kohler, E. Westphal and A. Traill, he pioneered linguistic studies on these endangered languages of Africa. He contributed significantly in collection of the data that helped classify and understand the grammar of San languages. Snyman also wrote several grammars in the form of monographs and notes on these languages. By the time he died, in 2002, a draft for the Tshwaa and Kua languages had been completed. Content: Linguistic, phonetics and orthography research materials including fonts for phonetic languages. Covering dates: 1967-2000
In this paper I argue in favor of a Matching Analysis for German relative clauses. The Head Raising Analysis is shown to fail to account for parts of the reconstruction pattern in German, especially cases where only the external head is interpreted and the absence of Principle C effects. I propose a Matching Analysis with Vehicle Change and make consistent assumptions about possible deletion operations in relatives so that the entire pattern can be captured by one analysis which therefore proves superior to previous ones.
In April 2002 the European Central Bank (ECB) and the Center for Financial Studies (CFS) launched the ECB-CFS Research Network to promote research on “Capital Markets and Financial Integration in Europe”. The ECB-CFS research network aims at stimulating top-level and policy-relevant research, significantly contributing to the understanding of the current and future structure and integration of the financial system in Europe and its international linkages with the United States and Japan. This report summarises the work done under the network after two years. Over time the network formed a coherent and growing group of researchers interested in the integration of European financial markets, while using light organisational structures and budgets. The members of this evolving group met repeatedly at the events organised by the network to present the latest results of their research and to share views on policy options. In this sense, the “network of people” intended at the start was created. Overall, the network aroused great interest, as leading academic researchers, researchers from the main policy institutions and high-level policy makers participated actively in it by presenting research results, through speeches and in policy panels. It also stimulated a new research field on securities settlement systems, an area of high policy relevance and interest to the ECB that had not attracted much interest in the research community beforehand. Also, the network seems to have triggered several related outside initiatives by international institutions, such as the IMF or the OECD. During its first two years the network was organised around three workshops and a final symposium on 10-11 May 2004. To focus research resources and to ensure medium-term policy relevance, a limited number of areas have been given top priority: bank competition and the geographical scope of banking; international portfolio choices and asset market linkages between Europe, the United States and Japan; European bond markets; European securities settlement systems; and the emergence and evolution of new markets in Europe (in particular start-up financing markets). In order to stimulate further research focused on the priority fields of the network, the ECB Lamfalussy research fellowships were established. These fellowships sponsor projects proposed by young researchers, both a dvanced doctoral students and younger professors. Five Lamfalussy fellowships were granted in 2003 and five more in 2004. The first papers from this program have already been issued in the ECB working paper series or are forthcoming. One of them won the prize for the best paper written by a Ph.D. student at the 2004 European Finance Association Meetings in Maastricht. Results of the network in the five top priority areas can be summarised as follows: Bank competition and the geographical scope of banking. First, integration does not appear to be very advanced in many retail banking markets. Second, some of the inherent characteristics of traditional loan and deposit business constrain the cross-border expansion of commercial banking, even in a common currency area. Hence, the implementation of some policies to foster cross-border integration in retail banking may be ineffective. Third, theoretical research suggests that supervisory structures may not be neutral towards further European banking integration. Finally, a stronger role of area-wide competition policies could be beneficial for further banking integration. This would also stimulate economic growth, as more competition in the banking sector induces financially dependent firms to grow more. European bond markets. While the government bond market has integrated rapidly with the EMU convergence process, its full integration has not yet been achieved. The introduction of a common electronic trading platform reduced transaction costs substantially, but yield spreads of long-term sovereign bonds of the euro area are still heterogeneous. This is largely explained by different sensitivities to an international risk factor, whereas liquidity differentials only play a role in conjunction with this latter factor. Somewhat surprisingly in this context, the dynamically developing corporate bond market exhibits a relatively high level of integration. There is also increasing evidence that the introduction of the euro has contributed to a reduction in the cost of capital in the euro area, in particular through the reduction of corporate bond underwriting fees. As a result, firms may wish to increase bond financing relative to equity financing. The development of a larger corporate bond market is also important for monetary policy. For example, US evidence suggests that the rating of corporate bonds may contribute to the persistence of recessions, as rating agencies´ policies affect firms asymmetrically in their access to the bond market over the business cycle. US evidence also suggests that liquidity conditions in stock and bond markets tend to be positively correlated. European securities settlement systems. European securities settlement infrastructures are highly fragmented and further integration and/or consolidation would exploit economies of scale that could greatly benefit investors. It is not clear, however, whether direct public intervention in favour of consolidation would lead to the highest level of efficiency, for example because of the existence of strong vertical integration between trading and securities platforms (“silos”). In contrast, promoting open access to clearing and settlement systems could lead to consolidation and the highest level of efficiency. Finally, regarding concerns about unfair practices by Central Securities Depositories (CSDs) toward custodian banks, regulatory interventions favouring custodian banks should be discouraged, as long as CSDs are not allowed to price discriminate between custodian banks and investor banks. The emergence and evolution of new markets in Europe (in particular start-up financing markets). While fairly well integrated, “new markets” and start-up financing are less developed and integrated in Europe than in the United States. However, new markets and venture capitalists are the most important intermediaries for the financing of projects with high risk but with potentially very high return. The analysis carried out within the network reveals that European start-up financiers are mostly institutional investors, while US venture capitalists are mostly rich individuals. Also, new markets are essential for the development of start-up finance in Europe, as they provide an exit strategy for start-up financiers who can then sell new successful projects using initial public offerings. Finally, the legal framework affects the development of venture capital firms. For example, very strict personal bankruptcy laws constrain early stage entrepreneurs, reducing demand for venture capital finance. International portfolio choices and asset market linkages between Europe, the United States and Japan. At a global scale, asset market linkages have increased recently. For example, major economies such as the United States and the euro area have become more financially interdependent. This phenomenon can be observed in stock and bond markets as well as in money markets, where the main direction of spillovers has recently been from the US to the euro area. Country-specific shocks now play a smaller role in explaining stock return variations of firms whose sales are internationally diversified. Increases in firmby-firm market linkages are a global phenomenon, but they are stronger within the euro area than in the rest of the world. Various other phenomena also increase market linkages and therefore the likelihood that financial shocks spread across countries. One example is the use of global bonds. Finally, the nowadays more direct access of unsophisticated investors to financial markets may increase volatility. Other areas. Financial integration affects financial structures, but it does not need to lead to their convergence across countries. Financial structures matter for growth, as market-oriented financial systems benefit all sectors and firms, whereas bank-based systems primarily benefit younger firms that depend on external finance. Moreover, good corporate governance increases firms’ value. In particular, the dual board system, where the monitoring and advising roles of the board of directors are separated, is found to dominate the single board structure. Therefore, the further development of the European single market should strongly require good corporate governance. In general, well designed institutions foster entrepreneurial activity, partly by relaxing capital constraints. The results of the network clearly illustrated the substantial effects the introduction of the euro had on euro area financial markets. In addition to the effects on bond markets, stock markets and the cost of capital summarised above, research produced showed that the single currency had its strongest effects on money markets, whose unsecured segment is now completely integrated. Without any doubt the euro generally enhanced the liquidity and efficiency of euro area financial markets, and ongoing initiatives such as the European Union’s Financial Services Action Plan will help to continue this process. In sum, in the first two years the network has established itself as the hub for the research debate on European financial integration. Some of the best papers produced by the network, leading to the conclusions mentioned above, are currently being considered for publication in two special issues of academic journals. An issue of the Oxford Review of Economic Policy on “European financial integration” is published contemporaneously with this report, and an issue of the Review of Finance is planned for next year. The current policy context, the gradual progress of integration as well as the creation of other related non-ECB or non-CFS initiatives on financial integration suggest that this topic will remain high on the agendas of policy makers and academics for the years to come. Therefore, the ECB Executive Board and the CFS decided to continue the network, refocusing its priorities. Three priority areas have been added: 1) The relationship between financial integration and financial stability, 2) EU accession, financial development and financial integration, and 3) financial system modernisation and economic growth in Europe. These three areas have become particularly important at the current juncture, but have not received particularly strong attention in the first two years of the network. For example, the area of financial stability research was highlighted by the ECB research evaluators as an area deserving further development. Moreover, despite the results found in the first two years of the network, new developments remain to be further explored in the earlier priority areas. A three-year extension is envisaged, running from after the May 2004 symposium until 2007, with two events to be held per year. The threeyear period is long enough to consider the first effects of the Financial Services Action Plan. It also constitutes a realistic horizon for the ambitious agenda implied by the three new priorities. The generally light organisational structure and working of the network will not be changed. In addition, given the value of the Lamfalussy fellowship research program in inducing further research in the areas of the network, the program has also been extended for all the research topics in the area of the network.
When Angela Merkel arrives at the United Nations for the opening of the 62nd session of the General Assembly on Tuesday [25 September] to deliver her first address as German chancellor she will be very well received. Just after two years in power she has already become something like a foreign policy legend...
Bency Eichorn learns in kollel and, on the side, has been researching about various segulos. For his wedding he authored a book, Simchas Zion, discussing the segulah of keeping the afikomom from year-to-year. The post below is a small part of a much larger project on this segulah and has been adapted for the blog.
Exploring elites and their relations to institutions can assist understanding the day-to-day realities of politics in Africa (Chabal and Daloz 1999, Amundsen 2001, Lindberg 2003). This review is a scoping exercise in what has been written on the subject in recent years. The main task of the review is to summarise current understandings of how elites work with and through political institutions in Africa. There is a huge literature in this subject area. We have tried to pick out a) that which is most pertinent and non-repetitive, and b) that which raises as many questions as it provides answers. On the whole we have focused on literature published in the last five to ten years and we have inclined towards the literature on Anglophone Africa. The review is presented as follows: Section 1 is an introduction to Africa’s recent political landscape and it introduces some of the major issues that appear in the literature. Section 2 provides some working definitions of elites, institutions and democratisation as three of the recurring themes in the review. Section 3 reviews literature broadly on democratisation in Africa and specifically on elections and elites. Section 4 examines how political parties have evolved over the last 15 years. Section 5 reviews the three branches of government and Section 6 briefly examines decentralisation and its relation to elites and politics. The remaining sections of the review move outside the more formal political structures to examine the media (Section 7), civil society (Section 8), women’s movements (Section 9), Trades unions (Section 10) and business associations (Section 11). The final Section 12 pulls out a number of gaps in the research that we have identified in the course of the review. Section 13 contains a complete bibliography of citations used in the review. It is crucial to remember that Africa’s experiences of democratisation are no more than 15 years old, and many scholars have cautioned that it is still very early to draw any definite conclusions (Amundsen 2001; Randall and Svasand 2002). Inhaltsverzeichnis: 1. Africa’s political landscape 3 * Diversity of ‘Africa’ 4 * Elections do not mean democracy 4 * Presidentialism 4 * Ethnicity 5 * Personal rule and patronage 5 2. Definitions 6 * Elites 6 * Political institutions 7 * Democratisation 8 3. Democratisation and elites 8 * Elections 9 * Elites and elections 13 4. Political parties 16 5. Branches of government 17 * The executive 17 * The legislature 18 * The Judiciary 20 6. Decentralisation 20 7. Media 21 * Radio 25 * Television 25 * Newspapers 25 * Internet 26 8. Civil society 26 9. Women’s movements 29 10. Trade unions 32 11. Business associations 34 12. Gaps in the research 36 13. Bibliography 38
The basic problem of primary audio and video research materials is clearly shown by the survey: A great and important part of the entire heritage is still outside archival custody in the narrower sense, scattered over many institutions in fairy small collections, and even in private hands. reservation following generally accepted standards can only be carried out effectively if collections represent critical mass. Specialised audiovisual archives will solve their problems, as they will sooner or later succeed in getting appropriate funding to achieve their aims. A very encouraging example is the case of the Netherlands. The larger audiovisual research archives will also manage, more or less autonomously, the transfer of contents in time. For a considerable part of the research collections, however, the concept of cooperative models and competence centres is the only viable model to successfullly safeguard their holdings. Their organisation and funding is a considerable challenge for the scientific community. TAPE has significantly raised awareness of the fact that, unless action is swiftly taken, the loss of audiovisual materials is inevitable. TAPE’s international and regional workshops were generally overbooked. While TAPE was already underway, several other projects for the promotion of archives have received grants from organisations other than the European Commission, inter alia support for the St. Petersburg Phonogram Archive, and the Folklore Archive in Tirana, obviously as a result of a better understanding of the need for audiovisual preservation. When the TAPE project started its partners assumed that cooperative projects would fail because of the notorious distrust of researchers, specifically in the post-communist countries. One of the most encouraging surprises was to learn that, at least in the most recent survey, it became apparent that this social obstacle is fading out. TAPE may have contributed to this important development.
This article generalizes Schwinger’s mechanism for particles production in the arbitrary finite field volume. McLerran-Venugopolan(MV) model and iterative solution of DGLAP equation in the double leading log approximation for small x gluon distribution function were used to derive the new formula for initial chromofield energy density. This initial chromofield energy is distributed among color neutral clusters or strings of different length. This strings are stretched by receding nucleus. From the proposed mechanism of string fragmentation or color field decay based on exact solution of Dirac equation in the different finite volume, the new formulae for esimated baryon kinetic energy loss and rapidity spectrum of produced partons were derived.
In the article, a travel sketch of Danish playwright Kaj Munk (1898 – 1944) is analytically considered. The analysis of this text allows drawing at least three conclusions: 1. Explicit motive of seasickness, figuring here as antithetic modification of implicite present free-standing posture motive, symbolizes idea of disintegrating personality. 2. Such a symbolism is deeply rooted in that of Danish identity. 3. From literary styles and trends history point of view, the sketch appears to be typical of that line in expressionism, which continues tradition of symbolism as artistic and literary trend of late 19th century.
When one considers the results of social scientific surveys, secularisation in Germany seems to be a more or less linear process of erosion of what is traditionally named religiosity. The percentage of citizens who affirm that they are “religious”, believe in God or otherworldly beings, hope for life after death or participate regularly in the praxis of a religious community has been – by and large – steadily declining for decades. This decline has occurred over the succeeding generations: The younger the generation, the fewer “religious” people in it. But the process of secularisation is apparent not only in this persistent quantitative shrinkage from generation to generation. Above all it also manifests itself – this is the thesis of the article – in the transformation of the habitus formations and contents of faith of the generations. The essence of ongoing secularisation naturally is reflected most clearly in its contemporary state of development which is represented in the youngest adult generation. Therefore the analysis of this generation is particularly interesting for the sociology of religion. But the article does not confine to analyze this generation. After indicating some basic premises of the sociology of generations and the notion of secularisation presupposed in this paper, the succession of generations in Germany is outlined hypothetically, from the so-called generation of ´68 to the youngest adult generation, concluding with some remarks about the progress of secularisation.
This paper documents the experiences of assurance evaluation during the early stage of a large software development project. This project researches, contracts and integrates privacy-respecting software to business environments. While assurance evaluation with ISO 15408 Common Criteria (CC) within the certification schemes is done after a system has been completed, our approach executes evaluation during the early phases of the software life cycle. The promise is to increase quality and to reduce testing and fault removal costs for later phases of the development process. First results from the still-ongoing project suggests that the Common Criteria can define a framework for assurance evaluation in ongoing development projects.
We review arguments for and against reserve requirements and conclude that the main question is whether a distinction between money creation and intermediation can be made. We argue that such a distinction can be made in a money-in-advance economy and show that if the money-in-advance constraint is universally binding then reserve requirements on checkable accounts have no effect on intermediation. We then proceed to show that in a model in which trade is uncertain and sequential, a fractional reserve banking system gives rise to endogenous monetary shocks. These endogenous monetary shocks lead to fluctuations in capacity utilisation and waste. When the money-in-advance constraint is universally binding, a 100% reserve requirement on checkable accounts can eliminate this waste.
The European Strategy on Invasive Alien Species T-PWS(2002) 8 mandates intensified research by member nations on invasive species. This research will not be restricted solely to the biology and remediation of invasive species, but will also evaluate their adverse health effects and economic impact. Previous studies of these issues have only been carried out in the Unites States of America, or in a limited, regional manner. Consequently, 20 plant and animal species from various problem areas (species which pose a threat to public health; losses to agriculture, fisheries, and forestry; damage to public roads and waterways; costs associated with the protection of native species threatened by non-native species as mandated by Recommendation 77 of the Bern Convention were assessed in Germany nation-wide. The accruing costs were sorted into 3 categories: a) direct economic losses, such as those caused by destructive pest species; b) ecological costs, in the form of extra care and protection of native taxa, biotopes, or ecosystems threatened by invasive species; c) costs of measures to combat invasive species. Because of the nature of available data, as well as the different biology and ecology of the invasive species, each had to be treated individually, and the associated costs vary greatly from species to species. Moreover, not all of the species investigated cause economic losses. Accordingly, a nuanced approach to alien species is essential. Cost assessment of losses deriving from ecological damage was only possible in a few cases. Ongoing, multi-year studies incorporating cost/benefit analysis will be necessary to resolve remaining issues.
Left dislocation in Zulu
(2004)
This paper examines left dislocation constructions in Zulu, a Southern Bantu language belonging to the Nguni group (Zone S 40). In Zulu left dislocation configurations, a topic phrase in the beginning of the sentence is linked to a resumptive element within the associated clause. Typically, the resumptive element is an incorporated pronoun (cf. Bresnan & Mchombo 1987), as illustrated by the examples in (1) and (2). In these examples, the object pronoun (in italics) is part of the verbal morphology and agrees with the noun class (gender) of the dislocate. This situation is schematically illustrated in (3), where co-indexation represents agreement: ...
In this paper I discuss the properties of particle verbs in light of a proposal about syntactic projection. In section 2 I suggest that projection involves functional structure in two important ways: (i) only functional phrases can be complements, and (ii) lexical heads that take complements and project must be inflected. In section 3, I show that the structure of particle verbs is not uniform with respect to (i) and (ii). On the one hand, a particle always combines with an inflected verb; in this respect, particle verbs look like verb-complement constructions. On the other hand, the particle is not a functional phrase and therefore is not a proper complement, which makes the combination of the particle and the verb look more like a morphologically complex verb. I argue that syntactic rules can in fact interpret the node dominating the particle and the verb as a projection and as a complex head. In section 4, I show that many of the characteristic properties of particle verbs in the Germanic languages follow from the fact that they are structural hybrids.
In this paper, I examine the potential of mobile alerting services empowering investors to react quickly to critical market events. Therefore, an analysis of short-term (intraday) price effects is performed. I find abnormal returns to company announcements which are completed within a timeframe of minutes. To make use of these findings, these price effects are predicted using pre-defined external metrics and different estimation methodologies. Compared to previous research, the results provide support that artificial neural networks and multiple linear regression are good estimation models for forecasting price effects also on an intraday basis. As most of the price effect magnitude and effect delay can be estimated correctly, it is demonstrated how a suitable mobile alerting service combining a low level of user-intrusiveness and timely information supply can be designed.
Multiplayer games have become very popular in the PC market. Almost none of the current games are shipped without some support for multiplayer gaming. At the same time mobile devices are becoming more powerful and popularity of games on these platforms increases. However, there are almost no games that support multiplayer gaming despite the multiple options of these devices to connect with each other and build mobile ad hoc networks. Reasons for this lack of multiplayer support are the high diversity of mobile devices as well as the different protocols and their properties that these devices support. With “SmartBlaster” we developed a multiplayer game for several different platforms that is using several different channels (Bluetooth, IrDa, 802.11 and other networks supporting TCP/IP) to communicate between them.
Serial correlation in dynamic panel data models with weakly exogenous regressor and fixed effects
(2005)
Our paper wants to present and compare two estimation methodologies for dynamic panel data models in the presence of serially correlated errors and weakly exogenous regressors. The ¯rst is the ¯rst di®erence GMM estimator as proposed by Arellano and Bond (1991) and the second is the transformed Maximum Likelihood Estimator as proposed by Hsiao, Pesaran, and Tahmiscioglu (2002). Thereby, we consider the ¯xed e®ects case and weakly exogenous regressors. The ¯nite sample properties of both estimation methodologies are analysed within a simulation experiment. Furthermore, we will present an empirical example to consider the performance of both estimators with real data. JEL Classification: C23, J64
The effects of vocational training programmes on the duration of unemployment in Eastern Germany
(2005)
Vocational training programmes have been the most important active labour market policy instrument in Germany in the last years. However, the still unsatisfying situation of the labour market has raised doubt on the efficiency of these programmes. In this paper, we analyse the effects of the participation in vocational training programmes on the duration of unemployment in Eastern Germany. Based on administrative data for the time between the October 1999 and December 2002 of the Federal Employment Administration, we apply a bivariate mixed proportional hazards model. By doing so, we are able to use the information of the timing of treatment as well as observable and unobservable influences to identify the treatment effects. The results show that a participation in vocational training prolongates the unemployment duration in Eastern Germany. Furthermore, the results suggest that locking-in effects are a serious problem of vocational training programmes. JEL Classification: J64, J24, I28, J68
Comparison of MSACD models
(2003)
We propose a new framework for modelling time dependence in duration processes on financial markets. The well known autoregressive conditional duration (ACD) approach introduced by Engle and Russell (1998) will be extended in a way that allows the conditional expectation of the duration process to depend on an unobservable stochastic process which is modelled via a Markov chain. The Markov switching ACD model (MSACD) is a very flexible tool for description and forecasting of financial duration processes. In addition, the introduction of an unobservable, discrete valued regime variable can be justified in the light of recent market microstructure theories. In an empirical application we show that the MSACD approach is able to capture several specific characteristics of inter trade durations while alternative ACD models fail. JEL classification: C22, C25, C41, G14
We propose a new framework for modelling the time dependence in duration processes being in force on financial markets. The pioneering ACD model introduced by Engle and Russell (1998) will be extended in a manner that the duration process will be accompanied by an unobservable stochastic process. The Discrete Mixture ACD framework provides us with a general methodology which puts the idea into practice. It is established by introducing a discrete-valued latent regime variable which can be justified in the light of recent market microstructure theories. The empirical application demonstrates its ability to capture specific characteristics of intraday transaction durations while alternative approaches fail. JEL classification: C41, C22, C25, C51, G14.
In recent methodological work the well known ACD approach, originally introduced by Engle and Russell (1998), has been supplemented by the involvement of an unobservable stochastic process which accompanies the underlying process of durations via a discrete mixture of distributions. The Mixture ACD model, emanating from the specialized proposal of De Luca and Gallo (2004), has proved to be a moderate tool for description of financial duration data. The use of one and the same family of ordinary distributions has been common practice until now. Our contribution incites to use the rich parameterized comprehensive family of distributions which allows for interacting different distributional idiosyncrasies. JEL classification: C41, C22, C25, C51, G14.
We propose a new framework for modelling the time dependence in duration processes being in force on financial markets. The pioneering ACD model introduced by Engle and Russell (1998) will be extended in a manner that the duration process will be accompanied by an unobservable stochastic process. The Discrete Mixture ACD framework provides us with a general methodology which puts the idea into practice. It is established by introducing a discrete-valued latent regime variable which can be justified in the light of recent market microstructure theories. The empirical application demonstrates its ability to capture specific characteristics of intraday transaction durations while alternative approaches fail. JEL classification: C41, C22, C25, C51, G14.
In recent methodological work the well known ACD approach, originally introduced by Engle and Russell (1998), has been supplemented by the involvement of an unobservable stochastic process which accompanies the underlying process of durations via a discrete mixture of distributions. The Mixture ACD model, emanating from the specialized proposal of De Luca and Gallo (2004), has proved to be a moderate tool for description of financial duration data. The use of one and the same family of ordinary distributions has been common practice until now. Our contribution incites to use the rich parameterized comprehensive family of distributions which allows for interacting different distributional idiosyncrasies. JEL classification: C41, C22, C25, C51, G14
We propose a new framework for modeling time dependence in duration processes. The ACD approach introduced by Engle and Russell (1998) will be extended so that the conditional expectation of the durations depends on an unobservable stochastic process which is modeled via a Markov chain. The Markov switching ACD model (MSACD) is a flexible tool for description of financial duration processes. The introduction of a latent information regime variable can be justified in the light of recent market microstructure theories. In an empirical application we show that the MSACD approach is able to capture specific characteristics of inter trade durations while alternative ACD models fail. JEL classification: C41, C22, C25, C51, G14
During the last decade, there has been a significant bias towards bond financing on emerging markets, with private investors relying on a bail-out of bonds by the international community. The bias has been a main cause for recent excessive fragility of international capital markets. The paper shows how collective action clauses in bonds contracts help to involve the private sector in risk sharing. It argues that such clauses, as a market based instrument, will raise spreads for emerging market debt and so help to correct a market failure towards excessive bond finance. Recent pressure by the IMF to involve the private sector is facing a conflict between the principle to honour existing contracts and the principle of equal treatment of bondholders.
Structural positions are very common in investment practice. A structural position is defined as a permanent overweighting of a riskier asset class relative to a prespecified benchmark portfolio. The most prominent example for a structural position is the equity bias in a balanced fund that arises by consistently overweighting equities in tactical asset allocation. Another example is the permanent allocation of credit in a fixed income portfolio with a government benchmark. The analysis provided in this article shows that whenever possible, structural positions should be avoided. Graphical illustrations based on Pythagorean theorem are used to make a connection between the active risk/return and the total risk/return framework. Structural positions alter the risk profile of the portfolio substantially, and the appeal of active management – to provide active returns uncorrelated to benchmark returns and hence to shift the efficient frontier outwards – gets lost. The article demonstrates that the commonly used alpha – tracking error criterion is not sufficient for active management. In addition, structural positions complicate measuring managers’ skill. The paper also develops normative implications for active portfolio management. Tactical asset allocation should be based on the comparison of expected excess returns of an asset class to the equilibrium risk premium of the same asset class and not to expected excess returns of other asset classes. For the cases, where structural positions cannot be avoided, a risk budgeting approach is introduced and applied to determine the optimal position size. Finally, investors are advised not to base performance evaluation only on simple manager rankings because this encourages managers to take structural positions and does not reward efforts to produce alpha. The same holds true for comparing managers’ information ratios. Information ratios, in investment practice defined as the ratio of active return to active risk, do not uncover structural positions.
Hackethal and Schmidt (2003) criticize a large body of literature on the financing of corporate sectors in different countries that questions some of the distinctions conventionally drawn between financial systems. Their criticism is directed against the use of net flows of finance and they propose alternative measures based on gross flows which they claim re-establish conventional distinctions. This paper argues that their criticism is invalid and that their alternative measures are misleading. There are real issues raised by the use of aggregate data but they are not the ones discussed in Hackethal and Schmidt’s paper. JEL Classification: G30
Empirical evidence suggests that even those firms presumably most in need of monitoring-intensive financing (young, small, and innovative firms) have a multitude of bank lenders, where one may be special in the sense of relationship lending. However, theory does not tell us a lot about the economic rationale for relationship lending in the context of multiple bank financing. To fill this gap, we analyze the optimal debt structure in a model that allows for multiple but asymmetric bank financing. The optimal debt structure balances the risk of lender coordination failure from multiple lending and the bargaining power of a pivotal relationship bank. We show that firms with low expected cash-flows or low interim liquidation values of assets prefer asymmetric financing, while firms with high expected cash-flow or high interim liquidation values of assets tend to finance without a relationship bank.
Tractable hedging - an implementation of robust hedging strategies : [This Version: March 30, 2004]
(2004)
This paper provides a theoretical and numerical analysis of robust hedging strategies in diffusion–type models including stochastic volatility models. A robust hedging strategy avoids any losses as long as the realised volatility stays within a given interval. We focus on the effects of restricting the set of admissible strategies to tractable strategies which are defined as the sum over Gaussian strategies. Although a trivial Gaussian hedge is either not robust or prohibitively expensive, this is not the case for the cheapest tractable robust hedge which consists of two Gaussian hedges for one long and one short position in convex claims which have to be chosen optimally.
We study the approximability of the following NP-complete (in their feasibility recognition forms) number theoretic optimization problems: 1. Given n numbers a1 ; : : : ; an 2 Z, find a minimum gcd set for a1 ; : : : ; an , i.e., a subset S fa1 ; : : : ; ang with minimum cardinality satisfying gcd(S) = gcd(a1 ; : : : ; an ). 2. Given n numbers a1 ; : : : ; an 2 Z, find a 1-minimum gcd multiplier for a1 ; : : : ; an , i.e., a vector x 2 Z n with minimum max 1in jx i j satisfying P n...
To preserve the required beam quality in an e+/e- collider it is necessary to have a very precise beam position control at each accelerating cavity. An elegant method to avoid additional length and beam disturbance is the usage of signals from existing HOM-dampers. The magnitude of the displacement is derived from the amplitude of a dipole mode whereas the sign follows from the phase comparison of a dipole and a monopole HOM. To check the performance of the system, a measurement setup has been built with an antenna which can be moved with micrometer resolution to simulate the beam. Furthermore we have developed a signal processing to determine the absolute beam displacement. Measurements on the HOM-damper cell can be done in the frequency domain using a network analyser. Final measurements with the nonlinear time dependent signal processing circuit has to be done with very short electric pulses simulating electron bunches. Thus, we have designed a sub nanosecond pulse generator using a clipping line and the step recovery effect of a diode. The measurement can be done with a resolution of about 10 micrometers. Measurements and numerical calculations concerning the monitor design and the pulse generator are presented.
Although the commoditisation of illiquid asset exposures through securitisation facilitates the disciplining effect of capital markets on the risk management, private information about securitised debt as well as complex transaction structures could possibly impair the fair market valuation. In a simple issue design model without intermediaries we maximise issuer proceeds over a positive measure of issue quality, where a direct revelation mechanism (DRM) by profitable informed investors engages endogenous price discovery through auction-style allocation preference as a continuous function of perceived issue quality. We derive an optimal allocation schedule for maximum issuer payoffs under different pricing regimes if asymmetric information requires underpricing. In particular, we study how the incidence of uninformed investors at varying levels of valuation uncertainty and their function of clearing the market effects profitable informed investment. We find that the issuer optimises own payoffs at each valuation irrespective of the applicable pricing mechanism by awarding informed investors the lowest possible allocation (and attendant underpricing) that still guarantees profitable informed investment. Under uniform pricing the composition of the investor pool ensures that informed investors appropriate higher profit than uninformed types. Any reservation utility by issuers lowers the probability of information disclosure by informed investors and the scope of issuers to curtail profitable informed investment. JEL Classifications: D82, G12, G14, G23
Asset securitisation as a risk management and funding tool : what does it hold in store for SMES?
(2005)
The following chapter critically surveys the attendant benefits and drawbacks of asset securitisation on both financial institutions and firms. It also elicits salient lessons to be learned about the securitisation of SME-related obligations from a cursory review of SME securitisation in Germany as a foray of asset securitisation in a bank-centred financial system paired with a strong presence of SMEs in industrial production. JEL Classification: D81, G15, M20
As a sign of ambivalence in the regulatory definition of capital adequacy for credit risk and the quest for more efficient refinancing sources collateral loan obligations (CLOs) have become a prominent securitisation mechanism. This paper presents a loss-based asset pricing model for the valuation of constituent tranches within a CLO-style security design. The model specifically examines how tranche subordination translates securitised credit risk into investment risk of issued tranches as beneficial interests on a designated loan pool typically underlying a CLO transaction. We obtain a tranchespecific term structure from an intensity-based simulation of defaults under both robust statistical analysis and extreme value theory (EVT). Loss sharing between issuers and investors according to a simplified subordination mechanism allows issuers to decompose securitised credit risk exposures into a collection of default sensitive debt securities with divergent risk profiles and expected investor returns. Our estimation results suggest a dichotomous effect of loss cascading, with the default term structure of the most junior tranche of CLO transactions (“first loss position”) being distinctly different from that of the remaining, more senior “investor tranches”. The first loss position carries large expected loss (with high investor return) and low leverage, whereas all other tranches mainly suffer from loss volatility (unexpected loss). These findings might explain why issuers retain the most junior tranche as credit enhancement to attenuate asymmetric information between issuers and investors. At the same time, the issuer discretion in the configuration of loss subordination within particular security design might give rise to implicit investment risk in senior tranches in the event of systemic shocks. JEL Classifications: C15, C22, D82, F34, G13, G18, G20
s.a. Deutsche Fassung: Rechtshistorisches Journal 15, 1996, 255-290 und in: Eric Schwarz (Hg.) La théorie des systèmes: une approche inter- et transdisciplinaire. Bösch, Sion 1996, 101-119. Italienische Fassung: La Bukowina globale: il pluralismo giuridico nella società mondiale. Sociologic a politiche sociali 2, 1999, 49-80. Portugiesische Fassung: Bukowina global sobre a emergência de um pluralismo jurídico transnacional. Impulso: Direito e Globalização 14, 2003. Georgische Fassung: Globaluri bukovina: samarTlebrivi pluralizmi msoflio sazogadoebaSi. Journal of the Institute of State and Law of the Georgian Academy of Sciences 2005 (im Erscheinen)
In the last years, much effort went into the design of robust anaphor resolution algorithms. Many algorithms are based on antecedent filtering and preference strategies that are manually designed. Along a different line of research, corpus-based approaches have been investigated that employ machine-learning techniques for deriving strategies automatically. Since the knowledge-engineering effort for designing and optimizing the strategies is reduced, the latter approaches are considered particularly attractive. Since, however, the hand-coding of robust antecedent filtering strategies such as syntactic disjoint reference and agreement in person, number, and gender constitutes a once-for-all effort, the question arises whether at all they should be derived automatically. In this paper, it is investigated what might be gained by combining the best of two worlds: designing the universally valid antecedent filtering strategies manually, in a once-for-all fashion, and deriving the (potentially genre-specific) antecedent selection strategies automatically by applying machine-learning techniques. An anaphor resolution system ROSANA-ML, which follows this paradigm, is designed and implemented. Through a series of formal evaluations, it is shown that, while exhibiting additional advantages, ROSANAML reaches a performance level that compares with the performance of its manually designed ancestor ROSANA.
We address to the problem to factor a large composite number by lattice reduction algorithms. Schnorr has shown that under a reasonable number theoretic assumptions this problem can be reduced to a simultaneous diophantine approximation problem. The latter in turn can be solved by finding sufficiently many l_1--short vectors in a suitably defined lattice. Using lattice basis reduction algorithms Schnorr and Euchner applied Schnorrs reduction technique to 40--bit long integers. Their implementation needed several hours to compute a 5% fraction of the solution, i.e., 6 out of 125 congruences which are necessary to factorize the composite. In this report we describe a more efficient implementation using stronger lattice basis reduction techniques incorporating ideas of Schnorr, Hoerner and Ritter. For 60--bit long integers our algorithm yields a complete factorization in less than 3 hours.
Given a real vector alpha =(alpha1 ; : : : ; alpha d ) and a real number E > 0 a good Diophantine approximation to alpha is a number Q such that IIQ alpha mod Zk1 ", where k \Delta k1 denotes the 1-norm kxk1 := max 1id jx i j for x = (x1 ; : : : ; xd ). Lagarias [12] proved the NP-completeness of the corresponding decision problem, i.e., given a vector ff 2 Q d , a rational number " ? 0 and a number N 2 N+ , decide whether there exists a number Q with 1 Q N and kQff mod Zk1 ". We prove that, unless ...
We generalize the concept of block reduction for lattice bases from l2-norm to arbitrary norms. This extends the results of Schnorr. We give algorithms for block reduction and apply the resulting enumeration concept to solve subset sum problems. The deterministic algorithm solves all subset sum problems. For up to 66 weights it needs in average less then two hours on a HP 715/50 under HP-UX 9.05.
We present an efficient variant of LLL-reduction of lattice bases in the sense of Lenstra, Lenstra, Lov´asz [LLL82]. We organize LLL-reduction in segments of size k. Local LLL-reduction of segments is done using local coordinates of dimension 2k. Strong segment LLL-reduction yields bases of the same quality as LLL-reduction but the reduction is n-times faster for lattices of dimension n. We extend segment LLL-reduction to iterated subsegments. The resulting reduction algorithm runs in O(n3 log n) arithmetic steps for integer lattices of dimension n with basis vectors of length 2O(n), compared to O(n5) steps for LLL-reduction.
We study the following problem: given x element Rn either find a short integer relation m element Zn, so that =0 holds for the inner product <.,.>, or prove that no short integer relation exists for x. Hastad, Just Lagarias and Schnorr (1989) give a polynomial time algorithm for the problem. We present a stable variation of the HJLS--algorithm that preserves lower bounds on lambda(x) for infinitesimal changes of x. Given x \in {\RR}^n and \alpha \in \NN this algorithm finds a nearby point x' and a short integer relation m for x'. The nearby point x' is 'good' in the sense that no very short relation exists for points \bar{x} within half the x'--distance from x. On the other hand if x'=x then m is, up to a factor 2^{n/2}, a shortest integer relation for \mbox{x.} Our algorithm uses, for arbitrary real input x, at most \mbox{O(n^4(n+\log \alpha))} many arithmetical operations on real numbers. If x is rational the algorithm operates on integers having at most \mbox{O(n^5+n^3 (\log \alpha)^2 + \log (\|q x\|^2))} many bits where q is the common denominator for x.
Black box cryptanalysis applies to hash algorithms consisting of many small boxes, connected by a known graph structure, so that the boxes can be evaluated forward and backwards by given oracles. We study attacks that work for any choice of the black boxes, i.e. we scrutinize the given graph structure. For example we analyze the graph of the fast Fourier transform (FFT). We present optimal black box inversions of FFT-compression functions and black box constructions of collisions. This determines the minimal depth of FFT-compression networks for collision-resistant hashing. We propose the concept of multipermutation, which is a pair of orthogonal latin squares, as a new cryptographic primitive that generalizes the boxes of the FFT. Our examples of multipermutations are based on the operations circular rotation, bitwise xor, addition and multiplication.
With ubiquitous use of digital camera devices, especially in mobile phones, privacy is no longer threatened by governments and companies only. The new technology creates a new threat by ordinary people, who now have the means to take and distribute pictures of one’s face at no risk and little cost in any situation in public and private spaces. Fast distribution via web based photo albums, online communities and web pages expose an individual’s private life to the public in unpreceeded ways. Social and legal measures are increasingly taken to deal with this problem. In practice however, they lack efficiency, as they are hard to enforce in practice. In this paper, we discuss a supportive infrastructure aiming for the distribution channel; as soon as the picture is publicly available, the exposed individual has a chance to find it and take proper action.
Korrektur zu: C.P. Schnorr: Security of 2t-Root Identification and Signatures, Proceedings CRYPTO'96, Springer LNCS 1109, (1996), pp. 143-156 page 148, section 3, line 5 of the proof of Theorem 3. Die Korrektur wurde präsentiert als: "Factoring N via proper 2 t-Roots of 1 mod N" at Eurocrypt '97 rump session.
Let G be a finite cyclic group with generator \alpha and with an encoding so that multiplication is computable in polynomial time. We study the security of bits of the discrete log x when given \exp_{\alpha}(x), assuming that the exponentiation function \exp_{\alpha}(x) = \alpha^x is one-way. We reduce he general problem to the case that G has odd order q. If G has odd order q the security of the least-significant bits of x and of the most significant bits of the rational number \frac{x}{q} \in [0,1) follows from the work of Peralta [P85] and Long and Wigderson [LW88]. We generalize these bits and study the security of consecutive shift bits lsb(2^{-i}x mod q) for i=k+1,...,k+j. When we restrict \exp_{\alpha} to arguments x such that some sequence of j consecutive shift bits of x is constant (i.e., not depending on x) we call it a 2^{-j}-fraction of \exp_{\alpha}. For groups of odd group order q we show that every two 2^{-j}-fractions of \exp_{\alpha} are equally one-way by a polynomial time transformation: Either they are all one-way or none of them. Our key theorem shows that arbitrary j consecutive shift bits of x are simultaneously secure when given \exp_{\alpha}(x) iff the 2^{-j}-fractions of \exp_{\alpha} are one-way. In particular this applies to the j least-significant bits of x and to the j most-significant bits of \frac{x}{q} \in [0,1). For one-way \exp_{\alpha} the individual bits of x are secure when given \exp_{\alpha}(x) by the method of Hastad, N\"aslund [HN98]. For groups of even order 2^{s}q we show that the j least-significant bits of \lfloor x/2^s\rfloor, as well as the j most-significant bits of \frac{x}{q} \in [0,1), are simultaneously secure iff the 2^{-j}-fractions of \exp_{\alpha'} are one-way for \alpha' := \alpha^{2^s}. We use and extend the models of generic algorithms of Nechaev (1994) and Shoup (1997). We determine the generic complexity of inverting fractions of \exp_{\alpha} for the case that \alpha has prime order q. As a consequence, arbitrary segments of (1-\varepsilon)\lg q consecutive shift bits of random x are for constant \varepsilon >0 simultaneously secure against generic attacks. Every generic algorithm using $t$ generic steps (group operations) for distinguishing bit strings of j consecutive shift bits of x from random bit strings has at most advantage O((\lg q) j\sqrt{t} (2^j/q)^{\frac14}).
We modify the concept of LLL-reduction of lattice bases in the sense of Lenstra, Lenstra, Lovasz [LLL82] towards a faster reduction algorithm. We organize LLL-reduction in segments of the basis. Our SLLL-bases approximate the successive minima of the lattice in nearly the same way as LLL-bases. For integer lattices of dimension n given by a basis of length 2exp(O(n)), SLLL-reduction runs in O(n.exp(5+epsilon)) bit operations for every epsilon > 0, compared to O(exp(n7+epsilon)) for the original LLL and to O(exp(n6+epsilon)) for the LLL-algorithms of Schnorr (1988) and Storjohann (1996). We present an even faster algorithm for SLLL-reduction via iterated subsegments running in O(n*exp(3)*log n) arithmetic steps.
We present a practical algorithm that given an LLL-reduced lattice basis of dimension n, runs in time O(n3(k=6)k=4+n4) and approximates the length of the shortest, non-zero lattice vector to within a factor (k=6)n=(2k). This result is based on reasonable heuristics. Compared to previous practical algorithms the new method reduces the proven approximation factor achievable in a given time to less than its fourthth root. We also present a sieve algorithm inspired by Ajtai, Kumar, Sivakumar [AKS01].
We consider Schwarz maps for triangles whose angles are rather general rational multiples of pi. Under which conditions can they have algebraic values at algebraic arguments? The answer is based mainly on considerations of complex multiplication of certain Prym varieties in Jacobians of hypergeometric curves. The paper can serve as an introduction to transcendence techniques for hypergeometric functions, but contains also new results and examples.
We calculate the kaon HBT radius parameters for high energy heavy ion collisions, assuming a first order phase transition from a thermalized Quark-Gluon-Plasma to a gas of hadrons. At high transverse momenta K_T ~ 1 GeV/c direct emission from the phase boundary becomes important, the emission duration signal, i.e., the R_out/R_side ratio, and its sensitivity to T_c (and thus to the latent heat of the phase transition) are enlarged. Moreover, the QGP+hadronic rescattering transport model calculations do not yield unusual large radii (R_i<9fm). Finite momentum resolution effects have a strong impact on the extracted HBT parameters (R_i and lambda) as well as on the ratio R_out/R_side.
We calculate the antibaryon-to-baryon ratios, anti-p/p, anti-Lambda/Lambda, anti-Xi/Xi, and anti-Omega/Omega for Au+Au collisions at RHIC (sqrt{s}_{NN}=200 GeV). The effects of strong color fields associated with an enhanced strangeness and diquark production probability and with an effective decrease of formation times are investigated. Antibaryon-to-baryon ratios increase with the color field strength. The ratios also increase with the strangeness content |S|. The netbaryon number at midrapidity considerably increases with the color field strength while the netproton number remains roughly the same. This shows that the enhanced baryon transport involves a conversion into the hyperon sector (hyperonization) which can be observed in the (Lambda - anti-Lambda)/(p - anti-p) ratio.
Report-no: UFTP-492/1999 Journal-ref: Phys.Rev. C61 (2000) 024909 We investigate flow in semi-peripheral nuclear collisions at AGS and SPS energies within macroscopic as well as microscopic transport models. The hot and dense zone assumes the shape of an ellipsoid which is tilted by an angle Theta with respect to the beam axis. If matter is close to the softest point of the equation of state, this ellipsoid expands predominantly orthogonal to the direction given by Theta. This antiflow component is responsible for the previously predicted reduction of the directed transverse momentum around the softest point of the equation of state.
The wide-area deployment of WiFi hot spots challenges IP access providers. While new profit models are sought after by them, profitability as well as logistics for large-scale deployment of 802.11 wireless technology are still to be proven. Expenditure for hardware, locations, maintenance, connectivity, marketing, billing and customer care must be considered. Even for large carriers with infrastructure, the deployment of a large-scale WiFi infrastructure may be risky. This paper proposes a multi-level scheme for hot spot distribution and customer acquisition that reduces financial risk, cost of marketing and cost of maintenance for the large-scale deployment of WiFi hot spots.
Central wage bargaining and local wage flexibility : evidence from the entire wage distribution
(1998)
We argue that in labor markets with central wage bargaining wage flexibility varies systematically across the wage distribution: local wage flexibility is more relevant for the upper part of the wage distribution, and flexibility of wages negotiated under central wage bargaining affects the lower part of the wage distribution. Using a random sample of German social-security accounts, we estimate wage flexibility across the wage distribution by means of quantile regressions. The results support our hypothesis, as employees with low wages have significantly lower local wage flexibility than high wage employees. This effect is particularly relevant for the lower educational groups. On the other hand, employees with low wages tend to have a higher wage flexibility with respect to national unemployment.
This paper shows that abnormal stock price returns around open market repurchase announcements are about four times higher in Germany than in the US (12% versus 3%). We hypothesize that this observation can be explained by country differences in repurchase regulation. Our empirical evidence indicates that German managers primarily buy back shares to signal an undervaluation of their firm. We demonstrate that the stringent repurchase process prescribed by German law attributes a higher credibility to such a signal than lax US regulations and thereby corroborate our hypothesis.