Refine
Year of publication
- 2005 (109) (remove)
Document Type
- Working Paper (109) (remove)
Has Fulltext
- yes (109)
Is part of the Bibliography
- no (109)
Keywords
- Deutschland (10)
- Europäische Union (8)
- Geldpolitik (6)
- Währungsunion (5)
- Aktienmarkt (4)
- Bank (4)
- Kapitalmarkt (4)
- Kreditmarkt (4)
- Kreditrisiko (4)
- Risikomanagement (4)
Institute
- Center for Financial Studies (CFS) (37)
- Wirtschaftswissenschaften (28)
- Rechtswissenschaft (13)
- Geographie (5)
- Informatik (4)
- Geowissenschaften (3)
- Extern (1)
- Institut für sozial-ökologische Forschung (ISOE) (1)
- Medizin (1)
- Universitätsbibliothek (1)
1 Institutionelle Organisation und politischer Aufbau der Bundesrepublik Deutschland 2 Die Sozio-ökonomische Situation 2.1 Einkommensverteilung und Armut 2.1.1 Generelle Einkommensentwicklung der 90er Jahre 2.1.2 Ungleichheit der Einkommen 2.1.3 Armut und Armutsquoten 2.1.4 Verteilung der Armutsquoten auf bestimmte Personengruppen 2.1.5 Die Verweildauer von in Armut lebenden Menschen 2.2 Erwerbsarbeit und Erwerbslosigkeit 2.2.1 Strukturen der Erwerbsarbeit 2.2.2 Strukturen der Arbeitslosigkeit 2.3 Bildung und Ausbildung 2.4 Wohnsituation 2.5 Gesundheit und Armut 3 Demographische Situation 4 Strategien zur Bekämpfung von sozialer Ausgrenzung und Armut 4.1 Charakterisierung des deutschen Sozialstaats 4.2 Sozialpolitische Strategien zur Armutsbekämpfung 4.2.1 Der Armuts- und Reichtumsbericht der Bundesregierung 4.2.2 Der Nationale Aktionsplan zur Bekämpfung von sozialer Ausgrenzung und Armut 4.2.3 Forum Teilhabe und soziale Integration (FORTEIL) 4.2.4 Die Agenda 2010 als Teilumsetzung des Armutsberichtes und von NAP-incl
Wider participation in stockholding is often presumed to reduce wealth inequality. We measure and decompose changes in US wealth inequality between 1989 and 2001, a period of considerable spread of equity culture. Inequality in equity wealth is found to be important for net wealth inequality, despite equity's limited share. Our findings show that reduced wealth inequality is not a necessary outcome of the spread of equity culture. We estimate contributions of stockholder characteristics to levels and inequality in equity holdings, and we distinguish changes in configuration of the stockholder pool from changes in the influence of given characteristics. Our estimates imply that both the 1989 and the 2001 stockholder pools would have produced higher equity holdings in 1998 than were actually observed for 1998 stockholders. This arises from differences both in optimal holdings and in financial attitudes and practices, suggesting a dilution effect of the boom followed by a cleansing effect of the downturn. Cumulative gains and losses in stockholding are shown to be significantly influenced by length of household investment horizon and portfolio breadth but, controlling for those, use of professional advice is either insignificant or counterproductive. JEL Classification: E21, G11
EU financial integration : is there a 'Core Europe'? ; evidence from a cluster-based approach
(2005)
Numerous recent studies, e.g. EU Commission (2004a), Baele et al. (2004), Adam et al.(2002), and the research pooled in ECB-CFS (2005), Gaspar, Hartmann, and Sleijpen(2003), have documented progress in EU financial integration from a micro-level view.This paper contributes to this research by identifying groups of financially integratedcountries from a holistic, macro-level view. It calculates cross-sectional dispersions, andinnovates by applying an inter-temporal cluster analysis to eight euro area countries for the period 1995-2002. The indicators employed represent the money, government bond and credit markets. Our results show that euro countries were divided into two stable groups of financially more closely integrated countries in the pre-EMU period. Back then, geographic proximity and country size might have played a role. This situation has changed remarkably with the euro's introduction. EMU has led to a shake-up both in the number and composition of groups. The evidence puts a question mark behin d using Germany as a benchmark in the post-EMU period. The ¯ndings suggest as well that ¯nancial integration takes place in waves. Stable periods and periods of intense transition alternate. Based on the notion of 'maximum similarity', the results suggest that there exist 'maximum similarity barriers'. It takes extraordinary events, such as EMU, to push the degree of ¯nancial integration beyond these barriers. The research encourages policymakers to move forward courageously in the post-FSAP era, and provides comfort that the substantial di®erences between the current and potentially new euro states can be overcome. The analysis could be extended to the new EU member countries, to the global level, and to additional indicators.
Innovations are a key factor to ensure the competitiveness of establishments as well as to enhance the growth and wealth of nations. But more than any other economic activity, decisions about innovations are plagued by failures of the market mechanism. As a response, public instruments have been implemented to stimulate private innovation activities. The effectiveness of these measures, however, is ambiguous and calls for an empirical evaluation. In this paper we make use of the IAB Establishment Panel and apply various microeconometric methods to estimate the effect of public measures on innovation activities of German establishments. We find that neglecting sample selection due to observable as well as to unobservable characteristics leads to an overestimation of the treatment effect and that there are considerable differences with regard to size class and betweenWest and East German establishments.
Fracture numérique de genre en Afrique francophone : une inquiétante réalité ; réseau genre et TIC
(2005)
Ce document présente les principaux résultats de la recherche "Fracture numérique de genre en Afrique francophone : données et indicateurs", réalisée en 2004-2005 par le Réseau genre et TIC, grâce à une subvention du Centre de Recherches pour le Développement International (CRDI, Ottawa, Canada). Le Réseau genre et TIC est une initiative menée conjointement par l’organisation internationale Environnement et Développement du Tiers Monde (ENDA), l’Observatoire des Systèmes d’Information sur les Réseaux et Inforoutes du Sénégal (OSIRIS) et l’Agence sénégalaise de Régulation des Télécommunications (ART). Composé de personnes et d’organisations actives pour la promotion de l’égalité de genre dans le secteur des TIC, sa mission, en concertation avec l’ensemble des acteurs nationaux et partenaires internationaux, est de promouvoir l’égalité de genre dans la société de l’information.
This article presents an overview of the contemporary German insurance market, its structure, players, and development trends. First, brief information about the history of the insurance industry in Germany is provided. Second, the contemporary market is analyzed in terms of its legal and economic structure, with statistics on the number of companies, insurance density and penetration, the role of insurers in the capital markets, premiums split, and main market players and their market shares. Furthermore, the three biggest insurance lines—life, health, and property and casualty—are considered in more detail, such as product range, country specifics, and insurance and investment results. A section on regulation outlines its implementation in the insurance sector, offering information on the underlying legislative basis, supervisory body, technical procedures, expected developments, and sources of more detailed information.
Das Recht der sog. eigenkapitalersetzenden Gesellschafterdarlehen ist in der jüngeren Vergangenheit zunehmend Gegenstand der Kritik geworden. Mit dem nachfolgenden Beitrag wird auf der Grundlage einer kritischen Analyse der lex lata ein Vorschlag für eine Vereinfachung der Regeln über die Gesellschafterfremdfinanzierung in der Krise entwickelt.
Groundwater recharge is the major limiting factor for the sustainable use of groundwater. To support water management in a globalized world, it is necessary to estimate, in a spatially resolved way, global-scale groundwater recharge. In this report, improved model estimates of diffuse groundwater recharge at the global-scale, with a spatial resolution of 0.5° by 0.5°, are presented. They are based on calculations of the global hydrological model WGHM (WaterGAP Global Hydrology Model) which, for semi-arid and arid areas of the globe, was tuned against independent point estimates of diffuse groundwater recharge. This has led to a decrease of estimated groundwater recharge under semi-arid and arid conditions as compared to the model results before tuning, and the new estimates are more similar to country level data on groundwater recharge. Using the improved model, the impact of climate change on groundwater recharge was simulated, applying two greenhouse gas emissions scenarios as interpreted by two different climate models.
This paper provides global terrestrial surface balances of nitrogen (N) at a resolution of 0.5 by 0.5 degree for the years 1961, 1995 and 2050 as simulated by the model WaterGAP-N. The terms livestock N excretion (Nanm), synthetic N fertilizer (Nfert), atmospheric N deposition (Ndep) and biological N fixation (Nfix) are considered as input while N export by plant uptake (Nexp) and ammonia volatilization (Nvol) are taken into account as output terms. The different terms in the balance are compared to results of other global models and uncertainties are described. Total global surface N surplus increased from 161 Tg N yr-1 in 1961 to 230 Tg N yr-1 in 1995. Using assumptions for the scenario A1B of the Special Report on Emission Scenarios (SRES) of the International Panel on Climate Change (IPCC) as quantified by the IMAGE model, total global surface N surplus is estimated to be 229 Tg N yr-1 in 2050. However, the implementation of these scenario assumptions leads to negative surface balances in many agricultural areas on the globe, which indicates that the assumptions about N fertilizer use and crop production changes are not consistent. Recommendations are made on how to change the assumptions about N fertilizer use to receive a more consistent scenario, which would lead to higher N surpluses in 2050 as compared to 1995.
Small and medium-sized firms typically obtain capital via bank financing. They often rely on a mixture of relationship and arm’s-length banking. This paper explores the reasons for the dominance of heterogeneous multiple banking systems. We show that the incidence of inefficient credit termination and subsequent firm liquidation is contingent on the borrower’s quality and on the relationship bank’s information precision. Generally, heterogeneous multiple banking leads to fewer inefficient credit decisions than monopoly relationship lending or homogeneous multiple banking, provided that the relationship bank’s fraction of total firm debt is not too large.
Small and medium-sized firms typically obtain capital via bank financing. They often rely on a mixture of relationship and arm’s-length banking. This paper explores the reasons for the dominance of heterogeneous multiple banking systems. We show that the incidence of inefficient credit termination and subsequent firm liquidation is contingent on the borrower’s quality and on the relationship bank’s information precision. Generally, heterogeneous multiple banking leads to fewer inefficient credit decisions than monopoly relationship lending or homogeneous multiple banking, provided that the relationship bank’s fraction of total firm debt is not too large.
Previous empirical studies of job creation schemes in Germany have shown that the average effects for the participating individuals are negative. However, we find that this is not true for all strata of the population. Identifying individual characteristics that are responsible for the effect heterogeneity and using this information for a better allocation of individuals therefore bears some scope for improving programme efficiency. We present several stratification strategies and discuss the occurring effect heterogeneity. Our findings show that job creation schemes do neither harm nor improve the labour market chances for most of the groups. Exceptions are long-term unemployed men in West and long-term unemployed women in East and West Germany who benefit from participation in terms of higher employment rates. JEL: C13 , J68 , H43
Using a set of regional inflation rates we examine the dynamics of inflation dispersion within the U.S.A., Japan and across U.S. and Canadian regions. We find that inflation rate dispersion is significant throughout the sample period in all three samples. Based on methods applied in the empirical growth literature, we provide evidence in favor of significant mean reversion (ß-convergence) in inflation rates in all considered samples. The evidence on ó-convergence is mixed, however. Observed declines in dispersion are usually associated with decreasing overall inflation levels which indicates a positive relationship between mean inflation and overall inflation rate dispersion. Our findings for the within-distribution dynamics of regional inflation rates show that dynamics are largest for Japanese prefectures, followed by U.S. metropolitan areas. For the combined U.S.-Canadian sample, we find a pattern of within-distribution dynamics that is comparable to that found for regions within the European Monetary Union (EMU). In line with findings in the so-called 'border literature' these results suggest that frictions across European markets are at least as large as they are, e.g., across North American markets. Klassifikation: E31, E52, E58
Die Arbeit ist Teil der konzeptionellen Vorbereitung einer Dissertation zum Thema Innovationsfinanzierung im Mittelstand. Sie stellt gleichzeitig die Grundlage einer explorativen Erhebung zur Innovationstätigkeit und Finanzierungsproblematik Kunststoff verarbeitender Unternehmen der südlichen Westpfalz dar, welche im Sommer 2004 durchgeführt wurde. Innovationspolitische Diskussionen konzentrieren sich in Deutschland fast ausschließlich auf so genannte High Tech-Industrien. Unter Verwendung von Indikatoren zur Messung des Personal- und des Investitionsaufwandes in Forschungs- und Entwicklungsabteilungen, der Kooperationshäufigkeit zwischen Forschungseinrichtungen und Unternehmen oder Patentanmeldungen werden Branchen hinsichtlich Innovativität und nachhaltiger Förderwürdigkeit bewertet. Aufgrund fehlender alternativer Indikatoren zur Innovationstätigkeit in Unternehmen werden weite Teile des Mittelstandes ausgeblendet. Regionen, in denen sich traditionelle Branchen konzentrieren, werden für die dynamische Entwicklung der Volkswirtschaft als weniger bedeutend eingestuft. So liegen beispielsweise die FuE-Aufwendungen der Kunststoff verarbeitenden Industrie unter dem Durchschnitt des verarbeitenden Gewerbes. Sie ist eine typische Zulieferindustrie. Über Interaktionen mit Zulieferern und Kunden erschließen sich Kunststoff verarbeitende Unternehmen wichtige Innovationsimpulse. Mit praktischen Kompetenzen generiert sie einen Mehrwert an technologischem Fortschritt für eine Vielzahl vor- und nachgelagerter Industriezweige. Die Beziehungen der Kunststoffverarbeiter zu ihren Kunden sind stark projektbezogen. Es dominieren inkrementelle Innovationen. Die Struktur der Branche unterliegt daher einem stetigen Wandel. Erfolgreiche lohnfertigende Zulieferer werden zu System-Zulieferern oder bringen eigene Produkte auf den Markt. Die statistische Erfassung der Innovationsleistungen der Branche wird durch ihre Zuliefer-Rolle zusätzlich erschwert. Auch die Ableitung des tatsächlichen Innovationsaufwandes kann nicht ohne weiteres erfolgen. Es bedarf eines angepassten Innovationsverständnisses. Ziel der Arbeit ist es, ein Verständnis für die Innovationsleistungen einer mittelständisch geprägten, in Wertschöpfungsketten eingebetteten Low Tech Branche zu entwickeln. Es sollen Ansatzpunkte für eine Klassifikation der Innovationsaktivitäten und aufwendungen erarbeitet werden. Nach einer kurzen Diskussion der Unzweckmäßigkeit einer einseitigen Verwendung FuE-basierter Indikatoren werden im dritten Kapitel für die Kunststoff verarbeitenden Industrie Aspekte branchenspezifischer Innovationsprozesse ermittelt und klassifiziert. Die Entwicklung von Produkt-, Werkstoff- und Werkzeugkonzepten durch Kunststoff verarbeitende Unternehmen werden als zentrale Innovationsaktivitäten der Branche herausgearbeitet. Das letzte Kapitel dient der Diskussion der zu erwartenden Risiken und Investitionsaufwendungen dieser Entwicklungsleistungen. Mögliche Finanzierungsformen werden abgeleitet.
In this paper, we examine the cost of insurance against model uncertainty for the Euro area considering four alternative reference models, all of which are used for policy-analysis at the ECB.We find that maximal insurance across this model range in terms of aMinimax policy comes at moderate costs in terms of lower expected performance. We extract priors that would rationalize the Minimax policy from a Bayesian perspective. These priors indicate that full insurance is strongly oriented towards the model with highest baseline losses. Furthermore, this policy is not as tolerant towards small perturbations of policy parameters as the Bayesian policy rule. We propose to strike a compromise and use preferences for policy design that allow for intermediate degrees of ambiguity-aversion.These preferences allow the specification of priors but also give extra weight to the worst uncertain outcomes in a given context. JEL Klassifikation: E52, E58, E61.
How do markets spread risk when events are unknown or unknowable and where not anticipated in an insurance contract? While the policyholder can "hold up" the insurer for extra contractual payments, the continuing gains from trade on a single contract are often too small to yield useful coverage. By acting as a repository of the reputations of the parties, we show the brokers provide a coordinating mechanism to leverage the collective hold up power of policyholders. This extends both the degree of implicit and explicit coverage. The role is reflected in the terms of broker engagement, specifically in the ownership by the broker of the renewal rights. Finally, we argue that brokers can be motivated to play this role when they receive commissions that are contingent on insurer profits. This last feature questions a recent, well publicized, attack on broker compensation by New York attorney general, Elliot Spitzer. Klassifikation: G22, G24, L14
The theory of intertemporal consumption choice makes sharp predictions about the evolution of the entire distribution of household consumption, not just about its conditional mean. In the paper, we study the empirical transition matrix of consumption using a panel drawn from the Bank of Italy Survey of Household Income and Wealth. We estimate the parameters that minimize the distance between the empirical and the theoretical transition matrix of the consumption distribution. The transition matrix generated by our estimates matches remarkably well the empirical matrix, both in the aggregate and in samples stratified by education. Our estimates strongly reject the consumption insurance model and suggest that households smooth income shocks to a lesser extent than implied by the permanent income hypothesis. Klassifikation: D52, D91, I30
This paper examines intraday stock price effects and trading activity caused by ad hoc disclosures in Germany. The evidence suggests that the observed stock prices react within 90 minutes after the ad hoc disclosures. Trading volumes take even longer to adjust. We find no evidence for abnormal price reactions or abnormal trading volume before announcements. The bigger the company that announces an ad hoc disclosure, the less severe is the abnormal price effect following the announcement. The number of analysts is negatively correlated to the trading volume effect before the ad hoc disclosure. The higher the trading volume on the last trading day before the announcement, the greater is the price effect after the ad hoc disclosures and the greater the trading volume effect. Keywords: ad hoc disclosure rules, intraday stock price adjustments, market efficiency.
Im Rahmen des DFG-Forschungsprojektes Scha 237/12-1 (Betreuer Prof. Dr. Eike W. Schamp) in Kooperation mit der University of Jordan in Amman/Jordanien (Betreuer Prof. Dr. Nasim Barham) wird derzeit u.a. die Rolle deutscher Reiseveranstalter (RV) in der (globalen) Wertschöpfungskette (Global Commodity Chain, GCC) des Pauschaltourismus von Deutschland nach Jordanien untersucht. Das vorliegende Papier gibt vornehmlich einen ersten Zwischenstand im Rahmen der laufenden empirischen Erhebung unter klein- und mittelständischen RV in Deutschland wieder, welche Reisen nach Jordanien innerhalb ihres Programmportfolios konzipieren und an den Endkunden, den Tourist, verkaufen. Der Schwerpunkt des Papers liegt auf der Identifizierung von Mechanismen der Zusammenarbeit zwischen zwei verschiedenen Unternehmen einer touristischen GCC, der deutsche RV sowie die jordanische Zielgebietsagentur (ZA). Beide wirken als zentrale Akteure bei der Erstellung einer Pauschalreise, indem sie jeweils Leistungen von Fremdanbietern bündeln und somit als (verschieden mächtige) „Knoten“ im Prozess der Leistungserstellung wirken. Dieser Prozess findet über weite geographische Distanzen statt.
Kapitalmarktorientierte Risikosteuerung in Banken : Marktwertsteuerung statt Marktzinsmethode
(2005)
In diesem Beitrag wird das Konzept der Marktzinsmethode als Grundlage der dualen Risikosteuerung von Kredit- und Marktpreisrisiken in Frage gestellt. Die Kreditrisiken einer Bank implizieren bonitätsinduzierte Marktpreisrisiken und bankspezifische Refinanzierungskosten. Während die bonitätsinduzierten Marktpreisrisiken in der dualen Risikosteuerung keine Berücksichtigung finden, werden die bankspezifischen Refinanzierungskosten zwar erkannt, aber bankintern nicht verursachungsgerecht zugeordnet. Das Grundmodell der Marktzinsmethode bietet keine Lösungsansätze zur Behebung dieser Probleme. Demgegenüber lassen sich die Fehlsteuerungsimpulse von vornherein durch eine konsequente Marktbewertung (Mark to Market) aller Finanzinstrumente vermeiden. Als Ausblick werden erste Überlegungen zur Implementierung einer umfassenden Marktwertsteuerung in Banken entwickelt und exemplarisch ein hierfür geeignetes Bewertungsmodell vorgestellt.
Current thinking on African conflicts suffers from misinterpretations oversimplification, lack of focus, lack of conceptual clarity, state-centrism and lack of vision). The paper analyses a variety of the dominant explanations of major international actors and donors, showing how these frequently do not distinguish with sufficient clarity between the ‘root causes’ of a conflict, its aggravating factors and its triggers. Specifically, a correct assessment of conflict prolonging (or sustaining) factors is of vital importance in Africa’s lingering confrontations. Broader approaches (e.g. “structural stability”) offer a better analytical framework than familiar one-dimensional explanations. Moreover, for explaining and dealing with violent conflicts a shift of attention from the nation-state towards the local and sub-regional level is needed.
While much of classical statistical analysis is based on Gaussian distributional assumptions, statistical modeling with the Laplace distribution has gained importance in many applied fields. This phenomenon is rooted in the fact that, like the Gaussian, the Laplace distribution has many attractive properties. This paper investigates two methods of combining them and their use in modeling and predicting financial risk. Based on 25 daily stock return series, the empirical results indicate that the new models offer a plausible description of the data. They are also shown to be competitive with, or superior to, use of the hyperbolic distribution, which has gained some popularity in asset-return modeling and, in fact, also nests the Gaussian and Laplace. Klassifikation: C16, C50 . March 2005.
From a macroeconomic perspective, the short-term interest rate is a policy instrument under the direct control of the central bank. From a finance perspective, long rates are risk-adjusted averages of expected future short rates. Thus, as illustrated by much recent research, a joint macro-finance modeling strategy will provide the most comprehensive understanding of the term structure of interest rates. We discuss various questions that arise in this research, and we also present a new examination of the relationship between two prominent dynamic, latent factor models in this literature: the Nelson-Siegel and affine no-arbitrage term structure models. JEL Klassifikation: G1, E4, E5.
In this paper, we propose a model of credit rating agencies using the global games framework to incorporate information and coordination problems. We introduce a refined utility function of a credit rating agency that, additional to reputation maximization, also embeds aspects of competition and feedback effects of the rating on the rated firms. Apart from hinting at explanations for several hypotheses with regard to agencies' optimal rating assessments, our model suggests that the existence of rating agencies may decrease the incidence of multiple equilibria. If investors have discretionary power over the precision of their private information, we can prove that public rating announcements and private information collection are complements rather than substitutes in order to secure uniqueness of equilibrium. In this respect, rating agencies may spark off a virtuous circle that increases the efficiency of the market outcome.
The paper is a follow-up to an article published in Technique Financière et Developpement in 2000 (see the appendix to the hardcopy version), which portrayed the first results of a new strategy in the field of development finance implemented in South-East Europe. This strategy consists in creating microfinance banks as greenfield investments, that is, of building up new banks which specialise in providing credit and other financial services to micro and small enterprises, instead of transforming existing credit-granting NGOs into formal banks, which had been the dominant approach in the 1990s. The present paper shows that this strategy has, in the course of the last five years, led to the emergence of a network of microfinance banks operating in several parts of the world. After discussing why financial sector development is a crucial determinant of general social and economic development and contrasting the new strategy to former approaches in the area of development finance, the paper provides information about the shareholder composition and the investment portfolio of what is at present the world's largest and most successful network of microfinance banks. This network is a good example of a well-functioning "private public partnership". The paper then provides performance figures and discusses why the creation of such a network seems to be a particularly promising approach to the creation of financially self-sustaining financial institutions with a clear developmental objective.
This paper computes the optimal progressivity of the income tax code in a dynamic general equilibrium model with household heterogeneity in which uninsurable labor productivity risk gives rise to a nontrivial income and wealth distribution. A progressive tax system serves as a partial substitute for missing insurance markets and enhances an equal distribution of economic welfare. These beneficial effects of a progressive tax system have to be traded off against the efficiency loss arising from distorting endogenous labor supply and capital accumulation decisions. Using a utilitarian steady state social welfare criterion we find that the optimal US income tax is well approximated by a flat tax rate of 17:2% and a fixed deduction of about $9,400. The steady state welfare gains from a fundamental tax reform towards this tax system are equivalent to 1:7% higher consumption in each state of the world. An explicit computation of the transition path induced by a reform of the current towards the optimal tax system indicates that a majority of the population currently alive (roughly 62%) would experience welfare gains, suggesting that such fundamental income tax reform is not only desirable, but may also be politically feasible. JEL Klassifikation: E62, H21, H24 .
The purpose of this dissertation is to defend the idea that the empirical responsibilities of binding theory can be handled in a more psychologically and historically realistic way when assigned to the field of pragmatics. In particular, I wish to show that Optimality Theory (OT) (Prince & Smolensky, 1993), the stochastic OT and Gradual Learning Algorithm of Boersma (1998), the Recoverability of OT of Wilson (2001) and Buchwald et al. (2002), and the bidirectional OT of Blutner (2000b) and Bidirectional Gradual Learning Algorithm of Jäger (2003a) can all participate in a formal framework in which one can formally spell out and justify the idea that the distributional behavior of bound pronouns and reflexivs is a pragmatic phenomenon.
This paper studies an overlapping generations model with stochastic production and incomplete markets to assess whether the introduction of an unfunded social security system leads to a Pareto improvement. When returns to capital and wages are imperfectly correlated a system that endows retired households with claims to labor income enhances the sharing of aggregate risk between generations. Our quantitative analysis shows that, abstracting from the capital crowding-out effect, the introduction of social security represents a Pareto improving reform, even when the economy is dynamically effcient. However, the severity of the crowding-out effect in general equilibrium tends to overturn these gains. Klassifikation: E62, H55, H31, D91, D58 . April 2005.
Sharing of substructures like subterms and subcontexts in terms is a common method for space-efficient representation of terms, which allows for example to represent exponentially large terms in polynomial space, or to represent terms with iterated substructures in a compact form. We present singleton tree grammars as a general formalism for the treatment of sharing in terms. Singleton tree grammars (STG) are recursion-free context-free tree grammars without alternatives for non-terminals and at most unary second-order nonterminals. STGs generalize Plandowski's singleton context free grammars to terms (trees). We show that the test, whether two different nonterminals in an STG generate the same term can be done in polynomial time, which implies that the equality test of terms with shared terms and contexts, where composition of contexts is permitted, can be done in polynomial time in the size of the representation. This will allow polynomial-time algorithms for terms exploiting sharing. We hope that this technique will lead to improved upper complexity bounds for variants of second order unification algorithms, in particular for variants of context unification and bounded second order unification.
What do academics have to offer market risk management practitioners in financial institutions? Current industry practice largely follows one of two extremely restrictive approaches: historical simulation or RiskMetrics. In contrast, we favor flexible methods based on recent developments in financial econometrics, which are likely to produce more accurate assessments of market risk. Clearly, the demands of real-world risk management in financial institutions - in particular, real-time risk tracking in very high-dimensional situations - impose strict limits on model complexity. Hence we stress parsimonious models that are easily estimated, and we discuss a variety of practical approaches for high-dimensional covariance matrix modeling, along with what we see as some of the pitfalls and problems in current practice. In so doing we hope to encourage further dialog between the academic and practitioner communities, hopefully stimulating the development of improved market risk management technologies that draw on the best of both worlds.
Using a unique data set of regional inflation rates we are examining the extent and dynamics of inflation dispersion in major EMU countries before and after the introduction of the euro. For both periods, we find strong evidence in favor of mean reversion (ß-convergence) in inflation rates. However, half-lives to convergence are considerable and seem to have increased after 1999. The results indicate that the convergence process is nonlinear in the sense that its speed becomes smaller the further convergence has proceeded. An examination of the dynamics of overall inflation dispersion (ó-convergence) shows that there has been a decline in dispersion in the first half of the 1990s. For the second half of the 1990s, no further decline can be observed. At the end of the sample period, dispersion has even increased. The existence of large persistence in European inflation rates is confirmed when distribution dynamics methodology is applied. At the end of the paper we present evidence for the sustainability of the ECB's inflation target of an EMU-wide average inflation rate of less than but close to 2%. Klassifikation: E31, E52, E58
Die Präsenz in den Hauptversammlungen der großen deutschen Aktiengesellschaften ist seit Jahren rückläufig. Die durchschnittliche Hauptversammlungspräsenz bei den 30 DAXnotierten Unternehmen lag 2005 nur noch bei 45,87 %, gegenüber immerhin 60,95 % im Jahr 1998. Im Zusammenhang mit den Initiativen institutioneller Anleger bei der Deutschen Börse AG im Sommer diesen Jahres hat die Besorgnis zugenommen, Minderheiten könnten die Abwesenheit von mehr als der Hälfte der Stimmberechtigten dazu nutzen, in den Hauptversammlungen verstärkt Einfluss auf weitreichende Unternehmensentscheidungen auszuüben, um Sondervorteile zu erzielen. Unabhängig davon ist jedenfalls zu berücksichtigen, dass bei geringer Hauptversammlungspräsenz unter Umständen eine kleine Kapitalminderheit Entscheidungen trifft, bei denen sie nicht von denjenigen, die diese Entscheidungen gleichfalls betrifft, kontrolliert und, falls erforderlich, korrigiert wird. Um dem Problem sinkender Hauptversammlungspräsenzen zu begegnen, werden auf nationaler wie internationaler Ebene verschiedene Maßnahmen erwogen. Im Zusammenhang mit der Vereinfachung der Vertretung der Aktionäre in der Hauptversammlung wird vorgeschlagen, über den in § 134 III S. 3 AktG bereits vorgesehenen Stimmrechtsvertreter der Gesellschaft (sog. Proxy-Stimmrecht) hinaus einen von der Gesellschaft unabhängigen Stimmrechtsvertreter gesetzlich vorzusehen. Im Übrigen gehen die Empfehlungen von der verstärkten Nutzung neuer technischer Kommunikationsmittel bei der Stimmabgabe (Stichwort „virtuelle Hauptversammlung“) bis zur verstärkten Einführung von Namensaktien, um die Kommunikation mit den Aktionären zu erleichtern. Als Antwort auf den zunehmenden Einfluss internationaler Investoren, die derzeit aus Zeit- und Kostengründen, wegen sprachlicher Probleme, zu knapp bemessener Einladungsfristen oder Störungen des Informationsflusses zwischen ihnen und der Gesellschaft regelmäßig nicht an den Hauptversammlungen europäischer Aktiengesellschaften teilnehmen, arbeitet die Europäische Kommission zudem an einem Richtlinienentwurf, der die grenzüberschreitende Stimmrechtsausübung fördern soll. Im Zusammenhang hiermit sind auch auf die Bemühungen der Europäischen Kommission zu sehen, mittelfristig eine verstärkte Offenlegung der Anlage- und Abstimmungsstrategien institutioneller Investoren zu erreichen. Hierdurch soll eine intensivere Mitwirkung dieser Aktionärsgruppe in den Angelegenheiten der Gesellschaft gewährleistet werden. Die verstärkte Teilnahme bzw. die Offenlegung des Abstimmungsverhaltens von Investmentfonds, Versicherungen und Pensionskassen ist auch Inhalt von Empfehlungen nationaler sowie internationaler Organisationen. Bereits in der letzten Legislaturperiode hat die Arbeitsgruppe Finanzmarktgesetzgebung des Justiz- und Finanzministeriums den Vorschlag eines sog. Dividendenbonus für diejenigen Aktionäre erwogen, die ihr Stimmrecht in der Hauptversammlung ausüben. Im Anschluss daran entwickelte sich eine lebhafte Diskussion, in der sich sowohl Stimmen in der Literatur als auch Interessenverbände für die Einführung eines finanziellen Anreizsystems für Aktionäre in Form eines Dividendenbonus aussprachen. Vorbildfunktion für eine entsprechende Regelung in Deutschland wird dabei der spanischen Hauptversammlungspraxis zugesprochen. Dort geben die meisten großen Unternehmen des spanischen Standardwerteindex Ibex 35 einen finanziellen Anreiz für die Hauptversammlungspräsenz. Die Zahlungen liegen zwischen zwei und zehn Cent pro Aktie. Hierdurch konnte das Interesse am Besuch der Hauptversammlung generell gesteigert werden. Der Energiekonzern Endesa schaffte es mit einem Bonus von zwei Cent, die Präsenz von 37 % auf 66 % nahezu zu verdoppeln. Diese Erfolge in Spanien sind darauf zurückzuführen, dass die dortigen Depotbanken durch die Auszahlung der sog. prima de asistencia dazu veranlasst wurden, sich im Dienste einer bestmöglichen Wahrnehmung der Interessen ihrer Kunden auch ohne ausdrückliche Aufforderung um eine entsprechende Stimmrechtsvollmacht zu bemühen. Im Folgenden wird versucht darzulegen, wie die Regelung eines finanziellen Anreizes zur Stärkung der Hauptversammlungspräsenzen sinnvoll gestaltet werden könnte und welche gesetzgeberischen Schritte notwendig sind, um die Zahlung eines Bonus für die Präsenz in der Hauptversammlung zu ermöglichen.
The effects of public policy programmes which aim at internalising spill-overs due to successful innovation are analysed in a sequential double-sided moral hazard double-sided adverse selection framework. The central focus lies in analysing their impact on contract design. We show that in our framework only ex post grants are a robust instrument for implementing the first-best situation, whereas the success of guarantee programmes, ex ante grants and some public-private partnerships depends strongly on the characteristics of the project: in certain cases they not only give no further incentives but even destroy contract mechanisms and so worsen the outcome.
The paper considers optimal monetary stabilization policy in a forward-looking model, when the central bank recognizes that private-sector expectations need not be precisely model-consistent, and wishes to choose a policy that will be as good as possible in the case of any beliefs that are close enough to model-consistency. It is found that commitment continues to be important for optimal policy, that the optimal long-run inflation target is unaffected by the degree of potential distortion of beliefs, and that optimal policy is even more history-dependent than if rational expectations are assumed. JEL Classification: E52, E58, E42
Despite a lot of re-structuring and many innovations in recent years, the securities transaction industry in the European Union is still a highly inefficient and inconsistently configured system for cross-border transactions. This paper analyzes the functions performed, the institutions involved and the parameters concerned that shape market and ownership structure in the industry. Of particular interest are microeconomic incentives of the main players that can be in contradiction to social welfare. We develop a framework and analyze three consistent systems for the securities transaction industry in the EU that offer superior efficiency than the current, inefficient arrangement. Some policy advice is given to select the 'best' system for the Single European Financial Market.
This paper has shown that some of the principal arguments against shareholder voice are unfounded. It has shown that shareholders do own corporations, and that the nature of their property interest is structured to meet the needs of the relationships found in stock corporations. The paper has explained that fiduciary and other duties restrain the actions of shareholders just as they do those of management, and that critics cannot reasonably expect court-imposed fiduciary duties to extend beyond the actual powers of shareholders. It has also illustrated how, although corporate statutes give shareholders complete power to structure governance as they will, the default governance structures of U.S. corporations leaves shareholders almost powerless to initiate any sort of action, and the interaction between state and federal law makes it almost impossible for shareholders to elect directors of their choice. Lastly, the paper has recalled how the percentage of U.S. corporate equities owned by institutional investors has increased dramatically in recent decades, and it has outlined some of the major developments in shareholder rights that followed this increase. I hope that this paper deflated some of the strong rhetoric used against shareholder voice by contrasting rhetoric to law, and that it illustrated why the picture of weak owners painted in the early 20th century should be updated to new circumstances, which will help avoid projecting an old description as a current normative model that perpetuates the inevitability of "managerialsm", perhaps better known as "dirigisme".
This special issue of the ZAS Papers in Linguistics contains a collection of papers of the French-German Thematic Summerschool on "Cognitive and physical models of speech production, and speech perception and of their interaction".
Organized by Susanne Fuchs (ZAS Berlin), Jonathan Harrington (IPdS Kiel), Pascal Perrier (ICP Grenoble) and Bernd Pompino-Marschall (HUB and ZAS Berlin) and funded by the German-French University in Saarbrücken this summerschool was held from September 19th till 24th 2004 at the coast of the Baltic Sea at the Heimvolkshochschule Lubmin (Germany) with 45 participants from Germany, France, Great Britain, Italy and Canada. The scientific program of this summerschool that is reprinted at the end of this volume included 11 key-note presentations by invited speakers, 21 oral presentations and a poster session (8 presentations). The names and addresses of all participants are also given in the back matter of this volume.
All participants was offered the opportunity to publish an extended version of their presentation in the ZAS Papers in Linguistics. All submitted papers underwent a review and an editing procedure by external experts and the organizers of the summerschool. As it is the case in a summerschool, papers present either works in progress, or works at a more advanced stage, or tutorials. They are ordered alphabetically by their first author's name, fortunately resulting in the fact that this special issue starts out with the paper that won the award as best pre-doctoral presentation, i.e. Sophie Dupont, Jérôme Aubin and Lucie Ménard with "A study of the McGurk effect in 4 and 5-year-old French Canadian children".
Für eine möglichst vollständige analytische Beschreibung werden in der statistischen Klimatologie beobachtete Klimazeitreihen als Realisation eines stochastischen Prozesses, das heißt als eine Folge von Zufallsvariablen verstanden. Die Zeitreihe soll im wesentlichen durch eine analytische Funktion der Zeit beschrieben werden können und die Beobachtung nur durch Zufallseinflüsse von dieser Funktion abweichen. Diese analytische Funktion setzt sich aus der Summe zeitlich strukturierter Komponenten zusammen, welche aus klimatologischem Blickwinkel interpretierbar erscheinen. Es werden Funktionen zugelassen, die den Jahresgang, Trends, episodische Komponenten und deren Änderung beschreiben. Die Extremereignisse sind als eine besondere weitere Komponente in die Zeitreihenanalyse aufgenommen und als von Änderungen in den Parametern der Verteilung unabhängige, extreme Werte definiert. Die Zufallseinflüsse sollen zunächst als Realisierungen unabhängiger normalverteilter Zufallsvariablen mit dem Erwartungswert Null und im Zeitablauf konstanter Varianz interpretiert werden können. In diesem Fall beschreibt die analytische Funktion der Zeit, die Summe detektierter strukturierter Komponenten, den zeitlichen Verlauf des Mittels. Ein zu einem bestimmten Zeitpunkt tatsächlich beobachteter Wert kann dann als eine mögliche Realisation einer Zufallsvariablen interpretiert werden, die der Gaußverteilung mit dem Mittelwert µ(t) zur Zeit t und konstanter Varianz genügt. Da die zugrundeliegenden Annahmen, unter Verwendung klimatologisch interpretierbarer Basisfunktionen, in der Analyse von Klimazeitreihen, die nicht die Temperatur betreffen, zumeist nicht erfüllt sind, wird in eine Verallgemeinerung des Konzepts der Zeitreihenzerlegung in einen deterministischen und einen statistischen Anteil eingeführt. Zeitlich strukturierte Änderungen werden nun in verschiedenen Verteilungsparametern frei wählbarer Wahrscheinlichkeitsdichtefunktionen gesucht. Die gängige Beschränkung auf die Schätzung einer zeitlich veränderlichen Lokation wird aufgehoben. Skalenschätzer sowie Schätzer fär den Formparameter spielen ebenso relevante Rollen fär die Beschreibung beobachteter Klimavariabilität. Die Klimazeitreihen werden wieder als Realisation eines Zufallprozesses verstanden, jedoch genügen die Zufallsvariablen nun einer frei wählbaren Wahrscheinlichkeitsdichtefunktion. Die zeitlich strukturierten Änderungen in den Verteilungsparametern werden auf Basis der gesamten Zeitreihe für jeden Zeitpunkt geschätzt. Die aus der Analyse resultierende analytische Beschreibung in Form einer zeitabhängigen Wahrscheinlichkeitsdichtefunktion ermöglicht weiterhin die Schätzung von Über- und Unterschreitungswahrscheinlichkeiten beliebig wählbarer Schwellenwerte für jeden Zeitpunkt des Beobachtungszeitraums. Diese Methode erlaubt insbesondere eine statistische Modellierung monatlicher Niederschlagsreihen durch die Zerlegung in einen deterministischen und einen statistischen Anteil. In dem speziellen Fall von 132 Reihen monatlicher Niederschlagssummen deutscher Stationen 1901-2000 gelingt eine vollständige analytische Beschreibung der Reihen durch ihre Interpretation als Realisation einer Gumbel-verteilten Zufallsvariablen mit variablem Lage- und Streuparameter. Auf Basis der gewonnenen analytischen Beschreibung der Reihen kann beispielsweise im Westen Deutschlands auf Verschiebungen der jährlichen Überschreitungsmaxima des 95%-Perzentils von den Sommer- in die Wintermonate geschlossen werden. Sie werden durch relativ starke Anstiege in der Überschreitungswahrscheinlichkeit (bis 10%) in den Wintermonaten und nur geringe Zunahmen oder aber Abnahmen in den Sommermonaten hervorgerufen. Dies geht mit einer Zunahme der Unterschreitungswahrscheinlichkeit in den Winter- und einer Abnahme in den Sommermonaten einher. Monte-Carlo-Simulationen zeigen, daß jahreszeitlich differenzierte Schätzungen von Änderungen im Erwartungswert, also gebräuchliche Trends, auf Basis der Kleinst-Quadrate-Methode systematischen Bias und hohe Varianz aufweisen. Eine Schätzung der Trends im Mittel auf Basis der statistischen Modellierung ist somit ebenso den Kleinst-Quadrate-Schätzern vorzuziehen. Hinsichtlich der Niederschlagsanalysen stellen jedoch aride Gebiete, mit sehr seltenen Niederschlägen zu bestimmten Jahreszeiten, die Grenze der Methode dar, denn zu diesen Zeitpunkten ist eine vertrauenswürdige Schätzung einer Wahrscheinlichkeitsdichtefunktion nicht möglich. In solchen Fällen ist eine grundsätzlich andere Herangehensweise zur Modellierung der Reihen erforderlich.
In this paper, we examine the cost of insurance against model uncertainty for the Euro area considering four alternative reference models, all of which are used for policy-analysis at the ECB.We find that maximal insurance across this model range in terms of aMinimax policy comes at moderate costs in terms of lower expected performance. We extract priors that would rationalize the Minimax policy from a Bayesian perspective. These priors indicate that full insurance is strongly oriented towards the model with highest baseline losses. Furthermore, this policy is not as tolerant towards small perturbations of policy parameters as the Bayesian policy rule. We propose to strike a compromise and use preferences for policy design that allow for intermediate degrees of ambiguity-aversion.These preferences allow the specification of priors but also give extra weight to the worst uncertain outcomes in a given context. JEL Klassifikation: E52, E58, E61
We explore the macro/finance interface in the context of equity markets. In particular, using half a century of Livingston expected business conditions data we characterize directly the impact of expected business conditions on expected excess stock returns. Expected business conditions consistently affect expected excess returns in a statistically and economically significant counter-cyclical fashion: depressed expected business conditions are associated with high expected excess returns. Moreover, inclusion of expected business conditions in otherwise standard predictive return regressions substantially reduces the explanatory power of the conventional financial predictors, including the dividend yield, default premium, and term premium, while simultaneously increasing R2. Expected business conditions retain predictive power even after controlling for an important and recently introduced non-financial predictor, the generalized consumption/wealth ratio, which accords with the view that expected business conditions play a role in asset pricing different from and complementary to that of the consumption/wealth ratio. We argue that time-varying expected business conditions likely capture time-varying risk, while time-varying consumption/wealth may capture time-varying risk aversion. Klassifikation: G12
This paper starts out by pointing out the challenges and weaknesses which the German banking systems faces according to the prevailing views among national and international observers. These challenges include a generalproblem of profitability and, possibly as its main reason, the strong role of public banks. These concerns raise the questions whether the facts support this assessment of a general profitability problem and whether there are reasons to expect a fundamental or structural transformation of the German banking system. The paper contains four sections. The first one presents the evidence concerning the profitability problem in a comparative, international perspective. The second section presents information about the so-called three-pillar system of German banking. What might be surprising in this context is that the group of pub lic banks is not only the largest segment of the German banking system, but that the primary savings banks also are its financially most successful part. The German banking system is highly fragmented. This fact suggests to discuss past, present and possible future consolidations in the banking system in the third section. The authors provide evidence to the effect that within- group consolidation has been going on at a rapid pace in the public and the cooperative banking groups in recent years and that this development has not yet come to an end, while within-group consolidation among the large private banks, consolidation across group boundaries at a national level and cross-border or international consolidation has so far only happened at a limited scale, and do not appear to gain momentum in the near future. In the last section, the authors develop their explanation for the fact that large-scale and cross border consolidation has so far not materialized to any great extent. Drawing on the concept of complementarity, they argue that it would be difficult to expect these kinds of mergers and acquisitions happening within a financial system which is itself surprisingly stable, or, as one cal also call it, resistant to change.
This study investigates supralaryngeal mechanisms of the two way voicing contrast among German velar stops and the three way contrast among Korean velar stops, both in intervocalic position. Articulatory data won via electromagnetic articulography of three Korean speakers and acoustic recordings of three Korean and three German speakers are analysed. It was found that in both languages the voicing contrast is created by more than one mechanism. However, one can say that for Korean velar stops in intervocalic position stop closure duration is the most important parameter. For German it is closure voicing. The results support the phonological description proposed by Kohler (1984).
We provide insights into determinants of the rating level of 371 issuers which defaulted in the years 1999 to 2003, and into the leader-follower relationship between Moody’s and S&P. The evidence for the rating level suggests that Moody’s assigns lower ratings than S&P for all observed periods before the default event. Furthermore, we observe two-way Granger causal-ity, which signifies information flow between the two rating agencies. Since lagged rating changes influence the magnitude of the agencies’ own rating changes it would appear that the two rating agencies apply a policy of taking a severe downgrade through several mild down-grades. Further, our analysis of rating changes shows that issuers with headquarters in the US are less sharply downgraded than non-US issuers. For rating changes by Moody’s we also find that larger issuers seem to be downgraded less severely than smaller issuers.
The Basle securitisation framework explained: the regulatory treatment of asset securitisation
(2005)
The paper provides a comprehensive overview of the gradual evolution of the supervisory policy adopted by the Basle Committee for the regulatory treatment of asset securitisation. We carefully highlight the pathology of the new “securitisation framework” to facilitate a general understanding of what constitutes the current state of computing adequate capital requirements for securitised credit exposures. Although we incorporate a simplified sensitivity analysis of the varying levels of capital charges depending on the security design of asset securitisation transactions, we do not engage in a profound analysis of the benefits and drawbacks implicated in the new securitisation framework. JEL Klassifikation: E58, G21, G24, K23, L51. Forthcoming in Journal of Financial Regulation and Compliance, Vol. 13, No. 1 .
In this paper we evaluate the employment effects of job creation schemes on the participating individuals in Germany. Job creation schemes are a major element of active labour market policy in Germany and are targeted at long-term unemployed and other hard-to-place individuals. Access to very informative administrative data of the Federal Employment Agency justifies the application of a matching estimator and allows to account for individual (group-specific) and regional effect heterogeneity. We extend previous studies in four directions. First, we are able to evaluate the effects on regular (unsubsidised) employment. Second, we observe the outcome of participants and non-participants for nearly three years after programme start and can therefore analyse mid- and long-term effects. Third, we test the sensitivity of the results with respect to various decisions which have to be made during implementation of the matching estimator, e.g. choosing the matching algorithm or estimating the propensity score. Finally, we check if a possible occurrence of 'unobserved heterogeneity' distorts our interpretation. The overall results are rather discouraging, since the employment effects are negative or insignificant for most of the analysed groups. One notable exception are long-term unemployed individuals who benefit from participation. Hence, one policy implication is to address programmes to this problem group more tightly. JEL Classification: J68, H43, C13
This paper introduces a method for solving numerical dynamic stochastic optimization problems that avoids rootfinding operations. The idea is applicable to many microeconomic and macroeconomic problems, including life cycle, buffer-stock, and stochastic growth problems. Software is provided. Klassifikation: C6, D9, E2 . July 28, 2005.
Vocational training programmes have been the most important active labour market policy instrument in Germany in the last years. However, the still unsatisfying situation of the labour market has raised doubt on the efficiency of these programmes. In this paper, we analyse the effects of the participation in vocational training programmes on the duration of unemployment in Eastern Germany. Based on administrative data for the time between the October 1999 and December 2002 of the Federal Employment Administration, we apply a bivariate mixed proportional hazards model. By doing so, we are able to use the information of the timing of treatment as well as observable and unobservable influences to identify the treatment effects. The results show that a participation in vocational training prolongates the unemployment duration in Eastern Germany. Furthermore, the results suggest that locking-in effects are a serious problem of vocational training programmes. JEL Classification: J64, J24, I28, J68
This paper characterizes the optimal inflation buffer consistent with a zero lower bound on nominal interest rates in a New Keynesian sticky-price model. It is shown that a purely forward-looking version of the model that abstracts from inflation inertia would significantly underestimate the inflation buffer. If the central bank follows the prescriptions of a welfare-theoretic objective, a larger buffer appears optimal than would be the case employing a traditional loss function. Taking also into account potential downward nominal rigidities in the price-setting behavior of firms appears not to impose significant further distortions on the economy. JEL Klassifikation: C63, E31, E52 .
This study offers a historical review of the monetary policy reform of October 6, 1979, and discusses the influences behind it and its significance. We lay out the record from the start of 1979 through the spring of 1980, relying almost exclusively upon contemporaneous sources, including the recently released transcripts of Federal Open Market Committee (FOMC) meetings during 1979. We then present and discuss in detail the reasons for the FOMC's adoption of the reform and the communications challenge presented to the Committee during this period. Further, we examine whether the essential characteristics of the reform were consistent with monetarism, new, neo, or old-fashioned Keynesianism, nominal income targeting, and inflation targeting. The record suggests that the reform was adopted when the FOMC became convinced that its earlier gradualist strategy using finely tuned interest rate moves had proved inadequate for fighting inflation and reversing inflation expectations. The new plan had to break dramatically with established practice, allow for the possibility of substantial increases in short-term interest rates, yet be politically acceptable, and convince financial markets participants that it would be effective. The new operating procedures were also adopted for the pragmatic reason that they would likely succeed. JEL Klassifikation: E52, E58, E61, E65.
Using unobservable conditional variance as measure, latent-variable approaches, such as GARCH and stochastic-volatility models, have traditionally been dominating the empirical finance literature. In recent years, with the availability of high-frequency financial market data modeling realized volatility has become a new and innovative research direction. By constructing "observable" or realized volatility series from intraday transaction data, the use of standard time series models, such as ARFIMA models, have become a promising strategy for modeling and predicting (daily) volatility. In this paper, we show that the residuals of the commonly used time-series models for realized volatility exhibit non-Gaussianity and volatility clustering. We propose extensions to explicitly account for these properties and assess their relevance when modeling and forecasting realized volatility. In an empirical application for S&P500 index futures we show that allowing for time-varying volatility of realized volatility leads to a substantial improvement of the model's fit as well as predictive performance. Furthermore, the distributional assumption for residuals plays a crucial role in density forecasting. Klassifikation: C22, C51, C52, C53
Trusting the stock market
(2005)
We provide a new explanation to the limited stock market participation puzzle. In deciding whether to buy stocks, investors factor in the risk of being cheated. The perception of this risk is a function not only of the objective characteristics of the stock, but also of the subjective characteristics of the investor. Less trusting individuals are less likely to buy stock and, conditional on buying stock, they will buy less. The calibration of the model shows that this problem is sufficiently severe to account for the lack of participation of some of the richest investors in the United States as well as for differences in the rate of participation across countries. We also find evidence consistent with these propositions in Dutch and Italian micro data, as well as in cross country data. Klassifikation: D1, D8
Abstract: It is commonplace in the debate on Germany's labor market problems to argue that high unemployment and low wage dispersion are related. This paper analyses the relationship between unemployment and residual wage dispersion for individuals with comparable attributes. In the conventional neoclassical point of view, wages are determined by the marginal product of the workers. Accordingly, increases in union minimum wages result in a decline of residual wage dispersion and higher unemployment. A competing view regards wage dispersion as the outcome of search frictions and the associated monopsony power of the firms. Accordingly, an increase in search frictions causes both higher unemployment and higher wage dispersion. The empirical analysis attempts to discriminate between the two hypotheses for West Germany analyzing the relationship between wage dispersion and both the level of unemployment as well as the transition rates between different labor market states. The findings are not completely consistent with either theory. However, as predicted by search theory, one robust result is that unemployment by cells is not negatively correlated with the within cell wage dispersion.
Die vorliegende Analyse untersucht die Beschäftigungseffekte von Vermittlungsgutscheinen und Personal-Service-Agenturen mit Hilfe einer makroökonometrischen Evaluation. Neben einer mikroökonometrischen Evaluation, welche die Wirkungen auf individueller Ebene untersucht, kann eine makroökonometrische Analyse Aussagen über die gesamtwirtschaftlichen Effekte der Maßnahmen machen. Die strukturellen Multiplikatorwirkungen im makroökonomischen Kreislaufzusammenhang werden jedoch nicht berücksichtigt. Das ökonometrische Modell zur Analyse der beiden Maßnahmen basiert auf einer Matching-Funktion, die den Suchprozess von Firmen und von Arbeitern nach einem Beschäftigungsverhältnis abbildet. Die empirischen Analysen werden getrennt für Ost- und Westdeutschland sowie für die Strategietypen der Bundesagentur für Arbeit durchgeführt. Sie zeigen, dass die Ausgabe von Vermittlungsgutscheinen nur in „großstädtisch geprägten Bezirken vorwiegend in Westdeutschland mit hoher Arbeitslosigkeit“ (Strategietyp II) einen signifikant positiven Effekt auf den Suchprozess hat. Für die Personal-Service-Agenturen zeigen sich signifikant positive Effekte für Ost- als auch für Westdeutschland. Allerdings fehlt für eine abschließende Bewertung der Ergebnisse für die Personal- Service-Agenturen aufgrund der relativ geringen Teilnehmerzahl noch ein Vergleich mit mikroökonometrischen Analysen.
Unter Berücksichtigung der implementationsanalytischen und kausalen Analysen für die Einführungsphase der Vermittlungsgutscheine kommen die Autoren zu dem Schluss, dass die Erprobungsphase dieses arbeitsmarktpolitischen Instruments fortgesetzt werden sollte. Aus der Implementationsanalyse zeigt sich, dass der Verbreitungsgrad des Instruments auch nach 27 Monaten Erprobungsphase sehr gering ist. Die kausalen Analysen beziehen sich aufgrund begrenzter Datenverfügbarkeit lediglich auf zwei Ausgabemonate ein Jahr nach Einführung des Instruments (Mai und Juni 2003) und zeigen geringe positive Beschäftigungseffekte auf der Mikro-Ebene. Ob dadurch die Kosten der Vermittlungsgutscheine gerechtfertigt sind, lässt sich derzeit noch nicht abschließend beurteilen. Anzeichen für Mitnahmeeffekte und/oder Missbrauch existieren. Auch haben erfolgreiche Vermittlungen durch Gutscheine die Beschäftigungschancen in anderen Gruppen geschmälert. Deshalb werden verschiedene Vorschläge für eine kosteneffizientere Ausgestaltung der Vermittlungsgutscheine dargestellt und diskutiert. Insbesondere wird auf die Vorschläge des Bundeskabinetts vom 01.09.2004 eingegangen.
Volatility forecasting
(2005)
Volatility has been one of the most active and successful areas of research in time series econometrics and economic forecasting in recent decades. This chapter provides a selective survey of the most important theoretical developments and empirical insights to emerge from this burgeoning literature, with a distinct focus on forecasting applications. Volatility is inherently latent, and Section 1 begins with a brief intuitive account of various key volatility concepts. Section 2 then discusses a series of different economic situations in which volatility plays a crucial role, ranging from the use of volatility forecasts in portfolio allocation to density forecasting in risk management. Sections 3, 4 and 5 present a variety of alternative procedures for univariate volatility modeling and forecasting based on the GARCH, stochastic volatility and realized volatility paradigms, respectively. Section 6 extends the discussion to the multivariate problem of forecasting conditional covariances and correlations, and Section 7 discusses volatility forecast evaluation methods in both univariate and multivariate cases. Section 8 concludes briefly. JEL Klassifikation: C10, C53, G1.
„Bedeutende Finanzplätze“ oder Finanzzentren sind eng abgegrenzte Orte mit einer beträchtlichen Konzentration wichtiger professioneller Aktivitäten aus dem Finanzdienstleistungsbereich und der entsprechenden Institutionen. Allerdings: „Finance is a footloose industry“: Die Finanzbranche kann abwandern, ein Finanzzentrum kann sich verlagern, möglicherweise auch einfach auflösen. Die Möglichkeit der Auflösung und der Abwanderung stellt eine Bedrohung dar, die in der Zeit der Globalisierung und der rasanten Fortschritte der Transport- und der Informations- und Kommunikationstechnik ausgeprägter sein dürfte, als sie je war. Frankfurt ist zweifellos ein „bedeutender Finanzplatz“, und manchen gilt er auch als bedroht. Allein deshalb ist unser Thema wichtig; und auch wenn die Einschätzungen von Bedeutung und Bedrohtheit keineswegs neu sind, ist es doch aktuell. Der Aspekt der Bedrohtheit prägt, wie wir die Frage im Titel verstehen und diskutieren möchten. Was ist ein „bedeutender Finanzplatz“? Selbst wenn man das Attribut „bedeutend“ erst einmal beiseite lässt, ist die Frage keineswegs trivial. Sie zielt ja nicht nur auf eine Begriffsklärung, eine Sprachregelung ab. Hinter dem Begriff steht oft auch eine Vorstellung vom „Wesen“ dessen, was ein Begriff bezeichnet. Also: Was macht einen Finanzplatz aus? Und weiter: Warum gibt es überhaupt Finanzplätze als beträchtliche Konzentrationen von bestimmten wichtigen Aktivitäten und Institutionen? Welche Kräfte führen - oder zumindest führten - zu der räumlichen Konzentration der Aktivitäten und Institutionen, wie wirken diese Kräfte, und wie ändern sie sich gegebenenfalls? Diesen Fragen ist dieser Beitrag im Wesentlichen gewidmet, und sie prägen seinen Aufbau. Im Abschnitt II wird diskutiert, was ein „bedeutender Finanzplatz“ ist oder woran man ihn erkennt und „was er braucht“. Im Abschnitt III gehen wir zuerst auf die Frage nach der in letzter Zeit unter dem Stichwort „the end of geography“ heftig diskutierten Vorstellung einer Auflösung oder Virtualisierung der Finanzplätze ein – nicht weil dies die wichtigere Bedrohung wäre, sondern weil es die grundlegendere Frage darstellt. Dann diskutieren wir den Wettbewerb von Finanzplätzen in Europa. Den Abschluss bilden Überlegungen zu den Perspektiven des Finanzplatzes Frankfurt und der möglichen Förderung seiner Entwicklung.
This paper makes a case for the future development of European corporate law through regulatory competition rather than EC legislation. It is for the first time becoming legally possible for firms within the EU to select the national company law that they wish to govern their activities. A significant number of firms can be expected to exercise this freedom, and national legislatures can be expected to respond by seeking to make their company laws more attractive to firms. Whilst the UK is likely to be the single most successful jurisdiction in attracting firms, the presence of different models of corporate governance within Europe make it quite possible that competition will result in specialisation rather than convergence, and that no Member State will come to dominate as Delaware has done in the US. Procedural safeguards in the legal framework will direct the selection of laws which increase social welfare, as opposed simply to the welfare of those making the choice. Given that European legislators cannot be sure of the ‘optimal’ model for company law, the future of European company law-making would better be left with Member States than take the form of harmonized legislation.
Die zu erwartende vorzeitige Auflösung des Bundestages wird wohl dazu führen, daß der derzeit dem Parlament vorliegende Entwurf eines Gesetzes über die Offenlegung der Vorstandsvergütungen nicht mehr verabschiedet wird. Das bietet die Gelegenheit, die diesem Entwurf und dem in den Bundestag eingebrachten Entwurf der FDP-Fraktion zugrundeliegende Grundkonzeption darzustellen und im Hinblick auf eine spätere Gesetzgebung zu prüfen, ob die Entwürfe ihren Zielen gerecht werden (dazu unter II.). Zu erwägen ist überdies grundsätzlich, ob eine detaillierte gesetzliche Regelung zur Offenlegung der Vorstandsvergütungen anzuraten, oder ob es, wie vielfach vorgeschlagen, beim bisherigen Rechtszustand verbleiben sollte, wonach die gesetzliche Pflicht zu den pauschalen Vergütungsangaben gemäß §§ 285 Nr. 9, 314 Abs. 1 Nr. 6 HGB ergänzt wird durch die Empfehlung des Deutschen Corporate Governance Kodex (Ziff. 4.2.4 DCGK), in börsennotierten Gesellschaften die Vorstandsvergütungen individuell auszuweisen (dazu unten III.).