Refine
Year of publication
- 2012 (73) (remove)
Document Type
- Report (34)
- Working Paper (16)
- Book (7)
- Part of Periodical (7)
- Article (6)
- Doctoral Thesis (3)
Has Fulltext
- yes (73) (remove)
Is part of the Bibliography
- no (73) (remove)
Keywords
- forecasting (5)
- model uncertainty (5)
- DSGE models (4)
- monetary policy (4)
- Greenbook (3)
- Social Interaction (3)
- complexity (3)
- density forecasts (3)
- forecast combination (3)
- real-time data (3)
Institute
- Wirtschaftswissenschaften (73) (remove)
This paper studies constrained portfolio problems that may involve constraints on the probability or the expected size of a shortfall of wealth or consumption. Our first contribution is that we solve the problems by dynamic programming, which is in contrast to the existing literature that applies the martingale method. More precisely, we construct the non-separable value function by formalizing the optimal constrained terminal wealth to be a (conjectured) contingent claim on the optimal non-constrained terminal wealth. This is relevant by itself, but also opens up the opportunity to derive new solutions to constrained problems. As a second contribution, we thus derive new results for non-strict constraints on the shortfall of inter¬mediate wealth and/or consumption.
The complexity resulting from intertwined uncertainties regarding model misspecification and mismeasurement of the state of the economy defines the monetary policy landscape. Using the euro area as laboratory this paper explores the design of robust policy guides aiming to maintain stability in the economy while recognizing this complexity. We document substantial output gap mismeasurement and make use of a new model data base to capture the evolution of model specification. A simple interest rate rule is employed to interpret ECB policy since 1999. An evaluation of alternative policy rules across 11 models of the euro area confirms the fragility of policy analysis optimized for any specific model and shows the merits of model averaging in policy design. Interestingly, a simple difference rule with the same coefficients on inflation and output growth as the one used to interpret ECB policy is quite robust as long as it responds to current outcomes of these variables.
We develop a dynamic network model with heterogenous banks which undertake optimizing portfolio decisions subject to liquidity and capital constraints and trade in the interbank market whose equilibrium is governed by a tatonnement process. Due to the micro-funded structure of the decisional process as well as the iterative dynamic adjustment taking place in the market, the links in the network structures are endogenous and evolve dynamically. We use the model to assess the diffusion of systemic risk (measured as default probability), the contribution of each bank to it as well as the evolution of the network in response to financial shocks and across different prudential policy regimes.
In the aftermath of the global financial crisis, the state of macroeconomic modeling and the use of macroeconomic models in policy analysis has come under heavy criticism. Macroeconomists in academia and policy institutions have been blamed for relying too much on a particular class of macroeconomic models. This paper proposes a comparative approach to macroeconomic policy analysis that is open to competing modeling paradigms. Macroeconomic model comparison projects have helped produce some very influential insights such as the Taylor rule. However, they have been infrequent and costly, because they require the input of many teams of researchers and multiple meetings to obtain a limited set of comparative findings. This paper provides a new approach that enables individual researchers to conduct model comparisons easily, frequently, at low cost and on a large scale. Using this approach a model archive is built that includes many well-known empirically estimated models that may be used for quantitative analysis of monetary and fiscal stabilization policies. A computational platform is created that allows straightforward comparisons of models’ implications. Its application is illustrated by comparing different monetary and fiscal policies across selected models. Researchers can easily include new models in the data base and compare the effects of novel extensions to established benchmarks thereby fostering a comparative instead of insular approach to model development.
We introduce a new measure of systemic risk, the change in the conditional joint probability of default, which assesses the effects of the interdependence in the financial system on the general default risk of sovereign debtors. We apply our measure to examine the fragility of the European financial system during the ongoing sovereign debt crisis. Our analysis documents an increase in systemic risk contributions in the euro area during the post-Lehman global recession and especially after the beginning of the euro area sovereign debt crisis. We also find a considerable potential for cascade effects from small to large euro area sovereigns. When we investigate the effect of sovereign default on the European Union banking system, we find that bigger banks, banks with riskier activities, with poor asset quality, and funding and liquidity constraints tend to be more vulnerable to a sovereign default. Surprisingly, an increase in leverage does not seem to influence systemic vulnerability.
We outline a procedure for consistent estimation of marginal and joint default risk in the euro area financial system. We interpret the latter risk as the intrinsic financial system fragility and derive several systemic fragility indicators for euro area banks and sovereigns, based on CDS prices. Our analysis documents that although the fragility of the euro area banking system had started to deteriorate before Lehman Brothers' file for bankruptcy, investors did not expect the crisis to affect euro area sovereigns' solvency until September 2008. Since then, and especially after November 2009, joint sovereign default risk has outpaced the rise of systemic risk within the banking system.
We develop a dynamic network model with heterogenous banks which undertake optimizing portfolio decisions subject to liquidity and capital constraints and trade in the interbank market whose equilibrium is governed by a tatonnement process. Due to the micro-funded structure of the decisional process as well as the iterative dynamic adjustment taking place in the market, the links in the network structures are endogenous and evolve dynamically. We use the model to assess the diffusion of systemic risk, the contribution of each bank to it as well as the evolution of the network in response to financial shocks and across different prudential policy regimes.
This paper presents a theory that explains why it is beneficial for banks to engage in circular lending activities on the interbank market. Using a simple network structure, it shows that if there is a non-zero bailout probability, banks can significantly increase the expected repayment of uninsured creditors by entering into cyclical liabilities on the interbank market before investing in loan portfolios. Therefore, banks are better able to attract funds from uninsured creditors. Our results show that implicit government guarantees incentivize banks to have large interbank exposures, to be highly interconnected, and to invest in highly correlated, risky portfolios. This can serve as an explanation for the observed high interconnectedness between banks and their investment behavior in the run-up to the subprime mortgage crisis.
This paper investigates the accuracy of point and density forecasts of four DSGE models for inflation, output growth and the federal funds rate. Model parameters are estimated and forecasts are derived successively from historical U.S. data vintages synchronized with the Fed’s Greenbook projections. Point forecasts of some models are of similar accuracy as the forecasts of nonstructural large dataset methods. Despite their common underlying New Keynesian modeling philosophy, forecasts of different DSGE models turn out to be quite distinct. Weighted forecasts are more precise than forecasts from individual models. The accuracy of a simple average of DSGE model forecasts is comparable to Greenbook projections for medium term horizons. Comparing density forecasts of DSGE models with the actual distribution of observations shows that the models overestimate uncertainty around point forecasts.
In the aftermath of the global financial crisis and great recession, many countries face substantial deficits and growing debts. In the United States, federal government outlays as a ratio to GDP rose substantially from about 19.5 percent before the crisis to over 24 percent after the crisis. In this paper we consider a fiscal consolidation strategy that brings the budget to balance by gradually reducing this spending ratio over time to the level that prevailed prior to the crisis. A crucial issue is the impact of such a consolidation strategy on the economy. We use structural macroeconomic models to estimate this impact focussing primarily on a dynamic stochastic general equilibrium model with price and wage rigidities and adjustment costs. We separate out the impact of reductions in government purchases and transfers, and we allow for a reduction in both distortionary taxes and government debt relative to the baseline of no consolidation. According to the model simulations GDP rises in the short run upon announcement and implementation of this fiscal consolidation strategy and remains higher than the baseline in the long run. We explore the role of the mix of expenditure cuts and tax reductions as well as gradualism in achieving this policy outcome. Finally, we conduct sensitivity studies regarding the type of model used and its parameterization.
The complexity resulting from intertwined uncertainties regarding model misspecification and mismeasurement of the state of the economy defines the monetary policy landscape. Using the euro area as laboratory this paper explores the design of robust policy guides aiming to maintain stability in the economy while recognizing this complexity. We document substantial output gap mismeasurement and make use of a new model data base to capture the evolution of model specification. A simple interest rate rule is employed to interpret ECB policy since 1999. An evaluation of alternative policy rules across 11 models of the euro area confirms the fragility of policy analysis optimized for any specific model and shows the merits of model averaging in policy design. Interestingly, a simple difference rule with the same coefficients on inflation and output growth as the one used to interpret ECB policy is quite robust as long as it responds to current outcomes of these variables.
We examine both the degree and the structural stability of inflation persis tence at different quantiles of the conditional inflation distribution. Previous research focused exclusively on persistence at the conditional mean of the inflation rate. Economic theory, however, provides various reasons -for example downward wage rigidities or menu costs- to expect higher inflation persistence at the upper than at the lower tail of the conditional inflation distribution.
Based on post-war US data we indeed find slower mean reversion in response to positive than to negative shocks. We find robust evidence for a structural break in persistence at all quantiles of the inflation process in the early 1980s. Inflation persistence has decreased and become more homogeneous across quantiles. Persistence at the conditional mean became more informative about the degree of persistence across the entire conditional inflation distribution. While prior to the 1980s inflation was not mean reverting in response to large positive shocks, our evidence strongly suggests that since the end of the Volcker disinflation the unit root can be rejected at every quantile including the upper tail of the conditional inflation distribution.
This chapter aims to provide a hands-on approach to New Keynesian models and their uses for macroeconomic policy analysis. It starts by reviewing the origins of the New Keynesian approach, the key model ingredients and representative models. Building blocks of current-generation dynamic stochastic general equilibrium (DSGE) models are discussed in detail. These models address the famous Lucas critique by deriving behavioral equations systematically from the optimizing and forward-looking decision-making of households and firms subject to well-defined constraints. State-of-the-art methods for solving and estimating such models are reviewed and presented in examples. The chapter goes beyond the mere presentation of the most popular benchmark model by providing a framework for model comparison along with a database that includes a wide variety of macroeconomic models. Thus, it offers a convenient approach for comparing new models to available benchmarks and for investigating whether particular policy recommendations are robust to model uncertainty. Such robustness analysis is illustrated by evaluating the performance of simple monetary policy rules across a range of recently-estimated models including some with financial market imperfections and by reviewing recent comparative findings regarding the magnitude of government spending multipliers. The chapter concludes with a discussion of important objectives for on-going and future research using the New Keynesian framework.
This paper investigates the accuracy of point and density forecasts of four DSGE models for inflation, output growth and the federal funds rate. Model parameters are estimated and forecasts are derived successively from historical U.S. data vintages synchronized with the Fed’s Greenbook projections. Point forecasts of some models are of similar accuracy as the forecasts of nonstructural large dataset methods. Despite their common underlying New Keynesian modeling philosophy, forecasts of different DSGE models turn out to be quite distinct. Weighted forecasts are more precise than forecasts from individual models. The accuracy of a simple average of DSGE model forecasts is comparable to Greenbook projections for medium term horizons. Comparing density forecasts of DSGE models with the actual distribution of observations shows that the models overestimate uncertainty around point forecasts.
This paper investigates the accuracy of forecasts from four DSGE models for inflation, output growth and the federal funds rate using a real-time dataset synchronized with the Fed’s Greenbook projections. Conditioning the model forecasts on the Greenbook nowcasts leads to forecasts that are as accurate as the Greenbook projections for output growth and the federal funds rate. Only for inflation the model forecasts are dominated by the Greenbook projections. A comparison with forecasts from Bayesian VARs shows that the economic structure of the DSGE models which is useful for the interpretation of forecasts does not lower the accuracy of forecasts. Combining forecasts of several DSGE models increases precision in comparison to individual model forecasts. Comparing density forecasts with the actual distribution of observations shows that DSGE models overestimate uncertainty around point forecasts.
In the aftermath of the global financial crisis and great recession, many countries face substantial deficits and growing debts. In the United States, federal government outlays as a ratio to GDP rose substantially from about 19.5 percent before the crisis to over 24 percent after the crisis. In this paper we consider a fiscal consolidation strategy that brings the budget to balance by gradually reducing this spending ratio over time to the level that prevailed prior to the crisis. A crucial issue is the impact of such a consolidation strategy on the economy. We use structural macroeconomic models to estimate this impact focussing primarily on a dynamic stochastic general equilibrium model with price and wage rigidities and adjustment costs. We separate out the impact of reductions in government purchases and transfers, and we allow for a reduction in both distortionary taxes and government debt relative to the baseline of no consolidation. According to the model simulations GDP rises in the short run upon announcement and implementation of this fiscal consolidation strategy and remains higher than the baseline in the long run. We explore the role of the mix of expenditure cuts and tax reductions as well as gradualism in achieving this policy outcome. Finally, we conduct sensitivity studies regarding the type of model used and its parameterization.
After initial temporary measures in support of Greece prooved insufficient to end the sovereign debt crisis, extensive countermeasures have ensued. The heads of state of the euro group have agreed to permanent support mechanims over the course of the past two years. In addition, the European Central Bank (ECB) has become involved in the assistance program. The article provides an overview of the various support mechanisms installed and cautions against the connected legal problems.
In its decision of December 13, 2011, the Constitutional Court of the state of North Rhine-Westphalia ruled that a State Court of Auditors is granted by the constitution a broad scope of powers not only to control the immediate state administration but also entities outside the direct state administration, as far as they exercise financial responsibility for the state. This ruling may have serious implications for the capital guarantees extended by EU Member States to the newly established institutions on the European level, as for instance the European Stability Mechanism (ESM).
The latest appointment to the ECB's Executive Board initiated a political dispute between the European Parliament and the Euro Group on the question of representation of females on the Executive Board and the Governing Council of the ECB. The dispute has raised awareness to the fact that a culture of equality and equal opportunity should be built from the ground up. A long term plan helping talented women to emerge and be prepared to take increasing responsibilities is necessary to make sure that there is a growing pool of qualified female candidates.
Der Fokus dieser kumulativen Dissertation liegt auf der Untersuchung des Managements
öffentlich-privater Informationstechnologie (IT)-Partnerschaften. Daher werden
im Rahmen dieses Kapitels zunächst die Bedeutung der Forschung über öffentlichprivate
Partnerschaften (ÖPP) im IT-Bereich und anschließend die untersuchten zentralen
Forschungsfragen erläutert. Im Anschluss wird die Struktur dieser Arbeit kurz vorgestellt,
um darzustellen, wie die zentralen Forschungsfragen adressiert wurden.
...
Wie in der Einleitung beschrieben, bestand die zentrale Motivation für diese kumulative
Dissertation in der fehlenden Forschung zur Konzeptionalisierung öffentlicher und privater
Organisationskulturen und der Analyse deren Auswirkungen auf die erfolgreiche
Realisierung von IT-ÖPPs. Daher zielten die durchgeführten Fallstudien darauf ab, zu
untersuchen, wie IT-ÖPPs trotz anfänglicher Schwierigkeiten erfolgreich gestaltet werden
können und wie eine IT-ÖPP über die Zeit hinweg etabliert und aufrechterhalten
werden kann.
Das Ziel der ersten zentralen Forschungsfrage konzentrierte sich daher auf die Analyse
der Unterschiede von öffentlichen und privaten Organisationskulturen. Artikel 1 beantwortet
diese Forschungsfrage durch die Konzeptionalisierung öffentlicher und privater
Organisationskulturen (bestehend aus divergierenden Denkweisen, Wissensbasen
und organisationalen Strukturen) und liefert erste Einblicke in deren Auswirkungen auf
die öffentlich-private Zusammenarbeit sowie den Erfolg von IT-ÖPPs. Dadurch leistet
Artikel 1 einen Beitrag zur IT-ÖPP-Forschungsdomäne. Außerdem trägt Artikel 1 zur
theoretischen Domäne organisationaler Kulturunterschiede dadurch bei, dass er den
Einfluss öffentlicher und privater Normen sowie deren Werte auf organisationsspezifisches
Verhalten aufdeckt. Zusammenfassend illustriert Artikel 1, dass die Etablierung
einer nachhaltigen IT-ÖPP das Bewusstsein sowie Verständnis für Unterschiede von
öffentlichen und privaten Organisationskulturen erfordert, um einen Kooperationsmodus
zu verhandeln.
Artikel 2, 3 und 4 erweitern die Erkenntnisse der Auswirkungen unterschiedlicher Organisationskulturen
auf öffentlich-private Zusammenarbeit in IT-ÖPPs durch die Analyse
der Gründe und Vorgehensweisen zur erfolgreichen Gestaltung einer IT-ÖPP trotz
anfänglicher Schwierigkeiten. Dadurch antworten die Artikel auf die zweite zentrale
Forschungsfrage. Der Beitrag von Artikel 2 zur Forschungsdomäne besteht in der Untersuchung
der Rolle divergierender Verständnisse und Erwartungen, deren Auswirkungen
durch den ÖPP-Kontext verstärkt werden, für das Scheitern von IT-ÖPP-Projekten.
Der theoretische Beitrag hingegen bezieht sich auf die Identifizierung der Ursachen der
Verletzung des psychologischen Vertrags auf einer individuellen Ebene in IT-ÖPPs.
Abgeleitet von diesen Erkenntnissen veranschaulicht Artikel 2 die Bedeutung der Aufrechterhaltung
informaler Beziehungen für die Realisierung von IT-Projekten im ÖPPKontext.
Artikel 3 setzt auf den Ergebnissen von Artikel 2 auf und illustriert die organisationalen
Voraussetzungen sowie Management-Praktiken (auch beeinflusst durch
den spezifischen ÖPP-Kontext) für die Wiederherstellung einer nachhaltigen Partnerschaft
in scheiternden IT-ÖPP-Projekten. Dadurch leistet Artikel 3 einen Beitrag zur
IT-ÖPP-Forschungsdomäne. Aus einer theoretischen Perspektive erweitert Artikel 3 die
bestehende Literatur über das Brückenspannen durch die Untersuchung der notwendigen
Voraussetzungen und Aktivitäten des Brückenspannens auf einer organisationalen
Ebene, um die kulturelle Kluft in interorganisationalen Kooperationen zu überbrücken.
Basierend auf diesen Erkenntnissen zeigt Artikel 3, dass die Trendwende eines negativen
Verlaufs von IT-ÖPP-Projekten die Etablierung einer unbelasteten öffentlichprivaten
Beziehung und die kontinuierliche Pflege der Partnerschaft sowie der Beziehungen
zu den Anspruchsgruppen erfordert. Artikel 4 analysiert ebenfalls die Gründe
sowie Vorgehensweisen für erfolgreiche IT-ÖPPs. Basierend auf der Konzeptionalisierung
von öffentlichen und privaten Organisationskulturen von Artikel 1 erweitert
Artikel 4 die Analyse von Artikel 1 durch die detaillierte Darstellung, wie organisationsspezifische
Verhaltensweisen begründet sind und wie sie die öffentlich-private Zusammenarbeit
erschweren. Zusätzlich erweitert Artikel 4 die Erkenntnisse der IT-ÖPPForschungsdomäne
durch die Illustration, wie Unterschiede organisationaler Kulturen
ausbalanciert werden können, um eine nachhaltige Partnerschaft zu etablieren und ITÖPPs
aus administrativer, politischer und betriebswirtschaftlicher Perspektive erfolgreich
zu gestalten. Hinsichtlich der theoretischen Domäne organisationaler Kulturunterschiede
liefert Artikel 4 tiefgehende Einblicke in den Einfluss von Normen und Routinen
öffentlicher und privater Logiken auf die Ausprägungen öffentlicher und privater
Organisationskulturen sowie die detaillierte Erläuterung der Auswirkungen kultureller
Unterschiede auf organisationsspezifisches Verhalten. Abschließend erläutert Artikel 4
die Hierarchie von IT-ÖPP-Erfolgskriterien aus administrativer, politischer sowie betriebswirtschaftlicher
Perspektive und illustriert, dass der Erfolg von IT-ÖPPs von der
Verhandlung von Kompromissen über gemeinsame Partnerschaftsziele und -vorgehensweisen
hinsichtlich der verschiedenen Interessen der Parteien abhängig ist.
Artikel 5 integriert die Ergebnisse der Artikel 1 bis 4 und untersucht die Etablierung
und Aufrechterhaltung einer IT-ÖPP über die Zeit hinweg. Dadurch beantwortet Artikel
5 die dritte zentrale Forschungsfrage. Durch die Analyse des Partnerschaftsentwicklungsprozesses
in IT-ÖPP-Projekten, der der Herausforderung kollidierender öffentlicher
und privater Organisationskulturen ausgesetzt ist, erläutert Artikel 5 die drei
Phasen der IT-ÖPP-Entwicklung und die Ereignisse, die den Übergang zwischen den
Phasen initiieren. Zusätzlich zu diesen Beiträgen zur Forschungsdomäne illustriert Artikel
5 das Zusammenspiel konkurrierender institutioneller Logiken über die Zeit hinweg
und belegt den schrittweisen Austausch konkurrierender Logiken durch eine neue,
dominante Logik. Zusammenfassend zeigt Artikel 5, dass die Etablierung und Aufrechterhaltung
einer nachhaltigen, bilateralen Partnerschaft zwischen öffentlichen und
privaten Organisationen die Verhandlung eines Kooperationsmodus durch die Aufweichung
öffentlicher und privater institutioneller Normen und Prinzipien sowie die gedankliche
Annäherung der Parteien aneinander erfordert.
Insgesamt liefert diese kumulative Dissertation Erkenntnisse über die Unterschiede zwischen
öffentlichen und privaten Organisationskulturen, wie diese Unterschiede die öffentlich-
private Zusammenarbeit in IT-ÖPPs erschweren, die Management-Praktiken
für die Etablierung und Aufrechterhaltung einer nachhaltigen Partnerschaft über die Zeit
hinweg sowie die erfolgreiche Gestaltung von IT-ÖPPs. Dadurch schafft diese Arbeit
einen Ausgangspunkt für die tiefergehende Erforschung von IT-ÖPP-Management, die
Analyse des spezifischen ÖPP-Kontexts und die Verbesserung der Realisierung von ITÖPP-
Projekten.
Financing asset growth
(2012)
We document the existence of a debt anomaly that is in addition to the asset growth anomaly: for a given asset growth rate, firms that issue more debt, as well as firms that retire more debt, have lower stock returns in the 12 months starting 6 months after the calendar year of asset growth. Exploring the reasons for debt issuance, we find that managers of firms for which analyst expectations are more over-optimistic, which suffer from declining investment profitability, and whose earnings-price ratios are relatively high are inclined to rely more heavily on debt financing. On the other hand, firms that retire more debt for a given asset growth rate tend to have improving profitability but to be over-priced. We also find that the financing decision is influenced by the prior debt ratio, the asset growth rate, profitability, and CEO pay sensitivity. We interpret our results in terms of managerial incentives, signaling, and market timing.
The idea of appointing a non-national as Central Bank Governor remains surprisingly controversial. Nevertheless, given the skills required by the Governor in order to manage what no doubt are increasingly complex institutions, considering non-nationals makes good sense for at least two reasons. First, increasing the pool of candidates to include those with broader skills and backgrounds makes it easier to find a suitable person for the job. Second, non-nationals are less likely to be beholden to domestic pressure groups and could help better insulate the central bank from political pressures.
Großer Beifall
(2012)
Wie kann das Projekt gemeinsame Währung seine Glaubwürdigkeit wiederherstellen? Otmar Issing argumentiert, dass eine gemeinsame Währung ohne politische Union nur mit dem No-bail-out Prinzip funktionieren kann. Er warnt gleichzeitig davor, die politische Union nur als Mittel zur Krisenbewältigung voranzutreiben.
This present comment suggests an amendment to the proposal for a directive of the European Parliament and of the Council, establishing a framework for the recovery and resolution of credit institutions and investment firms. The current proposal focuses on bail-in, but does not sufficiently take into account the pressure exerted on central bankers, supervisors and politicians by the fear of interbank contagion. The only way out of this hold-up type of situation can be found in bail-in bonds. Bail-in bonds are dedicated loss taking debt instruments, whose status of being first in line if it comes to default is clearly communicated from day one.
With this paper, I propose a simple asset pricing model that accounts for the influence from social interaction. Investors are assumed to make up their mind about an asset’s price based on a forecasting strategy and its past profitability as well as on the contemporaneous expectations of other market participants. Empirically analysing stocks of the DAX30 index, I provide evidence that social interaction rather destabilises financial markets. Based on my results, I state that at least, it does not have a stabilising effect.
An analyst who works in Germany is more likely to publish a high (low) price target regarding a DAX30 stock if other Germany based analysts are also optimistic (pessimistic) about the same stock. This finding is not biased by the fact that DAX30 companies are headquartered in Germany. In times of bull markets, price targets of analysts who regularly exchange their opinion are higher correlated compared to other analysts. This effect vanishes in a bearish market environment. This suggests that communication among analysts indeed plays an important role. However, analysts’ incentives induce them not to deviate too much from the overall average during an economic downturn.
Dieser Text fasst eine Studie zusammen, die für das Bundesministeriums für Ernährung, Landwirtschaft und Verbraucherschutz verfasst wurde und sich mit dem Kundennutzen von Anlageberatung auseinandersetzt. Das erhebliche Potenzial von interessenskongruenter Anlageberatung wird aufgezeigt und die aktuell geringe Leistungstransparenz im Markt kritisiert. Es wird empfohlen, ein standardisiertes Vokabular für Depotrisiken einzuführen und den Zugang aller Anleger zu leicht verständlichen und vergleichbaren Informationen zu historischem Depotrisiko und historischer Deporendite sicherzustellen. Die Studie fokussiert auf Wertpapierberatung und damit zuvorderst auf jene Teilmenge von Verbrauchern, die über Anlagevermögen verfügen. Die Grundideen zu Leistungstransparenz und standardisiertem Risikovokabular lassen sich jedoch auch z.B. auf den Alterssicherungsmarkt übertragen.
In dieser Notiz wird ein neues Konzept für eine europäische Einlagensicherung vorgeschlagen, welches den starken politischen Vorbehalten Rechnung trägt, die gegen eine Vergemeinschaftung der Haftung für Bankeinlagen bestehen. Das skizzierte drei-stufige Einlagensicherungsmodell führt existierende nationale Einlagensicherungseinrichtungen weiter, bietet einen europäischen Verlustausgleich und verhindert eine exzessive Risikoübernahme zu Lasten der internationalen Gemeinschaft.
Zeit seines Lebens beschäftigte sich Goethe mit ökonomischen Theorien; in seinem dichterischen Werk entwarf er wirtschaftliche Visionen, die das späte 19. und das frühe 20. Jahrhundert übersahen. Seine positive Vision des Kapitalismus ist von einer Sittlichkeit durchdrungen, die extreme Formen des Erwerbsstrebens und der Ausbeutung hemmt. Dagegen steht Goethes Schreckensbild eines uns modern erscheinenden Kapitalismus, wie es im »Faust« beschworen wird.
The exceptional circumstances in which the ECB has been operating in the past years are testing not only the currency union itself, but also its institutional design. While the Governing Council of the ECB was designed to mainly set interest rates optimally for the union as a whole, the recent crisis has expanded the tools of the ECB to include unconventional monetary policy actions that potentially increase the risk exposure of its balance sheet. Since each country would contribute to the losses according to its capital key, a different voting mechanism that takes into account the single country’s contribution to the ECB’s capital could be advisable.
In the event of a Greek exit from the Eurozone, the stronger members of the monetary union, especially Germany, face at least two risks: First, the debt of the Greek National Bank vis-à-vis the Eurosystem of central banks will most likely be lost. Secondly, the large flow of capital from Greece and other periphery countries to Germany will accelerate inflation.
Option-implied information and predictability of extreme returns : [Version 24 September 2012]
(2012)
We study whether option-implied conditional expectation of market loss due to tail events, or tail loss measure, contains information about future returns, especially the negative ones. Our tail loss measure predicts future market returns, magnitude, and probability of the market crashes, beyond and above other option-implied variables. Stock-specific tail loss measure predicts individual expected returns and magnitude of realized stock-specific crashes in the cross-section of stocks. An investor, especially the one who cares about the left tail of her wealth distribution (e.g., disappointment-averse), benefits from using the tail loss measure as an information variable to construct managed portfolios of a risk-free asset and market index. The tail loss measure is motivated by the results of the extreme value theory, and it is computed from observed prices of out-of-the-money put as the risk-neutral expected value of a loss beyond a given relative threshold.
Kaum eine andere Industrie wurde in den vergangenen beiden Jahrzehnten so massiv durch den Einzug der Informationstechnologie geprägt wie der Wertpapierhandel. Traditionelle Geschäftsmodelle haben sich grundlegend verändert. Das klassische Börsenparkett ist nahezu vollständig elektronischen Handelsplattformen gewichen.
Vielfältige Einschnitte im Rentensystem haben die Bedeutung der privaten Altersvorsorge in den vergangenen Jahren massiv erhöht. Neben Immobilienbesitz, Lebensversicherungen und staatlich geförderten Programmen zur privaten Vorsorge hat sich inzwischen auch die eigenverantwortliche Altersvorsorge mit Wertpapierdepots etabliert, so dass die Anzahl privater Depots in den letzten 25 Jahren von 8,0 auf 27,9 Millionen gestiegen ist. Vor diesem Hintergrund ist die Frage von zentraler Bedeutung, wie gut Anleger ihr Geld investieren.
In this paper, I analyse the reciprocal social influence on investment decisions within an international group of roughly 2000 mutual fund managers that invested in companies of the DAX30. Using a robust estimation procedure, I provide empirical evidence that in the average a fund manager puts 0.69% more portfolio weight on a particular stock, if other fund managers increase the corresponding position by 1%. The dynamics of this influence on portfolio weights suggest that fund managers adjust their behaviour according to the prevailing market situation and are more strongly influenced by others in times of an economic downturn. Analysing the working locations of the fund managers, I conclude that more than 90% of the magnitude of influence is due to pure observation. While this form of influence varies much in time, the magnitude of influence resulting from the exchange of opinion is more or less constant.
Der Deutsche Coprporate Governance Kodex soll das deutsche Corporate Governance System transparent und nachvollziehbar machen. Der Kodex stellt gesetzliche Vorschriften zur Leitung und Überwachung deutscher börsennotierter Gesellschaften dar und enthält international anerkannte Standards guter und verantwortungsvoller Unternehmensführung. Die Stellungnahme befasst sich mit von der Regierungskommission Deutscher Corporate Governance Kodex vorgebrachten Änderungsvorschlägen.