Refine
Year of publication
Document Type
- Working Paper (1496)
- Part of Periodical (568)
- Article (205)
- Report (141)
- Book (100)
- Doctoral Thesis (70)
- Contribution to a Periodical (44)
- Conference Proceeding (21)
- Part of a Book (13)
- Periodical (12)
Is part of the Bibliography
- no (2698)
Keywords
- Deutschland (98)
- Financial Institutions (90)
- Capital Markets Union (65)
- ECB (65)
- Financial Markets (59)
- Banking Union (50)
- Banking Regulation (49)
- Household Finance (45)
- Monetary Policy (41)
- Banking Supervision (40)
Institute
- Wirtschaftswissenschaften (2698) (remove)
This paper explores entrepreneurs’ initially intended exit strategies and compares them to their final exit paths using an inductive approach that builds on the grounded theory methodology. Our data shows that initially intended and final exit strategies differ among entrepreneurs. Two groups of entrepreneurs emerged from our data. The first group comprises entrepreneurs who financed their firms through equity investors. The second group is made up of entrepreneurs who financed their businesses solely with their own equities. Our data shows that the first group originally intended a financial harvest exit strategy and settled with this harvest exit strategy. The second group initially intended a stewardship exit strategy but did not succeed. We used the theory of planned behavior and the behavioral agency model to analyze our data. By examining our results from these two theoretical perspectives, our study explains how entrepreneurs’ exit intentions lead to their actual exit strategies.
Im Forschungsgebiet des Wissensmanagements ist der Teilbereich des Wissenstransfers von großer Bedeutung, jedoch gleichzeitig auch mit vielen Problemen verbunden, die es auf dem Weg zu einem erfolgreichen Wissenstransfer zu identifizieren und zu lösen gilt. Die vorliegende Arbeit stellt einen Ordnungsrahmen vor, mit dessen Hilfe ein detailliertes Gesamtbild des Wissenstransfers innerhalb einer beliebigen Organisation erstellt werden kann. Der Ordnungsrahmen bildet Rollen, Objekte und Handlungen des Wissenstransfers ab und setzt diese miteinander in Beziehung. Diesen Konstrukten des Ordnungsrahmens sind potentielle Probleme zugeordnet, die im Rahmen einer Literaturrecherche identifiziert werden und die einen reibungslosen Ablauf des Wissenstransfers innerhalb von Organisationen behindern können. Eine für den Ordnungsrahmen entwickelte Handlungsanleitung beschreibt, wie dieser als Basis für die konkrete Untersuchung der aktuellen Situation des Wissenstransfers in Organisationen eingesetzt werden kann. Im Rahmen der Forschungsarbeit wird anhand von mehreren Praxisfällen gezeigt, dass der Ordnungsrahmen mit Hilfe der Handlungsanleitung dazu eingesetzt werden kann, den Ist-Zustand des Wissenstransfers in Organisationen zu erheben sowie vorhandene Probleme im Wissenstransfer aufzudecken. Das Vorgehen der Forschung ist an den Grundsätzen des Design Science ausgerichtet. Der Beitrag zur Forschung als Ergebnis des Design Science-Prozesses ist der Ordnungsrahmen, dessen Validität und Relevanz anhand mehrerer Kriterien gezeigt wird.
Entwicklungsfinanzierung
(2000)
We analyze the implications of the structure of a network for asset prices in a general equilibrium model. Networks are represented via self- and mutually exciting jump processes, and the representative agent has Epstein-Zin preferences. Our approach provides a exible and tractable unifying foundation for asset pricing in networks. The model endogenously generates results in accordance with, e.g., the robust-yetfragile feature of financial networks shown in Acemoglu, Ozdaglar, and Tahbaz-Salehi (2014) and the positive centrality premium documented in Ahern (2013). We also show that models with simpler preference assumptions cannot generate all these findings simultaneously.
Is wider access to stockholding opportunities related to reduced wealth inequality, given that it creates challenges for small and less sophisticated investors? Counterfactual analysis is used to study the influence of changes in the US stockholder pool and economic environment, on the distribution of stock and net household wealth during a period of dramatic increase in stock market participation. We uncover substantial shifts in stockholder pool composition, favoring smaller holdings during the 1990s upswing but larger holdings around the burst of the Internet bubble. We find no evidence that widening access to stocks was associated with reduced net wealth inequality.
This study simulates three income tax scenarios in a Mirrleesian setting for 24 EU countries using data from the 2014 Structure of Earnings Survey. In scenario 1, each country individually maximizes its own welfare (benchmark). In scenarios 2 and 3, total welfare in the EU is maximized over a common budget constraint. Unlike scenario 2, the social planner of scenario 3 differentiates taxes by country of residence. If a common tax and transfer system were implemented in the EU, countries with a relatively higher mean wage rate—particularly those in Western and some of the Northern European countries—would transfer resources to the others. Scenario 2 implies increased labor distortions for almost all countries and, hence, leads to a contraction in total output. Scenario 3 produces higher (lower) marginal taxes for high- (low-) mean countries compared to the benchmark. The change in total output depends on the income effects on labor supply. Overall, total welfare is higher for the scenarios involving a European tax and transfer system despite more than two thirds of all the agents becoming worse off relative to the benchmark. A politically more feasible integrated tax system improves the well-being of almost half of all the EU but considerably reduces the aggregate welfare benefits.
"Ich möchte in diesem Vortrag Beziehungen zwischen Gutenbergs Theorie der Unternehmung, die in seiner Habilitationsschrift angelegt und in den "Grundlagen der Betriebswirtschaftslehre" entfaltet ist, und aktuellen Entwicklungen in der Theorie der Unternehmung herstellen. Obwohl der Anlaß für diesen Vortrag das Thema hinreichend rechtfertigt, stellt sich die Frage, ob mein Vorhaben ein wissenschaftlich sinnvolles Unterfangen darstellt: Kann Gutenbergs Theorie der Unternehmung noch aktuell sein?"
Erkenntnis voraus
(2016)
"Die Handelswissenschaften werden hier zum ersten Mal an einer Universität zu einem ergänzenden Teil der Sozial-, Staats- und Volkswirtschaftslehre", freute sich der Initiator Wilhelm Merton. Als Mitbegründer der Metallgesellschaft machte sich der Unternehmer dafür stark, die moderne Wirtschaftsgesellschaft in puncto Ausbildung und Lehre zu stärken.
Vom bürgerschaftlichen Engagement und der Stiftungskultur in Frankfurt profitiert die GoetheUniversität bis heute: 100 Jahre nach ihrer Gründung durch Frankfurter Bürger gehört sie zu den größten und drittmittelstärksten Hochschulen in Deutschland, ist seit 2008 wieder Stiftungsuniversität und genießt damit eine besondere Autonomie. ...
Angesichts des kürzlich von der Bundesregierung verabschiedeten Konjunkturpakets, stellen sich die Autoren des Policy Letters die Frage, ob und inwieweit die angekündigte Mehrwertsteuersenkung sowie der Kinderbonus zur substantiellen Ankurbelung des Binnenkonsums führt. Aus den für das Haushaltskrisenbarometer erhobenen Daten zu Einkommensänderungen sowie Einkommens- und Kündigungserwartungen, können die Ökonomen keine zu erwartende Schwächung der Binnennachfrage ableiten. Der überwiegende Teil der deutschen Wohnbevölkerung scheint kurzfristig nicht davon auszugehen, finanzielle Einbußen aufgrund der Pandemie zu erleiden. Die Erwartungen hinsichtlich der künftigen Einkommensentwicklung haben sich gar über die letzten vier Umfragewellen graduell verbessert. Ferner kann dargelegt werden, dass weder die Konsum- noch die Sparneigung durch die Corona-Krise zum gegenwärtigen Zeitpunkt langfristig stark beeinflusst wird. So geben derzeit lediglich 10 Prozent der Befragten an, größere Anschaffungen angesichts der Pandemie vollständig gestrichen zu haben. Anfang April 2020 lag dieser Wert noch bei 16 Prozent. Die Befragten berichteten in 71 Prozent der Fälle ihre Konsumpläne und in 78 Prozent der Fälle ihre Sparverhalten nicht geändert zu haben. Im Lichte dieser Ergebnisse lassen sich Maßnahmen, die auf eine unspezifische Stimulierung der Binnennachfrage abzielen, nicht substantiell begründen und rechtfertigen.
Untersuchungsgegenstand ist der empirische Gehalt der ökonomischen Theorie eines Hedgings auf Unternehmensebene. In den USA wurde die Hedging-Theorie in einer Reihe von empirischen Studien aufgegriffen. Die Befunde sind zumeist konsistent mit dem Erklärungsansatz von Froot/Scharfstein/Stein (1993), wonach eine Verringerung der Cashflow-Volatilität – unter der Annahme steigender Außenfinanzierungskosten – zu einer Reduzierung von Unterinvestitionskosten führt. Bei deutschen Unternehmen besitzt dieser Ansatz bemerkenswerterweise jedoch nur einen geringen Erklärungsgehalt. Die Ergebnisunterschiede können auf unterschiedliche Kapitalmarktverhältnisse zurückgeführt werden: Die unterstellten steigenden Kosten der Außenfinanzierung besitzen für deutsche Unternehmen aufgrund der Dominanz des Bezugsrechtsverfahrens sowie der Rolle der Hausbank als Mechanismus zur Überwindung von Informationsproblemen eine vergleichsweise geringere Bedeutung. Die Managerinteressen erweisen sich bei deutschen Unternehmen als eine wesentliche Hedging-Determinante. Zwischen der Höhe des gebundenen Managervermögens und der Hedging-Wahrscheinlichkeit besteht entsprechend der Hedging-Theorie ein signifikanter positiver Zusammenhang. Entgegen den amerikanischen Befunden kann jedoch eine disziplinierende Wirkung von Großaktionären auf die Hedging-Entscheidung nicht beobachtet werden. Zur Berücksichtigung der spezifischen deutschen Kapitalmarktverhältnisse wird der Einfluss von Bankenbeteiligungen und Familienunternehmen auf die Hedging-Entscheidung untersucht. Ein Bankeneinfluss auf die Derivateeinsatz-Entscheidung kann jedoch nicht festgestellt werden. Entgegen Diversifikations- und Kapitalmarktüberlegungen besteht bei Familienunternehmen interessanterweise eine signifikant geringere Hedging-Wahrscheinlichkeit.
Auszubildende sollen durch die Berufsausbildung u.a. die Kompetenz erlangen, berufliche Probleme zu lösen. Abschlussprüfungen dienen der Kompetenzerfassung, schriftlich-kaufmännische Prüfungsaufgaben bilden allerdings noch unzureichend Problemsituationen ab, deren Lösung Problemlösekompetenz erfordert. An der Erstellung von Prüfungsaufgaben sind auch Lehrkräfte kaufmännisch-beruflicher Schulen beteiligt. In der Arbeit wird untersucht, wie sie in der ersten und zweiten Phase der Lehrer*innenbildung auf das Erstellen problemhaltiger Aufgaben zu summativ-diagnostischen Zwecken vorbereitet werden. Hierfür werden Dokumentenanalysen zu beiden Phasen der Lehrer*innenbildung durchgeführt. Die Ergebnisse werden mittels einer Fragebogenstudie mit Studiengangsleiter*innen sowie Interviews mit Fachleiter*innen der Studienseminare gesichert. Um die Wahrnehmung angehender Lehrkräfte zu erfahren, werden Interviews mit Masterstudierenden der Wirtschaftspädagogik sowie Lehrkräfte im Vorbereitungsdienst (LiV) an kaufmännisch-beruflichen Schulen durchgeführt.
Durch die Vorstudien gelingt es, Optimierungsbedarfe in der Ausbildung von Lehrkräften kaufmännisch-beruflicher Schulen festzuhalten. Davon ausgehend wird ein Trainingskonzept begründet ausgewählt. Die Evaluation dessen erfolgt mittels einer quasi-experimentellen Studie mit Masterstudierende und LiV. Zur qualitativen Evaluation werden Interviews mit Teilnehmenden der Interventionsgruppe durchgeführt. Die Ergebnisse zeigen, dass die Teilnehmenden das Training als Intervention überwiegend positiv wahrnehmen und dieser, zumindest mit Blick auf das Erstellen von problemhaltigen Aufgaben, zu einem Lernzuwachs führt. Durch die bedarfsorientierte Intervention und dessen Evaluationsergebnisse wird ein Konzept vorgeschlagen, welches eine Lösung zur Deckung bestehender Optimierungsbedarfe bietet. Die Ergebnisse der Arbeit haben das Potential, langfristig einen Beitrag zur Verbesserung der Lehrer*innenbildung zu leisten und somit u.a. Assessmentaufgaben valider zu gestalten.
Überarbeitete Version des Arbeitspapiers "The dynamics of labour market participation, unemployment and non-participation in Great Britain, Sweden and Germany" / Wolfgang Strengmann-Kuhn. [Johann-Wolfgang-Goethe-Universität Frankfurt am Main, Fachbereich Wirtschaftswissenschaften, Institut für Volkswirtschaftslehre]
Inflation hat in den letzten Jahren weltweit erheblich an Popularität eingebüßt. Galten noch in den 1960er und 1970er Jahren moderate Inflationsraten von 5 bis 10 Prozent als wachstums- und beschäftigungsfördernd, so ist es mittlerweile in Politik und Wissenschaft nahezu unstrittig, dass Inflation vor allem volkswirtschaftliche Kosten verursacht und deshalb Preisstabilität das vorrangige Ziel moderner Geldpolitik sein muss. So sieht insbesondere die in Frankfurt ansässige Europäische Zentralbank (EZB) ihre Hauptaufgabe darin, die jährliche Inflationsrate in der Eurozone unter 2 Prozent zu halten. Klettert die Inflationsrate nur wenige Dezimalpunkte über diesen Zielwert, muss mit Zinserhöhungen und einer restriktiven Geldpolitik der Zentralbank gerechnet werden. Diese Geldpolitik i s t gerechtfertigt, wenn bereits niedrige Inflationsraten messbare realwirtschaftliche Effekte besitzen. Eine Studie der Professur für Empirische Makroökonomie untersucht deshalb den Einfluss von Inflation auf die Variabilität der relativen Preise.
The euro crisis was fueled by the diabolic loop between sovereign risk and bank risk, coupled with cross-border flight-to-safety capital flows. European Safe Bonds (ESBies), a union-wide safe asset without joint liability, would help to resolve these problems. We make three contributions. First, numerical simulations show that ESBies would be at least as safe as German bunds and approximately double the supply of euro safe assets when protected by a 30%-thick junior tranche. Second, a model shows how, when and why the two features of ESBies — diversification and seniority — can weaken the diabolic loop and its diffusion across countries. Third, we propose a step-by-step guide on how to create ESBies, starting with limited issuance by public or private-sector entities.
In this statement the European Shadow Financial Regulatory Committee (ESFRC) is advocating a conditional relief of Greek’s government debt based on Greece meeting certain targets for structural economic reforms in areas such as its labor market and pensions sector.The authors argue that the position of the European institutions that debt relief for Greece cannot be part of an agreement is based on the illusion that Greece will be able to service its sovereign debt and reduce its debt overhang after implementing a set of fiscal and structural reforms. However, the Greek economy would need to grow at an unrealistig level to achieve debt sustainability soley on the basis of reforms.The authors therefore view a substantial debt relief as inevitable and argue that three questions must be resolved urgently, in order to structure debt relief adequately: First, which groups must accept losses associated with debt relief. Second, how much debt relief should be offered. Third, under what conditions should relief be offered.
We examine whether the uncertainty related to environmental, social, and governance (ESG) regulation developments is reflected in asset prices. We proxy the sensitivity of firms to ESG regulation uncertainty by the disparity across the components of their ESG ratings. Firms with high ESG disparity have a higher option-implied cost of protection against downside tail risk. The impact of the misalignment across the different dimensions of the ESG score is distinct from that of ESG score level itself. Aggregate downside risk bears a negative price for firms with low ESG disparity.
Essays in behavioral economics - evidence on self-selection into jobs, social networks and leniency
(2013)
Die Dissertation mit dem Titel „Essays in Behavioral Economics – Evidence on Self-Selection into Jobs, Social Networks and Leniency“ besteht aus einer Sammlung von vier wissenschaftlichen Abhandlungen. Alle Arbeiten verbindet die Analyse von theoretischen Konzepten und Erkenntnissen der Verhaltensökonomie unter Verwendung der experimentellen Methode. Die erste wissenschaftliche Abhandlung trägt den Titel „Sorting of Motivated Agents - Empirical Evidence on Self-Selection into the German Police“ und untersucht Selbstselektion bestimmter Individuen in den Polizeiberuf. Die experimentelle Studie untersucht die Frage, ob Polizeibewerber sich hinsichtlich ihrer Präferenzen in Bezug auf ihr Normdurchsetzungsverhalten in den Polizeiberuf selektieren. Die zweite Abhandlung greift diese Erkenntnisse auf und untersucht Polizeianwärter in ihrer Berufsausbildung ebenfalls hinsichtlich ihrer Normdurchsetzungsbereitschaft. Die Arbeit trägt den Titel „Selection and formation of motivated agents -- empirical evidence from the German Police”. In der dritten wissenschaftlichen Abhandlung werden geschlechterspezifische Unterschiede bei der Wahl von Partnern und dem Aufbau des sozialen Netzwerkes untersucht. Diese trägt den Titel „Selectivity and opportunism: two dimensions of gender differences in trust games and network formation“ und wurde zusammen mit Guido Friebel, Marie Lalanne, Paul Seabright und Peter Schwardmann verfasst. Die vierte Abhandlung geht einer aktuellen Fragestellung der Industrieökonomie nach und trägt den Titel „Antitrust, auditing and leniency programs: evidence from the laboratory“, verfasst mit Mehdi Feizi and Ali Mazyaki. In ihrer Gesamtheit liefert meine Dissertation Antworten auf personalpolitische, soziale und industrieökonomische Fragestellungen.
This dissertation consists of three chapters. The first two chapters investigate the real effects of inflation and the third chapter the role of child care for fertility and female female labor supply. Chapter 1 introduces a generalized panel threshold model to analyze the relation between inflation and economic growth for a sample of developing countries. It is demonstrated that allowing for regime intercepts can be crucial for obtaining unbiased estimates of both, inflation thresholds and its marginal effects on growth in the various regimes. The empirical results confirm that the omitted variable bias of standard panel threshold models can be statistically and economically significant. Chapter 2, which is joined work with Dieter Nautz, investigates the impact of inflation on relative price variability (RPV) as a further important channel of the real effects of inflation. With a view to the recent debate on the Fed's implicit lower and upper bounds of its inflation objective, the econometric model introduced in Chapter 1 is used to explore the inflation-RPV linkage in U.S. cities. Chapter 3 investigates the relationship between fertility, female labor supply and child care in the context of a life cycle model for Germany. A particular emphasis is placed on the differences between West and East Germany. Counterfactual policy experiments mimicking recent policy reforms on maternal leave and the provision of subsidized child care are conducted with a structurally estimated version of the model.
CHAPTER A: THE INVESTMENT BEHAVIOR OF PRIVATE EQUITY FUND MANAGERS I The Bright and Dark Side of Staging: Investment Performance and the Varying Motivations of Private Equity Firms II The Liquidation Dilemma of Money Losing Investments – The Impact of Investment Experience and Window Dressing of Private Equity and Venture Capital Funds CHAPTER B: THE ASSESSMENT OF RISK AND RETURN OF PRIVATE EQUITY I Venture Capital Performance Projection: A Simulation Approach II Modeling Default Risk of Private Equity Funds – A Market-based Framework
This thesis consists of four chapters. Each chapter covers a topic in international macroeconomics and monetary policy. The first chapter investigates the impact of unexpected monetary policy shocks on exchange rates in a multi-country econometric model. The second chapter examines the linkage between macroeconomic fundamentals and exchange rates through the monetary policy expectation channel. The third chapter focuses on the international transmission of bank and corporate distress. The last chapter unfolds the interest rate channel of monetary policy transmission in-an emerging economy-China, where regulations and market forces co-exist in this transmission.
In this thesis the behavior of banks in financial markets which banks frequently use to obtain short-term as well as long-term financing is studied. In the first chapter we incorporate an interbank market for collateralized lending among banks into a dynamic, stochastic, general equilibrium (DSGE) framework to analyze the impact of variations in the expected value of the collateral on the interbank lending volume. We find that a central bank which decides to lower the haircut on eligible collateral in repurchase agreements is able to stimulate interbank markets. In the second chapter a microeconomic model of bank behavior on the interbank market is set up to analyze the impact of risk-taking behavior of interbank borrowing banks and uncertainty about their balance sheet quality on the lending behavior of interbank lending banks. It is found that the disruptions on the interbank market are the result of optimal behavior on the part of interbank lending banks in response to the uncertainty about the balance sheet quality of an interbank borrowing bank. In the third chapter we use monthly data on German bank bond spreads and regress it on bank-specific risk factors to assess the degree of market discipline in the German bank bond market. The regression results for the whole German bank bond market indicate that the bond spread does not show signs of market discipline. However, a structural break analysis uncovers that since the beginning of the financial crisis the German bank bond market exhibits at least a weak form of market discipline for bonds issued by medium-size and large banks.
This dissertation introduces in chapter 1 a new comparative approach to model-based research and policy analysis by constructing an archive of business cycle models. It includes many well-known models used in academia and at policy institutions. A computational platform is created that allows straightforward comparisons of models’ implications for monetary and fiscal stabilization policies. Chapter 2 applies business cycle models to forecasting. Several New Keynesian models are estimated on historical U.S. data vintages and forecasts are computed for the five most recent recessions. The extent of forecast heterogeneity for models and professional forecasts is analysed. Chapter 3 extends the forecasting analysis to a long sample and to the evaluation of density forecasts. Weighted forecasts are computed using a variety of weighting schemes. The accuracy of forecasts is evaluated and compared to professional forecasts and forecasts from nonstructural time series methods. Chapter 4 adds a new feature to existing business cycle models. Specifically, a medium-scale New Keynesian model is constructed that allows for strategic complementarities in price-setting. The role of trade integration for monetary policy transmission is explored. A new dimension of the exchange rate channel is highlighted by which monetary policy directly impacts domestic inflation. Chapter 5 tests whether simple symmetric monetary policy rules used in most business cycle models are a sufficient description of reality. I use quantile regressions to estimate policy parameters and find asymmetric reactions to inflation, the output gap and past interest rates.
Die Dissertation besteht aus drei thematisch zusammenhängenden Forschungspapieren, in denen zeitstetige Konsum-, Investment- und Versicherungsprobleme über den Lebenszyklus betrachtet werden. Ein besonderer Fokus liegt auf realistischen Features wie stochastischem Sterberisiko und nicht-replizierbarem Einkommen. In der ersten Forschungsarbeit untersuche ich die Relevanz von stochastischem Sterberisiko. Dabei zeige ich, dass eine Sprungkomponente in der Sterberate die optimalen Entscheidungen der Agenten und das Wohlfahrtslevel signifikant beeinflusst. Eine Diffusionskomponente ist hingegen vernachlässigbar. In dem zweiten Forschungspapier untersuchen wir die Risikolebensversicherungsnachfrage einer Familie, dessen Alleinverdiener stochastischem Sterberisiko ausgesetzt ist. Wir achten insbesondere auf eine realistische Modellierung der Versicherung. Wir zeigen, dass dadurch junge Agenten dem Versicherungsmarkt fern bleiben und die Versicherungsnachfrage mit dem Alter steigt, im Gegensatz zu Modellen mit einfachen stetig-veränderbaren Versicherungen. Weiterhin verstärken langlaufende Versicherungsverträge die negativen Effekte von Einkommensschocks und werden daher von risikoaversen Agenten weniger abgeschlossen. In der dritten Forschungsarbeit untersuche ich die Critical Illness Versicherungsnachfrage eines Agenten in einem Modell mit stochastischem Sterberisiko und Gesundheitsausgaben. Die Versicherung übernimmt dabei die zusätzlichen Gesundheitskosten, die bei einem Sprung entstehen. Fast alle Agenten schließen solch eine Versicherung vor dem Rentenalter ab, selbst wenn diese sehr kostspielig ist. Insbesondere Agenten mit geringen Gesundheitsausgaben und hohem Einkommen haben eine hohe Versicherungsnachfrage.
Die vorliegende Arbeit beschäftigt sich mit der zeitstetigen Portfoliooptimierung sowie mit Themen aus dem Bereich des Kreditrisikos. Das Ziel der Portfoliooptimierung ist es, zu einem gegebenen Anfangskapital die bestmöglichen Konsum- und Investmentstrategien zu finden. In dieser Arbeit wird dabei vor allem der Einfluss von Einkommen auf diese Entscheidungen untersucht. Da einerseits jedoch der zukünftige Einkommensstrom vom Zufall bestimmt ist und es andererseits keine Finanzprodukte gibt, die diesen replizieren können, stellt die Einbindung von Einkommen in die Portfoliooptimierung ein großes Problem dar. Es führt dazu, dass die Annahmen eines vollständigen Marktes nicht weiter gelten, so dass die Standardmethoden zur Lösung nicht angewendet werden können. Diese Arbeit analysiert mehrere Ausprägungen dieses Problems und geht auf verschiedene Verfahren zur Lösung ein. Weiterhin untersucht diese Studie den Einfluss des Kreditrisikos einer Firma auf die jeweilige Firmenrendite. Dabei wird vor allem auf eine Anomalie, die bereits umfassend in der Literatur diskutiert wurde, Bezug genommen. Diese Anomalie besagt, dass Firmen mit hohen Ausfallwahrscheinlichkeiten geringere Renditen erwirtschaften als Firmen mit kleineren Ausfallwahrscheinlichkeiten. Eine weitere Frage, die in den Bereich des Kreditrisikos fällt, ist die Frage, inwieweit Modelle dazu in der Lage sind, strukturierte Produkte zu bewerten und abzusichern. Diese Arbeit versucht Antworten darauf zu geben.
In total, this dissertation comprises three research papers. Objective of all of these papers are to detect mistakes of private investors when conducting mutual funds investments and to analyze the implications. Moreover, the question is addressed whether financial advisors help private investors to avoid these investment mistakes. All three research papers use the same data base which has been provided by a German online brokerage house. The detailed data set allows contributing to existing literature on mutual fund investments, smart decision making, household finance as well as financial advice on an investor- and transaction-specific level. The first paper addresses the question which particular decision criteria private investors use when purchasing mutual funds. It can be shown that funds volume is the dominating decision criterion, whereas historical performance is only of minor importance. As performance persistence exists in the underlying data set, it can be concluded that the majority of investors make investment mistakes. In the second paper it is shown that smart investors, i.e. investors who purchase mutual funds by chasing historical performance, are older, wealthier, more experienced and less likely to be overconfident. In addition, it can be verified that there exists a positive impact of the ability to select mutual funds by chasing historical performance on the overall investment success. Hence, the quality of mutual fund selection ability is an ex-ante measure for investment success. Finally, the third paper analyzes the influence of financial advice on mutual fund decision making of private investments. Evidence can be provided that financial advisors do not help their customers to purchase mutual funds by chasing historical performance. In fact, advisors recommend high-volume mutual funds from well-known fund families. Apparently, financial advisors are much more salesmen than real advisors. These results hold when controlling for potential endogeneity issues.
This dissertation contains three essays on monetary policy, dynamics of the interest rates and spillovers across economies. In the first essay I examine the effects of monetary policy and its interaction with financial regulation within a micro-founded macroeconometric framework for a closed economy with a heterogeneous banking system, facing a period of low interest rates. I analyse the interplay between monetary policy and banking regulation and study the role of agents’ expectations for the effectiveness of unconventional monetary policy tools. In the next essay, I argue that openness is crucial for understanding the dynamics of the term structure. In an empirical application, I show that my model of the term structure fits well the yield curve in-sample and has a sound ability to forecast interest rates out-of-sample. The model accounts for the expectations hypothesis, replicates the forward premium anomaly and reconciles the uncovered interest rate parity implications. The last essay is concerned with the dynamics of co-movement among macroeconomic aggregates and the degree of convergence or decoupling amongst economies. The model includes measures of financial and trade-based interdependencies and incorporates feedback between macroeconomic variables and time-varying weights. The findings point at the importance of asset price movements and financial linkages.
For private investors it is imperative to a) understand and define their own, individual risk preferences, b) assess their financial and demographic circumstances to determine the individual risk-taking potential, and c) form and maintain a well-diversified risky portfolio. The three chapters of my thesis each match one of these three tasks. \\ \noindent The first chapter of my thesis presents novel experimental evidence to test the existence of a potential projection bias in loss aversion, a significant determinant of investor preferences, thus matching task a). The second chapter is devoted to the determination of private investors' risk-taking potential based on their financial and socio-demographic circumstances, matching task b): In a large portfolio experiment, we examine the ability and heterogeneity of lay and professional advisors in matching investor demographics, such as age and income, with risky asset portfolio shares. The third and final chapter addresses the question on how to reach and maintain an efficient risky portfolio, therefore matching task c): It analyzes a decision support system for private investors that allows its users to simulate any arbitrary set of securities, and by reporting aggregated expected return and risk, to optimize their current portfolio.
This cumulative dissertation contains four self-contained chapters on stochastic games and learning in intertemporal choice.
Chapter 1 presents an experiment on value learning in a setting where actions have both immediate and delayed consequences. Subjects make a series of choices between abstract options, with values that have to be learned by sampling. Each option is associated with two payoff components: One is revealed immediately after the choice, the other with one round delay. Objectively, both payoff components are equally important, but most subjects systematically underreact to the delayed consequences. The resulting behavior appears impatient or myopic. However, there is no inherent reason to discount: All rewards are paid simultaneously, after the experiment. Elicited beliefs on the value of options are in accordance with choice behavior. These results demonstrate that revealed impatience may arise from frictions in learning, and that discounting does not necessarily reflect deep time preferences. In a treatment variation, subjects first learn passively from the evidence generated by others, before then making a series of own choices. Here, the underweighting of delayed consequences is attenuated, in particular for the earliest own decisions. Active decision making thus seems to play an important role in the emergence of the observed bias.
Chapter 2 introduces and proves existence of Markov quantal response equilibrium (QRE), an application of QRE to finite discounted stochastic games. We then study a specific case, logit Markov QRE, which arises when players react to total discounted payoffs using the logit choice rule with precision parameter λ. We show that the set of logit Markov QRE always contains a smooth path that leads from the unique QRE at λ = 0 to a stationary equilibrium of the game as λ goes to infinity. Following this path allows to solve arbitrary finite discounted stochastic games numerically; an implementation of this algorithm is publicly available as part of the package sgamesolver. We further show that all logit Markov QRE are ε-equilibria, with a bound for ε that is independent of the payoff function of the game and decreases hyperbolically in λ. Finally, we establish a link to reinforcement learning, by characterizing logit Markov QRE as the stationary points of a game dynamic that arises when all players follow the well-established reinforcement learning algorithm expected SARSA.
Chapter 3 introduces the logarithmic stochastic tracing procedure, a homotopy method to compute stationary equilibria for finite and discounted stochastic games. We build on the linear stochastic tracing procedure (Herings and Peeters 2004), but introduce logarithmic penalty terms as a regularization device, which brings two major improvements. First, the scope of the method is extended: it now has a convergence guarantee for all games of this class, rather than just generic ones. Second, by ensuring a smooth and interior solution path, computational performance is increased significantly. A ready-to-use implementation is publicly available. As demonstrated here, its speed compares quite favorable to other available algorithms, and it allows to solve games of considerable size in reasonable times. Because the method involves the gradual transformation of a prior into equilibrium strategies, it is possible to search the prior space and uncover potentially multiple equilibria and their respective basins of attraction. This also connects the method to established theory of equilibrium selection.
Chapter 4 introduces sgamesolver, a python package that uses the homotopy method to compute stationary equilibria of finite discounted stochastic games. A short user guide is complemented with discussion of the homotopy method, the two implemented homotopy functions logit Markov QRE and logarithmic tracing, and the predictor-corrector procedure and its implementation in sgamesolver. Basic and advanced use cases are demonstrated using several example games. Finally, we discuss the topic of symmetries in stochastic games.
Die Börsenindustrie hat in den vergangenen zwei Jahrzehnten einen signifikanten Wandel durchlaufen - und das nicht nur in Deutschland. Börsen haben schon längst nicht mehr den Charakter vergangener Tage, in denen ihre Mitglieder auf dem Parkett um Aktienpakete und -kurse von inländischen Unternehmen feilschten und an den genossenschaftlich organisierten Handelsplätzen eher eine vertrauliche Clubatmosphäre herrschte. Eine Vielzahl der Börsen hat den Parketthandel abgeschafft, ist selbst an einer Börse gelistet und orientiert sich primär am Shareholder Value und somit an den Interessen einer internationalen Aktionärsbasis. Mittlerweile existieren Börsenplätze, die mehrere Länder umspannen. Der französisch dominierten Euronext kommt hier eine Vorreiterrolle zu. Aber auch andere Börsen, wie die Deutsche Börse und die Schweizer Börse, haben länderübergreifend ihre Derivatehandelsplattformen vereinigt und mit ihrem Jointventure Eurex die umsatzstärkste Derivatebörse der Welt geschaffen. In jüngster Zeit werden nun auch transatlantische Allianzen zwischen amerikanischen und europäischen Börsen angedacht. Sowohl die Strategie der Nasdaq, die bisher eine Sperrminorität von über 25% an der Londoner Börse hält, als auch die der New York Stock Exchange, die eine Fusion mit der Euronext anstrebt, belegen dies. Zudem stehen Börsen mittlerweile in direktem Wettbewerb mit ihren Kunden und ehemaligen Eigentümern, den Finanzintermediären wie Banken und Wertpapierhäuser. Sie konkurrieren um Wertpapieraufträge von Investoren, da Banken nicht mehr jede Order automatisch an sie weiterleiten. Stattdessen versuchen manche Finanzintermediäre, die erhaltenen Investorenaufträge im eigenen Haus mit einer entsprechenden reziproken Order zusammenzuführen, um somit die Geld-Brief Spanne des Wertpapiers als Gewinn einzubehalten. Diese Internalisierung von Auftragsausführungen ist seit einigen Jahren insbesondere in England und Deutschland eine bedeutende Einkommensquelle für Wertpapierhäuser geworden. Gleichzeitig stoßen Börsen immer stärker in Geschäftsbereiche vor, die bislang die Domäne ihrer Kunden repräsentierten. Hier sei der Handel von bestimmten Kreditderivateprodukten genannt, die bisher außerbörslich zwischen großen Wertpapierhäusern gehandelt wurden. Sowohl die Chicago Mercantile Exchange als auch die Eurex planen den Handel dieser Titel auf ihren eigenen Plattformen. Ein weiteres Beispiel ist die vertikale Integration von Wertpapierabwicklungs- und Wertpapierverwahrungsgeschäften. Große internationale Banken wie BNP Paribas, Citigroup und State Street kämpfen hier gegen Börsen um Marktanteile. Wie kam es zu dem hier beschriebenen Wandel? Der entscheidende Katalysator ist der gestiegene Wettbewerbsdruck auf traditionelle Börsen, welcher in vielen Fällen zu einer Umstrukturierung ihrer Organisationsform und Eigentümerstruktur führte. Diese neu ausgerichteten Börsen verstanden sich nun als reguläre, gewinnorientierte Firmen, die nicht mehr in erster Linie ihren Kunden, sondern ihren neuen Eigentümern, den Aktionären, verpflichtet waren. ...
This thesis is concerned with various aspects of estimating trend output and growth and discusses and evaluates methods to prepare medium-term GDP growth projections. Furthermore, econometric techniques suited for cross-correlated macroeconomic panel data with a focus on factor models are applied for unit root and cointegration testing as well as panel error correction estimation. Applications involve the identification of growth determinants as well as the modelling of aggregate labor supply in a multi-country framework. The first chapter evaluates a very popular method for potential output estimation and medium-term forecasting---the production function approach---in terms of predictive performance. For this purpose, a particular forecast evaluation framework is developed and an evaluation of the predictions of GDP growth for the three to five years ahead for each individual G7 country is carried out. In chapter two, a new approach for estimating trend growth of advanced economies is proposed. The suggestion combines econometric methods that have been used to test and estimate the implications of the extended Solow growth model in a cross sectional time series setting with an application of multivariate time series filter techniques. The last chapter discusses several panel unit root tests designed to accommodate cross-sectional dependence. These methods are then applied to an OECD country sample of the aggregate labor supply measure "hours worked".
We estimate a semiparametric single-risk discrete-time duration model to assess the effect of vocational training on the duration of unemployment spells. The data basis used in this study is the German Socio-Economic-Panel (GSOEP) for West Germany for the period from 1986 to 1994. To take into account a possible selection bias actual participation in vocational training is instrumented using estimates of a randomeffects probit model for the participation in qualification measures. Our main results show that training does have a significant short term effect of reducing unemployment duration but that this effect does not persist in the long run. JEL classifications: C41, J20, J64
On July 4, 2013 the ECB Governing Council provided more specific forward guidance than in the past by stating that it expects ECB interest rates to remain at present or lower levels for an extended period of time. As explained by ECB President Mario Draghi this expectation is based on the Council’s medium-term outlook for inflation conditional on economic activity and money and credit. Draghi also stressed that there is no precise deadline for this extended period of time, but that a reasonable period can be estimated by extracting a reaction function. In this note, we use such a reaction function, namely the interest rate rule from Orphanides and Wieland (2013) that matches past ECB interest rate decisions quite well, to project the rate path consistent with inflation and growth forecasts from the survey of professional forecasters published by the ECB on August 8, 2013. This evaluation suggests an increase in ECB interest rates by May 2014 at the latest. We also use the Eurosystem staff projection from June 6, 2013 for comparison. While it would imply a longer period of low rates, it does not match past ECB decisions as well as the reaction function with SPF forecasts.
In this study, we develop a technique for estimating a firm’s expected cost of equity capital derived from analyst consensus forecasts and stock prices. Building on the work of Gebhardt/Lee/-Swaminathan (2001) and Easton/Taylor/Shroff/Sougiannis (2002), our approach allows daily estimation, using only publicly available information at that date. We then estimate the expected cost of equity capital at the market, industry and individual firm level using historical German data from 1989-2002 and examine firm characteristics which are systematically related to these estimates. Finally, we demonstrate the applicability of the concept in a contemporary case study for DaimlerChrysler and the European automobile industry.
We propose a new estimator for the spot covariance matrix of a multi-dimensional continuous semi-martingale log asset price process which is subject to noise and non-synchronous observations. The estimator is constructed based on a local average of block-wise parametric spectral covariance estimates. The latter originate from a local method of moments (LMM) which recently has been introduced by Bibinger et al. (2014). We extend the LMM estimator to allow for autocorrelated noise and propose a method to adaptively infer the autocorrelations from the data. We prove the consistency and asymptotic normality of the proposed spot covariance estimator. Based on extensive simulations we provide empirical guidance on the optimal implementation of the estimator and apply it to high-frequency data of a cross-section of NASDAQ blue chip stocks. Employing the estimator to estimate spot covariances, correlations and betas in normal but also extreme-event periods yields novel insights into intraday covariance and correlation dynamics. We show that intraday (co-)variations (i) follow underlying periodicity patterns, (ii) reveal substantial intraday variability associated with (co-)variation risk, (iii) are strongly serially correlated, and (iv) can increase strongly and nearly instantaneously if new information arrives.
The authors propose a new method to forecast macroeconomic variables that combines two existing approaches to mixed-frequency data in DSGE models. The first existing approach estimates the DSGE model in a quarterly frequency and uses higher frequency auxiliary data only for forecasting. The second method transforms a quarterly state space into a monthly frequency. Their algorithm combines the advantages of these two existing approaches.They compare the new method with the existing methods using simulated data and real-world data. With simulated data, the new method outperforms all other methods, including forecasts from the standard quarterly model. With real world data, incorporating auxiliary variables as in their method substantially decreases forecasting errors for recessions, but casting the model in a monthly frequency delivers better forecasts in normal times.
Causality is a widely-used concept in theoretical and empirical economics. The recent financial economics literature has used Granger causality to detect the presence of contemporaneous links between financial institutions and, in turn, to obtain a network structure. Subsequent studies combined the estimated networks with traditional pricing or risk measurement models to improve their fit to empirical data. In this paper, we provide two contributions: we show how to use a linear factor model as a device for estimating a combination of several networks that monitor the links across variables from different viewpoints; and we demonstrate that Granger causality should be combined with quantile-based causality when the focus is on risk propagation. The empirical evidence supports the latter claim.
Effort estimates are of utmost economic importance in software development projects. Estimates bridge the gap between managers and the invisible and almost artistic domain of developers. They give a means to managers to track and control projects. Consequently, numerous estimation approaches have been developed over the past decades, starting with Allan Albrecht's Function Point Analysis in the late 1970s. However, this work neither tries to develop just another estimation approach, nor focuses on improving accuracy of existing techniques. Instead of characterizing software development as a technological problem, this work understands software development as a sociological challenge. Consequently, this work focuses on the question, what happens when developers are confronted with estimates representing the major instrument of management control? Do estimates influence developers, or are they unaffected? Is it irrational to expect that developers start to communicate and discuss estimates, conform to them, work strategically, hide progress or delay? This study shows that it is inappropriate to assume an independency of estimated and actual development effort. A theory is developed and tested, that explains how developers and managers influence the relationship between estimated and actual development effort. The theory therefore elaborates the phenomenon of estimation fulfillment.
Markets are central to modern society, so their failures can have devastating effects. Here, we examine a prominent failure: price bubbles. We propose that bubbles are affected by ethnic homogeneity in the market and can be thwarted by diversity. Using experimental markets in Southeast Asia and North America, we find a marked difference: Market prices fit true values 58% better in diverse markets. In homogenous markets, overpricing is higher and traders’ errors are more correlated than in diverse markets. The findings suggest that price bubbles arise not only from individual errors or financial conditions, but also from the social context of decision making. Informing public discussion, our findings suggest that diversity facilitates friction that enhances deliberation and upends conformity.
This working paper suggests to analyse agencification as a double process of institutional and policy centralisation. To that end, it develops a categorisation of agencies that incorporates these two dimensions. More specifically, it is argued that mixed outcomes where the levels of institutional and policy centralisation diverge can be expected to be the rule rather than the exception, in line with the hybrid nature of EU agencies as inbetweeners. Moreover, the fiduciary setting hits important legal constraints given the limits to delegation in the EU context. Against this backdrop a process whereby institutional centralisation develops incrementally and remains limited, yet is accompanied by a process of substantial policy centralisation, appears as the most promising path for EU agencification. A fiduciary setting, where a strong agency enjoys a high degree of independence and operates in a centralised policy space, by contrast, should be the exception. The comparative study of the process of agencification in the energy and banking sector is insightful in the light of these expectations. The incremental nature of institutional change in energy exemplifies the usual path of agencification, which is conducive to a weak agency operating in a relatively centralised policy space. Agencification in banking, by contrast, has led to a rather unusual outcome where the strong agency model combines with a fragmented policy context.
EU financial integration : is there a 'Core Europe'? ; evidence from a cluster-based approach
(2005)
Numerous recent studies, e.g. EU Commission (2004a), Baele et al. (2004), Adam et al.(2002), and the research pooled in ECB-CFS (2005), Gaspar, Hartmann, and Sleijpen(2003), have documented progress in EU financial integration from a micro-level view.This paper contributes to this research by identifying groups of financially integratedcountries from a holistic, macro-level view. It calculates cross-sectional dispersions, andinnovates by applying an inter-temporal cluster analysis to eight euro area countries for the period 1995-2002. The indicators employed represent the money, government bond and credit markets. Our results show that euro countries were divided into two stable groups of financially more closely integrated countries in the pre-EMU period. Back then, geographic proximity and country size might have played a role. This situation has changed remarkably with the euro's introduction. EMU has led to a shake-up both in the number and composition of groups. The evidence puts a question mark behin d using Germany as a benchmark in the post-EMU period. The ¯ndings suggest as well that ¯nancial integration takes place in waves. Stable periods and periods of intense transition alternate. Based on the notion of 'maximum similarity', the results suggest that there exist 'maximum similarity barriers'. It takes extraordinary events, such as EMU, to push the degree of ¯nancial integration beyond these barriers. The research encourages policymakers to move forward courageously in the post-FSAP era, and provides comfort that the substantial di®erences between the current and potentially new euro states can be overcome. The analysis could be extended to the new EU member countries, to the global level, and to additional indicators.
This study provides a graphic overview on core legislation in the area of economic and financial services. The presentation essentially covers the areas within the responsibility of the Economic and Monetary Affairs Committee (ECON); hence it starts with core ECON areas but also displays neighbouring areas of other Committees' competences which are closely connected to and impacting on ECON's work. It shows legislation in force, proposals and other relevant provisions on banking, securities markets and investment firms, market infrastructure, insurance and occupational pensions, payment services, consumer protection in financial services, the European System of Financial Supervision, European Monetary Union, euro bills and coins and statistics, competition, taxation, commerce and company law, accounting and auditing. Moreover, it notes selected provisions that might become relevant in the upcoming Article 50 TEU negotiations.
In this study prepared for the ECON Committee of the European Parliament, Gellings, Jungbluth and Langenbucher present a graphic overview on core legislation in the area of economic and financial services in Europe. The mapping overview can serve as background for further deliberations. The study covers legislation in force, proposals and other relevant provisions in fourteen policy areas, i.e. banking, securities markets and investment firms, market infrastructure, insurance and occupational pensions, payment services, consumer protection in financial services, the European System of Financial Supervision, European Monetary Union, Euro bills and Coins and statistics, competition, taxation, commerce and company law, accounting and auditing.
The global financial crisis (as well as the European sovereign debt crisis) has led to a substantial redesign of rules and institutions – aiming in particular at underwriting financial stability. At the same time, the crisis generated a renewed interest in properly appraising systemic financial vulnerabilities. Employing most recent data and applying a variety of largely only recently developed methods we provide an assessment of indicators of financial stability within the Euro Area. Taking a “functional” approach, we analyze comprehensively all financial intermediary activities, regardless of the institutional roof – banks or non-bank (shadow) banks – under which they are conducted. Our results reveal a declining role of banks (and a commensurate increase in non-bank banking). These structural shifts (between institutions) are coincident with regulatory and supervisory reforms (implemented or firmly anticipated) as well as a non-standard monetary policy environment. They might, unintendedly, actually imply a rise in systemic risk. Overall, however, our analyses suggest that financial imbalances have been reduced over the course of recent years. Hence, the financial intermediation sector has become more resilient. Nonetheless, existing (equity) buffers would probably not suffice to face substantial volatility shocks.
Euro area shadow banking activities in a low-interest-rate environment: a flow-of-funds perspective
(2016)
Very low policy rates as well as the substantial redesign of rules and supervisory institutions have changed background conditions for the Euro Area’s financial intermediary sector substantially. Both policy initiatives have been targeted at improving societal welfare. And their potential side effects (or costs) have been discussed intensively, in academic as well as policy circles. Very low policy rates (and correspondingly low market rates) are likely to whet investors’ risk taking incentives. Concurrently, the tightened regulatory framework, in particular for banks, increases the comparative attractiveness of the less regulated, so-called shadow banking sector. Employing flow-of-funds data for the Euro Area’s non-bank banking sector we take stock of recent developments in this part of the financial sector. In addition, we examine to which extent low interest rates have had an impact on investment behavior. Our results reveal a declining role of banks (and, simultaneously, an increase in non-bank banking). Overall intermediation activity, hence, has remained roughly at the same level. Moreover, our findings also suggest that non-bank banks have tended to take positions in riskier assets (particularly in equities). In line with this observation, balance-sheet based risk measures indicate a rise in sector-specific risks in the non-bank banking sector (when narrowly defined).
Euro crash risk
(2015)
Using fiscal reaction functions for 3a panel of actual euro-area countries the paper investigates whether euro membership has reduced the responsiveness of countries to increases in the level of inherited debt compared to the period prior to succession to the euro. While we find some evidence for such a loss in prudence, the results are not robust to changes in the specification, as for example an exclusion of Greece from the panel. This suggests that the current debt problems may result to a large extent from pre-existing debt levels prior to entry or from a larger need for fiscal prudence in a common currency, while an adverse change in the fiscal reaction functions for most countries does not apply.
The paper uses fiscal reaction functions for a panel of euro-area countries to investigate whether euro membership has reduced the responsiveness of countries to shocks in the level of inherited debt compared to the period prior to succession to the euro. While we find some evidence for such a loss in prudence, the results are not robust to changes in the specification, such as an exclusion of Greece from the panel. This suggests that the current debt problems may result to a large extent from preexisting debt levels prior to entry or from a larger need for fiscal prudence in a common currency, while an adverse change in the fiscal reaction functions for most countries does not apply.
Euro nicht gefährdet
(2017)
Europa - wohin?
(2011)
Gemäß der Krönungstheorie der europäischen Währungsunion wurde der Euro eingeführt, um die Notwendigkeit gemeinsamen Regierens in der Europäischen Union allen vor Augen zu führen und so ein geordnetes Vorrücken zur europäischen Integration zu ermöglichen. In der gegenwärtigen Phase scheint indes politischer Opportunismus die Integration zu bestimmen.
Ein Freibrief für die Notenbank bedeutet, genau genommen, die Bankrotterklärung des demokratischen Verfassungsstaates vor technokratischen Beliebigkeiten, schreibt Helmut Siekmann in diesem Namensbeitrag. Er betont, dass die Europäische Union eine unverzichtbare Einrichtung ist und ein echter Bundesstaat sein sollte. Sie sei aber im Wesentlichen (nur) ein Rechtskonstrukt, weshalb es umso wichtiger sei, dass die rechtlichen Regeln, auf denen sie beruht, genauestens beachtet werden.
In dieser Notiz wird ein neues Konzept für eine europäische Einlagensicherung vorgeschlagen, welches den starken politischen Vorbehalten Rechnung trägt, die gegen eine Vergemeinschaftung der Haftung für Bankeinlagen bestehen. Das skizzierte drei-stufige Einlagensicherungsmodell führt existierende nationale Einlagensicherungseinrichtungen weiter, bietet einen europäischen Verlustausgleich und verhindert eine exzessive Risikoübernahme zu Lasten der internationalen Gemeinschaft.
This chapter discusses whether and how 'new quantitative trade models' (NQTMs) can be fruitfully applied to quantify the welfare effects of trade liberalization, thus shedding light on the trade-related effects of further European integration. On the one hand, it argues that NQTMs have indeed the potential of being used to supplement traditional 'computable general equilibrium' (CGE) analysis thanks to their tight connection between theory and data, appealing micro-theoretical foundations, and enhanced attention to the estimation of structural parameters. On the other hand, further work is still needed in order to fully exploit such potential.
The SVB case is a wake-up call for Europe’s regulators as it demonstrates the destructive power of a bank-run: it undermines the role of loss absorbing capital, elbowing governments to bailout affected banks. Many types of bank management weaknesses, like excessive duration risk, may raise concerns of bank losses – but to serve as a run-trigger, there needs to be a large enough group of bank depositors that fails to be fully covered by a deposit insurance scheme. Latent run-risk is the root cause of inefficient liquidations, and we argue that a run on SVB assets could have been avoided altogether by a more thoughtful deposit insurance scheme, sharply distinguishing between loss absorbing capital (equity plus bail-in debt) and other liabilities which are deemed not to be bail-inable, namely demand deposits. These evidence-based insights have direct implications for Europe’s banking regulation, suggesting a minimum and a maximum for a banks’ loss absorption capacity.
Asset-backed securitisation (ABS) is an asset funding technique that involves the issuance of structured claims on the cash flow performance of a designated pool of underlying receivables. Efficient risk management and asset allocation in this growing segment of fixed income markets requires both investors and issuers to thoroughly understand the longitudinal properties of spread prices. We present a multi-factor GARCH process in order to model the heteroskedasticity of secondary market spreads for valuation and forecasting purposes. In particular, accounting for the variance of errors is instrumental in deriving more accurate estimators of time-varying forecast confidence intervals. On the basis of CDO, MBS and Pfandbrief transactions as the most important asset classes of off-balance sheet and on-balance sheet securitisation in Europe we find that expected spread changes for these asset classes tends to be level stationary with model estimates indicating asymmetric mean reversion. Furthermore, spread volatility (conditional variance) is found to follow an asymmetric stochastic process contingent on the value of past residuals. This ABS spread behaviour implies negative investor sentiment during cyclical downturns, which is likely to escape stationary approximation the longer this market situation lasts.
Evaluating the quality of credit portfolio risk models is an important issue for both banks and regulators. Lopez and Saidenberg (2000) suggest cross-sectional resampling techniques in order to make efficient use of available data. We show that their proposal disregards cross-sectional dependence in resampled portfolios, which renders standard statistical inference invalid. We proceed by suggesting the Berkowitz (1999) procedure, which relies on standard likelihood ratio tests performed on transformed default data. We simulate the power of this approach in various settings including one in which the test is extended to incorporate cross-sectional information. To compare the predictive ability of alternative models, we propose to use either Bonferroni bounds or the likelihood-ratio of the two models. Monte Carlo simulations show that a default history of ten years can be sufficient to resolve uncertainties currently present in credit risk modeling.
Evaluating the quality of credit portfolio risk models is an important question for both banks and regulators. Lopez and Saidenberg (2000) suggest cross-sectional resampling techniques in order to make efficient use of available data and to produce measures of forecast accuracy. We first show that their proposal disregards crosssectional dependence in simulated subportfolios, which renders standard statistical inference invalid. We proceed by suggesting another evaluation methodology which draws on the concept of likelihood ratio tests. Specifically, we compare the predictive quality of alternative models by comparing the probabilities that observed data have been generated by these models. The distribution of the test statistic can be derived through Monte Carlo simulation. To exploit differences in cross-sectional predictions of alternative models, the test can be based on a linear combination of subportfolio statistics. In the construction of the test, the weight of a subportfolio depends on the difference in the loss distributions which alternative models predict for this particular portfolio. This makes efficient use of the data, and reduces computational burden. Monte Carlo simulations suggest that the power of the tests is satisfactory.
JEL classification: G2; G28; C52
We compare the cost effectiveness of two pronatalist policies:
(a) child allowances; and
(b) daycare subsidies.
We pay special attention to estimating how intended fertility (fertility before children are born) responds to these policies. We use two evaluation tools:
(i) a dynamic model on fertility, labor supply, outsourced childcare time, parental time, asset accumulation and consumption; and
(ii) randomized vignette-survey policy experiments.
We implement both tools in the United States and Germany, finding consistent evidence that daycare subsidies are more cost effective. Nevertheless, the required public expenditure to increase fertility to the replacement level might be viewed as prohibitively high.
Under a new Basel capital accord, bank regulators might use quantitative measures when evaluating the eligibility of internal credit rating systems for the internal ratings based approach. Based on data from Deutsche Bundesbank and using a simulation approach, we find that it is possible to identify strongly inferior rating systems out-of time based on statistics that measure either the quality of ranking borrowers from good to bad, or the quality of individual default probability forecasts. Banks do not significantly improve system quality if they use credit scores instead of ratings, or logistic regression default probability estimates instead of historical data. Banks that are not able to discriminate between high- and low-risk borrowers increase their average capital requirements due to the concavity of the capital requirements function.
This paper investigates the accuracy of point and density forecasts of four DSGE models for inflation, output growth and the federal funds rate. Model parameters are estimated and forecasts are derived successively from historical U.S. data vintages synchronized with the Fed’s Greenbook projections. Point forecasts of some models are of similar accuracy as the forecasts of nonstructural large dataset methods. Despite their common underlying New Keynesian modeling philosophy, forecasts of different DSGE models turn out to be quite distinct. Weighted forecasts are more precise than forecasts from individual models. The accuracy of a simple average of DSGE model forecasts is comparable to Greenbook projections for medium term horizons. Comparing density forecasts of DSGE models with the actual distribution of observations shows that the models overestimate uncertainty around point forecasts.
This paper investigates the accuracy of point and density forecasts of four DSGE models for inflation, output growth and the federal funds rate. Model parameters are estimated and forecasts are derived successively from historical U.S. data vintages synchronized with the Fed’s Greenbook projections. Point forecasts of some models are of similar accuracy as the forecasts of nonstructural large dataset methods. Despite their common underlying New Keynesian modeling philosophy, forecasts of different DSGE models turn out to be quite distinct. Weighted forecasts are more precise than forecasts from individual models. The accuracy of a simple average of DSGE model forecasts is comparable to Greenbook projections for medium term horizons. Comparing density forecasts of DSGE models with the actual distribution of observations shows that the models overestimate uncertainty around point forecasts.
This paper investigates the accuracy of forecasts from four DSGE models for inflation, output growth and the federal funds rate using a real-time dataset synchronized with the Fed’s Greenbook projections. Conditioning the model forecasts on the Greenbook nowcasts leads to forecasts that are as accurate as the Greenbook projections for output growth and the federal funds rate. Only for inflation the model forecasts are dominated by the Greenbook projections. A comparison with forecasts from Bayesian VARs shows that the economic structure of the DSGE models which is useful for the interpretation of forecasts does not lower the accuracy of forecasts. Combining forecasts of several DSGE models increases precision in comparison to individual model forecasts. Comparing density forecasts with the actual distribution of observations shows that DSGE models overestimate uncertainty around point forecasts.
This paper evaluates the effects of Public Sponsored Training in East Germany in the context of reiterated treatments. Selection bias based on observed characteristics is corrected for by applying kernel matching based on the propensity score. We control for further selection and the presence of Ashenfelter's Dip before the program with conditional difference-in-differences estimators. Training as a first treatment shows insignificant effects on the transition rates. The effect of program sequences and the incremental effect of a second program on the reemployment probability are insignificant. However, the incremental effect on the probability to remain employed is slightly positive. JEL - Klassifikation: H43 , C23 , J6 , J64 , C14