Refine
Year of publication
Document Type
- Working Paper (1496)
- Part of Periodical (568)
- Article (205)
- Report (141)
- Book (100)
- Doctoral Thesis (70)
- Contribution to a Periodical (44)
- Conference Proceeding (21)
- Part of a Book (13)
- Periodical (12)
Is part of the Bibliography
- no (2698)
Keywords
- Deutschland (98)
- Financial Institutions (90)
- Capital Markets Union (65)
- ECB (65)
- Financial Markets (59)
- Banking Union (50)
- Banking Regulation (49)
- Household Finance (45)
- Monetary Policy (41)
- Banking Supervision (40)
Institute
- Wirtschaftswissenschaften (2698) (remove)
Inflation hat in den letzten Jahren weltweit erheblich an Popularität eingebüßt. Galten noch in den 1960er und 1970er Jahren moderate Inflationsraten von 5 bis 10 Prozent als wachstums- und beschäftigungsfördernd, so ist es mittlerweile in Politik und Wissenschaft nahezu unstrittig, dass Inflation vor allem volkswirtschaftliche Kosten verursacht und deshalb Preisstabilität das vorrangige Ziel moderner Geldpolitik sein muss. So sieht insbesondere die in Frankfurt ansässige Europäische Zentralbank (EZB) ihre Hauptaufgabe darin, die jährliche Inflationsrate in der Eurozone unter 2 Prozent zu halten. Klettert die Inflationsrate nur wenige Dezimalpunkte über diesen Zielwert, muss mit Zinserhöhungen und einer restriktiven Geldpolitik der Zentralbank gerechnet werden. Diese Geldpolitik i s t gerechtfertigt, wenn bereits niedrige Inflationsraten messbare realwirtschaftliche Effekte besitzen. Eine Studie der Professur für Empirische Makroökonomie untersucht deshalb den Einfluss von Inflation auf die Variabilität der relativen Preise.
The euro crisis was fueled by the diabolic loop between sovereign risk and bank risk, coupled with cross-border flight-to-safety capital flows. European Safe Bonds (ESBies), a union-wide safe asset without joint liability, would help to resolve these problems. We make three contributions. First, numerical simulations show that ESBies would be at least as safe as German bunds and approximately double the supply of euro safe assets when protected by a 30%-thick junior tranche. Second, a model shows how, when and why the two features of ESBies — diversification and seniority — can weaken the diabolic loop and its diffusion across countries. Third, we propose a step-by-step guide on how to create ESBies, starting with limited issuance by public or private-sector entities.
In this statement the European Shadow Financial Regulatory Committee (ESFRC) is advocating a conditional relief of Greek’s government debt based on Greece meeting certain targets for structural economic reforms in areas such as its labor market and pensions sector.The authors argue that the position of the European institutions that debt relief for Greece cannot be part of an agreement is based on the illusion that Greece will be able to service its sovereign debt and reduce its debt overhang after implementing a set of fiscal and structural reforms. However, the Greek economy would need to grow at an unrealistig level to achieve debt sustainability soley on the basis of reforms.The authors therefore view a substantial debt relief as inevitable and argue that three questions must be resolved urgently, in order to structure debt relief adequately: First, which groups must accept losses associated with debt relief. Second, how much debt relief should be offered. Third, under what conditions should relief be offered.
We examine whether the uncertainty related to environmental, social, and governance (ESG) regulation developments is reflected in asset prices. We proxy the sensitivity of firms to ESG regulation uncertainty by the disparity across the components of their ESG ratings. Firms with high ESG disparity have a higher option-implied cost of protection against downside tail risk. The impact of the misalignment across the different dimensions of the ESG score is distinct from that of ESG score level itself. Aggregate downside risk bears a negative price for firms with low ESG disparity.
Essays in behavioral economics - evidence on self-selection into jobs, social networks and leniency
(2013)
Die Dissertation mit dem Titel „Essays in Behavioral Economics – Evidence on Self-Selection into Jobs, Social Networks and Leniency“ besteht aus einer Sammlung von vier wissenschaftlichen Abhandlungen. Alle Arbeiten verbindet die Analyse von theoretischen Konzepten und Erkenntnissen der Verhaltensökonomie unter Verwendung der experimentellen Methode. Die erste wissenschaftliche Abhandlung trägt den Titel „Sorting of Motivated Agents - Empirical Evidence on Self-Selection into the German Police“ und untersucht Selbstselektion bestimmter Individuen in den Polizeiberuf. Die experimentelle Studie untersucht die Frage, ob Polizeibewerber sich hinsichtlich ihrer Präferenzen in Bezug auf ihr Normdurchsetzungsverhalten in den Polizeiberuf selektieren. Die zweite Abhandlung greift diese Erkenntnisse auf und untersucht Polizeianwärter in ihrer Berufsausbildung ebenfalls hinsichtlich ihrer Normdurchsetzungsbereitschaft. Die Arbeit trägt den Titel „Selection and formation of motivated agents -- empirical evidence from the German Police”. In der dritten wissenschaftlichen Abhandlung werden geschlechterspezifische Unterschiede bei der Wahl von Partnern und dem Aufbau des sozialen Netzwerkes untersucht. Diese trägt den Titel „Selectivity and opportunism: two dimensions of gender differences in trust games and network formation“ und wurde zusammen mit Guido Friebel, Marie Lalanne, Paul Seabright und Peter Schwardmann verfasst. Die vierte Abhandlung geht einer aktuellen Fragestellung der Industrieökonomie nach und trägt den Titel „Antitrust, auditing and leniency programs: evidence from the laboratory“, verfasst mit Mehdi Feizi and Ali Mazyaki. In ihrer Gesamtheit liefert meine Dissertation Antworten auf personalpolitische, soziale und industrieökonomische Fragestellungen.
This dissertation consists of three chapters. The first two chapters investigate the real effects of inflation and the third chapter the role of child care for fertility and female female labor supply. Chapter 1 introduces a generalized panel threshold model to analyze the relation between inflation and economic growth for a sample of developing countries. It is demonstrated that allowing for regime intercepts can be crucial for obtaining unbiased estimates of both, inflation thresholds and its marginal effects on growth in the various regimes. The empirical results confirm that the omitted variable bias of standard panel threshold models can be statistically and economically significant. Chapter 2, which is joined work with Dieter Nautz, investigates the impact of inflation on relative price variability (RPV) as a further important channel of the real effects of inflation. With a view to the recent debate on the Fed's implicit lower and upper bounds of its inflation objective, the econometric model introduced in Chapter 1 is used to explore the inflation-RPV linkage in U.S. cities. Chapter 3 investigates the relationship between fertility, female labor supply and child care in the context of a life cycle model for Germany. A particular emphasis is placed on the differences between West and East Germany. Counterfactual policy experiments mimicking recent policy reforms on maternal leave and the provision of subsidized child care are conducted with a structurally estimated version of the model.
CHAPTER A: THE INVESTMENT BEHAVIOR OF PRIVATE EQUITY FUND MANAGERS I The Bright and Dark Side of Staging: Investment Performance and the Varying Motivations of Private Equity Firms II The Liquidation Dilemma of Money Losing Investments – The Impact of Investment Experience and Window Dressing of Private Equity and Venture Capital Funds CHAPTER B: THE ASSESSMENT OF RISK AND RETURN OF PRIVATE EQUITY I Venture Capital Performance Projection: A Simulation Approach II Modeling Default Risk of Private Equity Funds – A Market-based Framework
This thesis consists of four chapters. Each chapter covers a topic in international macroeconomics and monetary policy. The first chapter investigates the impact of unexpected monetary policy shocks on exchange rates in a multi-country econometric model. The second chapter examines the linkage between macroeconomic fundamentals and exchange rates through the monetary policy expectation channel. The third chapter focuses on the international transmission of bank and corporate distress. The last chapter unfolds the interest rate channel of monetary policy transmission in-an emerging economy-China, where regulations and market forces co-exist in this transmission.
In this thesis the behavior of banks in financial markets which banks frequently use to obtain short-term as well as long-term financing is studied. In the first chapter we incorporate an interbank market for collateralized lending among banks into a dynamic, stochastic, general equilibrium (DSGE) framework to analyze the impact of variations in the expected value of the collateral on the interbank lending volume. We find that a central bank which decides to lower the haircut on eligible collateral in repurchase agreements is able to stimulate interbank markets. In the second chapter a microeconomic model of bank behavior on the interbank market is set up to analyze the impact of risk-taking behavior of interbank borrowing banks and uncertainty about their balance sheet quality on the lending behavior of interbank lending banks. It is found that the disruptions on the interbank market are the result of optimal behavior on the part of interbank lending banks in response to the uncertainty about the balance sheet quality of an interbank borrowing bank. In the third chapter we use monthly data on German bank bond spreads and regress it on bank-specific risk factors to assess the degree of market discipline in the German bank bond market. The regression results for the whole German bank bond market indicate that the bond spread does not show signs of market discipline. However, a structural break analysis uncovers that since the beginning of the financial crisis the German bank bond market exhibits at least a weak form of market discipline for bonds issued by medium-size and large banks.
This dissertation introduces in chapter 1 a new comparative approach to model-based research and policy analysis by constructing an archive of business cycle models. It includes many well-known models used in academia and at policy institutions. A computational platform is created that allows straightforward comparisons of models’ implications for monetary and fiscal stabilization policies. Chapter 2 applies business cycle models to forecasting. Several New Keynesian models are estimated on historical U.S. data vintages and forecasts are computed for the five most recent recessions. The extent of forecast heterogeneity for models and professional forecasts is analysed. Chapter 3 extends the forecasting analysis to a long sample and to the evaluation of density forecasts. Weighted forecasts are computed using a variety of weighting schemes. The accuracy of forecasts is evaluated and compared to professional forecasts and forecasts from nonstructural time series methods. Chapter 4 adds a new feature to existing business cycle models. Specifically, a medium-scale New Keynesian model is constructed that allows for strategic complementarities in price-setting. The role of trade integration for monetary policy transmission is explored. A new dimension of the exchange rate channel is highlighted by which monetary policy directly impacts domestic inflation. Chapter 5 tests whether simple symmetric monetary policy rules used in most business cycle models are a sufficient description of reality. I use quantile regressions to estimate policy parameters and find asymmetric reactions to inflation, the output gap and past interest rates.
Die Dissertation besteht aus drei thematisch zusammenhängenden Forschungspapieren, in denen zeitstetige Konsum-, Investment- und Versicherungsprobleme über den Lebenszyklus betrachtet werden. Ein besonderer Fokus liegt auf realistischen Features wie stochastischem Sterberisiko und nicht-replizierbarem Einkommen. In der ersten Forschungsarbeit untersuche ich die Relevanz von stochastischem Sterberisiko. Dabei zeige ich, dass eine Sprungkomponente in der Sterberate die optimalen Entscheidungen der Agenten und das Wohlfahrtslevel signifikant beeinflusst. Eine Diffusionskomponente ist hingegen vernachlässigbar. In dem zweiten Forschungspapier untersuchen wir die Risikolebensversicherungsnachfrage einer Familie, dessen Alleinverdiener stochastischem Sterberisiko ausgesetzt ist. Wir achten insbesondere auf eine realistische Modellierung der Versicherung. Wir zeigen, dass dadurch junge Agenten dem Versicherungsmarkt fern bleiben und die Versicherungsnachfrage mit dem Alter steigt, im Gegensatz zu Modellen mit einfachen stetig-veränderbaren Versicherungen. Weiterhin verstärken langlaufende Versicherungsverträge die negativen Effekte von Einkommensschocks und werden daher von risikoaversen Agenten weniger abgeschlossen. In der dritten Forschungsarbeit untersuche ich die Critical Illness Versicherungsnachfrage eines Agenten in einem Modell mit stochastischem Sterberisiko und Gesundheitsausgaben. Die Versicherung übernimmt dabei die zusätzlichen Gesundheitskosten, die bei einem Sprung entstehen. Fast alle Agenten schließen solch eine Versicherung vor dem Rentenalter ab, selbst wenn diese sehr kostspielig ist. Insbesondere Agenten mit geringen Gesundheitsausgaben und hohem Einkommen haben eine hohe Versicherungsnachfrage.
Die vorliegende Arbeit beschäftigt sich mit der zeitstetigen Portfoliooptimierung sowie mit Themen aus dem Bereich des Kreditrisikos. Das Ziel der Portfoliooptimierung ist es, zu einem gegebenen Anfangskapital die bestmöglichen Konsum- und Investmentstrategien zu finden. In dieser Arbeit wird dabei vor allem der Einfluss von Einkommen auf diese Entscheidungen untersucht. Da einerseits jedoch der zukünftige Einkommensstrom vom Zufall bestimmt ist und es andererseits keine Finanzprodukte gibt, die diesen replizieren können, stellt die Einbindung von Einkommen in die Portfoliooptimierung ein großes Problem dar. Es führt dazu, dass die Annahmen eines vollständigen Marktes nicht weiter gelten, so dass die Standardmethoden zur Lösung nicht angewendet werden können. Diese Arbeit analysiert mehrere Ausprägungen dieses Problems und geht auf verschiedene Verfahren zur Lösung ein. Weiterhin untersucht diese Studie den Einfluss des Kreditrisikos einer Firma auf die jeweilige Firmenrendite. Dabei wird vor allem auf eine Anomalie, die bereits umfassend in der Literatur diskutiert wurde, Bezug genommen. Diese Anomalie besagt, dass Firmen mit hohen Ausfallwahrscheinlichkeiten geringere Renditen erwirtschaften als Firmen mit kleineren Ausfallwahrscheinlichkeiten. Eine weitere Frage, die in den Bereich des Kreditrisikos fällt, ist die Frage, inwieweit Modelle dazu in der Lage sind, strukturierte Produkte zu bewerten und abzusichern. Diese Arbeit versucht Antworten darauf zu geben.
In total, this dissertation comprises three research papers. Objective of all of these papers are to detect mistakes of private investors when conducting mutual funds investments and to analyze the implications. Moreover, the question is addressed whether financial advisors help private investors to avoid these investment mistakes. All three research papers use the same data base which has been provided by a German online brokerage house. The detailed data set allows contributing to existing literature on mutual fund investments, smart decision making, household finance as well as financial advice on an investor- and transaction-specific level. The first paper addresses the question which particular decision criteria private investors use when purchasing mutual funds. It can be shown that funds volume is the dominating decision criterion, whereas historical performance is only of minor importance. As performance persistence exists in the underlying data set, it can be concluded that the majority of investors make investment mistakes. In the second paper it is shown that smart investors, i.e. investors who purchase mutual funds by chasing historical performance, are older, wealthier, more experienced and less likely to be overconfident. In addition, it can be verified that there exists a positive impact of the ability to select mutual funds by chasing historical performance on the overall investment success. Hence, the quality of mutual fund selection ability is an ex-ante measure for investment success. Finally, the third paper analyzes the influence of financial advice on mutual fund decision making of private investments. Evidence can be provided that financial advisors do not help their customers to purchase mutual funds by chasing historical performance. In fact, advisors recommend high-volume mutual funds from well-known fund families. Apparently, financial advisors are much more salesmen than real advisors. These results hold when controlling for potential endogeneity issues.
This dissertation contains three essays on monetary policy, dynamics of the interest rates and spillovers across economies. In the first essay I examine the effects of monetary policy and its interaction with financial regulation within a micro-founded macroeconometric framework for a closed economy with a heterogeneous banking system, facing a period of low interest rates. I analyse the interplay between monetary policy and banking regulation and study the role of agents’ expectations for the effectiveness of unconventional monetary policy tools. In the next essay, I argue that openness is crucial for understanding the dynamics of the term structure. In an empirical application, I show that my model of the term structure fits well the yield curve in-sample and has a sound ability to forecast interest rates out-of-sample. The model accounts for the expectations hypothesis, replicates the forward premium anomaly and reconciles the uncovered interest rate parity implications. The last essay is concerned with the dynamics of co-movement among macroeconomic aggregates and the degree of convergence or decoupling amongst economies. The model includes measures of financial and trade-based interdependencies and incorporates feedback between macroeconomic variables and time-varying weights. The findings point at the importance of asset price movements and financial linkages.
For private investors it is imperative to a) understand and define their own, individual risk preferences, b) assess their financial and demographic circumstances to determine the individual risk-taking potential, and c) form and maintain a well-diversified risky portfolio. The three chapters of my thesis each match one of these three tasks. \\ \noindent The first chapter of my thesis presents novel experimental evidence to test the existence of a potential projection bias in loss aversion, a significant determinant of investor preferences, thus matching task a). The second chapter is devoted to the determination of private investors' risk-taking potential based on their financial and socio-demographic circumstances, matching task b): In a large portfolio experiment, we examine the ability and heterogeneity of lay and professional advisors in matching investor demographics, such as age and income, with risky asset portfolio shares. The third and final chapter addresses the question on how to reach and maintain an efficient risky portfolio, therefore matching task c): It analyzes a decision support system for private investors that allows its users to simulate any arbitrary set of securities, and by reporting aggregated expected return and risk, to optimize their current portfolio.
This cumulative dissertation contains four self-contained chapters on stochastic games and learning in intertemporal choice.
Chapter 1 presents an experiment on value learning in a setting where actions have both immediate and delayed consequences. Subjects make a series of choices between abstract options, with values that have to be learned by sampling. Each option is associated with two payoff components: One is revealed immediately after the choice, the other with one round delay. Objectively, both payoff components are equally important, but most subjects systematically underreact to the delayed consequences. The resulting behavior appears impatient or myopic. However, there is no inherent reason to discount: All rewards are paid simultaneously, after the experiment. Elicited beliefs on the value of options are in accordance with choice behavior. These results demonstrate that revealed impatience may arise from frictions in learning, and that discounting does not necessarily reflect deep time preferences. In a treatment variation, subjects first learn passively from the evidence generated by others, before then making a series of own choices. Here, the underweighting of delayed consequences is attenuated, in particular for the earliest own decisions. Active decision making thus seems to play an important role in the emergence of the observed bias.
Chapter 2 introduces and proves existence of Markov quantal response equilibrium (QRE), an application of QRE to finite discounted stochastic games. We then study a specific case, logit Markov QRE, which arises when players react to total discounted payoffs using the logit choice rule with precision parameter λ. We show that the set of logit Markov QRE always contains a smooth path that leads from the unique QRE at λ = 0 to a stationary equilibrium of the game as λ goes to infinity. Following this path allows to solve arbitrary finite discounted stochastic games numerically; an implementation of this algorithm is publicly available as part of the package sgamesolver. We further show that all logit Markov QRE are ε-equilibria, with a bound for ε that is independent of the payoff function of the game and decreases hyperbolically in λ. Finally, we establish a link to reinforcement learning, by characterizing logit Markov QRE as the stationary points of a game dynamic that arises when all players follow the well-established reinforcement learning algorithm expected SARSA.
Chapter 3 introduces the logarithmic stochastic tracing procedure, a homotopy method to compute stationary equilibria for finite and discounted stochastic games. We build on the linear stochastic tracing procedure (Herings and Peeters 2004), but introduce logarithmic penalty terms as a regularization device, which brings two major improvements. First, the scope of the method is extended: it now has a convergence guarantee for all games of this class, rather than just generic ones. Second, by ensuring a smooth and interior solution path, computational performance is increased significantly. A ready-to-use implementation is publicly available. As demonstrated here, its speed compares quite favorable to other available algorithms, and it allows to solve games of considerable size in reasonable times. Because the method involves the gradual transformation of a prior into equilibrium strategies, it is possible to search the prior space and uncover potentially multiple equilibria and their respective basins of attraction. This also connects the method to established theory of equilibrium selection.
Chapter 4 introduces sgamesolver, a python package that uses the homotopy method to compute stationary equilibria of finite discounted stochastic games. A short user guide is complemented with discussion of the homotopy method, the two implemented homotopy functions logit Markov QRE and logarithmic tracing, and the predictor-corrector procedure and its implementation in sgamesolver. Basic and advanced use cases are demonstrated using several example games. Finally, we discuss the topic of symmetries in stochastic games.
Die Börsenindustrie hat in den vergangenen zwei Jahrzehnten einen signifikanten Wandel durchlaufen - und das nicht nur in Deutschland. Börsen haben schon längst nicht mehr den Charakter vergangener Tage, in denen ihre Mitglieder auf dem Parkett um Aktienpakete und -kurse von inländischen Unternehmen feilschten und an den genossenschaftlich organisierten Handelsplätzen eher eine vertrauliche Clubatmosphäre herrschte. Eine Vielzahl der Börsen hat den Parketthandel abgeschafft, ist selbst an einer Börse gelistet und orientiert sich primär am Shareholder Value und somit an den Interessen einer internationalen Aktionärsbasis. Mittlerweile existieren Börsenplätze, die mehrere Länder umspannen. Der französisch dominierten Euronext kommt hier eine Vorreiterrolle zu. Aber auch andere Börsen, wie die Deutsche Börse und die Schweizer Börse, haben länderübergreifend ihre Derivatehandelsplattformen vereinigt und mit ihrem Jointventure Eurex die umsatzstärkste Derivatebörse der Welt geschaffen. In jüngster Zeit werden nun auch transatlantische Allianzen zwischen amerikanischen und europäischen Börsen angedacht. Sowohl die Strategie der Nasdaq, die bisher eine Sperrminorität von über 25% an der Londoner Börse hält, als auch die der New York Stock Exchange, die eine Fusion mit der Euronext anstrebt, belegen dies. Zudem stehen Börsen mittlerweile in direktem Wettbewerb mit ihren Kunden und ehemaligen Eigentümern, den Finanzintermediären wie Banken und Wertpapierhäuser. Sie konkurrieren um Wertpapieraufträge von Investoren, da Banken nicht mehr jede Order automatisch an sie weiterleiten. Stattdessen versuchen manche Finanzintermediäre, die erhaltenen Investorenaufträge im eigenen Haus mit einer entsprechenden reziproken Order zusammenzuführen, um somit die Geld-Brief Spanne des Wertpapiers als Gewinn einzubehalten. Diese Internalisierung von Auftragsausführungen ist seit einigen Jahren insbesondere in England und Deutschland eine bedeutende Einkommensquelle für Wertpapierhäuser geworden. Gleichzeitig stoßen Börsen immer stärker in Geschäftsbereiche vor, die bislang die Domäne ihrer Kunden repräsentierten. Hier sei der Handel von bestimmten Kreditderivateprodukten genannt, die bisher außerbörslich zwischen großen Wertpapierhäusern gehandelt wurden. Sowohl die Chicago Mercantile Exchange als auch die Eurex planen den Handel dieser Titel auf ihren eigenen Plattformen. Ein weiteres Beispiel ist die vertikale Integration von Wertpapierabwicklungs- und Wertpapierverwahrungsgeschäften. Große internationale Banken wie BNP Paribas, Citigroup und State Street kämpfen hier gegen Börsen um Marktanteile. Wie kam es zu dem hier beschriebenen Wandel? Der entscheidende Katalysator ist der gestiegene Wettbewerbsdruck auf traditionelle Börsen, welcher in vielen Fällen zu einer Umstrukturierung ihrer Organisationsform und Eigentümerstruktur führte. Diese neu ausgerichteten Börsen verstanden sich nun als reguläre, gewinnorientierte Firmen, die nicht mehr in erster Linie ihren Kunden, sondern ihren neuen Eigentümern, den Aktionären, verpflichtet waren. ...
This thesis is concerned with various aspects of estimating trend output and growth and discusses and evaluates methods to prepare medium-term GDP growth projections. Furthermore, econometric techniques suited for cross-correlated macroeconomic panel data with a focus on factor models are applied for unit root and cointegration testing as well as panel error correction estimation. Applications involve the identification of growth determinants as well as the modelling of aggregate labor supply in a multi-country framework. The first chapter evaluates a very popular method for potential output estimation and medium-term forecasting---the production function approach---in terms of predictive performance. For this purpose, a particular forecast evaluation framework is developed and an evaluation of the predictions of GDP growth for the three to five years ahead for each individual G7 country is carried out. In chapter two, a new approach for estimating trend growth of advanced economies is proposed. The suggestion combines econometric methods that have been used to test and estimate the implications of the extended Solow growth model in a cross sectional time series setting with an application of multivariate time series filter techniques. The last chapter discusses several panel unit root tests designed to accommodate cross-sectional dependence. These methods are then applied to an OECD country sample of the aggregate labor supply measure "hours worked".
We estimate a semiparametric single-risk discrete-time duration model to assess the effect of vocational training on the duration of unemployment spells. The data basis used in this study is the German Socio-Economic-Panel (GSOEP) for West Germany for the period from 1986 to 1994. To take into account a possible selection bias actual participation in vocational training is instrumented using estimates of a randomeffects probit model for the participation in qualification measures. Our main results show that training does have a significant short term effect of reducing unemployment duration but that this effect does not persist in the long run. JEL classifications: C41, J20, J64
On July 4, 2013 the ECB Governing Council provided more specific forward guidance than in the past by stating that it expects ECB interest rates to remain at present or lower levels for an extended period of time. As explained by ECB President Mario Draghi this expectation is based on the Council’s medium-term outlook for inflation conditional on economic activity and money and credit. Draghi also stressed that there is no precise deadline for this extended period of time, but that a reasonable period can be estimated by extracting a reaction function. In this note, we use such a reaction function, namely the interest rate rule from Orphanides and Wieland (2013) that matches past ECB interest rate decisions quite well, to project the rate path consistent with inflation and growth forecasts from the survey of professional forecasters published by the ECB on August 8, 2013. This evaluation suggests an increase in ECB interest rates by May 2014 at the latest. We also use the Eurosystem staff projection from June 6, 2013 for comparison. While it would imply a longer period of low rates, it does not match past ECB decisions as well as the reaction function with SPF forecasts.
In this study, we develop a technique for estimating a firm’s expected cost of equity capital derived from analyst consensus forecasts and stock prices. Building on the work of Gebhardt/Lee/-Swaminathan (2001) and Easton/Taylor/Shroff/Sougiannis (2002), our approach allows daily estimation, using only publicly available information at that date. We then estimate the expected cost of equity capital at the market, industry and individual firm level using historical German data from 1989-2002 and examine firm characteristics which are systematically related to these estimates. Finally, we demonstrate the applicability of the concept in a contemporary case study for DaimlerChrysler and the European automobile industry.
We propose a new estimator for the spot covariance matrix of a multi-dimensional continuous semi-martingale log asset price process which is subject to noise and non-synchronous observations. The estimator is constructed based on a local average of block-wise parametric spectral covariance estimates. The latter originate from a local method of moments (LMM) which recently has been introduced by Bibinger et al. (2014). We extend the LMM estimator to allow for autocorrelated noise and propose a method to adaptively infer the autocorrelations from the data. We prove the consistency and asymptotic normality of the proposed spot covariance estimator. Based on extensive simulations we provide empirical guidance on the optimal implementation of the estimator and apply it to high-frequency data of a cross-section of NASDAQ blue chip stocks. Employing the estimator to estimate spot covariances, correlations and betas in normal but also extreme-event periods yields novel insights into intraday covariance and correlation dynamics. We show that intraday (co-)variations (i) follow underlying periodicity patterns, (ii) reveal substantial intraday variability associated with (co-)variation risk, (iii) are strongly serially correlated, and (iv) can increase strongly and nearly instantaneously if new information arrives.
The authors propose a new method to forecast macroeconomic variables that combines two existing approaches to mixed-frequency data in DSGE models. The first existing approach estimates the DSGE model in a quarterly frequency and uses higher frequency auxiliary data only for forecasting. The second method transforms a quarterly state space into a monthly frequency. Their algorithm combines the advantages of these two existing approaches.They compare the new method with the existing methods using simulated data and real-world data. With simulated data, the new method outperforms all other methods, including forecasts from the standard quarterly model. With real world data, incorporating auxiliary variables as in their method substantially decreases forecasting errors for recessions, but casting the model in a monthly frequency delivers better forecasts in normal times.
Causality is a widely-used concept in theoretical and empirical economics. The recent financial economics literature has used Granger causality to detect the presence of contemporaneous links between financial institutions and, in turn, to obtain a network structure. Subsequent studies combined the estimated networks with traditional pricing or risk measurement models to improve their fit to empirical data. In this paper, we provide two contributions: we show how to use a linear factor model as a device for estimating a combination of several networks that monitor the links across variables from different viewpoints; and we demonstrate that Granger causality should be combined with quantile-based causality when the focus is on risk propagation. The empirical evidence supports the latter claim.
Effort estimates are of utmost economic importance in software development projects. Estimates bridge the gap between managers and the invisible and almost artistic domain of developers. They give a means to managers to track and control projects. Consequently, numerous estimation approaches have been developed over the past decades, starting with Allan Albrecht's Function Point Analysis in the late 1970s. However, this work neither tries to develop just another estimation approach, nor focuses on improving accuracy of existing techniques. Instead of characterizing software development as a technological problem, this work understands software development as a sociological challenge. Consequently, this work focuses on the question, what happens when developers are confronted with estimates representing the major instrument of management control? Do estimates influence developers, or are they unaffected? Is it irrational to expect that developers start to communicate and discuss estimates, conform to them, work strategically, hide progress or delay? This study shows that it is inappropriate to assume an independency of estimated and actual development effort. A theory is developed and tested, that explains how developers and managers influence the relationship between estimated and actual development effort. The theory therefore elaborates the phenomenon of estimation fulfillment.
Markets are central to modern society, so their failures can have devastating effects. Here, we examine a prominent failure: price bubbles. We propose that bubbles are affected by ethnic homogeneity in the market and can be thwarted by diversity. Using experimental markets in Southeast Asia and North America, we find a marked difference: Market prices fit true values 58% better in diverse markets. In homogenous markets, overpricing is higher and traders’ errors are more correlated than in diverse markets. The findings suggest that price bubbles arise not only from individual errors or financial conditions, but also from the social context of decision making. Informing public discussion, our findings suggest that diversity facilitates friction that enhances deliberation and upends conformity.
This working paper suggests to analyse agencification as a double process of institutional and policy centralisation. To that end, it develops a categorisation of agencies that incorporates these two dimensions. More specifically, it is argued that mixed outcomes where the levels of institutional and policy centralisation diverge can be expected to be the rule rather than the exception, in line with the hybrid nature of EU agencies as inbetweeners. Moreover, the fiduciary setting hits important legal constraints given the limits to delegation in the EU context. Against this backdrop a process whereby institutional centralisation develops incrementally and remains limited, yet is accompanied by a process of substantial policy centralisation, appears as the most promising path for EU agencification. A fiduciary setting, where a strong agency enjoys a high degree of independence and operates in a centralised policy space, by contrast, should be the exception. The comparative study of the process of agencification in the energy and banking sector is insightful in the light of these expectations. The incremental nature of institutional change in energy exemplifies the usual path of agencification, which is conducive to a weak agency operating in a relatively centralised policy space. Agencification in banking, by contrast, has led to a rather unusual outcome where the strong agency model combines with a fragmented policy context.
EU financial integration : is there a 'Core Europe'? ; evidence from a cluster-based approach
(2005)
Numerous recent studies, e.g. EU Commission (2004a), Baele et al. (2004), Adam et al.(2002), and the research pooled in ECB-CFS (2005), Gaspar, Hartmann, and Sleijpen(2003), have documented progress in EU financial integration from a micro-level view.This paper contributes to this research by identifying groups of financially integratedcountries from a holistic, macro-level view. It calculates cross-sectional dispersions, andinnovates by applying an inter-temporal cluster analysis to eight euro area countries for the period 1995-2002. The indicators employed represent the money, government bond and credit markets. Our results show that euro countries were divided into two stable groups of financially more closely integrated countries in the pre-EMU period. Back then, geographic proximity and country size might have played a role. This situation has changed remarkably with the euro's introduction. EMU has led to a shake-up both in the number and composition of groups. The evidence puts a question mark behin d using Germany as a benchmark in the post-EMU period. The ¯ndings suggest as well that ¯nancial integration takes place in waves. Stable periods and periods of intense transition alternate. Based on the notion of 'maximum similarity', the results suggest that there exist 'maximum similarity barriers'. It takes extraordinary events, such as EMU, to push the degree of ¯nancial integration beyond these barriers. The research encourages policymakers to move forward courageously in the post-FSAP era, and provides comfort that the substantial di®erences between the current and potentially new euro states can be overcome. The analysis could be extended to the new EU member countries, to the global level, and to additional indicators.
This study provides a graphic overview on core legislation in the area of economic and financial services. The presentation essentially covers the areas within the responsibility of the Economic and Monetary Affairs Committee (ECON); hence it starts with core ECON areas but also displays neighbouring areas of other Committees' competences which are closely connected to and impacting on ECON's work. It shows legislation in force, proposals and other relevant provisions on banking, securities markets and investment firms, market infrastructure, insurance and occupational pensions, payment services, consumer protection in financial services, the European System of Financial Supervision, European Monetary Union, euro bills and coins and statistics, competition, taxation, commerce and company law, accounting and auditing. Moreover, it notes selected provisions that might become relevant in the upcoming Article 50 TEU negotiations.
In this study prepared for the ECON Committee of the European Parliament, Gellings, Jungbluth and Langenbucher present a graphic overview on core legislation in the area of economic and financial services in Europe. The mapping overview can serve as background for further deliberations. The study covers legislation in force, proposals and other relevant provisions in fourteen policy areas, i.e. banking, securities markets and investment firms, market infrastructure, insurance and occupational pensions, payment services, consumer protection in financial services, the European System of Financial Supervision, European Monetary Union, Euro bills and Coins and statistics, competition, taxation, commerce and company law, accounting and auditing.
The global financial crisis (as well as the European sovereign debt crisis) has led to a substantial redesign of rules and institutions – aiming in particular at underwriting financial stability. At the same time, the crisis generated a renewed interest in properly appraising systemic financial vulnerabilities. Employing most recent data and applying a variety of largely only recently developed methods we provide an assessment of indicators of financial stability within the Euro Area. Taking a “functional” approach, we analyze comprehensively all financial intermediary activities, regardless of the institutional roof – banks or non-bank (shadow) banks – under which they are conducted. Our results reveal a declining role of banks (and a commensurate increase in non-bank banking). These structural shifts (between institutions) are coincident with regulatory and supervisory reforms (implemented or firmly anticipated) as well as a non-standard monetary policy environment. They might, unintendedly, actually imply a rise in systemic risk. Overall, however, our analyses suggest that financial imbalances have been reduced over the course of recent years. Hence, the financial intermediation sector has become more resilient. Nonetheless, existing (equity) buffers would probably not suffice to face substantial volatility shocks.
Euro area shadow banking activities in a low-interest-rate environment: a flow-of-funds perspective
(2016)
Very low policy rates as well as the substantial redesign of rules and supervisory institutions have changed background conditions for the Euro Area’s financial intermediary sector substantially. Both policy initiatives have been targeted at improving societal welfare. And their potential side effects (or costs) have been discussed intensively, in academic as well as policy circles. Very low policy rates (and correspondingly low market rates) are likely to whet investors’ risk taking incentives. Concurrently, the tightened regulatory framework, in particular for banks, increases the comparative attractiveness of the less regulated, so-called shadow banking sector. Employing flow-of-funds data for the Euro Area’s non-bank banking sector we take stock of recent developments in this part of the financial sector. In addition, we examine to which extent low interest rates have had an impact on investment behavior. Our results reveal a declining role of banks (and, simultaneously, an increase in non-bank banking). Overall intermediation activity, hence, has remained roughly at the same level. Moreover, our findings also suggest that non-bank banks have tended to take positions in riskier assets (particularly in equities). In line with this observation, balance-sheet based risk measures indicate a rise in sector-specific risks in the non-bank banking sector (when narrowly defined).
Euro crash risk
(2015)
Using fiscal reaction functions for 3a panel of actual euro-area countries the paper investigates whether euro membership has reduced the responsiveness of countries to increases in the level of inherited debt compared to the period prior to succession to the euro. While we find some evidence for such a loss in prudence, the results are not robust to changes in the specification, as for example an exclusion of Greece from the panel. This suggests that the current debt problems may result to a large extent from pre-existing debt levels prior to entry or from a larger need for fiscal prudence in a common currency, while an adverse change in the fiscal reaction functions for most countries does not apply.
The paper uses fiscal reaction functions for a panel of euro-area countries to investigate whether euro membership has reduced the responsiveness of countries to shocks in the level of inherited debt compared to the period prior to succession to the euro. While we find some evidence for such a loss in prudence, the results are not robust to changes in the specification, such as an exclusion of Greece from the panel. This suggests that the current debt problems may result to a large extent from preexisting debt levels prior to entry or from a larger need for fiscal prudence in a common currency, while an adverse change in the fiscal reaction functions for most countries does not apply.
Euro nicht gefährdet
(2017)
Europa - wohin?
(2011)
Gemäß der Krönungstheorie der europäischen Währungsunion wurde der Euro eingeführt, um die Notwendigkeit gemeinsamen Regierens in der Europäischen Union allen vor Augen zu führen und so ein geordnetes Vorrücken zur europäischen Integration zu ermöglichen. In der gegenwärtigen Phase scheint indes politischer Opportunismus die Integration zu bestimmen.
Ein Freibrief für die Notenbank bedeutet, genau genommen, die Bankrotterklärung des demokratischen Verfassungsstaates vor technokratischen Beliebigkeiten, schreibt Helmut Siekmann in diesem Namensbeitrag. Er betont, dass die Europäische Union eine unverzichtbare Einrichtung ist und ein echter Bundesstaat sein sollte. Sie sei aber im Wesentlichen (nur) ein Rechtskonstrukt, weshalb es umso wichtiger sei, dass die rechtlichen Regeln, auf denen sie beruht, genauestens beachtet werden.
In dieser Notiz wird ein neues Konzept für eine europäische Einlagensicherung vorgeschlagen, welches den starken politischen Vorbehalten Rechnung trägt, die gegen eine Vergemeinschaftung der Haftung für Bankeinlagen bestehen. Das skizzierte drei-stufige Einlagensicherungsmodell führt existierende nationale Einlagensicherungseinrichtungen weiter, bietet einen europäischen Verlustausgleich und verhindert eine exzessive Risikoübernahme zu Lasten der internationalen Gemeinschaft.
This chapter discusses whether and how 'new quantitative trade models' (NQTMs) can be fruitfully applied to quantify the welfare effects of trade liberalization, thus shedding light on the trade-related effects of further European integration. On the one hand, it argues that NQTMs have indeed the potential of being used to supplement traditional 'computable general equilibrium' (CGE) analysis thanks to their tight connection between theory and data, appealing micro-theoretical foundations, and enhanced attention to the estimation of structural parameters. On the other hand, further work is still needed in order to fully exploit such potential.
The SVB case is a wake-up call for Europe’s regulators as it demonstrates the destructive power of a bank-run: it undermines the role of loss absorbing capital, elbowing governments to bailout affected banks. Many types of bank management weaknesses, like excessive duration risk, may raise concerns of bank losses – but to serve as a run-trigger, there needs to be a large enough group of bank depositors that fails to be fully covered by a deposit insurance scheme. Latent run-risk is the root cause of inefficient liquidations, and we argue that a run on SVB assets could have been avoided altogether by a more thoughtful deposit insurance scheme, sharply distinguishing between loss absorbing capital (equity plus bail-in debt) and other liabilities which are deemed not to be bail-inable, namely demand deposits. These evidence-based insights have direct implications for Europe’s banking regulation, suggesting a minimum and a maximum for a banks’ loss absorption capacity.
Asset-backed securitisation (ABS) is an asset funding technique that involves the issuance of structured claims on the cash flow performance of a designated pool of underlying receivables. Efficient risk management and asset allocation in this growing segment of fixed income markets requires both investors and issuers to thoroughly understand the longitudinal properties of spread prices. We present a multi-factor GARCH process in order to model the heteroskedasticity of secondary market spreads for valuation and forecasting purposes. In particular, accounting for the variance of errors is instrumental in deriving more accurate estimators of time-varying forecast confidence intervals. On the basis of CDO, MBS and Pfandbrief transactions as the most important asset classes of off-balance sheet and on-balance sheet securitisation in Europe we find that expected spread changes for these asset classes tends to be level stationary with model estimates indicating asymmetric mean reversion. Furthermore, spread volatility (conditional variance) is found to follow an asymmetric stochastic process contingent on the value of past residuals. This ABS spread behaviour implies negative investor sentiment during cyclical downturns, which is likely to escape stationary approximation the longer this market situation lasts.
Evaluating the quality of credit portfolio risk models is an important issue for both banks and regulators. Lopez and Saidenberg (2000) suggest cross-sectional resampling techniques in order to make efficient use of available data. We show that their proposal disregards cross-sectional dependence in resampled portfolios, which renders standard statistical inference invalid. We proceed by suggesting the Berkowitz (1999) procedure, which relies on standard likelihood ratio tests performed on transformed default data. We simulate the power of this approach in various settings including one in which the test is extended to incorporate cross-sectional information. To compare the predictive ability of alternative models, we propose to use either Bonferroni bounds or the likelihood-ratio of the two models. Monte Carlo simulations show that a default history of ten years can be sufficient to resolve uncertainties currently present in credit risk modeling.
Evaluating the quality of credit portfolio risk models is an important question for both banks and regulators. Lopez and Saidenberg (2000) suggest cross-sectional resampling techniques in order to make efficient use of available data and to produce measures of forecast accuracy. We first show that their proposal disregards crosssectional dependence in simulated subportfolios, which renders standard statistical inference invalid. We proceed by suggesting another evaluation methodology which draws on the concept of likelihood ratio tests. Specifically, we compare the predictive quality of alternative models by comparing the probabilities that observed data have been generated by these models. The distribution of the test statistic can be derived through Monte Carlo simulation. To exploit differences in cross-sectional predictions of alternative models, the test can be based on a linear combination of subportfolio statistics. In the construction of the test, the weight of a subportfolio depends on the difference in the loss distributions which alternative models predict for this particular portfolio. This makes efficient use of the data, and reduces computational burden. Monte Carlo simulations suggest that the power of the tests is satisfactory.
JEL classification: G2; G28; C52
We compare the cost effectiveness of two pronatalist policies:
(a) child allowances; and
(b) daycare subsidies.
We pay special attention to estimating how intended fertility (fertility before children are born) responds to these policies. We use two evaluation tools:
(i) a dynamic model on fertility, labor supply, outsourced childcare time, parental time, asset accumulation and consumption; and
(ii) randomized vignette-survey policy experiments.
We implement both tools in the United States and Germany, finding consistent evidence that daycare subsidies are more cost effective. Nevertheless, the required public expenditure to increase fertility to the replacement level might be viewed as prohibitively high.
Under a new Basel capital accord, bank regulators might use quantitative measures when evaluating the eligibility of internal credit rating systems for the internal ratings based approach. Based on data from Deutsche Bundesbank and using a simulation approach, we find that it is possible to identify strongly inferior rating systems out-of time based on statistics that measure either the quality of ranking borrowers from good to bad, or the quality of individual default probability forecasts. Banks do not significantly improve system quality if they use credit scores instead of ratings, or logistic regression default probability estimates instead of historical data. Banks that are not able to discriminate between high- and low-risk borrowers increase their average capital requirements due to the concavity of the capital requirements function.
This paper investigates the accuracy of point and density forecasts of four DSGE models for inflation, output growth and the federal funds rate. Model parameters are estimated and forecasts are derived successively from historical U.S. data vintages synchronized with the Fed’s Greenbook projections. Point forecasts of some models are of similar accuracy as the forecasts of nonstructural large dataset methods. Despite their common underlying New Keynesian modeling philosophy, forecasts of different DSGE models turn out to be quite distinct. Weighted forecasts are more precise than forecasts from individual models. The accuracy of a simple average of DSGE model forecasts is comparable to Greenbook projections for medium term horizons. Comparing density forecasts of DSGE models with the actual distribution of observations shows that the models overestimate uncertainty around point forecasts.
This paper investigates the accuracy of point and density forecasts of four DSGE models for inflation, output growth and the federal funds rate. Model parameters are estimated and forecasts are derived successively from historical U.S. data vintages synchronized with the Fed’s Greenbook projections. Point forecasts of some models are of similar accuracy as the forecasts of nonstructural large dataset methods. Despite their common underlying New Keynesian modeling philosophy, forecasts of different DSGE models turn out to be quite distinct. Weighted forecasts are more precise than forecasts from individual models. The accuracy of a simple average of DSGE model forecasts is comparable to Greenbook projections for medium term horizons. Comparing density forecasts of DSGE models with the actual distribution of observations shows that the models overestimate uncertainty around point forecasts.
This paper investigates the accuracy of forecasts from four DSGE models for inflation, output growth and the federal funds rate using a real-time dataset synchronized with the Fed’s Greenbook projections. Conditioning the model forecasts on the Greenbook nowcasts leads to forecasts that are as accurate as the Greenbook projections for output growth and the federal funds rate. Only for inflation the model forecasts are dominated by the Greenbook projections. A comparison with forecasts from Bayesian VARs shows that the economic structure of the DSGE models which is useful for the interpretation of forecasts does not lower the accuracy of forecasts. Combining forecasts of several DSGE models increases precision in comparison to individual model forecasts. Comparing density forecasts with the actual distribution of observations shows that DSGE models overestimate uncertainty around point forecasts.
This paper evaluates the effects of Public Sponsored Training in East Germany in the context of reiterated treatments. Selection bias based on observed characteristics is corrected for by applying kernel matching based on the propensity score. We control for further selection and the presence of Ashenfelter's Dip before the program with conditional difference-in-differences estimators. Training as a first treatment shows insignificant effects on the transition rates. The effect of program sequences and the incremental effect of a second program on the reemployment probability are insignificant. However, the incremental effect on the probability to remain employed is slightly positive. JEL - Klassifikation: H43 , C23 , J6 , J64 , C14
We develop a novel empirical approach to identify the effectiveness of policies against a pandemic. The essence of our approach is the insight that epidemic dynamics are best tracked over stages, rather than over time. We use a normalization procedure that makes the pre-policy paths of the epidemic identical across regions. The procedure uncovers regional variation in the stage of the epidemic at the time of policy implementation. This variation delivers clean identification of the policy effect based on the epidemic path of a leading region that serves as a counterfactual for other regions. We apply our method to evaluate the effectiveness of the nationwide stay-home policy enacted in Spain against the Covid-19 pandemic. We find that the policy saved 15.9% of lives relative to the number of deaths that would have occurred had it not been for the policy intervention. Its effectiveness evolves with the epidemic and is larger when implemented at earlier stages.
Innovations are a key factor to ensure the competitiveness of establishments as well as to enhance the growth and wealth of nations. But more than any other economic activity, decisions about innovations are plagued by failures of the market mechanism. As a response, public instruments have been implemented to stimulate private innovation activities. The effectiveness of these measures, however, is ambiguous and calls for an empirical evaluation. In this paper we make use of the IAB Establishment Panel and apply various microeconometric methods to estimate the effect of public measures on innovation activities of German establishments. We find that neglecting sample selection due to observable as well as to unobservable characteristics leads to an overestimation of the treatment effect and that there are considerable differences with regard to size class and betweenWest and East German establishments.
In dieser Studie werden die Wirkungen von Arbeitsbeschaffungsmaßnahmen (ABM) in Deutschland auf die individuellen Eingliederungswahrscheinlichkeiten der Teilnehmer in reguläre Beschäftigung evaluiert. Für die Untersuchung wird ein umfangreicher und informativer Datensatz aus den Datenquellen der Bundesagentur für Arbeit (BA) verwendet, der es ermöglicht, die Wirkungen der Programme differenziert nach individuellen Unterschieden der Teilnehmer und mit Berücksichtigung der heterogenen Arbeitsmarktstruktur zu untersuchen. Der Datensatz enthält Informationen zu allen Teilnehmern in ABM, die ihre Maßnahmen im Februar 2000 begonnen haben, und zu einer Kontrollgruppe von Nichtteilnehmern, die im Januar 2000 arbeitslos waren und im Februar 2000 nicht in die Programme eingetreten sind. Mit Hilfe der Informationen der Beschäftigtenstatistik ist es hierbei erstmals möglich, den Abgang in reguläre Beschäftigung auf Grundlage administrativer Daten zu untersuchen. Der vorliegende Verbleibszeitraum reicht bis Dezember 2002. Unter Verwendung von Matching-Methoden auf dem Ansatz potenzieller Ergebnisse werden die Effekte von ABM mit regionaler Unterscheidung und für besondere Problem- und Zielgruppen des Arbeitsmarktes geschätzt. Die Ergebnisse zeigen zwar deutliche Unterschiede in den Effekten für Subgruppen, insgesamt weisen die empirischen Befunde jedoch darauf hin, dass das Ziel der Eingliederung in reguläre ungeförderte Beschäftigung durch ABM weitgehend nicht realisiert werden konnte. JEL: C40 , C13 , J64 , H43 , J68
In dieser Studie werden die Wirkungen von Arbeitsbeschaffungsmaßnahmen (ABM) in Deutschland auf die individuellen Eingliederungswahrscheinlichkeiten der Teilnehmer in reguläre Beschäftigung evaluiert. Für die Untersuchung wird ein umfangreicher und informativer Datensatz aus den Datenquellen der Bundesagentur für Arbeit (BA) verwendet, der es ermöglicht, die Wirkungen der Programme differenziert nach individuellen Unterschieden der Teilnehmer und mit Berücksichtigung der heterogenen Arbeitsmarktstruktur zu untersuchen. Der Datensatz enthält Informationen zu allen Teilnehmern in ABM, die ihre Maßnahmen im Februar 2000 begonnen haben, und zu einer Kontrollgruppe von Nichtteilnehmern, die im Januar 2000 arbeitslos waren und im Februar 2000 nicht in die Programme eingetreten sind. Mit Hilfe der Informationen der Beschäftigtenstatistik ist es hierbei erstmals möglich, den Abgang in reguläre Beschäftigung auf Grundlage administrativer Daten zu untersuchen. Der vorliegende Verbleibszeitraum reicht bis Dezember 2002. Unter Verwendung von Matching-Methoden auf dem Ansatz potenzieller Ergebnisse werden die Effekte von ABM mit regionaler Unterscheidung und für besondere Problem- und Zielgruppen des Arbeitsmarktes geschätzt. Die Ergebnisse zeigen zwar deutliche Unterschiede in den Effekten für Subgruppen, insgesamt weisen die empirischen Befunde jedoch darauf hin, dass das Ziel der Eingliederung in reguläre ungeförderte Beschäftigung durch ABM weitgehend nicht realisiert werden konnte. JEL: C40 , C13 , J64 , H43 , J68
Persistently high unemployment, tight government budgets and the growing scepticism regarding the effects of active labour market policies (ALMP) are the basis for a growing interest in evaluating these measures. This paper intends to explain the need for evaluation on the micro- and macroeconomic level, introduce the fundamental evaluation problem and solutions to it, give an overview of the newer developments in evaluation literature and finally take a look on empirical estimations of ALMP effects. JEL Classification: C14, C33, H43, J64, J68
We develop a methodology to identify and rank “systemically important financial institutions” (SIFIs). Our approach is consistent with that followed by the Financial Stability Board (FSB) but, unlike the latter, it is free of judgment and it is based entirely on publicly available data, thus filling the gap between the official views of the regulator and those that market participants can form with their own information set. We apply the methodology to annual data on three samples of banks (global, EU and euro area) for the years 2007-2012. We examine the evolution of the SIFIs over time and document the shifs in the relative weights of the major geographic areas. We also discuss the implication of the 2013 update of the identification methodology proposed by the FSB.
Prior studies indicate the protective role of Ultraviolet-B (UVB) radiation in human health, mediated by vitamin D synthesis. In this observational study, we empirically outline a negative association of UVB radiation as measured by ultraviolet index (UVI) with the number of COVID-19 deaths. We apply a fixed-effect log-linear regression model to a panel dataset of 152 countries over 108 days (n = 6524). We use the cumulative number of COVID-19 deaths and case-fatality rate (CFR) as the main dependent variables and isolate the UVI effect from potential confounding factors. After controlling for time-constant and time-varying factors, we find that a permanent unit increase in UVI is associated with a 1.2 percentage points decline in daily growth rates of cumulative COVID-19 deaths [p < 0.01] and a 1.0 percentage points decline in the CFR daily growth rate [p < 0.05]. These results represent a significant percentage reduction in terms of daily growth rates of cumulative COVID-19 deaths (− 12%) and CFR (− 38%). We find a significant negative association between UVI and COVID-19 deaths, indicating evidence of the protective role of UVB in mitigating COVID-19 deaths. If confirmed via clinical studies, then the possibility of mitigating COVID-19 deaths via sensible sunlight exposure or vitamin D intervention would be very attractive.
The use of evidence and economic analysis in policymaking is on the rise, and accounting standard setting and financial regulation are no exception. This article discusses the promise of evidence-based policymaking in accounting and financial markets as well as the challenges and opportunities for research supporting this endeavor. In principle, using sound theory and robust empirical evidence should lead to better policies and regulations. But despite its obvious appeal and substantial promise, evidence-based policymaking is easier demanded than done. It faces many challenges related to the difficulty of providing relevant causal evidence, lack of data, the reliability of published research, and the transmission of research findings. Overcoming these challenges requires substantial infrastructure investments for generating and disseminating relevant research. To illustrate this point, I draw parallels to the rise of evidence-based medicine. The article provides several concrete suggestions for the research process and the aggregation of research findings if scientific evidence is to inform policymaking. I discuss how policymakers can foster and support policy-relevant research, chiefly by providing and generating data. The article also points to potential pitfalls when research becomes increasingly policy-oriented.
Die Quantenspieltheorie stellt eine mathematische und konzeptuelle Erweiterung der klassischen Spieltheorie dar. Der Raum aller denkbaren Entscheidungswege der Spieler wird vom rein reellen, messbaren Raum in den Raum der komplexen Zahlen (reelle und imaginäre Zahlen) ausgedehnt. Durch das Konzept der möglichen quantentheoretischen Verschränkung der Entscheidungswege im imaginären Raum aller denkbaren Quantenstrategien können gemeinsame, durch kulturelle oder moralische Normen entstandene Denkrichtungen mit einbezogen werden. Ist die Strategienverschränkung der Spieler im imaginären Raum der denkbaren Entscheidungswege nur genügend groß, so können zusätzliche Nash-Gleichgewichte auftreten und zuvor existente dominante Strategien sich auflösen. Die der evolutionären Entwicklung zugrundeliegende Replikatordynamik besitzt in der evolutionären Quantenspieltheorie eine komplexere Struktur und die jeweiligen evolutionär stabilen Strategien können sich, abhängig vom Maß der Verschränkung, abändern. Neben einer detaillierten Darstellung der evolutionären Quantenspieltheorie werden in dieser Dissertation mehrere Anwendungsbeispiele besprochen. So wird durch eine quantentheoretische Erweiterung die aktuelle Finanzkrise mittels eines Anti-Koordinationsspiels beleuchtet, das unterschiedliche Publikationsverhalten von Wissenschaftlern erklärt und erste Ansätze einer experimentellen Bestätigung der Theorie dargestellt.
For the academic audience, this paper presents the outcome of a well-identified, large change in the monetary policy rule from the lens of a standard New Keynesian model and asks whether the model properly captures the effects. For policymakers, it presents a cautionary tale of the dismal effects of ignoring basic macroeconomics. The Turkish monetary policy experiment of the past decade, stemming from a belief of the government that higher interest rates cause higher inflation, provides an unfortunately clean exogenous variance in the policy rule. The mandate to keep rates low, and the frequent policymaker turnover orchestrated by the government to enforce this, led to the Taylor principle not being satisfied and eventually a negative coeffcient on inflation in the policy rule. In such an environment, was the exchange rate still a random walk? Was inflation anchored? Does the “standard model”” suffice to explain the broad contours of macroeconomic outcomes in an emerging economy with large identifying variance in the policy rule? There are no surprises for students of open-economy macroeconomics; the answers are no, no, and yes.
We investigate the suggested substitutive relation between executive compensation and the disciplinary threat of takeover imposed by the market for corporate control. We complement other empirical studies on managerial compensation and corporate control mechanisms in three distinct ways. First, we concentrate on firms in the oil industry for which agency problems were especially severe in the 1980s. Due to the extensive generation of excess cash flow, product and factor market discipline was ineffective. Second, we obtain a unique data set drawn directly from proxy statements which accounts not only for salary and bonus but for the value of all stock-market based compensation held in the portfolio of a CEO. Our data set consists of 51 firms in the U.S. oil industry from 1977 to 1994. Third, we employ ex ante measures of the threat of takeover at the individual firm level which are superior to ex post measures like actual takeover occurrence or past incidence of takeovers in an industry. Results show that annual compensation and, to a much higher degree, stock-based managerial compensation increase after a firm becomes protected from a hostile takeover. However, clear-cut evidence that CEOs of protected firms receive higher compensation than those of firms considered susceptible to a takeover cannot be found.
We develop a model of managerial compensation structure and asset risk choice. The model provides predictions about how inside debt features affect the relation between credit spreads and compensation components. First, inside debt reduces credit spreads only if it is unsecured. Second, inside debt exerts important indirect effects on the role of equity incentives: When inside debt is large and unsecured, equity incentives increase credit spreads; When inside debt is small or secured, this effect is weakened or reversed. We test our model on a sample of U.S. public firms with traded CDS contracts, finding evidence supportive of our predictions. To alleviate endogeneity concerns, we also show that our results are robust to using an instrumental variable approach.
Exit strategies
(2014)
We study alternative scenarios for exiting the post-crisis fiscal and monetary accommodation using a macromodel where banks choose their capital structure and are subject to runs. Under a Taylor rule, the post-crisis interest rate hits the zero lower bound (ZLB) and remains there for several years. In that condition, pre-announced and fast fiscal consolidations dominate - based on output and inflation performance and bank stability - alternative strategies incorporating various degrees of gradualism and surprise. We also examine an alternative monetary strategy in which the interest rate does not reach the ZLB; the benefits from fiscal consolidation persist, but are more nuanced.
Expectations on others
(2017)
This paper explores the interplay of feature-based explainable AI (XAI) tech- niques, information processing, and human beliefs. Using a novel experimental protocol, we study the impact of providing users with explanations about how an AI system weighs inputted information to produce individual predictions (LIME) on users’ weighting of information and beliefs about the task-relevance of information. On the one hand, we find that feature-based explanations cause users to alter their mental weighting of available information according to observed explanations. On the other hand, explanations lead to asymmetric belief adjustments that we inter- pret as a manifestation of the confirmation bias. Trust in the prediction accuracy plays an important moderating role for XAI-enabled belief adjustments. Our results show that feature-based XAI does not only superficially influence decisions but re- ally change internal cognitive processes, bearing the potential to manipulate human beliefs and reinforce stereotypes. Hence, the current regulatory efforts that aim at enhancing algorithmic transparency may benefit from going hand in hand with measures ensuring the exclusion of sensitive personal information in XAI systems. Overall, our findings put assertions that XAI is the silver bullet solving all of AI systems’ (black box) problems into perspective.
When requesting a web-based service, users often fail in setting the website’s privacy settings according to their self privacy preferences. Being overwhelmed by the choice of preferences, a lack of knowledge of related technologies or unawareness of the own privacy preferences are just some reasons why users tend to struggle. To address all these problems, privacy setting prediction tools are particularly well-suited. Such tools aim to lower the burden to set privacy preferences according to owners’ privacy preferences. To be in line with the increased demand for explainability and interpretability by regulatory obligations – such as the General Data Protection Regulation (GDPR) in Europe – in this paper an explainable model for default privacy setting prediction is introduced. Compared to the previous work we present an improved feature selection, increased interpretability of each step in model design and enhanced evaluation metrics to better identify weaknesses in the model’s design before it goes into production. As a result, we aim to provide an explainable and transparent tool for default privacy setting prediction which users easily understand and are therefore more likely to use.
A number of recent studies have concluded that consumer spending patterns over the month are closely linked to the timing of income receipt. This correlation is interpreted as evidence of hyperbolic discounting. I re-examine patterns of spending in the diary sample of the U.S. Consumer Expenditure Survey, incorporating information on the timing of the main consumption commitment for most households - their monthly rent or mortgage payment. I find that non-durable and food spending increase with 30-48% on the day housing payments are made, with smaller increases in the days after. Moreover, households with weekly, biweekly and monthly income streams but the same timing of rent/mortgage payments have very similar consumption patterns. Exploiting variation in income, I find that households with extra liquidity decrease non-durable spending around housing payments, especially those households with a large budget share of housing.
Market risks account for an integral part of life insurers' risk profiles. This paper explores the market risk sensitivities of insurers in two large life insurance markets, namely the U.S. and Europe. Based on panel regression models and daily market data from 2012 to 2018, we analyze the reaction of insurers' stock returns to changes in interest rates and CDS spreads of sovereign counterparties. We find that the influence of interest rate movements on stock returns is more than 50% larger for U.S. than for European life insurers. Falling interest rates reduce stock returns in particular for less solvent firms, insurers with a high share of life insurance reserves and unit-linked insurers. Moreover, life insurers' sensitivity to interest rate changes is seven times larger than their sensitivity towards CDS spreads. Only European insurers significantly suffer from rising CDS spreads, whereas U.S. insurers are immunized against increasing sovereign default probabilities.
Market risks account for an integral part of insurers' risk profiles. We explore market risk sensitivities of insurers in the United States and Europe. Based on panel regression models and daily market data from 2012 to 2018, we find that sensitivities are particularly driven by insurers' product portfolio. The influence of interest rate movements on stock returns is 60% larger for US than for European life insurers. For the former, interest rate risk is a dominant market risk with an effect that is five times larger than through corporate credit risk. For European life insurers, the sensitivity to interest rate changes is only 44% larger than toward credit default swap of government bonds, underlining the relevance of sovereign credit risk.
We survey a representative sample of US households to study how exposure to the COVID-19 stock market crash a↵ects expectations and planned behavior. Wealth shocks are associated with upward adjustments of expectations about retirement age, desired working hours, and household debt, but have only small e↵ects on expected spending. We provide correlational and experimental evidence that beliefs about the duration of the stock market recovery shape households’ expectations about their own wealth and their planned investment decisions and labor market activity. Our findings shed light on the implications of household exposure to stock market crashes for expectation formation.
Chen and Zadrozny (1998) developed the linear extended Yule-Walker (XYW) method for determining the parameters of a vector autoregressive (VAR) model with available covariances of mixed-frequency observations on the variables of the model. If the parameters are determined uniquely for available population covariances, then, the VAR model is identified. The present paper extends the original XYW method to an extended XYW method for determining all ARMA parameters of a vector autoregressive moving-average (VARMA) model with available covariances of single- or mixed-frequency observations on the variables of the model. The paper proves that under conditions of stationarity, regularity, miniphaseness, controllability, observability, and diagonalizability on the parameters of the model, the parameters are determined uniquely with available population covariances of single- or mixed-frequency observations on the variables of the model, so that the VARMA model is identified with the single- or mixed-frequency covariances.
In a parsimonious regime switching model, we find strong evidence that expected consumption growth varies over time. Adding inflation as a second variable, we uncover two states in which expected consumption growth is low, one with high and one with negative expected inflation. Embedded in a general equilibrium asset pricing model with learning, these dynamics replicate the observed time variation in stock return volatilities and stock- bond return correlations. They also provide an alternative derivation for a measure of time-varying disaster risk suggested by Wachter (2013), implying that both the disaster and the long-run risk paradigm can be extended towards explaining movements in the stock-bond correlation.
Using a normalized CES function with factor-augmenting technical progress, we estimate a supply-side system of the US economy from 1953 to 1998. Avoiding potential estimation biases that have occurred in earlier studies and putting a high emphasis on the consistency of the data set, required by the estimated system, we obtain robust results not only for the aggregate elasticity of substitution but also for the parameters of labor and capital augmenting technical change. We find that the elasticity of substitution is significantly below unity and that the growth rates of technical progress show an asymmetrical pattern where the growth of laboraugmenting technical progress is exponential, while that of capital is hyperbolic or logarithmic.
A series of recent articles has called into question the validity of VAR models of the global market for crude oil. These studies seek to replace existing oil market models by structural VAR models of their own based on different data, different identifying assumptions, and a different econometric approach. Their main aim has been to revise the consensus in the literature that oil demand shocks are a more important determinant of oil price fluctuations than oil supply shocks. Substantial progress has been made in recent years in sorting out the pros and cons of the underlying econometric methodologies and data in this debate, and in separating claims that are supported by empirical evidence from claims that are not. The purpose of this paper is to take stock of the VAR literature on global oil markets and to synthesize what we have learned. Combining this evidence with new data and analysis, I make the case that the concerns regarding the existing VAR oil market literature have been overstated and that the results from these models are quite robust to changes in the model specification.
Mis-selling by banks has occurred repeatedly in many nations over the last decade. While clients may benefit from competition – enabling them to choose financial services at lowest costs – economic frictions between banks and clients may give rise to mis-selling. Examples of mis-selling are mis-representation of information, overly complex product design and non-customized advice. European regulators address the problem of mis-selling in the "Markets in Financial Instruments Directive" (MiFID) I and II and the "Markets in Financial Instruments Regulation" (MiFIR), by setting behavioral requirements for banks, regulating the compensation of employees, and imposing re-quirements on offered financial products and disclosure rules.
This paper argues that MiFID II protects clients but is not as effective as it could be. (1) It does not differentiate between client groups with different levels of financial literacy. Effective advice requires different advice for different client groups. (2) MiFID II uses too many rules and too many instruments to achieve identical goals and thereby generates excessive compliance costs. High compliance costs and low revenues would drive banks out of some segments of retail business.
A key solution for public good provision is the voluntary formation of institutions that commit players to cooperate. Such institutions generate inequality if some players decide not to participate but cannot be excluded from cooperation benefits. Prior research with small groups emphasizes the role of fairness concerns with positive effects on cooperation. We show that effects do not generalize to larger groups: if group size increases, groups are less willing to form institutions generating inequality. In contrast to smaller groups, however, this does not increase the number of participating players, thereby limiting the positive impact of institution formation on cooperation.
Im vorliegenden Papier werden drei Ansätze zur Reform des Familienleistungsausgleichs (FLA) mit jeweils zwei Varianten dargestellt und hinsichtlich ihrer fiskalischen Effekte und Wirkungen in verschiedenen Segmenten der Einkommensverteilung systematisch verglichen. – Mit dem weitestgehenden Konzept, der Kindergrundsicherung, wird ein Existenz sicherndes und zu versteuerndes Kindergeld in Höhe von monatlich 502 Euro bzw. 454 Euro pro Kind vorgeschlagen. Die bisherigen kindbedingten Freibeträge und mehrere Sozialleis-tungen könnten entfallen bzw. reduziert werden. – Daneben werden Kindergelderhöhungen auf einheitlich 238 Euro bzw. 328 Euro pro Kind und Monat, die allen Kindern – auch denen im SGB II-Leistungsbezug – zugute kommen sollen, untersucht. Das Kindergeld wäre wie bisher nicht zu versteuern, die bisherigen kindbedingten Freibeträge würden aber entfallen. – Schließlich wird der Vorschlag einer deutlichen Erhöhung des Kinderzuschlags bei reduzierter Mindesteinkommensgrenze und Wegfall der Höchsteinkommensgrenze erörtert. Zudem ist bei diesem Ansatz ein nochmals erhöhter Kinderzuschlag bei Alleinerziehenden – analog zum Mehrbedarfszuschlag nach dem SGFB II – (erste Variante) oder eine Herabsetzung der Transferentzugsrate auf Nichterwerbseinkommen von 100% auf 70% (zweite Variante) vorgesehen. Die zu erwartenden fiskalischen Belastungen der einfachen Kindergelderhöhung (ohne Be-steuerung) können ohne Weiteres hochgerechnet werden (16 bzw. 35 Mrd. Euro p. a.), die der anderen Reformmodelle sind aber ohne mikroanalytische Fundierung unter Berücksichtigung der Einkommensverteilung kaum abschätzbar. Zwar lassen sich auch die Bruttokosten der Kindergrundsicherung auf einfachem Wege ermitteln (Multiplikation der Kindergeld-Kinder mit der Betragshöhe), die Aggregate der davon abzusetzenden zahlreichen Einsparungen bei anderen Sozialleistungen und insbesondere der Steuermehreinnahmen sind allerdings nicht offensichtlich. Eine erste Überschlagsrechnung hat ergeben, dass die Nettokosten der ersten „großzügigen“ Variante der Kindergrundsicherung (502 Euro) denen der Kindergelderhöhung auf das sächliche Existenzminimum (322 Euro) ohne Besteuerung ungefähr gleich sind (in der Größenordnung von 35 Mrd. Euro). Eine genauere Quantifizierung kann aber nur auf der Ba-sis repräsentativer Mikrodaten und eines Simulationsmodells erfolgen, da insbesondere der Besteuerungseffekt der Kindergrundsicherung von der faktischen Einkommensverteilung abhängt. Auch eine Kostenschätzung für die Kinderzuschlagsreform bedarf mikroanalytischer Verfahren; ungeachtet dessen würde diese auf einen begrenzten Einkommensbereich gerichte-te Reform aber eindeutig die geringsten Kosten verursachen. Für einen systematischen Vergleich der Verteilungswirkungen der Reformvorschläge werden in der vorliegenden Arbeit Modellrechnungen für zwei ausgewählte Haushaltstypen präsen-tiert. Dabei wird deutlich, dass mit dem vergleichsweise begrenzten Konzept der Ausweitung des Kinderzuschlags die Situation von Familien in prekären Einkommensverhältnissen bis zu Familien der unteren Mittelschicht deutlich verbessert werden könnte. Inwieweit dieser Effekt eintreten würde, hängt allerdings auch vom Inanspruchnahmeverhalten ab; bisher ist die Nicht-Inanspruchnahme von Kinderzuschlag und Wohngeld weit verbreitet. Zudem würde sich die Situation der ärmsten Familien, die auf SGB II-Leistungen angewiesen sind, nicht verbessern, und am oberen Rand würden die Entlastungseffekte des FLA als Folge der kind-bedingten Freibeträge weiterhin mit dem elterlichen Einkommen zunehmen. Demgegenüber würden sich bei den Varianten der Kindergelderhöhung (ohne Besteuerung) die deutlichen Verbesserungen gegenüber dem Status quo gleichmäßig über das Einkommensspektrum vom Niedriglohnsegment – bei unverändert problematischen Effekten des Kinderzuschlags (Ein-kommensbruchstelle bei Höchsteinkommensgrenze) – bis in obere Schichten verteilen und erst am oberen Rand mit steigendem Einkommen sinken (infolge des Wegfalls der bisherigen kindbedingten Freibeträge). Die Förderungen durch die Kindergrundsicherung schließlich würden am stärksten im unteren und unteren Mittelbereich ausfallen und – im Gegensatz zu anderen Konzepten – insbesondere verdeckte Armut systembedingt, also quasi „automatisch“, weitgehend abbauen. Im oberen Mittelbereich und hauptsächlich in höheren Schichten würden die Transfers dagegen mit zunehmendem Einkommen kontinuierlich abnehmen. Insgesamt würde dies zu einem vergleichsweise stetig steigenden Verlauf des verfügbaren Familieneinkommens führen; die wegen der hohen Transferentzugsraten des Kinderzuschlags – gegebenenfalls in Kombination mit Wohngeld – häufigen Befürchtungen negativer Arbeitsanreize im unteren Einkommensbereich wären gegenstandslos. Inwieweit die hier diskutierten Reformkonzepte zu einem Abbau von Kinder- und Familien-armut und zu weniger Ungleichheit der personellen Einkommensverteilung führen können, lässt sich allein auf der Basis von Modellrechnungen allerdings nicht absehen. Dazu bedarf es detaillierter Analysen auf der Basis von repräsentativen Mikrodaten, die die faktische Ein-kommensverteilung abbilden und Simulationsrechnungen zur Quantifizierung der Effekte der Reformvarianten – unter Einbeziehung der Finanzierung der jeweiligen Nettokosten – ermöglichen. Daran wird im Projekt „Vom Kindergeld zu einer Grundsicherung für Kinder“ auf Basis der Daten des Sozio-ökonomischen Panels (SOEP) 2007 gearbeitet; die Repräsentativität des Datensatzes hinsichtlich des Nachweises von Einkommens-, insbesondere Transferar-ten wurde bereits geprüft – mit gutem Ergebnis.
Social Security rules that determine retirement, spousal, and survivor benefits, along with benefit adjustments according to the age at which these are claimed, open up a complex set of financial options for household decisions. These rules influence optimal household asset allocation, insurance, and work decisions, subject to life cycle demographic shocks, such as marriage, divorce, and children. Our model-based research generates a wealth profile and a low and stable equity fraction consistent with empirical evidence. We confirm predictions that wives will claim retirement benefits earlier than husbands, while life insurance is mainly purchased by younger men. Our policy simulations imply that eliminating survivor benefits would sharply reduce claiming differences by sex while dramatically increasing men’s life insurance purchases.
The Federal Reserve’s muddled mandate to attain simultaneously the incompatible goals of maximum employment and price stability invites short-term-oriented discretionary policymaking inconsistent with the systematic approach needed for monetary policy to contribute best to the economy over time. Fear of liftoff—the reluctance to start the process of policy normalization after the end of a recession—serves as an example. Causes of the problem are discussed, drawing on public choice and cognitive psychology perspectives. The Federal Reserve could adopt a framework that relies on a simple policy rule subject to periodic reviews and adaptation. Replacing meeting-by-meeting discretion with a simple policy rule would eschew discretion in favor of systematic policy. Periodic review of the rule would allow the Federal Reserve the flexibility to account for and occasionally adapt to the evolving understanding of the economy. Congressional legislation could guide the Federal Reserve in this direction. However the Federal Reserve may be best placed to select the simple rule and could embrace this improvement on its own, within its current mandate, with the publication of a simple rule along the lines of its statement of longer-run goals.
Are we in a new “Polanyian moment”? If we are, it is essential to examine how “spontaneous” and punctual expressions of discontent at the individual level may give rise to collective discourses driving social and political change. It is also important to examine whether and how the framing of these discourses may vary across political economies. This paper contributes to this endeavor with the analysis of anti-finance discourses on Twitter in France, Germany, Italy, Spain and the UK between 2019 and 2020. This paper presents three main findings. First, the analysis shows that, more than ten years after the financial crisis, finance is still a strong catalyzer of political discontent. Second, it shows that there are important variations in the dominant framing of public anti-finance discourses on social media across European political economies. If the antagonistic “us versus them” is prominent in all the cases, the identification of who “us” and “them” are, vary significantly. Third, it shows that the presence of far-right tropes in the critique of finance varies greatly from virtually inexistent to a solid minority of statements.
Central banks sometimes evaluate their own policies. To assess the inherent conflict of interest, the authors compare the research findings of central bank researchers and academic economists regarding the macroeconomic effects of quantitative easing (QE). They find that central bank papers report larger effects of QE on output and inflation. Central bankers are also more likely to report significant effects of QE on output and to use more positive language in the abstract. Central bankers who report larger QE effects on output experience more favorable career outcomes. A survey of central banks reveals substantial involvement of bank management in research production.
Fabo, Janˇcokov ́a, Kempf, and P ́astor (2021) show that papers written by central bank researchers find quantitative easing (QE) to be more effective than papers written by academics. Weale and Wieladek (2022) show that a subset of these results lose statistical significance when OLS regressions are replaced by regressions that downweight outliers. We examine those outliers and find no reason to downweight them. Most of them represent estimates from influential central bank papers published in respectable academic journals. For example, among the five papers finding the largest peak effect of QE on output, all five are published in high-quality journals (Journal of Monetary Economics, Journal of Money, Credit and Banking, and Applied Economics Letters), and their average number of citations is well over 200. Moreover, we show that these papers have supported policy communication by the world’s leading central banks and shaped the public perception of the effectiveness of QE. New evidence based on quantile regressions further supports the results in Fabo et al. (2021).
The paper analyses the contagion channels of the European financial system through the stochastic block model (SBM). The model groups homogeneous connectivity patterns among the financial institutions and describes the shock transmission mechanisms of the financial networks in a compact way. We analyse the global financial crisis and European sovereign debt crisis and show that the network exhibits a strong community structure with two main blocks acting as shock spreader and receiver, respectively. Moreover, we provide evidence of the prominent role played by insurances in the spread of systemic risk in both crises. Finally, we demonstrate that policy interventions focused on institutions with inter-community linkages (community bridges) are more effective than the ones based on the classical connectedness measures and represents consequently, a better early warning indicator in predicting future financial losses.
Past research suggests that international real estate markets show return characteristics and interrelationships with other asset classes, which probably qualify them as an interesting component of national and international asset allocation decisions. However, the special characteristics of real estate assets are quite distinct from that of financial assets, such as stocks and bonds. This is also the case for real estate return distributions. Therefore, the proper integration of real estate markets into asset allocation decisions requires profound understanding of real estate returns' distributional characteristics .
Because of the particular characteristics of real estate, representing real estate markets through reliable a time-series is a complex task. Consequently, reliable real estate indices with a sufficiently long history in major international real estate markets are only scarcely available. Most of the research that has been done on real estate returns was done for the U.K. and U.S., where eligible indices exist. On the other hand, in other important real estate markets, such as Germany, either little or no research has been perfoimed.
In this analysis, the methodology of Maurer, Sebastian and Stephan (2000) for indirectly deriving an appraisal-based index for the German commercial real estate market will be applied. This approach is solely based on publicly available data from German open-ended real estate investment trusts. It could also provide a solution to deriving a reliable real estate time-series for other markets.
We will extend previous analyses for the U.K. and U.S. to provide additional fundamental insights into the return characteristics of the German commercial real estate market. Despite univariate considerations, the main focus is the interrelationships between various international real estate markets, as well as between those respective markets and the international stock and bond markets.