Refine
Year of publication
Document Type
- Working Paper (214)
- Part of Periodical (15)
- Report (8)
- Article (7)
- Book (2)
- Doctoral Thesis (1)
- Periodical (1)
Is part of the Bibliography
- no (248)
Keywords
- monetary policy (14)
- DSGE (7)
- Geldpolitik (7)
- Federal Reserve (6)
- Monetary Policy (6)
- Numerical accuracy (6)
- Solution methods (6)
- Bayesian estimation (5)
- DSGE models (5)
- Deutschland (5)
Institute
- Institute for Monetary and Financial Stability (IMFS) (248) (remove)
The authors identify U.S. monetary and fiscal dominance regimes using machine learning techniques. The algorithms are trained and verified by employing simulated data from Markov-switching DSGE models, before they classify regimes from 1968-2017 using actual U.S. data. All machine learning methods outperform a standard logistic regression concerning the simulated data. Among those the Boosted Ensemble Trees classifier yields the best results. The authors find clear evidence of fiscal dominance before Volcker. Monetary dominance is detected between 1984-1988, before a fiscally led regime turns up around the stock market crash lasting until 1994. Until the beginning of the new century, monetary dominance is established, while the more recent evidence following the financial crisis is mixed with a tendency towards fiscal dominance.
This note argues that the European Central Bank should adjust its strategy in order to consider broader measures of inflation in its policy deliberations and communications. In particular, it points out that a broad measure of domestic goods and services price inflation such as the GDP deflator has increased along with the euro area recovery and the expansion of monetary policy since 2013, while HICP inflation has become more variable and, on average, has declined. Similarly, the cost of owner-occupied housing, which is excluded from the HICP, has risen during this period. Furthermore, it shows that optimal monetary policy at the effective lower bound on nominal interest rates aims to return inflation more slowly to the inflation target from below than in normal times because of uncertainty about the effects and potential side effects of quantitative easing.
Rising temperatures, falling ratings: the effect of climate change on sovereign creditworthiness
(2021)
How will a changing climate impact the creditworthiness of governments over the very long term? Financial markets need credible, digestible information on how climate change translates into material risks. To bridge the gap between climate science and real-world financial indicators, the authors simulate the effect of climate change on sovereign credit ratings for 108 countries, creating the world’s first climate-adjusted sovereign credit rating. The study offers a first methodological approach to extend the long-term rating to an ultra-long-term reality, aiming at long-term investors, but also regulators and rating agencies.
Central banks normally accept debt of their own governments as collateral in liquidity operations without reservations. This gives rise to a valuable liquidity premium that reduces the cost of government finance. The ECB is an interesting exception in this respect. It relies on external assessments of the creditworthiness of its member states, such as credit ratings, to determine eligibility and the haircut it imposes on such debt. The authors show how such features in a central bank’s collateral framework can give rise to cliff effects and multiple equilibria in bond yields and increase the vulnerability of governments to external shocks. This can potentially induce sovereign debt crises and defaults that would not otherwise arise.
Can boundedly rational agents survive competition with fully rational agents? The authors develop a highly nonlinear heterogeneous agents model with rational forward looking versus boundedly rational backward looking agents and evolving market shares depending on their relative performance. Their novel numerical solution method detects equilibrium paths characterized by complex bubble and crash dynamics. Boundedly rational trend-extrapolators amplify small deviations from fundamentals, while rational agents anticipate market crashes after large bubbles and drive prices back close to fundamental value. Overall rational and non-rational beliefs co-evolve over time, with time-varying impact, and their interaction produces complex endogenous bubble and crashes, without any exogenous shocks.
High-frequency changes in interest rates around FOMC announcements are a standard method of measuring monetary policy shocks. However, some recent studies have documented puzzling effects of these shocks on private-sector forecasts of GDP, unemployment, or inflation that are opposite in sign to what standard macroeconomic models would predict. This evidence has been viewed as supportive of a „Fed information effect“ channel of monetary policy, whereby an FOMC tightening (easing) communicates that the economy is stronger (weaker) than the public had expected.
The authors show that these empirical results are also consistent with a „Fed response to news“ channel, in which incoming, publicly available economic news causes both the Fed to change monetary policy and the private sector to revise its forecasts. They provide substantial new evidence that distinguishes between these two channels and strongly favors the latter; for example, regressions that include the previously omitted public macroeconomic news, high-frequency stock market responses to Fed announcements, and a new survey that they conduct of individual Blue Chip forecasters all indicate that the Fed and private sector are simply responding to the same public news, and that there is little if any role for a „Fed information effect“.
On the accuracy of linear DSGE solution methods and the consequences for log-normal asset pricing
(2021)
This paper demonstrates a failure of standard, generalized Schur (or QZ) decomposition based solutions methods for linear dynamic stochastic general equilibrium (DSGE) models when there is insufficient eigenvalue separation about the unit circle. The significance of this is demonstrated in a simple production-based asset pricing model with external habit formation. While the exact solution afforded by the simplicity of the model matches post-war US consumption growth and the equity premium, QZ-based numerical solutions miss the later by many annualized percentage points.
The recently observed disconnect between inflation and economic activity can be explained by the interplay between the zero lower bound (ZLB) and the costs of external financing. In normal times, credit spreads and the nominal interest rate balance out; factor costs dominate firms' marginal costs. When nominal rates are constrained, larger spreads can more than offset the effect of lower factor costs and induce only moderate inflation responses. The Phillips curve is hence flat at the ZLB, but features a positive slope in normal times and thus a hockey stick shape. Via this mechanism, forward guidance may induce deflationary effects.
The authors examine the effectiveness of labor cost reductions as a means to stimulate economic activity and assesses the differences which may occur with the prevailing exchange rate regime. They develop a medium-scale three-region DSGE model and show that the impact of a cut in the employers’ social security contributions rate does not vary significantly under different exchange rate regimes. They find that both the interest rate and the exchange rate channel matters. Furthermore, the measure appears to be effective even if it comes along with a consumption tax increase to preserve long-term fiscal sustainability.
Finally, they assess whether obtained theoretical results hold up empirically by applying the local projection method. Regression results suggest that changes in employers’ social security contributions rates have statistically significant real effects – a one percentage point reduction leads to an average cumulative rise in output of around 1.3 percent in the medium term. Moreover, the outcome does not differ significantly across the different exchange rate regimes.
The so-called Troika, consisting of the EU-Commission, the European Central Bank (ECB) and the International Monetary Fund (IMF), was supposed to support the member states of the euro area which had been hit hard by a sovereign debt crisis. For that purpose, economic adjustment programs were drafted and monitored in order to prevent the break-up of the euro area and sovereign defaults. The cooperation of these institutions, which was born out of necessity, has been partly successful, but has also created persistent problems. With the further increase of public debt, especially in France and Italy, the danger of a renewed crisis in the euro area was growing. The European Stability Mechanism (ESM) together with the European Commission will replace the Troika in the future, following decisions of the EU Summit of December 2018. It shall play the role of a European Monetary Fund in the event of a crisis. The IMF, on the other side, will no longer play an active role in solving sovereign debt crises in the euro area. The current course is, however, inadequate to tackle the core problems of the euro zone and to avoid future crises, which are mainly structural in nature and due to escalating public debt and lack of international competitiveness of some member countries. The current Corona crisis will aggravate the institutional problems. It has led to a common European fiscal response ("Next Generation EU"). This rescue and recovery program will not be financed by ESM resources and will not be monitored by the ESM. One important novelty of this package is that it involves the issuance of substantial common European debt.
Despite the increasing use of cashless payment instruments, the notion that cash loses importance over time can be unambiguously refuted. In contrast, the authors show that cash demand increased steeply over the past 30 years. This is not only true on a global scale, but also for the most important currencies in advanced countries (USD, EUR, CHF, GBP and JPY). In this paper, they focus especially on the role of different crises (technological crises, financial market crises, natural disasters) and analyse the demand for small and large banknote denominations since the 1990s in an international perspective. It is evident that cash demand always increases in times of crises, independent of the nature of the crisis itself. However, largely unaffected from crises we observe a trend increase in global cash aligned with a shift from transaction balances towards more hoarding, especially in the form of large denomination banknotes.
The authors embed human capital-based endogenous growth into a New-Keynesian model with search and matching frictions in the labor market and skill obsolescence from long-term unemployment. The model can account for key features of the Great Recession: a decline in productivity growth, the relative stability of inflation despite a pronounced fall in output (the "missing disinflation puzzle"), and a permanent gap between output and the pre-crisis trend output.
In the model, lower aggregate demand raises unemployment and the training costs associated with skill obsolescence. Lower employment hinders learning-by-doing, which slows down human capital accumulation, feeding back into even fewer vacancies than justified by the demand shock alone. These feedback channels mitigate the disinflationary effect of the demand shock while amplifying its contractionary effect on output. The temporary growth slowdown translates into output hysteresis (permanently lower output and labor productivity).
Occasionally binding constraints have become an important part of economic modelling, especially since western central banks see themselves (again) constraint by the so-called zero lower bound (ZLB) of the nominal interest rate. A binding ZLB constraint poses a major problem for a quantitative-structural analysis: Linear solution methods do no work in the presence of a non-linearity such as the ZLB and existing alternatives tend to be computationally demanding. The urge to study macroeconomic questions related to the Great Recession and the Covid-19 crisis in a quantitative-structural framework requires algorithms that are not only accurate, but that are also robust, fast, and computationally efficient.
A particularly important application where efficient and fast methods for occasionally binding constraints (OBCs) are needed is the Bayesian estimation of macroeconomic models. This paper shows that a linear dynamic rational expectations system with OBCs, depending on the expected duration of the constraint, can be represented in closed form. Combined with a set of simple equilibrium conditions, this can be exploited to avoid matrix inversions and simulations at runtime for signifcant gains in computational speed.
Central banks sometimes evaluate their own policies. To assess the inherent conflict of interest, the authors compare the research findings of central bank researchers and academic economists regarding the macroeconomic effects of quantitative easing (QE). They find that central bank papers report larger effects of QE on output and inflation. Central bankers are also more likely to report significant effects of QE on output and to use more positive language in the abstract. Central bankers who report larger QE effects on output experience more favorable career outcomes. A survey of central banks reveals substantial involvement of bank management in research production.
Empirical estimates of equilibrium real interest rates are so far mostly limited to advanced economies, since no statistical procedure suitable for a large set of countries is available. This is surprising, as equilibrium rates have strong policy implications in emerging markets and developing economies as well; current estimates of the global equilibrium rate rely on only a few countries; and estimates for a more diverse set of countries can improve understanding of the drivers. The authors propose a model and estimation strategy that decompose ex ante real interest rates into a permanent and transitory component even with short samples and high volatility. This is done with an unobserved component local level stochastic volatility model, which is used to estimate equilibrium rates for 50 countries with Bayesian methods.
Equilibrium rates were lower in emerging markets and developing economies than in advanced economies in the 1980s, similar in the 1990s, and have been higher since 2000. In line with economic integration and rising global capital markets, synchronization has been rising over time and is higher among advanced economies. Equilibrium rates of countries with stronger trade linkages and similar demographic and economic trends are more synchronized.
The ruling of the German Federal Constitutional Court and its call for conducting and communicating proportionality assessments regarding monetary policy have been the subject of some controversy. However, it can also be understood as a way to strengthen the de-facto independence of the European Central Bank. The authors shows how a regular proportionality check could be integrated in the ECB’s strategy that is currently undergoing a systematic review. In particular, they propose to include quantitative benchmarks for policy rates and the central bank balance sheet. Deviations from such benchmarks can have benefits in terms of the intended path for inflation while involving costs in terms of risks and side effects that need to be balanced. Practical applications to the euro area are provided
In this paper we adapt the Hamiltonian Monte Carlo (HMC) estimator to DSGE models, a method presently used in various fields due to its superior sampling and diagnostic properties. We implement it into a state-of-theart, freely available high-performance software package, STAN. We estimate a small scale textbook New-Keynesian model and the Smets-Wouters model using US data. Our results and sampling diagnostics confirm the parameter estimates available in existing literature. In addition, we find bimodality in the Smets-Wouters model even if we estimate the model using the original tight priors. Finally, we combine the HMC framework with the Sequential Monte Carlo (SMC) algorithm to create a powerful tool which permits the estimation of DSGE models with ill-behaved posterior densities.
In this paper we adopt the Hamiltonian Monte Carlo (HMC) estimator for DSGE models by implementing it into a state-of-the-art, freely available high-performance software package. We estimate a small scale textbook New-Keynesian model and the Smets-Wouters model on US data. Our results and sampling diagnostics confirm the parameter estimates available in existing literature. In addition we combine the HMC framework with the Sequential Monte Carlo (SMC) algorithm which permits the estimation of DSGE models with ill-behaved posterior densities.
Using a nonlinear Bayesian likelihood approach that fully accounts for the zero lower bound on nominal interest rates, the authors analyze US post-crisis business cycle dynamics and provide reference parameter estimates. They find that neither the inclusion of financial frictions nor that of household heterogeneity improve the empirical fit of the standard model, or its ability to provide a joint explanation for the post-2007 dynamics. Associated financial shocks mis-predict an increase in consumption. The common practice of omitting the ZLB period in the estimation severely distorts the analysis of the more recent economic dynamics.
Did the Federal Reserves’ Quantitative Easing (QE) in the aftermath of the financial crisis have macroeconomic effects? To answer this question, the authors estimate a large-scale DSGE model over the sample from 1998 to 2020, including data of the Fed’s balance sheet. The authors allow for QE to affect the economy via multiple channels that arise from several financial frictions. Their nonlinear Bayesian likelihood approach fully accounts for the zero lower bound on nominal interest rates. They find that between 2009 to 2015, QE increased output by about 1.2 percent. This reflects a net increase in investment of nearly 9 percent, that was accompanied by a 0.7 percent drop in aggregate consumption. Both, government bond and capital asset purchases were effective in improving financing conditions. Especially capital asset purchases significantly facilitated new investment and increased the production capacity. Against the backdrop of a fall in consumption, supply side effects dominated which led to a mild disinflationary effect of about 0.25 percent annually.
Mit einem um die Behandlungskapazität des Gesundheitssystems erweiterten epidemiologischen SIRD-Modell werden Mechanismen und Dynamik einer Virusepidemie wie Corona anhand von stilisierten politischen Reaktionsmustern (Ignore, Shutdown, Ignore-Shutdown-Relax) simuliert. Ferner werden aus dem Modell Lehren für die statistische Analyse von Corona gezogen, wie die Aussagekraft publizierter Verdopplungszeiten und Reproduktionszahlen. Die Dunkelziffer unbestätigter Fälle und die im Epidemieverlauf variable Genauigkeit von medizinischen Infektionstests werden diskutiert. Zur Messung der medizinischen Kosten von Corona sowie für regionale und internationale Vergleiche wird ein Schadensindex der verlorenen Lebenszeit vorgeschlagen. Zuletzt geht die Arbeit kurz auf die ökonomischen Kosten von Corona in Deutschland ein.
This paper summarizes key elements of the German Federal Constitutional Court’s decision on the European Central Bank’s Public Sector Asset Purchase Programme. It briefly explains how it is possible for the German Court to disagree with the ruling of the Court of Justice of the European Union. Finally, it makes suggestions concerning a practical way forward for the Governing Council of the ECB in light of these developments.
Das Working Paper bietet die zusammenfassende Stellungnahme von Prof. Volker Wieland zum Ankaufprogramm der Europäischen Zentralbank für Anleihen des öffentlichen Sektors (Public Sector Purchase Programme, PSPP) am Bundesverfassungsgericht am 30.07.2019. Dabei liegt der Schwerpunkt auf der Frage der Einordnung des PSPP als monetäre, geldpolitische Maßnahme und der Verhältnismäßigkeit des Programms und seiner Umsetzung. Ebenfalls wird kurz auf die weiteren Fragen zur Umsetzung, insbesondere Ankündigung, Begrenzung und Abstand zum Primärmarkt für Staatsanleihen eingegangen.
Einen Überblick über neueste Forschungsergebnisse der Wissenschaftler am IMFS, Berichte von Konferenzen und Vorträgen sowie ausführliche Informationen zum derzeit größten Forschungsprojekt Macroeconomic Modeling and Comparison Initiative (MMCI) bietet der IMFS-Jahresbericht 2019, der jetzt veröffentlicht ist. Darüber hinaus gibt IMFS-Professor Michael Haliassos im Interview einen Einblick in seine Arbeit zum Finanzverhalten der privaten Haushalte und die ehemaligen Mitarbeiter Philipp Lieberknecht und Felix Strobel berichten, wie sie im Berufsleben auf ihrer Forschung am IMFS aufbauen können.
Auf rund 100 Seiten zeigt der Bericht die Highlights des Jahres, alle Mitarbeiter sowie die Projekte, Publikationen sowie die Veranstaltungen des IMFS, darunter „The ECB and Its Watchers“ zu finden. Der Jahresbericht ist auf Englisch erschienen und steht im PDF-Format zur Verfügung.
Household finance
(2020)
Household financial decisions are complex, interdependent, and heterogeneous, and central to the functioning of the financial system. We present an overview of the rapidly expanding literature on household finance (with some important exceptions) and suggest directions for future research. We begin with the theory and empirics of asset market participation and asset allocation over the lifecycle. We then discuss house-hold choices in insurance markets, trading behavior, decisions on retirement saving, and financial choices by retirees. We survey research on liabilities, including mortgage choice, refinancing, and default, and household behavior in unsecured credit markets, including credit cards and payday lending. We then connect the household to its social environment, including peer effects, cultural and hereditary factors, intra-household financial decision making, financial literacy, cognition and educational interventions. We also discuss literature on the provision and consumption of financial advice.
The term structure of interest rates is crucial for the transmission of monetary policy to financial markets and the macroeconomy. Disentangling the impact of monetary policy on the components of interest rates, expected short rates and term premia, is essential to understanding this channel. To accomplish this, we provide a quantitative structural model with endogenous, time-varying term premia that are consistent with empirical findings. News about future policy, in contrast to unexpected policy shocks, has quantitatively significant effects on term premia along the entire term structure. This provides a plausible explanation for partly contradictory estimates in the empirical literature.
We design, field and exploit survey data from a representative sample of the French population to examine whether informative social interactions enter householdsístockholding decisions. Respondents report perceptions about their circle of peers with whom they interact about Önancial matters, their social circle and the population. We provide evidence for the presence of an information channel through which social interactions ináuence perceptions and expectations about stock returns, and financial behavior. We also find evidence of mindless imitation of peers in the outer social circle, but this does not permeate as many layers of financial behavior as informative social interactions do.
Abundant studies show that individuals often struggle and frequently fail to form a correct perception of how much they are worth in terms of income or net wealth, both in absolute terms and relative to others. The authors find that wealth misperception arises even in a frictionless environment. They show that this wealth misperception is related to low cognitive abilities and inattention, and that subjects who misperceive wealth have a greater tendency to borrow and spend out of gains. A standard optimal consumption choice model, enriched with a rational but inattentive agent à la Gabaix aligns the key experimental findings.
In the course of the crisis, the European System of Central Banks (ESCB) has acted several times to support the EU Member States and banking systems in financial distress by purchasing debt instruments: Covered Bonds Programmes (CBP), Securities Market Programmes (SMP), Long Term Refinancing Operations (LTRO), and Targeted Long Term Refinancing Operations (TLTRO), followed by the Outright Monetary Transactions (OMT) and then the Extended Asset Purchase Programmes (EAPP) – colloquially labelled as Quantitative Easing (QE).
Initially, the support measures of the ESCB might have to be judged as monetary policy but the selectivity of OMT and – even more – SMP in conjunction with the transfer of risks to the ESCB speak against it.
We propose a simple modification of the time series filter by Hamilton (2018) that yields reliable and economically meaningful real-time output gap estimates. The original filter relies on 8-quarter-ahead forecast errors of a simple autoregression of log real GDP. While this approach yields a cyclical component of GDP that is hardly revised with new incoming data due to the one-sided filtering approach, it does not cover typical business cycle frequencies evenly, but short business cycles are muted and medium length business cycles are amplified. Further, the estimated trend is as volatile as GDP itself and can thus hardly be interpreted as potential GDP. A simple modification that is based on the mean of 4- to 12-quarter-ahead forecast errors shares the favorable real-time properties of the Hamilton filter, but leads to a much better coverage of typical business cycle frequencies and a smooth estimated trend. Based on output growth and inflation forecasts and a comparison to revised output gap estimates from policy institutions, we find that real-time output gaps based on the modified Hamilton filter are economically much more meaningful measures of the business cycle than those based on other simple statistical trend-cycle decomposition techniques such as the HP or the Bandpass filter.
We analyze cyclical co-movement in credit, house prices, equity prices, and longterm interest rates across 17 advanced economies. Using a time-varying multi-level dynamic factor model and more than 130 years of data, we analyze the dynamics of co-movement at different levels of aggregation and compare recent developments to earlier episodes such as the early era of financial globalization from 1880 to 1913 and the Great Depression. We find that joint global dynamics across various financial quantities and prices as well as variable-specific global co-movements are important to explain fluctuations in the data. From a historical perspective, global co-movement in financial variables is not a new phenomenon, but its importance has increased for some variables since the 1980s. For equity prices, global cycles play currently a historically unprecedented role, explaining more than half of the fluctuations in the data. Global cycles in credit and housing have become much more pronounced and longer, but their importance in explaining dynamics has only increased for some economies including the US, the UK and Nordic European countries. We also include GDP in the analysis and find an increasing role for a global business cycle.
There is substantial disagreement about the consequences of the Tax Cuts and Jobs Act (TCJA) of 2017, which constitutes the most extensive tax reform in the United States in more than 30 years. Using a large-scale two-country dynamic general equilibrium model with nominal rigidities, we find that the TCJA increases GDP by about 2% in the medium-run and by about 2.5% in the long-run. The shortrun impact depends crucially on the degree and costs of variable capital utilization, with GDP effects ranging from 1 to 3%. At the same time, the TCJA does not pay for itself. In our analysis, the reform decreases tax revenues and raises the debt-to-GDP ratio by about 15 percentage points in the medium-run until 2025. We show that combining the TCJA with spending cuts can dampen the increase in government indebtedness without reducing its expansionary effect.
Distributed ledger technology especially in the form of publicly coordinated validation networks such as Ethereum and Bitcoin with their own monetary circles provide for a revealing litmus test for current financial regulatory schemes. The paper highlights the interrelation between distributed coordination and the emission of virtual currency to make sense of the function of the new monetary phenomenon. It then argues for the regulation of financial services on the ground of the technology to ensure integrity standards. In this respect, it is useful to gear the development of a regulatory scheme towards the existing financial regulatory principles. However, future measures of the regulators must take the distributed nature of the platforms into account by relying on a “regulated self-regulation” of the community. Finally, the article focuses on the shortcomings of the current EU regulatory regimes, especially the regulation frameworks regarding financial services, payment services and electronic money.
Exploiting the natural experiment of the German reunification, we examine how consumers adapt to a new environment in their macroeconomic forecasting. We document that East Germans expect higher in inflation and make larger forecast errors than West
Germans even decades after reunification. Differences in consumption baskets, financial literacy, risk aversion or trust in the central bank cannot fully account for these patterns. We find most support for the explanation that East Germans, who were used to a strong norm of zero inflation, persistently overadjusted the level of their expectations in the face of the initial inflation shock in reunified Germany. Our findings suggest that large changes in the economic environment can permanently impede people's ability to form accurate macroeconomic expectations, with an important role for the interaction of old norms and new experiences around the event.
Policymakers attach an important role to the macroeconomic outlook of households. Using a representative online panel form the U.S., the authors examine how individuals' macroeconomic expectations causally affect their personal economic prospects and their behavior and provide them with different professional forecasts about the likelihood of a recession. The authors find that groups with the largest exposure to aggregate risk, such as individuals working in cyclical industries, are most likely to respond to an improved macroeconomic outlook, while a large fraction of the population is unlikely to react.
This paper uses unique administrative data and a quasi-field experiment of exogenous allocation in Sweden to estimate medium- and longer-run effects on financial behavior from exposure to financially literate neighbors. It contributes evidence of causal impact of exposure and of a social multiplier of financial knowledge, but also of unfavorable distributional aspects of externalities. Exposure promotes saving in private retirement accounts and stockholding, especially when neighbors have economics or business education, but only for educated households and when interaction possibilities are substantial. Findings point to transfer of knowledge rather than mere imitation or effects through labor, education, or mobility channels.
The recent sovereign debt crisis in the Eurozone was characterized by a monetary policy, which has been constrained by the zero lower bound (ZLB) on nominal interest rates, and several countries, which faced high risk spreads on their sovereign bonds. How is the government spending multiplier affected by such an economic environment?While prominent results in the academic literature point to high government spending multipliers at the ZLB, higher public indebtedness is often associated with small government spending multipliers. I develop a DSGE model with leverage constrained banks that captures both features of this economic environment, the ZLB and fiscal stress. In this model, I analyze the effects of government spending shocks. I find that not only are multipliers large at the ZLB, the presence of fiscal stress can even increase their size. For longer durations of the ZLB,multipliers in this model can be considerably larger than one.
JEL Classification: E32, E 44, E62
The authors relax the standard assumption in the dynamic stochastic general equilibrium (DSGE) literature that exogenous processes are governed by AR(1) processes and estimate ARMA (p,q) orders and parameters of exogenous processes. Methodologically, they contribute to the Bayesian DSGE literature by using Reversible Jump Markov Chain Monte Carlo (RJMCMC) to sample from the unknown ARMA orders and their associated parameter spaces of varying dimensions.
In estimating the technology process in the neoclassical growth model using post war US GDP data, they cast considerable doubt on the standard AR(1) assumption in favor of higher order processes. They find that the posterior concentrates density on hump-shaped impulse responses for all endogenous variables, consistent with alternative empirical estimates and the rigidities behind many richer structural models. Sampling from noninvertible MA representations, a negative response of hours to a positive technology shock is contained within the posterior credible set. While the posterior contains significant uncertainty regarding the exact order, the results are insensitive to the choice of data filter; this contrasts with the authors’ ARMA estimates of GDP itself, which vary significantly depending on the choice of HP or first difference filter.
What institutional arrangements for an independent central bank with a price stability mandate promote good policy outcomes when unconventional policies become necessary? Unconventional monetary policy poses challenges. The large scale asset purchases needed to counteract the zero lower bound on nominal interest rates have uncomfortable fiscal and distributional consequences and require central banks to assume greater risks on their balance sheets.
In his paper, Athanasios Orphanides draws lessons from the experience of the Bank of Japan (BoJ) since the late 1990s for the institutional design of independent central banks. He comes to the conclusion that lack of clarity on the precise definition of price stability, coupled with concerns about the legitimacy of large balance sheet expansions, hinders policy: It encourages the central bank to eschew the decisive quantitative easing needed to reflate the economy and instead to accommodate too-low inflation. The BoJ’s experience with the zero lower bound suggests important benefits from a clear definition of price stability as a symmetric 2% goal for inflation, which the Bank adopted in 2013.
Für Zwecke des privaten Konsums werden ständig Gegenwarts- und Zukunftsgüter bewertet und gehandelt. Ein zuverlässiges und umfassendes Maß für die allgemeine Kaufkraft des Geldes und deren Veränderung sollte diesem Grundsachverhalt Rechnung tragen. Im Unterschied zu konventionellen statistischen Verbraucherpreisindizes ist ein ökonomischer Lebenskostenindex intertemporal angelegt, da er die effektiven Konsumgüterpreise (Effektivpreise) über den Planungshorizont der privaten Haushalte bündelt. Ein Preisstabilitätsstandard, der diesen Zusammenhang ausblendet, ist tendenziell verzerrt und leistet einer asymmetrischen Geldpolitik Vorschub.
Effektivpreise sind Gegenwartspreise für künftigen Konsum, sie berücksichtigen Güterpreise und Zinsen bzw. Vermögenspreisänderungen, sind konsumtheoretisch und wohlfahrtsökonomisch fundiert und bilden die zentralen Bausteine für die Modellklasse der ökonomischen Lebenskostenindizes. Nutzentheoretisch gesehen sind Effektivpreise bewerteter Grenznutzen der letzten konsumierten Gütereinheit, und die daraus abgeleiteten Effektiven Inflationsraten sind intertemporale Grenzraten der Substitution.
Die Autoren entwickeln einen intertemporalen Lebenskostenindex auf der Grundlage des Konzepts der Effektivpreise und stellen empirische Zeitreihen und kohortenspezifische Szenarioanalysen für Deutschland vor.
The paper illustrates based on an example the importance of consistency between the empirical measurement and the concept of variables in estimated macroeconomic models. Since standard New Keynesian models do not account for demographic trends and sectoral shifts, the authors proposes adjusting hours worked per capita used to estimate such models accordingly to enhance the consistency between the data and the model. Without this adjustment, low frequency shifts in hours lead to unreasonable trends in the output gap, caused by the close link between hours and the output gap in such models.
The retirement wave of baby boomers, for example, lowers U.S. aggregate hours per capita, which leads to erroneous permanently negative output gap estimates following the Great Recession. After correcting hours for changes in the age composition, the estimated output gap closes gradually instead following the years after the Great Recession.
Helmut Siekmann erläutert in seinem Beitrag die Einstandspflicht der Bundesrepublik Deutschland für die Deutsche Bundesbank und die Europäische Zentralbank. Dabei kommt er zu dem Schluss, dass weder eine „Haftung der Bundesrepublik Deutschland für Verluste der EZB noch eine Verpflichtung zur Auffüllung von aufgezehrtem Eigenkapital“ besteht.
Dieser Beitrag ist zuerst erschienen in: Festschrift für Theodor Baums zum siebzigsten Geburtstag, S. 1145-1179, Helmut Siekmann, Andreas Cahn, Tim Florstedt, Katja Langenbucher, Julia Redenius-Hövermann, Tobias Tröger, Ulrich Segna, Hrsg., Tübingen, Mohr Siebeck 2017
Financial market interactions can lead to large and persistent booms and recessions. Instability is an inherent threat to economies with speculative financial markets. A central bank’s interest rate setting can amplify the expectation feedback in the financial market and this can lead to unstable dynamics and excess volatility. The paper suggests that policy institutions may be well-advised to handle tools like asset price targeting with care since such instruments might add a structural link between asset prices and macroeconomic aggregates. Neither stock prices nor indices are a good indicator to base decisions on.
The level of capital tax gains has high explanatory power regarding the question of what drives economic inequality. On this basis, the authors develop a simple, yet micro-founded portfolio selection model to explain the dynamics of wealth inequality given empirical tax series in the US. The results emphasize that the level and the transition of speed of wealth inequality depend crucially on the degree of capital taxation. The projections predict that – continuing on the present path of capital taxation in the US – the gap between rich and poor is expected to shrink whereas “massive” tax cuts will further increase the degree of wealth concentration.
The Institute for Monetary and Financial Stability (IMFS) is a research center of the Johann Wolfgang Goethe University, Frankfurt am Main, located in the "House of Finance". The Institute was established as implementation of the project "Currency and Financial Stability" funded by a grant of the Stiftung Geld und Währung (Foundation of Monetary and Financial Stability). The Foundation of Monetary and Financial Stability was created January 1, 2002 by federal law. ...
The purpose of the data presented in this article is to use it in ex post estimations of interest rate decisions by the European Central Bank (ECB), as it is done by Bletzinger and Wieland (2017) [1]. The data is of quarterly frequency from 1999 Q1 until 2013 Q2 and consists of the ECB's policy rate, inflation rate, real output growth and potential output growth in the euro area. To account for forward-looking decision making in the interest rate rule, the data consists of expectations about future inflation and output dynamics. While potential output is constructed based on data from the European Commission's annual macro-economic database, inflation and real output growth are taken from two different sources both provided by the ECB: the Survey of Professional Forecasters and projections made by ECB staff. Careful attention was given to the publication date of the collected data to ensure a real-time dataset only consisting of information which was available to the decision makers at the time of the decision.
The bail-in tool as implemented in the European bank resolution framework suffers from severe shortcomings. To some extent, the regulatory framework can remedy the impediments to the desirable incentive effect of private sector involvement (PSI) that emanate from a lack of predictability of outcomes, if it compels banks to issue a sufficiently sized minimum of high-quality, easy to bail-in (subordinated) liabilities. Yet, even the limited improvements any prescription of bail-in capital can offer for PSI’s operational effectiveness seem compromised in important respects.
The main problem, echoing the general concerns voiced against the European bail-in regime, is that the specifications for minimum requirements for own funds and eligible liabilities (MREL) are also highly detailed and discretionary and thus alleviate the predicament of investors in bail-in debt, at best, only insufficiently. Quite importantly, given the character of typical MREL instruments as non-runnable long-term debt, even if investors are able to gauge the relevant risk of PSI in a bank’s failure correctly at the time of purchase, subsequent adjustment of MREL-prescriptions by competent or resolution authorities potentially change the risk profile of the pertinent instruments. Therefore, original pricing decisions may prove inadequate and so may market discipline that follows from them.
The pending European legislation aims at an implementation of the already complex specifications of the Financial Stability Board (FSB) for Total Loss Absorbing Capacity (TLAC) by very detailed and case specific amendments to both the regulatory capital and the resolution regime with an exorbitant emphasis on proportionality and technical fine-tuning. What gets lost in this approach, however, is the key policy objective of enhanced market discipline through predictable PSI: it is hardly conceivable that the pricing of MREL-instruments reflects an accurate risk-assessment of investors because of the many discretionary choices a multitude of agencies are supposed to make and revisit in the administration of the new regime. To prove this conclusion, this chapter looks in more detail at the regulatory objectives of the BRRD’s prescriptions for MREL and their implementation in the prospectively amended European supervisory and resolution framework.
This paper analyzes the bail-in tool under the Bank Recovery and Resolution Directive (BRRD) and predicts that it will not reach its policy objective. To make this argument, this paper first describes the policy rationale that calls for mandatory private sector involvement (PSI). From this analysis, the key features for an effective bail-in tool can be derived.
These insights serve as the background to make the case that the European resolution framework is likely ineffective in establishing adequate market discipline through risk-reflecting prices for bank capital. The main reason for this lies in the avoidable embeddedness of the BRRD’s bail-in tool in the much broader resolution process, which entails ample discretion of the authorities also in forcing private sector involvement. Moreover, the idea that nearly all positions on the liability side of a bank’s balance sheet should be subjected to bail-in is misguided. Instead, a concentration of PSI in instruments that fall under the minimum requirements for own funds and eligible liabilities (MREL) is preferable.
Finally, this paper synthesized the prior analysis by putting forward an alternative regulatory approach that seeks to disentangle private sector involvement as a precondition for effective bank-resolution as much as possible form the resolution process as such.
Since 2014 the ECB has implemented a massive expansion of monetary policy including large-scale asset purchases and negative policy rates. As the euro area economy has improved and inflation has risen, questions concerning the future normalization of monetary policy are starting to dominate the public debate.
The study argues that the ECB should develop a strategy for policy normalization and communicate it very soon to prepare the ground for subsequent steps towards tightening. It provides analysis and makes proposals concerning key aspects of this strategy. The aim is to facilitate the emergence of expectations among market participants that are consistent with a smooth process of policy normalization.
For some time now, structural macroeconomic models used at central banks have been predominantly New Keynesian DSGE models featuring nominal rigidities and forwardlooking decision-making. While these features are widely deemed crucial for policy evaluation exercises, most central banks have added more detailed characterizations of the financial sector to these models following the Great Recession in order to improve their fit to the data and their forecasting performance. We employ a comparative approach to investigate the characteristics of this new generation of New Keynesian DSGE models and document an elevated degree of model uncertainty relative to earlier model generations. Policy transmission is highly heterogeneous across types of financial frictions and monetary policy causes larger effects, on average. The New Keynesian DSGE models we analyze suggest that a simple policy rule robust to model uncertainty involves a weaker response to inflation and the output gap in the presence of financial frictions as compared to earlier generations of such models. Leaning-against-the-wind policies in models of this class estimated for the Euro Area do not lead to substantial gains. With regard to forecasting performance, the inclusion of financial frictions can generate improvements, if conditioned on appropriate data. Looking forward, we argue that model-averaging and embracing alternative modelling paradigms is likely to yield a more robust framework for the conduct of monetary policy.
During the 1970s, industrial countries, including the US and continental Europa, experienced a combination of slow productivity growth and high unemplyoment. Subsequent research has shown that the standard model of unemployment actually gives counterfactual predictions. Motivated by the observation that the 1970s were also characterized by high and rising inflation, Tesfaselassie and Wolters examine the effect of growth on unemployment in the presence of nominal price rigidity.
The authors demonstrate that the effect of growth on unemployment may be positive or negative. Faster growth leads to lower unemployment if the rate of inflation is high enough. There is a threshold level of inflation below which faster growth leads to higher unemployment and above which faster growth leads to lower unemployment. The threshold level in turn depends on labor market characteristics, such as hiring efficiency, the job destruction rate, workers' relative bargaining power and the opportunity cost of work.
To broaden the scope of monetary policy, cash abolishment is often suggested as a means of breaking through the zero lower bound. However, practically nothing is said about the welfare costs of such a proposal. Rösl, Seitz and Tödter argue that the welfare costs of bypassing the zero lower bound can be analyzed analytically and empirically by assuming negative interest rates on cash holdings. They gauge the welfare effects of abolishing cash, both, for the euro area and for Germany.
Their findings suggest that the welfare losses of negative interest rates incurred by money holders are large, notably if implemented in the current low interest rate environment. Imposing a negative interest rate of 3 percentage points on cash holdings and reducing the interest on all assets included in M3 creates a deadweight loss of € 62bn for the euro area and of €18bn for Germany. Therefore, the authors argue that cash abolishment or negative interest rates on cash to break through the zero lower bound at any price can hardly be a meaningful policy goal.
The currrent debate on monetary and fiscal policy is heavily influenced by estimates of the equilibrium real interest rate. Beyer and Wieland re-estimate the U.S. equilibrium rate with the methodology of Laubach and Williams and further modifications. They provide new estimates for the United States, the euro area and Germany and subject them to sensitivity tests. Beyer and Wieland conclude that due to the great uncertainty and sensitivity, the observed decline in the estimates is not a reliable indicator of a need for expansionary monetary and fiscal policy. Yet, if those estimates are employed to determine the appropriate monetary policy stance, such estimates are better used together with the consistent estimate of the level of potential output.
I propose a dynamic stochastic general equilibrium model in which the leverage of borrowers as well as banks and housing finance play a crucial role in the model dynamics. The model is used to evaluate the relative effectiveness of a policy to inject capital into banks versus a policy to relieve households of mortgage debt. In normal times, when the economy is near the steady state and policy rates are set according to a Taylor-type rule, capital injections to banks are more effective in stimulating the economy in the long-run. However, in the middle of a housing debt crisis, when households are highly leveraged, the short-run output effects of the debt relief are more substantial. When the zero lower bound (ZLB) is additionally considered, the debt relief policy can be much more powerful in boosting the economy both in the short-run and in the longrun. Moreover, the output effects of the debt relief become increasingly larger, the longer the ZLB is binding.
The global financial crisis and the ensuing criticism of macroeconomics have inspired researchers to explore new modeling approaches. There are many new models that deliver improved estimates of the transmission of macroeconomic policies and aim to better integrate the financial sector in business cycle analysis. Policy making institutions need to compare available models of policy transmission and evaluate the impact and interaction of policy instruments in order to design effective policy strategies. This paper reviews the literature on model comparison and presents a new approach for comparative analysis. Its computational implementation enables individual researchers to conduct systematic model comparisons and policy evaluations easily and at low cost. This approach also contributes to improving reproducibility of computational research in macroeconomic modeling. Several applications serve to illustrate the usefulness of model comparison and the new tools in the area of monetary and fiscal policy. They include an analysis of the impact of parameter shifts on the effects of fiscal policy, a comparison of monetary policy transmission across model generations and a cross-country comparison of the impact of changes in central bank rates in the United States and the euro area. Furthermore, the paper includes a large-scale comparison of the dynamics and policy implications of different macro-financial models. The models considered account for financial accelerator effects in investment financing, credit and house price booms and a role for bank capital. A final exercise illustrates how these models can be used to assess the benefits of leaning against credit growth in monetary policy.
Under ordinary circumstances, the fiscal implications of central bank policies tend to be seen as relatively minor and escape close scrutiny. The global financial crisis of 2008, however, demanded an extraordinary response by central banks which brought to light the immense power of central bank balance sheet policies as well as their major fiscal implications. Once the zero lower bound on interest rates is reached, expanding a central bank’s balance sheet becomes the central instrument for providing additional monetary policy accommodation. However, with interest rates near zero, the line separating fiscal and monetary policy is blurred. Furthermore, discretionary decisions associated with asset purchases and liquidity provision, as well as with lender-of-last-resort operations benefiting private entities, can have major distributional effects that are ordinarily associated with fiscal policy. In the euro area, discretionary central bank decisions can have immense distributional effects across member states. However, decisions of this nature are incompatible with the role of unelected officials in democratic societies. Drawing on the response to the crisis by the Federal Reserve and the ECB, this paper explores the tensions arising from central bank balance sheet policies and addresses pertinent questions about the governance and accountability of independent central banks in a democratic society.
Recently there has been an explosion of research on whether the equilibrium real interest rate has declined, an issue with significant implications for monetary policy. A common finding is that the rate has declined. In this paper we provide evidence that contradicts this finding. We show that the perceived decline may well be due to shifts in regulatory policy and monetary policy that have been omitted from the research. In developing the monetary policy implications, it is promising that much of the research approaches the policy problem through the framework of monetary policy rules, as uncertainty in the equilibrium real rate is not a reason to abandon rules in favor of discretion. But the results are still inconclusive and too uncertain to incorporate into policy rules in the ways that have been suggested.
A number of contributions to research on monetary policy have suggested that policy should be asymmetric near the lower bound on nominal interest rates. As inflation and economic activity decline, policy should ease more aggressively than it would in the absence of the lower bound. As activity recovers and inflation picks up, the central bank should act to keep interest rates lower for longer than without the bound. In this note, we investigate to what extent the policy easing implemented by the ECB since summer 2013 mirrors the rate recommendations of a simple policy rule or deviates from it in a way that indicates a “lower for longer” approach to policy near zero interest rates.
Schätzwerte mittelfristiger Gleichgewichtszinsen mit der Methode nach Laubach und Williams (2003) werden inzwischen vielfach in der Diskussion um die Geld- und Fiskalpolitik zitiert. Unter anderem wurden sie von Summers (2014a) als Evidenz für eine säkulare Stagnation angeführt und von Yellen (2015) zur Rechtfertigung der Nullzinspolitik verwendet. In diesem Papier nehmen wir eine umfangreiche Untersuchung und Sensitivitätsanalyse dieser Schätzwerte für die Vereinigten Staaten, Deutschland und den Euro-Raum vor. Aufgrund der hohen Unsicherheit und Sensitivität, die mit den Schätzwerten mittelfristiger Gleichgewichtszinsen mit der Laubach-Williams-Methode und ähnlichen Ansätzen verbunden ist, sollten diese Schätzungen nicht den Ausschlag für entscheidende Weichenstellungen in der Geld- und Fiskalpolitik geben.
Die wirtschaftlichen und rechtlichen Grenzen der Notenbanken oder die Zukunft der Zentralbanken waren zwei große Themen, die im vergangenen Jahr im Fokus des IMFS standen. Einen Überblick über alle Aktivitäten und Publikationen des IMFS im Jahr 2013 gibt der Jahresbericht 2013, der jetzt in der Download-Version erschienen ist. Der Jahresbericht umfasst 58 Seiten und ist nur auf Englisch erhältlich.
Mehr als 18 Milliarden Euro hat die Commerzbank im Zuge der Finanzkrise in Form von staatlichen Garantien, Kapitalspritzen oder Einlagen erhalten. Auch die Hypo Real Estate, die WestLB, die SachsenLB und die IKB profitierten von Stützungsmaßnahmen. Die EU genehmigte diese und andere staatlichen Hilfsmaßnahmen. Grundsätzlich sind staatliche Stützungsmaßnahmen jedoch als wirtschaftlicher Vorteil zu werten und damit zunächst eine verbotene Beihilfe. In seinem Working Paper betrachtet Tuschl die rechtlichen Grundlagen des EU-Beihilferechts und zeigt die teilweise differierende Praxis der EU-Kommission auf.
The Federal Reserve’s muddled mandate to attain simultaneously the incompatible goals of maximum employment and price stability invites short-term-oriented discretionary policymaking inconsistent with the systematic approach needed for monetary policy to contribute best to the economy over time. Fear of liftoff—the reluctance to start the process of policy normalization after the end of a recession—serves as an example. Causes of the problem are discussed, drawing on public choice and cognitive psychology perspectives. The Federal Reserve could adopt a framework that relies on a simple policy rule subject to periodic reviews and adaptation. Replacing meeting-by-meeting discretion with a simple policy rule would eschew discretion in favor of systematic policy. Periodic review of the rule would allow the Federal Reserve the flexibility to account for and occasionally adapt to the evolving understanding of the economy. Congressional legislation could guide the Federal Reserve in this direction. However the Federal Reserve may be best placed to select the simple rule and could embrace this improvement on its own, within its current mandate, with the publication of a simple rule along the lines of its statement of longer-run goals.
This paper investigates the effect of a change in informational environment of borrowers on the organizational design of bank lending. We use micro-data from a large multinational bank and exploit the sudden introduction of a credit registry, an information-sharing mechanism across banks, for a subset of borrowers. Using within borrower and loan officer variation in a difference-in-difference empirical design, we show that expansion of credit registry led to an improvement in allocation of credit to affected
borrowers. There was a concurrent change in the organizational structure of the bank that involved a dramatic increase in delegation of lending decisions of affected borrowers to loan officers. We also find a significant expansion in scope of activities of loan officers who deal primarily with affected borrowers, as well as of their superiors. There is suggestive evidence that larger banks in the economy were better able to implement similar changes as our bank. We argue that these patterns can be understood within the framework of incentive-based and information cost processing theories. Our findings could help rationalize why improvements in the information environment of borrowers may be altering the landscape of lending by moving decisions outside the boundaries of financial intermediaries.
The IMFS Interdisciplinary Study 2/2013 contains speeches of Michael Burda (Humboldt University ), Benoît Coeuré (European Central Bank), Stefan Gerlach (Bank of Ireland and former IMFS Professor), Patrick Honohan (Bank of Ireland), Sabine Lautenschläger (Deutsche Bundesbank), Athanasios Orphanides (MIT) and Helmut Siekmann as well as Volker Wieland.
We analyze the macroeconomic implications of increasing the top marginal income tax rate using a dynamic general equilibrium framework with heterogeneous agents and a fiscal structure resembling the actual U.S. tax system. The wealth and income distributions generated by our model replicate the empirical ones. In two policy experiments, we increase the statutory top marginal tax rate from 35 to 70 percent and redistribute the additional tax revenue among households, either by decreasing all other marginal tax rates or by paying out a lump-sum transfer to all households. We find that increasing the top marginal tax rate decreases inequality in both wealth and income but also leads to a contraction of the aggregate economy. This is primarily driven by the negative effects that the tax change has on top income earners. The aggregate gain in welfare is sizable in both experiments mainly due to a higher degree of distributional equality.
This paper looks into the specific influence that the European banking union will have on (future) bank client relationships. It shows that the intended regulatory influence on market conditions in principle serves as a powerful governance tool to achieve financial stability objectives.
From this vantage, it analyzes macro-prudential instruments with a particular view to mortgage lending markets – the latter have been critical in the emergence of many modern financial crises. In gauging the impact of the new European supervisory framework, it finds that the ECB will lack influence on key macro-prudential tools to push through more rigid supervisory policies vis-à-vis forbearing national authorities.
Furthermore, this paper points out that the current design of the European bail-in tool supplies resolution authorities with undue discretion. This feature which also afflicts the SRM imperils the key policy objective to re-instill market discipline on banks’ debt financing operations. The latter is also called into question because the nested regulatory technique that aims at preventing bail-outs unintendedly opens additional maneuvering space for political decision makers.
This paper empirically investigates how organizational hierarchy affects the allocation of credit within a bank. Using an exogenous variation in organizational design, induced by a reorganization plan implemented in roughly 2,000 bank branches in India during 1999-2006, and employing a difference-in-differences research strategy, we find that increased hierarchization of a branch decreases its ability to produce "soft" information on loans, leads to increased standardization of loans and rationing of "soft information" loans. Furthermore, this loss of information brings about a reduction in performance on loans: delinquency rates and returns on similar loans are worse in more hierarchical branches. We also document how hierarchical structures perform better in environments that are characterized by a high degree of corruption, thus highlighting the benefits of hierarchical decision making in restraining rent seeking activities. Finally, we document a channel - managerial interference - through which hierarchy affects loan outcomes.
Our paper evaluates recent regulatory proposals mandating the deferral of bonus payments and claw-back clauses in the financial sector. We study a broadly applicable principal agent setting, in which the agent exerts effort for an immediately observable task (acquisition) and a task for which information is only gradually available over time (diligence). Optimal compensation contracts trade off the cost and benefit of delay resulting from agent impatience and the informational gain. Mandatory deferral may increase or decrease equilibrium diligence depending on the importance of the acquisition task. We provide concrete conditions on economic primitives that make mandatory deferral socially (un)desirable.
In its meeting on 6 September 2012, the Governing Council of the ECB took decisions on a number of technical features regarding the Eurosystem’s outright transactions in secondary sovereign bond markets (OMT). This decision was challenged in the German Federal Constitutional Court (GFCC) by a number of constitutional complaints and other petitions. In its seminal judgment of 14 January 2014, the German court expressed serious doubts on the compatibility of the ECB’s decision with the European Union law.
It admitted the complaints and petitions even though actual purchases had not been executed and the control of acts of an organ of the EU in principle is not the task of the GFCC. As justification for this procedure the court resorted to its judicature on a reserved “ultra vires” control and the defense of the “constitutional identiy” of Germany. In the end, however, the court referred the case pursuant to Article 267 TFEU to the European Court of Justice (ECJ) for preliminary rulings on several questions of EU law. In substance, the German court assessed OMT as an act of economic policy which is not covered by the competences of the ECB. Furthermore, it judged OMT as a – by EU primary law – prohibited monetary financing of sovereign debt. The defense of the ECB (disruption of monetary policy transmission mechanism) was dismissed without closer scrutiny as being “irrelevant”. Finally the court opened, however, a way for a compromise by an interpretation of OMT in conformity with EU law under preconditions, specified in detail.
Procedure and findings of this judgment were harshly criticized by many economists but also by the majority of legal scholars. This criticism is largely convincing in view of the admissibility of the complaints. Even if the “ultra vires” control is in conformity with prior decisions of court it is in this judgment expanded further without compelling reasons. It is also questionable whether the standing of the complaining parties had to be accepted and whether the referral to the ECJ was indicated. The arguments of the court are, however, conclusive in respect of the transgression of competences by the ECB and – to somewhat lesser extent – in respect of the monetary debt financing. The dismissal of the defense as “irrelevant” is absolutey persuasive.
The Treaty of Maastricht imposed the strict obligation on the European Union (EU) to establish an economic and monetary union, now Article 3(4) TEU. This economic and monetary union is, however, not designed as a separate entity but as an integral part of the EU. The single currency was to become the currency of the EU and to be the legal tender in all Member States unless an exemption was explicitly granted in the primary law of the EU, as in the case of the UK and Denmark. The newly admitted Member States are obliged to introduce the euro as their currency as soon as they fulfil the admission criteria. Technically, this has been achieved by transferring the exclusive competence for the monetary policy of the Member States whose currency is the euro on the EU, Article 3(1)(c) TFEU and by bestowing the euro with the quality of legal tender, the only legal tender in the EU, Article 128(1) sentence 3 TFEU.
Savings accounts are owned by most households, but little is known about the performance of households’ investments. We create a unique dataset by matching information on individual savings accounts from the DNB Household Survey with market data on account-specific interest rates and characteristics. We document considerable heterogeneity in returns across households, which can be partly explained by financial sophistication. A one-standard deviation increase in financial literacy is associated with a 13% increase compared to the median interest rate. We isolate the usage of modern technology (online accounts) as one channel through which financial literacy has a positive association with returns.
A theory of the boundaries of banks with implications for financial integration and regulation
(2015)
We offer a theory of the "boundary of the
rm" that is tailored to banking, as it builds on a single ine¢ ciency arising from risk-shifting and as it takes into account both interbank lending as an alternative to integration and the role of possibly insured deposit funding. Amongst others, it explains both why deeper economic integration should cause also greater financial integration through both bank mergers and interbank lending, albeit this typically remains ine¢ ciently incomplete, and why economic disintegration (or "desychronization"), as currently witnessed in the European Union, should cause less interbank exposure. It also suggests that recent policy measures such as the preferential treatment of retail deposits, the extension of deposit insurance, or penalties on "connectedness" could all lead to substantial welfare losses.
In this paper, we examine how the institutional design affects the outcome of bank bailout decisions. In the German savings bank sector, distress events can be resolved by local politicians or a state-level association. We show that decisions by local politicians with close links to the bank are distorted by personal considerations: While distress events per se are not related to the electoral cycle, the probability of local politicians injecting taxpayers’ money into a bank in distress is 30 percent lower in the year directly preceding an election. Using the electoral cycle as an instrument, we show that banks that are bailed out by local politicians experience less restructuring and perform considerably worse than banks that are supported by the savings bank association. Our findings illustrate that larger distance between banks and decision makers reduces distortions in the decision making process, which has implications for the design of bank regulation and supervision.
This study contains articles based on speeches and presentations at the 14th CFS-IMFS Conference "The ECB and its Watchers" on June 15, 2012 by Mario Draghi, John Vickers, Peter Praet, Lucrezia Reichlin, Vitor Gaspar, Lucio Pench and Stefan Gerlach and a post-conference outlook by Helmut Siekmann and Volker Wieland.
This paper investigates the risk channel of monetary policy on the asset side of banks’ balance sheets. We use a factoraugmented vector autoregression (FAVAR) model to show that aggregate lending standards of U.S. banks, such as their collateral requirements for firms, are significantly loosened in response to an unexpected decrease in the Federal Funds rate. Based on this evidence, we reformulate the costly state verification (CSV) contract to allow for an active financial intermediary, embed it in a New Keynesian dynamic stochastic general equilibrium (DSGE) model, and show that – consistent with our empirical findings – an expansionary monetary policy shock implies a temporary increase in bank lending relative to borrower collateral. In the model, this is accompanied by a higher default rate of borrowers.
Are rules and boundaries sufficient to limit harmful central bank discretion? Lessons from Europe
(2014)
Marvin Goodfriend’s (2014) insightful, informative and provocative work explains concisely and convincingly why the Fed needs rules and boundaries. This paper reviews the broader institutional design problem regarding the effectiveness of the central bank in practice and confirms the need for rules and boundaries. The framework proposed for improving the Fed incorporates key elements that have already been adopted in the European Union. The case of ELA provision by the ECB and the Central Bank of Cyprus to Marfin-Laiki Bank during the crisis, however, suggests that the existence of rules and boundaries may not be enough to limit harmful discretion. During a crisis, novel interpretations of the legal authority of the central bank may be introduced to create a grey area that might be exploited to justify harmful discretionary decisions even in the presence of rules and boundaries. This raises the question how to ensure that rules and boundaries are respected in practice
In the aftermath of the global financial crisis and great recession, many countries face substantial deficits and growing debts. In the United States, federal government outlays as a ratio to GDP rose substantially from about 19.5 percent before the crisis to over 24 percent after the crisis. In this paper we consider a fiscal consolidation strategy that brings the budget to balance by gradually reducing this spending ratio over time to the level that prevailed prior to the crisis. A crucial issue is the impact of such a consolidation strategy on the economy. We use structural macroeconomic models to estimate this impact. We consider two types of dynamic stochastic general equilibrium models: a neoclassical growth model and more complicated models with price and wage rigidities and adjustment costs. We separate out the impact of reductions in government purchases and transfers, and we allow for a reduction in both distortionary taxes and government debt relative to the baseline of no consolidation. According to the initial model simulations GDP rises in the short run upon announcement and implementation of this fiscal consolidation strategy and remains higher than the baseline in the long run.
How special are they? - Targeting systemic risk by regulating shadow banking : (October 5, 2014)
(2014)
This essay argues that at least some of the financial stability concerns associated with shadow banking can be addressed by an approach to financial regulation that imports its functional foundations more vigorously into the interpretation and implementation of existing rules. It shows that the general policy goals of prudential banking regulation remain constant over time despite dramatic transformations in the financial and technological landscape. Moreover, these overarching policy goals also legitimize intervention in the shadow banking sector. On these grounds, this essay encourages a more normative construction of available rules that potentially limits both the scope for regulatory arbitrage and the need for ever more rapid updates and a constant increase in the complexity of the regulatory framework. By tying the regulatory treatment of financial innovation closely to existing prudential rules and their underlying policy rationales, the proposed approach potentially ends the socially wasteful race between hare and tortoise that signifies the relation between regulators and a highly dynamic industry. In doing so it does not generally hamper market participants’ efficient discoveries where disintermediation proves socially beneficial. Instead, it only weeds-out rent-seeking circumventions of existing rules and standards.
In this paper, we investigate how the introduction of complex, model-based capital regulation affected credit risk of financial institutions. Model-based regulation was meant to enhance the stability of the financial sector by making capital charges more sensitive to risk. Exploiting the staggered introduction of the model-based approach in Germany and the richness of our loan-level data set, we show that (1) internal risk estimates employed for regulatory purposes systematically underpredict actual default rates by 0.5 to 1 percentage points; (2) both default rates and loss rates are higher for loans that were originated under the model-based approach, while corresponding risk-weights are significantly lower; and (3) interest rates are higher for loans originated under the model-based approach, suggesting that banks were aware of the higher risk associated with these loans and priced them accordingly. Further, we document that large banks benefited from the reform as they experienced a reduction in capital charges and consequently expanded their lending at the expense of smaller banks that did not introduce the model-based approach. Counter to the stated objectives, the introduction of complex regulation adversely affected the credit risk of financial institutions. Overall, our results highlight the pitfalls of complex regulation and suggest that simpler rules may increase the efficacy of financial regulation.
The recent decline in euro area inflation has triggered new calls for additional monetary stimulus by the ECB in order to counter the threat of a self‐reinforcing deflation and recession spiral. This note reviews the available evidence on inflation expectations, output gaps and other factors driving current inflation through the lens of the Phillips curve. It also draws a comparison to the Japanese experience with deflation in the late 1990s and the evidence from Japan concerning the outputinflation nexus at low trend inflation. The note concludes from this evidence that the risk of a selfreinforcing deflation remains very small. Thus, the ECB best await the impact of the long‐term refinancing operations decided in June that have the potential to induce substantial monetary accommodation once implemented for the first time in September.
Dem Druck standhalten
(2013)