Refine
Year of publication
Document Type
- Working Paper (214)
- Part of Periodical (15)
- Report (8)
- Article (7)
- Book (2)
- Doctoral Thesis (1)
- Periodical (1)
Is part of the Bibliography
- no (248)
Keywords
- monetary policy (14)
- DSGE (7)
- Geldpolitik (7)
- Federal Reserve (6)
- Monetary Policy (6)
- Numerical accuracy (6)
- Solution methods (6)
- Bayesian estimation (5)
- DSGE models (5)
- Deutschland (5)
Institute
- Institute for Monetary and Financial Stability (IMFS) (248) (remove)
High-frequency changes in interest rates around FOMC announcements are a standard method of measuring monetary policy shocks. However, some recent studies have documented puzzling effects of these shocks on private-sector forecasts of GDP, unemployment, or inflation that are opposite in sign to what standard macroeconomic models would predict. This evidence has been viewed as supportive of a „Fed information effect“ channel of monetary policy, whereby an FOMC tightening (easing) communicates that the economy is stronger (weaker) than the public had expected.
The authors show that these empirical results are also consistent with a „Fed response to news“ channel, in which incoming, publicly available economic news causes both the Fed to change monetary policy and the private sector to revise its forecasts. They provide substantial new evidence that distinguishes between these two channels and strongly favors the latter; for example, regressions that include the previously omitted public macroeconomic news, high-frequency stock market responses to Fed announcements, and a new survey that they conduct of individual Blue Chip forecasters all indicate that the Fed and private sector are simply responding to the same public news, and that there is little if any role for a „Fed information effect“.
Whatever it takes to understand a central banker : embedding their words using neural networks
(2023)
Dictionary approaches are at the forefront of current techniques for quantifying central bank communication. In this paper, the author propose a novel language model that is able to capture subtleties of messages such as one of the most famous sentences in central bank communications when ECB President Mario Draghi stated that "within [its] mandate, the ECB is ready to do whatever it takes to preserve the euro".
The authors utilize a text corpus that is unparalleled in size and diversity in the central bank communication literature, as well as introduce a novel approach to text quantication from computational linguistics. This allows them to provide high-quality central bank-specific textual representations and demonstrate their applicability by developing an index that tracks deviations in the Fed's communication towards inflation targeting. Their findings indicate that these deviations in communication significantly impact monetary policy actions, substantially reducing the reaction towards inflation deviation in the US.
We use a novel disaggregate sectoral euro area data set with a regional breakdown to investigate price changes and suggest a new method to extract factors from over-lapping data blocks. This allows us to separately estimate aggregate, sectoral, country-specific and regional components of price changes. We thereby provide an improved estimate of the sectoral factor in comparison with previous literature, which decomposes price changes into an aggregate and idiosyncratic component only, and interprets the latter as sectoral. We find that the sectoral component explains much less of the variation in sectoral regional inflation rates and exhibits much less volatility than previous findings for the US indicate. We further contribute to the literature on price setting by providing evidence that country- and region-specific factors play an important role in addition to the sector-specific factors, emphasising heterogeneity of inflation dynamics along different dimensions. We also conclude that sectoral price changes have a “geographical” dimension, that leads to new insights regarding the properties of sectoral price changes.
Since 2014 the ECB has implemented a massive expansion of monetary policy including large-scale asset purchases and negative policy rates. As the euro area economy has improved and inflation has risen, questions concerning the future normalization of monetary policy are starting to dominate the public debate.
The study argues that the ECB should develop a strategy for policy normalization and communicate it very soon to prepare the ground for subsequent steps towards tightening. It provides analysis and makes proposals concerning key aspects of this strategy. The aim is to facilitate the emergence of expectations among market participants that are consistent with a smooth process of policy normalization.
In this paper, we examine how the institutional design affects the outcome of bank bailout decisions. In the German savings bank sector, distress events can be resolved by local politicians or a state-level association. We show that decisions by local politicians with close links to the bank are distorted by personal considerations: While distress events per se are not related to the electoral cycle, the probability of local politicians injecting taxpayers’ money into a bank in distress is 30 percent lower in the year directly preceding an election. Using the electoral cycle as an instrument, we show that banks that are bailed out by local politicians experience less restructuring and perform considerably worse than banks that are supported by the savings bank association. Our findings illustrate that larger distance between banks and decision makers reduces distortions in the decision making process, which has implications for the design of bank regulation and supervision.
In this paper, we investigate how the introduction of complex, model-based capital regulation affected credit risk of financial institutions. Model-based regulation was meant to enhance the stability of the financial sector by making capital charges more sensitive to risk. Exploiting the staggered introduction of the model-based approach in Germany and the richness of our loan-level data set, we show that (1) internal risk estimates employed for regulatory purposes systematically underpredict actual default rates by 0.5 to 1 percentage points; (2) both default rates and loss rates are higher for loans that were originated under the model-based approach, while corresponding risk-weights are significantly lower; and (3) interest rates are higher for loans originated under the model-based approach, suggesting that banks were aware of the higher risk associated with these loans and priced them accordingly. Further, we document that large banks benefited from the reform as they experienced a reduction in capital charges and consequently expanded their lending at the expense of smaller banks that did not introduce the model-based approach. Counter to the stated objectives, the introduction of complex regulation adversely affected the credit risk of financial institutions. Overall, our results highlight the pitfalls of complex regulation and suggest that simpler rules may increase the efficacy of financial regulation.
This dissertation consists of three essays, which study the relation between stock prices and the macroeconomy using vector autoregressions (VARs). The first essay focuses on the link between stock prices and the current account. I find that stock markets provide a channel, in addition to the traditional exchange rate channel, through which external balance for a country with a current account imbalance can be restored. The second essay explores the transmission of U.S. stock price shocks to real activity and prices in G-7 countries. I achieve identification by imposing a small number of sign restrictions on impulse responses, while controlling for monetary policy, business cycle and government spending shocks. The results suggest that stock price movements are important for fluctuations in G-7 real activity and prices, but do not qualify as demand side business cycle shocks. The third essay investigates the impact of monetary and technology shocks on the stock market. I find an important role for technology shocks, but not monetary shocks, in explaining variations in real stock prices. The identification method is flexible enough to study the effects of technology news shocks. The responses are consistent with the idea that news on technology improvements have an immediate impact on stock prices.
This paper explores the relationship between equity prices and the current account for 17 industrialized countries in the period 1980-2007. Based on a panel vector autoregression, I compare the effects of equity price shocks to those originating from monetary policy and exchange rates. While monetary policy shocks have a limited impact, shocks to equity prices have sizeable effects. The results suggest that equity prices impact on the current account through their effects on real activity and exchange rates. Furthermore, shocks to exchange rates play a key role as well. Keywords: current account fluctuations, equity prices, panel vector autoregression
We investigate how unconventional monetary policy, via central banks’ purchases of corporate bonds, unfolds in credit-saturated markets. While this policy results in a loosening of credit market conditions as intended by policymakers, we report two unintended side effects. First, the policy impacts the allocation of credit among industries. Affected banks reallocate loans from investment-grade firms active on bond markets almost entirely to real estate asset managers. Other industries do not obtain more loans, particularly real estate developers and construction firms. We document an increase in real estate prices due to this policy, which fuels real estate overvaluation. Second, more loan write-offs arise from lending to these firms, and banks are not compensated for this risk by higher interest rates. We document a drop in bank profitability and, at the same time, a higher reliance on real estate collateral. Our findings suggest that central banks’ quantitative easing has substantial adverse effects in credit-saturated economies.
The Russian war of aggression against Ukraine since 24 February 2022 has intensified the discussion of Europe’s reliance on energy imports from Russia. A ban on Russian imports of oil, natural gas and coal has already been imposed by the United States, while the United Kingdom plans to cease imports of oil and coal from Russia by the end of 2022. The German Federal Government is currently opposing an energy embargo against Russia. However, the Federal Ministry for Economic Affairs and Climate Action is working on a strategy to reduce energy imports from Russia. In this paper, the authors give an overview of the German and European reliance on energy imports from Russia with a focus on gas imports and discuss price effects, alternative suppliers of natural gas, and the potential for saving and replacing natural gas. They also provide an overview of estimates of the consequences on the economic outlook if the conflict intensifies.
Empirical estimates of equilibrium real interest rates are so far mostly limited to advanced economies, since no statistical procedure suitable for a large set of countries is available. This is surprising, as equilibrium rates have strong policy implications in emerging markets and developing economies as well; current estimates of the global equilibrium rate rely on only a few countries; and estimates for a more diverse set of countries can improve understanding of the drivers. The authors propose a model and estimation strategy that decompose ex ante real interest rates into a permanent and transitory component even with short samples and high volatility. This is done with an unobserved component local level stochastic volatility model, which is used to estimate equilibrium rates for 50 countries with Bayesian methods.
Equilibrium rates were lower in emerging markets and developing economies than in advanced economies in the 1980s, similar in the 1990s, and have been higher since 2000. In line with economic integration and rising global capital markets, synchronization has been rising over time and is higher among advanced economies. Equilibrium rates of countries with stronger trade linkages and similar demographic and economic trends are more synchronized.
Schätzwerte mittelfristiger Gleichgewichtszinsen mit der Methode nach Laubach und Williams (2003) werden inzwischen vielfach in der Diskussion um die Geld- und Fiskalpolitik zitiert. Unter anderem wurden sie von Summers (2014a) als Evidenz für eine säkulare Stagnation angeführt und von Yellen (2015) zur Rechtfertigung der Nullzinspolitik verwendet. In diesem Papier nehmen wir eine umfangreiche Untersuchung und Sensitivitätsanalyse dieser Schätzwerte für die Vereinigten Staaten, Deutschland und den Euro-Raum vor. Aufgrund der hohen Unsicherheit und Sensitivität, die mit den Schätzwerten mittelfristiger Gleichgewichtszinsen mit der Laubach-Williams-Methode und ähnlichen Ansätzen verbunden ist, sollten diese Schätzungen nicht den Ausschlag für entscheidende Weichenstellungen in der Geld- und Fiskalpolitik geben.
The currrent debate on monetary and fiscal policy is heavily influenced by estimates of the equilibrium real interest rate. Beyer and Wieland re-estimate the U.S. equilibrium rate with the methodology of Laubach and Williams and further modifications. They provide new estimates for the United States, the euro area and Germany and subject them to sensitivity tests. Beyer and Wieland conclude that due to the great uncertainty and sensitivity, the observed decline in the estimates is not a reliable indicator of a need for expansionary monetary and fiscal policy. Yet, if those estimates are employed to determine the appropriate monetary policy stance, such estimates are better used together with the consistent estimate of the level of potential output.
The debate on monetary and fiscal policy is heavily influenced by estimates of the equilibrium real interest rate. In particular, this concerns estimates derived from a simple aggregate demand and Phillips curve model with time-varying components as proposed by Laubach and Williams (2003). For example, Summers (2014a) refers to these estimates as important evidence for a secular stagnation and the need for fiscal stimulus. Yellen (2015, 2017) has made use of such estimates in order to explain and justify why the Federal Reserve has held interest rates so low for so long. First, we re-estimate the United States equilibrium rate with the methodology of Laubach and Williams (2003). Then, we build on their approach and an alternative specification to provide new estimates for the United States, Germany, the euro area and Japan. Third, we subject these estimates to a battery of sensitivity tests. Due to the great uncertainty and sensitivity that accompany these equilibrium rate estimates, the observed decline in the estimates is not a reliable indicator of a need for expansionary monetary and fiscal policy. Yet, if these estimates are employed to determine the appropriate monetary policy stance, such estimates are better used together with the consistent estimate of the level of potential output.
For some time now, structural macroeconomic models used at central banks have been predominantly New Keynesian DSGE models featuring nominal rigidities and forwardlooking decision-making. While these features are widely deemed crucial for policy evaluation exercises, most central banks have added more detailed characterizations of the financial sector to these models following the Great Recession in order to improve their fit to the data and their forecasting performance. We employ a comparative approach to investigate the characteristics of this new generation of New Keynesian DSGE models and document an elevated degree of model uncertainty relative to earlier model generations. Policy transmission is highly heterogeneous across types of financial frictions and monetary policy causes larger effects, on average. The New Keynesian DSGE models we analyze suggest that a simple policy rule robust to model uncertainty involves a weaker response to inflation and the output gap in the presence of financial frictions as compared to earlier generations of such models. Leaning-against-the-wind policies in models of this class estimated for the Euro Area do not lead to substantial gains. With regard to forecasting performance, the inclusion of financial frictions can generate improvements, if conditioned on appropriate data. Looking forward, we argue that model-averaging and embracing alternative modelling paradigms is likely to yield a more robust framework for the conduct of monetary policy.
The purpose of the data presented in this article is to use it in ex post estimations of interest rate decisions by the European Central Bank (ECB), as it is done by Bletzinger and Wieland (2017) [1]. The data is of quarterly frequency from 1999 Q1 until 2013 Q2 and consists of the ECB's policy rate, inflation rate, real output growth and potential output growth in the euro area. To account for forward-looking decision making in the interest rate rule, the data consists of expectations about future inflation and output dynamics. While potential output is constructed based on data from the European Commission's annual macro-economic database, inflation and real output growth are taken from two different sources both provided by the ECB: the Survey of Professional Forecasters and projections made by ECB staff. Careful attention was given to the publication date of the collected data to ensure a real-time dataset only consisting of information which was available to the decision makers at the time of the decision.
A number of contributions to research on monetary policy have suggested that policy should be asymmetric near the lower bound on nominal interest rates. As inflation and economic activity decline, policy should ease more aggressively than it would in the absence of the lower bound. As activity recovers and inflation picks up, the central bank should act to keep interest rates lower for longer than without the bound. In this note, we investigate to what extent the policy easing implemented by the ECB since summer 2013 mirrors the rate recommendations of a simple policy rule or deviates from it in a way that indicates a “lower for longer” approach to policy near zero interest rates.
On July 4, 2013 the ECB Governing Council provided more specific forward guidance than in the past by stating that it expects ECB interest rates to remain at present or lower levels for an extended period of time. As explained by ECB President Mario Draghi this expectation is based on the Council’s medium-term outlook for inflation conditional on economic activity and money and credit. Draghi also stressed that there is no precise deadline for this extended period of time, but that a reasonable period can be estimated by extracting a reaction function. In this note, we use such a reaction function, namely the interest rate rule from Orphanides and Wieland (2013) that matches past ECB interest rate decisions quite well, to project the rate path consistent with inflation and growth forecasts from the survey of professional forecasters published by the ECB on August 8, 2013. This evaluation suggests an increase in ECB interest rates by May 2014 at the latest. We also use the Eurosystem staff projection from June 6, 2013 for comparison. While it would imply a longer period of low rates, it does not match past ECB decisions as well as the reaction function with SPF forecasts.
We analyze the macroeconomic implications of increasing the top marginal income tax rate using a dynamic general equilibrium framework with heterogeneous agents and a fiscal structure resembling the actual U.S. tax system. The wealth and income distributions generated by our model replicate the empirical ones. In two policy experiments, we increase the statutory top marginal tax rate from 35 to 70 percent and redistribute the additional tax revenue among households, either by decreasing all other marginal tax rates or by paying out a lump-sum transfer to all households. We find that increasing the top marginal tax rate decreases inequality in both wealth and income but also leads to a contraction of the aggregate economy. This is primarily driven by the negative effects that the tax change has on top income earners. The aggregate gain in welfare is sizable in both experiments mainly due to a higher degree of distributional equality.
How does the need to preserve government debt sustainability affect the optimal monetary and fiscal policy response to a liquidity trap? To provide an answer, we employ a small stochastic New Keynesian model with a zero bound on nominal interest rates and characterize optimal time-consistent stabilization policies. We focus on two policy tools, the short-term nominal interest rate and debt-financed government spending. The optimal policy response to a liquidity trap critically depends on the prevailing debt burden. While the optimal amount of government spending is decreasing in the level of outstanding government debt, future monetary policy is becoming more accommodative, triggering a change in private sector expectations that helps to dampen the fall in output and inflation at the outset of the liquidity trap.
The author proposes a Differential-Independence Mixture Ensemble (DIME) sampler for the Bayesian estimation of macroeconomic models.It allows sampling from particularly challenging, high-dimensional black-box posterior distributions which may also be computationally expensive to evaluate. DIME is a “Swiss Army knife”, combining the advantages of a broad class of gradient-free global multi-start optimizers with the properties of a Monte Carlo Markov chain (MCMC). This includes fast burn-in and convergence absent any prior numerical optimization or initial guesses, good performance for multimodal distributions, a large number of chains (the “ensemble”) running in parallel, an endogenous proposal density generated from the state of the full ensemble, which respects the bounds of the prior distribution. The author shows that the number of parallel chains scales well with the number of necessary ensemble iterations.
DIME is used to estimate the medium-scale heterogeneous agent New Keynesian (“HANK”) model with liquid and illiquid assets, thereby for the first time allowing to also include the households’ preference parameters. The results mildly point towards a less accentuated role of household heterogeneity for the empirical macroeconomic dynamics.
Financial market interactions can lead to large and persistent booms and recessions. Instability is an inherent threat to economies with speculative financial markets. A central bank’s interest rate setting can amplify the expectation feedback in the financial market and this can lead to unstable dynamics and excess volatility. The paper suggests that policy institutions may be well-advised to handle tools like asset price targeting with care since such instruments might add a structural link between asset prices and macroeconomic aggregates. Neither stock prices nor indices are a good indicator to base decisions on.
Occasionally binding constraints have become an important part of economic modelling, especially since western central banks see themselves (again) constraint by the so-called zero lower bound (ZLB) of the nominal interest rate. A binding ZLB constraint poses a major problem for a quantitative-structural analysis: Linear solution methods do no work in the presence of a non-linearity such as the ZLB and existing alternatives tend to be computationally demanding. The urge to study macroeconomic questions related to the Great Recession and the Covid-19 crisis in a quantitative-structural framework requires algorithms that are not only accurate, but that are also robust, fast, and computationally efficient.
A particularly important application where efficient and fast methods for occasionally binding constraints (OBCs) are needed is the Bayesian estimation of macroeconomic models. This paper shows that a linear dynamic rational expectations system with OBCs, depending on the expected duration of the constraint, can be represented in closed form. Combined with a set of simple equilibrium conditions, this can be exploited to avoid matrix inversions and simulations at runtime for signifcant gains in computational speed.
The level of capital tax gains has high explanatory power regarding the question of what drives economic inequality. On this basis, the authors develop a simple, yet micro-founded portfolio selection model to explain the dynamics of wealth inequality given empirical tax series in the US. The results emphasize that the level and the transition of speed of wealth inequality depend crucially on the degree of capital taxation. The projections predict that – continuing on the present path of capital taxation in the US – the gap between rich and poor is expected to shrink whereas “massive” tax cuts will further increase the degree of wealth concentration.
Did the Federal Reserves’ Quantitative Easing (QE) in the aftermath of the financial crisis have macroeconomic effects? To answer this question, the authors estimate a large-scale DSGE model over the sample from 1998 to 2020, including data of the Fed’s balance sheet. The authors allow for QE to affect the economy via multiple channels that arise from several financial frictions. Their nonlinear Bayesian likelihood approach fully accounts for the zero lower bound on nominal interest rates. They find that between 2009 to 2015, QE increased output by about 1.2 percent. This reflects a net increase in investment of nearly 9 percent, that was accompanied by a 0.7 percent drop in aggregate consumption. Both, government bond and capital asset purchases were effective in improving financing conditions. Especially capital asset purchases significantly facilitated new investment and increased the production capacity. Against the backdrop of a fall in consumption, supply side effects dominated which led to a mild disinflationary effect of about 0.25 percent annually.
Can boundedly rational agents survive competition with fully rational agents? The authors develop a highly nonlinear heterogeneous agents model with rational forward looking versus boundedly rational backward looking agents and evolving market shares depending on their relative performance. Their novel numerical solution method detects equilibrium paths characterized by complex bubble and crash dynamics. Boundedly rational trend-extrapolators amplify small deviations from fundamentals, while rational agents anticipate market crashes after large bubbles and drive prices back close to fundamental value. Overall rational and non-rational beliefs co-evolve over time, with time-varying impact, and their interaction produces complex endogenous bubble and crashes, without any exogenous shocks.
The recently observed disconnect between inflation and economic activity can be explained by the interplay between the zero lower bound (ZLB) and the costs of external financing. In normal times, credit spreads and the nominal interest rate balance out; factor costs dominate firms' marginal costs. When nominal rates are constrained, larger spreads can more than offset the effect of lower factor costs and induce only moderate inflation responses. The Phillips curve is hence flat at the ZLB, but features a positive slope in normal times and thus a hockey stick shape. Via this mechanism, forward guidance may induce deflationary effects.
Using a nonlinear Bayesian likelihood approach that fully accounts for the zero lower bound on nominal interest rates, the authors analyze US post-crisis business cycle dynamics and provide reference parameter estimates. They find that neither the inclusion of financial frictions nor that of household heterogeneity improve the empirical fit of the standard model, or its ability to provide a joint explanation for the post-2007 dynamics. Associated financial shocks mis-predict an increase in consumption. The common practice of omitting the ZLB period in the estimation severely distorts the analysis of the more recent economic dynamics.
Renewed interest in fiscal policy has increased the use of quantitative models to evaluate policy. Because of modeling uncertainty, it is essential that policy evaluations be robust to alternative assumptions. We find that models currently being used in practice to evaluate fiscal policy stimulus proposals are not robust. Government spending multipliers in an alternative empirically-estimated and widely-cited new Keynesian model are much smaller than in these old Keynesian models; the estimated stimulus is extremely small with GDP and employment effects only one-sixth as large.
In the aftermath of the global financial crisis and great recession, many countries face substantial deficits and growing debts. In the United States, federal government outlays as a ratio to GDP rose substantially from about 19.5 percent before the crisis to over 24 percent after the crisis. In this paper we consider a fiscal consolidation strategy that brings the budget to balance by gradually reducing this spending ratio over time to the level that prevailed prior to the crisis. A crucial issue is the impact of such a consolidation strategy on the economy. We use structural macroeconomic models to estimate this impact focussing primarily on a dynamic stochastic general equilibrium model with price and wage rigidities and adjustment costs. We separate out the impact of reductions in government purchases and transfers, and we allow for a reduction in both distortionary taxes and government debt relative to the baseline of no consolidation. According to the model simulations GDP rises in the short run upon announcement and implementation of this fiscal consolidation strategy and remains higher than the baseline in the long run. We explore the role of the mix of expenditure cuts and tax reductions as well as gradualism in achieving this policy outcome. Finally, we conduct sensitivity studies regarding the type of model used and its parameterization.
In the aftermath of the global financial crisis and great recession, many countries face substantial deficits and growing debts. In the United States, federal government outlays as a ratio to GDP rose substantially from about 19.5 percent before the crisis to over 24 percent after the crisis. In this paper we consider a fiscal consolidation strategy that brings the budget to balance by gradually reducing this spending ratio over time to the level that prevailed prior to the crisis. A crucial issue is the impact of such a consolidation strategy on the economy. We use structural macroeconomic models to estimate this impact focussing primarily on a dynamic stochastic general equilibrium model with price and wage rigidities and adjustment costs. We separate out the impact of reductions in government purchases and transfers, and we allow for a reduction in both distortionary taxes and government debt relative to the baseline of no consolidation. According to the model simulations GDP rises in the short run upon announcement and implementation of this fiscal consolidation strategy and remains higher than the baseline in the long run. We explore the role of the mix of expenditure cuts and tax reductions as well as gradualism in achieving this policy outcome. Finally, we conduct sensitivity studies regarding the type of model used and its parameterization.
Recently, we evaluated a fiscal consolidation strategy for the United States that would bring the government budget into balance by gradually reducing government spending relative to GDP to the ratio that prevailed prior to the crisis (Cogan et al, JEDC 2013). Specifically, we published an analysis of the macroeconomic consequences of the 2013 Budget Resolution that was passed by the U.S. House of Representatives in March 2012. In this note, we provide an update of our research that evaluates this year’s budget reform proposal that is to be discussed and voted on in the House of Representative in March 2013. Contrary to the views voiced by critics of fiscal consolidation, we show that such a reduction in government purchases and transfer payments can increase GDP immediately and permanently relative to a policy without spending restraint. Our research makes use of a modern structural model of the economy that incorporates the long-standing essential features of economics: opportunity costs, efficiency, foresight and incentives. GDP rises because households take into account that spending restraint helps avoid future increases in tax rates. Lower taxes imply less distorted incentives for work, investment and production relative to a scenario without fiscal consolidation and lead to higher growth.
In the aftermath of the global financial crisis and great recession, many countries face substantial deficits and growing debts. In the United States, federal government outlays as a ratio to GDP rose substantially from about 19.5 percent before the crisis to over 24 percent after the crisis. In this paper we consider a fiscal consolidation strategy that brings the budget to balance by gradually reducing this spending ratio over time to the level that prevailed prior to the crisis. A crucial issue is the impact of such a consolidation strategy on the economy. We use structural macroeconomic models to estimate this impact. We consider two types of dynamic stochastic general equilibrium models: a neoclassical growth model and more complicated models with price and wage rigidities and adjustment costs. We separate out the impact of reductions in government purchases and transfers, and we allow for a reduction in both distortionary taxes and government debt relative to the baseline of no consolidation. According to the initial model simulations GDP rises in the short run upon announcement and implementation of this fiscal consolidation strategy and remains higher than the baseline in the long run.
Savings accounts are owned by most households, but little is known about the performance of households’ investments. We create a unique dataset by matching information on individual savings accounts from the DNB Household Survey with market data on account-specific interest rates and characteristics. We document considerable heterogeneity in returns across households, which can be partly explained by financial sophistication. A one-standard deviation increase in financial literacy is associated with a 13% increase compared to the median interest rate. We isolate the usage of modern technology (online accounts) as one channel through which financial literacy has a positive association with returns.
Corporate borrowers care about the overall riskiness of a bank’s operations as their continued access to credit may rely on the bank’s ability to roll over loans or to expand existing credit facilities. As we show, a key implication of this observation is that increasing competition among banks should have an asymmetric impact on banks’ incentives to take on risk: Banks that are already riskier will take on yet more risk, while their safer rivals will become even more prudent. Our results offer new guidance for bank supervision in an increasingly competitive environment and may help to explain existing, ambiguous findings on the relationship between competition and risk-taking in banking. Furthermore, our results stress the beneficial role that competition can have for financial stability as it turns a bank’s "prudence" into an important competitive advantage.
The authors study the effects of forward looking communication in an environment of rising inflation rates on German consumers‘ inflation expectations using a randomized control trial. They show that information about rising inflation increases short- and long-term inflation expectations. This initial increase in expectations can be mitigated using forward looking information about inflation. Among these information treatments, professional forecasters‘ projections seem to reduce inflation expectations by more than policymakers‘ characterization of inflation as a temporary phenomenon.
I. EINLEITUNG II. VORSCHLAG DER WIRTSCHAFTSRECHTLICHEN ABTEILUNG ZUM 67. DEUTSCHEN JURISTENTAG 1. Darstellung und Begriffsbestimmung 2. Begründung III. BEDEUTUNG DES AUßERBÖRSLICHEN HANDELS IN DEUTSCHLAND IV. RECHTSVERGLEICHENDE BETRACHTUNG VON AKTIEN- UND KAPITALMARKTRECHT 1. Deutschland a) Organisation des Kapitalmarktes b) Differenzierung im Rahmen des Aktienrechts 2. Großbritannien a) Organisation des Kapitalmarktes b) Differenzierungen im „Companies Act 2006“ 3. USA a) Rechtsquellen des Kapitalgesellschafts- und Kapitalmarktrechts b) Organisation des Kapitalmarktes c) Kapitalgesellschaftsrecht V. STELLUNGNAHME 1. Anknüpfung der vorhandenen Regelungen an die Kapitalmarktorientierung 2. Verwischung der Grenzen zwischen Aktien- und Kapitalmarktrecht 3. Missbrauchsgefahr durch selbstbestimmte Wahl der Satzungsstrenge 4. Bisherige Reformansätze im deutschen Schrifttum 5. Die Abkehr von einer Differenzierung im Aktienrecht in der aktuellen Reformdiskussion 6. Ökonomische Analyse des Aktienrechts („Opt-In-Modell“) VI. FAZIT: Der Deregulierungsansatz, der eine Differenzierung zwischen börsen- und nichtbörsennotierten Aktiengesellschaften vorsieht, ist nicht zu befürworten. Vor dem Hintergrund der rechtsvergleichenden Betrachtung der Beispiele Großbritannien und der USA stellt sich vielmehr eine kapitalmarktorientierte Differenzierung der Anlegerschutzbestimmungen des Aktienrechts als vorzugswürdig dar. Die Anknüpfung von Deregulierungsmaßnahmen an das Kriterium der Kapitalmarktorientierung findet sich im Ansatz auch im bereits geltenden deutschen Recht. So enthält sowohl das Aktienrecht als auch das Kapitalmarktrecht entsprechend differenzierende Regelungen. Zudem weisen auch aktuelle nationale Gesetzesvorhaben und die Entwicklungen im europäischen Gesellschaftsrecht Tendenzen zu einer Abgrenzung nach dem Kriterium der Kapitalmarktferne oder -offenheit auf. Auch birgt der enge Anwendungsbereich der zwingenden Anlegerschutznormen des Aktienrechts auf börsennotierte Aktiengesellschaften erhebliche Missbrauchsrisiken. Aktiengesellschaften könnten in den außerbörslichen Handel wechseln, um in den Genuss von Deregulierungen und geringeren Transparenz- und Anlegerschutzanforderungen zu kommen. Letztlich folgt der Vorzug einer kapitalmarktorientierten Differenzierung auch aus der aktuellen Diskussion um Reformansätze zur Steigerung der Wettbewerbsfähigkeit des deutschen Gesellschafts- und Kapitalmarktrechts. Die in diesem Zusammenhang geforderte Aufhebung der Satzungsstrenge bei gleichzeitiger Normierung entsprechender Informations- und Anlegerschutzpflichten im Kapitalmarktrecht würde dazu führen, dass an bestehende Differenzierungen des Kapitalmarktrechts angeknüpft werden könnte.
Climate change has become one of the most prominent concerns globally. In this paper, the authors study the transition risk of greenhouse gas emission reduction in structural environmental-macroeconomic DSGE models. First, they analyze the uncertainty in model prediction on the effect of unanticipated and pre-announced carbon price increases. Second, they conduct optimal model-robust policy in different settings. They find that reducing emissions by 40% causes 0.7% to 4% output loss with 2% on average. Pre-announcement of carbon prices affects the inflation dynamics significantly. The central bank should react slightly less to inflation and output growth during the transition risk. With optimal carbon price designs, it should react even less to inflation, and more to output growth.
Optimal monetary policy studies typically rely on a single structural model and identification of model-specific rules that minimize the unconditional volatilities of inflation and real activity. In their proposed approach, the authors take a large set of structural models and look for the model-robust rules that minimize the volatilities at those frequencies that policymakers are most interested in stabilizing. Compared to the status quo approach, their results suggest that policymakers should be more restrained in their inflation responses when their aim is to stabilize inflation and output growth at specific frequencies. Additional caution is called for due to model uncertainty.
Fabo, Janˇcokov ́a, Kempf, and P ́astor (2021) show that papers written by central bank researchers find quantitative easing (QE) to be more effective than papers written by academics. Weale and Wieladek (2022) show that a subset of these results lose statistical significance when OLS regressions are replaced by regressions that downweight outliers. We examine those outliers and find no reason to downweight them. Most of them represent estimates from influential central bank papers published in respectable academic journals. For example, among the five papers finding the largest peak effect of QE on output, all five are published in high-quality journals (Journal of Monetary Economics, Journal of Money, Credit and Banking, and Applied Economics Letters), and their average number of citations is well over 200. Moreover, we show that these papers have supported policy communication by the world’s leading central banks and shaped the public perception of the effectiveness of QE. New evidence based on quantile regressions further supports the results in Fabo et al. (2021).
Central banks sometimes evaluate their own policies. To assess the inherent conflict of interest, the authors compare the research findings of central bank researchers and academic economists regarding the macroeconomic effects of quantitative easing (QE). They find that central bank papers report larger effects of QE on output and inflation. Central bankers are also more likely to report significant effects of QE on output and to use more positive language in the abstract. Central bankers who report larger QE effects on output experience more favorable career outcomes. A survey of central banks reveals substantial involvement of bank management in research production.
In this paper we adapt the Hamiltonian Monte Carlo (HMC) estimator to DSGE models, a method presently used in various fields due to its superior sampling and diagnostic properties. We implement it into a state-of-theart, freely available high-performance software package, STAN. We estimate a small scale textbook New-Keynesian model and the Smets-Wouters model using US data. Our results and sampling diagnostics confirm the parameter estimates available in existing literature. In addition, we find bimodality in the Smets-Wouters model even if we estimate the model using the original tight priors. Finally, we combine the HMC framework with the Sequential Monte Carlo (SMC) algorithm to create a powerful tool which permits the estimation of DSGE models with ill-behaved posterior densities.
In this paper we adopt the Hamiltonian Monte Carlo (HMC) estimator for DSGE models by implementing it into a state-of-the-art, freely available high-performance software package. We estimate a small scale textbook New-Keynesian model and the Smets-Wouters model on US data. Our results and sampling diagnostics confirm the parameter estimates available in existing literature. In addition we combine the HMC framework with the Sequential Monte Carlo (SMC) algorithm which permits the estimation of DSGE models with ill-behaved posterior densities.
A theory of the boundaries of banks with implications for financial integration and regulation
(2015)
We offer a theory of the "boundary of the
rm" that is tailored to banking, as it builds on a single ine¢ ciency arising from risk-shifting and as it takes into account both interbank lending as an alternative to integration and the role of possibly insured deposit funding. Amongst others, it explains both why deeper economic integration should cause also greater financial integration through both bank mergers and interbank lending, albeit this typically remains ine¢ ciently incomplete, and why economic disintegration (or "desychronization"), as currently witnessed in the European Union, should cause less interbank exposure. It also suggests that recent policy measures such as the preferential treatment of retail deposits, the extension of deposit insurance, or penalties on "connectedness" could all lead to substantial welfare losses.
The ruling of the German Federal Constitutional Court and its call for conducting and communicating proportionality assessments regarding monetary policy have been the subject of some controversy. However, it can also be understood as a way to strengthen the de-facto independence of the European Central Bank. The authors shows how a regular proportionality check could be integrated in the ECB’s strategy that is currently undergoing a systematic review. In particular, they propose to include quantitative benchmarks for policy rates and the central bank balance sheet. Deviations from such benchmarks can have benefits in terms of the intended path for inflation while involving costs in terms of risks and side effects that need to be balanced. Practical applications to the euro area are provided
Highly interconnected global supply chains make countries vulnerable to supply chain disruptions. The authors estimate the macroeconomic effects of global supply chain shocks for the euro area. Their empirical model combines business cycle variables with data from international container trade.
Using a novel identification scheme, they augment conventional sign restrictions on the impulse responses by narrative information about three episodes: the Tohoku earthquake in 2011, the Suez Canal obstruction in 2021, and the Shanghai backlog in 2022. They show that a global supply chain shock causes a drop in euro area real economic activity and a strong increase in consumer prices. Over a horizon of one year, the global supply chain shock explains about 30% of inflation dynamics. They also use regional data on supply chain pressure to isolate shocks originating in China.
Their results show that supply chain disruptions originating in China are an important driver for unexpected movements in industrial production, while disruptions originating outside China are an especially important driver for the dynamics of consumer prices.
Central banks sowing the seeds for a green financial sector? NGFS membership and market reactions
(2024)
In December 2017, during the One Planet Summit in Paris, a group of eight central banks and supervisory authorities launched the “Network for Greening the Financial Sector” (NGFS) to address challenges and risks posed by climate change to the global financial system. Until 06/2023 an additional 69 central banks from all around the world have joined the network. We find that the propensity to join the network can be described as a function in the country’s economic development (e.g., GDP per capita), national institutions (e.g., central bank independence), and performance of the central bank on its mandates (e.g., price stability and output gap). Using an event study design to examine consequences of network expansions in capital markets, we document that a difference portfolio that is long in clean energy stocks and short in fossil fuel stocks benefits from an enlargement of the NGFS. Overall, our results suggest that an increasing number of central banks and supervisory authorities are concerned about climate change and willing to go beyond their traditional objectives, and that the capital market believes they will do so.
Hong Kong’s Linked Exchange Rate System (LERS) has been in operation for twenty-five years during which time many other fixed exchange rate systems have succumbed to shocks and/or speculative attacks. This fact alone suggests that the LERS is a robust system which enjoys a large measure of credibility in financial markets. This paper intends to investigate whether this is indeed the case, and whether it has been the case throughout its 25-year history. In particular we will use the tools of modern finance to extract information from financial asset prices about market expectations that are related to the credibility of the LERS. The main focus is on how market participants ‘judged’ the various changes made to the LERS, such as the ‘seven technical measures’ introduced in September 1998 and the ‘three refinements’ made in May 2005. These changes have been characterizes as making the system less discretionary over time, and we hypothesize that they have also made it more credible as revealed in the prices of exchange rate related asset prices. We also investigate the relationship between interest rates and exchange rates in the current system in light of modern models of target-zone exchange rate systems. We will examine whether the intramarginal intervention in November 2007 changed the dynamic properties of the exchange rate as suggested by such models.
The risk of deflation
(2009)
This paper was prepared for the meeting on Financial Regulation and Macroeconomic Stability: Key issues for the G20, organised by the CEPR and the Reinventing Bretton Woods Committee, London, 31 January 2009. Introduction: The onset of financial instability in August 2007, which quickly spread across the world, raises a number of questions for policy makers. First, what are the roots of the crisis? Many factors have been emphasized in the debate, including the opacity of complex financial products; the excessive confidence in ratings; weak risk management by financial institutions; massive reliance on wholesale funding; and the presumption that markets would always be liquid. Furthermore, poorly understood incentive effects – arising from the originate-to-distribute-model, remuneration policies and the period of low interest rates – are also widely seen as having played a role. Second, how can a repetition of the crisis can be avoided? Much attention is being focused on regulation and supervision of financial intermediaries. The G-20, at its summit in November 2008, noted that measures need to be taken in five areas: (i) financial market transparency and disclosure by firms need to be strengthened; (ii) regulation needs to be enhanced to ensure that all financial markets, products and participants are regulated or subject to oversight, as appropriate; (iii) the integrity of financial markets should be improved by bolstering investor and consumer protection, avoiding conflicts of interest, and by promoting information sharing; (iv) international cooperation among regulators must be enhanced; and (v) international financial institutions must be reformed to reflect changing economic weights in the world economy better in order to increase the legitimacy and effectiveness of these institutions. Third, how can the consequences for economic activity be minimized? Many of the adverse developments in financial markets – in particular the collapse of term interbank markets – reflect deeply entrenched perceptions of counterparty risk. Prompt and far-reaching action to support the financial system, in particular the infusion of equity capital in financial institutions to reduce counter-party risk and get credit to flow again, is essential in order to restore market functioning. A particular risk at present is that the rapid decline in inflation in many countries in recent months will turn into deflation with highly adverse real economic developments. This background paper considers how large the risk of deflation may be and discusses what policy can do to reduce it. It is organized as follows. Section 2 defines deflation and discusses downward nominal wage rigidities and the zero lower bound on interest rates. While these factors are frequently seen as two reasons why deflation can be associated with very poor economic outcomes, they should not be overemphasized. Section 3 looks at the current situation. Inflation expectations and forecasts in the subset of economies we look at (the euro area, the UK and the US) are positive, indicating that deflation is not expected. This does not imply that the current concerns of deflation are unwarranted, only that the public expects the central bank to be successful in avoiding deflation. The section also looks at the evolution of headline and “core” inflation, focusing on data from the US and the euro area. Section 4 reviews how monetary and fiscal policy can be conducted to ensure that deflation is avoided. Section 5 briefly discusses special issues arising in emerging market economies. Finally, Section 6 offers some conclusions. An Appendix discusses deflation episodes in the period 1882-1939.
We test the menu cost model of Ball and Mankiw (1994, 1995), which implies that the impact of price dispersion on inflation should differ between inflation and deflation episodes, using data for Japan and Hong Kong. We use a random cross-section sample split when calculating the moments of the distribution of price changes to mitigate the small-cross-sectionsample bias noted by Cecchetti and Bryan (1999). The parameter on the third moment is positive and significant in both countries during both the inflation and deflation periods, and the parameter on the second moment changes sign in the deflation period, as the theory predicts. Keywords: inflation, deflation, menu costs, Hong Kong, Japan JEL Numbers: E31
Exploiting the natural experiment of the German reunification, we examine how consumers adapt to a new environment in their macroeconomic forecasting. We document that East Germans expect higher in inflation and make larger forecast errors than West
Germans even decades after reunification. Differences in consumption baskets, financial literacy, risk aversion or trust in the central bank cannot fully account for these patterns. We find most support for the explanation that East Germans, who were used to a strong norm of zero inflation, persistently overadjusted the level of their expectations in the face of the initial inflation shock in reunified Germany. Our findings suggest that large changes in the economic environment can permanently impede people's ability to form accurate macroeconomic expectations, with an important role for the interaction of old norms and new experiences around the event.
Household finance
(2020)
Household financial decisions are complex, interdependent, and heterogeneous, and central to the functioning of the financial system. We present an overview of the rapidly expanding literature on household finance (with some important exceptions) and suggest directions for future research. We begin with the theory and empirics of asset market participation and asset allocation over the lifecycle. We then discuss house-hold choices in insurance markets, trading behavior, decisions on retirement saving, and financial choices by retirees. We survey research on liabilities, including mortgage choice, refinancing, and default, and household behavior in unsecured credit markets, including credit cards and payday lending. We then connect the household to its social environment, including peer effects, cultural and hereditary factors, intra-household financial decision making, financial literacy, cognition and educational interventions. We also discuss literature on the provision and consumption of financial advice.
We assemble a data set of more than eight million German Twitter posts related to the war in Ukraine. Based on state-of-the-art methods of text analysis, we construct a daily index of uncertainty about the war as perceived by German Twitter. The approach also allows us to separate this index into uncertainty about sanctions against Russia, energy policy and other dimensions. We then estimate a VAR model with daily financial and macroeconomic data and identify an exogenous uncertainty shock. The increase in uncertainty has strong effects on financial markets and causes a significant decline in economic activity as well as an increase in expected inflation. We find the effects of uncertainty to be particularly strong in the first months of the war.
The authors study the impact of dissent in the ECB‘s Governing Council on uncertainty surrounding households‘ inflation expectations. They conduct a randomized controlled trial using the Bundesbank Online Panel Households. Participants are provided with alternative information treatments concerning the vote in the Council, e.g. unanimity and dissent, and are asked to submit probabilistic inflation expectations. The results show that the vote is informative.
Households revise their subjective inflation forecast after receiving information about the vote. Dissenting votes cause a wider individual distribution of future inflation. Hence, dissent increases households‘ uncertainty about inflation. This effect is statistically significant once the authors allow for the interaction between the treatments and individual characteristics of respondents.
The results are robust with respect to alternative measures of forecast uncertainty and hold for different model specifications. The findings suggest that providing information about dissenting votes without additional information about the nature of dissent is detrimental to coordinating household expectations.
Veronika Grimm, Lukas Nöh, and Volker Wieland assess the possible development of government interest expenditures as a share of GDP for Germany, France, Italy and Spain. Until 2021, these and other member states could anticipate a further reduction of interest expenditure in the future. This outlook has changed considerably with the recent surge in inflation and government bond rates. Nevertheless, under reasonable assumptions current yield curves still imply that interest expenditure relative to GDP can be stabilized at the current level. The authors also review the implications of a further upward shift in the yield curves of 1 or 2 percentage points. These implications suggest significant medium-term risks for highly indebted member states with interest expenditure approaching or exceeding levels last observed on the eve of the euro area debt crisis. In light of these risks, governments of euro area member states should take substantive action to achieve a sustained decline in debt-to-GDP ratios towards safer levels. They bear the responsibility for making sure that government finances can weather the higher interest rates which are required to achieve price stability in the euro area.
Trotz der von der EZB eingeleiteten Zinswende in der zweiten Jahreshälfte 2022 als späte Reaktion auf die deutlich unterschätzte Persistenz hoher Inflationsraten im Euroraum sind die Realzinsen sowohl in der Ex-post-Betrachtung als auch in der Ex-ante-Betrachtung keineswegs als restriktiv einzuschätzen. Die Banken haben allerdings recht rasch strengere Vergaberichtlinien beschlossen, und die Nachfrage im Wohnungsbau und bei den Hypothekarkrediten ist stark eingebrochen.
Die Autoren thematisieren die Bedeutung von Zahlungsstromeffekten bei Annuitätenkrediten und analysiert hier vor allem den sogenannten Front-Loading-Effekt. Danach führen höhere Nominalzinsen selbst bei vollständig antizipierten Inflationsraten und unveränderten Realzinsen zu starken finanziellen Zusatzbelastungen in den ersten Phasen der typischerweise langen Kreditlaufzeit. Derartige Liquiditätseffekte können die Zahlungsfähigkeit bzw. die Zahlungsbereitschaft der privaten Investoren empfindlich verringern. Dies gilt vor allem bei Darlehen in Form der Prozentannuität, da hier zusätzlich ein Laufzeitenverkürzungseffekt auftritt. Solche Darlehen sind in Deutschland recht populär.
Mit Blick auf die Zukunft sehen die Autoren auch eine reale Gefahr für den Bestand an Wohnungsbaukrediten, wenn es zu einer Refinanzierung des großen Bestands an billigen Wohnungsbaukrediten kommt, ein Risiko, das auch Auswirkungen auf die makroökonomische und finanzielle Stabilität hat.
We use a structural VAR model to study the German natural gas market and investigate the impact of the 2022 Russian supply stop on the German economy. Combining conventional and narrative sign restrictions, we find that gas supply and demand shocks have large and persistent price effects, while output effects tend to be moderate. The 2022 natural gas price spike was driven by adverse supply
shocks and positive storage demand shocks, as Germany filled its inventories before the winter. Counterfactual simulations of an embargo on natural gas imports from Russia indicate similar positive price and negative output effects compared to what we observe in the data.
This paper uses unique administrative data and a quasi-field experiment of exogenous allocation in Sweden to estimate medium- and longer-run effects on financial behavior from exposure to financially literate neighbors. It contributes evidence of causal impact of exposure and of a social multiplier of financial knowledge, but also of unfavorable distributional aspects of externalities. Exposure promotes saving in private retirement accounts and stockholding, especially when neighbors have economics or business education, but only for educated households and when interaction possibilities are substantial. Findings point to transfer of knowledge rather than mere imitation or effects through labor, education, or mobility channels.
The authors present evidence of a new propagation mechanism for wealth inequality, based on differential responses, by education, to greater inequality at the start of economic life. The paper is motivated by a novel positive cross-country relationship between wealth inequality and perceptions of opportunity and fairness, which holds only for the more educated. Using unique administrative micro data and a quasi-field experiment of exogenous allocation of households, the authors find that exposure to a greater top 10% wealth share at the start of economic life in the country leads only the more educated placed in locations with above-median wealth mobility to attain higher wealth levels and position in the cohort-specific wealth distribution later on. Underlying this effect is greater participation in risky financial and real assets and in self-employment, with no evidence for a labor income, unemployment risk, or human capital investment channel. This differential response is robust to controlling for initial exposure to fixed or other time-varying local features, including income inequality, and consistent with self-fulfilling responses of the more educated to perceived opportunities, without evidence of imitation or learning from those at the top.
The authors identify U.S. monetary and fiscal dominance regimes using machine learning techniques. The algorithms are trained and verified by employing simulated data from Markov-switching DSGE models, before they classify regimes from 1968-2017 using actual U.S. data. All machine learning methods outperform a standard logistic regression concerning the simulated data. Among those the Boosted Ensemble Trees classifier yields the best results. The authors find clear evidence of fiscal dominance before Volcker. Monetary dominance is detected between 1984-1988, before a fiscally led regime turns up around the stock market crash lasting until 1994. Until the beginning of the new century, monetary dominance is established, while the more recent evidence following the financial crisis is mixed with a tendency towards fiscal dominance.
This paper examines the sustainability of the currency board arrangements in Argentina and Hong Kong. We employ a Markov switching model with two regimes to infer the exchange rate pressure due to economic fundamentals and market expectations. The empirical results suggest that economic fundamentals and expectations are key determinants of a currency board’s sustainability. We also show that the government’s credibility played a more important role in Argentina than in Hong Kong. The trade surplus, real exchange rate and inflation rate were more important drivers of the sustainability of the Hong Kong currency board.
Distributed ledger technology especially in the form of publicly coordinated validation networks such as Ethereum and Bitcoin with their own monetary circles provide for a revealing litmus test for current financial regulatory schemes. The paper highlights the interrelation between distributed coordination and the emission of virtual currency to make sense of the function of the new monetary phenomenon. It then argues for the regulation of financial services on the ground of the technology to ensure integrity standards. In this respect, it is useful to gear the development of a regulatory scheme towards the existing financial regulatory principles. However, future measures of the regulators must take the distributed nature of the platforms into account by relying on a “regulated self-regulation” of the community. Finally, the article focuses on the shortcomings of the current EU regulatory regimes, especially the regulation frameworks regarding financial services, payment services and electronic money.
Our paper evaluates recent regulatory proposals mandating the deferral of bonus payments and claw-back clauses in the financial sector. We study a broadly applicable principal agent setting, in which the agent exerts effort for an immediately observable task (acquisition) and a task for which information is only gradually available over time (diligence). Optimal compensation contracts trade off the cost and benefit of delay resulting from agent impatience and the informational gain. Mandatory deferral may increase or decrease equilibrium diligence depending on the importance of the acquisition task. We provide concrete conditions on economic primitives that make mandatory deferral socially (un)desirable.
This paper applies structure preserving doubling methods to solve the matrix quadratic underlying the recursive solution of linear DSGE models. We present and compare two Structure-Preserving Doubling Algorithms ( SDAs) to other competing methods – the QZ method, a Newton algorithm, and an iterative Bernoulli approach – as well as the related cyclic and logarithmic reduction algorithms. Our comparison is completed using nearly 100 different models from the Macroeconomic Model Data Base (MMB) and different parameterizations of the monetary policy rule in the medium scale New Keynesian model of Smets and Wouters (2007) iteratively. We find that both SDAs perform very favorably relative to QZ, with generally more accurate solutions computed in less time. While we collect theoretical convergence results that promise quadratic convergence rates to a unique stable solution, the algorithms may fail to converge when there is a breakdown due to singularity of the coefficient matrices in the recursion. One of the proposed algorithms can overcome this problem by an appropriate (re)initialization. This SDA also performs particular well in refining solutions of different methods or from nearby parameterizations.
This paper considers a firm that has to delegate to an agent, such as a mortgage broker or a security dealer, the twin tasks of approaching and advising customers. The main contractual restriction, in particular in light of related research in Inderst and Ottaviani (2007), is that the firm can only compensate the agent through commissions. This standard contracting restriction has the following key implications. First, the firm can only ensure internal compliance to a "standard of sales", in terms of advice for the customer, if this standard is not too high. Second, if this is still feasible, then a higher standard is associated with higher, instead of lower, sales commissions. Third, once the limit for internal compliance is approached, tougher regulation and prosecution of "misselling" have (almost) no effect on the prevailing standard. Besides having practical implications, in particular on how to (re-)regulate the sale of financial products, the novel model, which embeds a problem of advice into a framework with repeated interactions, may also be of separate interest for future work on sales force compensation. JEL Classification: D18 (Consumer Protection), D83 (Search; Learning; Information and Knowledge), M31 (Marketing), M52 (Compensation and Compensation Methods and Their Effects).
This paper presents a novel model of the lending process that takes into account that loan officers must spend time and effort to originate new loans. Besides generating predictions on loan officers’ compensation and its interaction with the loan review process, the model sheds light on why competition could lead to excessively low lending standards. We also show how more intense competition may fasten the adoption of credit scoring. More generally, hard-information lending techniques such as credit scoring allow to give loan officers high-powered incentives without compromising the integrity and quality of the loan approval process. The model is finally applied to study the implications of loan sales on the adopted lending process and lending standard.
We present a simple model of personal finance in which an incumbent lender has an information advantage vis-a-vis both potential competitors and households. In order to extract more consumer surplus, a lender with sufficient market power may engage in "irresponsible"lending, approving credit even if this is knowingly against a household’s best interest. Unless rival lenders are equally well informed, competition may reduce welfare. This holds, in particular, if less informed rivals can free ride on the incumbent’s superior screening ability.
This paper presents a novel model of the lending process that takes into account that loan officers must spend time and effort to originate new loans. Besides generating predictions on loan officers’ compensation and its interaction with the loan review process, the model sheds light on why competition could lead to excessively low lending standards. We also show how more intense competition may fasten the adoption of credit scoring. More generally, hard-information lending techniques such as credit scoring allow to give loan officers high-powered incentives without compromising the integrity and quality of the loan approval process.
We analyze how two key managerial tasks interact: that of growing the business through creating new investment opportunities and that of providing accurate information about these opportunities in the corporate budgeting process. We show how this interaction endogenously biases managers toward overinvesting in their own projects. This bias is exacerbated if managers compete for limited resources in an internal capital market, which provides us with a novel theory of the boundaries of the firm. Finally, managers of more risky and less profitable divisions should obtain steeper incentives to facilitate efficient investment decisions.
We consider an imperfectly competitive loan market in which a local relationship lender has an information advantage vis-à-vis distant transaction lenders. Competitive pressure from the transaction lenders prevents the local lender from extracting the full surplus from projects, so that she inefficiently rejects marginally profitable projects. Collateral mitigates the inefficiency by increasing the local lender’s payoff from precisely those marginal projects that she inefficiently rejects. The model predicts that, controlling for observable borrower risk, collateralized loans are more likely to default ex post, which is consistent with the empirical evidence. The model also predicts that borrowers for whom local lenders have a relatively smaller information advantage face higher collateral requirements, and that technological innovations that narrow the information advantage of local lenders, such as small business credit scoring, lead to a greater use of collateral in lending relationships. JEL classification: D82; G21 Keywords: Collateral; Soft infomation; Loan market competition; Relationship lending
This paper shows that active investors, such as venture capitalists, can affect the speed at which new ventures grow. In the absence of product market competition, new ventures financed by active investors grow faster initially, though in the long run those financed by passive investors are able to catch up. By contrast, in a competitive product market, new ventures financed by active investors may prey on rivals that are financed by passive investors by “strategically overinvesting” early on, resulting in long-run differences in investment, profits, and firm growth. The value of active investors is greater in highly competitive industries as well as in industries with learning curves, economies of scope, and network effects, as is typical for many “new economy” industries. For such industries, our model predicts that start-ups with access to venture capital may dominate their industry peers in the long run. JEL Classifications: G24; G32 Keywords: Venture capital; dynamic investment; product market competition
We study a model of “information-based entrenchment” in which the CEO has private information that the board needs to make an efficient replacement decision. Eliciting the CEO’s private information is costly, as it implies that the board must pay the CEO both higher severance pay and higher on-the-job pay. While higher CEO pay is associated with higher turnover in our model, there is too little turnover in equilibrium. Our model makes novel empirical predictions relating CEO turnover, severance pay, and on-the-job pay to firm-level attributes such as size, corporate governance, and the quality of the firm’s accounting system.
This paper argues that banks must be sufficiently levered to have first-best incentives to make new risky loans. This result, which is at odds with the notion that leverage invariably leads to excessive risk taking, derives from two key premises that focus squarely on the role of banks as informed lenders. First, banks finance projects that they do not own, which implies that they cannot extract all the profits. Second, banks conduct a credit risk analysis before making new loans. Our model may help understand why banks take on additional unsecured debt, such as unsecured deposits and subordinated loans, over and above their existing deposit base. It may also help understand why banks and finance companies have similar leverage ratios, even though the latter are not deposit takers and hence not subject to the same regulatory capital requirements as banks.
This article shows that investors financing a portfolio of projects may use the depth of their financial pockets to overcome entrepreneurial incentive problems. Competition for scarce informed capital at the refinancing stage strengthens investors’ bargaining positions. And yet, entrepreneurs’ incentives may be improved, because projects funded by investors with ‘‘shallow pockets’’ must have not only a positive net present value at the refinancing stage, but one that is higher than that of competing portfolio projects. Our article may help understand provisions used in venture capital finance that limit a fund’s initial capital and make it difficult to add more capital once the initial venture capital fund is raised. (JEL G24, G31)
This paper shows that investors financing a portfolio of projects may use the depth of their financial pockets to overcome entrepreneurial incentive problems. Competition for scarce informed capital at the refinancing stage strengthens investors’ bargaining positions. And yet, entrepreneurs’ incentives may be improved, because projects funded by investors with “shallow pockets” must have not only a positive net present value at the refinancing stage, but one that is higher than that of competing portfolio projects. Our paper may help to understand provisions used in venture capital finance that limit a fund’s initial capital and make it difficult to add more capital once the initial venture capital fund is raised.
Misselling through agents
(2009)
This paper analyzes the implications of the inherent conflict between two tasks performed by direct marketing agents: prospecting for customers and advising on the product's "suitability" for the specific needs of customers. When structuring sales-force compensation, firms trade off the expected losses from "misselling" unsuitable products with the agency costs of providing marketing incentives. We characterize how the equilibrium amount of misselling (and thus the scope of policy intervention) depends on features of the agency problem including: the internal organization of a firm's sales process, the transparency of its commission structure, and the steepness of its agents' sales incentives. JEL Classification: D18 (Consumer Protection), D83 (Search; Learning; Information and Knowledge), M31 (Marketing), M52 (Compensation and Compensation Methods and Their Effects).
In this paper, we provide some reflections on the development of monetary theory and monetary policy over the last 150 years. Rather than presenting an encompassing overview, which would be overambitious, we simply concentrate on a few selected aspects that we view as milestones in the development of this subject. We also try to illustrate some of the interactions with the political and financial system, academic discussion and the views and actions of central banks.
In his speech at the conference „The SNB and its Watchers“, Otmar Issing, member of the ECB Governing Council from its start in 1998 until 2006, takes a look back at more than twenty years of the conference series „The ECB and Its Watchers“. In June 1999, Issing established this format together with Axel Weber, then Director of the Center for Financial Studies, to discuss the monetary policy strategy of the newly founded central bank with a broad circle of participants, that is academics, bank economists and members of the media on a „neutral ground“. At the annual conference, the ECB and its representatives would play an active role and engage in a lively exchange of view with the other participants. Over the years, Volker Wieland took over as organizer of the conference series, which also was adopted by other central banks. In his contribution at the second conference „The SNB and its Watchers“, Issing summarizes the experience gained from over twenty years of the ECB Watchers Conference.
The Eurosystem and the Deutsche Bundesbank will incur substantial losses in 2023 that are likely to persist for several years. Due to the massive purchases of securities in the last 10 years, especially of government bonds, the banks' excess reserves have risen sharply. The resulting high interest payments to the banks since the turnaround in monetary policy, with little income for the large-scale securities holdings, led to massive criticism. The banks were said to be making "unfair" profits as a result, while the fiscal authorities had to forego the previously customary transfers of central bank profits. Populist demands to limit bank profits by, for example, drastically increasing the minimum reserve ratios in the Eurosystem to reduce excess reserves are creating new severe problems and are neither justified nor helpful. Ultimately, the EU member states have benefited for a very long time from historically low interest rates because of the Eurosystem's extraordinary loose monetary policy and must now bear the flip side consequences of the massive expansion of central bank balance sheets during the necessary period of monetary policy normalisation.
The so-called Troika, consisting of the EU-Commission, the European Central Bank (ECB) and the International Monetary Fund (IMF), was supposed to support the member states of the euro area which had been hit hard by a sovereign debt crisis. For that purpose, economic adjustment programs were drafted and monitored in order to prevent the break-up of the euro area and sovereign defaults. The cooperation of these institutions, which was born out of necessity, has been partly successful, but has also created persistent problems. With the further increase of public debt, especially in France and Italy, the danger of a renewed crisis in the euro area was growing. The European Stability Mechanism (ESM) together with the European Commission will replace the Troika in the future, following decisions of the EU Summit of December 2018. It shall play the role of a European Monetary Fund in the event of a crisis. The IMF, on the other side, will no longer play an active role in solving sovereign debt crises in the euro area. The current course is, however, inadequate to tackle the core problems of the euro zone and to avoid future crises, which are mainly structural in nature and due to escalating public debt and lack of international competitiveness of some member countries. The current Corona crisis will aggravate the institutional problems. It has led to a common European fiscal response ("Next Generation EU"). This rescue and recovery program will not be financed by ESM resources and will not be monitored by the ESM. One important novelty of this package is that it involves the issuance of substantial common European debt.
Debt levels in the eurozone have reached new record highs. The member countries have tried to cushion the economic consequences of the corona pandemic with a massive increase in government spending. End of 2021 public debt in relation to GDP will approach 100% on average. There are various calls to abolish or soften the Maastricht rules of limiting sovereign debt. We see the risk of a new sovereign debt crisis in this decade if it is not possible to bring public debt down to an acceptable level. Our new fiscal rule would be suitable and appropriate for this purpose, because obviously the Maastricht criteria have failed. In contrast to the rigid 3% Maastricht-criterion, our rule is flexible and it addresses the main problem: excessively high public debt ratios. And it lowers the existing incentives for highly indebted governments to exert expansionary pressure on monetary policy. If obeyed strictly, our rule reinforces the snowball effect and reduces the excessively high debt ratios within a manageable period, even if nominal growth is weak. This is confirmed by simulations with different scenarios as well as with the hypothetical application of the new fiscal rule to eurozone economies from 2022 to 2026. Finally, we take up the recent proposal by ESM economists to increase the permissible debt ratio from 60 to 100% of GDP in the eurozone.
The term structure of interest rates is crucial for the transmission of monetary policy to financial markets and the macroeconomy. Disentangling the impact of monetary policy on the components of interest rates, expected short rates and term premia, is essential to understanding this channel. To accomplish this, we provide a quantitative structural model with endogenous, time-varying term premia that are consistent with empirical findings. News about future policy, in contrast to unexpected policy shocks, has quantitatively significant effects on term premia along the entire term structure. This provides a plausible explanation for partly contradictory estimates in the empirical literature.
The term structure of interest rates is crucial for the transmission of monetary policy to financial markets and the macroeconomy. Disentangling the impact of monetary policy on the components of interest rates, expected short rates, and term premia is essential to understanding this channel. To accomplish this, we provide a quantitative structural model with endogenous, time-varying term premia that are consistent with empirical findings. News about future policy, in contrast to unexpected policy shocks, has quantitatively significant effects on term premia along the entire term structure. This provides a plausible explanation for partly contradictory estimates in the empirical literature.
Rising temperatures, falling ratings: the effect of climate change on sovereign creditworthiness
(2021)
How will a changing climate impact the creditworthiness of governments over the very long term? Financial markets need credible, digestible information on how climate change translates into material risks. To bridge the gap between climate science and real-world financial indicators, the authors simulate the effect of climate change on sovereign credit ratings for 108 countries, creating the world’s first climate-adjusted sovereign credit rating. The study offers a first methodological approach to extend the long-term rating to an ultra-long-term reality, aiming at long-term investors, but also regulators and rating agencies.
This paper proposes a new approach for modeling investor fear after rare disasters. The key element is to take into account that investors’ information about fundamentals driving rare downward jumps in the dividend process is not perfect. Bayesian learning implies that beliefs about the likelihood of rare disasters drop to a much more pessimistic level once a disaster has occurred. Such a shift in beliefs can trigger massive declines in price-dividend ratios. Pessimistic beliefs persist for some time. Thus, belief dynamics are a source of apparent excess volatility relative to a rational expectations benchmark. Due to the low frequency of disasters, even an infinitely-lived investor will remain uncertain about the exact probability. Our analysis is conducted in continuous time and offers closed-form solutions for asset prices. We distinguish between rational and adaptive Bayesian learning. Rational learners account for the possibility of future changes in beliefs in determining their demand for risky assets, while adaptive learners take beliefs as given. Thus, risky assets tend to be lower-valued and price-dividend ratios vary less under adaptive versus rational learning for identical priors. Keywords: beliefs, Bayesian learning, controlled diffusions and jump processes, learning about jumps, adaptive learning, rational learning. JEL classification: D83, G11, C11, D91, E21, D81, C61
This paper studies the macro-financial implications of using carbon prices to achieve ambitious greenhouse gas (GHG) emission reduction targets. My empirical evidence shows a 0.6% output loss and a rise of 0.3% in inflation in response to a 1% shock on carbon policy. Furthermore, I also observe financial instability and allocation effects between the clean and highly polluted energy sectors. To have a better prediction of medium and long-term impact, using a medium-large macro-financial DSGE model with environmental aspects, I show the recessionary effect of an ambitious carbon price implementation to achieve climate targets, a 40% reduction in GHG emission causes a 0.7% output loss while reaching a zero-emission economy in 30 years causes a 2.6% output loss. I document an amplified effect of the banking sector during the transition path. The paper also uncovers the beneficial role of pre-announcements of carbon policies in mitigating inflation volatility by 0.2% at its peak, and our results suggest well-communicated carbon policies from authorities and investing to expand the green sector. My findings also stress the use of optimal green monetary and financial policies in mitigating the effects of transition risk and assisting the transition to a zero-emission world. Utilizing a heterogeneous approach with macroprudential tools, I find that optimal macroprudential tools can mitigate the output loss by 0.1% and investment loss by 1%. Importantly, my work highlights the use of capital flow management in the green transition when a global cooperative solution is challenging.
The authors embed human capital-based endogenous growth into a New-Keynesian model with search and matching frictions in the labor market and skill obsolescence from long-term unemployment. The model can account for key features of the Great Recession: a decline in productivity growth, the relative stability of inflation despite a pronounced fall in output (the "missing disinflation puzzle"), and a permanent gap between output and the pre-crisis trend output.
In the model, lower aggregate demand raises unemployment and the training costs associated with skill obsolescence. Lower employment hinders learning-by-doing, which slows down human capital accumulation, feeding back into even fewer vacancies than justified by the demand shock alone. These feedback channels mitigate the disinflationary effect of the demand shock while amplifying its contractionary effect on output. The temporary growth slowdown translates into output hysteresis (permanently lower output and labor productivity).
Central banks normally accept debt of their own governments as collateral in liquidity operations without reservations. This gives rise to a valuable liquidity premium that reduces the cost of government finance. The ECB is an interesting exception in this respect. It relies on external assessments of the creditworthiness of its member states, such as credit ratings, to determine eligibility and the haircut it imposes on such debt. The authors show how such features in a central bank’s collateral framework can give rise to cliff effects and multiple equilibria in bond yields and increase the vulnerability of governments to external shocks. This can potentially induce sovereign debt crises and defaults that would not otherwise arise.
This paper characterises optimal monetary policy in an economy with endogenous
firm entry, a cash-in-advance constraint and preset wages. Firms must make pro
fits to cover entry costs; thus the markup on goods prices is efficient. However, because leisure is not priced at a markup, the consumption-leisure tradeoff is distorted. Consequently, the real wage, hours and production are suboptimally low. Due to the labour requirement in entry, insufficient labour supply also implies that entry is too low. The paper shows that in the absence of
fiscal instruments such as labour income subsidies, the optimal monetary policy under sticky wages achieves higher welfare than under flexible wages. The policy maker uses the money supply instrument to raise the real wage - the cost of leisure - above its flexible-wage level, in response to expansionary shocks to productivity and entry costs. This raises labour supply, expanding production and
rm entry.
How do changes in market structure affect the US business cycle? We estimate a monetary DSGE model with endogenous
rm/product entry and a translog expenditure function by Bayesian methods. The dynamics of net business formation allow us to identify the 'competition effect', by which desired price markups and inflation decrease when entry rises. We
find that a 1 percent increase in the number of competitors lowers desired markups by 0.18 percent. Most of the cyclical variability in inflation is driven by markup fluctuations due to sticky prices or exogenous shocks rather than endogenous changes in desired markups.
How do changes in market structure affect the US business cycle? We estimate a monetary DSGE model with endogenous
rm/product entry and a translog expenditure function by Bayesian methods. The dynamics of net business formation allow us to identify the 'competition effect', by which desired price markups and inflation decrease when entry rises. We
find that a 1 percent increase in the number of competitors lowers desired markups by 0.18 percent. Most of the cyclical variability in inflation is driven by markup fluctuations due to sticky prices or exogenous shocks rather than endogenous changes in desired markups.
This paper investigates the effect of a change in informational environment of borrowers on the organizational design of bank lending. We use micro-data from a large multinational bank and exploit the sudden introduction of a credit registry, an information-sharing mechanism across banks, for a subset of borrowers. Using within borrower and loan officer variation in a difference-in-difference empirical design, we show that expansion of credit registry led to an improvement in allocation of credit to affected
borrowers. There was a concurrent change in the organizational structure of the bank that involved a dramatic increase in delegation of lending decisions of affected borrowers to loan officers. We also find a significant expansion in scope of activities of loan officers who deal primarily with affected borrowers, as well as of their superiors. There is suggestive evidence that larger banks in the economy were better able to implement similar changes as our bank. We argue that these patterns can be understood within the framework of incentive-based and information cost processing theories. Our findings could help rationalize why improvements in the information environment of borrowers may be altering the landscape of lending by moving decisions outside the boundaries of financial intermediaries.
There is substantial disagreement about the consequences of the Tax Cuts and Jobs Act (TCJA) of 2017, which constitutes the most extensive tax reform in the United States in more than 30 years. Using a large-scale two-country dynamic general equilibrium model with nominal rigidities, we find that the TCJA increases GDP by about 2% in the medium-run and by about 2.5% in the long-run. The shortrun impact depends crucially on the degree and costs of variable capital utilization, with GDP effects ranging from 1 to 3%. At the same time, the TCJA does not pay for itself. In our analysis, the reform decreases tax revenues and raises the debt-to-GDP ratio by about 15 percentage points in the medium-run until 2025. We show that combining the TCJA with spending cuts can dampen the increase in government indebtedness without reducing its expansionary effect.
This paper develops and implements a backward and forward error analysis of and condition numbers for the numerical stability of the solutions of linear dynamic stochastic general equilibrium (DSGE) models. Comparing seven different solution methods from the literature, I demonstrate an economically significant loss of accuracy specifically in standard, generalized Schur (or QZ) decomposition based solutions methods resulting from large backward errors in solving the associated matrix quadratic problem. This is illustrated in the monetary macro model of Smets and Wouters (2007) and two production-based asset pricing models, a simple model of external habits with a readily available symbolic solution and the model of Jermann (1998) that lacks such a symbolic solution - QZ-based numerical solutions miss the equity premium by up to several annualized percentage points for parameterizations that either match the chosen calibration targets or are nearby to the parameterization in the literature. While the numerical solution methods from the literature failed to give any indication of these potential errors, easily implementable backward-error metrics and condition numbers are shown to successfully warn of such potential inaccuracies. The analysis is then performed for a database of roughly 100 DSGE models from the literature and a large set of draws from the model of Smets and Wouters (2007). While economically relevant errors do not appear pervasive from these latter applications, accuracies that differ by several orders of magnitude persist.