Filtern
Erscheinungsjahr
Dokumenttyp
- Arbeitspapier (216)
- Ausgabe (Heft) zu einer Zeitschrift (15)
- Wissenschaftlicher Artikel (8)
- Bericht (8)
- Buch (Monographie) (2)
- Dissertation (1)
- Periodikum (1)
Gehört zur Bibliographie
- nein (251)
Schlagworte
- monetary policy (14)
- DSGE (8)
- Geldpolitik (7)
- Bayesian estimation (6)
- Federal Reserve (6)
- Monetary Policy (6)
- Numerical accuracy (6)
- Solution methods (6)
- DSGE models (5)
- Deutschland (5)
Institut
- Institute for Monetary and Financial Stability (IMFS) (251) (entfernen)
How does the need to preserve government debt sustainability affect the optimal monetary and fiscal policy response to a liquidity trap? To provide an answer, we employ a small stochastic New Keynesian model with a zero bound on nominal interest rates and characterize optimal time-consistent stabilization policies. We focus on two policy tools, the short-term nominal interest rate and debt-financed government spending. The optimal policy response to a liquidity trap critically depends on the prevailing debt burden. While the optimal amount of government spending is decreasing in the level of outstanding government debt, future monetary policy is becoming more accommodative, triggering a change in private sector expectations that helps to dampen the fall in output and inflation at the outset of the liquidity trap.
The author proposes a Differential-Independence Mixture Ensemble (DIME) sampler for the Bayesian estimation of macroeconomic models.It allows sampling from particularly challenging, high-dimensional black-box posterior distributions which may also be computationally expensive to evaluate. DIME is a “Swiss Army knife”, combining the advantages of a broad class of gradient-free global multi-start optimizers with the properties of a Monte Carlo Markov chain (MCMC). This includes fast burn-in and convergence absent any prior numerical optimization or initial guesses, good performance for multimodal distributions, a large number of chains (the “ensemble”) running in parallel, an endogenous proposal density generated from the state of the full ensemble, which respects the bounds of the prior distribution. The author shows that the number of parallel chains scales well with the number of necessary ensemble iterations.
DIME is used to estimate the medium-scale heterogeneous agent New Keynesian (“HANK”) model with liquid and illiquid assets, thereby for the first time allowing to also include the households’ preference parameters. The results mildly point towards a less accentuated role of household heterogeneity for the empirical macroeconomic dynamics.
Financial market interactions can lead to large and persistent booms and recessions. Instability is an inherent threat to economies with speculative financial markets. A central bank’s interest rate setting can amplify the expectation feedback in the financial market and this can lead to unstable dynamics and excess volatility. The paper suggests that policy institutions may be well-advised to handle tools like asset price targeting with care since such instruments might add a structural link between asset prices and macroeconomic aggregates. Neither stock prices nor indices are a good indicator to base decisions on.
Occasionally binding constraints have become an important part of economic modelling, especially since western central banks see themselves (again) constraint by the so-called zero lower bound (ZLB) of the nominal interest rate. A binding ZLB constraint poses a major problem for a quantitative-structural analysis: Linear solution methods do no work in the presence of a non-linearity such as the ZLB and existing alternatives tend to be computationally demanding. The urge to study macroeconomic questions related to the Great Recession and the Covid-19 crisis in a quantitative-structural framework requires algorithms that are not only accurate, but that are also robust, fast, and computationally efficient.
A particularly important application where efficient and fast methods for occasionally binding constraints (OBCs) are needed is the Bayesian estimation of macroeconomic models. This paper shows that a linear dynamic rational expectations system with OBCs, depending on the expected duration of the constraint, can be represented in closed form. Combined with a set of simple equilibrium conditions, this can be exploited to avoid matrix inversions and simulations at runtime for signifcant gains in computational speed.
The level of capital tax gains has high explanatory power regarding the question of what drives economic inequality. On this basis, the authors develop a simple, yet micro-founded portfolio selection model to explain the dynamics of wealth inequality given empirical tax series in the US. The results emphasize that the level and the transition of speed of wealth inequality depend crucially on the degree of capital taxation. The projections predict that – continuing on the present path of capital taxation in the US – the gap between rich and poor is expected to shrink whereas “massive” tax cuts will further increase the degree of wealth concentration.
Did the Federal Reserves’ Quantitative Easing (QE) in the aftermath of the financial crisis have macroeconomic effects? To answer this question, the authors estimate a large-scale DSGE model over the sample from 1998 to 2020, including data of the Fed’s balance sheet. The authors allow for QE to affect the economy via multiple channels that arise from several financial frictions. Their nonlinear Bayesian likelihood approach fully accounts for the zero lower bound on nominal interest rates. They find that between 2009 to 2015, QE increased output by about 1.2 percent. This reflects a net increase in investment of nearly 9 percent, that was accompanied by a 0.7 percent drop in aggregate consumption. Both, government bond and capital asset purchases were effective in improving financing conditions. Especially capital asset purchases significantly facilitated new investment and increased the production capacity. Against the backdrop of a fall in consumption, supply side effects dominated which led to a mild disinflationary effect of about 0.25 percent annually.
Can boundedly rational agents survive competition with fully rational agents? The authors develop a highly nonlinear heterogeneous agents model with rational forward looking versus boundedly rational backward looking agents and evolving market shares depending on their relative performance. Their novel numerical solution method detects equilibrium paths characterized by complex bubble and crash dynamics. Boundedly rational trend-extrapolators amplify small deviations from fundamentals, while rational agents anticipate market crashes after large bubbles and drive prices back close to fundamental value. Overall rational and non-rational beliefs co-evolve over time, with time-varying impact, and their interaction produces complex endogenous bubble and crashes, without any exogenous shocks.
The recently observed disconnect between inflation and economic activity can be explained by the interplay between the zero lower bound (ZLB) and the costs of external financing. In normal times, credit spreads and the nominal interest rate balance out; factor costs dominate firms' marginal costs. When nominal rates are constrained, larger spreads can more than offset the effect of lower factor costs and induce only moderate inflation responses. The Phillips curve is hence flat at the ZLB, but features a positive slope in normal times and thus a hockey stick shape. Via this mechanism, forward guidance may induce deflationary effects.
Using a nonlinear Bayesian likelihood approach that fully accounts for the zero lower bound on nominal interest rates, the authors analyze US post-crisis business cycle dynamics and provide reference parameter estimates. They find that neither the inclusion of financial frictions nor that of household heterogeneity improve the empirical fit of the standard model, or its ability to provide a joint explanation for the post-2007 dynamics. Associated financial shocks mis-predict an increase in consumption. The common practice of omitting the ZLB period in the estimation severely distorts the analysis of the more recent economic dynamics.
Renewed interest in fiscal policy has increased the use of quantitative models to evaluate policy. Because of modeling uncertainty, it is essential that policy evaluations be robust to alternative assumptions. We find that models currently being used in practice to evaluate fiscal policy stimulus proposals are not robust. Government spending multipliers in an alternative empirically-estimated and widely-cited new Keynesian model are much smaller than in these old Keynesian models; the estimated stimulus is extremely small with GDP and employment effects only one-sixth as large.
In the aftermath of the global financial crisis and great recession, many countries face substantial deficits and growing debts. In the United States, federal government outlays as a ratio to GDP rose substantially from about 19.5 percent before the crisis to over 24 percent after the crisis. In this paper we consider a fiscal consolidation strategy that brings the budget to balance by gradually reducing this spending ratio over time to the level that prevailed prior to the crisis. A crucial issue is the impact of such a consolidation strategy on the economy. We use structural macroeconomic models to estimate this impact focussing primarily on a dynamic stochastic general equilibrium model with price and wage rigidities and adjustment costs. We separate out the impact of reductions in government purchases and transfers, and we allow for a reduction in both distortionary taxes and government debt relative to the baseline of no consolidation. According to the model simulations GDP rises in the short run upon announcement and implementation of this fiscal consolidation strategy and remains higher than the baseline in the long run. We explore the role of the mix of expenditure cuts and tax reductions as well as gradualism in achieving this policy outcome. Finally, we conduct sensitivity studies regarding the type of model used and its parameterization.
In the aftermath of the global financial crisis and great recession, many countries face substantial deficits and growing debts. In the United States, federal government outlays as a ratio to GDP rose substantially from about 19.5 percent before the crisis to over 24 percent after the crisis. In this paper we consider a fiscal consolidation strategy that brings the budget to balance by gradually reducing this spending ratio over time to the level that prevailed prior to the crisis. A crucial issue is the impact of such a consolidation strategy on the economy. We use structural macroeconomic models to estimate this impact focussing primarily on a dynamic stochastic general equilibrium model with price and wage rigidities and adjustment costs. We separate out the impact of reductions in government purchases and transfers, and we allow for a reduction in both distortionary taxes and government debt relative to the baseline of no consolidation. According to the model simulations GDP rises in the short run upon announcement and implementation of this fiscal consolidation strategy and remains higher than the baseline in the long run. We explore the role of the mix of expenditure cuts and tax reductions as well as gradualism in achieving this policy outcome. Finally, we conduct sensitivity studies regarding the type of model used and its parameterization.
Recently, we evaluated a fiscal consolidation strategy for the United States that would bring the government budget into balance by gradually reducing government spending relative to GDP to the ratio that prevailed prior to the crisis (Cogan et al, JEDC 2013). Specifically, we published an analysis of the macroeconomic consequences of the 2013 Budget Resolution that was passed by the U.S. House of Representatives in March 2012. In this note, we provide an update of our research that evaluates this year’s budget reform proposal that is to be discussed and voted on in the House of Representative in March 2013. Contrary to the views voiced by critics of fiscal consolidation, we show that such a reduction in government purchases and transfer payments can increase GDP immediately and permanently relative to a policy without spending restraint. Our research makes use of a modern structural model of the economy that incorporates the long-standing essential features of economics: opportunity costs, efficiency, foresight and incentives. GDP rises because households take into account that spending restraint helps avoid future increases in tax rates. Lower taxes imply less distorted incentives for work, investment and production relative to a scenario without fiscal consolidation and lead to higher growth.
In the aftermath of the global financial crisis and great recession, many countries face substantial deficits and growing debts. In the United States, federal government outlays as a ratio to GDP rose substantially from about 19.5 percent before the crisis to over 24 percent after the crisis. In this paper we consider a fiscal consolidation strategy that brings the budget to balance by gradually reducing this spending ratio over time to the level that prevailed prior to the crisis. A crucial issue is the impact of such a consolidation strategy on the economy. We use structural macroeconomic models to estimate this impact. We consider two types of dynamic stochastic general equilibrium models: a neoclassical growth model and more complicated models with price and wage rigidities and adjustment costs. We separate out the impact of reductions in government purchases and transfers, and we allow for a reduction in both distortionary taxes and government debt relative to the baseline of no consolidation. According to the initial model simulations GDP rises in the short run upon announcement and implementation of this fiscal consolidation strategy and remains higher than the baseline in the long run.
Savings accounts are owned by most households, but little is known about the performance of households’ investments. We create a unique dataset by matching information on individual savings accounts from the DNB Household Survey with market data on account-specific interest rates and characteristics. We document considerable heterogeneity in returns across households, which can be partly explained by financial sophistication. A one-standard deviation increase in financial literacy is associated with a 13% increase compared to the median interest rate. We isolate the usage of modern technology (online accounts) as one channel through which financial literacy has a positive association with returns.
Corporate borrowers care about the overall riskiness of a bank’s operations as their continued access to credit may rely on the bank’s ability to roll over loans or to expand existing credit facilities. As we show, a key implication of this observation is that increasing competition among banks should have an asymmetric impact on banks’ incentives to take on risk: Banks that are already riskier will take on yet more risk, while their safer rivals will become even more prudent. Our results offer new guidance for bank supervision in an increasingly competitive environment and may help to explain existing, ambiguous findings on the relationship between competition and risk-taking in banking. Furthermore, our results stress the beneficial role that competition can have for financial stability as it turns a bank’s "prudence" into an important competitive advantage.
The authors study the effects of forward looking communication in an environment of rising inflation rates on German consumers‘ inflation expectations using a randomized control trial. They show that information about rising inflation increases short- and long-term inflation expectations. This initial increase in expectations can be mitigated using forward looking information about inflation. Among these information treatments, professional forecasters‘ projections seem to reduce inflation expectations by more than policymakers‘ characterization of inflation as a temporary phenomenon.
I. EINLEITUNG II. VORSCHLAG DER WIRTSCHAFTSRECHTLICHEN ABTEILUNG ZUM 67. DEUTSCHEN JURISTENTAG 1. Darstellung und Begriffsbestimmung 2. Begründung III. BEDEUTUNG DES AUßERBÖRSLICHEN HANDELS IN DEUTSCHLAND IV. RECHTSVERGLEICHENDE BETRACHTUNG VON AKTIEN- UND KAPITALMARKTRECHT 1. Deutschland a) Organisation des Kapitalmarktes b) Differenzierung im Rahmen des Aktienrechts 2. Großbritannien a) Organisation des Kapitalmarktes b) Differenzierungen im „Companies Act 2006“ 3. USA a) Rechtsquellen des Kapitalgesellschafts- und Kapitalmarktrechts b) Organisation des Kapitalmarktes c) Kapitalgesellschaftsrecht V. STELLUNGNAHME 1. Anknüpfung der vorhandenen Regelungen an die Kapitalmarktorientierung 2. Verwischung der Grenzen zwischen Aktien- und Kapitalmarktrecht 3. Missbrauchsgefahr durch selbstbestimmte Wahl der Satzungsstrenge 4. Bisherige Reformansätze im deutschen Schrifttum 5. Die Abkehr von einer Differenzierung im Aktienrecht in der aktuellen Reformdiskussion 6. Ökonomische Analyse des Aktienrechts („Opt-In-Modell“) VI. FAZIT: Der Deregulierungsansatz, der eine Differenzierung zwischen börsen- und nichtbörsennotierten Aktiengesellschaften vorsieht, ist nicht zu befürworten. Vor dem Hintergrund der rechtsvergleichenden Betrachtung der Beispiele Großbritannien und der USA stellt sich vielmehr eine kapitalmarktorientierte Differenzierung der Anlegerschutzbestimmungen des Aktienrechts als vorzugswürdig dar. Die Anknüpfung von Deregulierungsmaßnahmen an das Kriterium der Kapitalmarktorientierung findet sich im Ansatz auch im bereits geltenden deutschen Recht. So enthält sowohl das Aktienrecht als auch das Kapitalmarktrecht entsprechend differenzierende Regelungen. Zudem weisen auch aktuelle nationale Gesetzesvorhaben und die Entwicklungen im europäischen Gesellschaftsrecht Tendenzen zu einer Abgrenzung nach dem Kriterium der Kapitalmarktferne oder -offenheit auf. Auch birgt der enge Anwendungsbereich der zwingenden Anlegerschutznormen des Aktienrechts auf börsennotierte Aktiengesellschaften erhebliche Missbrauchsrisiken. Aktiengesellschaften könnten in den außerbörslichen Handel wechseln, um in den Genuss von Deregulierungen und geringeren Transparenz- und Anlegerschutzanforderungen zu kommen. Letztlich folgt der Vorzug einer kapitalmarktorientierten Differenzierung auch aus der aktuellen Diskussion um Reformansätze zur Steigerung der Wettbewerbsfähigkeit des deutschen Gesellschafts- und Kapitalmarktrechts. Die in diesem Zusammenhang geforderte Aufhebung der Satzungsstrenge bei gleichzeitiger Normierung entsprechender Informations- und Anlegerschutzpflichten im Kapitalmarktrecht würde dazu führen, dass an bestehende Differenzierungen des Kapitalmarktrechts angeknüpft werden könnte.
Climate change has become one of the most prominent concerns globally. In this paper, the authors study the transition risk of greenhouse gas emission reduction in structural environmental-macroeconomic DSGE models. First, they analyze the uncertainty in model prediction on the effect of unanticipated and pre-announced carbon price increases. Second, they conduct optimal model-robust policy in different settings. They find that reducing emissions by 40% causes 0.7% to 4% output loss with 2% on average. Pre-announcement of carbon prices affects the inflation dynamics significantly. The central bank should react slightly less to inflation and output growth during the transition risk. With optimal carbon price designs, it should react even less to inflation, and more to output growth.
Optimal monetary policy studies typically rely on a single structural model and identification of model-specific rules that minimize the unconditional volatilities of inflation and real activity. In their proposed approach, the authors take a large set of structural models and look for the model-robust rules that minimize the volatilities at those frequencies that policymakers are most interested in stabilizing. Compared to the status quo approach, their results suggest that policymakers should be more restrained in their inflation responses when their aim is to stabilize inflation and output growth at specific frequencies. Additional caution is called for due to model uncertainty.
Fabo, Janˇcokov ́a, Kempf, and P ́astor (2021) show that papers written by central bank researchers find quantitative easing (QE) to be more effective than papers written by academics. Weale and Wieladek (2022) show that a subset of these results lose statistical significance when OLS regressions are replaced by regressions that downweight outliers. We examine those outliers and find no reason to downweight them. Most of them represent estimates from influential central bank papers published in respectable academic journals. For example, among the five papers finding the largest peak effect of QE on output, all five are published in high-quality journals (Journal of Monetary Economics, Journal of Money, Credit and Banking, and Applied Economics Letters), and their average number of citations is well over 200. Moreover, we show that these papers have supported policy communication by the world’s leading central banks and shaped the public perception of the effectiveness of QE. New evidence based on quantile regressions further supports the results in Fabo et al. (2021).
Central banks sometimes evaluate their own policies. To assess the inherent conflict of interest, the authors compare the research findings of central bank researchers and academic economists regarding the macroeconomic effects of quantitative easing (QE). They find that central bank papers report larger effects of QE on output and inflation. Central bankers are also more likely to report significant effects of QE on output and to use more positive language in the abstract. Central bankers who report larger QE effects on output experience more favorable career outcomes. A survey of central banks reveals substantial involvement of bank management in research production.
In this paper we adapt the Hamiltonian Monte Carlo (HMC) estimator to DSGE models, a method presently used in various fields due to its superior sampling and diagnostic properties. We implement it into a state-of-theart, freely available high-performance software package, STAN. We estimate a small scale textbook New-Keynesian model and the Smets-Wouters model using US data. Our results and sampling diagnostics confirm the parameter estimates available in existing literature. In addition, we find bimodality in the Smets-Wouters model even if we estimate the model using the original tight priors. Finally, we combine the HMC framework with the Sequential Monte Carlo (SMC) algorithm to create a powerful tool which permits the estimation of DSGE models with ill-behaved posterior densities.
In this paper we adopt the Hamiltonian Monte Carlo (HMC) estimator for DSGE models by implementing it into a state-of-the-art, freely available high-performance software package. We estimate a small scale textbook New-Keynesian model and the Smets-Wouters model on US data. Our results and sampling diagnostics confirm the parameter estimates available in existing literature. In addition we combine the HMC framework with the Sequential Monte Carlo (SMC) algorithm which permits the estimation of DSGE models with ill-behaved posterior densities.
A theory of the boundaries of banks with implications for financial integration and regulation
(2015)
We offer a theory of the "boundary of the
rm" that is tailored to banking, as it builds on a single ine¢ ciency arising from risk-shifting and as it takes into account both interbank lending as an alternative to integration and the role of possibly insured deposit funding. Amongst others, it explains both why deeper economic integration should cause also greater financial integration through both bank mergers and interbank lending, albeit this typically remains ine¢ ciently incomplete, and why economic disintegration (or "desychronization"), as currently witnessed in the European Union, should cause less interbank exposure. It also suggests that recent policy measures such as the preferential treatment of retail deposits, the extension of deposit insurance, or penalties on "connectedness" could all lead to substantial welfare losses.
The ruling of the German Federal Constitutional Court and its call for conducting and communicating proportionality assessments regarding monetary policy have been the subject of some controversy. However, it can also be understood as a way to strengthen the de-facto independence of the European Central Bank. The authors shows how a regular proportionality check could be integrated in the ECB’s strategy that is currently undergoing a systematic review. In particular, they propose to include quantitative benchmarks for policy rates and the central bank balance sheet. Deviations from such benchmarks can have benefits in terms of the intended path for inflation while involving costs in terms of risks and side effects that need to be balanced. Practical applications to the euro area are provided
Highly interconnected global supply chains make countries vulnerable to supply chain disruptions. The authors estimate the macroeconomic effects of global supply chain shocks for the euro area. Their empirical model combines business cycle variables with data from international container trade.
Using a novel identification scheme, they augment conventional sign restrictions on the impulse responses by narrative information about three episodes: the Tohoku earthquake in 2011, the Suez Canal obstruction in 2021, and the Shanghai backlog in 2022. They show that a global supply chain shock causes a drop in euro area real economic activity and a strong increase in consumer prices. Over a horizon of one year, the global supply chain shock explains about 30% of inflation dynamics. They also use regional data on supply chain pressure to isolate shocks originating in China.
Their results show that supply chain disruptions originating in China are an important driver for unexpected movements in industrial production, while disruptions originating outside China are an especially important driver for the dynamics of consumer prices.
Central banks sowing the seeds for a green financial sector? NGFS membership and market reactions
(2024)
In December 2017, during the One Planet Summit in Paris, a group of eight central banks and supervisory authorities launched the “Network for Greening the Financial Sector” (NGFS) to address challenges and risks posed by climate change to the global financial system. Until 06/2023 an additional 69 central banks from all around the world have joined the network. We find that the propensity to join the network can be described as a function in the country’s economic development (e.g., GDP per capita), national institutions (e.g., central bank independence), and performance of the central bank on its mandates (e.g., price stability and output gap). Using an event study design to examine consequences of network expansions in capital markets, we document that a difference portfolio that is long in clean energy stocks and short in fossil fuel stocks benefits from an enlargement of the NGFS. Overall, our results suggest that an increasing number of central banks and supervisory authorities are concerned about climate change and willing to go beyond their traditional objectives, and that the capital market believes they will do so.
Hong Kong’s Linked Exchange Rate System (LERS) has been in operation for twenty-five years during which time many other fixed exchange rate systems have succumbed to shocks and/or speculative attacks. This fact alone suggests that the LERS is a robust system which enjoys a large measure of credibility in financial markets. This paper intends to investigate whether this is indeed the case, and whether it has been the case throughout its 25-year history. In particular we will use the tools of modern finance to extract information from financial asset prices about market expectations that are related to the credibility of the LERS. The main focus is on how market participants ‘judged’ the various changes made to the LERS, such as the ‘seven technical measures’ introduced in September 1998 and the ‘three refinements’ made in May 2005. These changes have been characterizes as making the system less discretionary over time, and we hypothesize that they have also made it more credible as revealed in the prices of exchange rate related asset prices. We also investigate the relationship between interest rates and exchange rates in the current system in light of modern models of target-zone exchange rate systems. We will examine whether the intramarginal intervention in November 2007 changed the dynamic properties of the exchange rate as suggested by such models.
The risk of deflation
(2009)
This paper was prepared for the meeting on Financial Regulation and Macroeconomic Stability: Key issues for the G20, organised by the CEPR and the Reinventing Bretton Woods Committee, London, 31 January 2009. Introduction: The onset of financial instability in August 2007, which quickly spread across the world, raises a number of questions for policy makers. First, what are the roots of the crisis? Many factors have been emphasized in the debate, including the opacity of complex financial products; the excessive confidence in ratings; weak risk management by financial institutions; massive reliance on wholesale funding; and the presumption that markets would always be liquid. Furthermore, poorly understood incentive effects – arising from the originate-to-distribute-model, remuneration policies and the period of low interest rates – are also widely seen as having played a role. Second, how can a repetition of the crisis can be avoided? Much attention is being focused on regulation and supervision of financial intermediaries. The G-20, at its summit in November 2008, noted that measures need to be taken in five areas: (i) financial market transparency and disclosure by firms need to be strengthened; (ii) regulation needs to be enhanced to ensure that all financial markets, products and participants are regulated or subject to oversight, as appropriate; (iii) the integrity of financial markets should be improved by bolstering investor and consumer protection, avoiding conflicts of interest, and by promoting information sharing; (iv) international cooperation among regulators must be enhanced; and (v) international financial institutions must be reformed to reflect changing economic weights in the world economy better in order to increase the legitimacy and effectiveness of these institutions. Third, how can the consequences for economic activity be minimized? Many of the adverse developments in financial markets – in particular the collapse of term interbank markets – reflect deeply entrenched perceptions of counterparty risk. Prompt and far-reaching action to support the financial system, in particular the infusion of equity capital in financial institutions to reduce counter-party risk and get credit to flow again, is essential in order to restore market functioning. A particular risk at present is that the rapid decline in inflation in many countries in recent months will turn into deflation with highly adverse real economic developments. This background paper considers how large the risk of deflation may be and discusses what policy can do to reduce it. It is organized as follows. Section 2 defines deflation and discusses downward nominal wage rigidities and the zero lower bound on interest rates. While these factors are frequently seen as two reasons why deflation can be associated with very poor economic outcomes, they should not be overemphasized. Section 3 looks at the current situation. Inflation expectations and forecasts in the subset of economies we look at (the euro area, the UK and the US) are positive, indicating that deflation is not expected. This does not imply that the current concerns of deflation are unwarranted, only that the public expects the central bank to be successful in avoiding deflation. The section also looks at the evolution of headline and “core” inflation, focusing on data from the US and the euro area. Section 4 reviews how monetary and fiscal policy can be conducted to ensure that deflation is avoided. Section 5 briefly discusses special issues arising in emerging market economies. Finally, Section 6 offers some conclusions. An Appendix discusses deflation episodes in the period 1882-1939.
We test the menu cost model of Ball and Mankiw (1994, 1995), which implies that the impact of price dispersion on inflation should differ between inflation and deflation episodes, using data for Japan and Hong Kong. We use a random cross-section sample split when calculating the moments of the distribution of price changes to mitigate the small-cross-sectionsample bias noted by Cecchetti and Bryan (1999). The parameter on the third moment is positive and significant in both countries during both the inflation and deflation periods, and the parameter on the second moment changes sign in the deflation period, as the theory predicts. Keywords: inflation, deflation, menu costs, Hong Kong, Japan JEL Numbers: E31
Exploiting the natural experiment of the German reunification, we examine how consumers adapt to a new environment in their macroeconomic forecasting. We document that East Germans expect higher in inflation and make larger forecast errors than West
Germans even decades after reunification. Differences in consumption baskets, financial literacy, risk aversion or trust in the central bank cannot fully account for these patterns. We find most support for the explanation that East Germans, who were used to a strong norm of zero inflation, persistently overadjusted the level of their expectations in the face of the initial inflation shock in reunified Germany. Our findings suggest that large changes in the economic environment can permanently impede people's ability to form accurate macroeconomic expectations, with an important role for the interaction of old norms and new experiences around the event.
Household finance
(2020)
Household financial decisions are complex, interdependent, and heterogeneous, and central to the functioning of the financial system. We present an overview of the rapidly expanding literature on household finance (with some important exceptions) and suggest directions for future research. We begin with the theory and empirics of asset market participation and asset allocation over the lifecycle. We then discuss house-hold choices in insurance markets, trading behavior, decisions on retirement saving, and financial choices by retirees. We survey research on liabilities, including mortgage choice, refinancing, and default, and household behavior in unsecured credit markets, including credit cards and payday lending. We then connect the household to its social environment, including peer effects, cultural and hereditary factors, intra-household financial decision making, financial literacy, cognition and educational interventions. We also discuss literature on the provision and consumption of financial advice.
We assemble a data set of more than eight million German Twitter posts related to the war in Ukraine. Based on state-of-the-art methods of text analysis, we construct a daily index of uncertainty about the war as perceived by German Twitter. The approach also allows us to separate this index into uncertainty about sanctions against Russia, energy policy and other dimensions. We then estimate a VAR model with daily financial and macroeconomic data and identify an exogenous uncertainty shock. The increase in uncertainty has strong effects on financial markets and causes a significant decline in economic activity as well as an increase in expected inflation. We find the effects of uncertainty to be particularly strong in the first months of the war.
The authors study the impact of dissent in the ECB‘s Governing Council on uncertainty surrounding households‘ inflation expectations. They conduct a randomized controlled trial using the Bundesbank Online Panel Households. Participants are provided with alternative information treatments concerning the vote in the Council, e.g. unanimity and dissent, and are asked to submit probabilistic inflation expectations. The results show that the vote is informative.
Households revise their subjective inflation forecast after receiving information about the vote. Dissenting votes cause a wider individual distribution of future inflation. Hence, dissent increases households‘ uncertainty about inflation. This effect is statistically significant once the authors allow for the interaction between the treatments and individual characteristics of respondents.
The results are robust with respect to alternative measures of forecast uncertainty and hold for different model specifications. The findings suggest that providing information about dissenting votes without additional information about the nature of dissent is detrimental to coordinating household expectations.
Veronika Grimm, Lukas Nöh, and Volker Wieland assess the possible development of government interest expenditures as a share of GDP for Germany, France, Italy and Spain. Until 2021, these and other member states could anticipate a further reduction of interest expenditure in the future. This outlook has changed considerably with the recent surge in inflation and government bond rates. Nevertheless, under reasonable assumptions current yield curves still imply that interest expenditure relative to GDP can be stabilized at the current level. The authors also review the implications of a further upward shift in the yield curves of 1 or 2 percentage points. These implications suggest significant medium-term risks for highly indebted member states with interest expenditure approaching or exceeding levels last observed on the eve of the euro area debt crisis. In light of these risks, governments of euro area member states should take substantive action to achieve a sustained decline in debt-to-GDP ratios towards safer levels. They bear the responsibility for making sure that government finances can weather the higher interest rates which are required to achieve price stability in the euro area.
Trotz der von der EZB eingeleiteten Zinswende in der zweiten Jahreshälfte 2022 als späte Reaktion auf die deutlich unterschätzte Persistenz hoher Inflationsraten im Euroraum sind die Realzinsen sowohl in der Ex-post-Betrachtung als auch in der Ex-ante-Betrachtung keineswegs als restriktiv einzuschätzen. Die Banken haben allerdings recht rasch strengere Vergaberichtlinien beschlossen, und die Nachfrage im Wohnungsbau und bei den Hypothekarkrediten ist stark eingebrochen.
Die Autoren thematisieren die Bedeutung von Zahlungsstromeffekten bei Annuitätenkrediten und analysiert hier vor allem den sogenannten Front-Loading-Effekt. Danach führen höhere Nominalzinsen selbst bei vollständig antizipierten Inflationsraten und unveränderten Realzinsen zu starken finanziellen Zusatzbelastungen in den ersten Phasen der typischerweise langen Kreditlaufzeit. Derartige Liquiditätseffekte können die Zahlungsfähigkeit bzw. die Zahlungsbereitschaft der privaten Investoren empfindlich verringern. Dies gilt vor allem bei Darlehen in Form der Prozentannuität, da hier zusätzlich ein Laufzeitenverkürzungseffekt auftritt. Solche Darlehen sind in Deutschland recht populär.
Mit Blick auf die Zukunft sehen die Autoren auch eine reale Gefahr für den Bestand an Wohnungsbaukrediten, wenn es zu einer Refinanzierung des großen Bestands an billigen Wohnungsbaukrediten kommt, ein Risiko, das auch Auswirkungen auf die makroökonomische und finanzielle Stabilität hat.
We use a structural VAR model to study the German natural gas market and investigate the impact of the 2022 Russian supply stop on the German economy. Combining conventional and narrative sign restrictions, we find that gas supply and demand shocks have large and persistent price effects, while output effects tend to be moderate. The 2022 natural gas price spike was driven by adverse supply
shocks and positive storage demand shocks, as Germany filled its inventories before the winter. Counterfactual simulations of an embargo on natural gas imports from Russia indicate similar positive price and negative output effects compared to what we observe in the data.
This paper uses unique administrative data and a quasi-field experiment of exogenous allocation in Sweden to estimate medium- and longer-run effects on financial behavior from exposure to financially literate neighbors. It contributes evidence of causal impact of exposure and of a social multiplier of financial knowledge, but also of unfavorable distributional aspects of externalities. Exposure promotes saving in private retirement accounts and stockholding, especially when neighbors have economics or business education, but only for educated households and when interaction possibilities are substantial. Findings point to transfer of knowledge rather than mere imitation or effects through labor, education, or mobility channels.
The authors present evidence of a new propagation mechanism for wealth inequality, based on differential responses, by education, to greater inequality at the start of economic life. The paper is motivated by a novel positive cross-country relationship between wealth inequality and perceptions of opportunity and fairness, which holds only for the more educated. Using unique administrative micro data and a quasi-field experiment of exogenous allocation of households, the authors find that exposure to a greater top 10% wealth share at the start of economic life in the country leads only the more educated placed in locations with above-median wealth mobility to attain higher wealth levels and position in the cohort-specific wealth distribution later on. Underlying this effect is greater participation in risky financial and real assets and in self-employment, with no evidence for a labor income, unemployment risk, or human capital investment channel. This differential response is robust to controlling for initial exposure to fixed or other time-varying local features, including income inequality, and consistent with self-fulfilling responses of the more educated to perceived opportunities, without evidence of imitation or learning from those at the top.
The authors identify U.S. monetary and fiscal dominance regimes using machine learning techniques. The algorithms are trained and verified by employing simulated data from Markov-switching DSGE models, before they classify regimes from 1968-2017 using actual U.S. data. All machine learning methods outperform a standard logistic regression concerning the simulated data. Among those the Boosted Ensemble Trees classifier yields the best results. The authors find clear evidence of fiscal dominance before Volcker. Monetary dominance is detected between 1984-1988, before a fiscally led regime turns up around the stock market crash lasting until 1994. Until the beginning of the new century, monetary dominance is established, while the more recent evidence following the financial crisis is mixed with a tendency towards fiscal dominance.
This paper examines the sustainability of the currency board arrangements in Argentina and Hong Kong. We employ a Markov switching model with two regimes to infer the exchange rate pressure due to economic fundamentals and market expectations. The empirical results suggest that economic fundamentals and expectations are key determinants of a currency board’s sustainability. We also show that the government’s credibility played a more important role in Argentina than in Hong Kong. The trade surplus, real exchange rate and inflation rate were more important drivers of the sustainability of the Hong Kong currency board.
Distributed ledger technology especially in the form of publicly coordinated validation networks such as Ethereum and Bitcoin with their own monetary circles provide for a revealing litmus test for current financial regulatory schemes. The paper highlights the interrelation between distributed coordination and the emission of virtual currency to make sense of the function of the new monetary phenomenon. It then argues for the regulation of financial services on the ground of the technology to ensure integrity standards. In this respect, it is useful to gear the development of a regulatory scheme towards the existing financial regulatory principles. However, future measures of the regulators must take the distributed nature of the platforms into account by relying on a “regulated self-regulation” of the community. Finally, the article focuses on the shortcomings of the current EU regulatory regimes, especially the regulation frameworks regarding financial services, payment services and electronic money.
Our paper evaluates recent regulatory proposals mandating the deferral of bonus payments and claw-back clauses in the financial sector. We study a broadly applicable principal agent setting, in which the agent exerts effort for an immediately observable task (acquisition) and a task for which information is only gradually available over time (diligence). Optimal compensation contracts trade off the cost and benefit of delay resulting from agent impatience and the informational gain. Mandatory deferral may increase or decrease equilibrium diligence depending on the importance of the acquisition task. We provide concrete conditions on economic primitives that make mandatory deferral socially (un)desirable.
This paper applies structure preserving doubling methods to solve the matrix quadratic underlying the recursive solution of linear DSGE models. We present and compare two Structure-Preserving Doubling Algorithms ( SDAs) to other competing methods – the QZ method, a Newton algorithm, and an iterative Bernoulli approach – as well as the related cyclic and logarithmic reduction algorithms. Our comparison is completed using nearly 100 different models from the Macroeconomic Model Data Base (MMB) and different parameterizations of the monetary policy rule in the medium scale New Keynesian model of Smets and Wouters (2007) iteratively. We find that both SDAs perform very favorably relative to QZ, with generally more accurate solutions computed in less time. While we collect theoretical convergence results that promise quadratic convergence rates to a unique stable solution, the algorithms may fail to converge when there is a breakdown due to singularity of the coefficient matrices in the recursion. One of the proposed algorithms can overcome this problem by an appropriate (re)initialization. This SDA also performs particular well in refining solutions of different methods or from nearby parameterizations.
This paper considers a firm that has to delegate to an agent, such as a mortgage broker or a security dealer, the twin tasks of approaching and advising customers. The main contractual restriction, in particular in light of related research in Inderst and Ottaviani (2007), is that the firm can only compensate the agent through commissions. This standard contracting restriction has the following key implications. First, the firm can only ensure internal compliance to a "standard of sales", in terms of advice for the customer, if this standard is not too high. Second, if this is still feasible, then a higher standard is associated with higher, instead of lower, sales commissions. Third, once the limit for internal compliance is approached, tougher regulation and prosecution of "misselling" have (almost) no effect on the prevailing standard. Besides having practical implications, in particular on how to (re-)regulate the sale of financial products, the novel model, which embeds a problem of advice into a framework with repeated interactions, may also be of separate interest for future work on sales force compensation. JEL Classification: D18 (Consumer Protection), D83 (Search; Learning; Information and Knowledge), M31 (Marketing), M52 (Compensation and Compensation Methods and Their Effects).
This paper presents a novel model of the lending process that takes into account that loan officers must spend time and effort to originate new loans. Besides generating predictions on loan officers’ compensation and its interaction with the loan review process, the model sheds light on why competition could lead to excessively low lending standards. We also show how more intense competition may fasten the adoption of credit scoring. More generally, hard-information lending techniques such as credit scoring allow to give loan officers high-powered incentives without compromising the integrity and quality of the loan approval process. The model is finally applied to study the implications of loan sales on the adopted lending process and lending standard.
We present a simple model of personal finance in which an incumbent lender has an information advantage vis-a-vis both potential competitors and households. In order to extract more consumer surplus, a lender with sufficient market power may engage in "irresponsible"lending, approving credit even if this is knowingly against a household’s best interest. Unless rival lenders are equally well informed, competition may reduce welfare. This holds, in particular, if less informed rivals can free ride on the incumbent’s superior screening ability.
This paper presents a novel model of the lending process that takes into account that loan officers must spend time and effort to originate new loans. Besides generating predictions on loan officers’ compensation and its interaction with the loan review process, the model sheds light on why competition could lead to excessively low lending standards. We also show how more intense competition may fasten the adoption of credit scoring. More generally, hard-information lending techniques such as credit scoring allow to give loan officers high-powered incentives without compromising the integrity and quality of the loan approval process.
We analyze how two key managerial tasks interact: that of growing the business through creating new investment opportunities and that of providing accurate information about these opportunities in the corporate budgeting process. We show how this interaction endogenously biases managers toward overinvesting in their own projects. This bias is exacerbated if managers compete for limited resources in an internal capital market, which provides us with a novel theory of the boundaries of the firm. Finally, managers of more risky and less profitable divisions should obtain steeper incentives to facilitate efficient investment decisions.
We consider an imperfectly competitive loan market in which a local relationship lender has an information advantage vis-à-vis distant transaction lenders. Competitive pressure from the transaction lenders prevents the local lender from extracting the full surplus from projects, so that she inefficiently rejects marginally profitable projects. Collateral mitigates the inefficiency by increasing the local lender’s payoff from precisely those marginal projects that she inefficiently rejects. The model predicts that, controlling for observable borrower risk, collateralized loans are more likely to default ex post, which is consistent with the empirical evidence. The model also predicts that borrowers for whom local lenders have a relatively smaller information advantage face higher collateral requirements, and that technological innovations that narrow the information advantage of local lenders, such as small business credit scoring, lead to a greater use of collateral in lending relationships. JEL classification: D82; G21 Keywords: Collateral; Soft infomation; Loan market competition; Relationship lending
This paper shows that active investors, such as venture capitalists, can affect the speed at which new ventures grow. In the absence of product market competition, new ventures financed by active investors grow faster initially, though in the long run those financed by passive investors are able to catch up. By contrast, in a competitive product market, new ventures financed by active investors may prey on rivals that are financed by passive investors by “strategically overinvesting” early on, resulting in long-run differences in investment, profits, and firm growth. The value of active investors is greater in highly competitive industries as well as in industries with learning curves, economies of scope, and network effects, as is typical for many “new economy” industries. For such industries, our model predicts that start-ups with access to venture capital may dominate their industry peers in the long run. JEL Classifications: G24; G32 Keywords: Venture capital; dynamic investment; product market competition
We study a model of “information-based entrenchment” in which the CEO has private information that the board needs to make an efficient replacement decision. Eliciting the CEO’s private information is costly, as it implies that the board must pay the CEO both higher severance pay and higher on-the-job pay. While higher CEO pay is associated with higher turnover in our model, there is too little turnover in equilibrium. Our model makes novel empirical predictions relating CEO turnover, severance pay, and on-the-job pay to firm-level attributes such as size, corporate governance, and the quality of the firm’s accounting system.
This paper argues that banks must be sufficiently levered to have first-best incentives to make new risky loans. This result, which is at odds with the notion that leverage invariably leads to excessive risk taking, derives from two key premises that focus squarely on the role of banks as informed lenders. First, banks finance projects that they do not own, which implies that they cannot extract all the profits. Second, banks conduct a credit risk analysis before making new loans. Our model may help understand why banks take on additional unsecured debt, such as unsecured deposits and subordinated loans, over and above their existing deposit base. It may also help understand why banks and finance companies have similar leverage ratios, even though the latter are not deposit takers and hence not subject to the same regulatory capital requirements as banks.
This article shows that investors financing a portfolio of projects may use the depth of their financial pockets to overcome entrepreneurial incentive problems. Competition for scarce informed capital at the refinancing stage strengthens investors’ bargaining positions. And yet, entrepreneurs’ incentives may be improved, because projects funded by investors with ‘‘shallow pockets’’ must have not only a positive net present value at the refinancing stage, but one that is higher than that of competing portfolio projects. Our article may help understand provisions used in venture capital finance that limit a fund’s initial capital and make it difficult to add more capital once the initial venture capital fund is raised. (JEL G24, G31)
This paper shows that investors financing a portfolio of projects may use the depth of their financial pockets to overcome entrepreneurial incentive problems. Competition for scarce informed capital at the refinancing stage strengthens investors’ bargaining positions. And yet, entrepreneurs’ incentives may be improved, because projects funded by investors with “shallow pockets” must have not only a positive net present value at the refinancing stage, but one that is higher than that of competing portfolio projects. Our paper may help to understand provisions used in venture capital finance that limit a fund’s initial capital and make it difficult to add more capital once the initial venture capital fund is raised.
Misselling through agents
(2009)
This paper analyzes the implications of the inherent conflict between two tasks performed by direct marketing agents: prospecting for customers and advising on the product's "suitability" for the specific needs of customers. When structuring sales-force compensation, firms trade off the expected losses from "misselling" unsuitable products with the agency costs of providing marketing incentives. We characterize how the equilibrium amount of misselling (and thus the scope of policy intervention) depends on features of the agency problem including: the internal organization of a firm's sales process, the transparency of its commission structure, and the steepness of its agents' sales incentives. JEL Classification: D18 (Consumer Protection), D83 (Search; Learning; Information and Knowledge), M31 (Marketing), M52 (Compensation and Compensation Methods and Their Effects).
In this paper, we provide some reflections on the development of monetary theory and monetary policy over the last 150 years. Rather than presenting an encompassing overview, which would be overambitious, we simply concentrate on a few selected aspects that we view as milestones in the development of this subject. We also try to illustrate some of the interactions with the political and financial system, academic discussion and the views and actions of central banks.
In his speech at the conference „The SNB and its Watchers“, Otmar Issing, member of the ECB Governing Council from its start in 1998 until 2006, takes a look back at more than twenty years of the conference series „The ECB and Its Watchers“. In June 1999, Issing established this format together with Axel Weber, then Director of the Center for Financial Studies, to discuss the monetary policy strategy of the newly founded central bank with a broad circle of participants, that is academics, bank economists and members of the media on a „neutral ground“. At the annual conference, the ECB and its representatives would play an active role and engage in a lively exchange of view with the other participants. Over the years, Volker Wieland took over as organizer of the conference series, which also was adopted by other central banks. In his contribution at the second conference „The SNB and its Watchers“, Issing summarizes the experience gained from over twenty years of the ECB Watchers Conference.
The Eurosystem and the Deutsche Bundesbank will incur substantial losses in 2023 that are likely to persist for several years. Due to the massive purchases of securities in the last 10 years, especially of government bonds, the banks' excess reserves have risen sharply. The resulting high interest payments to the banks since the turnaround in monetary policy, with little income for the large-scale securities holdings, led to massive criticism. The banks were said to be making "unfair" profits as a result, while the fiscal authorities had to forego the previously customary transfers of central bank profits. Populist demands to limit bank profits by, for example, drastically increasing the minimum reserve ratios in the Eurosystem to reduce excess reserves are creating new severe problems and are neither justified nor helpful. Ultimately, the EU member states have benefited for a very long time from historically low interest rates because of the Eurosystem's extraordinary loose monetary policy and must now bear the flip side consequences of the massive expansion of central bank balance sheets during the necessary period of monetary policy normalisation.
The so-called Troika, consisting of the EU-Commission, the European Central Bank (ECB) and the International Monetary Fund (IMF), was supposed to support the member states of the euro area which had been hit hard by a sovereign debt crisis. For that purpose, economic adjustment programs were drafted and monitored in order to prevent the break-up of the euro area and sovereign defaults. The cooperation of these institutions, which was born out of necessity, has been partly successful, but has also created persistent problems. With the further increase of public debt, especially in France and Italy, the danger of a renewed crisis in the euro area was growing. The European Stability Mechanism (ESM) together with the European Commission will replace the Troika in the future, following decisions of the EU Summit of December 2018. It shall play the role of a European Monetary Fund in the event of a crisis. The IMF, on the other side, will no longer play an active role in solving sovereign debt crises in the euro area. The current course is, however, inadequate to tackle the core problems of the euro zone and to avoid future crises, which are mainly structural in nature and due to escalating public debt and lack of international competitiveness of some member countries. The current Corona crisis will aggravate the institutional problems. It has led to a common European fiscal response ("Next Generation EU"). This rescue and recovery program will not be financed by ESM resources and will not be monitored by the ESM. One important novelty of this package is that it involves the issuance of substantial common European debt.
Debt levels in the eurozone have reached new record highs. The member countries have tried to cushion the economic consequences of the corona pandemic with a massive increase in government spending. End of 2021 public debt in relation to GDP will approach 100% on average. There are various calls to abolish or soften the Maastricht rules of limiting sovereign debt. We see the risk of a new sovereign debt crisis in this decade if it is not possible to bring public debt down to an acceptable level. Our new fiscal rule would be suitable and appropriate for this purpose, because obviously the Maastricht criteria have failed. In contrast to the rigid 3% Maastricht-criterion, our rule is flexible and it addresses the main problem: excessively high public debt ratios. And it lowers the existing incentives for highly indebted governments to exert expansionary pressure on monetary policy. If obeyed strictly, our rule reinforces the snowball effect and reduces the excessively high debt ratios within a manageable period, even if nominal growth is weak. This is confirmed by simulations with different scenarios as well as with the hypothetical application of the new fiscal rule to eurozone economies from 2022 to 2026. Finally, we take up the recent proposal by ESM economists to increase the permissible debt ratio from 60 to 100% of GDP in the eurozone.
The term structure of interest rates is crucial for the transmission of monetary policy to financial markets and the macroeconomy. Disentangling the impact of monetary policy on the components of interest rates, expected short rates and term premia, is essential to understanding this channel. To accomplish this, we provide a quantitative structural model with endogenous, time-varying term premia that are consistent with empirical findings. News about future policy, in contrast to unexpected policy shocks, has quantitatively significant effects on term premia along the entire term structure. This provides a plausible explanation for partly contradictory estimates in the empirical literature.
The term structure of interest rates is crucial for the transmission of monetary policy to financial markets and the macroeconomy. Disentangling the impact of monetary policy on the components of interest rates, expected short rates, and term premia is essential to understanding this channel. To accomplish this, we provide a quantitative structural model with endogenous, time-varying term premia that are consistent with empirical findings. News about future policy, in contrast to unexpected policy shocks, has quantitatively significant effects on term premia along the entire term structure. This provides a plausible explanation for partly contradictory estimates in the empirical literature.
Rising temperatures, falling ratings: the effect of climate change on sovereign creditworthiness
(2021)
How will a changing climate impact the creditworthiness of governments over the very long term? Financial markets need credible, digestible information on how climate change translates into material risks. To bridge the gap between climate science and real-world financial indicators, the authors simulate the effect of climate change on sovereign credit ratings for 108 countries, creating the world’s first climate-adjusted sovereign credit rating. The study offers a first methodological approach to extend the long-term rating to an ultra-long-term reality, aiming at long-term investors, but also regulators and rating agencies.
This paper proposes a new approach for modeling investor fear after rare disasters. The key element is to take into account that investors’ information about fundamentals driving rare downward jumps in the dividend process is not perfect. Bayesian learning implies that beliefs about the likelihood of rare disasters drop to a much more pessimistic level once a disaster has occurred. Such a shift in beliefs can trigger massive declines in price-dividend ratios. Pessimistic beliefs persist for some time. Thus, belief dynamics are a source of apparent excess volatility relative to a rational expectations benchmark. Due to the low frequency of disasters, even an infinitely-lived investor will remain uncertain about the exact probability. Our analysis is conducted in continuous time and offers closed-form solutions for asset prices. We distinguish between rational and adaptive Bayesian learning. Rational learners account for the possibility of future changes in beliefs in determining their demand for risky assets, while adaptive learners take beliefs as given. Thus, risky assets tend to be lower-valued and price-dividend ratios vary less under adaptive versus rational learning for identical priors. Keywords: beliefs, Bayesian learning, controlled diffusions and jump processes, learning about jumps, adaptive learning, rational learning. JEL classification: D83, G11, C11, D91, E21, D81, C61
This paper studies the macro-financial implications of using carbon prices to achieve ambitious greenhouse gas (GHG) emission reduction targets. My empirical evidence shows a 0.6% output loss and a rise of 0.3% in inflation in response to a 1% shock on carbon policy. Furthermore, I also observe financial instability and allocation effects between the clean and highly polluted energy sectors. To have a better prediction of medium and long-term impact, using a medium-large macro-financial DSGE model with environmental aspects, I show the recessionary effect of an ambitious carbon price implementation to achieve climate targets, a 40% reduction in GHG emission causes a 0.7% output loss while reaching a zero-emission economy in 30 years causes a 2.6% output loss. I document an amplified effect of the banking sector during the transition path. The paper also uncovers the beneficial role of pre-announcements of carbon policies in mitigating inflation volatility by 0.2% at its peak, and our results suggest well-communicated carbon policies from authorities and investing to expand the green sector. My findings also stress the use of optimal green monetary and financial policies in mitigating the effects of transition risk and assisting the transition to a zero-emission world. Utilizing a heterogeneous approach with macroprudential tools, I find that optimal macroprudential tools can mitigate the output loss by 0.1% and investment loss by 1%. Importantly, my work highlights the use of capital flow management in the green transition when a global cooperative solution is challenging.
In this paper, we construct a Dynamic Stochastic General Equilibrium (DSGE) model to examine the implications of dual rates for green lending. We demonstrate that implementing a distinct interest rate for banks engaged in green lending can effectively mitigate transition risks while channeling more capital towards green production sectors and firms for an immediate cut of emissions and net zero emission economy targets.
The authors embed human capital-based endogenous growth into a New-Keynesian model with search and matching frictions in the labor market and skill obsolescence from long-term unemployment. The model can account for key features of the Great Recession: a decline in productivity growth, the relative stability of inflation despite a pronounced fall in output (the "missing disinflation puzzle"), and a permanent gap between output and the pre-crisis trend output.
In the model, lower aggregate demand raises unemployment and the training costs associated with skill obsolescence. Lower employment hinders learning-by-doing, which slows down human capital accumulation, feeding back into even fewer vacancies than justified by the demand shock alone. These feedback channels mitigate the disinflationary effect of the demand shock while amplifying its contractionary effect on output. The temporary growth slowdown translates into output hysteresis (permanently lower output and labor productivity).
Central banks normally accept debt of their own governments as collateral in liquidity operations without reservations. This gives rise to a valuable liquidity premium that reduces the cost of government finance. The ECB is an interesting exception in this respect. It relies on external assessments of the creditworthiness of its member states, such as credit ratings, to determine eligibility and the haircut it imposes on such debt. The authors show how such features in a central bank’s collateral framework can give rise to cliff effects and multiple equilibria in bond yields and increase the vulnerability of governments to external shocks. This can potentially induce sovereign debt crises and defaults that would not otherwise arise.
This paper characterises optimal monetary policy in an economy with endogenous
firm entry, a cash-in-advance constraint and preset wages. Firms must make pro
fits to cover entry costs; thus the markup on goods prices is efficient. However, because leisure is not priced at a markup, the consumption-leisure tradeoff is distorted. Consequently, the real wage, hours and production are suboptimally low. Due to the labour requirement in entry, insufficient labour supply also implies that entry is too low. The paper shows that in the absence of
fiscal instruments such as labour income subsidies, the optimal monetary policy under sticky wages achieves higher welfare than under flexible wages. The policy maker uses the money supply instrument to raise the real wage - the cost of leisure - above its flexible-wage level, in response to expansionary shocks to productivity and entry costs. This raises labour supply, expanding production and
rm entry.
How do changes in market structure affect the US business cycle? We estimate a monetary DSGE model with endogenous
rm/product entry and a translog expenditure function by Bayesian methods. The dynamics of net business formation allow us to identify the 'competition effect', by which desired price markups and inflation decrease when entry rises. We
find that a 1 percent increase in the number of competitors lowers desired markups by 0.18 percent. Most of the cyclical variability in inflation is driven by markup fluctuations due to sticky prices or exogenous shocks rather than endogenous changes in desired markups.
How do changes in market structure affect the US business cycle? We estimate a monetary DSGE model with endogenous
rm/product entry and a translog expenditure function by Bayesian methods. The dynamics of net business formation allow us to identify the 'competition effect', by which desired price markups and inflation decrease when entry rises. We
find that a 1 percent increase in the number of competitors lowers desired markups by 0.18 percent. Most of the cyclical variability in inflation is driven by markup fluctuations due to sticky prices or exogenous shocks rather than endogenous changes in desired markups.
This paper investigates the effect of a change in informational environment of borrowers on the organizational design of bank lending. We use micro-data from a large multinational bank and exploit the sudden introduction of a credit registry, an information-sharing mechanism across banks, for a subset of borrowers. Using within borrower and loan officer variation in a difference-in-difference empirical design, we show that expansion of credit registry led to an improvement in allocation of credit to affected
borrowers. There was a concurrent change in the organizational structure of the bank that involved a dramatic increase in delegation of lending decisions of affected borrowers to loan officers. We also find a significant expansion in scope of activities of loan officers who deal primarily with affected borrowers, as well as of their superiors. There is suggestive evidence that larger banks in the economy were better able to implement similar changes as our bank. We argue that these patterns can be understood within the framework of incentive-based and information cost processing theories. Our findings could help rationalize why improvements in the information environment of borrowers may be altering the landscape of lending by moving decisions outside the boundaries of financial intermediaries.
There is substantial disagreement about the consequences of the Tax Cuts and Jobs Act (TCJA) of 2017, which constitutes the most extensive tax reform in the United States in more than 30 years. Using a large-scale two-country dynamic general equilibrium model with nominal rigidities, we find that the TCJA increases GDP by about 2% in the medium-run and by about 2.5% in the long-run. The shortrun impact depends crucially on the degree and costs of variable capital utilization, with GDP effects ranging from 1 to 3%. At the same time, the TCJA does not pay for itself. In our analysis, the reform decreases tax revenues and raises the debt-to-GDP ratio by about 15 percentage points in the medium-run until 2025. We show that combining the TCJA with spending cuts can dampen the increase in government indebtedness without reducing its expansionary effect.
On the accuracy of linear DSGE solution methods and the consequences for log-normal asset pricing
(2021)
This paper demonstrates a failure of standard, generalized Schur (or QZ) decomposition based solutions methods for linear dynamic stochastic general equilibrium (DSGE) models when there is insufficient eigenvalue separation about the unit circle. The significance of this is demonstrated in a simple production-based asset pricing model with external habit formation. While the exact solution afforded by the simplicity of the model matches post-war US consumption growth and the equity premium, QZ-based numerical solutions miss the later by many annualized percentage points.
This paper presents and compares Bernoulli iterative approaches for solving linear DSGE models. The methods are compared using nearly 100 different models from the Macroeconomic Model Data Base (MMB) and different parameterizations of the monetary policy rule in the medium-scale New Keynesian model of Smets and Wouters (2007) iteratively. I find that Bernoulli methods compare favorably in solving DSGE models to the QZ, providing similar accuracy as measured by the forward error of the solution at a comparable computation burden. The method can guarantee convergence to a particular, e.g., unique stable, solution and can be combined with other iterative methods, such as the Newton method, lending themselves especially to refining solutions.
This paper develops and implements a backward and forward error analysis of and condition numbers for the numerical stability of the solutions of linear dynamic stochastic general equilibrium (DSGE) models. Comparing seven different solution methods from the literature, I demonstrate an economically significant loss of accuracy specifically in standard, generalized Schur (or QZ) decomposition based solutions methods resulting from large backward errors in solving the associated matrix quadratic problem. This is illustrated in the monetary macro model of Smets and Wouters (2007) and two production-based asset pricing models, a simple model of external habits with a readily available symbolic solution and the model of Jermann (1998) that lacks such a symbolic solution - QZ-based numerical solutions miss the equity premium by up to several annualized percentage points for parameterizations that either match the chosen calibration targets or are nearby to the parameterization in the literature. While the numerical solution methods from the literature failed to give any indication of these potential errors, easily implementable backward-error metrics and condition numbers are shown to successfully warn of such potential inaccuracies. The analysis is then performed for a database of roughly 100 DSGE models from the literature and a large set of draws from the model of Smets and Wouters (2007). While economically relevant errors do not appear pervasive from these latter applications, accuracies that differ by several orders of magnitude persist.
I provide a solution method in the frequency domain for multivariate linear rational expectations models. The method works with the generalized Schur decomposition, providing a numerical implementation of the underlying analytic function solution methods suitable for standard DSGE estimation and analysis procedures. This approach generalizes the time-domain restriction of autoregressive-moving average exogenous driving forces to arbitrary covariance stationary processes. Applied to the standard New Keynesian model, I find that a Bayesian analysis favors a single parameter log harmonic function of the lag operator over the usual AR(1) assumption as it generates humped shaped autocorrelation patterns more consistent with the data.
The authors relax the standard assumption in the dynamic stochastic general equilibrium (DSGE) literature that exogenous processes are governed by AR(1) processes and estimate ARMA (p,q) orders and parameters of exogenous processes. Methodologically, they contribute to the Bayesian DSGE literature by using Reversible Jump Markov Chain Monte Carlo (RJMCMC) to sample from the unknown ARMA orders and their associated parameter spaces of varying dimensions.
In estimating the technology process in the neoclassical growth model using post war US GDP data, they cast considerable doubt on the standard AR(1) assumption in favor of higher order processes. They find that the posterior concentrates density on hump-shaped impulse responses for all endogenous variables, consistent with alternative empirical estimates and the rigidities behind many richer structural models. Sampling from noninvertible MA representations, a negative response of hours to a positive technology shock is contained within the posterior credible set. While the posterior contains significant uncertainty regarding the exact order, the results are insensitive to the choice of data filter; this contrasts with the authors’ ARMA estimates of GDP itself, which vary significantly depending on the choice of HP or first difference filter.
The authors present and compare Newton-based methods from the applied mathematics literature for solving the matrix quadratic that underlies the recursive solution of linear DSGE models. The methods are compared using nearly 100 different models from the Macroeconomic Model Data Base (MMB) and different parameterizations of the monetary policy rule in the medium-scale New Keynesian model of Smets and Wouters (2007) iteratively. They find that Newton-based methods compare favorably in solving DSGE models, providing higher accuracy as measured by the forward error of the solution at a comparable computation burden. The methods, however, suffer from their inability to guarantee convergence to a particular, e.g. unique stable, solution, but their iterative procedures lend themselves to refining solutions either from different methods or parameterizations.
Highlights
• Six Newton methods for solving matrix quadratic equations in linear DSGE models.
• Compared to QZ using 99 different DSGE models including Smets and Wouters (2007).
• Newton methods more accurate than QZ with comparable computation burden.
• Apt for refining solutions from alternative methods or nearby parameterizations.
Abstract
This paper presents and compares Newton-based methods from the applied mathematics literature for solving the matrix quadratic that underlies the recursive solution of linear DSGE models. The methods are compared using nearly 100 different models from the Macroeconomic Model Data Base (MMB) and different parameterizations of the monetary policy rule in the medium-scale New Keynesian model of Smets and Wouters (2007) iteratively. We find that Newton-based methods compare favorably in solving DSGE models, providing higher accuracy as measured by the forward error of the solution at a comparable computation burden. The methods, however, suffer from their inability to guarantee convergence to a particular, e.g. unique stable, solution, but their iterative procedures lend themselves to refining solutions either from different methods or parameterizations.
The authors propose a new method to forecast macroeconomic variables that combines two existing approaches to mixed-frequency data in DSGE models. The first existing approach estimates the DSGE model in a quarterly frequency and uses higher frequency auxiliary data only for forecasting. The second method transforms a quarterly state space into a monthly frequency. Their algorithm combines the advantages of these two existing approaches.They compare the new method with the existing methods using simulated data and real-world data. With simulated data, the new method outperforms all other methods, including forecasts from the standard quarterly model. With real world data, incorporating auxiliary variables as in their method substantially decreases forecasting errors for recessions, but casting the model in a monthly frequency delivers better forecasts in normal times.
We present determinacy bounds on monetary policy in the sticky information model. We find that these bounds are more conservative here when the long run Phillips curve is vertical than in the standard Calvo sticky price New Keynesian model. Specifically, the Taylor principle is now necessary directly - no amount of output targeting can substitute for the monetary authority’s concern for inflation. These determinacy bounds are obtained by appealing to frequency domain techniques that themselves provide novel interpretations of the Phillips curve.
The authors examine the effectiveness of labor cost reductions as a means to stimulate economic activity and assesses the differences which may occur with the prevailing exchange rate regime. They develop a medium-scale three-region DSGE model and show that the impact of a cut in the employers’ social security contributions rate does not vary significantly under different exchange rate regimes. They find that both the interest rate and the exchange rate channel matters. Furthermore, the measure appears to be effective even if it comes along with a consumption tax increase to preserve long-term fiscal sustainability.
Finally, they assess whether obtained theoretical results hold up empirically by applying the local projection method. Regression results suggest that changes in employers’ social security contributions rates have statistically significant real effects – a one percentage point reduction leads to an average cumulative rise in output of around 1.3 percent in the medium term. Moreover, the outcome does not differ significantly across the different exchange rate regimes.
Helmut Schlesinger: Wegbereiter und Garant der deutschen Geld- und Stabilitätspolitik wird 100
(2024)
Am 4. September 2024 vollendet Professor Dr. Helmut Schlesinger sein 100. Lebensjahr. Von 1991 bis 1993 bekleidete er das Amt des Präsidenten der Deutschen Bundesbank. Zuvor war er in verschiedenen Positionen für die Bank tätig, unter anderem als langjähriger Vizepräsident (von 1980 bis 1991) sowie als Leiter der Hauptabteilung Volkswirtschaft und Statistik. Das Jubiläum bietet Anlass, sein Lebenswerk zu beschreiben und zu würdigen. Für ehemalige Mitarbeiter war Helmut Schlesinger ein großes Vorbild und eine Quelle des Ansporns in vielerlei Hinsicht. Insbesondere vier Bereiche seiner Tätigkeiten haben die Arbeit seiner Mitarbeiter maßgeblich geprägt: Erstens seine Fähigkeit, ökonomisches Denken als eine Synthese aus Analyse und Statistik zu begreifen, zu vermitteln und zu organisieren, zweitens sein Verdienst, eine Stabilitätskultur in leitenden Positionen mitgeschaffen und bewahrt zu haben, drittens sein ordnungspolitisches Credo zur Preisstabilität und zur Unabhängigkeit der Zentralbank sowie viertens seine klaren Vorstellungen zu den Bedingungen einer erfolgreichen Europäischen Wirtschafts- und Währungsunion.
Im Folgenden soll ein Überblick über diese vier Schwerpunkte seiner Schaffensbilanz gegeben werden. In diesem Kontext ist insbesondere Schlesingers entscheidende Rolle bei der Schaffung der deutsch-deutschen Währungsunion 1990 sowie beim langjährigen Entstehungsprozess des Eurosystems und der Europäischen Zentralbank hervorzuheben. In der deutschen Bevölkerung, aber auch international hoch geachtet, wurde Helmut Schlesinger oft als die "Seele der Bundesbank" bezeichnet.Die Anforderungen, die er an jeden Einzelnen stellte, waren hoch. Er wurde von den Mitarbeitern sehr geschätzt, nicht zuletzt aufgrund seines großen Arbeitsethos und seiner unermüdlichen Schaffenskraft, die von Beständigkeit, Gradlinigkeit und Prinzipientreue geprägt waren.
Die Abhandlung ist eine überarbeitete und erweiterte Fassung der vom Institute for Monetary and Financial Stability am 19. Juni 2006 veranstalteten Guest Lecture des Autors zum Thema "Demystifying Hedge Funds"
What happened in Cyprus? The economic consequences of the last communist government in Europe
(2014)
This paper reviews developments in the Cypriot economy following the introduction of the euro on 1 January 2008 and leading to the economic collapse of the island five years later. The main cause of the collapse is identified with the election of a communist government in February 2008, within two months of the introduction of the euro, and its subsequent choices for action and inaction on economic policy matters. The government allowed a rapid deterioration of public finances, and despite repeated warnings, damaged the country's creditworthiness and lost market access in May 2011. The destruction of the island's largest power station in July 2011 subsequently threw the economy into recession. Together with the intensification of the euro area crisis in the summer and fall of 2011, these events weakened the banking system which was vulnerable due to its exposure in Greece. Rather than deal with its fiscal crisis, the government secured a loan from the Russian government that allowed it to postpone action until after the February 2013 election. Rather than protect the banking system, losses were imposed on banks and a campaign against them was coordinated and used as a platform by the communist party for the February 2013 election. The strategy succeeded in delaying resolution of the crisis and avoiding short-term political cost for the communist party before the election, but also in precipitating a catastrophe right after the election.
Under ordinary circumstances, the fiscal implications of central bank policies tend to be seen as relatively minor and escape close scrutiny. The global financial crisis of 2008, however, demanded an extraordinary response by central banks which brought to light the immense power of central bank balance sheet policies as well as their major fiscal implications. Once the zero lower bound on interest rates is reached, expanding a central bank’s balance sheet becomes the central instrument for providing additional monetary policy accommodation. However, with interest rates near zero, the line separating fiscal and monetary policy is blurred. Furthermore, discretionary decisions associated with asset purchases and liquidity provision, as well as with lender-of-last-resort operations benefiting private entities, can have major distributional effects that are ordinarily associated with fiscal policy. In the euro area, discretionary central bank decisions can have immense distributional effects across member states. However, decisions of this nature are incompatible with the role of unelected officials in democratic societies. Drawing on the response to the crisis by the Federal Reserve and the ECB, this paper explores the tensions arising from central bank balance sheet policies and addresses pertinent questions about the governance and accountability of independent central banks in a democratic society.
The Federal Reserve’s muddled mandate to attain simultaneously the incompatible goals of maximum employment and price stability invites short-term-oriented discretionary policymaking inconsistent with the systematic approach needed for monetary policy to contribute best to the economy over time. Fear of liftoff—the reluctance to start the process of policy normalization after the end of a recession—serves as an example. Causes of the problem are discussed, drawing on public choice and cognitive psychology perspectives. The Federal Reserve could adopt a framework that relies on a simple policy rule subject to periodic reviews and adaptation. Replacing meeting-by-meeting discretion with a simple policy rule would eschew discretion in favor of systematic policy. Periodic review of the rule would allow the Federal Reserve the flexibility to account for and occasionally adapt to the evolving understanding of the economy. Congressional legislation could guide the Federal Reserve in this direction. However the Federal Reserve may be best placed to select the simple rule and could embrace this improvement on its own, within its current mandate, with the publication of a simple rule along the lines of its statement of longer-run goals.
What institutional arrangements for an independent central bank with a price stability mandate promote good policy outcomes when unconventional policies become necessary? Unconventional monetary policy poses challenges. The large scale asset purchases needed to counteract the zero lower bound on nominal interest rates have uncomfortable fiscal and distributional consequences and require central banks to assume greater risks on their balance sheets.
In his paper, Athanasios Orphanides draws lessons from the experience of the Bank of Japan (BoJ) since the late 1990s for the institutional design of independent central banks. He comes to the conclusion that lack of clarity on the precise definition of price stability, coupled with concerns about the legitimacy of large balance sheet expansions, hinders policy: It encourages the central bank to eschew the decisive quantitative easing needed to reflate the economy and instead to accommodate too-low inflation. The BoJ’s experience with the zero lower bound suggests important benefits from a clear definition of price stability as a symmetric 2% goal for inflation, which the Bank adopted in 2013.
Following the experience of the global financial crisis, central banks have been asked to undertake unprecedented responsibilities. Governments and the public appear to have high expectations that monetary policy can provide solutions to problems that do not necessarily fit in the realm of traditional monetary policy. This paper examines three broad public policy goals that may overburden monetary policy: full employment; fiscal sustainability; and financial stability. While central banks have a crucial position in public policy, the appropriate policy mix also involves other institutions, and overreliance on monetary policy to achieve these goals is bound to disappoint. Central Bank policies that facilitate postponement of needed policy actions by governments may also have longer-term adverse consequences that could outweigh more immediate benefits. Overburdening monetary policy may eventually diminish and compromise the independence and credibility of the central bank, thereby reducing its effectiveness to preserve price stability and contribute to crisis management.
Are rules and boundaries sufficient to limit harmful central bank discretion? Lessons from Europe
(2014)
Marvin Goodfriend’s (2014) insightful, informative and provocative work explains concisely and convincingly why the Fed needs rules and boundaries. This paper reviews the broader institutional design problem regarding the effectiveness of the central bank in practice and confirms the need for rules and boundaries. The framework proposed for improving the Fed incorporates key elements that have already been adopted in the European Union. The case of ELA provision by the ECB and the Central Bank of Cyprus to Marfin-Laiki Bank during the crisis, however, suggests that the existence of rules and boundaries may not be enough to limit harmful discretion. During a crisis, novel interpretations of the legal authority of the central bank may be introduced to create a grey area that might be exploited to justify harmful discretionary decisions even in the presence of rules and boundaries. This raises the question how to ensure that rules and boundaries are respected in practice
Despite a number of helpful changes, including the adoption of an inflation target, the Fed’s monetary policy strategy proved insufficiently resilient in recent years. While the Fed eased policy appropriately during the pandemic, it fell behind the curve during the post-pandemic recovery. During 2021, the Fed kept easing policy while the inflation outlook was deteriorating and the economy was growing considerably faster than the economy’s natural growth rate—the sum of the Fed’s 2% inflation goal and the growth rate of potential output.
The resilience of the Fed’s monetary policy strategy could be enhanced, and such errors be avoided with guidance from a simple natural growth targeting rule that prescribes that the federal funds rate during each quarter be raised (cut) when projected nominal income growth exceeds (falls short) of the economy’s natural growth rate. An illustration with real-time data and forecasts since the early 1990s shows that Fed policy has not persistently deviated from this simple rule with the notable exception of the period coinciding with the Fed’s post-pandemic policy error.