Filtern
Erscheinungsjahr
Dokumenttyp
- Arbeitspapier (27) (entfernen)
Sprache
- Englisch (27)
Volltext vorhanden
- ja (27)
Gehört zur Bibliographie
- nein (27)
Schlagworte
- Monetary Policy (27) (entfernen)
Institut
- Center for Financial Studies (CFS) (27) (entfernen)
Central bank intervention in the form of quantitative easing (QE) during times of low interest rates is a controversial topic. The author introduces a novel approach to study the effectiveness of such unconventional measures. Using U.S. data on six key financial and macroeconomic variables between 1990 and 2015, the economy is estimated by artificial neural networks. Historical counterfactual analyses show that real effects are less pronounced than yield effects.
Disentangling the effects of the individual asset purchase programs, impulse response functions provide evidence for QE being less effective the more the crisis is overcome. The peak effects of all QE interventions during the Financial Crisis only amounts to 1.3 pp for GDP growth and 0.6 pp for inflation respectively. Hence, the time as well as the volume of the interventions should be deliberated.
This study analyses potential consequences of exiting the Targeted Long-Term Refinancing Operations (TLTRO) of the European Central Bank (ECB). Thanks to its asset purchase programs, the Eurosystem still holds plenty of reserves even with a full exit from the TLTROs. This explains why voluntary and mandatory repayments of TLTRO III borrowing went smoothly. Nevertheless, the more liquidity is drained from the banking system, the more important becomes interbank market borrowing and lending, ideally between euro area member states. Right now, the usual fault lines of the euro area show up. The German banking system has plenty of reserves while there are first signs of aggregate scarcity in the Italian banking system. This does not need to be a source of concern if the interbank market can be sufficiently reactivated. Moreover, the ECB has several tools to address possible future liquidity shortages.
This document was provided/prepared by the Economic Governance and EMU scrutiny Unit at the request of the ECON Committee.
The Eurosystem and the Deutsche Bundesbank will incur substantial losses in 2023 that are likely to persist for several years. Due to the massive purchases of securities in the last 10 years, especially of government bonds, the banks' excess reserves have risen sharply. The resulting high interest payments to the banks since the turnaround in monetary policy, with little income for the large-scale securities holdings, led to massive criticism. The banks were said to be making "unfair" profits as a result, while the fiscal authorities had to forego the previously customary transfers of central bank profits. Populist demands to limit bank profits by, for example, drastically increasing the minimum reserve ratios in the Eurosystem to reduce excess reserves are creating new severe problems and are neither justified nor helpful. Ultimately, the EU member states have benefited for a very long time from historically low interest rates because of the Eurosystem's extraordinary loose monetary policy and must now bear the flip side consequences of the massive expansion of central bank balance sheets during the necessary period of monetary policy normalisation.
We study the redistributive effects of inflation combining administrative bank data with an information provision experiment during an episode of historic inflation. On average, households are well-informed about prevailing inflation and are concerned about its impact on their wealth; yet, while many households know about inflation eroding nominal assets, most are unaware of nominal-debt erosion. Once they receive information on the debt-erosion channel, households update upwards their beliefs about nominal debt and their own real net wealth. These changes in beliefs causally affect actual consumption and hypothetical debt decisions. Our findings suggest that real wealth mediates the sensitivity of consumption to inflation once households are aware of the wealth effects of inflation.
In the euro area, monetary policy is conducted by a single central bank for 20 member countries. However, countries are heterogeneous in their economic development, including their inflation rates. This paper combines a New Keynesian model and a neural network to assess whether the European Central Bank (ECB) conducted monetary policy between 2002 and 2022 according to the weighted average of the inflation rates within the European Monetary Union (EMU) or reacted more strongly to the inflation rate developments of certain EMU countries.
The New Keynesian model first generates data which is used to train and evaluate several machine learning algorithms. They authors find that a neural network performs best out-of-sample. They use this algorithm to generally classify historical EMU data, and to determine the exact weight on the inflation rate of EMU members in each quarter of the past two decades. Their findings suggest disproportional emphasis of the ECB on the inflation rates of EMU members that exhibited high inflation rate volatility for the vast majority of the time frame considered (80%), with a median inflation weight of 67% on these countries. They show that these results stem from a tendency of the ECB to react more strongly to countries whose inflation rates exhibit greater deviations from their long-term trend.
There is much discussion today about a possible digital euro (PDE). Is this attention exaggerated? Are “central bank digital currencies” (CBDCs) “a solution in search of a problem”, as some have argued? This article summarizes the main facts about the PDE and concludes that, if the decision on adoption had to be taken today, the arguments against would outweigh those in favor. However, there may be future circumstances in which having a CBDC ready for use can indeed be useful. Therefore, preparing is a good thing, even if the odds of its usefulness in normal conditions are slim.
On 15 August 2017, the Bundesverfassungsgericht (BVerfG) referred the case against the European Central Bank’s policy of Quantitative Easing (QE) to the European Court of Justice (ECJ). The author argues that this event differs in several aspects from the OMT case in 2015 – in content as well as in form. The BVerfG recognizes that it is a legitimate goal of the ECB’s monetary policy to bring inflation up close to 2%, and that the instrument employed for QE is one of monetary policy. However, it doubts whether the sheer volume of QE would not distort the character of the program as one of monetary policy. The ECJ will now have to clarify the extent to which the ECJ’s findings in its OMT judgment are relevant for QE as well as the standard of review applicable to monetary policy. The author raises the questions of whether the principle of democracy under German constitutional law can actually provide the standard by which the ECB is to be measured, and how tight judicial review could be exercised over the ECB without encroaching upon its autonomy in monetary policy matters – and thus upon the very essence of central bank independence.
This paper analyzes the relationship between monetary policy and financial stability in the Banking Union. There is no uniform global model regarding the relationship between monetary policy-making on the one hand, and prudential supervision on the other. Before the crisis, EU Member States followed different approaches, some of them uniting monetary and supervisory functions in one institution, others assigning them to different, neatly separated institutions. The financial crisis has underlined that monetary policy and prudential supervision deeply affect each other, especially in case of systemic events. Even in normal times, monetary and supervisory decisions might conflict with each other. After the crisis, some jurisdictions have moved towards a more holistic approach under which monetary policy takes supervisory considerations into account, while supervisory decisions pay due regard to monetary policy.
The Banking Union puts prudential supervision in the hands of the European Central Bank (ECB), the institution responsible for monetary policy. Nevertheless, at its establishment there was the political understanding that the ECB should follow a policy of meticulous separation in the discharge of its different functions. This raises the question whether the ECB may pursue a holistic approach to monetary policy and supervisory decision-making, respectively. On the basis of a purposive reading of the monetary policy mandate and the SSM Regulation, the paper answers this question in the affirmative. Effective monetary policy (or supervision) requires financial stability (or smooth monetary policy transmission). Moreover, without a holistic approach, the SSM Regulation is more likely to provoke the adoption of mutually defeating decisions by the Governing Board. The reputation of the ECB would suffer considerably under such a situation – in a field where reputation is of paramount importance for effective policy.
As any meticulous separation between monetary and supervisory functions turns out to be infeasible, the paper explores the reasons. Parting from Katharina Pistor’s legal theory of finance, which puts the emphasis on exogenous factors to explain the (non)enforcement of legal rules, the paper suggests a legal instability theorem which focuses on endogenous reasons, such as law’s indeterminacy, contextuality, and responsiveness to democratic deliberation. This raises the question whether the holistic approach would be democratically legitimate under the current framework of the ESCB. The idea of technocratic legitimacy that exempts the ECB from representative structures is effectively called into question by the legal instability theorem. This does not imply that the independence of the ECB should be given up, as there are no viable alternatives to protect monetary policy against the time inconsistency problem. Rather, any solution might benefit from recognizing the ECB in its mixed technocratic and political shape as a centerpiece of European integration and improving.
This paper undertakes a quantitative investigation of the effects of anticipated inflation on the distribution of household wealth and welfare. Consumer Finance Data on household financial wealth suggests that about a third of the US population holds all its financial assets in transaction accounts. The remaining two-third of the US population holds most of their financial assets outside transaction accounts. To account for this evidence, I introduce a portfolio choice in a standard incomplete markets model with heterogeneous agents. I calibrate the model economy to SCF 2010 US data and use this environment to study the distributive effects of changes in anticipated inflation. An increase in anticipated inflation leads households to reshuffle their portfolio towards real assets. This crowding-in of supply for real assets lowers equilibrium interest rates and thereby redistributes wealth from creditors to borrowers. Because borrowers have a higher marginal utility, this redistribution improves aggregate welfare. First, this paper shows that inflation acts not only a regressive consumption tax as in Erosa and Ventura (2002), but also as a progressive tax. Second, this paper shows that the welfare cost of inflation are even lower than the estimates computed by Lucas (2000) and Ireland (2009). Finally, this paper offers insights into why deflationary environments should be avoided.
No. And not only for the reason you think. In a world with multiple inefficiencies the single policy tool the central bank has control over will not undo all inefficiencies; this is well understood. We argue that the world is better characterized by multiple inefficiencies and multiple policy makers with various objectives. Asking the policy question only in terms of optimal monetary policy effectively turns the central bank into the residual claimant of all policy and gives the other policymakers a free hand in pursuing their own goals. This further worsens the tradeoffs faced by the central bank. The optimal monetary policy literature and the optimal simple rules often labeled flexible inflation targeting assign all of the cyclical policymaking duties to central banks. This distorts the policy discussion and narrows the policy choices to a suboptimal set. We highlight this issue and call for a broader thinking of optimal policies.
The complexity resulting from intertwined uncertainties regarding model misspecification and mismeasurement of the state of the economy defines the monetary policy landscape. Using the euro area as laboratory this paper explores the design of robust policy guides aiming to maintain stability in the economy while recognizing this complexity. We document substantial output gap mismeasurement and make use of a new model data base to capture the evolution of model specification. A simple interest rate rule is employed to interpret ECB policy since 1999. An evaluation of alternative policy rules across 11 models of the euro area confirms the fragility of policy analysis optimized for any specific model and shows the merits of model averaging in policy design. Interestingly, a simple difference rule with the same coefficients on inflation and output growth as the one used to interpret ECB policy is quite robust as long as it responds to current outcomes of these variables.
Central banks have recently introduced new policy initiatives, including a policy called ‘Quantitative Easing’ (QE). Since it has been argued by the Bank of England that “Standard economic models are of limited use in these unusual circumstances, and the empirical evidence is extremely limited” (Bank of England, 2009b), we have taken an entirely empirical approach and have focused on the QE-experience, on which substantial data is available, namely that of Japan (2001-2006). Recent literature on the effectiveness of QE has neglected any reference to final policy goals. In this paper, we adopt the view that ultimately effectiveness will be measured by whether it will be able to “boost spending” (Bank of England, 2009b) and “will ultimately be judged by their impact on the wider macroeconomy” (Bank of England, 2010). In line with a widely held view among leading macroeconomists from various persuasions, while attempting to stay agnostic and open-minded on the distribution of demand changes between real output and inflation, we have thus identified nominal GDP growth as the key final policy goal of monetary policy. The empirical research finds that the policy conducted by the Bank of Japan between 2001 and 2006 makes little empirical difference while an alternative policy targeting credit creation (the original definition of QE) would likely have been more successful.
The lessons from QE and other 'unconventional' monetary policies - evidence from the Bank of England
(2011)
This paper investigates the effectiveness of the ‘quantitative easing’ policy, as implemented by the Bank of England in March 2009. Similar policies had been previously implemented in Japan, the U.S. and the Eurozone. The effectiveness is measured by the impact of Bank of England policies (including, but not limited to QE) on nominal GDP growth – the declared goal of the policy, according to the Bank of England. Unlike the majority of the literature on the topic, the general-to-specific econometric modeling methodology (a.k.a. the ‘Hendry’ or ‘LSE’ methodology) is employed for this purpose. The empirical analysis indicates that QE as defined and announced in March 2009 had no apparent effect on the UK economy. Meanwhile, it is found that a policy of ‘quantitative easing’ defined in the original sense of the term (Werner, 1994) is supported by empirical evidence: a stable relationship between a lending aggregate (disaggregated M4 lending, i.e. bank credit for GDP transactions) and nominal GDP is found. The findings imply that BoE policy should more directly target the growth of bank credit for GDP-transactions.
The unintended consequences of the debt ... will increased government expenditure hurt the economy?
(2011)
In 2008, governments in many countries embarked on large fiscal expenditure programmes, with the intention to support the economy and prevent a more serious recession. In this study, the overall impact of a substantial increase in fiscal expenditure is considered by providing a novel analysis of the most relevant recent experience in similar circumstances, namely that of Japan in the 1990s. Then a weak economy with risk-averse banks seemed to require some of the largest peacetime fiscal stimulation programmes on record, albeit with disappointing results. The explanations provided by the literature and their unsatisfactory empirical record are reviewed. An alternative explanation, derived from early Keynesian models on the ineffectiveness of fiscal policy is presented in the form of a modified Fisher-equation, which incorporates the recent findings in the credit view literature. The model postulates complete quantity crowding out. It is subjected to empirical tests, which were supportive. Thus evidence is found that fiscal policy, if not supported by suitable monetary policy, is likely to crowd out private sector demand, even in an environment of falling or near-zero interest rates. As a policy conclusion it is pointed out that by changing the funding strategy, complete crowding out can be avoided and a positive net effect produced. The proposed framework creates common ground between proponents of Keynesian views (as held, among others, by Blinder and Solow), monetarist views (as held in particular by Milton Friedman) and those of leading contemporary macroeconomists (such as Mankiw).
The recent financial crisis has highlighted the limits of the “originate to distribute” model of banking, but its nexus with the macroeconomy and monetary policy remains unexplored. I build a DSGE model with banks (along the lines of Holmström and Tirole [28] and Parlour and Plantin [39] and examine its properties with and without active secondary markets for credit risk transfer. The possibility of transferring credit reduces the impact of liquidity shocks on bank balance sheets, but also reduces the bank incentive to monitor. As a result, secondary markets allow to release bank capital and exacerbate the effect of productivity and other macroeconomic shocks on output and inflation. By offering a possibility of capital recycling and by reducing bank monitoring, secondary credit markets in general equilibrium allow banks to take on more risk. Keywords: Credit Risk Transfer , Dual Moral Hazard , Monetary Policy , Liquidity , Welfare JEL Classification: E3, E5, G3 First Draft: December 2009, This Draft: September 2010
This paper reviews the rationale for quantitative easing when central bank policy rates reach near zero levels in light of recent announcements regarding direct asset purchases by the Bank of England, the Bank of Japan, the U.S. Federal Reserve and the European Central Bank. Empirical evidence from the previous period of quantitative easing in Japan between 2001 and 2006 is presented. During this earlier period the Bank of Japan was able to expand the monetary base very quickly and significantly. Quantitative easing translated into a greater and more lasting expansion of M1 relative to nominal GDP. Deflation subsided by 2005. As soon as inflation appeared to stabilize near a rate of zero, the Bank of Japan rapidly reduced the monetary base as a share of nominal income as it had announced in 2001. The Bank was able to exit from extensive quantitative easing within less than a year. Some implications for the current situation in Europe and the United States are discussed.
In the New-Keynesian model, optimal interest rate policy under uncertainty is formulated without reference to monetary aggregates as long as certain standard assumptions on the distributions of unobservables are satisfied. The model has been criticized for failing to explain common trends in money growth and inflation, and that therefore money should be used as a cross-check in policy formulation (see Lucas (2007)). We show that the New-Keynesian model can explain such trends if one allows for the possibility of persistent central bank misperceptions. Such misperceptions motivate the search for policies that include additional robustness checks. In earlier work, we proposed an interest rate rule that is near-optimal in normal times but includes a cross-check with monetary information. In case of unusual monetary trends, interest rates are adjusted. In this paper, we show in detail how to derive the appropriate magnitude of the interest rate adjustment following a significant cross-check with monetary information, when the New-Keynesian model is the central bank’s preferred model. The cross-check is shown to be effective in offsetting persistent deviations of inflation due to central bank misperceptions. Keywords: Monetary Policy, New-Keynesian Model, Money, Quantity Theory, European Central Bank, Policy Under Uncertainty
Opting out of the great inflation: German monetary policy after the break down of Bretton Woods
(2009)
During the turbulent 1970s and 1980s the Bundesbank established an outstanding reputation in the world of central banking. Germany achieved a high degree of domestic stability and provided safe haven for investors in times of turmoil in the international financial system. Eventually the Bundesbank provided the role model for the European Central Bank. Hence, we examine an episode of lasting importance in European monetary history. The purpose of this paper is to highlight how the Bundesbank monetary policy strategy contributed to this success. We analyze the strategy as it was conceived, communicated and refined by the Bundesbank itself. We propose a theoretical framework (following Söderström, 2005) where monetary targeting is interpreted, first and foremost, as a commitment device. In our setting, a monetary target helps anchoring inflation and inflation expectations. We derive an interest rate rule and show empirically that it approximates the way the Bundesbank conducted monetary policy over the period 1975-1998. We compare the Bundesbank´s monetary policy rule with those of the FED and of the Bank of England. We find that the Bundesbank´s policy reaction function was characterized by strong persistence of policy rates as well as a strong response to deviations of inflation from target and to the activity growth gap. In contrast, the response to the level of the output gap was not significant. In our empirical analysis we use real-time data, as available to policy-makers at the time. JEL Classification: E31, E32, E41, E52, E58
This paper introduces adaptive learning and endogenous indexation in the New-Keynesian Phillips curve and studies disinflation under inflation targeting policies. The analysis is motivated by the disinflation performance of many inflation-targeting countries, in particular the gradual Chilean disinflation with temporary annual targets. At the start of the disinflation episode price-setting firms’ expect inflation to be highly persistent and opt for backward-looking indexation. As the central bank acts to bring inflation under control, price-setting firms revise their estimates of the degree of persistence. Such adaptive learning lowers the cost of disinflation. This reduction can be exploited by a gradual approach to disinflation. Firms that choose the rate for indexation also re-assess the likelihood that announced inflation targets determine steady-state inflation and adjust indexation of contracts accordingly. A strategy of announcing and pursuing short-term targets for inflation is found to influence the likelihood that firms switch from backward-looking indexation to the central bank’s targets. As firms abandon backward-looking indexation the costs of disinflation decline further. We show that an inflation targeting strategy that employs temporary targets can benefit from lower disinflation costs due to the reduction in backward-looking indexation.
Monetary policy analysts often rely on rules-of-thumb, such as the Taylor rule, to describe historical monetary policy decisions and to compare current policy to historical norms. Analysis along these lines also permits evaluation of episodes where policy may have deviated from a simple rule and examination of the reasons behind such deviations. One interesting question is whether such rules-of-thumb should draw on policymakers "forecasts of key variables such as inflation and unemployment or on observed outcomes. Importantly, deviations of the policy from the prescriptions of a Taylor rule that relies on outcomes may be due to systematic responses to information captured in policymakers" own projections. We investigate this proposition in the context of FOMC policy decisions over the past 20 years using publicly available FOMC projections from the biannual monetary policy reports to the Congress (Humphrey-Hawkins reports). Our results indicate that FOMC decisions can indeed be predominantly explained in terms of the FOMC´s own projections rather than observed outcomes. Thus, a forecast-based rule-of-thumb better characterizes FOMC decision-making. We also confirm that many of the apparent deviations of the federal funds rate from an outcome-based Taylor-style rule may be considered systematic responses to information contained in FOMC projections.