Universitätspublikationen
Refine
Year of publication
Document Type
- Working Paper (175)
- Part of Periodical (11)
- Report (8)
- Article (7)
- Book (2)
- Doctoral Thesis (1)
Has Fulltext
- yes (204) (remove)
Is part of the Bibliography
- no (204)
Keywords
- monetary policy (13)
- DSGE (7)
- Federal Reserve (6)
- Monetary Policy (6)
- Numerical accuracy (6)
- Solution methods (6)
- Bayesian estimation (5)
- DSGE models (5)
- Geldpolitik (5)
- banking union (5)
Institute
- Institute for Monetary and Financial Stability (IMFS) (204) (remove)
We analyze the repercussions of different kinds of uncertainty on cash demand, including uncertainty of the digital infrastructures, confidence crises of the financial system, natural disasters, political uncertainties, and inflationary crises. Based on a comprehensive literature survey, theoretical considerations and complemented by case studies, we derive a classification scheme how cash holdings typically evolve in each of these types of uncertainty by separating between demand for domestic and international cash as well as between transaction and store of value balances. Hereby, we focus on the stabilizing macroeconomic properties of cash and recommend guidelines for cash supply by central banks and the banking system. Finally, we exemplify our analysis with five case studies from the developing world, namely Venezuela, Zimbabwe, Afghanistan, Iraq, and Libya.
In the euro area, monetary policy is conducted by a single central bank for 20 member countries. However, countries are heterogeneous in their economic development, including their inflation rates. This paper combines a New Keynesian model and a neural network to assess whether the European Central Bank (ECB) conducted monetary policy between 2002 and 2022 according to the weighted average of the inflation rates within the European Monetary Union (EMU) or reacted more strongly to the inflation rate developments of certain EMU countries.
The New Keynesian model first generates data which is used to train and evaluate several machine learning algorithms. They authors find that a neural network performs best out-of-sample. They use this algorithm to generally classify historical EMU data, and to determine the exact weight on the inflation rate of EMU members in each quarter of the past two decades. Their findings suggest disproportional emphasis of the ECB on the inflation rates of EMU members that exhibited high inflation rate volatility for the vast majority of the time frame considered (80%), with a median inflation weight of 67% on these countries. They show that these results stem from a tendency of the ECB to react more strongly to countries whose inflation rates exhibit greater deviations from their long-term trend.
Climate change has become one of the most prominent concerns globally. In this paper, the authors study the transition risk of greenhouse gas emission reduction in structural environmental-macroeconomic DSGE models. First, they analyze the uncertainty in model prediction on the effect of unanticipated and pre-announced carbon price increases. Second, they conduct optimal model-robust policy in different settings. They find that reducing emissions by 40% causes 0.7% to 4% output loss with 2% on average. Pre-announcement of carbon prices affects the inflation dynamics significantly. The central bank should react slightly less to inflation and output growth during the transition risk. With optimal carbon price designs, it should react even less to inflation, and more to output growth.
We assemble a data set of more than eight million German Twitter posts related to the war in Ukraine. Based on state-of-the-art methods of text analysis, we construct a daily index of uncertainty about the war as perceived by German Twitter. The approach also allows us to separate this index into uncertainty about sanctions against Russia, energy policy and other dimensions. We then estimate a VAR model with daily financial and macroeconomic data and identify an exogenous uncertainty shock. The increase in uncertainty has strong effects on financial markets and causes a significant decline in economic activity as well as an increase in expected inflation. We find the effects of uncertainty to be particularly strong in the first months of the war.
Optimal monetary policy studies typically rely on a single structural model and identification of model-specific rules that minimize the unconditional volatilities of inflation and real activity. In their proposed approach, the authors take a large set of structural models and look for the model-robust rules that minimize the volatilities at those frequencies that policymakers are most interested in stabilizing. Compared to the status quo approach, their results suggest that policymakers should be more restrained in their inflation responses when their aim is to stabilize inflation and output growth at specific frequencies. Additional caution is called for due to model uncertainty.
I have assessed changes in the monetary policy stance in the euro area since its inception by applying a Bayesian time-varying parameter framework in conjunction with the Hamiltonian Monte Carlo algorithm. I find that the estimated policy response has varied considerably over time. Most of the results suggest that the response weakened after the onset of the financial crisis and while quantitative measures were still in place, although there are also indications that the weakening of the response to the expected inflation gap may have been less pronounced. I also find that the policy response has become more forceful over the course of the recent sharp rise in inflation. Furthermore, it is essential to model the stochastic volatility relating to deviations from the policy rule as it materially influences the results.
This paper presents and compares Bernoulli iterative approaches for solving linear DSGE models. The methods are compared using nearly 100 different models from the Macroeconomic Model Data Base (MMB) and different parameterizations of the monetary policy rule in the medium-scale New Keynesian model of Smets and Wouters (2007) iteratively. I find that Bernoulli methods compare favorably in solving DSGE models to the QZ, providing similar accuracy as measured by the forward error of the solution at a comparable computation burden. The method can guarantee convergence to a particular, e.g., unique stable, solution and can be combined with other iterative methods, such as the Newton method, lending themselves especially to refining solutions.
Fabo, Janˇcokov ́a, Kempf, and P ́astor (2021) show that papers written by central bank researchers find quantitative easing (QE) to be more effective than papers written by academics. Weale and Wieladek (2022) show that a subset of these results lose statistical significance when OLS regressions are replaced by regressions that downweight outliers. We examine those outliers and find no reason to downweight them. Most of them represent estimates from influential central bank papers published in respectable academic journals. For example, among the five papers finding the largest peak effect of QE on output, all five are published in high-quality journals (Journal of Monetary Economics, Journal of Money, Credit and Banking, and Applied Economics Letters), and their average number of citations is well over 200. Moreover, we show that these papers have supported policy communication by the world’s leading central banks and shaped the public perception of the effectiveness of QE. New evidence based on quantile regressions further supports the results in Fabo et al. (2021).
Output gap revisions can be large even after many years. Real-time reliability tests might therefore be sensitive to the choice of the final output gap vintage that the real-time estimates are compared to. This is the case for the Federal Reserve’s output gap. When accounting for revisions in response to the global financial crisis in the final output gap, the improvement in real-time reliability since the mid-1990s is much smaller than found by Edge and Rudd (Review of Economics and Statistics, 2016, 98(4), 785-791). The negative bias of real-time estimates from the 1980s has disappeared, but the size of revisions continues to be as large as the output gap itself.
The authors systematically analyse how the realtime reliability assessment is affected through varying the final output gap vintage. They find that the largest changes are caused by output gap revisions after recessions. Economists revise their models in response to such events, leading to economically important revisions not only for the most recent years, but reaching back up to two decades. This might improve the understanding of past business cycle dynamics, but decreases the reliability of real-time output gaps ex post.
High-frequency changes in interest rates around FOMC announcements are an important tool for identifying the effects of monetary policy on asset prices and the macroeconomy. However, some recent studies have questioned both the exogeneity and the relevance of these monetary policy surprises as instruments, especially for estimating the macroeconomic effects of monetary policy shocks. For example, monetary policy surprises are correlated with macroeconomic and financial data that is publicly available prior to the FOMC announcement. The authors address these concerns in two ways: First, they expand the set of monetary policy announcements to include speeches by the Fed Chair, which essentially doubles the number and importance of announcements in our dataset. Second, they explain the predictability of the monetary policy surprises in terms of the “Fed response to news” channel of Bauer and Swanson (2021) and account for it by orthogonalizing the surprises with respect to macroeconomic and financial data. Their subsequent reassessment of the effects of monetary policy yields two key results: First, estimates of the high-frequency effects on financial markets are largely unchanged. Second, estimates of the macroeconomic effects of monetary policy are substantially larger and more significant than what most previous empirical studies have found.