Refine
Year of publication
- 2012 (73) (remove)
Document Type
- Report (34)
- Working Paper (16)
- Book (7)
- Part of Periodical (7)
- Article (6)
- Doctoral Thesis (3)
Has Fulltext
- yes (73) (remove)
Is part of the Bibliography
- no (73)
Keywords
- forecasting (5)
- model uncertainty (5)
- DSGE models (4)
- monetary policy (4)
- Greenbook (3)
- Social Interaction (3)
- complexity (3)
- density forecasts (3)
- forecast combination (3)
- real-time data (3)
Institute
- Wirtschaftswissenschaften (73) (remove)
This paper studies constrained portfolio problems that may involve constraints on the probability or the expected size of a shortfall of wealth or consumption. Our first contribution is that we solve the problems by dynamic programming, which is in contrast to the existing literature that applies the martingale method. More precisely, we construct the non-separable value function by formalizing the optimal constrained terminal wealth to be a (conjectured) contingent claim on the optimal non-constrained terminal wealth. This is relevant by itself, but also opens up the opportunity to derive new solutions to constrained problems. As a second contribution, we thus derive new results for non-strict constraints on the shortfall of inter¬mediate wealth and/or consumption.
The complexity resulting from intertwined uncertainties regarding model misspecification and mismeasurement of the state of the economy defines the monetary policy landscape. Using the euro area as laboratory this paper explores the design of robust policy guides aiming to maintain stability in the economy while recognizing this complexity. We document substantial output gap mismeasurement and make use of a new model data base to capture the evolution of model specification. A simple interest rate rule is employed to interpret ECB policy since 1999. An evaluation of alternative policy rules across 11 models of the euro area confirms the fragility of policy analysis optimized for any specific model and shows the merits of model averaging in policy design. Interestingly, a simple difference rule with the same coefficients on inflation and output growth as the one used to interpret ECB policy is quite robust as long as it responds to current outcomes of these variables.
We develop a dynamic network model with heterogenous banks which undertake optimizing portfolio decisions subject to liquidity and capital constraints and trade in the interbank market whose equilibrium is governed by a tatonnement process. Due to the micro-funded structure of the decisional process as well as the iterative dynamic adjustment taking place in the market, the links in the network structures are endogenous and evolve dynamically. We use the model to assess the diffusion of systemic risk (measured as default probability), the contribution of each bank to it as well as the evolution of the network in response to financial shocks and across different prudential policy regimes.
In the aftermath of the global financial crisis, the state of macroeconomic modeling and the use of macroeconomic models in policy analysis has come under heavy criticism. Macroeconomists in academia and policy institutions have been blamed for relying too much on a particular class of macroeconomic models. This paper proposes a comparative approach to macroeconomic policy analysis that is open to competing modeling paradigms. Macroeconomic model comparison projects have helped produce some very influential insights such as the Taylor rule. However, they have been infrequent and costly, because they require the input of many teams of researchers and multiple meetings to obtain a limited set of comparative findings. This paper provides a new approach that enables individual researchers to conduct model comparisons easily, frequently, at low cost and on a large scale. Using this approach a model archive is built that includes many well-known empirically estimated models that may be used for quantitative analysis of monetary and fiscal stabilization policies. A computational platform is created that allows straightforward comparisons of models’ implications. Its application is illustrated by comparing different monetary and fiscal policies across selected models. Researchers can easily include new models in the data base and compare the effects of novel extensions to established benchmarks thereby fostering a comparative instead of insular approach to model development.
We introduce a new measure of systemic risk, the change in the conditional joint probability of default, which assesses the effects of the interdependence in the financial system on the general default risk of sovereign debtors. We apply our measure to examine the fragility of the European financial system during the ongoing sovereign debt crisis. Our analysis documents an increase in systemic risk contributions in the euro area during the post-Lehman global recession and especially after the beginning of the euro area sovereign debt crisis. We also find a considerable potential for cascade effects from small to large euro area sovereigns. When we investigate the effect of sovereign default on the European Union banking system, we find that bigger banks, banks with riskier activities, with poor asset quality, and funding and liquidity constraints tend to be more vulnerable to a sovereign default. Surprisingly, an increase in leverage does not seem to influence systemic vulnerability.
We outline a procedure for consistent estimation of marginal and joint default risk in the euro area financial system. We interpret the latter risk as the intrinsic financial system fragility and derive several systemic fragility indicators for euro area banks and sovereigns, based on CDS prices. Our analysis documents that although the fragility of the euro area banking system had started to deteriorate before Lehman Brothers' file for bankruptcy, investors did not expect the crisis to affect euro area sovereigns' solvency until September 2008. Since then, and especially after November 2009, joint sovereign default risk has outpaced the rise of systemic risk within the banking system.
We develop a dynamic network model with heterogenous banks which undertake optimizing portfolio decisions subject to liquidity and capital constraints and trade in the interbank market whose equilibrium is governed by a tatonnement process. Due to the micro-funded structure of the decisional process as well as the iterative dynamic adjustment taking place in the market, the links in the network structures are endogenous and evolve dynamically. We use the model to assess the diffusion of systemic risk, the contribution of each bank to it as well as the evolution of the network in response to financial shocks and across different prudential policy regimes.
This paper presents a theory that explains why it is beneficial for banks to engage in circular lending activities on the interbank market. Using a simple network structure, it shows that if there is a non-zero bailout probability, banks can significantly increase the expected repayment of uninsured creditors by entering into cyclical liabilities on the interbank market before investing in loan portfolios. Therefore, banks are better able to attract funds from uninsured creditors. Our results show that implicit government guarantees incentivize banks to have large interbank exposures, to be highly interconnected, and to invest in highly correlated, risky portfolios. This can serve as an explanation for the observed high interconnectedness between banks and their investment behavior in the run-up to the subprime mortgage crisis.
This paper investigates the accuracy of point and density forecasts of four DSGE models for inflation, output growth and the federal funds rate. Model parameters are estimated and forecasts are derived successively from historical U.S. data vintages synchronized with the Fed’s Greenbook projections. Point forecasts of some models are of similar accuracy as the forecasts of nonstructural large dataset methods. Despite their common underlying New Keynesian modeling philosophy, forecasts of different DSGE models turn out to be quite distinct. Weighted forecasts are more precise than forecasts from individual models. The accuracy of a simple average of DSGE model forecasts is comparable to Greenbook projections for medium term horizons. Comparing density forecasts of DSGE models with the actual distribution of observations shows that the models overestimate uncertainty around point forecasts.
In the aftermath of the global financial crisis and great recession, many countries face substantial deficits and growing debts. In the United States, federal government outlays as a ratio to GDP rose substantially from about 19.5 percent before the crisis to over 24 percent after the crisis. In this paper we consider a fiscal consolidation strategy that brings the budget to balance by gradually reducing this spending ratio over time to the level that prevailed prior to the crisis. A crucial issue is the impact of such a consolidation strategy on the economy. We use structural macroeconomic models to estimate this impact focussing primarily on a dynamic stochastic general equilibrium model with price and wage rigidities and adjustment costs. We separate out the impact of reductions in government purchases and transfers, and we allow for a reduction in both distortionary taxes and government debt relative to the baseline of no consolidation. According to the model simulations GDP rises in the short run upon announcement and implementation of this fiscal consolidation strategy and remains higher than the baseline in the long run. We explore the role of the mix of expenditure cuts and tax reductions as well as gradualism in achieving this policy outcome. Finally, we conduct sensitivity studies regarding the type of model used and its parameterization.
The complexity resulting from intertwined uncertainties regarding model misspecification and mismeasurement of the state of the economy defines the monetary policy landscape. Using the euro area as laboratory this paper explores the design of robust policy guides aiming to maintain stability in the economy while recognizing this complexity. We document substantial output gap mismeasurement and make use of a new model data base to capture the evolution of model specification. A simple interest rate rule is employed to interpret ECB policy since 1999. An evaluation of alternative policy rules across 11 models of the euro area confirms the fragility of policy analysis optimized for any specific model and shows the merits of model averaging in policy design. Interestingly, a simple difference rule with the same coefficients on inflation and output growth as the one used to interpret ECB policy is quite robust as long as it responds to current outcomes of these variables.
We examine both the degree and the structural stability of inflation persis tence at different quantiles of the conditional inflation distribution. Previous research focused exclusively on persistence at the conditional mean of the inflation rate. Economic theory, however, provides various reasons -for example downward wage rigidities or menu costs- to expect higher inflation persistence at the upper than at the lower tail of the conditional inflation distribution.
Based on post-war US data we indeed find slower mean reversion in response to positive than to negative shocks. We find robust evidence for a structural break in persistence at all quantiles of the inflation process in the early 1980s. Inflation persistence has decreased and become more homogeneous across quantiles. Persistence at the conditional mean became more informative about the degree of persistence across the entire conditional inflation distribution. While prior to the 1980s inflation was not mean reverting in response to large positive shocks, our evidence strongly suggests that since the end of the Volcker disinflation the unit root can be rejected at every quantile including the upper tail of the conditional inflation distribution.
This chapter aims to provide a hands-on approach to New Keynesian models and their uses for macroeconomic policy analysis. It starts by reviewing the origins of the New Keynesian approach, the key model ingredients and representative models. Building blocks of current-generation dynamic stochastic general equilibrium (DSGE) models are discussed in detail. These models address the famous Lucas critique by deriving behavioral equations systematically from the optimizing and forward-looking decision-making of households and firms subject to well-defined constraints. State-of-the-art methods for solving and estimating such models are reviewed and presented in examples. The chapter goes beyond the mere presentation of the most popular benchmark model by providing a framework for model comparison along with a database that includes a wide variety of macroeconomic models. Thus, it offers a convenient approach for comparing new models to available benchmarks and for investigating whether particular policy recommendations are robust to model uncertainty. Such robustness analysis is illustrated by evaluating the performance of simple monetary policy rules across a range of recently-estimated models including some with financial market imperfections and by reviewing recent comparative findings regarding the magnitude of government spending multipliers. The chapter concludes with a discussion of important objectives for on-going and future research using the New Keynesian framework.
This paper investigates the accuracy of point and density forecasts of four DSGE models for inflation, output growth and the federal funds rate. Model parameters are estimated and forecasts are derived successively from historical U.S. data vintages synchronized with the Fed’s Greenbook projections. Point forecasts of some models are of similar accuracy as the forecasts of nonstructural large dataset methods. Despite their common underlying New Keynesian modeling philosophy, forecasts of different DSGE models turn out to be quite distinct. Weighted forecasts are more precise than forecasts from individual models. The accuracy of a simple average of DSGE model forecasts is comparable to Greenbook projections for medium term horizons. Comparing density forecasts of DSGE models with the actual distribution of observations shows that the models overestimate uncertainty around point forecasts.
This paper investigates the accuracy of forecasts from four DSGE models for inflation, output growth and the federal funds rate using a real-time dataset synchronized with the Fed’s Greenbook projections. Conditioning the model forecasts on the Greenbook nowcasts leads to forecasts that are as accurate as the Greenbook projections for output growth and the federal funds rate. Only for inflation the model forecasts are dominated by the Greenbook projections. A comparison with forecasts from Bayesian VARs shows that the economic structure of the DSGE models which is useful for the interpretation of forecasts does not lower the accuracy of forecasts. Combining forecasts of several DSGE models increases precision in comparison to individual model forecasts. Comparing density forecasts with the actual distribution of observations shows that DSGE models overestimate uncertainty around point forecasts.
In the aftermath of the global financial crisis and great recession, many countries face substantial deficits and growing debts. In the United States, federal government outlays as a ratio to GDP rose substantially from about 19.5 percent before the crisis to over 24 percent after the crisis. In this paper we consider a fiscal consolidation strategy that brings the budget to balance by gradually reducing this spending ratio over time to the level that prevailed prior to the crisis. A crucial issue is the impact of such a consolidation strategy on the economy. We use structural macroeconomic models to estimate this impact focussing primarily on a dynamic stochastic general equilibrium model with price and wage rigidities and adjustment costs. We separate out the impact of reductions in government purchases and transfers, and we allow for a reduction in both distortionary taxes and government debt relative to the baseline of no consolidation. According to the model simulations GDP rises in the short run upon announcement and implementation of this fiscal consolidation strategy and remains higher than the baseline in the long run. We explore the role of the mix of expenditure cuts and tax reductions as well as gradualism in achieving this policy outcome. Finally, we conduct sensitivity studies regarding the type of model used and its parameterization.