CFS working paper series
https://gfk-cfs.de/working-papers/
Refine
Year of publication
- 2014 (50) (remove)
Document Type
- Working Paper (50)
Language
- English (50)
Has Fulltext
- yes (50)
Is part of the Bibliography
- no (50)
Keywords
- financial crisis (4)
- monetary policy (3)
- Alternative investments (2)
- G-SIFIs (2)
- Household finance (2)
- OTC markets (2)
- Progressive Taxation (2)
- asset pricing (2)
- limited attention (2)
- simulated method of moments (2)
Institute
- Center for Financial Studies (CFS) (50) (remove)
489
US data and new stockholding data from fifteen European countries and China exhibit a common pattern: stockholding shares increase in household income and wealth. Yet, there is a multitude of numbers to match through models. Using a single utility function across households (parsimony), we suggest a strategy for fitting stockholding numbers, while replicating that saving rates increase in wealth, too. The key is introducing subsistence consumption to an Epstein-Zin-Weil utility function, creating endogenous risk-aversion differences across rich and poor. A closed-form solution for the model with insurable labor-income risk serves as calibration guide for numerical simulations with uninsurable labor-income risk.
488
Using data from the US Health and Retirement Study, we study the causal effect of increased health insurance coverage through Medicare and the associated reduction in health-related background risk on financial risk-taking. Given the onset of Medicare at age 65, we identify our effect of interest using a regression discontinuity approach. We find that getting Medicare coverage induces stockholding for those with at least some college education, but not for their less-educated counterparts. Hence, our results indicate that a reduction in background risk induces financial risk-taking in individuals for whom informational and pecuniary stock market participation costs are relatively low.
487
This paper studies the effect of graduating from college on lifetime earnings. We develop a quantitative model of college choice with uncertain graduation. Departing from much of the literature, we model in detail how students progress through college. This allows us to parameterize the model using transcript data. College transcripts reveal substantial and persistent heterogeneity in students’ credit accumulation rates that are strongly related to graduation outcomes. From this data, the model infers a large ability gap between college graduates and high school graduates that accounts for 54% of the college lifetime earnings premium.
486
his paper distils three lessons for bank regulation from the experience of the 2009-12 euro-area financial crisis. First, it highlights the key role that sovereign debt exposures of banks have played in the feedback loop between bank and fiscal distress, and inquires how the regulation of banks’ sovereign exposures in the euro area should be changed to mitigate this feedback loop in the future. Second, it explores the relationship between the forbearance of non-performing loans by European banks and the tendency of EU regulators to rescue rather than resolving distressed banks, and asks to what extent the new regulatory framework of the euro-area “banking union” can be expected to mitigate excessive forbearance and facilitate resolution of insolvent banks. Finally, the paper highlights that capital requirements based on the ratio of Tier-1 capital to banks’ risk-weighted assets were massively gamed by large banks, which engaged in various forms of regulatory arbitrage to minimize their capital charges while expanding leverage. This argues in favor of relying on a set of simpler and more robust indicators to determine banks’ capital shortfall, such as book and market leverage ratios.
485
We study a model where some investors ("hedgers") are bad at information processing, while others ("speculators") have superior information-processing ability and trade purely to exploit it. The disclosure of financial information induces a trade externality: if speculators refrain from trading, hedgers do the same, depressing the asset price. Market transparency reinforces this mechanism, by making speculators' trades more visible to hedgers. As a consequence, issuers will oppose both the disclosure of fundamentals and trading transparency. Issuers may either under- or over-provide information compared to the socially efficient level if speculators have more bargaining power than hedgers, while they never under-provide it otherwise. When hedgers have low financial literacy, forbidding their access to the market may be socially efficient.
484
Most simulated micro-founded macro models use solely consumer-demand aggregates in order to estimate deep economy-wide preference parameters, which are useful for policy evaluation. The underlying demand-aggregation properties that this approach requires, should be easy to empirically disprove: since household-consumption choices differ for households with more members, aggregation can be rejected if appropriate data violate an affine equation regarding how much individuals benefit from within-household sharing of goods. We develop a survey method that tests the validity of this equation, without utility-estimation restrictions via models. Surprisingly, in six countries, this equation is not rejected, lending support to using consumer-demand aggregates.
483
Riley (1979)'s reactive equilibrium concept addresses problems of equilibrium existence in competitive markets with adverse selection. The game-theoretic interpretation of the reactive equilibrium concept in Engers and Fernandez (1987) yields the Rothschild-Stiglitz (1976)/Riley (1979) allocation as an equilibrium allocation, however multiplicity of equilibrium emerges. In this note we imbed the reactive equilibrium's logic in a dynamic market context with active consumers. We show that the Riley/Rothschild-Stiglitz contracts constitute the unique equilibrium allocation in any pure strategy subgame perfect Nash equilibrium.
482
Advertising arbitrage
(2014)
Speculators often advertise arbitrage opportunities in order to persuade other investors and thus accelerate the correction of mispricing. We show that in order to minimize the risk and the cost of arbitrage an investor who identifies several mispriced assets optimally advertises only one of them, and overweights it in his portfolio; a risk-neutral arbitrageur invests only in this asset. The choice of the asset to be advertised depends not only on mispricing but also on its "advertisability" and accuracy of future news about it. When several arbitrageurs identify the same arbitrage opportunities, their decisions are strategic complements: they invest in the same asset and advertise it. Then, multiple equilibria may arise, some of which inefficient: arbitrageurs may correct small mispricings while failing to eliminate large ones. Finally, prices react more strongly to the ads of arbitrageurs with a successful track record, and reputation-building induces high-skill arbitrageurs to advertise more than others.
481
Has economic research been helpful in dealing with the financial crises of the early 2000s? On the whole, the answer is negative, although there are bright spots. Economists have largely failed to predict both crises, largely because most of them were not analytically equipped to understand them, in spite of their recurrence in the last 25 years. In the pre-crisis period, however, there have been important exceptions – theoretical and empirical strands of research that largely laid out the basis for our current thinking about financial crises. Since 2008, a flurry of new studies offered several different interpretations of the US crisis: to some extent, they point to potentially complementary factors, but disagree on their relative importance, and therefore on policy recommendations. Research on the euro debt crisis has so far been much more limited: even Europe-based researchers – including CEPR ones – have often directed their attention more to the US crisis than to that occurring on their doorstep. In terms of impact on policy and regulatory reform, the record is uneven. On the one hand, the swift and massive liquidity provision by central banks in the wake of both crises is, at least partly, to be credited to previous research on the role of central banks as lenders of last resort in crises and on the real effects of bank lending and monetary policy. On the other hand, economists have had limited impact on the reform of prudential and security market regulation. In part, this is due to their neglect of important regulatory choices, which policy-makers are therefore left to take without the guidance of academic research-based analysis.
480
Consumption-based asset pricing with rare disaster risk : a simulated method of moments approach
(2014)
The rare disaster hypothesis suggests that the extraordinarily high postwar U.S. equity premium resulted because investors ex ante demanded compensation for unlikely but calamitous risks that they happened not to incur. Although convincing in theory, empirical tests of the rare disaster explanation are scarce. We estimate a disaster-including consumption-based asset pricing model (CBM) using a combination of the simulated method of moments and bootstrapping. We consider several methodological alternatives that differ in the moment matches and the way to account for disasters in the simulated consumption growth and return series. Whichever specification is used, the estimated preference parameters are of an economically plausible size, and the estimation precision is much higher than in previous studies that use the canonical CBM. Our results thus provide empirical support for the rare disaster hypothesis, and help reconcile the nexus between real economy and financial markets implied by the consumption-based asset pricing paradigm.
479
The long-run consumption risk (LRR) model is a promising approach to resolve prominent asset pricing puzzles. The simulated method of moments (SMM) provides a natural framework to estimate its deep parameters, but caveats concern model solubility and weak identification. We propose a two-step estimation strategy that combines GMM and SMM, and for which we elicit informative macroeconomic and financial moment matches from the LRR model structure. In particular, we exploit the persistent serial correlation of consumption and dividend growth and the equilibrium conditions for market return and risk-free rate, as well as the model-implied predictability of the risk-free rate. We match analytical moments when possible and simulated moments when necessary and determine the crucial factors required for both identification and reasonable estimation precision. A simulation study – the first in the context of long-run risk modeling – delineates the pitfalls associated with SMM estimation of a non-linear dynamic asset pricing model. Our study provides a blueprint for successful estimation of the LRR model.
478
he predictive likelihood is of particular relevance in a Bayesian setting when the purpose is to rank models in a forecast comparison exercise. This paper discusses how the predictive likelihood can be estimated for any subset of the observable variables in linear Gaussian state-space models with Bayesian methods, and proposes to utilize a missing observations consistent Kalman filter in the process of achieving this objective. As an empirical application, we analyze euro area data and compare the density forecast performance of a DSGE model to DSGE-VARs and reduced-form linear Gaussian models.
477
We propose a new estimator for the spot covariance matrix of a multi-dimensional continuous semi-martingale log asset price process which is subject to noise and non-synchronous observations. The estimator is constructed based on a local average of block-wise parametric spectral covariance estimates. The latter originate from a local method of moments (LMM) which recently has been introduced by Bibinger et al. (2014). We extend the LMM estimator to allow for autocorrelated noise and propose a method to adaptively infer the autocorrelations from the data. We prove the consistency and asymptotic normality of the proposed spot covariance estimator. Based on extensive simulations we provide empirical guidance on the optimal implementation of the estimator and apply it to high-frequency data of a cross-section of NASDAQ blue chip stocks. Employing the estimator to estimate spot covariances, correlations and betas in normal but also extreme-event periods yields novel insights into intraday covariance and correlation dynamics. We show that intraday (co-)variations (i) follow underlying periodicity patterns, (ii) reveal substantial intraday variability associated with (co-)variation risk, (iii) are strongly serially correlated, and (iv) can increase strongly and nearly instantaneously if new information arrives.
476
This paper studies the use of performance pricing (PP) provisions in debt contracts and compares accounting-based with rating-based pricing designs. We find that rating-based provisions are used by volatile-growth borrowers and allow for stronger spread increases over the credit period. Accounting-based provisions are employed by opaque-growth borrowers and stipulate stronger spread reductions. Further, a higher spread-increase potential in rating-based contracts lowers the spread at the loan’s inception and improves the borrower’s performance later on. In contrast, a higher spread-decrease potential in accounting-based contracts lowers the initial spread and raises the borrower’s leverage afterwards. The evidence indicates that rating-based contracts are indeed employed for different reasons than accounting-based contracts: the former to signal a borrower’s quality, the latter to mitigate investment inefficiencies.
475
This paper examines the effect of imperfect labor market competition on the efficiency of compensation schemes in a setting with moral hazard, private information and risk-averse agents. Two vertically differentiated firrms compete for agents by offering contracts with fixed and variable payments. Vertical differentiation between firms leads to endogenous, type-dependent exit options for agents. In contrast to screening models with perfect competition, we find that existence of equilibria does not depend on whether the least-cost separating allocation is interim efficient. Rather, vertical differentiation allows the inferior firm to offer (cross-)subsidizing fixed payments even above the interim efficient level. We further show that the efficiency of variable pay depends on the degree of competition for agents: For small degrees of competition, low-ability agents are under-incentivized and exert too little effort. For large degrees of competition, high-ability agents are over-incentivized and bear too much risk. For intermediate degrees of competition, however, contracts are second-best despite private information.
474
We analyze the differential impact of domestic and foreign monetary policy on the local supply of bank credit in domestic and foreign currencies. We analyze a novel, supervisory dataset from Hungary that records all bank lending to firms including its currency denomination. Accounting for time-varying firm-specific heterogeneity in loan demand, we find that a lower domestic interest rate expands the supply of credit in the domestic but not in the foreign currency. A lower foreign interest rate on the other hand expands lending by lowly versus highly capitalized banks relatively more in the foreign than in the domestic currency.
473
In this paper we argue that very high marginal labor income tax rates are an effective tool for social insurance even when households have preferences with high labor supply elasticity, make dynamic savings decisions, and policies have general equilibrium effects. To make this point we construct a large scale Overlapping Generations Model with uninsurable labor productivity risk, show that it has a wealth distribution that matches the data well, and then use it to characterize fiscal policies that achieve a desired degree of redistribution in society. We find that marginal tax rates on the top 1% of the earnings distribution of close to 90% are optimal. We document that this result is robust to plausible variation in the labor supply elasticity and holds regardless of whether social welfare is measured at the steady state only or includes transitional generations.
472
What would be the economic effects of the UK leaving the European Union on living standards of British people? We focus on the effects of trade on welfare net of lower fiscal transfers to the EU. We use a standard quantitative static general equilibrium trade model with multiple sectors, countries and intermediates, as in Costinot and Rodriguez-Clare (2013). Static losses range between 1.13% and 3.09% of GDP, depending on the assumptions used in our counterfactual scenarios. Including dynamic effects could more than double such losses.
471
We use data from the 2009 Internet Survey of the Health and Retirement Study to examine the consumption impact of wealth shocks and unemployment during the Great Recession in the US. We find that many households experienced large capital losses in housing and in their financial portfolios, and that a non-trivial fraction of respondents have lost their job. As a consequence of these shocks, many households reduced substantially their expenditures. We estimate that the marginal propensities to consume with respect to housing and financial wealth are 1 and 3.3 percentage points, respectively. In addition, those who became unemployed reduced spending by 10 percent. We also distinguish the effect of perceived transitory and permanent wealth shocks, splitting the sample between households who think that the stock market is likely to recover in a year’s time, and those who do not. In line with the predictions of standard models of intertemporal choice, we find that the latter group adjusted much more than the former its spending in response to financial wealth shocks.
470
This chapter discusses whether and how 'new quantitative trade models' (NQTMs) can be fruitfully applied to quantify the welfare effects of trade liberalization, thus shedding light on the trade-related effects of further European integration. On the one hand, it argues that NQTMs have indeed the potential of being used to supplement traditional 'computable general equilibrium' (CGE) analysis thanks to their tight connection between theory and data, appealing micro-theoretical foundations, and enhanced attention to the estimation of structural parameters. On the other hand, further work is still needed in order to fully exploit such potential.