CFS working paper series
https://gfk-cfs.de/working-papers/
Refine
Year of publication
- 2014 (50) (remove)
Document Type
- Working Paper (50)
Language
- English (50) (remove)
Has Fulltext
- yes (50)
Is part of the Bibliography
- no (50)
Keywords
- financial crisis (4)
- monetary policy (3)
- Alternative investments (2)
- G-SIFIs (2)
- Household finance (2)
- OTC markets (2)
- Progressive Taxation (2)
- asset pricing (2)
- limited attention (2)
- simulated method of moments (2)
Institute
- Center for Financial Studies (CFS) (50) (remove)
466
Futures markets are a potentially valuable source of information about market expectations. Exploiting this information has proved difficult in practice, because the presence of a time-varying risk premium often renders the futures price a poor measure of the market expectation of the price of the underlying asset. Even though the expectation in principle may be recovered by adjusting the futures price by the estimated risk premium, a common problem in applied work is that there are as many measures of market expectations as there are estimates of the risk premium. We propose a general solution to this problem that allows us to uniquely pin down the best possible estimate of the market expectation for any set of risk premium estimates. We illustrate this approach by solving the long-standing problem of how to recover the market expectation of the price of crude oil. We provide a new measure of oil price expectations that is considerably more accurate than the alternatives and more economically plausible. We discuss implications of our analysis for the estimation of economic models of energy-intensive durables, for the debate on speculation in oil markets, and for oil price forecasting.
452
This paper solves a dynamic model of households' mortgage decisions incorporating labor income, house price, inflation, and interest rate risk. It uses a zero-profit condition for mortgage lenders to solve for equilibrium mortgage rates given borrower characteristics and optimal decisions. The model quantifies the effects of adjustable vs. fixed mortgage rates, loan-to-value ratios, and mortgage affordability measures on mortgage premia and default. Heterogeneity in borrowers' labor income risk is important for explaining the higher default rates on adjustable-rate mortgages during the recent US housing downturn, and the variation in mortgage premia with the level of interest rates.
483
Riley (1979)'s reactive equilibrium concept addresses problems of equilibrium existence in competitive markets with adverse selection. The game-theoretic interpretation of the reactive equilibrium concept in Engers and Fernandez (1987) yields the Rothschild-Stiglitz (1976)/Riley (1979) allocation as an equilibrium allocation, however multiplicity of equilibrium emerges. In this note we imbed the reactive equilibrium's logic in a dynamic market context with active consumers. We show that the Riley/Rothschild-Stiglitz contracts constitute the unique equilibrium allocation in any pure strategy subgame perfect Nash equilibrium.
482
Advertising arbitrage
(2014)
Speculators often advertise arbitrage opportunities in order to persuade other investors and thus accelerate the correction of mispricing. We show that in order to minimize the risk and the cost of arbitrage an investor who identifies several mispriced assets optimally advertises only one of them, and overweights it in his portfolio; a risk-neutral arbitrageur invests only in this asset. The choice of the asset to be advertised depends not only on mispricing but also on its "advertisability" and accuracy of future news about it. When several arbitrageurs identify the same arbitrage opportunities, their decisions are strategic complements: they invest in the same asset and advertise it. Then, multiple equilibria may arise, some of which inefficient: arbitrageurs may correct small mispricings while failing to eliminate large ones. Finally, prices react more strongly to the ads of arbitrageurs with a successful track record, and reputation-building induces high-skill arbitrageurs to advertise more than others.
494
This chapter analyzes the risk and return characteristics of investments in artists from the Middle East and Northern Africa (MENA) region over the sample period 2000 to 2012. With hedonic regression modeling we create an annual index that is based on 3,544 paintings created by 663 MENA artists. Our empirical results prove that investing in such a hypothetical index provides strong financial returns. While the results show an exponential growth in sales since 2006, the geometric annual return of the MENA art index is a stable13.9 percent over the whole period. We conclude that investing in MENA paintings would have been profitable but also note that we examined the performance of an emerging art market that has only seen an upward trend without any correction, yet.
454
We analyze the risk premium on bank bonds at origination with a special focus on the role of implicit and explicit public guarantees and the systemic relevance of the issuing institutions. By looking at the asset swap spread on 5,500 bonds, we find that explicit guarantees and sovereign creditworthiness have a substantial effect on the risk premium. In addition, while large institutions still enjoy lower issuance costs linked to the TBTF framework, we find evidence of enhanced market disciple for systemically important banks which face, since the onset of the financial crisis, an increased premium on bond placements.
480
Consumption-based asset pricing with rare disaster risk : a simulated method of moments approach
(2014)
The rare disaster hypothesis suggests that the extraordinarily high postwar U.S. equity premium resulted because investors ex ante demanded compensation for unlikely but calamitous risks that they happened not to incur. Although convincing in theory, empirical tests of the rare disaster explanation are scarce. We estimate a disaster-including consumption-based asset pricing model (CBM) using a combination of the simulated method of moments and bootstrapping. We consider several methodological alternatives that differ in the moment matches and the way to account for disasters in the simulated consumption growth and return series. Whichever specification is used, the estimated preference parameters are of an economically plausible size, and the estimation precision is much higher than in previous studies that use the canonical CBM. Our results thus provide empirical support for the rare disaster hypothesis, and help reconcile the nexus between real economy and financial markets implied by the consumption-based asset pricing paradigm.
462
We examine the effects of credit default swaps (CDS), a major type of over-the-counter derivative, on the corporate liquidity management of the reference firms. CDS help firms to access the credit market since the lenders can hedge their credit risk more easily using these contracts. However, CDS-protected creditors can be tougher in debt renegotiations and less willing to support distressed borrowers, causing some firms to become more cautious. Consequently, we find that firms hold significantly more cash after the inception of CDS trading on their debt. The increase in cash holdings by CDS firms is more pronounced for financially constrained firms and firms facing higher refinancing risk. Moreover, bank relationships and outstanding credit facilities intensify the CDS effect on cash holding. Finally, firms with greater financial expertise hold more cash when their debt is referenced by CDS. These findings suggest that CDS, which are primarily a risk management tool for lenders, induce firms to adopt more conservative liquidity policies.
481
Has economic research been helpful in dealing with the financial crises of the early 2000s? On the whole, the answer is negative, although there are bright spots. Economists have largely failed to predict both crises, largely because most of them were not analytically equipped to understand them, in spite of their recurrence in the last 25 years. In the pre-crisis period, however, there have been important exceptions – theoretical and empirical strands of research that largely laid out the basis for our current thinking about financial crises. Since 2008, a flurry of new studies offered several different interpretations of the US crisis: to some extent, they point to potentially complementary factors, but disagree on their relative importance, and therefore on policy recommendations. Research on the euro debt crisis has so far been much more limited: even Europe-based researchers – including CEPR ones – have often directed their attention more to the US crisis than to that occurring on their doorstep. In terms of impact on policy and regulatory reform, the record is uneven. On the one hand, the swift and massive liquidity provision by central banks in the wake of both crises is, at least partly, to be credited to previous research on the role of central banks as lenders of last resort in crises and on the real effects of bank lending and monetary policy. On the other hand, economists have had limited impact on the reform of prudential and security market regulation. In part, this is due to their neglect of important regulatory choices, which policy-makers are therefore left to take without the guidance of academic research-based analysis.
484
Most simulated micro-founded macro models use solely consumer-demand aggregates in order to estimate deep economy-wide preference parameters, which are useful for policy evaluation. The underlying demand-aggregation properties that this approach requires, should be easy to empirically disprove: since household-consumption choices differ for households with more members, aggregation can be rejected if appropriate data violate an affine equation regarding how much individuals benefit from within-household sharing of goods. We develop a survey method that tests the validity of this equation, without utility-estimation restrictions via models. Surprisingly, in six countries, this equation is not rejected, lending support to using consumer-demand aggregates.
450
We propose an iterative procedure to efficiently estimate models with complex log-likelihood functions and the number of parameters relative to the observations being potentially high. Given consistent but inefficient estimates of sub-vectors of the parameter vector, the procedure yields computationally tractable, consistent and asymptotic efficient estimates of all parameters. We show the asymptotic normality and derive the estimator's asymptotic covariance in dependence of the number of iteration steps. To mitigate the curse of dimensionality in high-parameterized models, we combine the procedure with a penalization approach yielding sparsity and reducing model complexity. Small sample properties of the estimator are illustrated for two time series models in a simulation study. In an empirical application, we use the proposed method to estimate the connectedness between companies by extending the approach by Diebold and Yilmaz (2014) to a high-dimensional non-Gaussian setting.
495
Emotions-at-risk: an experimental investigation into emotions, option prices and risk perception
(2014)
This paper experimentally investigates how emotions are associated with option prices and risk perception. Using a binary lottery, we find evidence that the emotion ‘surprise’ plays a significant role in the negative correlation between lottery returns and estimates of the price of a put option. Our findings shed new light on various existing theories on emotions and affect. We find gratitude, admiration, and joy to be positively associated with risk perception, although the affect heuristic predicts a negative association. In contrast with the predictions of the appraisal tendency framework (ATF), we document a negative correlation between option price and surprise for lottery winners. Finally, the results show that the option price is not associated with risk perception as commonly used in psychology.
477
We propose a new estimator for the spot covariance matrix of a multi-dimensional continuous semi-martingale log asset price process which is subject to noise and non-synchronous observations. The estimator is constructed based on a local average of block-wise parametric spectral covariance estimates. The latter originate from a local method of moments (LMM) which recently has been introduced by Bibinger et al. (2014). We extend the LMM estimator to allow for autocorrelated noise and propose a method to adaptively infer the autocorrelations from the data. We prove the consistency and asymptotic normality of the proposed spot covariance estimator. Based on extensive simulations we provide empirical guidance on the optimal implementation of the estimator and apply it to high-frequency data of a cross-section of NASDAQ blue chip stocks. Employing the estimator to estimate spot covariances, correlations and betas in normal but also extreme-event periods yields novel insights into intraday covariance and correlation dynamics. We show that intraday (co-)variations (i) follow underlying periodicity patterns, (ii) reveal substantial intraday variability associated with (co-)variation risk, (iii) are strongly serially correlated, and (iv) can increase strongly and nearly instantaneously if new information arrives.
470
This chapter discusses whether and how 'new quantitative trade models' (NQTMs) can be fruitfully applied to quantify the welfare effects of trade liberalization, thus shedding light on the trade-related effects of further European integration. On the one hand, it argues that NQTMs have indeed the potential of being used to supplement traditional 'computable general equilibrium' (CGE) analysis thanks to their tight connection between theory and data, appealing micro-theoretical foundations, and enhanced attention to the estimation of structural parameters. On the other hand, further work is still needed in order to fully exploit such potential.
463
We develop a methodology to identify and rank “systemically important financial institutions” (SIFIs). Our approach is consistent with that followed by the Financial Stability Board (FSB) but, unlike the latter, it is free of judgment and it is based entirely on publicly available data, thus filling the gap between the official views of the regulator and those that market participants can form with their own information set. We apply the methodology to annual data on three samples of banks (global, EU and euro area) for the years 2007-2012. We examine the evolution of the SIFIs over time and document the shifs in the relative weights of the major geographic areas. We also discuss the implication of the 2013 update of the identification methodology proposed by the FSB.
485
We study a model where some investors ("hedgers") are bad at information processing, while others ("speculators") have superior information-processing ability and trade purely to exploit it. The disclosure of financial information induces a trade externality: if speculators refrain from trading, hedgers do the same, depressing the asset price. Market transparency reinforces this mechanism, by making speculators' trades more visible to hedgers. As a consequence, issuers will oppose both the disclosure of fundamentals and trading transparency. Issuers may either under- or over-provide information compared to the socially efficient level if speculators have more bargaining power than hedgers, while they never under-provide it otherwise. When hedgers have low financial literacy, forbidding their access to the market may be socially efficient.
489
US data and new stockholding data from fifteen European countries and China exhibit a common pattern: stockholding shares increase in household income and wealth. Yet, there is a multitude of numbers to match through models. Using a single utility function across households (parsimony), we suggest a strategy for fitting stockholding numbers, while replicating that saving rates increase in wealth, too. The key is introducing subsistence consumption to an Epstein-Zin-Weil utility function, creating endogenous risk-aversion differences across rich and poor. A closed-form solution for the model with insurable labor-income risk serves as calibration guide for numerical simulations with uninsurable labor-income risk.
479
The long-run consumption risk (LRR) model is a promising approach to resolve prominent asset pricing puzzles. The simulated method of moments (SMM) provides a natural framework to estimate its deep parameters, but caveats concern model solubility and weak identification. We propose a two-step estimation strategy that combines GMM and SMM, and for which we elicit informative macroeconomic and financial moment matches from the LRR model structure. In particular, we exploit the persistent serial correlation of consumption and dividend growth and the equilibrium conditions for market return and risk-free rate, as well as the model-implied predictability of the risk-free rate. We match analytical moments when possible and simulated moments when necessary and determine the crucial factors required for both identification and reasonable estimation precision. A simulation study – the first in the context of long-run risk modeling – delineates the pitfalls associated with SMM estimation of a non-linear dynamic asset pricing model. Our study provides a blueprint for successful estimation of the LRR model.
473
In this paper we argue that very high marginal labor income tax rates are an effective tool for social insurance even when households have preferences with high labor supply elasticity, make dynamic savings decisions, and policies have general equilibrium effects. To make this point we construct a large scale Overlapping Generations Model with uninsurable labor productivity risk, show that it has a wealth distribution that matches the data well, and then use it to characterize fiscal policies that achieve a desired degree of redistribution in society. We find that marginal tax rates on the top 1% of the earnings distribution of close to 90% are optimal. We document that this result is robust to plausible variation in the labor supply elasticity and holds regardless of whether social welfare is measured at the steady state only or includes transitional generations.
490
How much additional tax revenue can the government generate by increasing labor income taxes? In this paper we provide a quantitative answer to this question, and study the importance of the progressivity of the tax schedule for the ability of the government to generate tax revenues. We develop a rich overlapping generations model featuring an explicit family structure, extensive and intensive margins of labor supply, endogenous accumulation of labor market experience as well as standard intertemporal consumption-savings choices in the presence of uninsurable idiosyncratic labor productivity risk. We calibrate the model to US macro, micro and tax data and characterize the labor income tax Laffer curve under the current choice of the progressivity of the labor income tax code as well as when varying progressivity. We find that more progressive labor income taxes significantly reduce tax revenues. For the US, converting to a flat tax code raises the peak of the Laffer curve by 6%, whereas converting to a tax system with progressivity similar to Denmark would lower the peak by 7%. We also show that, relative to a representative agent economy tax revenues are less sensitive to the progressivity of the tax code in our economy. This finding is due to the fact that labor supply of two earner households is less elastic (along the intensive margin) and the endogenous accumulation of labor market experience makes labor supply of females less elastic (around the extensive margin) to changes in tax progressivity.