Refine
Year of publication
Document Type
- Working Paper (1305)
- Part of Periodical (276)
- Article (153)
- Report (96)
- Doctoral Thesis (34)
- Conference Proceeding (14)
- Part of a Book (7)
- Book (5)
- Preprint (4)
- Review (3)
Language
- English (1900) (remove)
Has Fulltext
- yes (1900) (remove)
Is part of the Bibliography
- no (1900)
Keywords
- Deutschland (58)
- Financial Institutions (47)
- ECB (41)
- Capital Markets Union (36)
- monetary policy (34)
- Financial Markets (33)
- Banking Union (32)
- Banking Regulation (28)
- Monetary Policy (28)
- Household Finance (27)
Institute
- Wirtschaftswissenschaften (1900) (remove)
How special are they? - Targeting systemic risk by regulating shadow banking : (October 5, 2014)
(2014)
This essay argues that at least some of the financial stability concerns associated with shadow banking can be addressed by an approach to financial regulation that imports its functional foundations more vigorously into the interpretation and implementation of existing rules. It shows that the general policy goals of prudential banking regulation remain constant over time despite dramatic transformations in the financial and technological landscape. Moreover, these overarching policy goals also legitimize intervention in the shadow banking sector. On these grounds, this essay encourages a more normative construction of available rules that potentially limits both the scope for regulatory arbitrage and the need for ever more rapid updates and a constant increase in the complexity of the regulatory framework. By tying the regulatory treatment of financial innovation closely to existing prudential rules and their underlying policy rationales, the proposed approach potentially ends the socially wasteful race between hare and tortoise that signifies the relation between regulators and a highly dynamic industry. In doing so it does not generally hamper market participants’ efficient discoveries where disintermediation proves socially beneficial. Instead, it only weeds-out rent-seeking circumventions of existing rules and standards.
Advertising arbitrage
(2014)
Speculators often advertise arbitrage opportunities in order to persuade other investors and thus accelerate the correction of mispricing. We show that in order to minimize the risk and the cost of arbitrage an investor who identifies several mispriced assets optimally advertises only one of them, and overweights it in his portfolio; a risk-neutral arbitrageur invests only in this asset. The choice of the asset to be advertised depends not only on mispricing but also on its "advertisability" and accuracy of future news about it. When several arbitrageurs identify the same arbitrage opportunities, their decisions are strategic complements: they invest in the same asset and advertise it. Then, multiple equilibria may arise, some of which inefficient: arbitrageurs may correct small mispricings while failing to eliminate large ones. Finally, prices react more strongly to the ads of arbitrageurs with a successful track record, and reputation-building induces high-skill arbitrageurs to advertise more than others.
Has economic research been helpful in dealing with the financial crises of the early 2000s? On the whole, the answer is negative, although there are bright spots. Economists have largely failed to predict both crises, largely because most of them were not analytically equipped to understand them, in spite of their recurrence in the last 25 years. In the pre-crisis period, however, there have been important exceptions – theoretical and empirical strands of research that largely laid out the basis for our current thinking about financial crises. Since 2008, a flurry of new studies offered several different interpretations of the US crisis: to some extent, they point to potentially complementary factors, but disagree on their relative importance, and therefore on policy recommendations. Research on the euro debt crisis has so far been much more limited: even Europe-based researchers – including CEPR ones – have often directed their attention more to the US crisis than to that occurring on their doorstep. In terms of impact on policy and regulatory reform, the record is uneven. On the one hand, the swift and massive liquidity provision by central banks in the wake of both crises is, at least partly, to be credited to previous research on the role of central banks as lenders of last resort in crises and on the real effects of bank lending and monetary policy. On the other hand, economists have had limited impact on the reform of prudential and security market regulation. In part, this is due to their neglect of important regulatory choices, which policy-makers are therefore left to take without the guidance of academic research-based analysis.
Consumption-based asset pricing with rare disaster risk : a simulated method of moments approach
(2014)
The rare disaster hypothesis suggests that the extraordinarily high postwar U.S. equity premium resulted because investors ex ante demanded compensation for unlikely but calamitous risks that they happened not to incur. Although convincing in theory, empirical tests of the rare disaster explanation are scarce. We estimate a disaster-including consumption-based asset pricing model (CBM) using a combination of the simulated method of moments and bootstrapping. We consider several methodological alternatives that differ in the moment matches and the way to account for disasters in the simulated consumption growth and return series. Whichever specification is used, the estimated preference parameters are of an economically plausible size, and the estimation precision is much higher than in previous studies that use the canonical CBM. Our results thus provide empirical support for the rare disaster hypothesis, and help reconcile the nexus between real economy and financial markets implied by the consumption-based asset pricing paradigm.
The long-run consumption risk (LRR) model is a promising approach to resolve prominent asset pricing puzzles. The simulated method of moments (SMM) provides a natural framework to estimate its deep parameters, but caveats concern model solubility and weak identification. We propose a two-step estimation strategy that combines GMM and SMM, and for which we elicit informative macroeconomic and financial moment matches from the LRR model structure. In particular, we exploit the persistent serial correlation of consumption and dividend growth and the equilibrium conditions for market return and risk-free rate, as well as the model-implied predictability of the risk-free rate. We match analytical moments when possible and simulated moments when necessary and determine the crucial factors required for both identification and reasonable estimation precision. A simulation study – the first in the context of long-run risk modeling – delineates the pitfalls associated with SMM estimation of a non-linear dynamic asset pricing model. Our study provides a blueprint for successful estimation of the LRR model.
he predictive likelihood is of particular relevance in a Bayesian setting when the purpose is to rank models in a forecast comparison exercise. This paper discusses how the predictive likelihood can be estimated for any subset of the observable variables in linear Gaussian state-space models with Bayesian methods, and proposes to utilize a missing observations consistent Kalman filter in the process of achieving this objective. As an empirical application, we analyze euro area data and compare the density forecast performance of a DSGE model to DSGE-VARs and reduced-form linear Gaussian models.
We propose a new estimator for the spot covariance matrix of a multi-dimensional continuous semi-martingale log asset price process which is subject to noise and non-synchronous observations. The estimator is constructed based on a local average of block-wise parametric spectral covariance estimates. The latter originate from a local method of moments (LMM) which recently has been introduced by Bibinger et al. (2014). We extend the LMM estimator to allow for autocorrelated noise and propose a method to adaptively infer the autocorrelations from the data. We prove the consistency and asymptotic normality of the proposed spot covariance estimator. Based on extensive simulations we provide empirical guidance on the optimal implementation of the estimator and apply it to high-frequency data of a cross-section of NASDAQ blue chip stocks. Employing the estimator to estimate spot covariances, correlations and betas in normal but also extreme-event periods yields novel insights into intraday covariance and correlation dynamics. We show that intraday (co-)variations (i) follow underlying periodicity patterns, (ii) reveal substantial intraday variability associated with (co-)variation risk, (iii) are strongly serially correlated, and (iv) can increase strongly and nearly instantaneously if new information arrives.
This paper studies the use of performance pricing (PP) provisions in debt contracts and compares accounting-based with rating-based pricing designs. We find that rating-based provisions are used by volatile-growth borrowers and allow for stronger spread increases over the credit period. Accounting-based provisions are employed by opaque-growth borrowers and stipulate stronger spread reductions. Further, a higher spread-increase potential in rating-based contracts lowers the spread at the loan’s inception and improves the borrower’s performance later on. In contrast, a higher spread-decrease potential in accounting-based contracts lowers the initial spread and raises the borrower’s leverage afterwards. The evidence indicates that rating-based contracts are indeed employed for different reasons than accounting-based contracts: the former to signal a borrower’s quality, the latter to mitigate investment inefficiencies.
This paper examines the effect of imperfect labor market competition on the efficiency of compensation schemes in a setting with moral hazard, private information and risk-averse agents. Two vertically differentiated firrms compete for agents by offering contracts with fixed and variable payments. Vertical differentiation between firms leads to endogenous, type-dependent exit options for agents. In contrast to screening models with perfect competition, we find that existence of equilibria does not depend on whether the least-cost separating allocation is interim efficient. Rather, vertical differentiation allows the inferior firm to offer (cross-)subsidizing fixed payments even above the interim efficient level. We further show that the efficiency of variable pay depends on the degree of competition for agents: For small degrees of competition, low-ability agents are under-incentivized and exert too little effort. For large degrees of competition, high-ability agents are over-incentivized and bear too much risk. For intermediate degrees of competition, however, contracts are second-best despite private information.
We analyze the differential impact of domestic and foreign monetary policy on the local supply of bank credit in domestic and foreign currencies. We analyze a novel, supervisory dataset from Hungary that records all bank lending to firms including its currency denomination. Accounting for time-varying firm-specific heterogeneity in loan demand, we find that a lower domestic interest rate expands the supply of credit in the domestic but not in the foreign currency. A lower foreign interest rate on the other hand expands lending by lowly versus highly capitalized banks relatively more in the foreign than in the domestic currency.
In this paper we argue that very high marginal labor income tax rates are an effective tool for social insurance even when households have preferences with high labor supply elasticity, make dynamic savings decisions, and policies have general equilibrium effects. To make this point we construct a large scale Overlapping Generations Model with uninsurable labor productivity risk, show that it has a wealth distribution that matches the data well, and then use it to characterize fiscal policies that achieve a desired degree of redistribution in society. We find that marginal tax rates on the top 1% of the earnings distribution of close to 90% are optimal. We document that this result is robust to plausible variation in the labor supply elasticity and holds regardless of whether social welfare is measured at the steady state only or includes transitional generations.
What would be the economic effects of the UK leaving the European Union on living standards of British people? We focus on the effects of trade on welfare net of lower fiscal transfers to the EU. We use a standard quantitative static general equilibrium trade model with multiple sectors, countries and intermediates, as in Costinot and Rodriguez-Clare (2013). Static losses range between 1.13% and 3.09% of GDP, depending on the assumptions used in our counterfactual scenarios. Including dynamic effects could more than double such losses.
We use data from the 2009 Internet Survey of the Health and Retirement Study to examine the consumption impact of wealth shocks and unemployment during the Great Recession in the US. We find that many households experienced large capital losses in housing and in their financial portfolios, and that a non-trivial fraction of respondents have lost their job. As a consequence of these shocks, many households reduced substantially their expenditures. We estimate that the marginal propensities to consume with respect to housing and financial wealth are 1 and 3.3 percentage points, respectively. In addition, those who became unemployed reduced spending by 10 percent. We also distinguish the effect of perceived transitory and permanent wealth shocks, splitting the sample between households who think that the stock market is likely to recover in a year’s time, and those who do not. In line with the predictions of standard models of intertemporal choice, we find that the latter group adjusted much more than the former its spending in response to financial wealth shocks.
This chapter discusses whether and how 'new quantitative trade models' (NQTMs) can be fruitfully applied to quantify the welfare effects of trade liberalization, thus shedding light on the trade-related effects of further European integration. On the one hand, it argues that NQTMs have indeed the potential of being used to supplement traditional 'computable general equilibrium' (CGE) analysis thanks to their tight connection between theory and data, appealing micro-theoretical foundations, and enhanced attention to the estimation of structural parameters. On the other hand, further work is still needed in order to fully exploit such potential.
Especially in developing countries credit constraints are often perceived as one of the most important market frictions constraining firm innovation and growth. Huge amounts of public money are being devoted to the removal of such constraints but their effectiveness is still subject to an intense policy debate. This paper contributes to this debate by analysing the effects of the Brazilian Development Bank (BNDES) loans. It finds that, before receiving BNDES support, granted firms are indeed more credit constrained than comparable non-granted firms. It also finds that BNDES support allows granted firms to achieve the same level of performance as similar non-granted firms that are not credit constrained. However, it does not allow granted firms to outperform similar non-granted ones.
We develop a model of an order-driven exchange competing for order flow with off-exchange trading mechanisms. Liquidity suppliers face a trade-off between benefits and costs of order exposure. If they display trading intentions, they attract additional trade demand. We show, in equilibrium, hiding trade intentions can induce mis-coordination between liquidity supply and demand, generate excess price fluctuations and harm price efficiency. Econometric high-frequency analysis based on unique data on hidden orders from NASDAQ reveals strong empirical support for these predictions: We find abnormal reactions in prices and order flow after periods of high excess-supply of hidden liquidity.
We propose a framework for estimating network-driven time-varying systemic risk contributions that is applicable to a high-dimensional financial system. Tail risk dependencies and contributions are estimated based on a penalized two-stage fixed-effects quantile approach, which explicitly links bank interconnectedness to systemic risk contributions. The framework is applied to a system of 51 large European banks and 17 sovereigns through the period 2006 to 2013, utilizing both equity and CDS prices. We provide new evidence on how banking sector fragmentation and sovereign-bank linkages evolved over the European sovereign debt crisis and how it is reflected in network statistics and systemic risk measures. Illustrating the usefulness of the framework as a monitoring tool, we provide indication for the fragmentation of the European financial system having peaked and that recovery has started.
Futures markets are a potentially valuable source of information about market expectations. Exploiting this information has proved difficult in practice, because the presence of a time-varying risk premium often renders the futures price a poor measure of the market expectation of the price of the underlying asset. Even though the expectation in principle may be recovered by adjusting the futures price by the estimated risk premium, a common problem in applied work is that there are as many measures of market expectations as there are estimates of the risk premium. We propose a general solution to this problem that allows us to uniquely pin down the best possible estimate of the market expectation for any set of risk premium estimates. We illustrate this approach by solving the long-standing problem of how to recover the market expectation of the price of crude oil. We provide a new measure of oil price expectations that is considerably more accurate than the alternatives and more economically plausible. We discuss implications of our analysis for the estimation of economic models of energy-intensive durables, for the debate on speculation in oil markets, and for oil price forecasting.
This paper shows that a capital budgeting process in which the division manager is required to engage in personally costly influence activities prior to a project approval has beneficial incentive effects: It provides the manager with incentives to acquire costly information about project prospects and helps to elicit the revelation of the acquired information. As a consequence, imposing influence costs on the manager can lead to improved capital allocations. The optimal level of influence costs, chosen by the firm, trades off ex ante incentives for information acquisition against efficient use of the acquired information ex post.
We analyze the effect of committee formation on how corporate boards perform two main functions: setting CEO pay and overseeing the financial reporting process. The use of performance-based pay schemes induces the CEO to manipulate earnings, which leads to an increased need for board oversight. If the whole board is responsible for both functions, it is inclined to provide the CEO with a compensation scheme that is relatively insensitive to performance in order to reduce the burden of subsequent monitoring. When the functions are separated through the formation of committees, the compensation committee is willing to choose a higher pay-performance sensitivity as the increased cost of oversight is borne by the audit committee. Our model generates predictions relating the board committee structure to the pay-performance sensitivity of CEO compensation, the quality of board oversight, and the level of earnings management.