Refine
Year of publication
- 2018 (102) (remove)
Document Type
- Working Paper (97)
- Part of Periodical (4)
- Article (1)
Has Fulltext
- yes (102)
Is part of the Bibliography
- no (102) (remove)
Keywords
- Liquidity (6)
- household finance (4)
- regulation (3)
- regulatory arbitrage (3)
- Capital Markets Union (2)
- Central Counterparties (2)
- Circuit Breaker (2)
- Competition (2)
- EMIR (2)
- European Supervisory Architecture (2)
Institute
- Center for Financial Studies (CFS) (102) (remove)
The paper illustrates based on an example the importance of consistency between the empirical measurement and the concept of variables in estimated macroeconomic models. Since standard New Keynesian models do not account for demographic trends and sectoral shifts, the authors proposes adjusting hours worked per capita used to estimate such models accordingly to enhance the consistency between the data and the model. Without this adjustment, low frequency shifts in hours lead to unreasonable trends in the output gap, caused by the close link between hours and the output gap in such models.
The retirement wave of baby boomers, for example, lowers U.S. aggregate hours per capita, which leads to erroneous permanently negative output gap estimates following the Great Recession. After correcting hours for changes in the age composition, the estimated output gap closes gradually instead following the years after the Great Recession.
While record-making prices at art auctions receive headline news coverage, artists typically do not receive any direct proceeds from those sales. Early-stage creative work in any field is perennially difficult to value, but the valuation, reward, and incentivization for artistic labor are particularly fraught. A core challenge in studying the real return on artists’ work is the extreme difficulty accessing data from when an artwork was first sold. Galleries keep private records that are difficult to access and to match to public auction results. This paper, for the first time, uses archivally sourced primary market records, for the artists Jasper Johns and Robert Rauschenberg. Although this approach restricts the size of the data set, this innovative method shows much more accurate returns on art than typical regression and hedonic models. We find that if Johns and Rauschenberg had retained 10% equity in their work when it was first sold, the returns to them when the work was resold at auction would have outperformed the US S&P 500 by between 2 and 986 times. The implication of this work opens up vast policy recommendations with regard to secondary art market sales, entrepreneurial strategies using blockchain technology, and implications about how we compensate creative work.
This paper studies the distributional consequences of a systematic variation in expenditure shares and prices. Using European Union Household Budget Surveys and Harmonized Index of Consumer Prices data, we construct household-specific price indices and reveal the existence of a pro-rich inflation in Europe. Particularly, over the period 2001-15, the consumption bundles of the poorest deciles in 25 European countries have, on average, become 10.5 percentage points more expensive than those of the richest decile. We find that ignoring the differential inflation across the distribution underestimates the change in the Gini (based on consumption expenditure) by up to 0.03 points. Cross-country heterogeneity in this change is large enough to alter the inequality ranking of numerous countries. The average inflation effect we detect is almost as large as the change in the standard Gini measure over the period of interest.
Digitalization expands the possibility for corporations to reduce taxes, mainly, but not exclusively, by allowing improved planning where profits can be shifted. Against this background, the European Commission and several countries emphatically demand and design new tax instruments. However, a selective turning away from internationally accepted principles of international taxation will bring up more questions than solutions. While there are good reasons to think about a fundamental regime switch in international corporate taxation, there are also good arguments for not turning to ad hoc measures that selectively target the relatively small market of Google and Facebook and raise only negligible tax revenues.
A number of recent studies have concluded that consumer spending patterns over the month are closely linked to the timing of income receipt. This correlation is interpreted as evidence of hyperbolic discounting. I re-examine patterns of spending in the diary sample of the U.S. Consumer Expenditure Survey, incorporating information on the timing of the main consumption commitment for most households - their monthly rent or mortgage payment. I find that non-durable and food spending increase with 30-48% on the day housing payments are made, with smaller increases in the days after. Moreover, households with weekly, biweekly and monthly income streams but the same timing of rent/mortgage payments have very similar consumption patterns. Exploiting variation in income, I find that households with extra liquidity decrease non-durable spending around housing payments, especially those households with a large budget share of housing.
Für Zwecke des privaten Konsums werden ständig Gegenwarts- und Zukunftsgüter bewertet und gehandelt. Ein zuverlässiges und umfassendes Maß für die allgemeine Kaufkraft des Geldes und deren Veränderung sollte diesem Grundsachverhalt Rechnung tragen. Im Unterschied zu konventionellen statistischen Verbraucherpreisindizes ist ein ökonomischer Lebenskostenindex intertemporal angelegt, da er die effektiven Konsumgüterpreise (Effektivpreise) über den Planungshorizont der privaten Haushalte bündelt. Ein Preisstabilitätsstandard, der diesen Zusammenhang ausblendet, ist tendenziell verzerrt und leistet einer asymmetrischen Geldpolitik Vorschub.
Effektivpreise sind Gegenwartspreise für künftigen Konsum, sie berücksichtigen Güterpreise und Zinsen bzw. Vermögenspreisänderungen, sind konsumtheoretisch und wohlfahrtsökonomisch fundiert und bilden die zentralen Bausteine für die Modellklasse der ökonomischen Lebenskostenindizes. Nutzentheoretisch gesehen sind Effektivpreise bewerteter Grenznutzen der letzten konsumierten Gütereinheit, und die daraus abgeleiteten Effektiven Inflationsraten sind intertemporale Grenzraten der Substitution.
Die Autoren entwickeln einen intertemporalen Lebenskostenindex auf der Grundlage des Konzepts der Effektivpreise und stellen empirische Zeitreihen und kohortenspezifische Szenarioanalysen für Deutschland vor.
Germany Inc. was an idiosyncratic form of industrial organization that put financial institutions at the center. This paper argues that the consumption of private benefits in related party transactions by these key agents can be understood as a compensation for their coordinating and monitoring function in Germany Inc. As a consequence, legal tools apt to curb tunneling remained weak in Germany from the perspective of outside shareholders. While banks were in a position to use their firm-level knowledge and influence to limit rent-seeking by other related parties, their own behavior was not subject to meaningful controls. With the dismantling of Germany Inc. banks seized their monitoring function and left an unprecedented void with regard to related party transactions. Hence, a “traditionalist” stance which opposes law reform for related party transactions in Germany negatively affects capital market development, growth opportunities and ultimately social welfare.
This paper is the national report for Germany prepared for the to the 20th General Congress of the International Academy of Comparative Law 2018 and gives an overview of the regulation of crowdfunding in Germany and the typical design of crowdfunding campaigns under this legal framework. After a brief survey of market data, it delineates the classification of crowdfunding transactions in German contract law and their treatment under the applicable conflict of laws regime. It then turns to the relevant rules in prudential banking regulation and capital market law. It highlights disclosure requirements that flow from both contractual obligations of the initiators of campaigns vis-à-vis contributors and securities regulation (prospectus regime). After sketching the most important duties of the parties involved in crowdfunding, the report also looks at the key features of the respective transactions’ tax treatment.
The recent sovereign debt crisis in the Eurozone was characterized by a monetary policy, which has been constrained by the zero lower bound (ZLB) on nominal interest rates, and several countries, which faced high risk spreads on their sovereign bonds. How is the government spending multiplier affected by such an economic environment?While prominent results in the academic literature point to high government spending multipliers at the ZLB, higher public indebtedness is often associated with small government spending multipliers. I develop a DSGE model with leverage constrained banks that captures both features of this economic environment, the ZLB and fiscal stress. In this model, I analyze the effects of government spending shocks. I find that not only are multipliers large at the ZLB, the presence of fiscal stress can even increase their size. For longer durations of the ZLB,multipliers in this model can be considerably larger than one.
JEL Classification: E32, E 44, E62
Passt das deutsche Dreisäulensystem in eine zunehmend harmonisierte Bankenstruktur für Europa?
(2018)
Das deutsche Bankensystem ruht seit Jahrzehnten auf drei Säulen: den privaten Kreditbanken, einschließlich der großen Banken in Aktionärsbesitz, den öffentlichen Banken und den Genossenschaftsbanken. Fast nirgendwo anders in Europa hat ein solches Dreisäulensystem überlebt. Passt es also noch in ein Europa, in dem die Bankpolitik, die Regulierung und die Aufsicht inzwischen weitgehend in die Zuständigkeit der EU fallen? Für eine Bewahrung des Systems sprechen vor allem Gesichtspunkte der Stabilität. Angesichts ihrer Gruppenzugehörigkeit sind die deutschen "stakeholder-value-orientierten" Banken der Säulen 2 und 3 finanziell keineswegs weniger erfolgreich, sogar ein wenig erfolgreicher als die "shareholder-value-orientierten" Großbanken der Säule 1. Insbesondere schwanken ihre Geschäftszahlen deutlich weniger als jene der Großbanken, die in der Regel ein riskanteres Geschäftsmodell verfolgen. In vielen Privatbanken ist die Gewinnorientierung und damit auch die Bereitschaft, hohe Risiken einzugehen, aus ordnungspolitischer Sicht zu hoch, was die Systemstabilität tendenziell gefährdet. Zudem erfüllen die Genossenschaftsbanken und Sparkassen eine regionalpolitische Ausgleichsfunktion und haben eine gesamtwirtschaftlich stabilisierende Wirkung.
We develop a simple theoretical model to motivate testable hypotheses about how peer-to-peer (P2P) platforms compete with banks for loans. The model predicts that (i) P2P lending grows when some banks are faced with exogenously higher regulatory costs; (ii) P2P loans are riskier than bank loans; and (iii) the risk-adjusted interest rates on P2P loans are lower than those on bank loans. We confront these predictions with data on P2P lending and the consumer bank credit market in Germany and find empirical support. Overall, our analysis indicates the P2P lenders are bottom fishing when regulatory shocks create a competitive disadvantage for some banks.
Policymakers attach an important role to the macroeconomic outlook of households. Using a representative online panel form the U.S., the authors examine how individuals' macroeconomic expectations causally affect their personal economic prospects and their behavior and provide them with different professional forecasts about the likelihood of a recession. The authors find that groups with the largest exposure to aggregate risk, such as individuals working in cyclical industries, are most likely to respond to an improved macroeconomic outlook, while a large fraction of the population is unlikely to react.
We show that bond purchases undertaken in the context of quantitative easing efforts by the European Central Bank created a large mispricing between the market for German and Italian government bonds and their respective futures contracts. On top of the direct effect the buying pressure exerted on bond prices, we show three indirect effects through which the scarcity of bonds, resulting from the asset purchases, drove a wedge between the futures contracts and the underlying bonds: the deterioration of bond market liquidity, the increased bond specialness on the repurchase agreement market, and the greater uncertainty about bond availability as collateral.
This paper investigates the effect of the conventional and unconventional (e.g. Quantitative Easing - QE) monetary policy intervention on the insurance industry. We first analyze the impact on the stock performances of 166 (re)insurers from the last QE programme launched by the European Central Bank (ECB) by constructing an event study around the announcement date. Then we enlarge the scope by looking at the monetary policy surprise effects on the same sample of (re)insurers over a timeframe of 12 years, also extending the analysis to the Credit Default Swaps (CDS) market. In the second part of the paper by building a set of balance sheet-based indices, we identify the characteristics of (re)insurers that determine sensitivity to monetary policy actions. Our evidences suggest that a single intervention extrapolated from the comprehensive strategy cannot be utilized to estimate the effect of monetary policy intervention on the market. With respect to the impact of monetary policies, we show how the effect of interventions changes over time. Expansionary monetary policy interventions, when generating an instantaneous reduction of interest rates, generated movement in stock prices in the same direction till September 2010. This effect turned positive during the European sovereign debt crisis. However, the effect faded away in 2014-2015. The pattern is confirmed by the impact on the CDS market. With regard to the determinants of these effects, our analysis suggests that sensitivity is mainly driven by asset allocation and in particular by exposure to fixed income assets.
The paper investigates the determinants of the idiosyncratic volatility puzzle by allowing linkages across asset returns. The first contribution of the paper is to show that portfolios sorted by increasing indegree computed on the network based on Granger causality test have lower expected returns, not related to idiosyncratic volatility. Secondly, empirical evidence indicates that stocks with higher idiosyncratic volatility have the lower exposition on the indegree risk factor.
In talent-intensive jobs, workers’ quality is revealed by their performance. This enhances productivity and earnings, but also increases layoff risk. Firms cannot insure workers against this risk if they compete fiercely for talent. In this case, the more risk-averse workers will choose less quality-revealing jobs. This lowers expected productivity and salaries. Public unemployment insurance corrects this inefficiency, enhancing employment in talent-sensitive industries, consistently with international evidence. Unemployment insurance dominates legal restrictions on firms’ dismissals, which penalize more talent-sensitive firms and thus depress expected productivity. Finally, unemployment insurance fosters education, by encouraging investment in risky human capital that enhances talent discovery.
What institutional arrangements for an independent central bank with a price stability mandate promote good policy outcomes when unconventional policies become necessary? Unconventional monetary policy poses challenges. The large scale asset purchases needed to counteract the zero lower bound on nominal interest rates have uncomfortable fiscal and distributional consequences and require central banks to assume greater risks on their balance sheets.
In his paper, Athanasios Orphanides draws lessons from the experience of the Bank of Japan (BoJ) since the late 1990s for the institutional design of independent central banks. He comes to the conclusion that lack of clarity on the precise definition of price stability, coupled with concerns about the legitimacy of large balance sheet expansions, hinders policy: It encourages the central bank to eschew the decisive quantitative easing needed to reflate the economy and instead to accommodate too-low inflation. The BoJ’s experience with the zero lower bound suggests important benefits from a clear definition of price stability as a symmetric 2% goal for inflation, which the Bank adopted in 2013.
We assess the relationship between finance and growth over the period 1980-2014. We estimate a cross-country growth regression for 48 countries during 20 periods of 15 years starting in 1980 (to 1995) and ending in 1999 (to 2014). We use OLS and IV estimations and we find that: 1) overall financial development had a positive effect on economic growth during all periods of our sample, i.e., we confirm that from 1980 to 2014 financial services provided by the various financial systems were significant (to various degrees) for firm creation, industrial expansion and economic growth; but that, 2) the structure of financial markets was particularly relevant for economic growth until the financial crisis; while 3) the structure of the banking sector played a major role since; and finally that, 4) the legal system is the primary determinant of the effectiveness of the overall financial system in facilitating innovation and growth in (almost) all of our sample period. Hence, overall our results suggest that the relationship between finance and growth matters but also that it varies over time in strength and in sector origination.
JEL Classification: O16, G16, G20.
Bargaining with a bank
(2018)
This paper examines bargaining as a mechanism to resolve information problems. To guide the analysis, I develop a parsimonious model of a credit negotiation between a bank and firms with varying levels of impatience. In equilibrium, impatient firms accept the bank’s offer immediately, while patient firms wait and negotiate price adjustments. I test the empirical predictions using a hand-collected dataset on credit line negotiations. Firms signing the bank’s offer right away draw down their line of credit after origination and default more than late signers. Late signers negotiate price adjustments more frequently, and, consistent with the model, these adjustments predict better ex post performance.
The authors relax the standard assumption in the dynamic stochastic general equilibrium (DSGE) literature that exogenous processes are governed by AR(1) processes and estimate ARMA (p,q) orders and parameters of exogenous processes. Methodologically, they contribute to the Bayesian DSGE literature by using Reversible Jump Markov Chain Monte Carlo (RJMCMC) to sample from the unknown ARMA orders and their associated parameter spaces of varying dimensions.
In estimating the technology process in the neoclassical growth model using post war US GDP data, they cast considerable doubt on the standard AR(1) assumption in favor of higher order processes. They find that the posterior concentrates density on hump-shaped impulse responses for all endogenous variables, consistent with alternative empirical estimates and the rigidities behind many richer structural models. Sampling from noninvertible MA representations, a negative response of hours to a positive technology shock is contained within the posterior credible set. While the posterior contains significant uncertainty regarding the exact order, the results are insensitive to the choice of data filter; this contrasts with the authors’ ARMA estimates of GDP itself, which vary significantly depending on the choice of HP or first difference filter.
This paper presents new evidence on the expectation formation process from a Dutch household survey. Households become too optimistic about their future income after their income has improved, consistent with the over-extrapolation of their experience. We show that this effect of experience is persistent and that households over-extrapolate income losses more than income gains. Furthermore, older households over-extrapolate more, suggesting that they did not learn over time to form more accurate expectations. Finally, we study the relationship between expectation errors and consumption. We find that more over-optimistic households intend to consume more and subsequently report higher consumption, even though they do not consume as much as they intended to. These results suggests that overextrapolation hurts consumers and amplify business cycles.
I present a new business cycle model in which decision making follows a simple mental process motivated by neuroeconomics. Decision makers first compute the value of two different options and then choose the option that offers the highest value, but with errors. The resulting model is highly tractable and intuitive. A demand function in level replaces the traditional Euler equation. As a result, even liquid consumers can have a large marginal propensity to consume. The interest rate affects consumption through the cost of borrowing and not through intertemporal substitution. I discuss the implications for stimulus policies.
Manipulative communications touting stocks are common in capital markets around the world. Although the price distortions created by so-called “pump-and-dump” schemes are well known, little is known about the investors in these frauds. By examining 421 “pump-and-dump” schemes between 2002 and 2015 and a proprietary set of trading records for over 110,000 individual investors from a major German bank, we provide evidence on the participation rate, magnitude of the investments, losses, and the characteristics of the individuals who invest in such schemes. Our evidence suggests that participation is quite common and involves sizable losses, with nearly 6% of active investors participating in at least one “pump-and-dump” and an average loss of nearly 30%. Moreover, we identify several distinct types of investors, some of which should not be viewed as falling prey to these frauds. We also show that portfolio composition and past trading behavior can better explain participation in touted stocks than demographics. Our analysis offers insights into the challenges associated with designing effective investor protection against market manipulation.
The use of evidence and economic analysis in policymaking is on the rise, and accounting standard setting and financial regulation are no exception. This article discusses the promise of evidence-based policymaking in accounting and financial markets as well as the challenges and opportunities for research supporting this endeavor. In principle, using sound theory and robust empirical evidence should lead to better policies and regulations. But despite its obvious appeal and substantial promise, evidence-based policymaking is easier demanded than done. It faces many challenges related to the difficulty of providing relevant causal evidence, lack of data, the reliability of published research, and the transmission of research findings. Overcoming these challenges requires substantial infrastructure investments for generating and disseminating relevant research. To illustrate this point, I draw parallels to the rise of evidence-based medicine. The article provides several concrete suggestions for the research process and the aggregation of research findings if scientific evidence is to inform policymaking. I discuss how policymakers can foster and support policy-relevant research, chiefly by providing and generating data. The article also points to potential pitfalls when research becomes increasingly policy-oriented.
Through the lens of market participants' objective to minimize counterparty risk, we provide an explanation for the reluctance to clear derivative trades in the absence of a central clearing obligation. We develop a comprehensive understanding of the benefits and potential pitfalls with respect to a single market participant's counterparty risk exposure when moving from a bilateral to a clearing architecture for derivative markets. Previous studies suggest that central clearing is beneficial for single market participants in the presence of a sufficiently large number of clearing members. We show that three elements can render central clearing harmful for a market participant's counterparty risk exposure regardless of the number of its counterparties: 1) correlation across and within derivative classes (i.e., systematic risk), 2) collateralization of derivative claims, and 3) loss sharing among clearing members. Our results have substantial implications for the design of derivatives markets, and highlight that recent central clearing reforms might not incentivize market participants to clear derivatives.
We study the relevance of signaling and marketing as explanations for the discount control mechanisms that a closed-end fund may choose to adopt in its prospectus. These policies are designed to narrow the potential gap between share price and net asset value, measured by the fund’s discount. The two most common discount control mechanisms are explicit discretion to repurchase shares based on the magnitude of the fund discount and mandatory continuation votes that provide shareholders the opportunity to liquidate the fund. We find very limited evidence that a discount control mechanism serves as costly signal of information. Funds with mandatory voting are not more likely to delist than the rest of the CEFs in general or whenever the fund discount is large. Similarly, funds that explicitly discuss share repurchases as a potential response do not subsequently buy back shares more often when discounts do increase. Instead, the existence of these policies is more consistent with marketing explanations because the policies are associated with an increased probability of issuing more equity in subsequent periods.
Direct financing of consumer credit by individual investors or non-bank institutions through an implementation of marketplace lending is a relatively new phenomenon in financial markets. The emergence of online platforms has made this type of financial intermediation widely available. This paper analyzes the performance of marketplace lending using proprietary cash flow data for each individual loan from the largest platform, Lending Club. While individual loan characteristics would be important for amateur investors holding a few loans, sophisticated lenders, including institutional investors, usually form broad portfolios to benefit from diversification. We find high risk-adjusted performance of approximately 40 basis points per month for these basic loan portfolios. This abnormal performance indicates that Lending Club, and similar marketplace lenders, are likely to attract capital to finance a growing share of the consumer credit market. In the absence of a competitive response from traditional credit providers, these loans lower costs to the ultimate borrowers and increase returns for the ultimate lenders.
We characterize the optimal linear tax on capital in an Overlapping Generations model with two period lived households facing uninsurable idiosyncratic labor income risk. The Ramsey government internalizes the general equilibrium feedback of private precautionary saving. For logarithmic utility our full analytical solution of the Ramsey problem shows that the optimal aggregate saving rate is independent of income risk. The optimal time-invariant tax on capital is increasing in income risk. Its sign depends on the extent of risk and on the Pareto weight of future generations. If the Ramsey tax rate that maximizes steady state utility is positive, then implementing this tax rate permanently generates a Pareto-improving transition even if the initial equilibrium is dynamically efficient. We generalize our results to Epstein-Zin-Weil utility and show that the optimal steady state saving rate is increasing in income risk if and only if the intertemporal elasticity of substitution is smaller than 1.
Über Scheinriesen: Was TARGET-Salden tatsächlich bedeuten : eine finanzökonomische Überprüfung
(2018)
Der TARGET-Saldo der Bundesbank beläuft sich gegenwärtig auf knapp 1 Billion Euro. Kritikern zufolge birgt dieser Umstand hohe Lasten und Risiken für den deutschen Steuerzahler und zeigt, dass Deutschland zu einem „Selbstbedienungsladen“ im Eurosystem geworden sei. Vor diesem Hintergrund erörtert das Papier im Detail, wie TARGET-Salden überhaupt entstehen und was sie finanzökonomisch bedeuten. Die wirtschaftspolitische Analyse kommt zu dem Schluss, dass - anders als von den Kritikern behauptet- unter den Bedingungen einer Währungsunion im Normalbetrieb - TARGET-Salden lediglich Verrechnungssalden ohne weitere Implikationen sind, die aber nützliche Informationen über ökonomisch tieferliegende, regionale Verschiebungen geben können. Unter dem Extremszenario eines Zerfalls der Währungsunion können TARGET-Salden zwar als offene Positionen interpretiert werden, deren spätere Erfüllung würde aber ähnlich dem Brexit von komplizierten politischen Verhandlungen abhängen, sodass über die Werthaltigkeit allenfalls spekuliert werden kann. Sollte man das Extremszenario für bedeutend halten, und politisches Handeln fordern, erscheinen zwei Lösungen sinnvoll. Beide Vorschläge führen zu einer institutionellen Stärkung der Eurozone: i) die Einführung einer Tilgungspraxis, wie sie im US-amerikanischen Fedwire-System angewandt wird. Dabei handelt es sich um eine rein fiktive Tilgung in Form einer Umbuchung auf einem gemeinsamen (Offenmarkt-)Konto bei der EZB; ii) die Bündelung aller monetären Aktivitäten bei der EZB, sodass eine regionale Abgrenzung von Zahlungsvorgängen entfällt (und damit die TARGET-Salden verschwinden), weil alle Banken in direkter Beziehung zu ein und derselben Zentralbank stehen und der Zahlungsverkehr direkt zwischen den beteiligten Banken stattfindet.
Based on OECD evidence, equity/housing-price busts and credit crunches are followed by substantial increases in public consumption. These increases in unproductive public spending lead to increases in distortionary marginal taxes, a policy in sharp contrast with presumably optimal Keynesian fiscal stimulus after a crisis. Here we claim that this seemingly adverse policy selection is optimal under rational learning about the frequency of rare capital-value busts. Bayesian updating after a bust implies massive belief jumps toward pessimism, with investors and policymakers believing that busts will be arriving more frequently in the future. Lowering taxes would be as if trying to kick a sick horse in order to stand up and run, since pessimistic markets would be unwilling to invest enough under any temporarily generous tax regime.
Asset transaction prices sampled at high frequency are much staler than one might expect in the sense that they frequently lack new updates showing zero returns. In this paper, we propose a theoretical framework for formalizing this phenomenon. It hinges on the existence of a latent continuous-time stochastic process pt valued in the open interval (0; 1), which represents at any point in time the probability of the occurrence of a zero return. Using a standard infill asymptotic design, we develop an inferential theory for nonparametrically testing, the null hypothesis that pt is constant over one day. Under the alternative, which encompasses a semimartingale model for pt, we develop non-parametric inferential theory for the probability of staleness that includes the estimation of various integrated functionals of pt and its quadratic variation. Using a large dataset of stocks, we provide empirical evidence that the null of the constant probability of staleness is fairly rejected. We then show that the variability of pt is mainly driven by transaction volume and is almost unaffected by bid-ask spread and realized volatility.
We present empirical evidence on the heterogeneity in monetary policy transmission across countries with different home ownership rates. We use household-level data together with shocks to the policy rate identified from high-frequency data. We find that housing tenure reacts more strongly to unexpected changes in the policy rate in Germany and Switzerland –the OECD countries with the lowest home ownership rates– compared with existing evidence for the U.S. An unexpected decrease in the policy rate by 25 basis points increases the home ownership rate by 0.8 percentage points in Germany and by 0.6 percentage points in Switzerland. The response of non-housing consumption in Switzerland is less heterogeneous across renters and mortgagors, and has a different pattern across age groups than in the U.S. We discuss economic explanations for these findings and implications for monetary policy.
The propagation of regional shocks in housing markets: evidence from oil price shocks in Canada
(2018)
Shocks to the demand for housing that originate in one region may seem important only for that regional housing market. We provide evidence that such shocks can also affect housing markets in other regions. Our analysis focuses on the response of Canadian housing markets to oil price shocks. Oil price shocks constitute an important source of exogenous regional variation in income in Canada because oil production is highly geographically concentrated. We document that, at the national level, real oil price shocks account for 11% of the variability in real house price growth over time. At the regional level, we find that unexpected increases in the real price of oil raise housing demand and real house prices not only in oil-producing regions, but also in other regions. We develop a theoretical model of the propagation of real oil price shocks across regions that helps understand this finding. The model differentiates between oil-producing and non-oil-producing regions and incorporates multiple sectors, trade between provinces, government redistribution, and consumer spending on fuel. We empirically confirm the model prediction that oil price shocks are propagated to housing markets in non-oil-producing regions by the government redistribution of oil revenue and by increased interprovincial trade.
Even if the importance of micro data transparency is a well-established fact, European institutions are still lacking behind the US when it comes to the provision of financial market data to academics. In this Policy Letter we discuss five different types of micro data that are crucial for monitoring (systemic) risk in the financial system, identifying and understanding inter-linkages in financial markets and thus have important implications for policymakers and regulatory authorities. We come to the conclusion that for all five areas of micro data, outlined in this Policy Letter (bank balance sheet data, asset portfolio data, market transaction data, market high frequency data and central bank data), the benefits of increased transparency greatly offset potential downsides. Hence, European policymakers would do well to follow the US example and close the sizeable gap in micro data transparency. For most cases, relevant data is already collected (at least on national level), but just not made available to academics for partly incomprehensible reasons. Overcoming these obstacles could foster financial stability in Europe and assure level playing fields with US regulators and policymakers.
Deutschland und Europa
(2018)
Otmar Issing erörtert die Reaktionen in Deutschland auf die Pläne des französischen Präsidenten Macron aus dessen viel beachteter Rede zur Zukunft Europas an der Pariser Sorbonne. Issing wertet das Ergebnis der Sondierungsgespräche zwischen CDU/CSU und SPD als Abschied von der Vorstellung einer auf Stabilität gerichteten europäischen Gemeinschaft und mahnt an, den einheitlichen Markt und die damit verbundenen Freiheiten nicht durch überzogene Ambitionen zu gefährden und damit zunehmendes Misstrauen gegenüber Europa zu fördern.
Insbesondere in der geplanten Weiterentwicklung des ESM zu einem im Unionsrecht verankerten Europäischen Währungsfonds sieht Issing die Auslieferung der durch den Fonds zur Verfügung gestellten Mittel an eine politische Mehrheit. Zudem führe die Bestellung eines europäischen Finanzministers zur Schaffung einer die Währungsunion ergänzenden Fiskalunion und damit zur Verlagerung finanzpolitischer Kompetenz von der nationalen auf die europäische Ebene. In letzter Konsequenz bedeute dies eine Aufgabe des grundlegenden Prinzips der demokratischen Legitimierung und Kontrolle finanzpolitischer Entscheidungen.
Das Ergebnis der Sondierungsgespräche muss man als Abschied von der Vorstellung einer auf Stabilität gerichteten europäischen Gemeinschaft verstehen. Damit werden die Versprechen gebrochen, die man den Bürgern in Deutschland vor der Einführung des Euros gegeben hat.
Der Beitrag analysiert die Voraussetzungen für stabiles Geld und setzt sich dabei grundlegend mit Hayeks Thesen zu alternativen Währungssystemen sowie dessen fundamentaler Kritik an der Möglichkeit zur Gestaltung der Geldpolitik auf wissenschaftlicher Basis auseinander. Er prüft Hayeks Vorschlag zur Entnationalisierung des Geldes und seine Thesen zur Überlegenheit des im privaten Wettbewerb geschaffenen Geldes. In diesem Zusammenhang schlägt der Beitrag einen Bogen zur aktuellen Diskussion über Kryptowährungen und wirft die Frage auf, ob virtuelle Währungen wie etwa Bitcoin geeignet sind, den Hayekschen Währungswettbewerb zu entfalten. Sodann wird im Gegensatz zu Hayeks Forderung nach einer Abschaffung der Zentralbanken deren entscheidende Rolle für anhaltendes Wachstum bei stabilen Preisen skizziert und die Wichtigkeit der Unabhängigkeit von Notenbanken für die dauerhafte Durchführung einer stabilitätsorientierten Geldpolitik hervorgehoben. Gleichwohl ergeht der Hinweis, dass Notenbanken mit der Überschreitung ihres Mandats auf lange Sicht gesehen selbst den Status ihrer Unabhängigkeit unterminieren können und damit die Rückübertragung der Kompetenz für zentrale geldpolitische Entscheidungen auf Regierung und Parlament provozieren. Die Gefahren der weitgehenden Unabhängigkeit einiger weniger an der Spitze der Notenbanken anerkennend wird anschließend die Bedeutung ihrer Rechenschaftspflicht und Transparenz ihrer Entscheidungen unterstrichen.
In this study we investigate which economic ideas were prevalent in the macroprudential discourse post-crises in order to understand the availability of ideas for reform minded agents. We base our analysis on new findings in the field of ideational shifts and regulatory science, which posit that change-agents engage with new ideas pragmatically and strategically in their effort to have their economic ideas institutionalized. We argue that in these epistemic battles over new regulation, scientific backing by academia is the key resource determining the outcome. We show that the present reforms implemented internationally follow this pattern. In our analysis we contrast the entire discourse on systemic risk and macroprudential regulation with Borio’s initial 2003 proposal for a macroprudential framework. We find that mostly cross-sectional measures targeted towards increasing the resilience of the financial system rather than inter-temporal measures dampening the financial cycle have been implemented. We provide evidence for the lacking support of new macroprudential thinking within academia and argue that this is partially responsible for the lack of anti-cyclical macroprudential regulation. Most worryingly, the financial cycle is largely absent in the academic discourse and is only tacitly assumed instead of fully fledged out in technocratic discourses, pointing to the possibility that no anti-cyclical measures will be forthcoming.
In the last decade, central bank interventions, flights to safety, and the shift in derivatives clearing resulted in exceptionally high demand for high quality liquid assets, such as German treasuries, in the securities lending market besides the traditional repo market activities. Despite the high demand, the realizable securities lending income has remained economically negligible for most beneficial owners. We provide empirical evidence of pricing inefficiencies in the non-transparent, oligopolistic securities lending market for German treasuries from 2006 to 2015. Consistent with Duffie, Gârleanu and Pedersen (2005)’s theory, we find that the less connected market participants’ interests are underrepresented, evident in the longer maturity segment, where lenders are more likely to be conservative passive investors, such as pension funds and insurance firms. The low price elasticity in this segment hinders these beneficial owners to fully capitalize on the additional income from securities lending, giving rise to important negative welfare implications.
We show that time-varying volatility of volatility is a significant risk factor which affects the cross-section and the time-series of index and VIX option returns, beyond volatility risk itself. Volatility and volatility-of-volatility measures, identified model-free from the option price data as the VIX and VVIX indices, respectively, are only weakly related to each other. Delta-hedged index and VIX option returns are negative on average, and are more negative for strategies which are more exposed to volatility and volatility-of-volatility risks. Volatility and volatility of volatility significantly and negatively predict future delta-hedged option payoffs. The evidence is consistent with a no-arbitrage model featuring time-varying market volatility and volatility-of-volatility factors, both of which have negative market price of risk.
A recent US Treasury regulation allowed deferred longevity income annuities to be included in pension plan menus as a default payout solution, yet little research has investigated whether more people should convert some of the $15 trillion they hold in employer-based defined contribution plans into lifelong income streams. We investigate this innovation using a calibrated lifecycle consumption and portfolio choice model embodying realistic institutional considerations. Our welfare analysis shows that defaulting a small portion of retirees’ 401(k) assets (over a threshold) is an attractive way to enhance retirement security, enhancing welfare by up to 20% of retiree plan accruals.
This paper provides a complete characterization of optimal contracts in principal-agent settings where the agent's action has persistent effects. We model general information environments via the stochastic process of the likelihood-ratio. The martingale property of this performance metric captures the information benefit of deferral. Costs of deferral may result from both the agent's relative impatience as well as her consumption smoothing needs. If the relatively impatient agent is risk neutral, optimal contracts take a simple form in that they only reward maximal performance for at most two payout dates. If the agent is additionally risk-averse, optimal contracts stipulate rewards for a larger selection of dates and performance states: The performance hurdle to obtain the same level of compensation is increasing over time whereas the pay-performance sensitivity is declining.
his paper studies heterogeneity in the reaction to rank feedback. In a laboratory experiment, individuals take part in a series of dynamic real-effort contests with intermediate feedback. To solve the identification problem in estimating the causal effect of rank feedback on subsequent effort provision we implement a random multiplier in the first round of each contest. The realization of this multiplier then serves as a valid instrument for rank feedback. While rank feedback has a robust effect on subsequent effort provision on average, an explicit analysis of between-subject heterogeneity reveals that a substantial fraction of participants in fact react entirely opposite than the aggregated results indicate. We further show that this heterogeneity has consequences for overall outcomes, thereby arguing that heterogeneous sensitivities to rank feedback could have implications for the design of various policies in education and organizations.
This paper revisits the macroeconomic effects of the large-scale asset purchase programmes launched by the Federal Reserve and the Bank of England from 2008. Using a Bayesian VAR, we investigate the macroeconomic impact of shocks to asset purchase announcements and assess changes in their effectiveness based on subsample analysis. The results suggest that the early asset purchase programmes had significant positive macroeconomic effects, while those of the subsequent ones were weaker and in part not significantly different from zero. The reduced effectiveness seems to reflect in part better anticipation of asset purchase programmes over time, since we find significant positive macroeconomic effects when we consider shocks to survey expectations of the Federal Reserve’s last asset purchase programme. Finally, in all estimations we find a significant and persistent positive impact of asset purchase shocks on stock prices.
Departing from the principle of absolute priority, CoCo bonds are particularly exposed to bank losses despite not having ownership rights. This paper shows the link between adverse CoCo design and their yields, confirming the existence of market monitoring in designated bail-in debt. Specifically, focusing on the write-down feature as loss absorption mechanism in CoCo debt, I do find a yield premium on this feature relative to equity-conversion CoCo bonds as predicted by theoretical models. Moreover, and consistent with theories on moral hazard, I find this premium to be largest when existing incentives for opportunistic behavior are largest, while this premium is non-existent if moral hazard is perceived to be small. The findings show that write-down CoCo bonds introduce a moral hazard problem in the banks. At the same time, they support the idea of CoCo investors acting as monitors, which is a prerequisite for a meaningful role of CoCo debt in banks' regulatory capital mix.
Distributed ledger technologies rely on consensus protocols confronting traders with random waiting times until the transfer of ownership is accomplished. This time consuming settlement process exposes arbitrageurs to price risk and imposes limits to arbitrage. We derive theoretical arbitrage boundaries under general assumptions and show that they increase with expected latency, latency uncertainty, spot volatility, and risk aversion. Using high-frequency data from the Bitcoin network, we estimate arbitrage boundaries due to settlement latency of on average 124 basis points, covering 88% of the observed cross-exchange price differences. Settlement through decentralized systems thus induces non-trivial frictions affecting market efficiency and price formation.
How demanding and consistent is the 2018 stress test design in comparison to previous exercises?
(2018)
Bank regulators have the discretion to discipline banks by executing enforcement actions to ensure that banks correct deficiencies regarding safe and sound banking principles. We highlight the trade-offs regarding the execution of enforcement actions for financial stability. Following this we provide an overview of the differences in the legal framework governing supervisors’ execution of enforcement actions in the Banking Union and the United States. After discussing work on the effect of enforcement action on bank behaviour and the real economy, we present data on the evolution of enforcement actions and monetary penalties by U.S. regulators. We conclude by noting the importance of supervisors to levy efficient monetary penalties and stressing that a division of competences among different regulators should not lead to a loss of efficiency regarding the execution of enforcement actions.
This paper analyzes how the combination of borrowing constraints and idiosyncratic risk affects the equity premium in an overlapping generations economy. I find that introducing a zero-borrowing constraint in an economy without idiosyncratic risk increases the equity premium by 70 percent, which means that the mechanism described in Constantinides, Donaldson, and Mehra (2002) is dampened because of the large number of generations and production. With social security the effect of the zero-borrowing constraint is a lot weaker. More surprisingly, when I introduce idiosyncratic labor income risk in an economy without a zero-borrowing constraint, the equity premium increases by 50 percent, even though the income shocks are independent of aggregate risk and are not permanent. The reason is that idiosyncratic risk makes the endogenous natural borrowing limits much tighter, so that they have a similar effect to an exogenously imposed zero-borrowing constraint. This intuition is confirmed when I add idiosyncratic risk in an economy with a zero-borrowing constraint: neither the equity premium nor the Sharpe ratio change, because the zero-borrowing constraint is already tighter than the natural borrowing limits that result when idiosyncratic risk is added.
This paper uses unique administrative data and a quasi-field experiment of exogenous allocation in Sweden to estimate medium- and longer-run effects on financial behavior from exposure to financially literate neighbors. It contributes evidence of causal impact of exposure and of a social multiplier of financial knowledge, but also of unfavorable distributional aspects of externalities. Exposure promotes saving in private retirement accounts and stockholding, especially when neighbors have economics or business education, but only for educated households and when interaction possibilities are substantial. Findings point to transfer of knowledge rather than mere imitation or effects through labor, education, or mobility channels.
We use minutes from 17,000 financial advisory sessions and corresponding client portfolio data to study how active client involvement affects advisor recommendations and portfolio outcomes. We find that advisors confronted with acquiescent clients stick to their standards and recommend expensive but well diversified mutual fund portfolios. However, if clients take an active role in the meetings, advisors deviate markedly from their standards, resulting in poorer portfolio diversification and lower Sharpe ratios. Our findings that advisors cater to client requests parallel the phenomenon of doctors prescribing antibiotics to insistent patients even if inappropriate, and imply that pandering diminishes the quality of advice.