Working Paper
Refine
Year of publication
Document Type
- Working Paper (2351) (remove)
Language
- English (2351) (remove)
Is part of the Bibliography
- no (2351)
Keywords
- Deutschland (115)
- USA (51)
- Geldpolitik (48)
- monetary policy (46)
- Schätzung (45)
- Europäische Union (43)
- Bank (38)
- Corporate Governance (36)
- Monetary Policy (31)
- Inflation (23)
Institute
- Center for Financial Studies (CFS) (1376)
- Wirtschaftswissenschaften (1306)
- Sustainable Architecture for Finance in Europe (SAFE) (738)
- House of Finance (HoF) (604)
- Institute for Monetary and Financial Stability (IMFS) (173)
- Rechtswissenschaft (148)
- Informatik (114)
- Foundation of Law and Finance (50)
- Exzellenzcluster Die Herausbildung normativer Ordnungen (34)
- Gesellschaftswissenschaften (29)
This publication aims to provide an overview on how digitalisation of communication results in societal trends such as an “always-on” culture, “shitstorms”, “fake news” and their effects on schools, media, non-governmental organisations, work and sports.
Table of Contents
Christian Reuter, Tanjev Schultz, Christian Stegbauer: Digitalisation and Communication: Societal Trends and the Change in Organisations — Preface
Daniel Lambach: Digital World and Real World – Opposites no more
Leonard Reinecke: Brave New Smartphone World? Psychological Wellbeing between Digital Autonomy and Constant Connectedness
Christian Reuter: Fake News and the Manipulation of Public Opinion
Christian Stegbauer: Tantrums on a Massive Scale, or: Could Anybody be a Victim of Social Media Outrage?
Volker Schaeffer: “We Have Always Been Living in Bubbles” The Opportunities and Risks in the Digitalisation of Media
Angela Menig, Verena Zimmermann, Joachim Vogt: Digital Transformation of the Workplace – Risk or Opportunity?
Stefan Aufenanger, Jasmin Bastian: Digital Technology in Schools
Angelika Böhling: Development Assistance Goes Digital - The Opportunities and Challenges Non-Governmental Organisations Face in Digital Communication
Josef Wiemeyer: Digital Interaction and Communication in Sports
We uncover a new channel for spillovers of funding dry-ups. The 2016 US money market fund (MMF) reform exogenously reduced unsecured MMF funding for some banks. We use novel data to trace those banks to a platform for corporate deposit funding. We show that intensified competition for corporate deposits spilled the funding squeeze over to other banks with no MMF exposure. These banks paid more for deposits, and their pool of funding providers deteriorated. Moreover, their lending volumes and margins declined, and their stocks underperformed. Our results suggest that banks' competitiveness in funding markets affect their competitiveness in lending markets.
We study nominal wage rigidity in the Netherlands using administrative data, which has three key features: (1) high-frequency (monthly), (2) high-quality (administrative records), and (3) high coverage (the universe of workers and the universe of firms). We find wage rigidity patterns in the data that are similar to wage behavior documented for other European countries. In particular we find that the hazard function has two spikes, one at 12 months and another one at 24 months and wage changes have time and state dependency components. As a novel and important piece of evidence we also uncover substantial heterogeneity in the frequency of wage changes due to explicit terms of the labor contract. In particular, contracts featuring flexible hours, such as on-call contracts, exhibit a higher probability of a change in the contract wage compared to fixed hour contracts. Once we split the sample based on contract characteristics, we also find that the response of wage changes to the time and state component is heterogeneous across different type of contracts - with relatively more downward adjustments in flexible-hour contract wages in response to aggregate unemployment.
Since the financial crisis financial literacy has attracted growing interest among researchers and policy makers, as there is international empirical evidence that financial literacy is poor among both adults and students. In Germany we have almost no empirical evidence on financial literacy, especially in the case of students attending secondary schools, as financial education has not featured on German school curricula to date. Besides, Germany has not yet participated in the optional financial literacy module of PISA, which was offered for the first time in 2012. However, a lack of private pension provisioning, in spite of demographic change, and low stock ownership among German households indicate a deficit in financial knowledge and skills in this country as well.
In this paper we investigate financial literacy among students aged 14 to 16 attending a secondary school in the state of Hesse. The foundation is a test designed according to international standards. The statistical analysis of the test reveals substantial deficits in key areas of financial literacy. Particular deficits could be identified in the fields of basic knowledge of financial matters and, to an even greater degree, in more advanced concepts such as risk diversification. Applying interest calculations to financial matters turned out to be problematic for many students.
Furthermore, the paper analyses the impact of gender and type of school on the overall test score as well as test performance in specific tasks. The findings suggest that financial matters should be covered in some form at secondary schools. In light of the potentially far-reaching consequences of financial illiteracy for financial wellbeing, German participation in future PISA financial literacy tests seems highly advisable to gain a deeper understanding of the preliminary findings presented in this paper.
Revisiting the stealth trading hypothesis: does time-varying liquidity explain the size-effect?
(2019)
Large trades have a smaller price impact per share than medium-sized trades. So far, the literature has attributed this effect to the informational content of trades. In this paper, we show that this effect can arise from strategic order placement. We introduce the concept of a liquidity elasticity, measuring the responsiveness of liquidity demand with respect to changes in liquidity supply, as a major driver for a declining price impact per share. Empirical evidence based on Nasdaq stocks strongly supports theoretical predictions and shows that the aspect of liquidity coordination is an important complement to rationales based on asymmetric information.
The paper analyses the linkages from financial developments to public finances. It maps and discusses the transmission channels to fiscal variables. These channels include asset prices, financing conditions, balance sheets of banks, non-banks and central banks and international linkages. The study argues that the fiscal effects via each and all these channels can be very serious in magnitude and can put the sustainability of public finances at risk. However, there is an only limited in-depth analysis of these channels and risks.
Depressed demand and supply
(2019)
We investigate the implications of experienced-based learning on consumption-saving and labor supply, two fundamental decisions in business cycle models. Using the Dutch Household Survey, we find that individuals who have experienced higher national unemployment rates over their lifetime save more, borrow less, and work less, after controlling for aggregate shocks, income, wealth, and demographics. Possibly explaining these behavioral responses, these individuals find it more important to save for retirement and to cover unexpected expenses, are more worried about losing their job, and dislike their job more. These results have implications for business cycle models and stabilization policies.
In early July 2019, Christian Sewing, the CEO of Deutsche Bank, proclaimed a fundamental shift of the bank’s strategy after finally obtaining the approval of the Supervisory Board, which the management seems to have requested for quite some time. The essential point of the reorientation is a deep cut into the bank’s investment banking activities. At the same time, those parts of the bank’s activity portfolio that had been the mainstay of Deutsche Bank’s business 20 to 25 years ago, in particular lending to large and mid-sized German and European corporate clients, shall be strengthened in spite of a simultaneous reduction of the bank’s staff by 18,000 FTEs over the next three years.
The bank’s CEO, who has only been in office since about one year, was reported to have called this shift of strategy a “return to the roots of Deutsche Bank” at the press conference at which it was announced, without, however, making it clear to which roots he was referring: those of some 40 years ago, when Deutsche Bank was essentially a Germany-focused commercial bank, or even those from the late 19th century, when the bank had been founded with the mission to become an international bank with a strong capital market-orientation. In any event, the press was impressed and keeps repeating these words, that deserve to be taken seriously and irrespective of their vagueness may be justified. If it were successfully implemented, this change of strategy would indeed be fundamental and imply undoing what Deutsche Bank’s former management teams had aspired to do in the last 20 or 25 years.
The newly announced strategy shift raises two questions. Can it be successful, and what does it mean for the bank itself and its shareholders, for its staff and for its clients? And what does it imply for the German financial system? This note focuses on the latter question. What makes it interesting is the fact that the last fundamental change of Deutsche Bank’s strategy of two decades ago, which aimed at transforming Deutsche Bank from a Germany-centered commercial bank into a leading international investment bank, had a profound – and in my view clearly negative - effect on the entire German financial system.
The financial crisis of 2007-08 has stressed the importance of a sound financial system. Unlike other studies weighing the pros and cons of market versus bank-based systems, this paper investigates whether the main elements of the German financial system can be regarded as complementary and consistent. This assessment refers to the idea that there is a potential for positive interaction between different elements in the system that is actually used to make it more valuable to economy and society and more robust to crises. It is shown that the old German bank-based system, where the risk of long-term lending by large private commercial banks was limited by the membership in supervisory boards and strong personal ties between all stakeholders, was a consistent system of well-adjusted complementary elements. After reunification, a hybrid system has emerged where, on the one hand, public savings banks and cooperative banks maintain their role as lenders, but on the other, large private banks have withdrawn from their former dominant role in financing and corporate governance. It is argued that this transition to stronger capital-market and, accordingly, shareholder value orientations has occurred at the expense of consistency.
This paper aspires to provide an overview of the issue of diversity of banking and financial systems and its development over time from a positive and a normative perspective. In other word: how different are banks within a given country and how much do banking systems and entire financial systems differ between countries and regions, and do in-country diversity and between-country diversity change over time, as one would be inclined to expect as a consequence of globalization and increasingly global standards of regulation?
As the first part of this paper shows, the general answer to these questions is that there is still today a surprisingly high level of diversity in finance. This raises the two questions addressed in the second part of the paper: How can the persistence of diversity be explained, and how can it be assessed? In contrast to prevailing views, the author argues that persistent diversity should be regarded as valuable in a context in which there is no clear answer to the question of which structures of banking and financial systems are optimal from an economic perspective
It has been documented that vertical customer-supplier links between industries are the basis for strong cross-sectional stock return predictability (Menzly and Ozbas (2010)). We show that robust predictability also arises from horizontal links between industries, i.e., from the fact that industries are competitors or offer products, which are substitutes for each other. These horizontally linked industries exhibit positively correlated fundamentals. The signal derived from this type of connectedness is the basis for significant alpha in sorted portfolio strategies, and informed investors take the related information into account when they form their portfolios. We thus provide evidence of return predictability based on a new type of economic links between industries not captured in previous studies.
In the course of the crisis, the European System of Central Banks (ESCB) has acted several times to support the EU Member States and banking systems in financial distress by purchasing debt instruments: Covered Bonds Programmes (CBP), Securities Market Programmes (SMP), Long Term Refinancing Operations (LTRO), and Targeted Long Term Refinancing Operations (TLTRO), followed by the Outright Monetary Transactions (OMT) and then the Extended Asset Purchase Programmes (EAPP) – colloquially labelled as Quantitative Easing (QE).
Initially, the support measures of the ESCB might have to be judged as monetary policy but the selectivity of OMT and – even more – SMP in conjunction with the transfer of risks to the ESCB speak against it.
Using a novel regulatory dataset of fully identified derivatives transactions, this paper provides the first comprehensive analysis of the structure of the euro area interest rate swap (IRS) market after the start of the mandatory clearing obligation. Our dataset contains 1.7 million bilateral IRS transactions of banks and non-banks. Our key results are as follows:
1) The euro area IRS market is highly standardised and concentrated around the group of the G16 Dealers but also around a significant group of core “intermediaries"(and major CCPs).
2) Banks are active in all segments of the IRS euro market, whereas non-banks are often specialised.
3) When using relative net exposures as a proxy for the “flow of risk" in the IRS market, we find that risk absorption takes place in the core as well as the periphery of the network but in absolute terms the risk absorption is largely at the core.
4) Among the Basel III capital and liquidity ratios, the leverage ratio plays a key role in determining a bank's IRS trading activity.
We build a search-and-matching algorithm of network dynamics with decision-making under incomplete information, seeking to understand the determinants of the observed gradual downgrading of expert opinion on complicated issues and the decreasing trust in science. Even without fake news, combining the internet’s ease of forming networks with (a) individual biases, such as confirmation bias or assimilation bias, and (b) people’s tendency to align their actions with those of peers, produces populist and polarization network dynamics. Homophily leads to actions with more weight on biases and less weight on expert opinion, and such actions lead to more homophily.
Exploiting heterogeneity in U.S. firms' exposure to an unconventional monetary policy shock that reduced debt financing costs, I identify the impact of financing conditions on firms' toxic emissions. I find robust evidence that lower financing costs reduce toxic emissions and boost investments in emission reduction activities, especially capital-intensive pollution control activities. The effect is stronger for firms in noncompliance with environmental regulation. Examining the ability of regaining regulatory compliance by implementing pollution control activities I find that only capital-intensive activities help firms regaining compliance. These findings underscore the impact of firms' financing conditions for emissions and the environment.
We show that "quasi-dark" trading venues, i.e., markets with somewhat non-transparent trading mechanisms, are important parts of modern equity market structure alongside lit markets and dark pools. Using the European MiFID II regulation as a quasi-natural experiment, we find that dark pool bans lead to (i) volume spill-overs into quasi-dark trading mechanisms including periodic auctions and order internalization systems; (ii) little volume returning to transparent public markets; and consequently, (iii) a negligible impact on market liquidity and short-term price efficiency. These results show that quasi-dark markets serve as close substitutes for dark pools and consequently mitigate the effectiveness of dark pool regulation. Our findings highlight the need for a broader approach to transparency regulation in modern markets that takes into consideration the many alternative forms of quasi-dark trading.
We study the effects of market incompleteness on speculation, investor survival, and asset pricing moments, when investors disagree about the likelihood of jumps and have recursive preferences. We consider two models. In a model with jumps in aggregate consumption, incompleteness barely matters, since the consumption claim resembles an insurance product against jump risk and effectively reproduces approximate spanning. In a long-run risk model with jumps in the long-run growth rate, market incompleteness affects speculation, and investor survival. Jump and diffusive risks are more balanced regarding their importance and, therefore, the consumption claim cannot reproduce approximate spanning.
This paper investigates what we can learn from the financial crisis about the link between accounting and financial stability. The picture that emerges ten years after the crisis is substantially different from the picture that dominated the accounting debate during and shortly after the crisis. Widespread claims about the role of fair-value (or mark-to-market) accounting in the crisis have been debunked. However, we identify several other core issues for the link between accounting and financial stability. Our analysis suggests that, going into the financial crisis, banks’ disclosures about relevant risk exposures were relatively sparse. Such disclosures came later after major concerns about banks’ exposures had arisen in markets. Similarly, banks delayed the recognition of loan losses. Banks’ incentives seem to drive this evidence, suggesting that reporting discretion and enforcement deserve careful consideration. In addition, bank regulation through its interlinkage with financial accounting may have dampened banks’ incentives for corrective actions. Our analysis illustrates that a number of serious challenges remain if accounting and financial reporting are to contribute to financial stability.
We examine the degree to which competition amongst lenders interacts with the cyclicality in lending standards using a simple measure, the average physical distance of borrowers from banks’ branches. We propose that this novel measure captures the extent to which lenders are willing to stretch their lending portfolio. Consistent with this idea, we find a significant cyclical component in the evolution of lending distances. Distances widen considerably when credit conditions are lax and shorten considerably when credit conditions become tighter. Next, we show that a sharp departure from the trend in distance between banks and borrowers is indicative of increased risk taking. Finally, we provide evidence that as competition in banks’ local markets increases, their willingness to make loans at greater distance increases. Since average lending distance is easily measurable, it is potentially a useful measure for bank supervisors.
Visibility and digital accessibility of the School of Salamanca in a linked open-data environment
(2019)
This paper raises the bibliographic and technological approach to increasing visibility and accessibility of the work of the School of Salamanca in the current technological state of the web. The objective is to avoid the cultural effect of not acting in this field, for which authors draw an analogy with Plantin's privilege in 16th-Century Spanish printing. The Virtual Library of the School of Salamanca is described as a Linked Open-Data resource about the authors of this school and their digitized works, in which the relationships between authors and concepts are crucial. For this purpose, different properties of the DBpedia ontology are used, and the descriptions of the authors are systematically linked to other Linked Open-Data resources. All descriptions (authors, works and concepts) are offered in Europeana Data Model and MARC 21. Also discussed are the advantages of Wikipedia and Wikidata in increasing visibility.
A summary of this text was presented at the international conference organized by the Max Planck Institute for the History of European Law: "The School of Salamanca: A Case of Global Knowledge Production?", held in Buenos Aires from 24th to 26th October 2018.
In this exploratory article, we consider the future of Deutsche Bank and Commerzbank and develop a new approach to the topic: instead of a merger of DB and CB we propose to consider a partial merger of the IT and related back office functions in order to create the basis for an Open Banking platform in Germany. Such a platform would act as a cross-institutional infrastructure company in which the participating banks develop a common data and IT platform (while respecting the data protection regulations). Significant parts of the transaction processes would be pooled by the institutions and executed by the Open Banking platform. Moreover, the institutions remain legally independent and compete with each other at the level of products and services that are developed and produced using just this common data and IT platform – “national champions” would not be created.
But such an “Open Banking Platform” could become even the nucleus of a European Banking platform that could be competitive with existing global data platforms from the USA and China which are already offering financial services and are likely to expand their offerings in the foreseeable future. The proposed model of an open data platform for banks prevents the emergence of national champions and supports the main goal of the banking union: creation of a financial system, in which single banks can be resolved without provoking a systemic crisis and forcing taxpayers to finance bailouts.
We study the information flow from the ECB on policy dates since its inception, using tick data. We show that three factors capture about all of the variation in the yield curve but that these are different factors with different variance shares in the window that contains the policy decision announcement and the window that contains the press conference. We also show that the QE-related policy factor has been dominant in the recent period and that Forward Guidance and QE effects have been very persistent on the longer-end of the yield curve. We further show that broad and banking stock indices' responses to monetary policy surprises depended on the perceived nature of the surprises. We find no evidence of asymmetric responses of financial markets to positive and negative surprises, in contrast to the literature on asymmetric real effects of monetary policy. Lastly, we show how to implement our methodology for any policy-related news release, such as policymaker speeches. To carry out the analysis, we construct the Euro Area Monetary Policy Event- Study Database (EA-MPD). This database, which contains intraday asset price changes around the policy decision announcement as well as around the press conference, is a contribution on its own right and we expect it to be the standard in monetary policy research for the euro area.
Decisions under ambiguity depend on both the belief regarding possible scenarios and the attitude towards ambiguity. This paper exclusively investigates the belief formation and belief updating process under ambiguity, using laboratory experiments. The results show that half of the subjects tend to adopt a simple heuristic strategy when updating beliefs, while the other half seems to partially adopt the Bayesian updates. We recover beliefs, represented by distributions of the priors/posteriors. The recoverable initial priors mostly follow a uniform distribution. We also find that subjects on average demonstrate slight pessimism in an ambiguous environment.
updated version --
The Multilingual Assessment Instrument for Narratives (MAIN) was designed in order to assess narrative skills in children who acquire one or more languages from birth or from early age. MAIN is suitable for children from 3 to 10 years and evaluates both comprehension and production of narratives. Its design allows for the assessment of several languages in the same child, as well as for different elicitation modes: Model Story, Retelling, and Telling. MAIN contains four parallel stories, each with a carefully designed six-picture sequence. The stories are controlled for cognitive and linguistic complexity, parallelism in macrostructure and microstructure, as well as for cultural appropriateness and robustness. The instrument has been developed on the basis of extensive piloting with more than 550 monolingual and bilingual children aged 3 to 10, for 15 different languages and language combinations. Even though MAIN has not been norm-referenced yet, its standardized procedures can be used for evaluation, intervention and research purposes. MAIN is currently available in the following languages: English, Afrikaans, Albanian, Basque, Bulgarian, Croatian, Cypriot Greek, Danish, Dutch, Estonian, Finnish, French, German, Greek, Hebrew, Icelandic, Italian, Lithuanian, Norwegian, Polish, Russian, Spanish, Standard Arabic, Swedish, Turkish, Vietnamese, and Welsh.
Doing safe by doing good : ESG investing and corporate social responsibility in the U.S. and Europe
(2019)
This paper examines the profitability of investing according to environmental, social and governance (ESG) criteria in the U.S. and Europe. Based on data from 2003 to 2017, we show that a portfolio long in stocks with the highest ESG scores and short in those with the lowest scores yields a significantly negative abnormal return. Interestingly, this is caused by the strong positive return of firms with the lowest ESG activity. As we find that increasing ESG scores reduce firm risk (particularly downside risk), this hints at an insurance-like character of corporate social responsibility: Firms with low ESG activity need to offer a corresponding risk premium. The perception of ESG as an insurance can be shown to be stronger in more volatile capital markets for U.S. firms, but not for European firms. Socially responsible investment may therefore be of varying attractiveness in different market phases.
Do household inflation expectations affect consumption-savings decisions? We link survey data on quantitative inflation expectations to administrative data on income and wealth. We document that households with higher inflation expectations save less. Estimating panel data models with year and household fixed effects, we find that a one percentage point increase in a household's inflation expectation over time is associated with a 250-400 euro reduction in the household's change in net worth per year on average. We also document that households with higher inflation expectations are more likely to acquire a car and acquire higher-value cars. In addition, we provide a quantitative model of household-level inflation expectations.
We propose a simple modification of the time series filter by Hamilton (2018) that yields reliable and economically meaningful real-time output gap estimates. The original filter relies on 8-quarter-ahead forecast errors of a simple autoregression of log real GDP. While this approach yields a cyclical component of GDP that is hardly revised with new incoming data due to the one-sided filtering approach, it does not cover typical business cycle frequencies evenly, but short business cycles are muted and medium length business cycles are amplified. Further, the estimated trend is as volatile as GDP itself and can thus hardly be interpreted as potential GDP. A simple modification that is based on the mean of 4- to 12-quarter-ahead forecast errors shares the favorable real-time properties of the Hamilton filter, but leads to a much better coverage of typical business cycle frequencies and a smooth estimated trend. Based on output growth and inflation forecasts and a comparison to revised output gap estimates from policy institutions, we find that real-time output gaps based on the modified Hamilton filter are economically much more meaningful measures of the business cycle than those based on other simple statistical trend-cycle decomposition techniques such as the HP or the Bandpass filter.
We analyze cyclical co-movement in credit, house prices, equity prices, and longterm interest rates across 17 advanced economies. Using a time-varying multi-level dynamic factor model and more than 130 years of data, we analyze the dynamics of co-movement at different levels of aggregation and compare recent developments to earlier episodes such as the early era of financial globalization from 1880 to 1913 and the Great Depression. We find that joint global dynamics across various financial quantities and prices as well as variable-specific global co-movements are important to explain fluctuations in the data. From a historical perspective, global co-movement in financial variables is not a new phenomenon, but its importance has increased for some variables since the 1980s. For equity prices, global cycles play currently a historically unprecedented role, explaining more than half of the fluctuations in the data. Global cycles in credit and housing have become much more pronounced and longer, but their importance in explaining dynamics has only increased for some economies including the US, the UK and Nordic European countries. We also include GDP in the analysis and find an increasing role for a global business cycle.
There is substantial disagreement about the consequences of the Tax Cuts and Jobs Act (TCJA) of 2017, which constitutes the most extensive tax reform in the United States in more than 30 years. Using a large-scale two-country dynamic general equilibrium model with nominal rigidities, we find that the TCJA increases GDP by about 2% in the medium-run and by about 2.5% in the long-run. The shortrun impact depends crucially on the degree and costs of variable capital utilization, with GDP effects ranging from 1 to 3%. At the same time, the TCJA does not pay for itself. In our analysis, the reform decreases tax revenues and raises the debt-to-GDP ratio by about 15 percentage points in the medium-run until 2025. We show that combining the TCJA with spending cuts can dampen the increase in government indebtedness without reducing its expansionary effect.
We investigate the transmission of central bank liquidity to bank deposits and loan spreads in Europe over the January 2006 to June 2010 period. We find evidence consistent with an impaired transmission channel due to bank risk. Central bank liquidity does not translate into lower loan spreads for high-risk banks, even as it lowers deposit rates for both high-risk and low-risk banks. This adversely affects the balance sheets of high-risk bank borrowers, leading to lower payouts, lower capital expenditures, and lower employment. Overall, our results suggest that banks’ capital constraints at the time of an easing of monetary policy pose a challenge to the effectiveness of the bank lending channel and the effectiveness of the central bank as a lender of last resort.
Job loss expectations, durable consumption and household finances : evidence from linked survey data
(2019)
Job security is important for durable consumption and household savings. Using surveys, workers express a probability that they will lose their job in the next 12 months. In order to assess the empirical content of these probabilities, we link survey data to administrative data with labor market outcomes. Workers predict job loss quite well, in particular those whose job loss is followed by unemployment. Workers with higher job loss expectations acquire cheaper cars, and are less likely to buy new cars. In line with models of precautionary saving, higher job loss expectations are associated with more savings and less exposure to risky assets.
We study how the informativeness of stock prices changes with the presence of high-frequency trading (HFT). Our estimate is based on the staggered start of HFT participation in a panel of international exchanges. With HFT presence, market prices are a less reliable predictor of future cash flows and investment, even more so for longer horizons. Further, firm-level idiosyncratic volatility decreases, and the holdings and trades by institutional investors deviate less from the market-capitalization weighted portfolio as a benchmark. Our results document that the informativeness of prices decreases subsequent to the start of HFT. These findings are consistent with theoretical models of HFTs' ability to anticipate informed order flow, resulting in decreased incentives to acquire fundamental information.
In this note, we first highlight different developments for banks under direct ECB supervision within the SSM that may prompt further investigation by supervisors. We find that banks that were weakly capitalized at the start of direct ECB supervision (1) still face elevated levels of non-performing loans, (2) are less cost-efficient and (3) reduced their share of subordinated debt financing over the last years. We then stress the importance of continuous and ongoing cost-benefit analysis regarding banking supervision in Europe. We also encourage processes to question existing supervisory practices to ensure a lean and efficient banking supervision. Finally, we underline the need of continuous and intensified coordination among regulatory bodies in the Banking Union since the efficacy of European bank supervision rests on its interplay with many different institutions.
This document was requested by the European Parliament's Committee on Economic and Monetary Affairs. It was originally published on the European Parliament’s webpage.
Do competition and incentives offered to designated market makers (DMMs) improve market liquidity? Using data from NYSE Euronext Paris, we show that an exogenous increase in competition among DMMs leads to a significant decrease in quoted and effective spreads, mainly through a reduction in adverse selection costs. In contrast, changes in incentives, through small changes in rebates and requirements for DMMs, do not have any tangible effect on market liquidity. Our results are of relevance for designing optimal contracts between exchanges and DMMs and for regulatory market oversight.
We show that banks that are facing relatively high locally non-diversifiable risks in their home region expand more across states than banks that do not face such risks following branching deregulation in the 1990s and 2000s. These banks with high locally non-diversifiable risks also benefit relatively more from deregulation in terms of higher bank stability. Further, these banks expand more into counties where risks are relatively high and positively correlated with risks in their home region, suggesting that they do not only diversify but also build on their expertise in local risks when they expand into new regions.
Self-control failure is among the major pathologies (Baumeister et al. (1994)) affecting individual investment decisions which has hardly been measurable in empirical research. We use cigarette addiction identified from checking account transactions to proxy for low self-control and compare over 5,000 smokers to 14,000 nonsmokers. Smokers self-directing their investment trade more frequently, exhibit more biases and achieve lower portfolio returns. We also find that smokers, some of which might be aware of their limited levels of self-control, exhibit a higher propensity than nonsmokers to delegate decision making to professional advisors and fund managers. We document that such precommitments work successfully.
We explore space improvements in LRP, a polymorphically typed call-by-need functional core language. A relaxed space measure is chosen for the maximal size usage during an evaluation. It Abstracts from the details of the implementation via abstract machines, but it takes garbage collection into account and thus can be seen as a realistic approximation of space usage. The results are: a context lemma for space improving translations and for space equivalences; all but one reduction rule of the calculus are shown to be space improvements, and the exceptional one, the copy-rule, is shown to increase space only moderately.
Several further program transformations are shown to be space improvements or space equivalences, in particular the translation into machine expressions is a space equivalence. These results are a step Forward in making predictions about the change in runtime space behavior of optimizing transformations in callbyneed functional languages.
We explore space improvements in LRP, a polymorphically typed call-by-need functional core language. A relaxed space measure is chosen for the maximal size usage during an evaluation. It Abstracts from the details of the implementation via abstract machines, but it takes garbage collection into account and thus can be seen as a realistic approximation of space usage. The results are: a context lemma for space improving translations and for space equivalences; all but one reduction rule of the calculus are shown to be space improvements, and the exceptional one, the copy-rule, is shown to increase space only moderately.
Several further program transformations are shown to be space improvements or space equivalences, in particular the translation into machine expressions is a space equivalence. These results are a step Forward in making predictions about the change in runtime space behavior of optimizing transformations in callbyneed functional languages.
The synchronous pi-calculus is translated into a core language of Concurrent Haskell extended by futures (CHF). The translation simulates the synchronous message-passing of the pi-calculus by sending messages and adding synchronization using Concurrent Haskell's mutable shared-memory locations (MVars). The semantic criterion is a contextual semantics of the pi-calculus and of CHF using may- and should-convergence as observations. The results are equivalence with respect to the observations, full abstraction of the translation of closed processes, and adequacy of the translation on open processes. The translation transports the semantics of the pi-calculus processes under rather strong criteria, since error-free programs are translated into error-free ones, and programs without non-deterministic error possibilities are also translated into programs without non-deterministic error-possibilities. This investigation shows that CHF embraces the expressive power and the concurrency capabilities of the pi-calculus.
Distributed ledger technology especially in the form of publicly coordinated validation networks such as Ethereum and Bitcoin with their own monetary circles provide for a revealing litmus test for current financial regulatory schemes. The paper highlights the interrelation between distributed coordination and the emission of virtual currency to make sense of the function of the new monetary phenomenon. It then argues for the regulation of financial services on the ground of the technology to ensure integrity standards. In this respect, it is useful to gear the development of a regulatory scheme towards the existing financial regulatory principles. However, future measures of the regulators must take the distributed nature of the platforms into account by relying on a “regulated self-regulation” of the community. Finally, the article focuses on the shortcomings of the current EU regulatory regimes, especially the regulation frameworks regarding financial services, payment services and electronic money.
Exploiting the natural experiment of the German reunification, we examine how consumers adapt to a new environment in their macroeconomic forecasting. We document that East Germans expect higher in inflation and make larger forecast errors than West
Germans even decades after reunification. Differences in consumption baskets, financial literacy, risk aversion or trust in the central bank cannot fully account for these patterns. We find most support for the explanation that East Germans, who were used to a strong norm of zero inflation, persistently overadjusted the level of their expectations in the face of the initial inflation shock in reunified Germany. Our findings suggest that large changes in the economic environment can permanently impede people's ability to form accurate macroeconomic expectations, with an important role for the interaction of old norms and new experiences around the event.
Policymakers attach an important role to the macroeconomic outlook of households. Using a representative online panel form the U.S., the authors examine how individuals' macroeconomic expectations causally affect their personal economic prospects and their behavior and provide them with different professional forecasts about the likelihood of a recession. The authors find that groups with the largest exposure to aggregate risk, such as individuals working in cyclical industries, are most likely to respond to an improved macroeconomic outlook, while a large fraction of the population is unlikely to react.
This paper uses unique administrative data and a quasi-field experiment of exogenous allocation in Sweden to estimate medium- and longer-run effects on financial behavior from exposure to financially literate neighbors. It contributes evidence of causal impact of exposure and of a social multiplier of financial knowledge, but also of unfavorable distributional aspects of externalities. Exposure promotes saving in private retirement accounts and stockholding, especially when neighbors have economics or business education, but only for educated households and when interaction possibilities are substantial. Findings point to transfer of knowledge rather than mere imitation or effects through labor, education, or mobility channels.
The recent sovereign debt crisis in the Eurozone was characterized by a monetary policy, which has been constrained by the zero lower bound (ZLB) on nominal interest rates, and several countries, which faced high risk spreads on their sovereign bonds. How is the government spending multiplier affected by such an economic environment?While prominent results in the academic literature point to high government spending multipliers at the ZLB, higher public indebtedness is often associated with small government spending multipliers. I develop a DSGE model with leverage constrained banks that captures both features of this economic environment, the ZLB and fiscal stress. In this model, I analyze the effects of government spending shocks. I find that not only are multipliers large at the ZLB, the presence of fiscal stress can even increase their size. For longer durations of the ZLB,multipliers in this model can be considerably larger than one.
JEL Classification: E32, E 44, E62
Recently, Fuest and Sinn (2018) have demanded a change of rules for the Eurozone’s Target 2 payment system, claiming it would violate the Statutes of the European System of Central Banks and of the European Central Bank. The authors present a stylized model based on a set of macro-economic assumptions, and show that Target 2 may lead to loss sharing among national central banks (NCBs), thus violating the no risk-sharing requirement laid out by the Eurosystem Statutes.
In this note, I present an augmented model that incorporates essential features of the micro- and macroprudential regulatory and supervisory regime that today is hard-wired into Europe’s banking system. The model shows that the original no-risk-sharing principle is not necessarily violated during a financial crisis of a member state. Moreover, it shows that under a banking union regime, financial crisis asset value losses at or below the 99.9th percentile are borne by private investors, not by taxpayers, and particularly not by central banks.
Therefore, policy conclusions from the micro-founded model differ significantly from those suggested by Fuest and Sinn (2018).
We propose a shrinkage and selection methodology specifically designed for network inference using high dimensional data through a regularised linear regression model with Spike-and-Slab prior on the parameters. The approach extends the case where the error terms are heteroscedastic, by adding an ARCH-type equation through an approximate Expectation-Maximisation algorithm. The proposed model accounts for two sets of covariates. The first set contains predetermined variables which are not penalised in the model (i.e., the autoregressive component and common factors) while the second set of variables contains all the (lagged) financial institutions in the system, included with a given probability. The financial linkages are expressed in terms of inclusion probabilities resulting in a weighted directed network where the adjacency matrix is built “row by row". In the empirical application, we estimate the network over time using a rolling window approach on 1248 world financial firms (banks, insurances, brokers and other financial services) both active and dead from 29 December 2000 to 6 October 2017 at a weekly frequency. Findings show that over time the shape of the out degree distribution exhibits the typical behavior of financial stress indicators and represents a significant predictor of market returns at the first lag (one week) and the fourth lag (one month).
Extending the data set used in Beyer (2009) to 2017, we estimate I(1) and I(2) money demand models for euro area M3. After including two broken trends and a few dummies to account for shifts in the variables following the global financial crisis and the ECB's non-standard monetary policy measures, we find that the money demand and the real wealth relations identified in Beyer (2009) have remained remarkably stable throughout the extended sample period. Testing for price homogeneity in the I(2) model we find that the nominal-to-real transformation is not rejected for the money relation whereas the wealth relation cannot be expressed in real terms.
This paper examines how networks of professional contacts contribute to the development of the careers of executives of North American and European companies. We build a dynamic model of career progression in which career moves may both depend upon existing networks and contribute to the development of future networks. We test the theory on an original dataset of nearly 73 000 executives in over 10 000 _rms. In principle professional networks could be relevant both because they are rewarded by the employer and because they facilitate job mobility. Our econometric analysis suggests that, although there is a substantial positive correlation between network size and executive compensation, with an elasticity of around 20%, almost all of this is due to unobserved individual characteristics. The true causal impact of networks on compensation is closer to an elasticity of 1 or 2% on average, all of this due to enhanced probability of moving to a higher-paid job. And there appear to be strongly diminishing returns to network size.
Using a unique confidential contract level dataset merged with firm-level asset price data, we find robust evidence that firms' stock market valuations and employment levels respond more to monetary policy announcements the higher the degree of wage rigidity. Data on the renegotiations of collective bargaining agreements allow us to construct an exogenous measure of wage rigidity. We also find that the amplification induced by wage rigidity is stronger for firms with high labor intensity and low profitability, providing evidence of distributional consequences of monetary policy. We rationalize the evidence through a model in which firms in different sectors feature different degrees of wage rigidity due to staggered renegotiations vis-a-vis unions.
This paper analyzes the effect of financial constraints on firms' corporate social responsibility. Exploiting heterogeneity in firms' exposure to a monetary policy shock in the U.S., which reduced financial constraints for some firms, I find that firms increase their environmental responsibility. I use facility-level data to account for unobservable time-varying influences on pollution and find that toxic emissions decrease when parent companies are more exposed to the monetary policy shock. I further find that these facilities are also more likely to implement pollution abatement activities. Examining within-parent company heterogeneity I find that pollution abatement investments center on facilities at greater risk of facing additional costs due to environmental regulation. The findings are consistent with the idea that a reduction in financial constraints reduces pollution as it allows firms to implement pollution abatement measures.
Households buy life insurance as part of their liquidity management. The option to surrender such a policy can serve as a buffer when a household faces a liquidity need. In this study, we investigate empirically which individual and household specific sociodemographic factors influence the surrender behavior of life insurance policyholders. Based on the Socio-Economic Panel (SOEP), an ongoing wide-ranging representative longitudinal study of around 11,000 private households in Germany, we construct a proxy to identify life insurance surrender in the data. We use this proxy to conduct fixed effect regressions and support the results with survival analyses. We find that life events that possibly impose a liquidity shock to the household, such as birth of a child and divorce increase the likelihood to surrender an existing life insurance policy for an average household in the panel. The acquisition of a dwelling and unemployment are further aspects that can foster life insurance surrender. Our results are robust with respect to different models and hold conditioning on region specific trends; they vary however for different age groups. Our analyses contribute to the existing literature supporting the emergency fund hypothesis. The findings obtained in this study can help life insurers and regulators to detect and understand industry specific challenges of the demographic change.
Higher capital ratios are believed to improve system-wide financial stability through three main channels: (i) higher loss-absorption capacity, (ii) lower moral hazard, (iii) stabilization of the financial cycle if capital ratios are increased during good times. We examine these mechanisms in a laboratory asset market experiment with indebted participants. We find support for the loss-absorption channel: higher capital ratios reduce the bankruptcy rate. However, we do not find support for the moral hazard channel. Higher capital ratios (insignificantly) increase asset price bubbles, an aggregate measure of excessive risk-taking. Additional evidence suggests that bankruptcy aversion explains this surprising result. Finally, the evidence supports the idea that higher capital ratios in good times stabilize the financial cycle.
Whither artificial intelligence? Debating the policy challenges of the upcoming transformation
(2018)
The School of Salamanca, and Iberian late Scholasticism in general, had the merit of transposing the wisdom of medieval scholasticism into the coordinates of early modernity. Due to the economic growth after the discovery of America, economic terms and moral problems become a central focus for moral theologians. In this article, I consider important key economic concepts that deliver a surprising wealth of insights into the modernization brought about by the leading scholars of the time. Social mobility, the principle of majority decision, the inviolability of property, human rights of the person, limited political power of the pope, and other key concepts that were decisive for the development of democracy and modernity are to be found in the works of the School of Salamanca in connection with economic issues.
Distributed ledger technologies rely on consensus protocols confronting traders with random waiting times until the transfer of ownership is accomplished. This time consuming settlement process exposes arbitrageurs to price risk and imposes limits to arbitrage. We derive theoretical arbitrage boundaries under general assumptions and show that they increase with expected latency, latency uncertainty, spot volatility, and risk aversion. Using high-frequency data from the Bitcoin network, we estimate arbitrage boundaries due to settlement latency of on average 124 basis points, covering 88% of the observed cross-exchange price differences. Settlement through decentralized systems thus induces non-trivial frictions affecting market efficiency and price formation.
Much ado about nothing : a study of differential pricing and liquidity of short and long term bonds
(2018)
Are yields of long-maturity bonds distorted by demand pressure of clientele investors, regulatory effects, or default, flight-to-safety or liquidity premiums? Using data on German nominal bonds between 2005 and 2015, we study the differential pricing and liquidity of short and long maturity bonds. We find statistically significant, but economically negligible segmentation in yields and some degree of liquidity segmentation of short-term versus long-term bonds. These results have important policy implications for the e17.5 trillion European pension and insurance industries: long maturity bond yields seem appropriate for the valuation of long-term liabilities.
A number of recent studies have concluded that consumer spending patterns over the month are closely linked to the timing of income receipt. This correlation is interpreted as evidence of hyperbolic discounting. I re-examine patterns of spending in the diary sample of the U.S. Consumer Expenditure Survey, incorporating information on the timing of the main consumption commitment for most households - their monthly rent or mortgage payment. I find that non-durable and food spending increase with 30-48% on the day housing payments are made, with smaller increases in the days after. Moreover, households with weekly, biweekly and monthly income streams but the same timing of rent/mortgage payments have very similar consumption patterns. Exploiting variation in income, I find that households with extra liquidity decrease non-durable spending around housing payments, especially those households with a large budget share of housing.
A recent US Treasury regulation allowed deferred longevity income annuities to be included in pension plan menus as a default payout solution, yet little research has investigated whether more people should convert some of the $15 trillion they hold in employer-based defined contribution plans into lifelong income streams. We investigate this innovation using a calibrated lifecycle consumption and portfolio choice model embodying realistic institutional considerations. Our welfare analysis shows that defaulting a small portion of retirees’ 401(k) assets (over a threshold) is an attractive way to enhance retirement security, enhancing welfare by up to 20% of retiree plan accruals.
We provide the first partner tenure and rotation analysis for a large cross-section of U.S. publicly listed firms over an extended period. We analyze the effects on audit quality as well as economic tradeoffs with respect to audit hours and fees. On average, we find no evidence for audit quality declines over the tenure cycle and, consistent with the former, little support for fresh-look benefits after five-year mandatory rotations. Nevertheless, partner rotations have significant economic consequences. We find increases in audit fees and decreases in audit hours over the tenure cycle, which differ by partner experience, client size, and competitiveness of the local audit market. Our findings are consistent with efforts by the audit firms to minimize disruptions and audit failures around mandatory rotations. We also analyze special circumstances, such as audit firm or audit team switches and early partner rotations. We show that these situations are more disruptive and more likely to exhibit audit quality effects. In particular, we find that low quality audits give rise to early engagement partner rotations and in this sense have (career) consequences for partners.
Manipulative communications touting stocks are common in capital markets around the world. Although the price distortions created by so-called “pump-and-dump” schemes are well known, little is known about the investors in these frauds. By examining 421 “pump-and-dump” schemes between 2002 and 2015 and a proprietary set of trading records for over 110,000 individual investors from a major German bank, we provide evidence on the participation rate, magnitude of the investments, losses, and the characteristics of the individuals who invest in such schemes. Our evidence suggests that participation is quite common and involves sizable losses, with nearly 6% of active investors participating in at least one “pump-and-dump” and an average loss of nearly 30%. Moreover, we identify several distinct types of investors, some of which should not be viewed as falling prey to these frauds. We also show that portfolio composition and past trading behavior can better explain participation in touted stocks than demographics. Our analysis offers insights into the challenges associated with designing effective investor protection against market manipulation.
An important question in banking is how strict supervision affects bank lending and in turn local business activity. Forcing banks to recognize losses could choke off lending and amplify local economic woes, especially after financial crises. But stricter supervision could also lead to changes in how banks assess loans and manage their loan portfolios. Estimating such effects is challenging. We exploit the extinction of the thrift regulator (OTS) – a large change in prudential supervision, affecting ten percent of all U.S. depository institutions. Using this event, we analyze economic links between strict supervision, bank lending and business activity. We first show that the OTS replacement indeed resulted in stricter supervision of former OTS banks. We then analyze the lending effects of this regulatory change and show that former OTS banks increase small business lending by approximately 10 percent. This increase stems primarily from well capitalized banks and those more affected by the new regime. These findings suggest that stricter supervision operates not only through capital but can also overcome frictions in bank management, leading to more lending and a reallocation of loans. Consistent with the latter, we find increases in business entry and exit in counties with greater expose to OTS banks.
The use of evidence and economic analysis in policymaking is on the rise, and accounting standard setting and financial regulation are no exception. This article discusses the promise of evidence-based policymaking in accounting and financial markets as well as the challenges and opportunities for research supporting this endeavor. In principle, using sound theory and robust empirical evidence should lead to better policies and regulations. But despite its obvious appeal and substantial promise, evidence-based policymaking is easier demanded than done. It faces many challenges related to the difficulty of providing relevant causal evidence, lack of data, the reliability of published research, and the transmission of research findings. Overcoming these challenges requires substantial infrastructure investments for generating and disseminating relevant research. To illustrate this point, I draw parallels to the rise of evidence-based medicine. The article provides several concrete suggestions for the research process and the aggregation of research findings if scientific evidence is to inform policymaking. I discuss how policymakers can foster and support policy-relevant research, chiefly by providing and generating data. The article also points to potential pitfalls when research becomes increasingly policy-oriented.
We examine whether the economy can be insured against banking crises with deposit and loan contracts contingent on macroeconomic shocks. We study banking competition and show that the private sector insures the banking system through such contracts, and banking crises are avoided, provided that failed banks are not bailed out. When risks are large, banks may shift part of them to depositors. In contrast, when banks are bailed out by the next generation, depositors receive non-contingent contracts with high interest rates, while entrepreneurs obtain loan contracts that demand high repayment in good times and low repayment in bad times. As a result, the present generation overinvests, and banks generate large macroeconomic risks for future generations, even if the underlying productivity risk is small or zero. We conclude that a joint policy package of orderly default procedures and contingent contracts is a promising way to reduce the threat of a fragile banking system.
Following the introduction of the one-child policy in China, the capital-labor (K/L) ratio of China increased relative to that of India, and, simultaneously, FDI inflows relative to GDP for China versus India declined. These observations are explained in the context of a simple neoclassical OLG paradigm. The adjustment mechanism works as follows: the reduction in the growth rate of the (urban) labor force due to the one-child policy permanently increases the capital per worker inherited from the previous generation. The resulting increase in China's (domestic K)/L thus "crowds out" the need for FDI in China relative to India. Our paper is a contribution to the nascent literature exploring demographic transitions and their effects on FDI flows.
Based on OECD evidence, equity/housing-price busts and credit crunches are followed by substantial increases in public consumption. These increases in unproductive public spending lead to increases in distortionary marginal taxes, a policy in sharp contrast with presumably optimal Keynesian fiscal stimulus after a crisis. Here we claim that this seemingly adverse policy selection is optimal under rational learning about the frequency of rare capital-value busts. Bayesian updating after a bust implies massive belief jumps toward pessimism, with investors and policymakers believing that busts will be arriving more frequently in the future. Lowering taxes would be as if trying to kick a sick horse in order to stand up and run, since pessimistic markets would be unwilling to invest enough under any temporarily generous tax regime.
We present empirical evidence on the heterogeneity in monetary policy transmission across countries with different home ownership rates. We use household-level data together with shocks to the policy rate identified from high-frequency data. We find that housing tenure reacts more strongly to unexpected changes in the policy rate in Germany and Switzerland –the OECD countries with the lowest home ownership rates– compared with existing evidence for the U.S. An unexpected decrease in the policy rate by 25 basis points increases the home ownership rate by 0.8 percentage points in Germany and by 0.6 percentage points in Switzerland. The response of non-housing consumption in Switzerland is less heterogeneous across renters and mortgagors, and has a different pattern across age groups than in the U.S. We discuss economic explanations for these findings and implications for monetary policy.
In 1983, Brian Henderson published an article that examined various types of narrative structure in film, including flashbacks and flashforwards. After analyzing a whole spectrum of techniques capable of effecting a transition between past and present – blurs, fades, dissolves, and so on – he concluded: "Our discussions indicate that cinema has not (yet) developed the complexity of tense structures found in literary works". His "yet" (in parentheses) was an instance of laudable caution, as very soon – in some ten–fifteen years – the situation would change drastically, and temporal twists would become a trademark of a new genre that has not (yet) acquired a standardized name: "modular narratives", "puzzle films", and "complex films" are among the labels used.
Asset transaction prices sampled at high frequency are much staler than one might expect in the sense that they frequently lack new updates showing zero returns. In this paper, we propose a theoretical framework for formalizing this phenomenon. It hinges on the existence of a latent continuous-time stochastic process pt valued in the open interval (0; 1), which represents at any point in time the probability of the occurrence of a zero return. Using a standard infill asymptotic design, we develop an inferential theory for nonparametrically testing, the null hypothesis that pt is constant over one day. Under the alternative, which encompasses a semimartingale model for pt, we develop non-parametric inferential theory for the probability of staleness that includes the estimation of various integrated functionals of pt and its quadratic variation. Using a large dataset of stocks, we provide empirical evidence that the null of the constant probability of staleness is fairly rejected. We then show that the variability of pt is mainly driven by transaction volume and is almost unaffected by bid-ask spread and realized volatility.
Through the lens of market participants' objective to minimize counterparty risk, we provide an explanation for the reluctance to clear derivative trades in the absence of a central clearing obligation. We develop a comprehensive understanding of the benefits and potential pitfalls with respect to a single market participant's counterparty risk exposure when moving from a bilateral to a clearing architecture for derivative markets. Previous studies suggest that central clearing is beneficial for single market participants in the presence of a sufficiently large number of clearing members. We show that three elements can render central clearing harmful for a market participant's counterparty risk exposure regardless of the number of its counterparties: 1) correlation across and within derivative classes (i.e., systematic risk), 2) collateralization of derivative claims, and 3) loss sharing among clearing members. Our results have substantial implications for the design of derivatives markets, and highlight that recent central clearing reforms might not incentivize market participants to clear derivatives.
A tale of one exchange and two order books : effects of fragmentation in the absence of competition
(2018)
Exchanges nowadays routinely operate multiple, almost identically structured limit order markets for the same security. We study the effects of such fragmentation on market performance using a dynamic model where agents trade strategically across two identically-organized limit order books. We show that fragmented markets, in equilibrium, offer higher welfare to intermediaries at the expense of investors with intrinsic trading motives, and lower liquidity than consolidated markets. Consistent with our theory, we document improvements in liquidity and lower profits for liquidity providers when Euronext, in 2009, consolidated its order ow for stocks traded across two country-specific and identically-organized order books into a single order book. Our results suggest that competition in market design, not fragmentation, drives previously documented improvements in market quality when new trading venues emerge; in the absence of such competition, market fragmentation is harmful.
This paper presents new evidence on the expectation formation process from a Dutch household survey. Households become too optimistic about their future income after their income has improved, consistent with the over-extrapolation of their experience. We show that this effect of experience is persistent and that households over-extrapolate income losses more than income gains. Furthermore, older households over-extrapolate more, suggesting that they did not learn over time to form more accurate expectations. Finally, we study the relationship between expectation errors and consumption. We find that more over-optimistic households intend to consume more and subsequently report higher consumption, even though they do not consume as much as they intended to. These results suggests that overextrapolation hurts consumers and amplify business cycles.
Popularity/Prestige
(2018)
What is the canon? Usually this question is just a proxy for something like, "Which works are in the canon?" But the first question is not just a concise version of the second, or at least it doesn’t have to be. Instead, it can ask what the structure of the canon is - in other words, when things are in the canon, what are they in? This question came to the fore during the project that resulted in Pamphlet 11. The members of that group were looking for morphological differences between the canon and the archive. The latter they define, straightforwardly and capaciously, as "that portion of published literature that has been preserved—in libraries and elsewhere" The canon is a slipperier concept; the authors speak instead of multiple canons, like the books preserved in the Chadwyck-Healey Nineteenth-Century Fiction Collection, the constituents of the six different "best-twentieth century novels" lists analyzed by Mark Algee-Hewitt and Mark McGurl in Pamphlet 8, authors included in the British Dictionary of National Biography, and so forth. [...] This last conundrum points the way out of these difficulties and into a workable model of the structure of the canon. It suggests two different ways of entering the canon: being read by many and being prized by an elite few—or, to use the terms arrived at in Pamphlet 11, popularity and prestige. With these two dimensions, we arrive at a canonical space [...].
The propagation of regional shocks in housing markets: evidence from oil price shocks in Canada
(2018)
Shocks to the demand for housing that originate in one region may seem important only for that regional housing market. We provide evidence that such shocks can also affect housing markets in other regions. Our analysis focuses on the response of Canadian housing markets to oil price shocks. Oil price shocks constitute an important source of exogenous regional variation in income in Canada because oil production is highly geographically concentrated. We document that, at the national level, real oil price shocks account for 11% of the variability in real house price growth over time. At the regional level, we find that unexpected increases in the real price of oil raise housing demand and real house prices not only in oil-producing regions, but also in other regions. We develop a theoretical model of the propagation of real oil price shocks across regions that helps understand this finding. The model differentiates between oil-producing and non-oil-producing regions and incorporates multiple sectors, trade between provinces, government redistribution, and consumer spending on fuel. We empirically confirm the model prediction that oil price shocks are propagated to housing markets in non-oil-producing regions by the government redistribution of oil revenue and by increased interprovincial trade.
We analytically characterize optimal monetary policy for an augmented New Keynesian model with a housing sector. In a setting where the private sector has rational expectations about future housing prices and inflation, optimal monetary policy can be characterized without making reference to housing price developments: commitment to a 'target criterion' that refers to inflation and the output gap only is optimal, as in the standard model without a housing sector. When the policymaker is concerned with potential departures of private sector expectations from rational ones and seeks to choose a policy that is robust against such possible departures, then the optimal target criterion must also depend on housing prices. In the empirically realistic case where housing is subsidized and where monopoly power causes output to fall short of its optimal level, the robustly optimal target criterion requires the central bank to 'lean against' housing prices: following unexpected housing price increases, policy should adopt a stance that is projected to undershoot its normal targets for inflation and the output gap, and similarly aim to overshoot those targets in the case of unexpected declines in housing prices. The robustly optimal target criterion does not require that policy distinguish between 'fundamental' and 'non-fundamental' movements in housing prices.
We establish that the labor market helps discipline asset managers via the impact of fund liquidations on their careers. Using hand-collected data on 1,948 professionals, we find that top managers working for funds liquidated after persistently poor relative performance suffer demotion coupled with a significant loss in imputed compensation. Scarring effects are absent when liquidations are preceded by normal relative performance or involve mid-level employees. Seen through the lens of a model with moral hazard and adverse selection, these results can be ascribed to reputation loss rather than bad luck. The findings suggest that performance-induced liquidations supplement compensation-based incentives.
In talent-intensive jobs, workers’ quality is revealed by their performance. This enhances productivity and earnings, but also increases layoff risk. Firms cannot insure workers against this risk if they compete fiercely for talent. In this case, the more risk-averse workers will choose less quality-revealing jobs. This lowers expected productivity and salaries. Public unemployment insurance corrects this inefficiency, enhancing employment in talent-sensitive industries, consistently with international evidence. Unemployment insurance dominates legal restrictions on firms’ dismissals, which penalize more talent-sensitive firms and thus depress expected productivity. Finally, unemployment insurance fosters education, by encouraging investment in risky human capital that enhances talent discovery.
We assess the relationship between finance and growth over the period 1980-2014. We estimate a cross-country growth regression for 48 countries during 20 periods of 15 years starting in 1980 (to 1995) and ending in 1999 (to 2014). We use OLS and IV estimations and we find that: 1) overall financial development had a positive effect on economic growth during all periods of our sample, i.e., we confirm that from 1980 to 2014 financial services provided by the various financial systems were significant (to various degrees) for firm creation, industrial expansion and economic growth; but that, 2) the structure of financial markets was particularly relevant for economic growth until the financial crisis; while 3) the structure of the banking sector played a major role since; and finally that, 4) the legal system is the primary determinant of the effectiveness of the overall financial system in facilitating innovation and growth in (almost) all of our sample period. Hence, overall our results suggest that the relationship between finance and growth matters but also that it varies over time in strength and in sector origination.
JEL Classification: O16, G16, G20.
Motivated by the observation that survey expectations of stock returns are inconsistent with rational return expectations under real-world probabilities, we investigate whether alternative expectations hypotheses entertained in the asset pricing literature are consistent with the survey evidence. We empirically test (1) the notion that survey forecasts constitute rational but risk-neutral forecasts of future returns, and (2) the notion that survey fore- casts are ambiguity averse/robust forecasts of future returns. We find that these alternative hypotheses are also strongly rejected by the data, albeit for different reasons. Hypothesis (1) is rejected because survey return forecasts are not in line with risk-free interest rates and because survey expected excess returns are predictable. Hypothesis (2) is rejected because agents are not al- ways pessimistic about future returns, instead often display overly optimistic return expectations. We speculate as to what kind of expectations theories might be consistent with the available survey evidence.
Europe is a key normative power. Its legitimacy as a force for ensuring the reign of rule of law in international relations is unparalleled. It also packs an economic punch. In data protection and the fight against cybercrime, European norms have been successfully globalized. The time is right to take the next step: Europe must now become the international normative leader for developing a new deal on internet governance. To ensure this, European powers should commit to rules that work in security, economic development and human rights on the internet and implement them in a reinvigorated IGF.
This paper argues that the introduction of the Banking Recovery and Resolution Directive (BRRD) improved market discipline in the European bank market for unsecured debt. The different impact of the BRRD on bank bonds provides a quasi-natural experiment that allows to study the effect of the BRRD within banks using a difference-in-difference approach. Identification is based on the fact that (otherwise identical) bonds of a given bank maturing before 2016 are explicitly protected from BRRD bail-in. The empirical results are consistent with the hypothesis that debt holders actively monitor banks and that the BRRD diminished bail-out expectations. Bank bonds subject to BRRD bail-in carry a 10 basis points bail-in premium in terms of the yield spread. While there is some evidence that the bail-in premium is more pronounced for non-GSIB banks and banks domiciled in peripheral European countries, weak capitalization is the main driver.
The authors relax the standard assumption in the dynamic stochastic general equilibrium (DSGE) literature that exogenous processes are governed by AR(1) processes and estimate ARMA (p,q) orders and parameters of exogenous processes. Methodologically, they contribute to the Bayesian DSGE literature by using Reversible Jump Markov Chain Monte Carlo (RJMCMC) to sample from the unknown ARMA orders and their associated parameter spaces of varying dimensions.
In estimating the technology process in the neoclassical growth model using post war US GDP data, they cast considerable doubt on the standard AR(1) assumption in favor of higher order processes. They find that the posterior concentrates density on hump-shaped impulse responses for all endogenous variables, consistent with alternative empirical estimates and the rigidities behind many richer structural models. Sampling from noninvertible MA representations, a negative response of hours to a positive technology shock is contained within the posterior credible set. While the posterior contains significant uncertainty regarding the exact order, the results are insensitive to the choice of data filter; this contrasts with the authors’ ARMA estimates of GDP itself, which vary significantly depending on the choice of HP or first difference filter.
What institutional arrangements for an independent central bank with a price stability mandate promote good policy outcomes when unconventional policies become necessary? Unconventional monetary policy poses challenges. The large scale asset purchases needed to counteract the zero lower bound on nominal interest rates have uncomfortable fiscal and distributional consequences and require central banks to assume greater risks on their balance sheets.
In his paper, Athanasios Orphanides draws lessons from the experience of the Bank of Japan (BoJ) since the late 1990s for the institutional design of independent central banks. He comes to the conclusion that lack of clarity on the precise definition of price stability, coupled with concerns about the legitimacy of large balance sheet expansions, hinders policy: It encourages the central bank to eschew the decisive quantitative easing needed to reflate the economy and instead to accommodate too-low inflation. The BoJ’s experience with the zero lower bound suggests important benefits from a clear definition of price stability as a symmetric 2% goal for inflation, which the Bank adopted in 2013.
The paper illustrates based on an example the importance of consistency between the empirical measurement and the concept of variables in estimated macroeconomic models. Since standard New Keynesian models do not account for demographic trends and sectoral shifts, the authors proposes adjusting hours worked per capita used to estimate such models accordingly to enhance the consistency between the data and the model. Without this adjustment, low frequency shifts in hours lead to unreasonable trends in the output gap, caused by the close link between hours and the output gap in such models.
The retirement wave of baby boomers, for example, lowers U.S. aggregate hours per capita, which leads to erroneous permanently negative output gap estimates following the Great Recession. After correcting hours for changes in the age composition, the estimated output gap closes gradually instead following the years after the Great Recession.
Financial market interactions can lead to large and persistent booms and recessions. Instability is an inherent threat to economies with speculative financial markets. A central bank’s interest rate setting can amplify the expectation feedback in the financial market and this can lead to unstable dynamics and excess volatility. The paper suggests that policy institutions may be well-advised to handle tools like asset price targeting with care since such instruments might add a structural link between asset prices and macroeconomic aggregates. Neither stock prices nor indices are a good indicator to base decisions on.
The level of capital tax gains has high explanatory power regarding the question of what drives economic inequality. On this basis, the authors develop a simple, yet micro-founded portfolio selection model to explain the dynamics of wealth inequality given empirical tax series in the US. The results emphasize that the level and the transition of speed of wealth inequality depend crucially on the degree of capital taxation. The projections predict that – continuing on the present path of capital taxation in the US – the gap between rich and poor is expected to shrink whereas “massive” tax cuts will further increase the degree of wealth concentration.
We investigate the characteristics of infrastructure as an asset class from an investment perspective of a limited partner. While non U.S. institutional investors gain exposure to infrastructure assets through a mix of direct investments and private fund vehicles, U.S. investors predominantly invest in infrastructure through private funds. We find that the stream of cash flows delivered by private infrastructure funds to institutional investors is very similar to that delivered by other types of private equity, as reflected by the frequency and amounts of net cash flows. U.S. public pension funds perform worse than other institutional investors in their infrastructure fund investments, although they are exposed to underlying deals with very similar project stage, concession terms, ownership structure, industry, and geographical location. By selecting funds that invest in projects with poor financial performance, U.S. public pension funds have created an implicit subsidy to infrastructure as an asset class, which we estimate within the range of $730 million to $3.16 billion per year depending on the benchmark.
Direct financing of consumer credit by individual investors or non-bank institutions through an implementation of marketplace lending is a relatively new phenomenon in financial markets. The emergence of online platforms has made this type of financial intermediation widely available. This paper analyzes the performance of marketplace lending using proprietary cash flow data for each individual loan from the largest platform, Lending Club. While individual loan characteristics would be important for amateur investors holding a few loans, sophisticated lenders, including institutional investors, usually form broad portfolios to benefit from diversification. We find high risk-adjusted performance of approximately 40 basis points per month for these basic loan portfolios. This abnormal performance indicates that Lending Club, and similar marketplace lenders, are likely to attract capital to finance a growing share of the consumer credit market. In the absence of a competitive response from traditional credit providers, these loans lower costs to the ultimate borrowers and increase returns for the ultimate lenders.
We study the relevance of signaling and marketing as explanations for the discount control mechanisms that a closed-end fund may choose to adopt in its prospectus. These policies are designed to narrow the potential gap between share price and net asset value, measured by the fund’s discount. The two most common discount control mechanisms are explicit discretion to repurchase shares based on the magnitude of the fund discount and mandatory continuation votes that provide shareholders the opportunity to liquidate the fund. We find very limited evidence that a discount control mechanism serves as costly signal of information. Funds with mandatory voting are not more likely to delist than the rest of the CEFs in general or whenever the fund discount is large. Similarly, funds that explicitly discuss share repurchases as a potential response do not subsequently buy back shares more often when discounts do increase. Instead, the existence of these policies is more consistent with marketing explanations because the policies are associated with an increased probability of issuing more equity in subsequent periods.
This paper investigates how biases in macroeconomic forecasts are associated with economic surprises and market responses across asset classes around US data announcements. We find that the skewness of the distribution of economic forecasts is a strong predictor of economic surprises, suggesting that forecasters behave strategically (rational bias) and possess private information. Our results also show that consensus forecasts of US macroeconomic releases embed anchoring. Under these conditions, both economic surprises and the returns of assets that are sensitive to macroeconomic conditions are predictable. Our findings indicate that local equities and bond markets are more predictable than foreign markets, currencies and commodities. Economic surprises are found to link to asset returns very distinctively through the stages of the economic cycle, whereas they strongly depend on economic releases being inflation- or growth-related. Yet, when forecasters fail to correctly forecast the direction of economic surprises, regret becomes a relevant cognitive bias to explain asset price responses. We find that the behavioral and rational biases encountered in US economic forecasting also exists in Continental Europe, the United Kingdom and Japan, albeit, to a lesser extent.
In the secondary art market, artists play no active role. This allows us to isolate cultural influences on the demand for female artists’ work from supply-side factors. Using 1.5 million auction transactions in 45 countries, we document a 47.6% gender discount in auction prices for paintings. The discount is higher in countries with greater gender inequality. In experiments, participants are unable to guess the gender of an artist simply by looking at a painting and they vary in their preferences for paintings associated with female artists. Women's art appears to sell for less because it is made by women.
While record-making prices at art auctions receive headline news coverage, artists typically do not receive any direct proceeds from those sales. Early-stage creative work in any field is perennially difficult to value, but the valuation, reward, and incentivization for artistic labor are particularly fraught. A core challenge in studying the real return on artists’ work is the extreme difficulty accessing data from when an artwork was first sold. Galleries keep private records that are difficult to access and to match to public auction results. This paper, for the first time, uses archivally sourced primary market records, for the artists Jasper Johns and Robert Rauschenberg. Although this approach restricts the size of the data set, this innovative method shows much more accurate returns on art than typical regression and hedonic models. We find that if Johns and Rauschenberg had retained 10% equity in their work when it was first sold, the returns to them when the work was resold at auction would have outperformed the US S&P 500 by between 2 and 986 times. The implication of this work opens up vast policy recommendations with regard to secondary art market sales, entrepreneurial strategies using blockchain technology, and implications about how we compensate creative work.
We study the introduction of single-market liquidity provider incentives in fragmented securities markets. Specifically, we investigate whether fee rebates for liquidity providers enhance liquidity on the introducing market and thereby increase its competitiveness and market share. Further, we analyze whether single-market liquidity provider incentives increase overall market liquidity available for market participants. Therefore, we measure the specific liquidity contribution of individual markets to the aggregate liquidity in the fragmented market environment. While liquidity and market share of the venue introducing incentives increase, we find no significant effect for turnover and liquidity of the whole market.
Reliability and relevance of fair values : private equity investments and investee fundamentals
(2018)
We directly test the reliability and relevance of fair values reported by listed private equity firms (LPEs), where the unit of account for fair value measurement attribute (FVM) is an investment stake in an individual investee company. FVMs are observable for multiple investment stakes, fair values are economically important, and granular data on investee economic fundamentals that should underpin fair values are available in public disclosures. We find that LPE fund managers determine valuations based on accounting-based fundamentals—equity book value and net income—that are in line with those investors derive for listed companies. Additionally, our findings suggest that LPE fund managers apply a lower valuation weight to investee net income if direct market inputs are unobservable during investment value estimation. We interpret these findings as evidence that LPE fund managers do not appear mechanically to apply market valuation weights for publicly traded investees when determining valuations of non-listed. We also document that the judgments that LPE fund managers apply when determining investee valuations appear to be perceived as reliable by their investors.
We study the impact of transparency on liquidity in OTC markets. We do so by providing an analysis of liquidity in a corporate bond market without trade transparency (Germany), and comparing our findings to a market with full post-trade disclosure (the U.S.). We employ a unique regulatory dataset of transactions of German financial institutions from 2008 until 2014 to find that: First, overall trading activity is much lower in the German market than in the U.S. Second, similar to the U.S., the determinants of German corporate bond liquidity are in line with search theories of OTC markets. Third, surprisingly, frequently traded German bonds have transaction costs that are 39-61 bp lower than a matched sample of bonds in the U.S. Our results support the notion that, while market liquidity is generally higher in transparent markets, a sub-set of bonds could be more liquid in more opaque markets because of investors "crowding" their demand into a small number of more actively traded securities.
This paper analyzes how the combination of borrowing constraints and idiosyncratic risk affects the equity premium in an overlapping generations economy. I find that introducing a zero-borrowing constraint in an economy without idiosyncratic risk increases the equity premium by 70 percent, which means that the mechanism described in Constantinides, Donaldson, and Mehra (2002) is dampened because of the large number of generations and production. With social security the effect of the zero-borrowing constraint is a lot weaker. More surprisingly, when I introduce idiosyncratic labor income risk in an economy without a zero-borrowing constraint, the equity premium increases by 50 percent, even though the income shocks are independent of aggregate risk and are not permanent. The reason is that idiosyncratic risk makes the endogenous natural borrowing limits much tighter, so that they have a similar effect to an exogenously imposed zero-borrowing constraint. This intuition is confirmed when I add idiosyncratic risk in an economy with a zero-borrowing constraint: neither the equity premium nor the Sharpe ratio change, because the zero-borrowing constraint is already tighter than the natural borrowing limits that result when idiosyncratic risk is added.
We propose a spatiotemporal approach for modeling risk spillovers using time-varying proximity matrices based on observable financial networks and introduce a new bilateral specification. We study covariance stationarity and identification of the model, and analyze consistency and asymptotic normality of the quasi-maximum-likelihood estimator. We show how to isolate risk channels and we discuss how to compute target exposure able to reduce system variance. An empirical analysis on Euro-area cross-country holdings shows that Italy and Ireland are key players in spreading risk, France and Portugal are the major risk receivers, and we uncover Spain's non-trivial role as risk middleman.