Working Paper
Refine
Year of publication
Document Type
- Working Paper (2351) (remove)
Language
- English (2351) (remove)
Is part of the Bibliography
- no (2351)
Keywords
- Deutschland (115)
- USA (51)
- Geldpolitik (48)
- monetary policy (46)
- Schätzung (45)
- Europäische Union (43)
- Bank (38)
- Corporate Governance (36)
- Monetary Policy (31)
- Inflation (23)
Institute
- Center for Financial Studies (CFS) (1376)
- Wirtschaftswissenschaften (1306)
- Sustainable Architecture for Finance in Europe (SAFE) (738)
- House of Finance (HoF) (604)
- Institute for Monetary and Financial Stability (IMFS) (173)
- Rechtswissenschaft (148)
- Informatik (114)
- Foundation of Law and Finance (50)
- Exzellenzcluster Die Herausbildung normativer Ordnungen (34)
- Gesellschaftswissenschaften (29)
We show that bond purchases undertaken in the context of quantitative easing efforts by the European Central Bank created a large mispricing between the market for German and Italian government bonds and their respective futures contracts. On top of the direct effect the buying pressure exerted on bond prices, we show three indirect effects through which the scarcity of bonds, resulting from the asset purchases, drove a wedge between the futures contracts and the underlying bonds: the deterioration of bond market liquidity, the increased bond specialness on the repurchase agreement market, and the greater uncertainty about bond availability as collateral.
We study the role of various trader types in providing liquidity in spot and futures markets based on complete order-book and transactions data as well as cross-market trader identifiers from the National Stock Exchange of India for a single large stock. During normal times, short-term traders who carry little inventory overnight are the primary intermediaries in both spot and futures markets, and changes in futures prices Granger-cause changes in spot prices. However, during two days of fast crashes, Granger-causality ran both ways. Both crashes were due to large-scale selling by foreign institutional investors in the spot market. Buying by short-term traders and cross-market traders was insufficient to stop the crashes. Mutual funds, patient traders with better trade-execution quality who were initially slow to move in, eventually bought sufficient quantities leading to price recovery in both markets. Our findings suggest that market stability requires the presence of well-capitalized standby liquidity providers.
An important assumption underlying the designation of some insurers as systemically important is that their overlapping portfolio holdings can result in common selling. We measure the overlap in holdings using cosine similarity, and show that insurers with more similar portfolios have larger subsequent common sales. This relationship can be magnified for some insurers when they are regulatory capital constrained or markets are under stress. When faced with an exogenous liquidity shock, insurers with greater portfolio similarity have even larger common sales that impact prices. Our measure can be used by regulators to predict which institutions may contribute most to financial instability through the asset liquidation channel of risk transmission.
This paper investigates inertia within and across banks in retail deposit markets using detailed panel data on consumer choices and account characteristics. In a structural choice model, I find that costs of inertia are around one third higher for switching accounts across compared to switching within banks. Observable proxies of bank-level switching costs (number and type of additional financial products) explain most of this cost premium, while online banking usage reduces inertia. Consistent with theory, I provide evidence that banks incorporate inertia in their pricing as older accounts pay lower rates than comparable newer accounts. Counterfactual policies reducing inertia shift market share to more competitive smaller banks, but only eliminating inertia within banks already results in high potential gains in consumer surplus. This suggests that facilitating bank switching alone might be insufficient to improve consumer choices.
In recent years European financial regulation has experienced a tremendous reorientation with respect to the shadow banking system, which manifested first and foremost in its reframing as market-based finance. Initially identified as a source of systemic risk certain initiatives did not only fall much behind the envisaged changes but all to the contrary have been substantially modified in a way that they now aim at revitalizing these activities. The reorientation of European regulatory agency on shadow banking post-crisis, from curtailing it to facilitating resilient market-based finance, has been a cause for irritation by academic observers, dismissed by some as mere rebranding or taken as a sign of regulatory capture. All to the contrary, this paper documents the central role of regulatory agency in shadow banking’s reconfiguration. It does so by analyzing the European initiatives concerning the regulation of Asset-Backed Commercial Paper (ABCP) and another prime example of shadow banking, Money Market Mutual Funds (MMFs). Based on documentary analysis and expert interviews we trace the way the recently published EU frameworks for MMFs and ABCP have been designed (in particular the STS, CRR and MMF regulation in 2017). Furthermore, we show how they have been transformed in such a way that their final versions allow to re-establish the shadow banking chain linking MMFs, the ABCP market and arguably the regular banking system. This transformation is driven by a new form of pro-active European regulatory agency which aims at creating a regulatory infrastructure able to sustain the orderly flow of real economy debt. Far from being captured by the industry, they did so consciously and in cooperation with private actors in order to maintain a channel for credit creation outside of bank credit, a task made more complicated by the rushed politicized final negotiations coupled with technical complexity. This paper thereby contributes to a new strand of literature, seeing the creation and reconfiguration of the shadow banking system as characterized by the active and conscious role of state actors.
We propose a unified framework to measure the effects of different reforms of the pension system on retirement ages and macroeconomic indicators in the face of demographic change. A rich overlapping generations (OLG) model is built and endogenous retirement decisions are explicitly modeled within a public pension system. Heterogeneity with respect to consumption preferences, wage profiles, and survival rates is embedded in the model. Besides the expected direct effects of these reforms on the behavior of households, we observe that feedback effects do occur. Results suggest that individual retirement decisions are strongly influenced by numerous incentives produced by the pension system and macroeconomic variables, such as the statutory eligibility age, adjustment rates, the presence of a replacement rate, and interest rates. Those decisions, in turn, have several impacts on the macro-economy which can create feedback cycles working through equilibrium effects on interest rates and wages. Taken together, these reform scenarios have strong implications for the sustainability of pension systems. Because of the rich nature of our unified model framework, we are able to rank the reform proposals according to several individual and macroeconomic measures, thereby providing important support for policy recommendations on pension systems.
The paper investigates the determinants of the idiosyncratic volatility puzzle by allowing linkages across asset returns. The first contribution of the paper is to show that portfolios sorted by increasing indegree computed on the network based on Granger causality test have lower expected returns, not related to idiosyncratic volatility. Secondly, empirical evidence indicates that stocks with higher idiosyncratic volatility have the lower exposition on the indegree risk factor.
We examine how a firms' investment behavior affects the investment of a neighboring firm. Economic theory yields ambiguous predictions regarding the direction of firm peer effects and consistent with earlier work, we find that firms display similar investment behavior within an area using OLS analysis. Exploiting time-variation in the rise of U.S. states' corporate income taxes and utilizing heterogeneity in firms' exposure to increases in corporate income tax rates, we identify the causal impact of local firms' investments. Using this as an instrumental variable in a 2SLS estimation, we find that an increases in local firms' investment reduces the investment of a local peer firm. This effect is more pronounced if local competition among firms is stronger and supports theories that firm investments are strategic substitutes due to competition.
We use minutes from 17,000 financial advisory sessions and corresponding client portfolio data to study how active client involvement affects advisor recommendations and portfolio outcomes. We find that advisors confronted with acquiescent clients stick to their standards and recommend expensive but well diversified mutual fund portfolios. However, if clients take an active role in the meetings, advisors deviate markedly from their standards, resulting in poorer portfolio diversification and lower Sharpe ratios. Our findings that advisors cater to client requests parallel the phenomenon of doctors prescribing antibiotics to insistent patients even if inappropriate, and imply that pandering diminishes the quality of advice.
This paper provides a complete characterization of optimal contracts in principal-agent settings where the agent's action has persistent effects. We model general information environments via the stochastic process of the likelihood-ratio. The martingale property of this performance metric captures the information benefit of deferral. Costs of deferral may result from both the agent's relative impatience as well as her consumption smoothing needs. If the relatively impatient agent is risk neutral, optimal contracts take a simple form in that they only reward maximal performance for at most two payout dates. If the agent is additionally risk-averse, optimal contracts stipulate rewards for a larger selection of dates and performance states: The performance hurdle to obtain the same level of compensation is increasing over time whereas the pay-performance sensitivity is declining.
A growing body of literature shows the importance of financial literacy in households' financial decisions. However, fewer studies focus on understanding the determinants of financial literacy. Our paper fills this gap by analyzing a specific determinant, the educational system, to explain the heterogeneity in financial literacy scores across Germany. We suggest that the lower financial literacy observed in East Germany is partially caused by a different institutional framework experienced during the Cold War, more specifically, by the socialist educational system of the GDR which affected specific cohorts of individuals. By exploiting the unique set-up of the German reunification, we identify education as a channel through which institutions and financial literacy are related in the German context.
How demanding and consistent is the 2018 stress test design in comparison to previous exercises?
(2018)
Bank regulators have the discretion to discipline banks by executing enforcement actions to ensure that banks correct deficiencies regarding safe and sound banking principles. We highlight the trade-offs regarding the execution of enforcement actions for financial stability. Following this we provide an overview of the differences in the legal framework governing supervisors’ execution of enforcement actions in the Banking Union and the United States. After discussing work on the effect of enforcement action on bank behaviour and the real economy, we present data on the evolution of enforcement actions and monetary penalties by U.S. regulators. We conclude by noting the importance of supervisors to levy efficient monetary penalties and stressing that a division of competences among different regulators should not lead to a loss of efficiency regarding the execution of enforcement actions.
A new governance architecture for european financial markets? Towards a european supervision of CCPs
(2018)
Does the new European outlook on financial markets, as voiced by the EU Commission since the beginning of the Capital Market Unions imply a movement of the EU towards an alignment of market integration and direct supervision of common rules? This paper sets out to answer this question for the case of common supervision for Central Counterparties (CCPs) in the European Union. Those entities gained crucial importance post-crisis due to new regulation which requires the mandatory clearing of standardized derivative contracts, transforming clearing houses into central nodes for cross-border financial transactions. While the EU-wide regulatory framework EMIR, enacted in 2012, stipulates common regulatory requirements, the framework still relies on home-country supervision of those rules, arguably leading to regulatory as well as supervisory arbitrage. Therefore, the regulatory reform to stabilize the OTC derivatives market replicated at its center a governance flaw, which had been identified as one of the major causes for the gravity of the financial crisis in the EU: the coupling of intense competition based on private risk management systems with a national supervision of European rules. This paper traces the history of this problem awareness and inquires which factors account for the fact that only in 2017 serious negotiations at the EU level ensued that envisioned a common supervision of CCPs to fix the flawed system of governance. Analyzing this shift in the European governance architecture, we argue that Brexit has opened a window of opportunity for a centralization of supervision for CCPs. Brexit aligns the urgency of the problem with material interests of crucial political stakeholder, in particular of Germany and France, providing the possibility for a grand European bargain.
Improving financial conditions of individuals requires an understanding of the mechanisms through which bad financial decision-making leads to worse financial outcomes. From a theoretical point of view, a key candidate inducing mistakes in financial decision-making are so called present-biased preferences, which are one of the cornerstones of behavioral economics. According to theory, present-biased households should behave systematically different when it comes to consumption and saving decisions, as they should be more prone to spending too much and saving too little.
In this policy letter we show how high frequency financial transaction data available in digitized form allows to precisely categorize individual financial-decision making to be present-biased or not. Using this categorization, we find that one out of five individuals in our sample exhibits present-bias and that this present-biased behavior is associated with a stronger use of overdrafts. As overdrafts represent a particularly expensive way of short-term borrowing, their systematic use can be interpreted as a measure of suboptimal financial-decision making. Overall, our results indicate that the combination of economic theory and Big Data is able to generate valuable insights with applications for policy makers and businesses alike.
The object of this study is one of the most ambitious projects of twentieth-century art history: Aby Warburg's 'Atlas Mnemosyne', conceived in the summer of 1926 – when the first mention of a 'Bilderatlas', or "atlas of images", occurs in his journal – and truncated three years later, unfinished, by his sudden death in October 1929. Mnemosyne consisted in a series of large black panels, about 170x140 cm., on which were attached black-and-white photographs of paintings, sculptures, book pages, stamps, newspaper clippings, tarot cards, coins, and other types of images. Warburg kept changing the order of the panels and the position of the images until the very end, and three main versions of the Atlas have been recorded: one from 1928 (the "1-43 version", with 682 images); one from the early months of 1929, with 71 panels and 1050 images; and the one Warburg was working on at the time of his death, also known as the "1-79 version", with 63 panels and 971 images (which is the one we will examine). But Warburg was planning to have more panels – possibly many more – and there is no doubt that Mnemosyne is a dramatically unfinished and controversial object of study.
Patterns and interpretation
(2017)
One thing for sure: digitization has completely changed the literary archive. People like me used to work on a few hundred nineteenth-century novels; today, we work on thousands of them; tomorrow, hundreds of thousands. This has had a major effect on literary history, obviously enough, but also on critical methodology; because, when we work on 200,000 novels instead of 200, we are not doing the same thing, 1,000 times bigger; we are doing a different thing. The new scale changes our relationship to our object, and in fact 'it changes the object itself'.
The Emotions of London
(2016)
A few years ago, a group formed by Ben Allen, Cameron Blevins, Ryan Heuser, and Matt Jockers decided to use topic modeling to extract geographical information from nineteenth-century novels. Though the study was eventually abandoned, it had revealed that London-related topics had become significantly more frequent in the course of the century, and when some of us were later asked to design a crowd-sourcing experiment, we decided to add a further dimension to those early findings, and see whether London place-names could become the cornerstone for an emotional geography of the city.
Literature, measured
(2016)
There comes a moment, in digital humanities talks, when someone raises the hand and says: "Ok. Interesting. But is it really new?" Good question... And let's leave aside the obvious lines of defense, such as "but the field is still only at its beginning!", or "and traditional literary criticism, is that always new?" All true, and all irrelevant; because the digital humanities have presented themselves as a radical break with the past, and must therefore produce evidence of such a break. And the evidence, let's be frank, is not strong. What is there, moreover, comes in a variety of forms, beginning with the slightly paradoxical fact that, in a new approach, not everything has to be new. When "Network Theory, Plot Analysis” pointed out, in passing, that a network of Hamlet had Hamlet at its center, the New York Times gleefully mentioned the passage as an unmistakable sign of stupidity. Maybe; but the point, of course, was not to present Hamlet’s centrality as a surprise; it was exactly the opposite: had the new approach not found Hamlet at the center of the play, its plausibility would have disintegrated. Before using network theory for dramatic analysis, I had to test it, and prove that it corroborated the main results of previous research.
Of the novelties introduced by digitization in the study of literature, the size of the archive is probably the most dramatic: we used to work on a couple of hundred nineteenth-century novels, and now we can analyze thousands of them, tens of thousands, tomorrow hundreds of thousands. It's a moment of euphoria, for quantitative literary history: like having a telescope that makes you see entirely new galaxies. And it's a moment of truth: so, have the digital skies revealed anything that changes our knowledge of literature? This is not a rhetorical question. In the famous 1958 essay in which he hailed "the advent of a quantitative history" that would "break with the traditional form of nineteenth-century history", Fernand Braudel mentioned as its typical materials "demographic progressions, the movement of wages, the variations in interest rates [...] productivity [...] money supply and demand." These were all quantifiable entities, clearly enough; but they were also completely new objects compared to the study of legislation, military campaigns, political cabinets, diplomacy, and so on. It was this double shift that changed the practice of history; not quantification alone. In our case, though, there is no shift in materials: we may end up studying 200,000 novels instead of 200; but, they're all still novels. Where exactly is the novelty?
Different scales, different features. It’s the main difference between the thesis we have presented here, and the one that has so far dominated the study of the paragraph. By defining it as "a sentence writ large", or, symmetrically, as "a short discourse", previous research was implicitly asserting the irrelevance of scale: sentence, paragraph, and discourse were all equally involved in the "development of one topic". We have found the exact opposite: 'scale is directly correlated to the differentiation of textual functions'. By this, we don't simply mean that the scale of sentences or paragraphs allows us to "see" style or themes more clearly. This is true, but secondary. Paragraphs allows us to "see" themes, because themes fully "exist" only at the scale of the paragraph. Ours is not just an epistemological claim, but an ontological one: if style and themes and episodes exist in the form they do, it's because writers work at different scales – and do different things according to the level at which they are operating.