Refine
Year of publication
Document Type
- Working Paper (1495)
- Part of Periodical (566)
- Article (188)
- Report (141)
- Book (100)
- Doctoral Thesis (70)
- Contribution to a Periodical (44)
- Conference Proceeding (21)
- Part of a Book (13)
- Periodical (12)
Is part of the Bibliography
- no (2678)
Keywords
- Deutschland (98)
- Financial Institutions (90)
- Capital Markets Union (65)
- ECB (65)
- Financial Markets (59)
- Banking Union (50)
- Banking Regulation (49)
- Household Finance (45)
- Monetary Policy (41)
- Banking Supervision (40)
Institute
- Wirtschaftswissenschaften (2678) (remove)
Optimal investment decisions by institutional investors require accurate predictions with respect to the development of stock markets. Motivated by previous research that revealed the unsatisfactory performance of existing stock market prediction models, this study proposes a novel prediction approach. Our proposed system combines Artificial Intelligence (AI) with data from Virtual Investment Communities (VICs) and leverages VICs’ ability to support the process of predicting stock markets. An empirical study with two different models using real data shows the potential of the AI-based system with VICs information as an instrument for stock market predictions. VICs can be a valuable addition but our results indicate that this type of data is only helpful in certain market phases.
Artificial Intelligence (AI) and Machine Learning (ML) are currently hot topics in industry and business practice, while management-oriented research disciplines seem reluctant to adopt these sophisticated data analytics methods as research instruments. Even the Information Systems (IS) discipline with its close connections to Computer Science seems to be conservative when conducting empirical research endeavors. To assess the magnitude of the problem and to understand its causes, we conducted a bibliographic review on publications in high-level IS journals. We reviewed 1,838 articles that matched corresponding keyword-queries in journals from the AIS senior scholar basket, Electronic Markets and Decision Support Systems (Ranked B). In addition, we conducted a survey among IS researchers (N = 110). Based on the findings from our sample we evaluate different potential causes that could explain why ML methods are rather underrepresented in top-tier journals and discuss how the IS discipline could successfully incorporate ML methods in research undertakings.
This article discusses the counterpart of interactive machine learning, i.e., human learning while being in the loop in a human-machine collaboration. For such cases we propose the use of a Contradiction Matrix to assess the overlap and the contradictions of human and machine predictions. We show in a small-scaled user study with experts in the area of pneumology (1) that machine-learning based systems can classify X-rays with respect to diseases with a meaningful accuracy, (2) humans partly use contradictions to reconsider their initial diagnosis, and (3) that this leads to a higher overlap between human and machine diagnoses at the end of the collaboration situation. We argue that disclosure of information on diagnosis uncertainty can be beneficial to make the human expert reconsider her or his initial assessment which may ultimately result in a deliberate agreement. In the light of the observations from our project, it becomes apparent that collaborative learning in such a human-in-the-loop scenario could lead to mutual benefits for both human learning and interactive machine learning. Bearing the differences in reasoning and learning processes of humans and intelligent systems in mind, we argue that interdisciplinary research teams have the best chances at tackling this undertaking and generating valuable insights.
We focus on the role of social media as a high-frequency, unfiltered mass information transmission channel and how its use for government communication affects the aggregate stock markets. To measure this effect, we concentrate on one of the most prominent Twitter users, the 45th President of the United States, Donald J. Trump. We analyze around 1,400 of his tweets related to the US economy and classify them by topic and textual sentiment using machine learning algorithms. We investigate whether the tweets contain relevant information for financial markets, i.e. whether they affect market returns, volatility, and trading volumes. Using high-frequency data, we find that Trump’s tweets are most often a reaction to pre-existing market trends and therefore do not provide material new information that would influence prices or trading. We show that past market information can help predict Trump’s decision to tweet about the economy.
We develop a two-sector incomplete markets integrated assessment model to analyze the effectiveness of green quantitative easing (QE) in complementing fiscal policies for climate change mitigation. We model green QE through an outstanding stock of private assets held by a monetary authority and its portfolio allocation between a clean and a dirty sector of production. Green QE leads to a partial crowding out of private capital in the green sector and to a modest reduction of the global temperature by 0.04 degrees of Celsius until 2100. A moderate global carbon tax of 50 USD per tonne of carbon is 4 times more effective.
The ECB’s Outright Monetary Transactions (OMT) program, launched in summer 2012, indirectly recapitalized periphery country banks through its positive impact on the value of sovereign bonds. However, the regained stability of the European banking sector has not fully transferred into economic growth. We show that zombie lending behavior of banks that still remained undercapitalized after the OMT announcement is an important reason for this development. As a result, there was no positive impact on real economic activity like employment or investment. Instead, firms mainly used the newly acquired funds to build up cash reserves. Finally, we document that creditworthy firms in industries with a high prevalence of zombie firms suffered significantly from the credit misallocation, which slowed down the economic recovery.
We investigate the transmission of central bank liquidity to bank deposits and loan spreads in Europe over the January 2006 to June 2010 period. We find evidence consistent with an impaired transmission channel due to bank risk. Central bank liquidity does not translate into lower loan spreads for high-risk banks, even as it lowers deposit rates for both high-risk and low-risk banks. This adversely affects the balance sheets of high-risk bank borrowers, leading to lower payouts, lower capital expenditures, and lower employment. Overall, our results suggest that banks’ capital constraints at the time of an easing of monetary policy pose a challenge to the effectiveness of the bank lending channel and the effectiveness of the central bank as a lender of last resort.
The European Central Bank (ECB) has finalized its comprehensive assessment of the solvency of the largest banks in the euro area and on October 26 disclosed the results of this assessment. In the present paper, Acharya and Steffen compare the outcomes of the ECB's assessment to their own benchmark stress tests conducted for 39 publically listed financial institutions that are also included in the ECB's regulatory review. The authors identify a negative correlation between their benchmark estimates for capital shortfalls and the regulatory capital shortfall, but a positive correlation between their benchmark estimates for losses under stress both in the banking book and in the trading book. They conclude that the regulatory stress test outcomes are potentially heavily affected by discretion of national regulators in measuring what is capital, and especially the use of risk-weighted assets in calculating the prudential capital requirement.
We develop a dynamic recursive model where political and economic decisions interact, to study how excessive debt-GDP ratios affect political sustainability of prudent fiscal policies. Rent seeking groups make political decisions – to cooperate (or not) – on the allocation of fiscal budgets (including rents) and issuance of sovereign debt. A classic commons problem triggers collective fiscal impatience and excessive debt issuing, leading to a vicious circle of high borrowing costs and sovereign default. We analytically characterize debt-GDP thresholds that foster cooperation among rent seeking groups and avoid default. Our analysis and application helps in understanding the politico-economic sustainability of sovereign rescues, emphasizing the need for fiscal targets and possible debt haircuts. We provide a calibrated example that quantifies the threshold debt-GDP ratio at 137%, remarkably close to the target set for private sector involvement in the case of Greece.
We determine optimal monetary policy under commitment in a forwardlooking New Keynesian model when nominal interest rates are bounded below by zero. The lower bound represents an occasionally binding constraint that causes the model and optimal policy to be nonlinear. A calibration to the U.S. economy suggests that policy should reduce nominal interest rates more aggressively than suggested by a model without lower bound. Rational agents anticipate the possibility of reaching the lower bound in the future and this amplifies the effects of adverse shocks well before the bound is reached. While the empirical magnitude of U.S. mark-up shocks seems too small to entail zero nominal interest rates, shocks affecting the natural real interest rate plausibly lead to a binding lower bound. Under optimal policy, however, this occurs quite infrequently and does not imply positive average inflation rates in equilibrium. Interestingly, the presence of binding real rate shocks alters the policy response to (non-binding) mark-up shocks.
Zur Reform der Einlagensicherung: Elemente einer anreizkompatiblen Europäischen Rückversicherung
(2020)
Bankeinlagen bis 100.000 Euro sind de jure überall im Euroraum gleichermaßen vor Verlusten geschützt. De facto hängt der Wert dieser gesetzlichen Haftungszusage unter anderem von der Ausstattung des nationalen Sicherungsfonds und der relativen Größe des Bankensektors in einer Volkswirtschaft ab. Um die Homogenität des Einlagenschutzes zu gewährleisten und die Bankenunion zu vollenden, bedarf es einer einheitlichen europäischen Einlagensicherung. Die bestehende implizite Risikoteilung im Euroraum ist ordnungspolitisch nicht wünschenswert. Ferner kann eine explizite und glaubwürdige Zweitsicherung Fehlanreize zur Übernahme exzessiver Risiken verhindern, bevor es zum Schadensfall kommt. Daher plädiert dieser Beitrag für ein zweistufiges, streng subsidiär organisiertes Rückversicherungsmodell: Nationale Erstversicherungen würden einen festgeschriebenen Teil, die europäische Rückversicherung nachrangig den Rest der Deckungssumme besichern. Die Rückversicherung gewährt diese Liquiditätshilfen in Form von Kassenkrediten. Weil die Haftung auf nationaler Ebene verbleibt, werden Risiken geteilt aber nicht vergemeinschaftet. Marktgerechte Prämien müssen nicht nur das individuelle Risikogewicht einer Bank sondern auch länderspezifische Risikofaktoren berücksichtigen. Zuletzt braucht der Rückversicherer umfangreiche Aufsichtsrechte, um die Zahlungsfähigkeit der Erstversicherer mit Hinblick auf die nationalen Haftungspflichten jederzeit sicherzustellen.
Motivated by the observation that survey expectations of stock returns are inconsistent with rational return expectations under real-world probabilities, we investigate whether alternative expectations hypotheses entertained in the asset pricing literature are consistent with the survey evidence. We empirically test (1) the notion that survey forecasts constitute rational but risk-neutral forecasts of future returns, and (2) the notion that survey fore- casts are ambiguity averse/robust forecasts of future returns. We find that these alternative hypotheses are also strongly rejected by the data, albeit for different reasons. Hypothesis (1) is rejected because survey return forecasts are not in line with risk-free interest rates and because survey expected excess returns are predictable. Hypothesis (2) is rejected because agents are not al- ways pessimistic about future returns, instead often display overly optimistic return expectations. We speculate as to what kind of expectations theories might be consistent with the available survey evidence.
Optimal trend inflation
(2017)
We present a sticky-price model incorporating heterogeneous Firms and systematic firm-level productivity trends. Aggregating the model in closed form, we show that it delivers radically different predictions for the optimal inflation rate than canonical sticky price models featuring homogenous Firms:
(1) the optimal steady-state inflation rate generically differs from zero and,
(2) inflation optimally responds to productivity disturbances.
Using micro data from the US Census Bureau to estimate the inflation-relevant productivity trends at the firm level, we find that the optimal US inflation rate is positive. It was slightly above 2 percent in the year 1986, but continuously declined thereafter, reaching about 1 percent in the year 2013.
We analytically characterize optimal monetary policy for an augmented New Keynesian model with a housing sector. In a setting where the private sector has rational expectations about future housing prices and inflation, optimal monetary policy can be characterized without making reference to housing price developments: commitment to a 'target criterion' that refers to inflation and the output gap only is optimal, as in the standard model without a housing sector. When the policymaker is concerned with potential departures of private sector expectations from rational ones and seeks to choose a policy that is robust against such possible departures, then the optimal target criterion must also depend on housing prices. In the empirically realistic case where housing is subsidized and where monopoly power causes output to fall short of its optimal level, the robustly optimal target criterion requires the central bank to 'lean against' housing prices: following unexpected housing price increases, policy should adopt a stance that is projected to undershoot its normal targets for inflation and the output gap, and similarly aim to overshoot those targets in the case of unexpected declines in housing prices. The robustly optimal target criterion does not require that policy distinguish between 'fundamental' and 'non-fundamental' movements in housing prices.
In the secondary art market, artists play no active role. This allows us to isolate cultural influences on the demand for female artists’ work from supply-side factors. Using 1.5 million auction transactions in 45 countries, we document a 47.6% gender discount in auction prices for paintings. The discount is higher in countries with greater gender inequality. In experiments, participants are unable to guess the gender of an artist simply by looking at a painting and they vary in their preferences for paintings associated with female artists. Women's art appears to sell for less because it is made by women.
In this paper, we develop a state-dependent sensitivity value-at-risk (SDSVaR) approach that enables us to quantify the direction, size, and duration of risk spillovers among financial institutions as a function of the state of financial markets (tranquil, normal, and volatile). Within a system of quantile regressions for four sets of major financial institutions (commercial banks, investment banks, hedge funds, and insurance companies) we show that while small during normal times, equivalent shocks lead to considerable spillover effects in volatile market periods. Commercial banks and, especially, hedge funds appear to play a major role in the transmission of shocks to other financial institutions. Using daily data, we can trace out the spillover effects over time in a set of impulse response functions and find that they reach their peak after 10 to 15 days.
Credit boom detection methodologies (such as threshold method) lack robustness as they are based on univariate detrending analysis and resort to ratios of credit to real activity. I propose a quantitative indicator to detect atypical behavior of credit from a multivariate system - a monetary VAR. This methodology explicitly accounts for endogenous interactions between credit, asset prices and real activity and detects atypical credit expansions and contractions in the Euro Area, Japan and the U.S. robustly and timely. The analysis also proves useful in real time.
This paper investigates the risk channel of monetary policy on the asset side of banks’ balance sheets. We use a factoraugmented vector autoregression (FAVAR) model to show that aggregate lending standards of U.S. banks, such as their collateral requirements for firms, are significantly loosened in response to an unexpected decrease in the Federal Funds rate. Based on this evidence, we reformulate the costly state verification (CSV) contract to allow for an active financial intermediary, embed it in a New Keynesian dynamic stochastic general equilibrium (DSGE) model, and show that – consistent with our empirical findings – an expansionary monetary policy shock implies a temporary increase in bank lending relative to borrower collateral. In the model, this is accompanied by a higher default rate of borrowers.
Even as online advertising continues to grow, a central question remains: Who to target? Yet, advertisers know little about how to select from the hundreds of audience segments for targeting (and combinations thereof) for a profitable online advertising campaign. Utilizing insights from a field experiment on Facebook (Study 1), we develop a model that helps advertisers solve the cold-start problem of selecting audience segments for targeting. Our model enables advertisers to calculate the break-even performance of an audience segment to make a targeted ad campaign at least as profitable as an untargeted one. Advertisers can use this novel model to decide whether to test specific audience segments in their campaigns (e.g., in randomized controlled trials). We apply our model to data from the Spotify ad platform to study the profitability of different audience segments (Study 2). Approximately half of those audience segments require the click-through rate to double compared to an untargeted campaign, which is unrealistically high for most ad campaigns. Our model also shows that narrow segments require a lift that is likely not attainable, specifically when the data quality of these segments is poor. We confirm this theoretical finding in an empirical study (Study 3): A decrease in data quality due to Apple’s introduction of the App Tracking Transparency (ATT) framework more negatively affects the click-through rate of narrow (versus broad) audience segments.
A common prediction of macroeconomic models of credit market frictions is that the tightness of financial constraints is countercyclical. As a result, theory implies a negative collateralizability premium; that is, capital that can be used as collateral to relax financial constraints provides insurance against aggregate shocks and commands a lower risk compensation compared with non-collateralizable assets. We show that a longshort portfolio constructed using a novel measure of asset collateralizability generates an average excess return of around 8% per year. We develop a general equilibrium model with heterogeneous firms and financial constraints to quantitatively account for the collateralizability premium.
Most insurers in the European Union determine their regulatory capital requirements based on the standard formula of Solvency II. However, there is evidence that the standard formula inaccurately reflects insurers’ risk situation and may provide misleading steering incentives. In the second pillar, Solvency II requires insurers to perform a so-called “Own Risk and Solvency Assessment” (ORSA). In their ORSA, insurers must establish their own risk measurement approaches, including those based on scenarios, in order to derive suitable risk assessments and address shortcomings of the standard formula. The idea of this paper is to identify scenarios in such a way that the standard formula in connection with the ORSA provides a reliable basis for risk management decisions. Using an innovative method for scenario identification, our approach allows for a simple but relatively precise assessment of marginal and even non-marginal portfolio changes. We numerically evaluate the proposed approach in the context of market risk employing an internal model from the academic literature and the Solvency Capital Requirement (SCR) calculation under Solvency II.
Gradient capital allocation, also known as Euler allocation, is a technique used to redistribute diversified capital requirements among different segments of a portfolio. The method is commonly employed to identify dominant risks, assessing the risk-adjusted profitability of segments, and installing limit systems. However, capital allocation can be misleading in all these applications because it only accounts for the current portfolio composition and ignores how diversification effects may change with a portfolio restructuring. This paper proposes enhancing the gradient capital allocation by adding “orthogonal convexity scenarios” (OCS). OCS identify risk concentrations that potentially drive portfolio risk and become relevant after restructuring. OCS have strong ties with principal component analysis (PCA), but they are a more general concept and compatible with common empirical patterns of risk drivers being fat-tailed and increasingly dependent in market downturns. We illustrate possible applications of OCS in terms of risk communication and risk limits.
Research on interbank networks and systemic importance is starting to recognise that the web of exposures linking banks balance sheets is more complex than the single-layer-of-exposure paradigm. We use data on exposures between large European banks broken down by both maturity and instrument type to characterise the main features of the multiplex structure of the network of large European banks. This multiplex network presents positive correlated multiplexity and a high similarity between layers, stemming both from standard similarity analyses as well as a core-periphery analyses of the different layers. We propose measures of systemic importance that fit the case in which banks are connected through an arbitrary number of layers (be it by instrument, maturity or a combination of both). Such measures allow for a decomposition of the global systemic importance index for any bank into the contributions of each of the sub-networks, providing a useful tool for banking regulators and supervisors. We use the dataset of exposures between large European banks to illustrate the proposed measures.
The analyses of intersectoral linkages of Leontief (1941) and Hirschman (1958) provide a natural way to study the transmission of risk among interconnected banks and to measure their systemic importance. In this paper we show how classic input-output analysis can be applied to banking and how to derive six indicators that capture different aspects of systemic importance, using a simple numerical example for illustration. We also discuss the relationship with other approaches, most notably network centrality measures, both formally and by means of a simulated network.
We uncover a new channel for spillovers of funding dry-ups. The 2016 US money market fund (MMF) reform exogenously reduced unsecured MMF funding for some banks. We use novel data to trace those banks to a platform for corporate deposit funding. We show that intensified competition for corporate deposits spilled the funding squeeze over to other banks with no MMF exposure. These banks paid more for deposits, and their pool of funding providers deteriorated. Moreover, their lending volumes and margins declined, and their stocks underperformed. Our results suggest that banks' competitiveness in funding markets affect their competitiveness in lending markets.
We present a network model of the interbank market in which optimizing risk averse banks lend to each other and invest in non-liquid assets. Market clearing takes place through a tâtonnement process which yields the equilibrium price, while traded quantities are determined by means of a matching algorithm. We compare three alternative matching algorithms: maximum entropy, closest matching and random matching. Contagion occurs through liquidity hoarding, interbank interlinkages and fire sale externalities. The resulting network configurations exhibits a core-periphery structure, dis-assortative behavior and low clustering coefficient. We measure systemic importance by means of network centrality and input-output metrics and the contribution of systemic risk by means of Shapley values. Within this framework we analyze the effects of prudential policies on the stability/efficiency trade-off. Liquidity requirements unequivocally decrease systemic risk but at the cost of lower efficiency (measured by aggregate investment in non-liquid assets); equity requirements tend to reduce risk (hence increase stability) without reducing significantly overall investment.
In many cases, the dire situation of public finances calls into question the very soundness of sovereigns and prompts corrective actions with far-reaching consequences. In this context, European authorities responded with several measures on different fronts, for instance by passing the "Fiscal Compact", which entered into force on January 1, 2013. Of critical importance in this framework is the assessment of a country’s situation by way of statistical measures, in order to take corrective actions when called for according to the letter of the law. If these statistics are not correct, there is a risk of imposing draconian measures on countries that do not really need it.
The implications of delegating fiscal decision making power to sub-national governments has become an area of significant interest over the past two decades, in the expectation that these reforms will lead to better and more efficient provision of public goods and services. The move towards decentralization has, however, not been homogeneously implemented on the revenue and expenditure side: decentralization has materialized more substantially on the latter than on the former, creating "vertical fiscal imbalances". These imbalances measure the extent to which sub-national governments’ expenditures are financed through their own revenues. This mismatch between own revenues and expenditures may have negative consequences for public finances performance, for example by softening the budget constraint of sub-national governments. Using a large sample of countries covering a long time period from the IMF’s Government Finance Statistics Yearbook, this paper is the first to examine the effects of vertical fiscal imbalances on fiscal performance through the accumulation of government debt. Our findings suggest that vertical fiscal imbalances are indeed relevant in explaining government debt accumulation, and call for a degree of caution when promoting fiscal decentralization.
[Tagungsbericht] Making finance sustainable: Ten years equator principles – success or letdown?
(2013)
In 2003, a number of banks adopted the Equator Principles (EPs), a voluntary Code of Conduct based on the International Finance Corporation’s (IFC) performance standards, to ensure the ecological and social sustainability of project finance. These so called Equator Principles Financial Institutions (EPFI) commit to requiring their borrowers to adopt sustainable management plans of environmental and social risks associated with their projects. The Principles apply to the project finance business segment of the banks and cover projects with a total cost of US $10 million or more. While for long developing countries relied on World Bank and other public assistance to finance infrastructure projects there has occurred a shift in recent years to private funding. The NGOs have been frustrated by this shift of project finance as they had spent their resources to exercise pressure on the public financial institutions to incorporate environmental and social standards in their project finance activities. However, after a shift of NGO pressure to private financial institutions the latter adopted the EPs for fear of reputational risks. NGOs had laid down their own more ambitious ideas about sustainable finance in the Collevecchio Declaration on Financial Institutions and Sustainability. Legally speaking, the EPs are a self-regulatory soft law instrument. However, it has a hard law dimension as the Equator Banks require their borrowers to comply with the EPs through covenants in the loan contracts that may trigger a default in a case of violation. ...
Since the outbreak of the financial crisis, the macro-prudential policy paradigm has gained increasing prominence (Bank of England, 2009; Bernanke, 2011). The dynamics of this shift in the economic discourse, and the reasons this shift has not taken place prior to the crisis have not been addressed systemically. This paper investigates the evolution of the economic discourse on systemic risk and banking regulation to better understand these changes and their timing. Further, we use our sample to inquire whether, and if so, why the economic regulatory studies failed to recommend a reliable banking regulation prior to the crisis. By following a discourse analysis, we establish that the economic discourse on banking regulation has not been suitable for providing the knowledge basis required for a dynamically reliable banking regulation, and we identify the underlying reasons for such failure. These reasons include the obsession of economic discourse with optimization and particular forms of formalism, particularly, partial equilibrium analysis. Further, the economic discourse on banking regulation excludes historical and practitioners’ discourses and ignores weak signals. We point out that post-crisis, these epistemological failures of the economic discourse on banking regulation were not sufficiently recognized and that recent attempts to conceptualize systemic risk as a negative externality and to thus price it point to the persistence of formalism, equilibrium thinking and optimization, with their attending dangers.
We develop a novel empirical approach to identify the effectiveness of policies against a pandemic. The essence of our approach is the insight that epidemic dynamics are best tracked over stages, rather than over time. We use a normalization procedure that makes the pre-policy paths of the epidemic identical across regions. The procedure uncovers regional variation in the stage of the epidemic at the time of policy implementation. This variation delivers clean identification of the policy effect based on the epidemic path of a leading region that serves as a counterfactual for other regions. We apply our method to evaluate the effectiveness of the nationwide stay-home policy enacted in Spain against the Covid-19 pandemic. We find that the policy saved 15.9% of lives relative to the number of deaths that would have occurred had it not been for the policy intervention. Its effectiveness evolves with the epidemic and is larger when implemented at earlier stages.
We show that the correct experiment to evaluate the effects of a fiscal adjustment is the simulation of a multi year fiscal plan rather than of individual fiscal shocks. Simulation of fiscal plans adopted by 16 OECD countries over a 30-year period supports the hypothesis that the effects of consolidations depend on their design. Fiscal adjustments based upon spending cuts are much less costly, in terms of output losses, than tax-based ones and have especially low output costs when they consist of permanent rather than stop and go changes in taxes and spending. The difference between tax-based and spending-based adjustments appears not to be explained by accompanying policies, including monetary policy. It is mainly due to the different response of business confidence and private investment.
We develop a methodology to identify and rank “systemically important financial institutions” (SIFIs). Our approach is consistent with that followed by the Financial Stability Board (FSB) but, unlike the latter, it is free of judgment and it is based entirely on publicly available data, thus filling the gap between the official views of the regulator and those that market participants can form with their own information set. We apply the methodology to annual data on three samples of banks (global, EU and euro area) for the years 2007-2012. We examine the evolution of the SIFIs over time and document the shifs in the relative weights of the major geographic areas. We also discuss the implication of the 2013 update of the identification methodology proposed by the FSB.
Banking and markets
(2001)
This paper integrates a number of recent themes in the literature in banking and asset markets–optimal risk sharing, limited market participation, asset-price volatility, market liquidity, and financial crises–in a general-equilibrium theory of the financial system. A complex financial system comprises both financial markets financial institutions. Financial institutions can take the form of intermediaries or banks. Banks, inlike intermediaries, are subject to runs, but crises do not imply market failure. We show that a sophisticated financiel system–a system with complete markets for aggregate risk and limited market participation–is incentive-efficient, if the institutions take the form of intermediaries, or else constrained-efficient, of they take the form of banks. We also consider an economy in which the markets for aggregate risks are incomplete. In this context, there is a rolefpr prudential regulation: regulating liquidity can improve welfare.
In this paper we propose a way forward towards increased financial resilience in times of growing disagreement concerning open borders, free trade and global regulatory standards. In light of these concerns, financial resilience remains a highly valued policy objective. We wish to contribute by suggesting an agenda of concrete, do-able steps supporting an enhanced level of resilience, combined with a deeper understanding of its relevance in the public domain.
First, remove inconsistencies across regulatory rules and territorial regimes, and ensure their credibility concerning implementation. Second, discourage the use of financial regulatory standards as means of international competition. Third, give more weight to pedagogically explaining the established regulatory standards in public, to strengthen their societal backing.
Im Rahmen der Arbeit wird empirisch untersucht, welche Reaktionen die Veröffentlichung von Periodenergebnissen als Teil der Unternehmenspublizität am deutschen Kapitalmarkt auslöst, welches die Bestimmungsfaktoren für das Ausmaß der Kapitalmarktreaktionen sind und ob Unternehmen in der Lage sind, durch ihre Publizitätsentscheidungen die Kapitalmarktreaktionen auf die Veröffentlichung von Periodenergebnissen zu beeinflussen. Kapitalmarktreaktionen auf die Veröffentlichung von Periodenergebnissen werden im Rahmen dieser Arbeit anhand von Aktienkursen und Handelsvolumen (Informationsgehalt) sowie anhand von Geld-Brief-Spannen (Informationsasymmetrie) gemessen. Als Ergebnis kann ein erhöhtes Ausmaß an abnormalen Renditen und Handelsvolumen am Tag der Ergebnisankündigung festgestellt werden, was darauf hinweist, dass Ergebnisankündigungen Informationsgehalt besitzen. Weiterhin kann festgestellt werden, dass durch Ergebnisankündigungen anhand von Geld-Brief-Spannen gemessene Informationsasymmetrien sinken. Weiterhin zeigt sich, dass die Kapitalmarktreaktion umso stärker ist, je mehr zusätzliche Informationen zusammen mit dem Periodenergebnis veröffentlicht werden. In diesem Zusammenhang kann auch gezeigt werden, dass sowohl Aktienkurs- und Handelsvolumenreaktionen als auch Geld-Brief-Spannen bei Anwendung international anerkannter Rechnungslegungsgrundsätze größer sind als bei Anwendung handelsrechtlicher Grundsätze. Dabei zeigt sich, dass Aktienkursreaktionen und Geld-Brief-Spannen über den Untersuchungszeitraum, der als Gewöhnungsphase an international anerkannte Rechnungslegungsgrundsätze angesehen werden kann, sinken und das Handelsvolumen steigt. Darüber hinaus kann gezeigt werden, dass die Kapitalmarktreaktion umso geringer ausfällt, je mehr oder je qualitativ hochwertiger die Unternehmenspublizität vor der Ergebnisankündigung ausfällt.
We study the information flow from the ECB on policy dates since its inception, using tick data. We show that three factors capture about all of the variation in the yield curve but that these are different factors with different variance shares in the window that contains the policy decision announcement and the window that contains the press conference. We also show that the QE-related policy factor has been dominant in the recent period and that Forward Guidance and QE effects have been very persistent on the longer-end of the yield curve. We further show that broad and banking stock indices' responses to monetary policy surprises depended on the perceived nature of the surprises. We find no evidence of asymmetric responses of financial markets to positive and negative surprises, in contrast to the literature on asymmetric real effects of monetary policy. Lastly, we show how to implement our methodology for any policy-related news release, such as policymaker speeches. To carry out the analysis, we construct the Euro Area Monetary Policy Event- Study Database (EA-MPD). This database, which contains intraday asset price changes around the policy decision announcement as well as around the press conference, is a contribution on its own right and we expect it to be the standard in monetary policy research for the euro area.
We investigate whether government credit guarantee schemes, extensively used at the onset of the Covid-19 pandemic, led to substitution of non-guaranteed with guaranteed credit rather than fully adding to the supply of lending. We study this issue using a unique euro-area credit register data, matched with supervisory bank data, and establish two main findings. First, guaranteed loans were mostly extended to small but comparatively creditworthy firms in sectors severely affected by the pandemic, borrowing from large, liquid and well-capitalized banks. Second, guaranteed loans partially substitute pre-existing non-guaranteed debt. For firms borrowing from multiple banks, the substitution mainly arises from the lending behavior of the bank extending guaranteed loans. Substitution was highest for funding granted to riskier and smaller firms in sectors more affected by the pandemic, and borrowing from larger and stronger banks. Overall, the evidence indicates that government guarantees contributed to the continued extension of credit to relatively creditworthy firms hit by the pandemic, but also benefited banks’ balance sheets to some extent.
Using novel monthly data for 226 euro-area banks from 2007 to 2015, we investigate the determinants of changes in banks’ sovereign exposures and their effects during and after the crisis. First, public, bailed out and poorly capitalized banks responded to sovereign stress by purchasing domestic public debt more than other banks, with public banks’ purchases growing especially in coincidence with the largest ECB liquidity injections. Second, bank exposures significantly amplified the transmission of risk from the sovereign and its impact on lending. This amplification of the impact on lending does not appear to arise from spurious correlation or reverse causality.
We extend the classical ”martingale-plus-noise” model for high-frequency prices by an error correction mechanism originating from prevailing mispricing. The speed of price reversal is a natural measure for informational efficiency. The strength of the price reversal relative to the signal-to-noise ratio determines the signs of the return serial correlation and the bias in standard realized variance estimates. We derive the model’s properties and locally estimate it based on mid-quote returns of the NASDAQ 100 constituents. There is evidence of mildly persistent local regimes of positive and negative serial correlation, arising from lagged feedback effects and sluggish price adjustments. The model performance is decidedly superior to existing stylized microstructure models. Finally, we document intraday periodicities in the speed of price reversion and noise-to-signal ratios.
We investigate the characteristics of infrastructure as an asset class from an investment perspective of a limited partner. While non U.S. institutional investors gain exposure to infrastructure assets through a mix of direct investments and private fund vehicles, U.S. investors predominantly invest in infrastructure through private funds. We find that the stream of cash flows delivered by private infrastructure funds to institutional investors is very similar to that delivered by other types of private equity, as reflected by the frequency and amounts of net cash flows. U.S. public pension funds perform worse than other institutional investors in their infrastructure fund investments, although they are exposed to underlying deals with very similar project stage, concession terms, ownership structure, industry, and geographical location. By selecting funds that invest in projects with poor financial performance, U.S. public pension funds have created an implicit subsidy to infrastructure as an asset class, which we estimate within the range of $730 million to $3.16 billion per year depending on the benchmark.
Shallow meritocracy
(2023)
Meritocracies aspire to reward hard work and promise not to judge individuals by the circumstances into which they were born. However, circumstances often shape the choice to work hard. I show that people's merit judgments are "shallow" and insensitive to this effect. They hold others responsible for their choices, even if these choices have been shaped by unequal circumstances. In an experiment, US participants judge how much money workers deserve for the effort they exert. Unequal circumstances disadvantage some workers and discourage them from working hard. Nonetheless, participants reward the effort of disadvantaged and advantaged workers identically, regardless of the circumstances under which choices are made. For some participants, this reflects their fundamental view regarding fair rewards. For others, the neglect results from the uncertain counterfactual. They understand that circumstances shape choices but do not correct for this because the counterfactual—what would have happened under equal circumstances—remains uncertain.
We document the individual willingness to act against climate change and study the role of social norms in a large sample of US adults. Individual beliefs about social norms positively predict pro-climate donations, comparable in strength to universal moral values and economic preferences such as patience and reciprocity. However, we document systematic misperceptions of social norms. Respondents vastly underestimate the prevalence of climate-friendly behaviors and norms. Correcting these misperceptions in an experiment causally raises individual willingness to act against climate change as well as individual support for climate policies. The effects are strongest for individuals who are skeptical about the existence and threat of global warming.
This paper shows that support for climate action is high across survey participants from all EU countries in three dimensions: (1) Participants are willing to contribute personally to combating climate change, (2) they approve of pro-climate social norms, and (3) they demand government action. In addition, there is a significant perception gap where individuals underestimate others' willingness to contribute to climate action by over 10 percentage points, influencing their own willingness to act. Policymakers should recognize the broad support for climate action among European citizens and communicate this effectively to counteract the vocal minority opposed to it.
Investors' return expectations are pivotal in stock markets, but the reasoning behind these expectations remains a black box for economists. This paper sheds light on economic agents' mental models -- their subjective understanding -- of the stock market, drawing on surveys with the US general population, US retail investors, US financial professionals, and academic experts. Respondents make return forecasts in scenarios describing stale news about the future earnings streams of companies, and we collect rich data on respondents' reasoning. We document three main results. First, inference from stale news is rare among academic experts but common among households and financial professionals, who believe that stale good news lead to persistently higher expected returns in the future. Second, while experts refer to the notion of market efficiency to explain their forecasts, households and financial professionals reveal a neglect of equilibrium forces. They naively equate higher future earnings with higher future returns, neglecting the offsetting effect of endogenous price adjustments. Third, a series of experimental interventions demonstrate that these naive forecasts do not result from inattention to trading or price responses but reflect a gap in respondents' mental models -- a fundamental unfamiliarity with the concept of equilibrium.
Speculative news on corporate takeovers may hurt productivity because uncertainty and threat of job loss cause anxiety, distraction, and reduced collaboration and morale among employees and managers. Using a panel of OECD-headquartered firms, we show that firm productivity temporarily declines upon announcements of speculative takeover rumors that do not materialize. This productivity dip is more pronounced for targets and for firms in countries with weaker employee rights and less long-term orientation. Abnormal stock returns mirror these results. The evidence fosters our understanding of potential real effects of speculative financial news and the costs of takeover threats.
Recent advances in natural language processing have contributed to the development of market sentiment measures through text content analysis in news providers and social media. The effectiveness of these sentiment variables depends on the imple- mented techniques and the type of source on which they are based. In this paper, we investigate the impact of the release of public financial news on the S&P 500. Using automatic labeling techniques based on either stock index returns or dictionaries, we apply a classification problem based on long short-term memory neural networks to extract alternative proxies of investor sentiment. Our findings provide evidence that there exists an impact of those sentiments in the market on a 20-minute time frame. We find that dictionary-based sentiment provides meaningful results with respect to those based on stock index returns, which partly fails in the mapping process between news and financial returns.
Discussions about the banking union have restarted. Its success so far is limited: national banking sectors are still overwhelmingly exposed to their own countries’ economies, cross border banking has not increased and capital and liquidity remain locked within national boundaries. The policy letter highlights that the current debate, centered on sovereign exposures and deposit insurance, misses critical underlying problems in the supervision and resolution frameworks. The ECB supervisors’ efforts to facilitate cross-border banking have been hampered by national ringfencing. The resolution framework is not up to its task: limited powers of the SRB, prohibitive access conditions and limited size of the Single Resolution Fund limit its effectiveness. A lack of a coherent European framework for insolvency unlevels the regulatory field and creates incentives to bypass European rules. The new Commission and European Parliament, with the new ECB leadership, provide a unique opportunity to address these shortcomings and make the banking union work.
There is much discussion today about a possible digital euro (PDE). Is this attention exaggerated? Are “central bank digital currencies” (CBDCs) “a solution in search of a problem”, as some have argued? This article summarizes the main facts about the PDE and concludes that, if the decision on adoption had to be taken today, the arguments against would outweigh those in favor. However, there may be future circumstances in which having a CBDC ready for use can indeed be useful. Therefore, preparing is a good thing, even if the odds of its usefulness in normal conditions are slim.
In its first ten years (2014-2023), the banking union was successful in its prudential agenda but failed spectacularly in its underlying objective: establishing a single banking market in the euro area. This goal is now more important than ever, and easier to attain than at any time in the last decade. To make progress, cross-border banks should receive a specific treatment within general banking union legislation. Suggestions are made on how to make such regulatory carve-out effective and legally sound.
This policy note summarizes our assessment of financial sanctions against Russia. We see an increase in sanctions severity starting from (1) the widely discussed SWIFT exclusions, followed by (2) blocking of correspondent banking relationships with Russian banks, including the Central Bank, alongside secondary sanctions, and (3) a full blacklisting of the ‘real’ export-import flows underlying the financial transactions. We assess option (1) as being less impactful than often believed yet sending a strong signal of EU unity; option (2) as an effective way to isolate the Russian banking system, particularly if secondary sanctions are in place, to avoid workarounds. Option (3) represents possibly the most effective way to apply economic and financial pressure, interrupting trade relationships.
We assess, through VAR evidence, the effects of monetary policy on banks’ risk exposure and find the presence of a risk-taking channel. A model combining fragile banks prone to risk mis-incentives and credit constrained firms, whose collateral fluctuations generate a balance sheet channel, is used to rationalize the evidence. A monetary expansion increases bank leverage. With two consequences: on the one side this exacerbates risk exposure; on the other, the risk spiral depresses output, therefore dampening the conventional amplification effect of the financial accelerator.
We assess the effects of monetary policy on bank risk to verify the existence of a risk-taking channel - monetary expansions inducing banks to assume more risk. We first present VAR evidence confirming that this channel exists and tends to concentrate on the bank funding side. Then, to rationalize this evidence we build a macro model where banks subject to runs endogenously choose their funding structure (deposits vs. capital) and risk level. A monetary expansion increases bank leverage and risk. In turn, higher bank risk in steady state increases asset price volatility and reduces equilibrium output.
Exit strategies
(2014)
We study alternative scenarios for exiting the post-crisis fiscal and monetary accommodation using a macromodel where banks choose their capital structure and are subject to runs. Under a Taylor rule, the post-crisis interest rate hits the zero lower bound (ZLB) and remains there for several years. In that condition, pre-announced and fast fiscal consolidations dominate - based on output and inflation performance and bank stability - alternative strategies incorporating various degrees of gradualism and surprise. We also examine an alternative monetary strategy in which the interest rate does not reach the ZLB; the benefits from fiscal consolidation persist, but are more nuanced.
We present new statistical indicators of the structure and performance of US banks from 1990 to today, geographically disaggregated at the level of individual counties. The constructed data set (20 indicators for some 3150 counties over 31 years, for a total of about 2 million data points) conveys a detailed picture of how the geography of US banking has evolved in the last three decades. We consider the data as a stepping stone to understand the role banks and banking policies may have played in mitigating, or exacerbating, the rise of poverty and inequality in certain US regions.
We examine the impact of so-called "Crisis Contracts" on bank managers' risk-taking incentives and on the probability of banking crises. Under a Crisis Contract, managers are required to contribute a pre-specified share of their past earnings to finance public rescue funds when a crisis occurs. This can be viewed as a retroactive tax that is levied only when a crisis occurs and that leads to a form of collective liability for bank managers. We develop a game-theoretic model of a banking sector whose shareholders have limited liability, so that society at large will suffer losses if a crisis occurs. Without Crisis Contracts, the managers' and shareholders' interests are aligned, and managers take more than the socially optimal level of risk. We investigate how the introduction of Crisis Contracts changes the equilibrium level of risk-taking and the remuneration of bank managers. We establish conditions under which the introduction of Crisis Contracts will reduce the probability of a banking crisis and improve social welfare. We explore how Crisis Contracts and capital requirements can supplement each other and we show that the efficacy of Crisis Contracts is not undermined by attempts to hedge.
Executive Stock Option Programs (SOPs) have become the dominant compensation instrument for top-management in recent years. The incentive effects of an SOP both with respect to corporate investment and financing decisions critically depend on the design of the SOP. A specific problem in designing SOPs concerns dividend protection. Usually, SOPs are not dividend protected, i.e. any dividend payout decreases the value of a manager’s options. Empirical evidence shows that this results in a significant decrease in the level of corporate dividends and, at the same time, into an increase in share repurchases. Yet, few suggestions have been made on how to account for dividends in SOPs. This paper applies arguments from principal-agent-theory and from the theory of finance to analyze different forms of dividend protection, and to address the relevance of dividend protection in SOPs. Finally, the paper relates the theoretical analysis to empirical work on the link between share repurchases and SOPs.
We design, field and exploit survey data from a representative sample of the French population to examine whether informative social interactions enter householdsístockholding decisions. Respondents report perceptions about their circle of peers with whom they interact about Önancial matters, their social circle and the population. We provide evidence for the presence of an information channel through which social interactions ináuence perceptions and expectations about stock returns, and financial behavior. We also find evidence of mindless imitation of peers in the outer social circle, but this does not permeate as many layers of financial behavior as informative social interactions do.
We consider the continuous-time portfolio optimization problem of an investor with constant relative risk aversion who maximizes expected utility of terminal wealth. The risky asset follows a jump-diffusion model with a diffusion state variable. We propose an approximation method that replaces the jumps by a diffusion and solve the resulting problem analytically. Furthermore, we provide explicit bounds on the true optimal strategy and the relative wealth equivalent loss that do not rely on results from the true model. We apply our method to a calibrated affine model and fine that relative wealth equivalent losses are below 1.16% if the jump size is stochastic and below 1% if the jump size is constant and γ ≥ 5. We perform robustness checks for various levels of risk-aversion, expected jump size, and jump intensity.
We consider the continuous-time portfolio optimization problem of an investor with constant relative risk aversion who maximizes expected utility of terminal wealth. The risky asset follows a jump-diffusion model with a diffusion state variable. We propose an approximation method that replaces the jumps by a diffusion and solve the resulting problem analytically. Furthermore, we provide explicit bounds on the true optimal strategy and the relative wealth equivalent loss that do not rely on quantities known only in the true model. We apply our method to a calibrated affine model. Our findings are threefold: Jumps matter more, i.e. our approximation is less accurate, if (i) the expected jump size or (ii) the jump intensity is large. Fixing the average impact of jumps, we find that (iii) rare, but severe jumps matter more than frequent, but small jumps.