Sustainable Architecture for Finance in Europe (SAFE)
Refine
Year of publication
Document Type
- Working Paper (805)
- Part of Periodical (500)
- Report (62)
- Article (32)
- Contribution to a Periodical (2)
- Periodical (2)
- Conference Proceeding (1)
- Review (1)
Is part of the Bibliography
- no (1405)
Keywords
- Financial Institutions (90)
- Capital Markets Union (67)
- ECB (62)
- Financial Markets (58)
- Banking Regulation (51)
- Banking Union (50)
- Household Finance (41)
- Banking Supervision (40)
- Macro Finance (40)
- Monetary Policy (35)
Institute
- Sustainable Architecture for Finance in Europe (SAFE) (1405)
- Wirtschaftswissenschaften (1351)
- Center for Financial Studies (CFS) (779)
- House of Finance (HoF) (686)
- Institute for Monetary and Financial Stability (IMFS) (123)
- Rechtswissenschaft (62)
- Foundation of Law and Finance (47)
- Institute for Law and Finance (ILF) (7)
- Gesellschaftswissenschaften (6)
- Frankfurt MathFinance Institute (FMFI) (3)
The loan impairment rules recently introduced by IFRS 9 require banks to estimate their future credit losses by using forward-looking information. We use supervisory loan-level data from Germany to investigate how banks apply their reporting discretion and adjust their lending upon the announcement of the new rules. Our identification strategy exploits a cut-off for the level of provisions at the investment grade threshold based on banks’ internal rating of a borrower. We find that banks required to adopt the new rules assign better internal ratings to exactly the same borrowers compared to banks that do not apply IFRS 9 around this cut-off. This pattern is consistent with a strategic use of the increased reporting discretion that is inherent to rules requiring forward-looking loss estimation. At the same time, banks also reduce their lending exposure to exactly those borrowers at the highest risk of experiencing a rating downgrade below the cutoff. These loans would be associated with additional provisions in future periods, both in the intensive and extensive margin. The lending change thus mitigates some of the negative effects of increased reporting opportunism on banks’ crisis resilience. However, when these firms with internal ratings around the investment grade cut-off obtain less external funding through banks, the introduction of IFRS 9 will likely also be associated with real economic effects
Using German and US brokerage data we find that investors are more likely to sell speculative stocks trading at a gain. Investors’ gain realizations are monotonically increasing in a stock’s speculativeness. This translates into a high disposition effect for speculative and a much lower disposition effect for non-speculative stocks. Our findings hold across asset classes (stocks, passive, and active funds) and explain cross-sectional differences in investor selling behavior which previous literature attributed primarily to investor demographics. Our results are robust to rank or attention effects and can be linked to realization utility and rolling mental account.
Recent empirical evidence shows that most international prices are sticky in dollars. This paper studies the policy implications of this fact in the context of an open economy model, allowing for an arbitrary structure of asset markets, general preferences and technologies, time- or state-dependent price setting, and a rich set of shocks. We show that although monetary policy is less efficient and cannot implement the flexible-price allocation, inflation targeting remains robustly optimal in non-U.S. economies. The implementation of this non-cooperative policy results in a "global monetary cycle" with other countries importing the monetary stance of the U.S. The capital controls cannot unilaterally improve the allocation and are useful only when coordinated across countries. Thanks to the dominance of the dollar, the U.S. can extract rents in international goods and asset markets and enjoy a higher welfare than other economies. Although international cooperation benefits other countries by improving global demand for dollar-invoiced goods, it is not in the self-interest of the U.S. and may be hard to sustain.
Output gap revisions can be large even after many years. Real-time reliability tests might therefore be sensitive to the choice of the final output gap vintage that the real-time estimates are compared to. This is the case for the Federal Reserve’s output gap. When accounting for revisions in response to the global financial crisis in the final output gap, the improvement in real-time reliability since the mid-1990s is much smaller than found by Edge and Rudd (Review of Economics and Statistics, 2016, 98(4), 785-791). The negative bias of real-time estimates from the 1980s has disappeared, but the size of revisions continues to be as large as the output gap itself.
The authors systematically analyse how the realtime reliability assessment is affected through varying the final output gap vintage. They find that the largest changes are caused by output gap revisions after recessions. Economists revise their models in response to such events, leading to economically important revisions not only for the most recent years, but reaching back up to two decades. This might improve the understanding of past business cycle dynamics, but decreases the reliability of real-time output gaps ex post.
The great financial crisis and the euro area crisis led to a substantial reform of financial safety nets across Europe and – critically – to the introduction of supranational elements. Specifically, a supranational supervisor was established for the euro area, with discrete arrangements for supervisory competences and tasks depending on the systemic relevance of supervised credit institutions. A resolution mechanism was created to allow the frictionless resolution of large financial institutions. This resolution mechanism has been now complemented with a funding instrument.
While much more progress has been achieved than most observers could imagine 12 years ago, the banking union remains unfinished with important gaps and deficiencies. The experience over the past years, especially in the area of crisis management and resolution, has provided impetus for reform discussions, as reflected most lately in the Eurogroup statement of 16 June 2022.
This Policy Insight looks primarily at the current and the desired state of the banking union project. The key underlying question, and the focus here, is the level of ambition and how it is matched with effective legal and regulatory tools. Specifically, two questions will structure the discussions:
What would be a reasonable definition and rationale for a ‘complete’ banking union? And what legal reforms would be required to achieve it?
Banking union is a case of a new remit of EU-level policy that so far has been established on the basis of long pre-existing treaty stipulations, namely, Article 127(6) TFEU (for banking supervision) and Article 114 TFEU (for crisis management and deposit insurance). Could its completion be similarly carried out through secondary law? Or would a more comprehensive overhaul of the legal architecture be required to ensure legal certainty and legitimacy?
Using the negotiation process of the Basel Committee on Banking Supervision (BCBS), this paper studies the way regulators form their positions on regulatory issues in the process of international standard-setting and the consequences on the resultant harmonized framework. Leveraging on leaked voting records and corroborating them using machine learning techniques on publicly available speeches, we construct a unique dataset containing the positions of banks and national regulators on the regulatory initiatives of Basel II and III. We document that the probability of a regulator opposing a specific initiative increases by 30% if their domestic national champion opposes the new rule, particularly when the proposed rule disproportionately affects them. We find the effect is driven by regulators who had prior experience of working in large banks – lending support to the private-interest theories of regulation. Meanwhile smaller banks, even when they collectively have a higher share in the domestic market, do not have any impact on regulators’ stand – providing little support to public-interest theories of regulation. Finally, we show this decision-making process manifests into significant watering down of proposed rules, thereby limiting the potential gains from harmonization of international financial regulation.
Highly interconnected global supply chains make countries vulnerable to supply chain disruptions. The authors estimate the macroeconomic effects of global supply chain shocks for the euro area. Their empirical model combines business cycle variables with data from international container trade.
Using a novel identification scheme, they augment conventional sign restrictions on the impulse responses by narrative information about three episodes: the Tohoku earthquake in 2011, the Suez Canal obstruction in 2021, and the Shanghai backlog in 2022. They show that a global supply chain shock causes a drop in euro area real economic activity and a strong increase in consumer prices. Over a horizon of one year, the global supply chain shock explains about 30% of inflation dynamics. They also use regional data on supply chain pressure to isolate shocks originating in China.
Their results show that supply chain disruptions originating in China are an important driver for unexpected movements in industrial production, while disruptions originating outside China are an especially important driver for the dynamics of consumer prices.
The author proposes a Differential-Independence Mixture Ensemble (DIME) sampler for the Bayesian estimation of macroeconomic models.It allows sampling from particularly challenging, high-dimensional black-box posterior distributions which may also be computationally expensive to evaluate. DIME is a “Swiss Army knife”, combining the advantages of a broad class of gradient-free global multi-start optimizers with the properties of a Monte Carlo Markov chain (MCMC). This includes fast burn-in and convergence absent any prior numerical optimization or initial guesses, good performance for multimodal distributions, a large number of chains (the “ensemble”) running in parallel, an endogenous proposal density generated from the state of the full ensemble, which respects the bounds of the prior distribution. The author shows that the number of parallel chains scales well with the number of necessary ensemble iterations.
DIME is used to estimate the medium-scale heterogeneous agent New Keynesian (“HANK”) model with liquid and illiquid assets, thereby for the first time allowing to also include the households’ preference parameters. The results mildly point towards a less accentuated role of household heterogeneity for the empirical macroeconomic dynamics.
The authors estimate perceptions about the Fed's monetary policy rule from panel data on professional forecasts of interest rates and macroeconomic conditions. The perceived dependence of the federal funds rate on economic conditions is time-varying and cyclical: high during tightening episodes but low during easings. Forecasters update their perceptions about the policy rule in response to monetary policy actions, measured by high-frequency interest rate surprises, suggesting that forecasters have imperfect information about the rule. The perceived rule impacts asset prices crucial for monetary policy transmission, driving how interest rates respond to macroeconomic news and explaining term premia in long-term interest rates.
We employ a proprietary transaction-level dataset in Germany to examine how capital requirements affect the liquidity of corporate bonds. Using the 2011 European Banking Authority capital exercise that mandated certain banks to increase regulatory capital, we find that affected banks reduce their inventory holdings, pre-arrange more trades, and have smaller average trade size. While non-bank affiliated dealers increase their market-making activity, they are unable to bridge this gap - aggregate liquidity declines. Our results are stronger for banks with a higher capital shortfall, for non-investment grade bonds, and for bonds where the affected banks were the dominant market-maker.
We develop a two-sector incomplete markets integrated assessment model to analyze the effectiveness of green quantitative easing (QE) in complementing fiscal policies for climate change mitigation. We model green QE through an outstanding stock of private assets held by a monetary authority and its portfolio allocation between a clean and a dirty sector of production. Green QE leads to a partial crowding out of private capital in the green sector and to a modest reduction of the global temperature by 0.04 degrees of Celsius until 2100. A moderate global carbon tax of 50 USD per tonne of carbon is 4 times more effective.
Many people do not understand the concepts of life expectancy and longevity risk, potentially leading them to under-save for retirement or to not purchase longevity insurance, which in turn could reduce wellbeing at older ages. We investigate alternative ways to increase the salience of both concepts, allowing us to assess whether these change peoples’ perceptions and financial decision making. Using randomly-assigned vignettes providing subjects with information about either life expectancy or longevity, we show that merely prompting people to think about financial decisions changes their perceptions regarding subjective survival probabilities. Moreover, this information also boosts respondents’ interest in saving and demand for longevity insurance. In particular, longevity information influences both subjective survival probabilities and financial decisions, while life expectancy information influences only annuity choices. We provide some evidence that many people are simply unaware of longevity risk.
When the COVID-19 crisis struck, banks using internal-rating based (IRB) models quickly recognized the increase in risk and reduced lending more than banks using a standardized approach. This effect is not driven by borrowers’ quality or by banks in countries with credit booms before the pandemic. The higher risk sensitivity of IRB models does not always result in lower credit provision when risk intensifies. Certain features of the IRB models – the use of a downturn Loss Given Default parameter – can increase banks’ resilience and preserve their intermediation capacity also during downturns. Affected borrowers were not able to fully insulate and decreased corporate investments.
Previous studies document a relationship between gambling activity at the aggregate level and investments in securities with lottery-like features. We combine data on individual gambling consumption with portfolio holdings and trading records to examine whether gambling and trading act as substitutes or complements. We find that gamblers are more likely than the average investor to hold lottery stocks, but significantly less likely than active traders who do not gamble. Our results suggest that gambling behavior across domains is less relevant compared to other portfolio characteristics that predict investing in high-risk and high-skew securities, and that gambling on and off the stock market act as substitutes to satisfy the same need, e.g., sensation seeking.
Colocation services offered by stock exchanges enable market participants to achieve execution costs for large orders that are substantially lower and less sensitive to transacting against high-frequency traders. However, these benefits manifest only for orders executed on the colocated brokers' own behalf, whereas customers' order execution costs are substantially higher. Analyses of individual order executions indicate that customer orders originating from colocated brokers are less actively monitored and achieve inferior execution quality. This suggests that brokers do not make effective use of their technology, possibly due to agency frictions or poor algorithm selection and parameter choice by customers.
The leading premium
(2022)
In this paper, we consider conditional measures of lead-lag relationships between aggregate growth and industry-level cash-flow growth in the US. Our results show that firms in leading industries pay an average annualized return 3.6\% higher than that of firms in lagging industries. Using both time series and cross sectional tests, we estimate an annual pure timing premium ranging from 1.2% to 1.7%. This finding can be rationalized in a model in which (a) agents price growth news shocks, and (b) leading industries provide valuable resolution of uncertainty about the growth prospects of lagging industries.
Advances in Machine Learning (ML) led organizations to increasingly implement predictive decision aids intended to improve employees’ decision-making performance. While such systems improve organizational efficiency in many contexts, they might be a double-edged sword when there is the danger of a system discontinuance. Following cognitive theories, the provision of ML-based predictions can adversely affect the development of decision-making skills that come to light when people lose access to the system. The purpose of this study is to put this assertion to the test. Using a novel experiment specifically tailored to deal with organizational obstacles and endogeneity concerns, we show that the initial provision of ML decision aids can latently prevent the development of decision-making skills which later becomes apparent when the system gets discontinued. We also find that the degree to which individuals 'blindly' trust observed predictions determines the ultimate performance drop in the post-discontinuance phase. Our results suggest that making it clear to people that ML decision aids are imperfect can have its benefits especially if there is a reasonable danger of (temporary) system discontinuances.
Search costs for lenders when evaluating potential borrowers are driven by the quality of the underwriting model and by access to data. Both have undergone radical change over the last years, due to the advent of big data and machine learning. For some, this holds the promise of inclusion and better access to finance. Invisible prime applicants perform better under AI than under traditional metrics. Broader data and more refined models help to detect them without triggering prohibitive costs. However, not all applicants profit to the same extent. Historic training data shape algorithms, biases distort results, and data as well as model quality are not always assured. Against this background, an intense debate over algorithmic discrimination has developed. This paper takes a first step towards developing principles of fair lending in the age of AI. It submits that there are fundamental difficulties in fitting algorithmic discrimination into the traditional regime of anti-discrimination laws. Received doctrine with its focus on causation is in many cases ill-equipped to deal with algorithmic decision-making under both, disparate treatment, and disparate impact doctrine. The paper concludes with a suggestion to reorient the discussion and with the attempt to outline contours of fair lending law in the age of AI.
Many nations incentivize retirement saving by letting workers defer taxes on pension contributions, imposing them when retirees withdraw their funds. Using a dynamic life cycle model, we show how ‘Rothification’ – that is, taxing 401(k) contributions rather than payouts – alters saving, investment, consumption, and Social Security claiming patterns. We find that taxing pension contributions instead of withdrawals leads to delayed retirement, somewhat lower lifetime tax payments, and relatively small reductions in consumption. Indeed, the two tax regimes generate quite similar relative inequality metrics: the relative consumption inequality ratio under TEE is only four percent higher than in the EET case. Moreover, results indicate that the Gini measures are also strikingly similar under the EET and the TEE regimes for lifetime consumption, cash on hand, and 401(k) assets, differing by only 1-4 percent. While tax payments are higher early in life under the TEE regime, they are slightly lower in the long run. Moreover, higher EET tax payments are also accompanied by higher volatility. We therefore find few reasons for policymakers to favor either tax approach on egalitarian or revenue-enhancing grounds.
We analyze how market fragmentation affects market quality of SME and other less actively traded stocks. Compared to large stocks, they are less likely to be traded on multiple venues and show, if at all, low levels of fragmentation. Concerning the impact of fragmentation on market quality, we find evidence for a hockey stick effect: Fragmentation has no effect for infrequently traded stocks, a negative effect on liquidity of slightly more active stocks, and increasing benefits for liquidity of large and actively traded stocks. Consequently, being traded on multiple venues is not necessarily harmful for SME stock market quality.
SAFE Update December 2022
(2022)
SAFE Update October 2022
(2022)
The authors propose a new method to forecast macroeconomic variables that combines two existing approaches to mixed-frequency data in DSGE models. The first existing approach estimates the DSGE model in a quarterly frequency and uses higher frequency auxiliary data only for forecasting. The second method transforms a quarterly state space into a monthly frequency. Their algorithm combines the advantages of these two existing approaches.They compare the new method with the existing methods using simulated data and real-world data. With simulated data, the new method outperforms all other methods, including forecasts from the standard quarterly model. With real world data, incorporating auxiliary variables as in their method substantially decreases forecasting errors for recessions, but casting the model in a monthly frequency delivers better forecasts in normal times.
Search costs for lenders when evaluating potential borrowers are driven by the quality of the underwriting model and by access to data. Both have undergone radical change over the last years, due to the advent of big data and machine learning. For some, this holds the promise of inclusion and better access to finance. Invisible prime applicants perform better under AI than under traditional metrics. Broader data and more refined models help to detect them without triggering prohibitive costs. However, not all applicants profit to the same extent. Historic training data shape algorithms, biases distort results, and data as well as model quality are not always assured. Against this background, an intense debate over algorithmic discrimination has developed. This paper takes a first step towards developing principles of fair lending in the age of AI. It submits that there are fundamental difficulties in fitting algorithmic discrimination into the traditional regime of anti-discrimination laws. Received doctrine with its focus on causation is in many cases ill-equipped to deal with algorithmic decision-making under both, disparate treatment, and disparate impact doctrine. The paper concludes with a suggestion to reorient the discussion and with the attempt to outline contours of fair lending law in the age of AI.
We investigate the impact of uneven transparency regulation across countries and industries on the location of economic activity. Using two distinct sources of regulatory variation—the varying extent of financial-reporting requirements and the staggered introduction of electronic business registers in Europe—, we consistently document that direct exposure to transparency regulation is negatively associated with the focal industry’s economic activity in terms of inputs (e.g., employment) and outputs (e.g., production). By contrast, we find that indirect exposure to supplier and customer industries’ transparency regulation is positively associated with the focal industry’s economic activity. Our evidence suggests uneven transparency regulation can reallocate economic activity from regulated toward unregulated countries and industries, distorting the location of economic activity.
To ensure the credibility of market discipline induced by bail-in, neither retail investors nor peer banks should appear prominently among the investor base of banks’ loss absorbing capital. Empirical evidence on bank-level data provided by the German Federal Financial Supervisory Authority raises a few red flags. Our list of policy recommendations encompasses disclosure policy, data sharing among supervisors, information transparency on holdings of bail-inable debt for all stakeholders, threshold values, and a well-defined upper limit for any bail-in activity. This document was provided by the Economic Governance Support Unit at the request of the ECON Committee.
European banks have substantial investments in assets that are
measured without directly observable market prices (mark-to-
model). Financial disclosures of these value estimates lack
standardization and are hard to compare across banks. These
comparability concerns are concentrated in large European
banks that extensively rely on level 3 estimates with the most
unobservable inputs. Although the relevant balance sheet
positions only represent a small fraction of these large banks’
total assets (2.9%), their value equals a significant fraction of core
equity tier 1 (48.9%). Incorrect valuations thus have a potential to
impact financial stability. 85% of these bank assets are under
direct ECB supervision. Prudential regulation requires value
adjustments that are apt to shield capital against valuation risk.
Yet, stringent enforcement is critical for achieving this objective.
This document was provided by the Economic Governance
Support Unit at the request of the ECON Committee.
Short sale bans may improve market quality during crises: new evidence from the 2020 Covid crash
(2022)
In theory, banning short selling stabilizes stock prices but undermines pricing efficiency and has ambiguous impacts on market liquidity. Empirical studies find mixed and conflicting results. This paper leverages cross-country policy variation during the 2020 Covid crisis to assess differential impacts of bans on stock liquidity, prices, and volatility. Results suggest that bans improved liquidity and stabilized prices for illiquid stocks but temporarily diminished liquidity for highly liquid stocks.The findings support theories in which short sale bans may improve liquidity by selectively filtering out informed— potentially predatory—traders. Thus, policies that target the most illiquid stocks may deliver better overall market quality than uniform short sale bans imposed on all stocks.
With open banking, consumers take greater control over their own financial data and share it at their discretion. Using a rich set of loan application data from the largest German FinTech lender in consumer credit, this paper studies what characterizes borrowers who share data and assesses its impact on loan application outcomes. I show that riskier borrowers share data more readily, which subsequently leads to an increase in the probability of loan approval and a reduction in interest rates. The effects hold across all credit risk profiles but are the most pronounced for borrowers with lower credit scores (a higher increase in loan approval rate) and higher credit scores (a larger reduction in interest rate). I also find that standard variables used in credit scoring explain substantially less variation in loan application outcomes when customers share data. Overall, these findings suggest that open banking improves financial inclusion, and also provide policy implications for regulators engaged in the adoption or extension of open banking policies.
With free delivery of products virtually being a standard in E-commerce, product returns pose a major challenge for online retailers and society. For retailers, product returns involve significant transportation, labor, disposal, and administrative costs. From a societal perspective, product returns contribute to greenhouse gas emissions and packaging disposal and are often a waste of natural resources. Therefore, reducing product returns has become a key challenge. This paper develops and validates a novel smart green nudging approach to tackle the problem of product returns during customers’ online shopping processes. We combine a green nudge with a novel data enrichment strategy and a modern causal machine learning method. We first run a large-scale randomized field experiment in the online shop of a German fashion retailer to test the efficacy of a novel green nudge. Subsequently, we fuse the data from about 50,000 customers with publicly-available aggregate data to create what we call enriched digital footprints and train a causal machine learning system capable of optimizing the administration of the green nudge. We report two main findings: First, our field study shows that the large-scale deployment of a simple, low-cost green nudge can significantly reduce product returns while increasing retailer profits. Second, we show how a causal machine learning system trained on the enriched digital footprint can amplify the effectiveness of the green nudge by “smartly” administering it only to certain types of customers. Overall, this paper demonstrates how combining a low-cost marketing instrument, a privacy-preserving data enrichment strategy, and a causal machine learning method can create a win-win situation from both an environmental and economic perspective by simultaneously reducing product returns and increasing retailers’ profits.
Financial literacy affects wealth accumulation, and pension planning plays a key role in this relationship. In a large field experiment, we employ a digital pension aggregation tool to confront a treatment group with a simplified overview of their current pension claims across all pillars of the pension system. We combine survey and administrative bank data to measure the effects on actual saving behavior. Access to the tool decreases pension uncertainty for treated individuals. Average savings increase - especially for the financially less literate. We conclude that simplification of pension information can potentially reduce disparities in pension planning and savings behavior.
This paper utilizes a comprehensive worker-firm panel for the Netherlands to quantifythe impact of ICT capital-skill complementarity on the finance wage premium after the Global Financial Crisis. We apply additive worker and firm fixed-effect models to account for unobserved worker- and firm-heterogeneity and show that firm fixed-effects correct for a downward bias in the estimated finance wage premium. Our results indicate a sizable finance wage premium for both fixed- and full-hourly wages. The complementarity between ICT capital spending and the share of high skill workers at the firm-level reduces the full-wage premium considerably and the fixed-wage premium almost entirely.
This note argues that in a situation of an inelastic natural gas supply a restrictive monetary policy in the euro zone could reduce the energy bill and therefore has additional merits. A more hawkish monetary policy may be able to indirectly use monopsony power on the gas market. The welfare benefits of such a policy are diluted to the extent that some of the supply (approximately 10 percent) comes from within the euro zone, which may give rise to distributional concerns.