Refine
Year of publication
Document Type
- Working Paper (1306)
- Part of Periodical (277)
- Article (162)
- Report (96)
- Doctoral Thesis (34)
- Conference Proceeding (14)
- Part of a Book (7)
- Book (5)
- Periodical (4)
- Preprint (4)
Language
- English (1915) (remove)
Is part of the Bibliography
- no (1915)
Keywords
- Deutschland (58)
- Financial Institutions (47)
- ECB (41)
- Capital Markets Union (36)
- monetary policy (34)
- Financial Markets (33)
- Banking Union (32)
- Banking Regulation (28)
- Monetary Policy (28)
- Household Finance (27)
Institute
- Wirtschaftswissenschaften (1915) (remove)
By computing a volatility index (CVX) from cryptocurrency option prices, we analyze this market’s expectation of future volatility. Our method addresses the challenging liquidity environment of this young asset class and allows us to extract stable market implied volatilities. Two alternative methods are considered to compute volatilities from granular intra-day cryptocurrency options data, which spans over the COVID-19 pandemic period. CVX data therefore capture ‘normal’ market dynamics as well as distress and recovery periods. The methods yield two cointegrated index series, where the corresponding error correction model can be used as an indicator for market implied tail-risk. Comparing our CVX to existing volatility benchmarks for traditional asset classes, such as VIX (equity) or GVX (gold), confirms that cryptocurrency volatility dynamics are often disconnected from traditional markets, yet, share common shocks.
The pricing of digital art
(2023)
The intersection of recent advancements in generative artificial intelligence and blockchain technology has propelled digital art into the spotlight. Digital art pricing recognizes that owners derive utility beyond the artwork’s inherent value. We incorporate the consumption utility associated with digital art and model the stochastic discount factor and risk premiums. Furthermore, we conduct a calibration analysis to analyze the effects of shifts in the real and digital economy. Higher returns are required in a digital market upswing due to increased exposure to systematic risk and digital art prices are especially responsive to fluctuations in business cycles within digital markets.
Using a field study at a German brokerage, we investigate advised individual investors’ behavior and outcomes after self-selecting into a flat-fee scheme (percentage of portfolio value) for mutual funds. In a difference-in-differences setting, we compare 699 switchers to propensity-score-matched advisory clients who remained in the commission-based scheme. Switchers increase their portfolio values, improve portfolio diversification, and increase their portfolio performance. They also demand more financial advice and follow more advisor recommendations. We argue that switchers attribute a higher quality to the unchanged advisory services.
We study the role mutual funds play in the recovery from fast intraday crashes based on data from the National Stock Exchange of India for a single large stock. During normal times, trading activity and liquidity provision by mutual funds is negligible compared to other traders at around 4% of overall activity. Nevertheless, for the two intraday market-wide crashes in our sample, price recovery took place only after mutual funds moved in. Market stability may require the presence of well-capitalized standby liquidity providers for recovery from fast crashes.
The recent COVID-19 pandemic represents an unprecedented worldwide event to study the influence of related news on the financial markets, especially during the early stage of the pandemic when information on the new threat came rapidly and was complex for investors to process. In this paper, we investigate whether the flow of news on COVID-19 had an impact on forming market expectations. We analyze 203,886 online articles dealing with COVID-19 and published on three news platforms (MarketWatch.com, NYTimes.com, and Reuters.com) in the period from January to June 2020. Using machine learning techniques, we extract the news sentiment through a financial market-adapted BERT model that enables recognizing the context of each word in a given item. Our results show that there is a statistically significant and positive relationship between sentiment scores and S&P 500 market. Furthermore, we provide evidence that sentiment components and news categories on NYTimes.com were differently related to market returns.
Can consumption-based mechanisms generate positive and time-varying real term premia as we see in the data? I show that only models with time-varying risk aversion or models with high consumption risk can independently produce these patterns. The latter explanation has not been analysed before with respect to real term premia, and it relies on a small group of investors exposed to high consumption risk. Additionally, it can give rise to a “consumption-based arbitrageur” story of term premia. In relation to preferences, I consider models with both time-separable and recursive utility functions. Specifically for recursive utility, I introduce a novel perturbation solution method in terms of the intertemporal elasticity of substitution. This approach has not been used before in such models, it is easy to implement, and it allows a wide range of values for the parameter of intertemporal elasticity of substitution.
The complexities of geopolitical events, financial and fiscal crises, and the ebb and flow of personal life circumstances can weigh heavily on individuals’ minds as they make critical economic decisions. To investigate the impact of cognitive load on such decisions, the authors conducted an incentivized online experiment involving a representative sample of 2,000 French households. The results revealed that exposure to a taxing and persistent cognitive load significantly reduced consumption, particularly for individuals under the threat of furlough, while simultaneously increasing their account balances, particularly for those not facing such employment uncertainty. These effects were not driven by supply constraints or a worsening of credit constraints. Instead, cognitive load primarily affected the optimality of the chosen policy rules and impaired the ability of the standard economic model to accurately predict consumption patterns, although this effect was less pronounced among college-educated subjects
We investigate how unconventional monetary policy, via central banks’ purchases of corporate bonds, unfolds in credit-saturated markets. While this policy results in a loosening of credit market conditions as intended by policymakers, we report two unintended side effects. First, the policy impacts the allocation of credit among industries. Affected banks reallocate loans from investment-grade firms active on bond markets almost entirely to real estate asset managers. Other industries do not obtain more loans, particularly real estate developers and construction firms. We document an increase in real estate prices due to this policy, which fuels real estate overvaluation. Second, more loan write-offs arise from lending to these firms, and banks are not compensated for this risk by higher interest rates. We document a drop in bank profitability and, at the same time, a higher reliance on real estate collateral. Our findings suggest that central banks’ quantitative easing has substantial adverse effects in credit-saturated economies.
We conduct a field experiment with clients of a German universal bank to explore the impact of peer information on sustainable retail investments. Our results show that infor-mation about peers’ inclination towards sustainable investing raises the amount allocated to stock funds labeled sustainable, when communicated during a buying decision. This effect is primarily driven by participants initially underestimating peers’ propensity to invest sustainably. Further, treated individuals indicate an increased interest in addi-tional information on sustainable investments, primarily on risk and return expectations. However, when analyzing account-level portfolio holding data over time, we detect no spillover effects of peer information on later sustainable investment decisions.
Many consumers care about climate change and other externalities associated with their purchases. We analyze the behavior and market effects of such “socially responsible consumers” in three parts. First, we develop a flexible theoretical framework to study competitive equilibria with rational consequentialist consumers. In violation of price taking, equilibrium feedback non-trivially dampens a consumer’s mitigation efforts, undermining responsible behavior. This leads to a new type of market failure, where even consumers who fully “internalize the externality” overconsume externality-generating goods. At the same time, socially responsible consumers change the relative effectiveness of taxes, caps, and other policies in lowering the externality. Second, since consumer beliefs about and preferences over dampening play a crucial role in our framework, we investigate them empirically via a tailored survey. Consistent with our model, consumers are predominantly consequentialist, and on average believe in dampening. Inconsistent with our model, however, many consumers fail to anticipate dampening. Third, therefore, we analyze how such “naive” consumers modify our theoretical conclusions. Naive consumers behave more responsibly than rational consumers in a single-good economy, but may behave less responsibly in a multi-good economy with cross-market spillovers. A mix of naive and rational consumers may yield the worst outcomes.
This paper investigates stock market reaction to greenwashing by analyzing a new channel whereby companies change their names to green-related ones (i.e., names that evoke green and sustainable sentiments) to persuade the public that their activities are green. The findings reveal a striking positive stock price reaction to the announcement of corporate name changes to green-related names only for companies not involved in green activities at the time of the announcement. However, over an extended period of time, companies unrelated to green activities experience substantial negative abnormal returns if they fail to align their operational focus with the new name after the change.
How does group identity affect belief formation? To address this question, we conduct a series of online experiments with a representative sample of individuals in the US. Using the setting of the 2020 US presidential election, we find evidence of intergroup preference across three distinct components of the belief formation cycle: a biased prior belief, avoid-ance of outgroup information sources, and a belief-updating process that places greater (less) weight on prior (new) information. We further find that an intervention reducing the salience of information sources decreases outgroup information avoidance by 50%. In a social learn-ing context in wave 2, we find participants place 33% more weight on ingroup than outgroup guesses. Through two waves of interventions, we identify source utility as the mechanism driving group effects in belief formation. Our analyses indicate that our observed effects are driven by groupy participants who exhibit stable and consistent intergroup preferences in both allocation decisions and belief formation across all three waves. These results suggest that policymakers could reduce the salience of group and partisan identity associated with a policy to decrease outgroup information avoidance and increase policy uptake.
This paper applies structure preserving doubling methods to solve the matrix quadratic underlying the recursive solution of linear DSGE models. We present and compare two Structure-Preserving Doubling Algorithms ( SDAs) to other competing methods – the QZ method, a Newton algorithm, and an iterative Bernoulli approach – as well as the related cyclic and logarithmic reduction algorithms. Our comparison is completed using nearly 100 different models from the Macroeconomic Model Data Base (MMB) and different parameterizations of the monetary policy rule in the medium scale New Keynesian model of Smets and Wouters (2007) iteratively. We find that both SDAs perform very favorably relative to QZ, with generally more accurate solutions computed in less time. While we collect theoretical convergence results that promise quadratic convergence rates to a unique stable solution, the algorithms may fail to converge when there is a breakdown due to singularity of the coefficient matrices in the recursion. One of the proposed algorithms can overcome this problem by an appropriate (re)initialization. This SDA also performs particular well in refining solutions of different methods or from nearby parameterizations.
Whatever it takes to understand a central banker : embedding their words using neural networks
(2023)
Dictionary approaches are at the forefront of current techniques for quantifying central bank communication. In this paper, the author propose a novel language model that is able to capture subtleties of messages such as one of the most famous sentences in central bank communications when ECB President Mario Draghi stated that "within [its] mandate, the ECB is ready to do whatever it takes to preserve the euro".
The authors utilize a text corpus that is unparalleled in size and diversity in the central bank communication literature, as well as introduce a novel approach to text quantication from computational linguistics. This allows them to provide high-quality central bank-specific textual representations and demonstrate their applicability by developing an index that tracks deviations in the Fed's communication towards inflation targeting. Their findings indicate that these deviations in communication significantly impact monetary policy actions, substantially reducing the reaction towards inflation deviation in the US.
Christine Laudenbach and Vincent Lindner: To promote financial education among children, young people, and adults in the long term, comprehensive information services must reach the entire population in Germany with the help of cooperation partners. Talking about finances can no longer be a taboo subject.
Standard applications of the consumption-based asset pricing model assume that goods and services within the nondurable consumption bundle are substitutes. We estimate substitution elasticities between different consumption bundles and show that households cannot substitute energy consumption by consumption of other nondurables. As a consequence, energy consumption affects the pricing function as a separate factor. Variation in energy consumption betas explains a large part of the premia related to value, investment, and operating profitability. For example, value stocks are typically more energy-intensive than growth stocks and thus riskier, since they suffer more from the oil supply shocks that also affect households.
We propose a model with mean-variance foreign investors who exhibit a convex disutility associated to brown bond holdings. The model predicts that bond green premia should be smaller in economies with a closer financial account and highly volatile exchange rates. This happens because foreign intermediaries invest relatively less in such economies, and this lowers the marginal disutility of investing in polluting activities. We find strong empirical evidence in favor of this hypothesis using a global bond market dataset. Exchange rate volatility and financial account openness are thus able to explain the higher financing costs of green projects in emerging markets relative to advanced economies, especially when green bonds are denominated in local currency: a disadvantage that we can call the "green sin" of emerging economies.
This study looks at potential windfall profits for the four banking acquisitions in 2023. Based on accounting figures, an FT article states that a total of USD 44bn was left on the table. We see accounting figures as a misleading analysis. By estimating marked-based cumulative abnormal returns (CAR), we find positive abnormal returns in all four cases which when made quantifiable, are around half of the FT’s accounting figures. Furthermore, we argue that transparent auctions with enough bidders should be preferred to negotiated bank sales.
This document was provided/prepared by the Economic Governance and EMU Scrutiny Unit at the request of the ECON Committee.
This paper explores entrepreneurs’ initially intended exit strategies and compares them to their final exit paths using an inductive approach that builds on the grounded theory methodology. Our data shows that initially intended and final exit strategies differ among entrepreneurs. Two groups of entrepreneurs emerged from our data. The first group comprises entrepreneurs who financed their firms through equity investors. The second group is made up of entrepreneurs who financed their businesses solely with their own equities. Our data shows that the first group originally intended a financial harvest exit strategy and settled with this harvest exit strategy. The second group initially intended a stewardship exit strategy but did not succeed. We used the theory of planned behavior and the behavioral agency model to analyze our data. By examining our results from these two theoretical perspectives, our study explains how entrepreneurs’ exit intentions lead to their actual exit strategies.
This paper develops and implements a backward and forward error analysis of and condition numbers for the numerical stability of the solutions of linear dynamic stochastic general equilibrium (DSGE) models. Comparing seven different solution methods from the literature, I demonstrate an economically significant loss of accuracy specifically in standard, generalized Schur (or QZ) decomposition based solutions methods resulting from large backward errors in solving the associated matrix quadratic problem. This is illustrated in the monetary macro model of Smets and Wouters (2007) and two production-based asset pricing models, a simple model of external habits with a readily available symbolic solution and the model of Jermann (1998) that lacks such a symbolic solution - QZ-based numerical solutions miss the equity premium by up to several annualized percentage points for parameterizations that either match the chosen calibration targets or are nearby to the parameterization in the literature. While the numerical solution methods from the literature failed to give any indication of these potential errors, easily implementable backward-error metrics and condition numbers are shown to successfully warn of such potential inaccuracies. The analysis is then performed for a database of roughly 100 DSGE models from the literature and a large set of draws from the model of Smets and Wouters (2007). While economically relevant errors do not appear pervasive from these latter applications, accuracies that differ by several orders of magnitude persist.
Product aesthetics is a powerful means for achieving competitive advantage. Yet most studies to date have focused on the role of aesthetics in shaping pre-purchase preferences and have failed to consider how product aesthetics affects post-purchase processes and consumers' usage behavior. This research focuses on the relationship between aesthetics and usage behavior in the context of durable products. Studies 1A to 1C provide evidence of a positive effect of product aesthetics on usage intensity using market data from the car and the fashion industries. Study 2 corroborates these findings and shows that the more intensive use of highly aesthetic products may lead to the acquisition of product-specific usage skills that form the basis for a cognitive lock-in. Hence, consumers are less likely to switch away from products with appealing designs, an effect that is labeled as the ‘aesthetic fidelity’ effect. Study 3 addresses an alternative explanation for the ‘aesthetic fidelity effect’ based on mood and motivation but finds that the ‘aesthetic fidelity’ effect is indeed determined by usage intensity. Finally, Study 4 identifies a boundary condition of the positive effect of product aesthetics on product usage, showing that it is limited to durable products. In sum, this research demonstrates that the effects of product aesthetics extend beyond the pre-consumption stage and have an enduring impact on people's consumption experiences.
This research examines the impact of online display advertising and paid search advertising relative to offline advertising on firm performance and firm value. Using proprietary data on annualized advertising expenditures for 1651 firms spanning seven years, we document that both display advertising and paid search advertising exhibit positive effects on firm performance (measured by sales) and firm value (measured by Tobin's q). Paid search advertising has a more positive effect on sales than offline advertising, consistent with paid search being closest to the actual purchase decision and having enhanced targeting abilities. Display advertising exhibits a relatively more positive effect on Tobin's q than offline advertising, consistent with its long-term effects. The findings suggest heterogeneous economic benefits across different types of advertising, with direct implications for managers in analyzing advertising effectiveness and external stakeholders in assessing firm performance.
Most event studies rely on cumulative abnormal returns, measured as percentage changes in stock prices, as their dependent variable. Stock price reflects the value of the operating business plus non-operating assets minus debt. Yet, many events, in particular in marketing, only influence the value of the operating business, but not non-operating assets and debt. For these cases, the authors argue that the cumulative abnormal return on the operating business, defined as the ratio between the cumulative abnormal return on stock price and the firm-specific leverage effect, is a more appropriate dependent variable. Ignoring the differences in firm-specific leverage effects inflates the impact of observations pertaining to firms with large debt and deflates those pertaining to firms with large non-operating assets. Observations of firms with high debt receive several times the weight attributed to firms with low debt. A simulation study and the reanalysis of three previously published marketing event studies shows that ignoring the firm-specific leverage effects influences an event study's results in unpredictable ways.
This article uses information from two data sources, Compustat and Nexis Uni, and textual analysis to measure and validate the brand focus and customer focus of 109 U.S. listed retailers. The results from an analysis of their 853 earnings calls in 2010 and 2018 outline that on average, both foci increased over time. Although both foci vary substantially, brand focus varies more widely across retailers than their customer focus. Both foci are independent of each other. Specialty retailers have the highest brand focus, and internet & direct marketing retailers have the highest customer focus. A positive correlation exists between a retailer’s customer focus and its profitability, but not between a retailer’s brand focus and its profitability. The authors use the results to generate a research agenda that can direct future research in further systematically exploring firms’ brand and customer focus.
Knowledge of consumers' willingness to pay (WTP) is a prerequisite to profitable price-setting. To gauge consumers' WTP, practitioners often rely on a direct single question approach in which consumers are asked to explicitly state their WTP for a product. Despite its popularity among practitioners, this approach has been found to suffer from hypothetical bias. In this paper, we propose a rigorous method that improves the accuracy of the direct single question approach. Specifically, we systematically assess the hypothetical biases associated with the direct single question approach and explore ways to de-bias it. Our results show that by using the de-biasing procedures we propose, we can generate a de-biased direct single question approach that is accurate enough to be useful for managerial decision-making. We validate this approach with two studies in this paper.
In recent years, European regulators have debated restricting the time an online tracker can track a user to protect consumer privacy better. Despite the significance of these debates, there has been a noticeable absence of any comprehensive cost-benefit analysis. This article fills this gap on the cost side by suggesting an approach to estimate the economic consequences of lifetime restrictions on cookies for publishers. The empirical study on cookies of 54,127 users who received ∼128 million ad impressions over ∼2.5 years yields an average cookie lifetime of 279 days, with an average value of €2.52 per cookie. Only ∼13 % of all cookies increase their daily value over time, but their average value is about four times larger than the average value of all cookies. Restricting cookies’ lifetime to one year (two years) could potentially decrease their lifetime value by ∼25 % (∼19 %), which represents a potential decrease in the value of all cookies of ∼9 % (∼5%). Most cookies, however, would not be affected by lifetime restrictions of 12 or 24 months as 72 % (85 %) of the users delete their cookies within 12 (24) months. In light of the €10.60 billion cookie-based display ad revenue in Europe, such restrictions would endanger €904 million (€576 million) annually, equivalent to €2.08 (€1.33) per EU internet user. The article discusses these results' marketing strategy challenges and opportunities for advertisers and publishers.
Even as online advertising continues to grow, a central question remains: Who to target? Yet, advertisers know little about how to select from the hundreds of audience segments for targeting (and combinations thereof) for a profitable online advertising campaign. Utilizing insights from a field experiment on Facebook (Study 1), we develop a model that helps advertisers solve the cold-start problem of selecting audience segments for targeting. Our model enables advertisers to calculate the break-even performance of an audience segment to make a targeted ad campaign at least as profitable as an untargeted one. Advertisers can use this novel model to decide whether to test specific audience segments in their campaigns (e.g., in randomized controlled trials). We apply our model to data from the Spotify ad platform to study the profitability of different audience segments (Study 2). Approximately half of those audience segments require the click-through rate to double compared to an untargeted campaign, which is unrealistically high for most ad campaigns. Our model also shows that narrow segments require a lift that is likely not attainable, specifically when the data quality of these segments is poor. We confirm this theoretical finding in an empirical study (Study 3): A decrease in data quality due to Apple’s introduction of the App Tracking Transparency (ATT) framework more negatively affects the click-through rate of narrow (versus broad) audience segments.
Small businesses face major challenges to becoming more innovative. These challenges are particularly prevalent in emerging economies where high uncertainties are a barrier to innovation. We know from previous studies that linkages to universities, on the one hand, and public procurement, on the other, support large and innovative firms in their efforts to become more innovative. However, we do not know whether these positive effects also hold true for small businesses. In this paper, we focus on how policy strategies reducing information, market and financial uncertainties shape small businesses’ innovation in China. Based on a sample of 926 small businesses derived from the World Bank Enterprises Survey in China (2012), we find that university-industry linkages enhance innovation, though only when it comes to minor forms of innovation. In line with the resource-based view of the firm, this effect is stronger for small businesses with higher capabilities. Moreover, we show that bidding for or delivering contracts to public sector clients has a positive effect on innovation, and in particular of major forms of innovation. In the bidding selection process, private firms and firms with higher capabilities are selected. Our findings show that both policy strategies have enhanced innovation, though with different effects on the degree of novelty. We attribute this finding to the different degrees of uncertainties they address.
In this article, we examine anti-refugee hate crime in the wake of the large influx of refugees to Germany in 2014 and 2015. By exploiting institutional features of the assignment of refugees to German regions, we estimate the impact of unexpected and sudden large-scale immigration on hate crime against refugees. Results indicate that it is not simply the size of local refugee inflows which drives the increase in hate crime, but rather the combination of refugee arrivals and latent anti-refugee sentiment. We show that ethnically homogeneous areas, areas which experienced hate crimes in the 1990s, and areas with high support for the Nazi party in the Weimar Republic, are more prone to respond to the arrival of refugees with incidents of hate crime against this group. Our results highlight the importance of regional anti-immigration sentiment in the analysis of the incumbent population’s reaction to immigration.
A novel spatial autoregressive model for panel data is introduced, which incor-porates multilayer networks and accounts for time-varying relationships. Moreover, the proposed approach allows the structural variance to evolve smoothly over time and enables the analysis of shock propagation in terms of time-varying spillover effects.
The framework is applied to analyse the dynamics of international relationships among the G7 economies and their impact on stock market returns and volatilities. The findings underscore the substantial impact of cooperative interactions and highlight discernible disparities in network exposure across G7 nations, along with nuanced patterns in direct and indirect spillover effects.
In his speech at the conference „The SNB and its Watchers“, Otmar Issing, member of the ECB Governing Council from its start in 1998 until 2006, takes a look back at more than twenty years of the conference series „The ECB and Its Watchers“. In June 1999, Issing established this format together with Axel Weber, then Director of the Center for Financial Studies, to discuss the monetary policy strategy of the newly founded central bank with a broad circle of participants, that is academics, bank economists and members of the media on a „neutral ground“. At the annual conference, the ECB and its representatives would play an active role and engage in a lively exchange of view with the other participants. Over the years, Volker Wieland took over as organizer of the conference series, which also was adopted by other central banks. In his contribution at the second conference „The SNB and its Watchers“, Issing summarizes the experience gained from over twenty years of the ECB Watchers Conference.
Vulnerability comes, according to Orio Giarini, with two risks: human-made risks, also called entrepreneurial risks, and natural or pure risks such as accidents and earthquakes. Both types of risk are growing in dimension and are increasingly interrelated. To control the vulnerability, sophisticated insurance products are called for. Here, mutual insurance is relevant, in particular when risks are large, probabilities uncertain or unknown, and events interrelated or correlated. In this paper the following three examples are discussed and the advantages of mutual insurance are shown: unknown probabilities connected with unforeseeable events, correlated risks and macroeconomic or demographic risks.
Investors' return expectations are pivotal in stock markets, but the reasoning behind these expectations remains a black box for economists. This paper sheds light on economic agents' mental models -- their subjective understanding -- of the stock market, drawing on surveys with the US general population, US retail investors, US financial professionals, and academic experts. Respondents make return forecasts in scenarios describing stale news about the future earnings streams of companies, and we collect rich data on respondents' reasoning. We document three main results. First, inference from stale news is rare among academic experts but common among households and financial professionals, who believe that stale good news lead to persistently higher expected returns in the future. Second, while experts refer to the notion of market efficiency to explain their forecasts, households and financial professionals reveal a neglect of equilibrium forces. They naively equate higher future earnings with higher future returns, neglecting the offsetting effect of endogenous price adjustments. Third, a series of experimental interventions demonstrate that these naive forecasts do not result from inattention to trading or price responses but reflect a gap in respondents' mental models -- a fundamental unfamiliarity with the concept of equilibrium.
Shallow meritocracy
(2023)
Meritocracies aspire to reward hard work and promise not to judge individuals by the circumstances into which they were born. However, circumstances often shape the choice to work hard. I show that people's merit judgments are "shallow" and insensitive to this effect. They hold others responsible for their choices, even if these choices have been shaped by unequal circumstances. In an experiment, US participants judge how much money workers deserve for the effort they exert. Unequal circumstances disadvantage some workers and discourage them from working hard. Nonetheless, participants reward the effort of disadvantaged and advantaged workers identically, regardless of the circumstances under which choices are made. For some participants, this reflects their fundamental view regarding fair rewards. For others, the neglect results from the uncertain counterfactual. They understand that circumstances shape choices but do not correct for this because the counterfactual—what would have happened under equal circumstances—remains uncertain.
We estimate the causal effect of shared e-scooter services on traffic accidents by exploiting the variation in the availability of e-scooter services induced by the staggered rollout across 93 cities in six countries. Police-reported accidents involving personal injuries in the average month increased by around 8.2% after shared e-scooters were introduced. Effects are large during summer and insignificant during winter. Further heterogeneity analysis reveals the largest estimated effects for cities with limited cycling infrastructure, while no effects are detectable in cities with high bike-lane density. This difference suggests that public policy can play a crucial role in mitigating accidents related to e-scooters and, more generally, to changes in urban mobility.
This paper proposes tests for out-of-sample comparisons of interval forecasts based on parametric conditional quantile models. The tests rank the distance between actual and nominal conditional coverage with respect to the set of conditioning variables from all models, for a given loss function. We propose a pairwise test to compare two models for a single predictive interval. The set-up is then extended to a comparison across multiple models and/or intervals. The limiting distribution varies depending on whether models are strictly non-nested or overlapping. In the latter case, degeneracy may occur. We establish the asymptotic validity of wild bootstrap based critical values across all cases. An empirical application to Growth-at-Risk (GaR) uncovers situations in which a richer set of financial indicators are found to outperform a commonly-used benchmark model when predicting downside risk to economic activity.
This paper studies the macro-financial implications of using carbon prices to achieve ambitious greenhouse gas (GHG) emission reduction targets. My empirical evidence shows a 0.6% output loss and a rise of 0.3% in inflation in response to a 1% shock on carbon policy. Furthermore, I also observe financial instability and allocation effects between the clean and highly polluted energy sectors. To have a better prediction of medium and long-term impact, using a medium-large macro-financial DSGE model with environmental aspects, I show the recessionary effect of an ambitious carbon price implementation to achieve climate targets, a 40% reduction in GHG emission causes a 0.7% output loss while reaching a zero-emission economy in 30 years causes a 2.6% output loss. I document an amplified effect of the banking sector during the transition path. The paper also uncovers the beneficial role of pre-announcements of carbon policies in mitigating inflation volatility by 0.2% at its peak, and our results suggest well-communicated carbon policies from authorities and investing to expand the green sector. My findings also stress the use of optimal green monetary and financial policies in mitigating the effects of transition risk and assisting the transition to a zero-emission world. Utilizing a heterogeneous approach with macroprudential tools, I find that optimal macroprudential tools can mitigate the output loss by 0.1% and investment loss by 1%. Importantly, my work highlights the use of capital flow management in the green transition when a global cooperative solution is challenging.
This study explores the implications of rising markups for optimal Mirrleesian income and profit taxation. Using a stylized model with two individuals, the main forces shaping welfare-optimal policies are analytically characterized. Although a higher profit tax has redistributive benefits, it adversely affects market competition, leading to a greater equilibrium cost-of-living. Rising markups directly contribute to a decline in optimal marginal taxes on labor income. The optimal policy response to higher markups includes increasingly relying on the profit tax to fund redistribution. Declining optimal marginal income taxes assists the redistributive function of the profit tax by contributing to the expansion of the profit tax base. This response alone considerably increases the equilibrium cost-of-living. Nevertheless, a majority of the individuals become better off with the optimal policy. If it is not possible to tax profits optimally, due, for example, to profit shifting, increasing redistribution via income taxes is not optimal; every individual is worse off relative to the scenario with optimal profit taxation.
The debate on monetary and fiscal policy is heavily influenced by estimates of the equilibrium real interest rate. In particular, this concerns estimates derived from a simple aggregate demand and Phillips curve model with time-varying components as proposed by Laubach and Williams (2003). For example, Summers (2014a) refers to these estimates as important evidence for a secular stagnation and the need for fiscal stimulus. Yellen (2015, 2017) has made use of such estimates in order to explain and justify why the Federal Reserve has held interest rates so low for so long. First, we re-estimate the United States equilibrium rate with the methodology of Laubach and Williams (2003). Then, we build on their approach and an alternative specification to provide new estimates for the United States, Germany, the euro area and Japan. Third, we subject these estimates to a battery of sensitivity tests. Due to the great uncertainty and sensitivity that accompany these equilibrium rate estimates, the observed decline in the estimates is not a reliable indicator of a need for expansionary monetary and fiscal policy. Yet, if these estimates are employed to determine the appropriate monetary policy stance, such estimates are better used together with the consistent estimate of the level of potential output.
While the COVID-19 pandemic had a large and asymmetric impact on firms, many countries quickly enacted massive business rescue programs which are specifically targeted to smaller firms. Little is known about the effects of such policies on business entry and exit, investment, factor reallocation, and macroeconomic outcomes. This paper builds a general equilibrium model with heterogeneous and financially constrained firms in order to evaluate the short- and long-term consequences of small firm rescue programs in a pandemic recession. We calibrate the stationary equilibrium and the pandemic shock to the U.S. economy, taking into account the factual Paycheck Protection Program (PPP) as a specific policy. We find that the policy has only a modest impact on aggregate output and employment because (i) jobs are saved predominately in the smallest firms that account for a minor share of employment and (ii) the grant reduces the reallocation of resources towards larger and less impacted firms. Much of the reallocation effects occur in the aftermath of the pandemic episode. By preventing inefficient liquidations, the policy dampens the long-term declines of aggregate consumption and of the real wage, thus delivering small welfare gains.
Nowadays, digitalization has an immense impact on the landscape of jobs. This technological revolution creates new industries and professions, promises greater efficiency and improves the quality of working life. However, emerging technologies such as robotics and artificial intelligence (AI) are reducing human intervention, thus advancing automation and eliminating thousands of jobs and whole occupational images. To prepare employees for the changing demands of work, adequate and timely training of the workforce and real-time support of workers in new positions is necessary. Therefore, it is investigated whether user-oriented technologies, such as augmented reality (AR) and virtual reality (VR) can be applied “on-the-job” for such training and support—also known as intelligence augmentation (IA). To address this problem, this work synthesizes results of a systematic literature review as well as a practically oriented search on augmented reality and virtual reality use cases within the IA context. A total of 150 papers and use cases are analyzed to identify suitable areas of application in which it is possible to enhance employees' capabilities. The results of both, theoretical and practical work, show that VR is primarily used to train employees without prior knowledge, whereas AR is used to expand the scope of competence of individuals in their field of expertise while on the job. Based on these results, a framework is derived which provides practitioners with guidelines as to how AR or VR can support workers at their job so that they can keep up with anticipated skill demands. Furthermore, it shows for which application areas AR or VR can provide workers with sufficient training to learn new job tasks. By that, this research provides practical recommendations in order to accompany the imminent distortions caused by AI and similar technologies and to alleviate associated negative effects on the German labor market.
Goal setting is vital in learning sciences, but the scientific evaluation of optimal learning goals is underexplored. This study proposes a novel methodological approach to determine optimal learning goals. The data in this study comes from a gamified learning app implemented in an undergraduate accounting course at a large German university. With a combination of decision trees and regression analyses, the goals connected to the badges implemented in the app are evaluated. The results show that the initial badge set already motivated learning strategies that led to better grades on the exam. However, the results indicate that the levels of the goals could be improved, and additional badges could be implemented. In addition to new goal levels, new goal types are also discussed. The findings show that learning goals initially determined by the instructors need to be evaluated to offer an optimal motivational effect. The new methodological approach used in this study can be easily transferred to other learning data sets to provide further insights.
Life insurers use accounting and actuarial techniques to smooth reporting of firm assets and liabilities, seeking to transfer surpluses in good years to cover benefit payouts in bad years. Yet these techniques have been criticized as they make it difficult to assess insurers’ true financial status. We develop stylized and realistically-calibrated models of a participating life annuity, an insurance product that pays retirees guaranteed lifelong benefits along with variable non-guaranteed surplus. Our goal is to illustrate how accounting and actuarial techniques for this type of financial contract shape policyholder wellbeing, along with insurer profitability and stability. Smoothing adds value to both the annuitant and the insurer, so curtailing smoothing could undermine the market for long-term retirement payout products.
We investigate how financial literacy shapes older Americans’ demand for financial advice. Using an experimental module fielded in the Health and Retirement Study, we show that financial literacy strongly improves the quality but not the quantity of financial advice sought. In particular, more financially literate people seek financial help from professionals. This effect is more pronounced among older people and those with more wealth and more complex financial positions. Our analysis result implies that financial literacy and financial advisory services are complementary with, rather than substitutes for, each other.
This paper examines heterogeneity in time discounting among a representative sample of elderly Americans, as well as its role in explaining key economic behaviors at older ages. We show how older Americans evaluate simple (hypothetical) inter-temporal choices in which payments today are compared with payments in the future. Using the indicators derived from this measure, we then demonstrate that differences in discounting patterns are associated with characteristics of particular importance in elderly populations. For example, cognitive deficits are associated with greater impatience, whereas bequest motives are associated with less impatience. We then relate our discounting measure to key economic outcomes and find that impatience is associated with lower wealth, fewer investments in health, and less planning for end of life care.
The US Treasury recently permitted deferred longevity income annuities to be included in pension plan menus as a default payout solution, yet little research has investigated whether more people should convert some of the $18 trillion they hold in employer-based defined contribution plans into lifelong income streams. We investigate this innovation using a calibrated lifecycle consumption and portfolio choice model embodying realistic institutional considerations. Our welfare analysis shows that defaulting a modest portion of retirees’ 401(k) assets (over a threshold) is an attractive way to enhance retirement security, enhancing welfare by up to 20% of retiree plan accruals.
Do required minimum distribution 401(k) rules matter, and for whom? Insights from a lifecycle model
(2023)
Tax-qualified vehicles have helped U.S. private-sector workers accumulate $33Tr in retirement plans. An often-overlooked important institutional feature shaping decumulations from these plans is the “Required Minimum Distribution” (RMD) regulation requiring retirees to withdraw a minimum fraction from their retirement accounts or pay excise taxes on withdrawal shortfalls. Our calibrated lifecycle model measures the impact of RMD rules on heterogeneous households’ financial behavior during their work lives and in retirement. The model shows that reforms delaying or eliminating the RMD rules have little effect on consumption profiles, but they would influence withdrawals and tax payments for households with bequest motives.
Artificial Intelligence (AI) and Machine Learning (ML) are currently hot topics in industry and business practice, while management-oriented research disciplines seem reluctant to adopt these sophisticated data analytics methods as research instruments. Even the Information Systems (IS) discipline with its close connections to Computer Science seems to be conservative when conducting empirical research endeavors. To assess the magnitude of the problem and to understand its causes, we conducted a bibliographic review on publications in high-level IS journals. We reviewed 1,838 articles that matched corresponding keyword-queries in journals from the AIS senior scholar basket, Electronic Markets and Decision Support Systems (Ranked B). In addition, we conducted a survey among IS researchers (N = 110). Based on the findings from our sample we evaluate different potential causes that could explain why ML methods are rather underrepresented in top-tier journals and discuss how the IS discipline could successfully incorporate ML methods in research undertakings.
Measuring and reducing energy consumption constitutes a crucial concern in public policies aimed at mitigating global warming. The real estate sector faces the challenge of enhancing building efficiency, where insights from experts play a pivotal role in the evaluation process. This research employs a machine learning approach to analyze expert opinions, seeking to extract the key determinants influencing potential residential building efficiency and establishing an efficient prediction framework. The study leverages open Energy Performance Certificate databases from two countries with distinct latitudes, namely the UK and Italy, to investigate whether enhancing energy efficiency necessitates different intervention approaches. The findings reveal the existence of non-linear relationships between efficiency and building characteristics, which cannot be captured by conventional linear modeling frameworks. By offering insights into the determinants of residential building efficiency, this study provides guidance to policymakers and stakeholders in formulating effective and sustainable strategies for energy efficiency improvement.
The forward guidance trap
(2023)
This paper examines the policy experience of the Fed, ECB and BOJ during and after the Covid-19 pandemic and draws lessons for monetary policy strategy and ist communication. All three central banks provided appropriate accommodation during the pandemic but two failed to unwind this accommodation in a timely manner. The Fed and ECB guided real interest rates to inappropriately negative levels as the economy recovered from the pandemic, fueling high inflation. The policy error can be traced to decisions regarding forward guidance on policy rates that delayed lift-off while the two central banks continued to expand their balance sheets. The Fed and the ECB fell into the forward guidance trap. This could have been avoided if policy were guided by a forward- looking rule that properly adjusted the nominal interest rate with the evolution of the inflation outlook.
Tail-correlation matrices are an important tool for aggregating risk measurements across risk categories, asset classes and/or business segments. This paper demonstrates that traditional tail-correlation matrices—which are conventionally assumed to have ones on the diagonal—can lead to substantial biases of the aggregate risk measurement’s sensitivities with respect to risk exposures. Due to these biases, decision-makers receive an odd view of the effects of portfolio changes and may be unable to identify the optimal portfolio from a risk-return perspective. To overcome these issues, we introduce the “sensitivity-implied tail-correlation matrix”. The proposed tail-correlation matrix allows for a simple deterministic risk aggregation approach which reasonably approximates the true aggregate risk measurement according to the complete multivariate risk distribution. Numerical examples demonstrate that our approach is a better basis for portfolio optimization than the Value-at-Risk implied tail-correlation matrix, especially if the calibration portfolio (or current portfolio) deviates from the optimal portfolio.
We empirically examine how systemic risk in the banking sector leads to correlated risk in office markets of global financial centers. In so doing, we compute an aggregated measure of systemic risk in financial centers as the cumulated expected capital shortfall of local financial institutions. Our identification strategy is based on a double counterfactual approach by comparing normal with financial distress periods as well as office with retail markets. We find that office market interconnectedness arises from systemic risk during financial turmoil periods. Office market performance in a financial center is affected by returns of systemically linked financial center office markets only during a systemic banking crisis. In contrast, there is no evidence of correlated risk during normal times and among the within-city counterfactual retail sector. The decline in office market returns during a banking crisis is larger in financial centers compared to non-financial centers.
Rezension zu: Social preferences: an introduction to behavioural economics and experimental research, by Michalis Drouvelis, Newcastle upon Tyne: Agenda Publishing, 2021, 205 pages, £22.99, ISBN 978-1-78821-417-9 (paperback).
Having a gatekeeper position in a collaborative network offers firms great potential to gain competitive advantages. However, it is not well understood what kind of collaborations are associated with such a position. Conceptually grounded in social network theory, this study draws on the resource-based view and the relational factors view to investigate which types of collaboration characterize firms that are in a gatekeeper position, which ultimately could improve firm performance in subsequent periods. The empirical analysis utilizes a unique longitudinal data set to examine dynamic network formation. We used a data crawling approach to reconstruct collaboration networks among the 500 largest companies in Germany over nine years and matched these networks with performance data. The results indicate that firms in gatekeeper positions often engage in medium-intensity collaborations and less likely weak-intensity collaborations. Strong-intensity collaborations are not related to the likelihood of being a gatekeeper. Our study further reveals that a firm's knowledge base is an important moderator and that this knowledge base can increase the benefits of having a gatekeeper position in terms of firm performance.
A safe core mandate
(2023)
Central banks have vastly expanded their footprint on capital markets. At a time of extraordinary pressure by many sides, a simple benchmark for the scale and scope of their core mandate of price and financial stability may be useful.
We make a case for a narrow mandate to maintain and safeguard the border between safe and quasi safe assets. This ex-ante definition minimizes ambiguity and discourages risk creation and limit panic runs, primarily by separating market demand for reliable liquidity from risk-intolerant, price-insensitive demand for a safe store of value. The central bank may be occasionally forced to intervene beyond the safe core but should not be bound by any such ex-ante mandate, unless directed to specific goals set by legislation with explicit fiscal support.
We review distinct features of liquidity and safety demand, seeking a definition of the safety border, and discuss LOLR support for borderline safe assets such as MMF or uninsured deposits.
A safe core formulation is close to the historical focus on regulated entities, collateralized lending and attention to the public debt market, but its specific framing offers some context on controversial issues such as the extent of LOLR responsibilities. It also justifies a persistently large scale for central bank liabilities (Greenwood, Hansom and Stein 2016), as safety demand is related to financial wealth rather than GDP. Finally, it is consistent with an active central bank role in supporting liquidity in government debt markets trading and clearing (Duffie 2020, 2021).
A key solution for public good provision is the voluntary formation of institutions that commit players to cooperate. Such institutions generate inequality if some players decide not to participate but cannot be excluded from cooperation benefits. Prior research with small groups emphasizes the role of fairness concerns with positive effects on cooperation. We show that effects do not generalize to larger groups: if group size increases, groups are less willing to form institutions generating inequality. In contrast to smaller groups, however, this does not increase the number of participating players, thereby limiting the positive impact of institution formation on cooperation.
This Policy Letter presents two event studies based on the pre-war data that foreshadows the remarkable way in which Russian economy was able to withstand the pressure from unprecedented package of international sanctions. First, it shows that a sudden stop of one of the two domestic producers of zinc in 2018 did not lead to a slowdown in the steel industry, which heavily relied on this input. Second, it demonstrates that a huge increase in cost of fuel called mazut in 2020 had virtually no impact on firms that used it, even in the regions where it was hard to substitute it for alternative fuels. This Policy Letter argues that such stability in production can be explained by the fact that Russian economy is heavily oriented toward commodities. It is much easier to replace a commodity supplier than a supplier of manufacturing goods, and many commodity producers operate at high profit margins that allow them to continue to operate even after big increases in their costs. Thus, sanctions had a much smaller impact on Russia than they would have on an economy with larger manufacturing sector, where inputs are less substitutable and profit margins are smaller.
We study the interplay of capital and liquidity regulation in a general equilibrium setting by focusing on future funding risks. The model consists of a banking sector with long-term illiquid investment opportunities that need to be financed by shortterm debt and by issuing equity. Reliance on refinancing long-term investment in the middle of the life-time is risky, since the next generation of potential short-term debt holders may not be willing to provide funding when the return prospects on the long-term investment turn out to be bad. For moderate return risk, equilibria with and without bank default coexist, and bank default is a self-fulfilling prophecy. Capital and liquidity regulation can prevent bank default and may implement the first-best. Yet the former is more powerful in ruling out undesirable equilibria and thus dominates liquidity regulation. Adding liquidity regulation to optimal capital regulation is redundant.
In current discussions on large language models (LLMs) such as GPT, understanding their ability to emulate facets of human intelligence stands central. Using behavioral economic paradigms and structural models, we investigate GPT’s cooperativeness in human interactions and assess its rational goal-oriented behavior. We discover that GPT cooperates more than humans and has overly optimistic expectations about human cooperation. Intriguingly, additional analyses reveal that GPT’s behavior isn’t random; it displays a level of goal-oriented rationality surpassing human counterparts. Our findings suggest that GPT hyper-rationally aims to maximize social welfare, coupled with a strive of self-preservation. Methodologically, our esearch highlights how structural models, typically employed to decipher human behavior, can illuminate the rationality and goal-orientation of LLMs. This opens a compelling path for future research into the intricate rationality of sophisticated, yet enigmatic artificial agents.
We study the redistributive effects of inflation combining administrative bank data with an information provision experiment during an episode of historic inflation. On average, households are well-informed about prevailing inflation and are concerned about its impact on their wealth; yet, while many households know about inflation eroding nominal assets, most are unaware of nominal-debt erosion. Once they receive information on the debt-erosion channel, households update upwards their beliefs about nominal debt and their own real net wealth. These changes in beliefs causally affect actual consumption and hypothetical debt decisions. Our findings suggest that real wealth mediates the sensitivity of consumption to inflation once households are aware of the wealth effects of inflation.
Dynamics of life course family transitions in Germany: exploring patterns, process and relationships
(2023)
This paper explores dynamics of family life events in Germany using discrete time event history analysis based on SOEP data. We find that higher educational attainment, better income level, and marriage emerge as salient protective factors mitigating the risk of mortality; better education also reduces the likelihood of first marriage whereas, lower educational attainment, protracted period, and presence of children act as protective factors against divorce. Our key finding shows that disparity in mean life expectancies between individuals from low- and high-income brackets is observed to be 9 years among males and 6 years among females, thereby illustrating the mortality inequality attributed to income disparities. Our estimates show that West Germans have low risk of death, less likelihood of first marriage, and they have a high risk of divorce and remarriage compared to East Germans.
We present determinacy bounds on monetary policy in the sticky information model. We find that these bounds are more conservative here when the long run Phillips curve is vertical than in the standard Calvo sticky price New Keynesian model. Specifically, the Taylor principle is now necessary directly - no amount of output targeting can substitute for the monetary authority’s concern for inflation. These determinacy bounds are obtained by appealing to frequency domain techniques that themselves provide novel interpretations of the Phillips curve.
Questionable research practices have generated considerable recent interest throughout and beyond the scientific community. We subsume such practices involving secret data snooping that influences subsequent statistical inference under the term MESSing (manipulating evidence subject to snooping) and discuss, illustrate and quantify the possibly dramatic effects of several forms of MESSing using an empirical and a simple theoretical example. The empirical example uses numbers from the most popular German lottery, which seem to suggest that 13 is an unlucky number.
In this study, we introduce a novel entity matching (EM) framework. It com-bines state-of-the-art EM approaches based on Artificial Neural Networks (ANN) with a new similarity encoding derived from matching techniques that are preva-lent in finance and economics. Our framework is on-par or outperforms alternative end-to-end frameworks in standard benchmark cases. Because similarity encod-ing is constructed using (edit) distances instead of semantic similarities, it avoids out-of-vocabulary problems when matching dirty data. We highlight this property by applying an EM application to dirty financial firm-level data extracted from historical archives.
Biodiversity loss poses a significant threat to the global economy and affects ecosystem services on which most large companies rely heavily. The severe financial implications of such a reduced species diversity have attracted the attention of companies and stakeholders, with numerous calls to increase corporate transparency. Using textual analysis, this study thus investigates the current state of voluntary biodiversity reporting of 359 European blue-chip companies and assesses the extent to which it aligns with the upcoming disclosure framework of the Task Force on Nature-related Financial Disclosures (TNFD). The descriptive results suggest a substantial gap between current reporting practices and the proposed TNFD framework, with disclosures largely lacking quantification, details and clear targets. In addition, the disclosures appear to be relatively unstandardized. Companies in sectors or regions exposed to higher nature-related risks as well as larger companies are more likely to report on aspects of biodiversity. This study contributes to the emerging literature on nature-related risks and provides detailed insights on the extent of the reporting gap in light of the upcoming standards.
This paper analyzes the current implementation status of sustainability and taxonomy-aligned disclosure under the Sustainable Finance Disclosure Regulation (SFDR) as well as the development of the SFDR categorization of funds offered via banks in Germany. Examining data provided by WM Group, which consists of more than 10,000 investment funds and 2,000 index funds between September 2022 and March 2023, we have observed a significant proportion of Article 9 (dark green) funds transitioning to Article 8 (light green) funds, particularly among index funds. As a consequence of this process, the profile of the SFDR classes has sharpened, which reflects an increased share of sustainable investments in the group of Article 9 funds. When differentiating between environmental and social investments, the share of environmental investments increased, but the share of social investments decreased in the group of Article 9 funds at the beginning of 2023. The share of taxonomy-aligned investments is very low, but slightly increasing for Article 9 funds. However, by March 2023 only around 1,000 funds have reported their sustainability proportions and this picture might change due to legal changes which require all funds in the scope of the SFDR to report these proportions in their annual reports being published after 1 January 2023.
Industry classification groups firms into finer partitions to help investments and empirical analysis. To overcome the well-documented limitations of existing industry definitions, like their stale nature and coarse categories for firms with multiple operations, we employ a clustering approach on 69 firm characteristics and allocate companies to novel economic sectors maximizing the within-group explained variation. Such sectors are dynamic yet stable, and represent a superior investment set compared to standard classification schemes for portfolio optimization and for trading strategies based on within-industry mean-reversion, which give rise to a latent risk factor significantly priced in the cross-section. We provide a new metric to quantify feature importance for clustering methods, finding that size drives differences across classical industries while book-to-market and financial liquidity variables matter for clustering-based sectors.
We estimate the transmission of the pandemic shock in 2020 to prices in the residential and commercial real estate market by causal machine learning, using new granular data at the municipal level for Germany. We exploit differences in the incidence of Covid infections or short-time work at the municipal level for identification. In contrast to evidence for other countries, we find that the pandemic had only temporary negative effects on rents for some real estate types and increased asset prices of real estate particularly in the top price segment of commercial real estate.
This study analyzes information production and trading behavior of banks with lending relationships. We combine trade-by-trade supervisory data and credit-registry data to examine banks' proprietary trading in borrower stocks around a large number of corporate events. We find that relationship banks build up positive (negative) trading positions in the two weeks before events with positive (negative) news, even when these events are unscheduled, and unwind positions shortly after the event. This trading pattern is more pronounced in situations when banks are likely to possess private information about their borrowers, and cannot be explained by specialized expertise in certain industries or certain firms. The results suggest that banks' lending relationships inform their trading and underscore the potential for conflicts of interest in universal banking, which have been a prominent concern in the regulatory debate for a long time. Our analysis illustrates how combining large data sets can uncover unusual trading patterns and enhance the supervision of financial institutions.
We examine whether the uncertainty related to environmental, social, and governance (ESG) regulation developments is reflected in asset prices. We proxy the sensitivity of firms to ESG regulation uncertainty by the disparity across the components of their ESG ratings. Firms with high ESG disparity have a higher option-implied cost of protection against downside tail risk. The impact of the misalignment across the different dimensions of the ESG score is distinct from that of ESG score level itself. Aggregate downside risk bears a negative price for firms with low ESG disparity.
A common practice in empirical macroeconomics is to examine alternative recursive orderings of the variables in structural vector autogressive (VAR) models. When the implied impulse responses look similar, the estimates are considered trustworthy. When they do not, the estimates are used to bound the true response without directly addressing the identification challenge. A leading example of this practice is the literature on the effects of uncertainty shocks on economic activity. We prove by counterexample that this practice is invalid in general, whether the data generating process is a structural VAR model or a dynamic stochastic general equilibrium model.
This paper analyzes the scope of the private market for pandemic insurance. We develop a framework that explains theoretically how the equilibrium price of pandemic insurance depends on accumulation risk, covariance between pandemic claims and other claims, and covariance between pandemic claims and the stock market performance. Using the natural catastrophe (NatCat) insurance market as a laboratory, we estimate the relationship between the insurance price markup and the tail characteristics of the loss distribution. Then, by using the high-frequency data tracking the economic impact of the COVID-19 pandemic in the United States, we calibrate the loss distribution of a hypothetical insurance contract designed to alleviate the impact of the pandemic on small businesses. The pandemic insurance contract price markup corresponds to the top 20% markup observed in the NatCat insurance market. Then we analyze an intertemporal risk-sharing scheme that can reduce the expected shortfall of the loss distribution by 50%.
A key technology driving the digital transformation of the economy is artificial intelligence (AI). It has gained a high degree of public attention with the initial release of the chatbot ChatGPT, which demonstrates the potential of generative AI (GAI) as a relatively new segment within AI. It is widely expected that GAI will shape the future of many industries and society in the coming years. This article provides a brief overview of the foundations of generative AI (“GAI”) including machine learning and what distinguishes it from other fields of AI. Furthermore, we look at important players in this emerging market, possible use cases and the expected economic potential as of today. It is apparent that, once again, a few US-based Big Tech firms are about to dominate this emerging technology and that the European tech sector is falling further behind. Finally, we conclude that the recently adopted Digital Markets Act (DMA) and the Digital Service Act (DSA) as well as the upcoming AI Act should be reviewed to ensure that the regulatory framework of European digital markets keeps up with the accelerated development of AI.
Homeownership rates differ widely across European countries. We document that part of this variation is driven by differences in the fraction of adults co-residing with their par-ents. Comparing Germany and Italy, we show that in contrast to homeownership rates per household, homeownership rates per individual are very similar during the first part of the life cycle. To understand these patterns, we build an overlapping-generations model where individuals face uninsurable income risk and make consumption-saving and housing tenure decisions. We embed an explicit intergenerational link between children and parents to cap-ture the three-way trade-off between owning, renting, and co-residing. Calibrating the model to Germany we explore the role of income profiles, housing policies, and the taste for inde-pendence and show that a combination of these factors goes a long way in explaining the differential life-cycle patterns of living arrangements between the two countries.
We develop a quantity-driven general equilibrium model that integrates the term structure of interest rates with the repurchase agreements (repo) market to shed light on the com-bined effects of quantitative easing (QE) on the bond and money markets. We characterize in closed form the endogenous dynamic interaction between bond prices and repo rates, and show (i) that repo specialness dampens the impact of any given quantity of asset pur-chases due to QE on the slope of the term structure and (ii) that bond scarcity resulting from QE increases repo specialness, thus strengthening the local supply channel of QE.
Recent regulatory measures such as the European Union’s AI Act re-quire artificial intelligence (AI) systems to be explainable. As such, under-standing how explainability impacts human-AI interaction and pinpoint-ing the specific circumstances and groups affected, is imperative. In this study, we devise a formal framework and conduct an empirical investiga-tion involving real estate agents to explore the complex interplay between explainability of and delegation to AI systems. On an aggregate level, our findings indicate that real estate agents display a higher propensity to delegate apartment evaluations to an AI system when its workings are explainable, thereby surrendering control to the machine. However, at an individual level, we detect considerable heterogeneity. Agents possess-ing extensive domain knowledge are generally more inclined to delegate decisions to AI and minimize their effort when provided with explana-tions. Conversely, agents with limited domain knowledge only exhibit this behavior when explanations correspond with their preconceived no-tions regarding the relationship between apartment features and listing prices. Our results illustrate that the introduction of explainability in AI systems may transfer the decision-making control from humans to AI under the veil of transparency, which has notable implications for policy makers and practitioners that we discuss.
We provide evidence on the extent to which survey items in the Preference Survey Module and the resulting Global Preference Survey measuring social preferences − trust, altruism, positive and negative reciprocity − predict behavior in corresponding experimental games outside the original participant sample of Falk et al. (2022). Our results, which are based on a replication study with university students in Tehran, Iran, are mixed. While quantitative items considering hypothetical versions of the experimental games correlate significantly and economically meaningfully with individual behavior, none of the qualitative items show significant correlations. The only exception is altruism where results correspond more closely to the original findings.
In the euro area, monetary policy is conducted by a single central bank for 20 member countries. However, countries are heterogeneous in their economic development, including their inflation rates. This paper combines a New Keynesian model and a neural network to assess whether the European Central Bank (ECB) conducted monetary policy between 2002 and 2022 according to the weighted average of the inflation rates within the European Monetary Union (EMU) or reacted more strongly to the inflation rate developments of certain EMU countries.
The New Keynesian model first generates data which is used to train and evaluate several machine learning algorithms. They authors find that a neural network performs best out-of-sample. They use this algorithm to generally classify historical EMU data, and to determine the exact weight on the inflation rate of EMU members in each quarter of the past two decades. Their findings suggest disproportional emphasis of the ECB on the inflation rates of EMU members that exhibited high inflation rate volatility for the vast majority of the time frame considered (80%), with a median inflation weight of 67% on these countries. They show that these results stem from a tendency of the ECB to react more strongly to countries whose inflation rates exhibit greater deviations from their long-term trend.
Climate change has become one of the most prominent concerns globally. In this paper, the authors study the transition risk of greenhouse gas emission reduction in structural environmental-macroeconomic DSGE models. First, they analyze the uncertainty in model prediction on the effect of unanticipated and pre-announced carbon price increases. Second, they conduct optimal model-robust policy in different settings. They find that reducing emissions by 40% causes 0.7% to 4% output loss with 2% on average. Pre-announcement of carbon prices affects the inflation dynamics significantly. The central bank should react slightly less to inflation and output growth during the transition risk. With optimal carbon price designs, it should react even less to inflation, and more to output growth.
We analyze the repercussions of different kinds of uncertainty on cash demand, including uncertainty of the digital infrastructures, confidence crises of the financial system, natural disasters, political uncertainties, and inflationary crises. Based on a comprehensive literature survey, theoretical considerations and complemented by case studies, we derive a classification scheme how cash holdings typically evolve in each of these types of uncertainty by separating between demand for domestic and international cash as well as between transaction and store of value balances. Hereby, we focus on the stabilizing macroeconomic properties of cash and recommend guidelines for cash supply by central banks and the banking system. Finally, we exemplify our analysis with five case studies from the developing world, namely Venezuela, Zimbabwe, Afghanistan, Iraq, and Libya.
Data is considered the new oil of the economy, but privacy concerns limit their use, leading to a widespread sense that data analytics and privacy are contradictory. Yet such a view is too narrow, because firms can implement a wide range of methods that satisfy different degrees of privacy and still enable them to leverage varied data analytics methods. Therefore, the current study specifies different functions related to data analytics and privacy (i.e., data collection, storage, verification, analytics, and dissemination of insights), compares how these functions might be performed at different levels (consumer, intermediary, and firm), outlines how well different analytics methods address consumer privacy, and draws several conclusions, along with future research directions.
This paper studies the impact of banks’ dividend restrictions on the behavior of their institutional investors. Using an identification strategy that relies on the within investor variation and a difference in difference setup, I find that funds permanently decrease their ownership shares at treated banks during the 2020 dividend restrictions in the Eurozone and even exit treated banks’ stocks. Using data before the intro- duction of the ban reveals a positive relationship between fund ownership and banks’ dividend yield, highlighting again the importance of dividends for European banks’ fund investors. This reaction also has pricing implications since there is a negative relationship between the dividend restriction announcement day cumulative abnormal returns and the percentage of fund owners per bank.
This literature survey explores the potential avenues for the design of a green auto asset-backed security by focusing on the European auto securitization market. In this context, we examine the entire value chain of the securitization process to understand the incentives and interests involved at various stages of the transaction. We review recent regulatory developments, feasibility concerns, and potential designs of a sustainable securitization framework. Our study suggests that a Green Auto ABS should be based on both a green use of proceeds and a green collateral-based methodology.
Climate risk has become a major concern for financial institutions and financial markets. Yet, climate policy is still in its infancy and contributes to increased uncertainty. For example, the lack of a sufficiently high carbon price and the variety of definitions for green activities lower the value of existing and new capital, and complicate risk management. This column argues that it would be welfare-enhancing if policy changes were to follow a predictable longer-term path. Accordingly, the authors suggest a role for financial regulation in the transition.
We document the structure of firm-bank relationships across the eleven largest euro area countries and present new stylised facts using novel data from the recent credit registry of the Eurosystem - AnaCredit. We look at the number of banking relationships, reliance on the main bank, credit instruments, loan maturity and interest rates. The granularity of the data allows us to account for cross country differences in firm characteristics. Firms in Southern European countries borrow from a larger number of banks and obtain a lower share of credit from the main bank compared to those in Northern European countries. They also tend to borrow more on short term, more expensive instruments and to obtain loans with shorter maturity. This is consistent with the hypothesis that Southern European countries rely less on relationship banking and obtain credit less conducive to firm growth, in line with the smaller average size of Southern European firms. Instead, no clear pattern emerges in terms of interest rates, consistent with the idea that banks appropriate part of the surplus generated by relationship lending through higher rates.