Refine
Year of publication
Document Type
- Working Paper (1306)
- Part of Periodical (277)
- Article (162)
- Report (96)
- Doctoral Thesis (34)
- Conference Proceeding (14)
- Part of a Book (7)
- Book (5)
- Periodical (4)
- Preprint (4)
Language
- English (1915) (remove)
Is part of the Bibliography
- no (1915)
Keywords
- Deutschland (58)
- Financial Institutions (47)
- ECB (41)
- Capital Markets Union (36)
- monetary policy (34)
- Financial Markets (33)
- Banking Union (32)
- Banking Regulation (28)
- Monetary Policy (28)
- Household Finance (27)
Institute
- Wirtschaftswissenschaften (1915) (remove)
The 2011 Arab Spring marked the opening of the Central Mediterranean Route for irregular border crossings between Libya and Italy, which produced heterogeneous reductions of bilateral smuggling distances between country pairs in the Mediterranean region. We exploit this source of spatial and temporal variation in bilateral distance along land and sea routes to estimate the elasticity of irregular migration intentions for African and Near East countries. We estimate an elasticity of migration intentions to smuggling distances exceeding −3, mainly driven by countries with weak rule of law and high internet penetration. Our findings are consistent across irregular migration measures both at the aggregate and individual levels. We show that irregular migration elasticity is higher for youth, relatively skilled individuals and those with an informative advantage (having a social network abroad or a mobile phone).
Nations are imposing unprecedented measures at a large scale to contain the spread of the COVID-19 pandemic. While recent studies show that non-pharmaceutical intervention measures such as lockdowns may have mitigated the spread of COVID-19, those measures also lead to substantial economic and social costs, and might limit exposure to ultraviolet-B radiation (UVB). Emerging observational evidence indicates the protective role of UVB and vitamin D in reducing the severity and mortality of COVID-19 deaths. This observational study empirically outlines the protective roles of lockdown and UVB exposure as measured by the ultraviolet index (UVI). Specifically, we examine whether the severity of lockdown is associated with a reduction in the protective role of UVB exposure. We use a log-linear fixed-effects model on a panel dataset of secondary data of 155 countries from 22 January 2020 until 7 October 2020 (n = 29,327). We use the cumulative number of COVID-19 deaths as the dependent variable and isolate the mitigating influence of lockdown severity on the association between UVI and growth rates of COVID-19 deaths from time-constant country-specific and time-varying country-specific potentially confounding factors. After controlling for time-constant and time-varying factors, we find that a unit increase in UVI and lockdown severity are independently associated with − 0.85 percentage points (p.p) and − 4.7 p.p decline in COVID-19 deaths growth rate, indicating their respective protective roles. The change of UVI over time is typically large (e.g., on average, UVI in New York City increases up to 6 units between January until June), indicating that the protective role of UVI might be substantial. However, the widely utilized and least severe lockdown (governmental recommendation to not leave the house) is associated with the mitigation of the protective role of UVI by 81% (0.76 p.p), which indicates a downside risk associated with its widespread use. We find that lockdown severity and UVI are independently associated with a slowdown in the daily growth rates of cumulative COVID-19 deaths. However, we find evidence that an increase in lockdown severity is associated with significant mitigation in the protective role of UVI in reducing COVID-19 deaths. Our results suggest that lockdowns in conjunction with adequate exposure to UVB radiation might have even reduced the number of COVID-19 deaths more strongly than lockdowns alone. For example, we estimate that there would be 11% fewer deaths on average with sufficient UVB exposure during the period people were recommended not to leave their house. Therefore, our study outlines the importance of considering UVB exposure, especially while implementing lockdowns, and could inspire further clinical studies that may support policy decision-making in countries imposing such measures.
This research focuses on the cost of financing green projects on the primary bond market and tests for a potential price differential between green bonds issued by government entities and those issued by supranational and private sector issuers. Our findings indicate that government entities benefit from more favorable pricing conditions worldwide. This advantage is growing over time and particularly pronounced for sovereigns and municipal authorities. Our analysis also reveals that country-specific factors, such as strong political commitment to address climate change, low income level and high degree of indebtedness are significant predictors of the pricing spread across bonds.
Contagious stablecoins?
(2023)
Can competing stablecoins produce efficient and stable outcomes? We study competition among stablecoins pegged to a stable currency. They are backed by interest-bearing safe assets and can be redeemed with the issuer or traded in a secondary market. If an issuer sticks to an appropriate investment and redemption rule, its stablecoin is invulnerable to runs. Since an issuer must pay interest on its stablecoin if other issuers also pay interest, competing interest-bearing stablecoins, however, are contagious and can render the economy inefficient and unstable. The efficient allocation is uniquely implemented when regulation prevents interest payments on stablecoins.
In this study, we unpack the ESG ratings of four prominent agencies in Europe and find that (i) each single E, S, G pillar explains the overall ESG score differently,(ii) there is a low co-movement between the three E, S, G pillars and (iii) there are specific ESG Key Performance Indicators (KPIs) that are driving these ratings more than others. We argue that such discrepancies might mislead firms about their actual ESG status, potentially leading to cherry-picking areas for improvement, thus raising questions about the accuracy and effectiveness of ESG evaluations in both explaining sustainability and driving capital toward sustainable companies.
We document the individual willingness to act against climate change and study the role of social norms in a large sample of US adults. Individual beliefs about social norms positively predict pro-climate donations, comparable in strength to universal moral values and economic preferences such as patience and reciprocity. However, we document systematic misperceptions of social norms. Respondents vastly underestimate the prevalence of climate-friendly behaviors and norms. Correcting these misperceptions in an experiment causally raises individual willingness to act against climate change as well as individual support for climate policies. The effects are strongest for individuals who are skeptical about the existence and threat of global warming.
Despite a number of helpful changes, including the adoption of an inflation target, the Fed’s monetary policy strategy proved insufficiently resilient in recent years. While the Fed eased policy appropriately during the pandemic, it fell behind the curve during the post-pandemic recovery. During 2021, the Fed kept easing policy while the inflation outlook was deteriorating and the economy was growing considerably faster than the economy’s natural growth rate—the sum of the Fed’s 2% inflation goal and the growth rate of potential output.
The resilience of the Fed’s monetary policy strategy could be enhanced, and such errors be avoided with guidance from a simple natural growth targeting rule that prescribes that the federal funds rate during each quarter be raised (cut) when projected nominal income growth exceeds (falls short) of the economy’s natural growth rate. An illustration with real-time data and forecasts since the early 1990s shows that Fed policy has not persistently deviated from this simple rule with the notable exception of the period coinciding with the Fed’s post-pandemic policy error.
Highlights
• Pathways for a circular economy towards the EU goals require policy support that, in turn, requires legitimacy.
• Legitimacy is often contested in the public discourse at all phases in the technological innovation system.
• Legitimacy remains poorly understood for ‘in-between’ technologies that struggle to move from the formative to the growth stage.
• The article explores legitimacy for chemical recycling primarily based on evidence from the UK, Germany, and Italy.
Abstract
The European Commission aims to increase the recycling of plastic packaging to 60% by 2025, requiring fundamental changes towards a more circular economy. Pathways for this transition require policy support that largely depends on their legitimacy in the public discourse. These normative aspects remain poorly understood for ‘in-between’ technologies, i.e., technologies that are no longer novel but struggle to move to the growth phase within the technological innovation system. Therefore, we ask: How do discourses shape technology legitimacy for in-between technologies? Drawing on the empirical example of chemical recycling, the analysis renders two principal findings. First, legitimising and delegitimising storylines present contesting views on in-between technologies regarding their technological aspects, environmental and social impacts, and economic and policy implications. Second, how discourses contribute to technology legitimacy depends on the actors and interests that drive the prevalent storylines in particular contexts.
Highlights
• Six Newton methods for solving matrix quadratic equations in linear DSGE models.
• Compared to QZ using 99 different DSGE models including Smets and Wouters (2007).
• Newton methods more accurate than QZ with comparable computation burden.
• Apt for refining solutions from alternative methods or nearby parameterizations.
Abstract
This paper presents and compares Newton-based methods from the applied mathematics literature for solving the matrix quadratic that underlies the recursive solution of linear DSGE models. The methods are compared using nearly 100 different models from the Macroeconomic Model Data Base (MMB) and different parameterizations of the monetary policy rule in the medium-scale New Keynesian model of Smets and Wouters (2007) iteratively. We find that Newton-based methods compare favorably in solving DSGE models, providing higher accuracy as measured by the forward error of the solution at a comparable computation burden. The methods, however, suffer from their inability to guarantee convergence to a particular, e.g. unique stable, solution, but their iterative procedures lend themselves to refining solutions either from different methods or parameterizations.
In a unifying framework generalizing established theories we characterize under which conditions Joint Ownership of assets creates the best cooperation incentives in a partnership. We endogenise renegotiation costs and assume that they weakly increase with additional assets. A salient sufficient condition for optimal cooperation incentives among patient partners is if Joint Ownership is a Strict Coasian Institution for which transaction costs impede an efficient asset reallocation after a breakdown. In contrast to Halonen (2002) the logic behind our results is that Joint Ownership maximizes the value of the relationship and the costs of renegotiating ownership after a broken relationship.
The hierarchical feature regression (HFR) is a novel graph-based regularized regression estimator, which mobilizes insights from the domains of machine learning and graph theory to estimate robust parameters for a linear regression. The estimator constructs a supervised feature graph that decomposes parameters along its edges, adjusting first for common variation and successively incorporating idiosyncratic patterns into the fitting process. The graph structure has the effect of shrinking parameters towards group targets, where the extent of shrinkage is governed by a hyperparameter, and group compositions as well as shrinkage targets are determined endogenously. The method offers rich resources for the visual exploration of the latent effect structure in the data, and demonstrates good predictive accuracy and versatility when compared to a panel of commonly used regularization techniques across a range of empirical and simulated regression tasks.
In its first ten years (2014-2023), the banking union was successful in its prudential agenda but failed spectacularly in its underlying objective: establishing a single banking market in the euro area. This goal is now more important than ever, and easier to attain than at any time in the last decade. To make progress, cross-border banks should receive a specific treatment within general banking union legislation. Suggestions are made on how to make such regulatory carve-out effective and legally sound.
The Eurosystem and the Deutsche Bundesbank will incur substantial losses in 2023 that are likely to persist for several years. Due to the massive purchases of securities in the last 10 years, especially of government bonds, the banks' excess reserves have risen sharply. The resulting high interest payments to the banks since the turnaround in monetary policy, with little income for the large-scale securities holdings, led to massive criticism. The banks were said to be making "unfair" profits as a result, while the fiscal authorities had to forego the previously customary transfers of central bank profits. Populist demands to limit bank profits by, for example, drastically increasing the minimum reserve ratios in the Eurosystem to reduce excess reserves are creating new severe problems and are neither justified nor helpful. Ultimately, the EU member states have benefited for a very long time from historically low interest rates because of the Eurosystem's extraordinary loose monetary policy and must now bear the flip side consequences of the massive expansion of central bank balance sheets during the necessary period of monetary policy normalisation.
This cumulative dissertation contains four self-contained chapters on stochastic games and learning in intertemporal choice.
Chapter 1 presents an experiment on value learning in a setting where actions have both immediate and delayed consequences. Subjects make a series of choices between abstract options, with values that have to be learned by sampling. Each option is associated with two payoff components: One is revealed immediately after the choice, the other with one round delay. Objectively, both payoff components are equally important, but most subjects systematically underreact to the delayed consequences. The resulting behavior appears impatient or myopic. However, there is no inherent reason to discount: All rewards are paid simultaneously, after the experiment. Elicited beliefs on the value of options are in accordance with choice behavior. These results demonstrate that revealed impatience may arise from frictions in learning, and that discounting does not necessarily reflect deep time preferences. In a treatment variation, subjects first learn passively from the evidence generated by others, before then making a series of own choices. Here, the underweighting of delayed consequences is attenuated, in particular for the earliest own decisions. Active decision making thus seems to play an important role in the emergence of the observed bias.
Chapter 2 introduces and proves existence of Markov quantal response equilibrium (QRE), an application of QRE to finite discounted stochastic games. We then study a specific case, logit Markov QRE, which arises when players react to total discounted payoffs using the logit choice rule with precision parameter λ. We show that the set of logit Markov QRE always contains a smooth path that leads from the unique QRE at λ = 0 to a stationary equilibrium of the game as λ goes to infinity. Following this path allows to solve arbitrary finite discounted stochastic games numerically; an implementation of this algorithm is publicly available as part of the package sgamesolver. We further show that all logit Markov QRE are ε-equilibria, with a bound for ε that is independent of the payoff function of the game and decreases hyperbolically in λ. Finally, we establish a link to reinforcement learning, by characterizing logit Markov QRE as the stationary points of a game dynamic that arises when all players follow the well-established reinforcement learning algorithm expected SARSA.
Chapter 3 introduces the logarithmic stochastic tracing procedure, a homotopy method to compute stationary equilibria for finite and discounted stochastic games. We build on the linear stochastic tracing procedure (Herings and Peeters 2004), but introduce logarithmic penalty terms as a regularization device, which brings two major improvements. First, the scope of the method is extended: it now has a convergence guarantee for all games of this class, rather than just generic ones. Second, by ensuring a smooth and interior solution path, computational performance is increased significantly. A ready-to-use implementation is publicly available. As demonstrated here, its speed compares quite favorable to other available algorithms, and it allows to solve games of considerable size in reasonable times. Because the method involves the gradual transformation of a prior into equilibrium strategies, it is possible to search the prior space and uncover potentially multiple equilibria and their respective basins of attraction. This also connects the method to established theory of equilibrium selection.
Chapter 4 introduces sgamesolver, a python package that uses the homotopy method to compute stationary equilibria of finite discounted stochastic games. A short user guide is complemented with discussion of the homotopy method, the two implemented homotopy functions logit Markov QRE and logarithmic tracing, and the predictor-corrector procedure and its implementation in sgamesolver. Basic and advanced use cases are demonstrated using several example games. Finally, we discuss the topic of symmetries in stochastic games.
Central banks sowing the seeds for a green financial sector? NGFS membership and market reactions
(2024)
In December 2017, during the One Planet Summit in Paris, a group of eight central banks and supervisory authorities launched the “Network for Greening the Financial Sector” (NGFS) to address challenges and risks posed by climate change to the global financial system. Until 06/2023 an additional 69 central banks from all around the world have joined the network. We find that the propensity to join the network can be described as a function in the country’s economic development (e.g., GDP per capita), national institutions (e.g., central bank independence), and performance of the central bank on its mandates (e.g., price stability and output gap). Using an event study design to examine consequences of network expansions in capital markets, we document that a difference portfolio that is long in clean energy stocks and short in fossil fuel stocks benefits from an enlargement of the NGFS. Overall, our results suggest that an increasing number of central banks and supervisory authorities are concerned about climate change and willing to go beyond their traditional objectives, and that the capital market believes they will do so.
By computing a volatility index (CVX) from cryptocurrency option prices, we analyze this market’s expectation of future volatility. Our method addresses the challenging liquidity environment of this young asset class and allows us to extract stable market implied volatilities. Two alternative methods are considered to compute volatilities from granular intra-day cryptocurrency options data, which spans over the COVID-19 pandemic period. CVX data therefore capture ‘normal’ market dynamics as well as distress and recovery periods. The methods yield two cointegrated index series, where the corresponding error correction model can be used as an indicator for market implied tail-risk. Comparing our CVX to existing volatility benchmarks for traditional asset classes, such as VIX (equity) or GVX (gold), confirms that cryptocurrency volatility dynamics are often disconnected from traditional markets, yet, share common shocks.
The pricing of digital art
(2023)
The intersection of recent advancements in generative artificial intelligence and blockchain technology has propelled digital art into the spotlight. Digital art pricing recognizes that owners derive utility beyond the artwork’s inherent value. We incorporate the consumption utility associated with digital art and model the stochastic discount factor and risk premiums. Furthermore, we conduct a calibration analysis to analyze the effects of shifts in the real and digital economy. Higher returns are required in a digital market upswing due to increased exposure to systematic risk and digital art prices are especially responsive to fluctuations in business cycles within digital markets.
Using a field study at a German brokerage, we investigate advised individual investors’ behavior and outcomes after self-selecting into a flat-fee scheme (percentage of portfolio value) for mutual funds. In a difference-in-differences setting, we compare 699 switchers to propensity-score-matched advisory clients who remained in the commission-based scheme. Switchers increase their portfolio values, improve portfolio diversification, and increase their portfolio performance. They also demand more financial advice and follow more advisor recommendations. We argue that switchers attribute a higher quality to the unchanged advisory services.
We study the role mutual funds play in the recovery from fast intraday crashes based on data from the National Stock Exchange of India for a single large stock. During normal times, trading activity and liquidity provision by mutual funds is negligible compared to other traders at around 4% of overall activity. Nevertheless, for the two intraday market-wide crashes in our sample, price recovery took place only after mutual funds moved in. Market stability may require the presence of well-capitalized standby liquidity providers for recovery from fast crashes.
The recent COVID-19 pandemic represents an unprecedented worldwide event to study the influence of related news on the financial markets, especially during the early stage of the pandemic when information on the new threat came rapidly and was complex for investors to process. In this paper, we investigate whether the flow of news on COVID-19 had an impact on forming market expectations. We analyze 203,886 online articles dealing with COVID-19 and published on three news platforms (MarketWatch.com, NYTimes.com, and Reuters.com) in the period from January to June 2020. Using machine learning techniques, we extract the news sentiment through a financial market-adapted BERT model that enables recognizing the context of each word in a given item. Our results show that there is a statistically significant and positive relationship between sentiment scores and S&P 500 market. Furthermore, we provide evidence that sentiment components and news categories on NYTimes.com were differently related to market returns.
Can consumption-based mechanisms generate positive and time-varying real term premia as we see in the data? I show that only models with time-varying risk aversion or models with high consumption risk can independently produce these patterns. The latter explanation has not been analysed before with respect to real term premia, and it relies on a small group of investors exposed to high consumption risk. Additionally, it can give rise to a “consumption-based arbitrageur” story of term premia. In relation to preferences, I consider models with both time-separable and recursive utility functions. Specifically for recursive utility, I introduce a novel perturbation solution method in terms of the intertemporal elasticity of substitution. This approach has not been used before in such models, it is easy to implement, and it allows a wide range of values for the parameter of intertemporal elasticity of substitution.
The complexities of geopolitical events, financial and fiscal crises, and the ebb and flow of personal life circumstances can weigh heavily on individuals’ minds as they make critical economic decisions. To investigate the impact of cognitive load on such decisions, the authors conducted an incentivized online experiment involving a representative sample of 2,000 French households. The results revealed that exposure to a taxing and persistent cognitive load significantly reduced consumption, particularly for individuals under the threat of furlough, while simultaneously increasing their account balances, particularly for those not facing such employment uncertainty. These effects were not driven by supply constraints or a worsening of credit constraints. Instead, cognitive load primarily affected the optimality of the chosen policy rules and impaired the ability of the standard economic model to accurately predict consumption patterns, although this effect was less pronounced among college-educated subjects
We investigate how unconventional monetary policy, via central banks’ purchases of corporate bonds, unfolds in credit-saturated markets. While this policy results in a loosening of credit market conditions as intended by policymakers, we report two unintended side effects. First, the policy impacts the allocation of credit among industries. Affected banks reallocate loans from investment-grade firms active on bond markets almost entirely to real estate asset managers. Other industries do not obtain more loans, particularly real estate developers and construction firms. We document an increase in real estate prices due to this policy, which fuels real estate overvaluation. Second, more loan write-offs arise from lending to these firms, and banks are not compensated for this risk by higher interest rates. We document a drop in bank profitability and, at the same time, a higher reliance on real estate collateral. Our findings suggest that central banks’ quantitative easing has substantial adverse effects in credit-saturated economies.
We conduct a field experiment with clients of a German universal bank to explore the impact of peer information on sustainable retail investments. Our results show that infor-mation about peers’ inclination towards sustainable investing raises the amount allocated to stock funds labeled sustainable, when communicated during a buying decision. This effect is primarily driven by participants initially underestimating peers’ propensity to invest sustainably. Further, treated individuals indicate an increased interest in addi-tional information on sustainable investments, primarily on risk and return expectations. However, when analyzing account-level portfolio holding data over time, we detect no spillover effects of peer information on later sustainable investment decisions.
Many consumers care about climate change and other externalities associated with their purchases. We analyze the behavior and market effects of such “socially responsible consumers” in three parts. First, we develop a flexible theoretical framework to study competitive equilibria with rational consequentialist consumers. In violation of price taking, equilibrium feedback non-trivially dampens a consumer’s mitigation efforts, undermining responsible behavior. This leads to a new type of market failure, where even consumers who fully “internalize the externality” overconsume externality-generating goods. At the same time, socially responsible consumers change the relative effectiveness of taxes, caps, and other policies in lowering the externality. Second, since consumer beliefs about and preferences over dampening play a crucial role in our framework, we investigate them empirically via a tailored survey. Consistent with our model, consumers are predominantly consequentialist, and on average believe in dampening. Inconsistent with our model, however, many consumers fail to anticipate dampening. Third, therefore, we analyze how such “naive” consumers modify our theoretical conclusions. Naive consumers behave more responsibly than rational consumers in a single-good economy, but may behave less responsibly in a multi-good economy with cross-market spillovers. A mix of naive and rational consumers may yield the worst outcomes.
This paper investigates stock market reaction to greenwashing by analyzing a new channel whereby companies change their names to green-related ones (i.e., names that evoke green and sustainable sentiments) to persuade the public that their activities are green. The findings reveal a striking positive stock price reaction to the announcement of corporate name changes to green-related names only for companies not involved in green activities at the time of the announcement. However, over an extended period of time, companies unrelated to green activities experience substantial negative abnormal returns if they fail to align their operational focus with the new name after the change.
How does group identity affect belief formation? To address this question, we conduct a series of online experiments with a representative sample of individuals in the US. Using the setting of the 2020 US presidential election, we find evidence of intergroup preference across three distinct components of the belief formation cycle: a biased prior belief, avoid-ance of outgroup information sources, and a belief-updating process that places greater (less) weight on prior (new) information. We further find that an intervention reducing the salience of information sources decreases outgroup information avoidance by 50%. In a social learn-ing context in wave 2, we find participants place 33% more weight on ingroup than outgroup guesses. Through two waves of interventions, we identify source utility as the mechanism driving group effects in belief formation. Our analyses indicate that our observed effects are driven by groupy participants who exhibit stable and consistent intergroup preferences in both allocation decisions and belief formation across all three waves. These results suggest that policymakers could reduce the salience of group and partisan identity associated with a policy to decrease outgroup information avoidance and increase policy uptake.
This paper applies structure preserving doubling methods to solve the matrix quadratic underlying the recursive solution of linear DSGE models. We present and compare two Structure-Preserving Doubling Algorithms ( SDAs) to other competing methods – the QZ method, a Newton algorithm, and an iterative Bernoulli approach – as well as the related cyclic and logarithmic reduction algorithms. Our comparison is completed using nearly 100 different models from the Macroeconomic Model Data Base (MMB) and different parameterizations of the monetary policy rule in the medium scale New Keynesian model of Smets and Wouters (2007) iteratively. We find that both SDAs perform very favorably relative to QZ, with generally more accurate solutions computed in less time. While we collect theoretical convergence results that promise quadratic convergence rates to a unique stable solution, the algorithms may fail to converge when there is a breakdown due to singularity of the coefficient matrices in the recursion. One of the proposed algorithms can overcome this problem by an appropriate (re)initialization. This SDA also performs particular well in refining solutions of different methods or from nearby parameterizations.
Whatever it takes to understand a central banker : embedding their words using neural networks
(2023)
Dictionary approaches are at the forefront of current techniques for quantifying central bank communication. In this paper, the author propose a novel language model that is able to capture subtleties of messages such as one of the most famous sentences in central bank communications when ECB President Mario Draghi stated that "within [its] mandate, the ECB is ready to do whatever it takes to preserve the euro".
The authors utilize a text corpus that is unparalleled in size and diversity in the central bank communication literature, as well as introduce a novel approach to text quantication from computational linguistics. This allows them to provide high-quality central bank-specific textual representations and demonstrate their applicability by developing an index that tracks deviations in the Fed's communication towards inflation targeting. Their findings indicate that these deviations in communication significantly impact monetary policy actions, substantially reducing the reaction towards inflation deviation in the US.
Christine Laudenbach and Vincent Lindner: To promote financial education among children, young people, and adults in the long term, comprehensive information services must reach the entire population in Germany with the help of cooperation partners. Talking about finances can no longer be a taboo subject.
Standard applications of the consumption-based asset pricing model assume that goods and services within the nondurable consumption bundle are substitutes. We estimate substitution elasticities between different consumption bundles and show that households cannot substitute energy consumption by consumption of other nondurables. As a consequence, energy consumption affects the pricing function as a separate factor. Variation in energy consumption betas explains a large part of the premia related to value, investment, and operating profitability. For example, value stocks are typically more energy-intensive than growth stocks and thus riskier, since they suffer more from the oil supply shocks that also affect households.
We propose a model with mean-variance foreign investors who exhibit a convex disutility associated to brown bond holdings. The model predicts that bond green premia should be smaller in economies with a closer financial account and highly volatile exchange rates. This happens because foreign intermediaries invest relatively less in such economies, and this lowers the marginal disutility of investing in polluting activities. We find strong empirical evidence in favor of this hypothesis using a global bond market dataset. Exchange rate volatility and financial account openness are thus able to explain the higher financing costs of green projects in emerging markets relative to advanced economies, especially when green bonds are denominated in local currency: a disadvantage that we can call the "green sin" of emerging economies.
This study looks at potential windfall profits for the four banking acquisitions in 2023. Based on accounting figures, an FT article states that a total of USD 44bn was left on the table. We see accounting figures as a misleading analysis. By estimating marked-based cumulative abnormal returns (CAR), we find positive abnormal returns in all four cases which when made quantifiable, are around half of the FT’s accounting figures. Furthermore, we argue that transparent auctions with enough bidders should be preferred to negotiated bank sales.
This document was provided/prepared by the Economic Governance and EMU Scrutiny Unit at the request of the ECON Committee.
This paper explores entrepreneurs’ initially intended exit strategies and compares them to their final exit paths using an inductive approach that builds on the grounded theory methodology. Our data shows that initially intended and final exit strategies differ among entrepreneurs. Two groups of entrepreneurs emerged from our data. The first group comprises entrepreneurs who financed their firms through equity investors. The second group is made up of entrepreneurs who financed their businesses solely with their own equities. Our data shows that the first group originally intended a financial harvest exit strategy and settled with this harvest exit strategy. The second group initially intended a stewardship exit strategy but did not succeed. We used the theory of planned behavior and the behavioral agency model to analyze our data. By examining our results from these two theoretical perspectives, our study explains how entrepreneurs’ exit intentions lead to their actual exit strategies.
This paper develops and implements a backward and forward error analysis of and condition numbers for the numerical stability of the solutions of linear dynamic stochastic general equilibrium (DSGE) models. Comparing seven different solution methods from the literature, I demonstrate an economically significant loss of accuracy specifically in standard, generalized Schur (or QZ) decomposition based solutions methods resulting from large backward errors in solving the associated matrix quadratic problem. This is illustrated in the monetary macro model of Smets and Wouters (2007) and two production-based asset pricing models, a simple model of external habits with a readily available symbolic solution and the model of Jermann (1998) that lacks such a symbolic solution - QZ-based numerical solutions miss the equity premium by up to several annualized percentage points for parameterizations that either match the chosen calibration targets or are nearby to the parameterization in the literature. While the numerical solution methods from the literature failed to give any indication of these potential errors, easily implementable backward-error metrics and condition numbers are shown to successfully warn of such potential inaccuracies. The analysis is then performed for a database of roughly 100 DSGE models from the literature and a large set of draws from the model of Smets and Wouters (2007). While economically relevant errors do not appear pervasive from these latter applications, accuracies that differ by several orders of magnitude persist.
Product aesthetics is a powerful means for achieving competitive advantage. Yet most studies to date have focused on the role of aesthetics in shaping pre-purchase preferences and have failed to consider how product aesthetics affects post-purchase processes and consumers' usage behavior. This research focuses on the relationship between aesthetics and usage behavior in the context of durable products. Studies 1A to 1C provide evidence of a positive effect of product aesthetics on usage intensity using market data from the car and the fashion industries. Study 2 corroborates these findings and shows that the more intensive use of highly aesthetic products may lead to the acquisition of product-specific usage skills that form the basis for a cognitive lock-in. Hence, consumers are less likely to switch away from products with appealing designs, an effect that is labeled as the ‘aesthetic fidelity’ effect. Study 3 addresses an alternative explanation for the ‘aesthetic fidelity effect’ based on mood and motivation but finds that the ‘aesthetic fidelity’ effect is indeed determined by usage intensity. Finally, Study 4 identifies a boundary condition of the positive effect of product aesthetics on product usage, showing that it is limited to durable products. In sum, this research demonstrates that the effects of product aesthetics extend beyond the pre-consumption stage and have an enduring impact on people's consumption experiences.
This research examines the impact of online display advertising and paid search advertising relative to offline advertising on firm performance and firm value. Using proprietary data on annualized advertising expenditures for 1651 firms spanning seven years, we document that both display advertising and paid search advertising exhibit positive effects on firm performance (measured by sales) and firm value (measured by Tobin's q). Paid search advertising has a more positive effect on sales than offline advertising, consistent with paid search being closest to the actual purchase decision and having enhanced targeting abilities. Display advertising exhibits a relatively more positive effect on Tobin's q than offline advertising, consistent with its long-term effects. The findings suggest heterogeneous economic benefits across different types of advertising, with direct implications for managers in analyzing advertising effectiveness and external stakeholders in assessing firm performance.
Most event studies rely on cumulative abnormal returns, measured as percentage changes in stock prices, as their dependent variable. Stock price reflects the value of the operating business plus non-operating assets minus debt. Yet, many events, in particular in marketing, only influence the value of the operating business, but not non-operating assets and debt. For these cases, the authors argue that the cumulative abnormal return on the operating business, defined as the ratio between the cumulative abnormal return on stock price and the firm-specific leverage effect, is a more appropriate dependent variable. Ignoring the differences in firm-specific leverage effects inflates the impact of observations pertaining to firms with large debt and deflates those pertaining to firms with large non-operating assets. Observations of firms with high debt receive several times the weight attributed to firms with low debt. A simulation study and the reanalysis of three previously published marketing event studies shows that ignoring the firm-specific leverage effects influences an event study's results in unpredictable ways.
This article uses information from two data sources, Compustat and Nexis Uni, and textual analysis to measure and validate the brand focus and customer focus of 109 U.S. listed retailers. The results from an analysis of their 853 earnings calls in 2010 and 2018 outline that on average, both foci increased over time. Although both foci vary substantially, brand focus varies more widely across retailers than their customer focus. Both foci are independent of each other. Specialty retailers have the highest brand focus, and internet & direct marketing retailers have the highest customer focus. A positive correlation exists between a retailer’s customer focus and its profitability, but not between a retailer’s brand focus and its profitability. The authors use the results to generate a research agenda that can direct future research in further systematically exploring firms’ brand and customer focus.
Knowledge of consumers' willingness to pay (WTP) is a prerequisite to profitable price-setting. To gauge consumers' WTP, practitioners often rely on a direct single question approach in which consumers are asked to explicitly state their WTP for a product. Despite its popularity among practitioners, this approach has been found to suffer from hypothetical bias. In this paper, we propose a rigorous method that improves the accuracy of the direct single question approach. Specifically, we systematically assess the hypothetical biases associated with the direct single question approach and explore ways to de-bias it. Our results show that by using the de-biasing procedures we propose, we can generate a de-biased direct single question approach that is accurate enough to be useful for managerial decision-making. We validate this approach with two studies in this paper.
In recent years, European regulators have debated restricting the time an online tracker can track a user to protect consumer privacy better. Despite the significance of these debates, there has been a noticeable absence of any comprehensive cost-benefit analysis. This article fills this gap on the cost side by suggesting an approach to estimate the economic consequences of lifetime restrictions on cookies for publishers. The empirical study on cookies of 54,127 users who received ∼128 million ad impressions over ∼2.5 years yields an average cookie lifetime of 279 days, with an average value of €2.52 per cookie. Only ∼13 % of all cookies increase their daily value over time, but their average value is about four times larger than the average value of all cookies. Restricting cookies’ lifetime to one year (two years) could potentially decrease their lifetime value by ∼25 % (∼19 %), which represents a potential decrease in the value of all cookies of ∼9 % (∼5%). Most cookies, however, would not be affected by lifetime restrictions of 12 or 24 months as 72 % (85 %) of the users delete their cookies within 12 (24) months. In light of the €10.60 billion cookie-based display ad revenue in Europe, such restrictions would endanger €904 million (€576 million) annually, equivalent to €2.08 (€1.33) per EU internet user. The article discusses these results' marketing strategy challenges and opportunities for advertisers and publishers.
Even as online advertising continues to grow, a central question remains: Who to target? Yet, advertisers know little about how to select from the hundreds of audience segments for targeting (and combinations thereof) for a profitable online advertising campaign. Utilizing insights from a field experiment on Facebook (Study 1), we develop a model that helps advertisers solve the cold-start problem of selecting audience segments for targeting. Our model enables advertisers to calculate the break-even performance of an audience segment to make a targeted ad campaign at least as profitable as an untargeted one. Advertisers can use this novel model to decide whether to test specific audience segments in their campaigns (e.g., in randomized controlled trials). We apply our model to data from the Spotify ad platform to study the profitability of different audience segments (Study 2). Approximately half of those audience segments require the click-through rate to double compared to an untargeted campaign, which is unrealistically high for most ad campaigns. Our model also shows that narrow segments require a lift that is likely not attainable, specifically when the data quality of these segments is poor. We confirm this theoretical finding in an empirical study (Study 3): A decrease in data quality due to Apple’s introduction of the App Tracking Transparency (ATT) framework more negatively affects the click-through rate of narrow (versus broad) audience segments.
Small businesses face major challenges to becoming more innovative. These challenges are particularly prevalent in emerging economies where high uncertainties are a barrier to innovation. We know from previous studies that linkages to universities, on the one hand, and public procurement, on the other, support large and innovative firms in their efforts to become more innovative. However, we do not know whether these positive effects also hold true for small businesses. In this paper, we focus on how policy strategies reducing information, market and financial uncertainties shape small businesses’ innovation in China. Based on a sample of 926 small businesses derived from the World Bank Enterprises Survey in China (2012), we find that university-industry linkages enhance innovation, though only when it comes to minor forms of innovation. In line with the resource-based view of the firm, this effect is stronger for small businesses with higher capabilities. Moreover, we show that bidding for or delivering contracts to public sector clients has a positive effect on innovation, and in particular of major forms of innovation. In the bidding selection process, private firms and firms with higher capabilities are selected. Our findings show that both policy strategies have enhanced innovation, though with different effects on the degree of novelty. We attribute this finding to the different degrees of uncertainties they address.
In this article, we examine anti-refugee hate crime in the wake of the large influx of refugees to Germany in 2014 and 2015. By exploiting institutional features of the assignment of refugees to German regions, we estimate the impact of unexpected and sudden large-scale immigration on hate crime against refugees. Results indicate that it is not simply the size of local refugee inflows which drives the increase in hate crime, but rather the combination of refugee arrivals and latent anti-refugee sentiment. We show that ethnically homogeneous areas, areas which experienced hate crimes in the 1990s, and areas with high support for the Nazi party in the Weimar Republic, are more prone to respond to the arrival of refugees with incidents of hate crime against this group. Our results highlight the importance of regional anti-immigration sentiment in the analysis of the incumbent population’s reaction to immigration.
A novel spatial autoregressive model for panel data is introduced, which incor-porates multilayer networks and accounts for time-varying relationships. Moreover, the proposed approach allows the structural variance to evolve smoothly over time and enables the analysis of shock propagation in terms of time-varying spillover effects.
The framework is applied to analyse the dynamics of international relationships among the G7 economies and their impact on stock market returns and volatilities. The findings underscore the substantial impact of cooperative interactions and highlight discernible disparities in network exposure across G7 nations, along with nuanced patterns in direct and indirect spillover effects.
In his speech at the conference „The SNB and its Watchers“, Otmar Issing, member of the ECB Governing Council from its start in 1998 until 2006, takes a look back at more than twenty years of the conference series „The ECB and Its Watchers“. In June 1999, Issing established this format together with Axel Weber, then Director of the Center for Financial Studies, to discuss the monetary policy strategy of the newly founded central bank with a broad circle of participants, that is academics, bank economists and members of the media on a „neutral ground“. At the annual conference, the ECB and its representatives would play an active role and engage in a lively exchange of view with the other participants. Over the years, Volker Wieland took over as organizer of the conference series, which also was adopted by other central banks. In his contribution at the second conference „The SNB and its Watchers“, Issing summarizes the experience gained from over twenty years of the ECB Watchers Conference.
Vulnerability comes, according to Orio Giarini, with two risks: human-made risks, also called entrepreneurial risks, and natural or pure risks such as accidents and earthquakes. Both types of risk are growing in dimension and are increasingly interrelated. To control the vulnerability, sophisticated insurance products are called for. Here, mutual insurance is relevant, in particular when risks are large, probabilities uncertain or unknown, and events interrelated or correlated. In this paper the following three examples are discussed and the advantages of mutual insurance are shown: unknown probabilities connected with unforeseeable events, correlated risks and macroeconomic or demographic risks.
Investors' return expectations are pivotal in stock markets, but the reasoning behind these expectations remains a black box for economists. This paper sheds light on economic agents' mental models -- their subjective understanding -- of the stock market, drawing on surveys with the US general population, US retail investors, US financial professionals, and academic experts. Respondents make return forecasts in scenarios describing stale news about the future earnings streams of companies, and we collect rich data on respondents' reasoning. We document three main results. First, inference from stale news is rare among academic experts but common among households and financial professionals, who believe that stale good news lead to persistently higher expected returns in the future. Second, while experts refer to the notion of market efficiency to explain their forecasts, households and financial professionals reveal a neglect of equilibrium forces. They naively equate higher future earnings with higher future returns, neglecting the offsetting effect of endogenous price adjustments. Third, a series of experimental interventions demonstrate that these naive forecasts do not result from inattention to trading or price responses but reflect a gap in respondents' mental models -- a fundamental unfamiliarity with the concept of equilibrium.
Shallow meritocracy
(2023)
Meritocracies aspire to reward hard work and promise not to judge individuals by the circumstances into which they were born. However, circumstances often shape the choice to work hard. I show that people's merit judgments are "shallow" and insensitive to this effect. They hold others responsible for their choices, even if these choices have been shaped by unequal circumstances. In an experiment, US participants judge how much money workers deserve for the effort they exert. Unequal circumstances disadvantage some workers and discourage them from working hard. Nonetheless, participants reward the effort of disadvantaged and advantaged workers identically, regardless of the circumstances under which choices are made. For some participants, this reflects their fundamental view regarding fair rewards. For others, the neglect results from the uncertain counterfactual. They understand that circumstances shape choices but do not correct for this because the counterfactual—what would have happened under equal circumstances—remains uncertain.
We estimate the causal effect of shared e-scooter services on traffic accidents by exploiting the variation in the availability of e-scooter services induced by the staggered rollout across 93 cities in six countries. Police-reported accidents involving personal injuries in the average month increased by around 8.2% after shared e-scooters were introduced. Effects are large during summer and insignificant during winter. Further heterogeneity analysis reveals the largest estimated effects for cities with limited cycling infrastructure, while no effects are detectable in cities with high bike-lane density. This difference suggests that public policy can play a crucial role in mitigating accidents related to e-scooters and, more generally, to changes in urban mobility.
This paper proposes tests for out-of-sample comparisons of interval forecasts based on parametric conditional quantile models. The tests rank the distance between actual and nominal conditional coverage with respect to the set of conditioning variables from all models, for a given loss function. We propose a pairwise test to compare two models for a single predictive interval. The set-up is then extended to a comparison across multiple models and/or intervals. The limiting distribution varies depending on whether models are strictly non-nested or overlapping. In the latter case, degeneracy may occur. We establish the asymptotic validity of wild bootstrap based critical values across all cases. An empirical application to Growth-at-Risk (GaR) uncovers situations in which a richer set of financial indicators are found to outperform a commonly-used benchmark model when predicting downside risk to economic activity.
This paper studies the macro-financial implications of using carbon prices to achieve ambitious greenhouse gas (GHG) emission reduction targets. My empirical evidence shows a 0.6% output loss and a rise of 0.3% in inflation in response to a 1% shock on carbon policy. Furthermore, I also observe financial instability and allocation effects between the clean and highly polluted energy sectors. To have a better prediction of medium and long-term impact, using a medium-large macro-financial DSGE model with environmental aspects, I show the recessionary effect of an ambitious carbon price implementation to achieve climate targets, a 40% reduction in GHG emission causes a 0.7% output loss while reaching a zero-emission economy in 30 years causes a 2.6% output loss. I document an amplified effect of the banking sector during the transition path. The paper also uncovers the beneficial role of pre-announcements of carbon policies in mitigating inflation volatility by 0.2% at its peak, and our results suggest well-communicated carbon policies from authorities and investing to expand the green sector. My findings also stress the use of optimal green monetary and financial policies in mitigating the effects of transition risk and assisting the transition to a zero-emission world. Utilizing a heterogeneous approach with macroprudential tools, I find that optimal macroprudential tools can mitigate the output loss by 0.1% and investment loss by 1%. Importantly, my work highlights the use of capital flow management in the green transition when a global cooperative solution is challenging.
This study explores the implications of rising markups for optimal Mirrleesian income and profit taxation. Using a stylized model with two individuals, the main forces shaping welfare-optimal policies are analytically characterized. Although a higher profit tax has redistributive benefits, it adversely affects market competition, leading to a greater equilibrium cost-of-living. Rising markups directly contribute to a decline in optimal marginal taxes on labor income. The optimal policy response to higher markups includes increasingly relying on the profit tax to fund redistribution. Declining optimal marginal income taxes assists the redistributive function of the profit tax by contributing to the expansion of the profit tax base. This response alone considerably increases the equilibrium cost-of-living. Nevertheless, a majority of the individuals become better off with the optimal policy. If it is not possible to tax profits optimally, due, for example, to profit shifting, increasing redistribution via income taxes is not optimal; every individual is worse off relative to the scenario with optimal profit taxation.
The debate on monetary and fiscal policy is heavily influenced by estimates of the equilibrium real interest rate. In particular, this concerns estimates derived from a simple aggregate demand and Phillips curve model with time-varying components as proposed by Laubach and Williams (2003). For example, Summers (2014a) refers to these estimates as important evidence for a secular stagnation and the need for fiscal stimulus. Yellen (2015, 2017) has made use of such estimates in order to explain and justify why the Federal Reserve has held interest rates so low for so long. First, we re-estimate the United States equilibrium rate with the methodology of Laubach and Williams (2003). Then, we build on their approach and an alternative specification to provide new estimates for the United States, Germany, the euro area and Japan. Third, we subject these estimates to a battery of sensitivity tests. Due to the great uncertainty and sensitivity that accompany these equilibrium rate estimates, the observed decline in the estimates is not a reliable indicator of a need for expansionary monetary and fiscal policy. Yet, if these estimates are employed to determine the appropriate monetary policy stance, such estimates are better used together with the consistent estimate of the level of potential output.
While the COVID-19 pandemic had a large and asymmetric impact on firms, many countries quickly enacted massive business rescue programs which are specifically targeted to smaller firms. Little is known about the effects of such policies on business entry and exit, investment, factor reallocation, and macroeconomic outcomes. This paper builds a general equilibrium model with heterogeneous and financially constrained firms in order to evaluate the short- and long-term consequences of small firm rescue programs in a pandemic recession. We calibrate the stationary equilibrium and the pandemic shock to the U.S. economy, taking into account the factual Paycheck Protection Program (PPP) as a specific policy. We find that the policy has only a modest impact on aggregate output and employment because (i) jobs are saved predominately in the smallest firms that account for a minor share of employment and (ii) the grant reduces the reallocation of resources towards larger and less impacted firms. Much of the reallocation effects occur in the aftermath of the pandemic episode. By preventing inefficient liquidations, the policy dampens the long-term declines of aggregate consumption and of the real wage, thus delivering small welfare gains.
Nowadays, digitalization has an immense impact on the landscape of jobs. This technological revolution creates new industries and professions, promises greater efficiency and improves the quality of working life. However, emerging technologies such as robotics and artificial intelligence (AI) are reducing human intervention, thus advancing automation and eliminating thousands of jobs and whole occupational images. To prepare employees for the changing demands of work, adequate and timely training of the workforce and real-time support of workers in new positions is necessary. Therefore, it is investigated whether user-oriented technologies, such as augmented reality (AR) and virtual reality (VR) can be applied “on-the-job” for such training and support—also known as intelligence augmentation (IA). To address this problem, this work synthesizes results of a systematic literature review as well as a practically oriented search on augmented reality and virtual reality use cases within the IA context. A total of 150 papers and use cases are analyzed to identify suitable areas of application in which it is possible to enhance employees' capabilities. The results of both, theoretical and practical work, show that VR is primarily used to train employees without prior knowledge, whereas AR is used to expand the scope of competence of individuals in their field of expertise while on the job. Based on these results, a framework is derived which provides practitioners with guidelines as to how AR or VR can support workers at their job so that they can keep up with anticipated skill demands. Furthermore, it shows for which application areas AR or VR can provide workers with sufficient training to learn new job tasks. By that, this research provides practical recommendations in order to accompany the imminent distortions caused by AI and similar technologies and to alleviate associated negative effects on the German labor market.
Goal setting is vital in learning sciences, but the scientific evaluation of optimal learning goals is underexplored. This study proposes a novel methodological approach to determine optimal learning goals. The data in this study comes from a gamified learning app implemented in an undergraduate accounting course at a large German university. With a combination of decision trees and regression analyses, the goals connected to the badges implemented in the app are evaluated. The results show that the initial badge set already motivated learning strategies that led to better grades on the exam. However, the results indicate that the levels of the goals could be improved, and additional badges could be implemented. In addition to new goal levels, new goal types are also discussed. The findings show that learning goals initially determined by the instructors need to be evaluated to offer an optimal motivational effect. The new methodological approach used in this study can be easily transferred to other learning data sets to provide further insights.
Life insurers use accounting and actuarial techniques to smooth reporting of firm assets and liabilities, seeking to transfer surpluses in good years to cover benefit payouts in bad years. Yet these techniques have been criticized as they make it difficult to assess insurers’ true financial status. We develop stylized and realistically-calibrated models of a participating life annuity, an insurance product that pays retirees guaranteed lifelong benefits along with variable non-guaranteed surplus. Our goal is to illustrate how accounting and actuarial techniques for this type of financial contract shape policyholder wellbeing, along with insurer profitability and stability. Smoothing adds value to both the annuitant and the insurer, so curtailing smoothing could undermine the market for long-term retirement payout products.
We investigate how financial literacy shapes older Americans’ demand for financial advice. Using an experimental module fielded in the Health and Retirement Study, we show that financial literacy strongly improves the quality but not the quantity of financial advice sought. In particular, more financially literate people seek financial help from professionals. This effect is more pronounced among older people and those with more wealth and more complex financial positions. Our analysis result implies that financial literacy and financial advisory services are complementary with, rather than substitutes for, each other.
This paper examines heterogeneity in time discounting among a representative sample of elderly Americans, as well as its role in explaining key economic behaviors at older ages. We show how older Americans evaluate simple (hypothetical) inter-temporal choices in which payments today are compared with payments in the future. Using the indicators derived from this measure, we then demonstrate that differences in discounting patterns are associated with characteristics of particular importance in elderly populations. For example, cognitive deficits are associated with greater impatience, whereas bequest motives are associated with less impatience. We then relate our discounting measure to key economic outcomes and find that impatience is associated with lower wealth, fewer investments in health, and less planning for end of life care.
The US Treasury recently permitted deferred longevity income annuities to be included in pension plan menus as a default payout solution, yet little research has investigated whether more people should convert some of the $18 trillion they hold in employer-based defined contribution plans into lifelong income streams. We investigate this innovation using a calibrated lifecycle consumption and portfolio choice model embodying realistic institutional considerations. Our welfare analysis shows that defaulting a modest portion of retirees’ 401(k) assets (over a threshold) is an attractive way to enhance retirement security, enhancing welfare by up to 20% of retiree plan accruals.
Do required minimum distribution 401(k) rules matter, and for whom? Insights from a lifecycle model
(2023)
Tax-qualified vehicles have helped U.S. private-sector workers accumulate $33Tr in retirement plans. An often-overlooked important institutional feature shaping decumulations from these plans is the “Required Minimum Distribution” (RMD) regulation requiring retirees to withdraw a minimum fraction from their retirement accounts or pay excise taxes on withdrawal shortfalls. Our calibrated lifecycle model measures the impact of RMD rules on heterogeneous households’ financial behavior during their work lives and in retirement. The model shows that reforms delaying or eliminating the RMD rules have little effect on consumption profiles, but they would influence withdrawals and tax payments for households with bequest motives.
Artificial Intelligence (AI) and Machine Learning (ML) are currently hot topics in industry and business practice, while management-oriented research disciplines seem reluctant to adopt these sophisticated data analytics methods as research instruments. Even the Information Systems (IS) discipline with its close connections to Computer Science seems to be conservative when conducting empirical research endeavors. To assess the magnitude of the problem and to understand its causes, we conducted a bibliographic review on publications in high-level IS journals. We reviewed 1,838 articles that matched corresponding keyword-queries in journals from the AIS senior scholar basket, Electronic Markets and Decision Support Systems (Ranked B). In addition, we conducted a survey among IS researchers (N = 110). Based on the findings from our sample we evaluate different potential causes that could explain why ML methods are rather underrepresented in top-tier journals and discuss how the IS discipline could successfully incorporate ML methods in research undertakings.
Measuring and reducing energy consumption constitutes a crucial concern in public policies aimed at mitigating global warming. The real estate sector faces the challenge of enhancing building efficiency, where insights from experts play a pivotal role in the evaluation process. This research employs a machine learning approach to analyze expert opinions, seeking to extract the key determinants influencing potential residential building efficiency and establishing an efficient prediction framework. The study leverages open Energy Performance Certificate databases from two countries with distinct latitudes, namely the UK and Italy, to investigate whether enhancing energy efficiency necessitates different intervention approaches. The findings reveal the existence of non-linear relationships between efficiency and building characteristics, which cannot be captured by conventional linear modeling frameworks. By offering insights into the determinants of residential building efficiency, this study provides guidance to policymakers and stakeholders in formulating effective and sustainable strategies for energy efficiency improvement.
The forward guidance trap
(2023)
This paper examines the policy experience of the Fed, ECB and BOJ during and after the Covid-19 pandemic and draws lessons for monetary policy strategy and ist communication. All three central banks provided appropriate accommodation during the pandemic but two failed to unwind this accommodation in a timely manner. The Fed and ECB guided real interest rates to inappropriately negative levels as the economy recovered from the pandemic, fueling high inflation. The policy error can be traced to decisions regarding forward guidance on policy rates that delayed lift-off while the two central banks continued to expand their balance sheets. The Fed and the ECB fell into the forward guidance trap. This could have been avoided if policy were guided by a forward- looking rule that properly adjusted the nominal interest rate with the evolution of the inflation outlook.
Tail-correlation matrices are an important tool for aggregating risk measurements across risk categories, asset classes and/or business segments. This paper demonstrates that traditional tail-correlation matrices—which are conventionally assumed to have ones on the diagonal—can lead to substantial biases of the aggregate risk measurement’s sensitivities with respect to risk exposures. Due to these biases, decision-makers receive an odd view of the effects of portfolio changes and may be unable to identify the optimal portfolio from a risk-return perspective. To overcome these issues, we introduce the “sensitivity-implied tail-correlation matrix”. The proposed tail-correlation matrix allows for a simple deterministic risk aggregation approach which reasonably approximates the true aggregate risk measurement according to the complete multivariate risk distribution. Numerical examples demonstrate that our approach is a better basis for portfolio optimization than the Value-at-Risk implied tail-correlation matrix, especially if the calibration portfolio (or current portfolio) deviates from the optimal portfolio.
We empirically examine how systemic risk in the banking sector leads to correlated risk in office markets of global financial centers. In so doing, we compute an aggregated measure of systemic risk in financial centers as the cumulated expected capital shortfall of local financial institutions. Our identification strategy is based on a double counterfactual approach by comparing normal with financial distress periods as well as office with retail markets. We find that office market interconnectedness arises from systemic risk during financial turmoil periods. Office market performance in a financial center is affected by returns of systemically linked financial center office markets only during a systemic banking crisis. In contrast, there is no evidence of correlated risk during normal times and among the within-city counterfactual retail sector. The decline in office market returns during a banking crisis is larger in financial centers compared to non-financial centers.
Rezension zu: Social preferences: an introduction to behavioural economics and experimental research, by Michalis Drouvelis, Newcastle upon Tyne: Agenda Publishing, 2021, 205 pages, £22.99, ISBN 978-1-78821-417-9 (paperback).
Having a gatekeeper position in a collaborative network offers firms great potential to gain competitive advantages. However, it is not well understood what kind of collaborations are associated with such a position. Conceptually grounded in social network theory, this study draws on the resource-based view and the relational factors view to investigate which types of collaboration characterize firms that are in a gatekeeper position, which ultimately could improve firm performance in subsequent periods. The empirical analysis utilizes a unique longitudinal data set to examine dynamic network formation. We used a data crawling approach to reconstruct collaboration networks among the 500 largest companies in Germany over nine years and matched these networks with performance data. The results indicate that firms in gatekeeper positions often engage in medium-intensity collaborations and less likely weak-intensity collaborations. Strong-intensity collaborations are not related to the likelihood of being a gatekeeper. Our study further reveals that a firm's knowledge base is an important moderator and that this knowledge base can increase the benefits of having a gatekeeper position in terms of firm performance.
A safe core mandate
(2023)
Central banks have vastly expanded their footprint on capital markets. At a time of extraordinary pressure by many sides, a simple benchmark for the scale and scope of their core mandate of price and financial stability may be useful.
We make a case for a narrow mandate to maintain and safeguard the border between safe and quasi safe assets. This ex-ante definition minimizes ambiguity and discourages risk creation and limit panic runs, primarily by separating market demand for reliable liquidity from risk-intolerant, price-insensitive demand for a safe store of value. The central bank may be occasionally forced to intervene beyond the safe core but should not be bound by any such ex-ante mandate, unless directed to specific goals set by legislation with explicit fiscal support.
We review distinct features of liquidity and safety demand, seeking a definition of the safety border, and discuss LOLR support for borderline safe assets such as MMF or uninsured deposits.
A safe core formulation is close to the historical focus on regulated entities, collateralized lending and attention to the public debt market, but its specific framing offers some context on controversial issues such as the extent of LOLR responsibilities. It also justifies a persistently large scale for central bank liabilities (Greenwood, Hansom and Stein 2016), as safety demand is related to financial wealth rather than GDP. Finally, it is consistent with an active central bank role in supporting liquidity in government debt markets trading and clearing (Duffie 2020, 2021).
A key solution for public good provision is the voluntary formation of institutions that commit players to cooperate. Such institutions generate inequality if some players decide not to participate but cannot be excluded from cooperation benefits. Prior research with small groups emphasizes the role of fairness concerns with positive effects on cooperation. We show that effects do not generalize to larger groups: if group size increases, groups are less willing to form institutions generating inequality. In contrast to smaller groups, however, this does not increase the number of participating players, thereby limiting the positive impact of institution formation on cooperation.
This Policy Letter presents two event studies based on the pre-war data that foreshadows the remarkable way in which Russian economy was able to withstand the pressure from unprecedented package of international sanctions. First, it shows that a sudden stop of one of the two domestic producers of zinc in 2018 did not lead to a slowdown in the steel industry, which heavily relied on this input. Second, it demonstrates that a huge increase in cost of fuel called mazut in 2020 had virtually no impact on firms that used it, even in the regions where it was hard to substitute it for alternative fuels. This Policy Letter argues that such stability in production can be explained by the fact that Russian economy is heavily oriented toward commodities. It is much easier to replace a commodity supplier than a supplier of manufacturing goods, and many commodity producers operate at high profit margins that allow them to continue to operate even after big increases in their costs. Thus, sanctions had a much smaller impact on Russia than they would have on an economy with larger manufacturing sector, where inputs are less substitutable and profit margins are smaller.
We study the interplay of capital and liquidity regulation in a general equilibrium setting by focusing on future funding risks. The model consists of a banking sector with long-term illiquid investment opportunities that need to be financed by shortterm debt and by issuing equity. Reliance on refinancing long-term investment in the middle of the life-time is risky, since the next generation of potential short-term debt holders may not be willing to provide funding when the return prospects on the long-term investment turn out to be bad. For moderate return risk, equilibria with and without bank default coexist, and bank default is a self-fulfilling prophecy. Capital and liquidity regulation can prevent bank default and may implement the first-best. Yet the former is more powerful in ruling out undesirable equilibria and thus dominates liquidity regulation. Adding liquidity regulation to optimal capital regulation is redundant.
In current discussions on large language models (LLMs) such as GPT, understanding their ability to emulate facets of human intelligence stands central. Using behavioral economic paradigms and structural models, we investigate GPT’s cooperativeness in human interactions and assess its rational goal-oriented behavior. We discover that GPT cooperates more than humans and has overly optimistic expectations about human cooperation. Intriguingly, additional analyses reveal that GPT’s behavior isn’t random; it displays a level of goal-oriented rationality surpassing human counterparts. Our findings suggest that GPT hyper-rationally aims to maximize social welfare, coupled with a strive of self-preservation. Methodologically, our esearch highlights how structural models, typically employed to decipher human behavior, can illuminate the rationality and goal-orientation of LLMs. This opens a compelling path for future research into the intricate rationality of sophisticated, yet enigmatic artificial agents.
We study the redistributive effects of inflation combining administrative bank data with an information provision experiment during an episode of historic inflation. On average, households are well-informed about prevailing inflation and are concerned about its impact on their wealth; yet, while many households know about inflation eroding nominal assets, most are unaware of nominal-debt erosion. Once they receive information on the debt-erosion channel, households update upwards their beliefs about nominal debt and their own real net wealth. These changes in beliefs causally affect actual consumption and hypothetical debt decisions. Our findings suggest that real wealth mediates the sensitivity of consumption to inflation once households are aware of the wealth effects of inflation.
Dynamics of life course family transitions in Germany: exploring patterns, process and relationships
(2023)
This paper explores dynamics of family life events in Germany using discrete time event history analysis based on SOEP data. We find that higher educational attainment, better income level, and marriage emerge as salient protective factors mitigating the risk of mortality; better education also reduces the likelihood of first marriage whereas, lower educational attainment, protracted period, and presence of children act as protective factors against divorce. Our key finding shows that disparity in mean life expectancies between individuals from low- and high-income brackets is observed to be 9 years among males and 6 years among females, thereby illustrating the mortality inequality attributed to income disparities. Our estimates show that West Germans have low risk of death, less likelihood of first marriage, and they have a high risk of divorce and remarriage compared to East Germans.
We present determinacy bounds on monetary policy in the sticky information model. We find that these bounds are more conservative here when the long run Phillips curve is vertical than in the standard Calvo sticky price New Keynesian model. Specifically, the Taylor principle is now necessary directly - no amount of output targeting can substitute for the monetary authority’s concern for inflation. These determinacy bounds are obtained by appealing to frequency domain techniques that themselves provide novel interpretations of the Phillips curve.
Questionable research practices have generated considerable recent interest throughout and beyond the scientific community. We subsume such practices involving secret data snooping that influences subsequent statistical inference under the term MESSing (manipulating evidence subject to snooping) and discuss, illustrate and quantify the possibly dramatic effects of several forms of MESSing using an empirical and a simple theoretical example. The empirical example uses numbers from the most popular German lottery, which seem to suggest that 13 is an unlucky number.
In this study, we introduce a novel entity matching (EM) framework. It com-bines state-of-the-art EM approaches based on Artificial Neural Networks (ANN) with a new similarity encoding derived from matching techniques that are preva-lent in finance and economics. Our framework is on-par or outperforms alternative end-to-end frameworks in standard benchmark cases. Because similarity encod-ing is constructed using (edit) distances instead of semantic similarities, it avoids out-of-vocabulary problems when matching dirty data. We highlight this property by applying an EM application to dirty financial firm-level data extracted from historical archives.
Biodiversity loss poses a significant threat to the global economy and affects ecosystem services on which most large companies rely heavily. The severe financial implications of such a reduced species diversity have attracted the attention of companies and stakeholders, with numerous calls to increase corporate transparency. Using textual analysis, this study thus investigates the current state of voluntary biodiversity reporting of 359 European blue-chip companies and assesses the extent to which it aligns with the upcoming disclosure framework of the Task Force on Nature-related Financial Disclosures (TNFD). The descriptive results suggest a substantial gap between current reporting practices and the proposed TNFD framework, with disclosures largely lacking quantification, details and clear targets. In addition, the disclosures appear to be relatively unstandardized. Companies in sectors or regions exposed to higher nature-related risks as well as larger companies are more likely to report on aspects of biodiversity. This study contributes to the emerging literature on nature-related risks and provides detailed insights on the extent of the reporting gap in light of the upcoming standards.
This paper analyzes the current implementation status of sustainability and taxonomy-aligned disclosure under the Sustainable Finance Disclosure Regulation (SFDR) as well as the development of the SFDR categorization of funds offered via banks in Germany. Examining data provided by WM Group, which consists of more than 10,000 investment funds and 2,000 index funds between September 2022 and March 2023, we have observed a significant proportion of Article 9 (dark green) funds transitioning to Article 8 (light green) funds, particularly among index funds. As a consequence of this process, the profile of the SFDR classes has sharpened, which reflects an increased share of sustainable investments in the group of Article 9 funds. When differentiating between environmental and social investments, the share of environmental investments increased, but the share of social investments decreased in the group of Article 9 funds at the beginning of 2023. The share of taxonomy-aligned investments is very low, but slightly increasing for Article 9 funds. However, by March 2023 only around 1,000 funds have reported their sustainability proportions and this picture might change due to legal changes which require all funds in the scope of the SFDR to report these proportions in their annual reports being published after 1 January 2023.
Industry classification groups firms into finer partitions to help investments and empirical analysis. To overcome the well-documented limitations of existing industry definitions, like their stale nature and coarse categories for firms with multiple operations, we employ a clustering approach on 69 firm characteristics and allocate companies to novel economic sectors maximizing the within-group explained variation. Such sectors are dynamic yet stable, and represent a superior investment set compared to standard classification schemes for portfolio optimization and for trading strategies based on within-industry mean-reversion, which give rise to a latent risk factor significantly priced in the cross-section. We provide a new metric to quantify feature importance for clustering methods, finding that size drives differences across classical industries while book-to-market and financial liquidity variables matter for clustering-based sectors.
We estimate the transmission of the pandemic shock in 2020 to prices in the residential and commercial real estate market by causal machine learning, using new granular data at the municipal level for Germany. We exploit differences in the incidence of Covid infections or short-time work at the municipal level for identification. In contrast to evidence for other countries, we find that the pandemic had only temporary negative effects on rents for some real estate types and increased asset prices of real estate particularly in the top price segment of commercial real estate.
This study analyzes information production and trading behavior of banks with lending relationships. We combine trade-by-trade supervisory data and credit-registry data to examine banks' proprietary trading in borrower stocks around a large number of corporate events. We find that relationship banks build up positive (negative) trading positions in the two weeks before events with positive (negative) news, even when these events are unscheduled, and unwind positions shortly after the event. This trading pattern is more pronounced in situations when banks are likely to possess private information about their borrowers, and cannot be explained by specialized expertise in certain industries or certain firms. The results suggest that banks' lending relationships inform their trading and underscore the potential for conflicts of interest in universal banking, which have been a prominent concern in the regulatory debate for a long time. Our analysis illustrates how combining large data sets can uncover unusual trading patterns and enhance the supervision of financial institutions.
We examine whether the uncertainty related to environmental, social, and governance (ESG) regulation developments is reflected in asset prices. We proxy the sensitivity of firms to ESG regulation uncertainty by the disparity across the components of their ESG ratings. Firms with high ESG disparity have a higher option-implied cost of protection against downside tail risk. The impact of the misalignment across the different dimensions of the ESG score is distinct from that of ESG score level itself. Aggregate downside risk bears a negative price for firms with low ESG disparity.
A common practice in empirical macroeconomics is to examine alternative recursive orderings of the variables in structural vector autogressive (VAR) models. When the implied impulse responses look similar, the estimates are considered trustworthy. When they do not, the estimates are used to bound the true response without directly addressing the identification challenge. A leading example of this practice is the literature on the effects of uncertainty shocks on economic activity. We prove by counterexample that this practice is invalid in general, whether the data generating process is a structural VAR model or a dynamic stochastic general equilibrium model.
This paper analyzes the scope of the private market for pandemic insurance. We develop a framework that explains theoretically how the equilibrium price of pandemic insurance depends on accumulation risk, covariance between pandemic claims and other claims, and covariance between pandemic claims and the stock market performance. Using the natural catastrophe (NatCat) insurance market as a laboratory, we estimate the relationship between the insurance price markup and the tail characteristics of the loss distribution. Then, by using the high-frequency data tracking the economic impact of the COVID-19 pandemic in the United States, we calibrate the loss distribution of a hypothetical insurance contract designed to alleviate the impact of the pandemic on small businesses. The pandemic insurance contract price markup corresponds to the top 20% markup observed in the NatCat insurance market. Then we analyze an intertemporal risk-sharing scheme that can reduce the expected shortfall of the loss distribution by 50%.