Working Paper
Refine
Year of publication
Document Type
- Working Paper (2353) (remove)
Language
- English (2353) (remove)
Is part of the Bibliography
- no (2353)
Keywords
- Deutschland (115)
- USA (51)
- Geldpolitik (48)
- monetary policy (46)
- Schätzung (45)
- Europäische Union (43)
- Bank (38)
- Corporate Governance (36)
- Monetary Policy (31)
- Inflation (23)
Institute
- Center for Financial Studies (CFS) (1378)
- Wirtschaftswissenschaften (1308)
- Sustainable Architecture for Finance in Europe (SAFE) (740)
- House of Finance (HoF) (606)
- Institute for Monetary and Financial Stability (IMFS) (173)
- Rechtswissenschaft (148)
- Informatik (114)
- Foundation of Law and Finance (50)
- Exzellenzcluster Die Herausbildung normativer Ordnungen (34)
- Gesellschaftswissenschaften (29)
We develop a novel empirical approach to identify the effectiveness of policies against a pandemic. The essence of our approach is the insight that epidemic dynamics are best tracked over stages, rather than over time. We use a normalization procedure that makes the pre-policy paths of the epidemic identical across regions. The procedure uncovers regional variation in the stage of the epidemic at the time of policy implementation. This variation delivers clean identification of the policy effect based on the epidemic path of a leading region that serves as a counterfactual for other regions. We apply our method to evaluate the effectiveness of the nationwide stay-home policy enacted in Spain against the Covid-19 pandemic. We find that the policy saved 15.9% of lives relative to the number of deaths that would have occurred had it not been for the policy intervention. Its effectiveness evolves with the epidemic and is larger when implemented at earlier stages.
This paper studies a household’s optimal demand for a reverse mortgage. These contracts allow homeowners to tap their home equity to finance consumption needs. In stylized frameworks, we show that the decision to enter a reverse mortgage is mainly driven by the dierential between the aggregate appreciation of the house price and principal limiting factor on the one hand and the funding costs of a household on the other hand. We also study a rich life-cycle model that can explain the low demand for reverse mortgages as observed in US data. In this model, we analyze the optimal response of a household that is confronted with a health shock or financial disaster. If an agent suers from an unexpected health shock, she reduces the risky portfolio share and is more likely to enter a reverse mortgage. On the other hand, if there is a large drop in the stock market, she keeps the risky portfolio share almost constant by buying additional shares of stock. Besides, the probability to take out a reverse mortgage is hardly aected.
The ruling of the German Federal Constitutional Court and its call for conducting and communicating proportionality assessments regarding monetary policy have been the subject of some controversy. However, it can also be understood as a way to strengthen the de-facto independence of the European Central Bank. The authors shows how a regular proportionality check could be integrated in the ECB’s strategy that is currently undergoing a systematic review. In particular, they propose to include quantitative benchmarks for policy rates and the central bank balance sheet. Deviations from such benchmarks can have benefits in terms of the intended path for inflation while involving costs in terms of risks and side effects that need to be balanced. Practical applications to the euro area are provided
In this paper we adapt the Hamiltonian Monte Carlo (HMC) estimator to DSGE models, a method presently used in various fields due to its superior sampling and diagnostic properties. We implement it into a state-of-theart, freely available high-performance software package, STAN. We estimate a small scale textbook New-Keynesian model and the Smets-Wouters model using US data. Our results and sampling diagnostics confirm the parameter estimates available in existing literature. In addition, we find bimodality in the Smets-Wouters model even if we estimate the model using the original tight priors. Finally, we combine the HMC framework with the Sequential Monte Carlo (SMC) algorithm to create a powerful tool which permits the estimation of DSGE models with ill-behaved posterior densities.
This paper determines the cost of employee stock options (ESOs) to shareholders. I present a pricing method that seeks to replicate the empirics of exercise and cancellation as good as possible. In a first step, an intensity-based pricing model of El Karoui and Martellini is adapted to the needs of ESOs. In a second step, I calibrate the model with a regression analysis of exercise rates from the empirical work of Heath, Huddart and Lang. The pricing model thus takes account for all effects captured in the regression. Separate regressions enable me to compare options for top executives with those for subordinates. I find no price differences. The model is also applied to test the precision of the fair value accounting method for ESOs, SFAS 123. Using my model as a reference, the SFAS method results in surprisingly accurate prices.
JEL classification: G13; J33; M41; M52
This paper studies the link between bank recapitalization and welfare in a dynamic production economy. The model features financial frictions because banks benefit of a cost advantage at monitoring firms and face costly equity issuance. The competitive equilibrium outcome is inefficient because agents do not internalize the effects banks’ capitalization over the allocation of capital, its price and, in turn, firms investments. It follows, individual recapitalizations are sub-optimal and bailout policies may benefit social welfare in the long-run. Bailouts improve capital allocation in states where aggregate banks are poorly capitalized, therefore enhancing their market valuation, fostering investments, and stabilizing the economy recovery path.
Market fragmentation and technological advances increasing the speed of trading altered the functioning and stability of global equity limit order markets. Taking market resiliency as an indicator of market quality, we investigate how resilient are trading venues in a high-frequency environment with cross-venue fragmented order flow. Employing a Hawkes process methodology on high-frequency data for FTSE 100 stocks on LSE, a traditional exchange, and on Chi-X, an alternative venue, we find that when liquidity becomes scarce Chi-X is a less resilient venue than LSE with variations existing across stocks and time. In comparison with LSE, Chi-X has more, longer, and severer liquidity shocks. Whereas the vast majority of liquidity droughts on both venues disappear within less than one minute, the recovery is not lasting, as liquidity shocks spiral over the time dimension. Over half of the shocks on both venues are caused by spiralling. Liquidity shocks tend to spiral more on Chi-X than on LSE for large stocks suggesting that the liquidity supply on Chi-X is thinner than on LSE. Finally, a significant amount of liquidity shocks spill over cross-venue providing supporting evidence for the competition for order flow between LSE and Chi-X.
The Multilingual Assessment Instrument for Narratives (MAIN) is a theoretically grounded toolkit that employs parallel pictorial stimuli to explore and assess narrative skills in children in many different languages. It is part of the LITMUS (Language Impairment Testing in Multilingual Settings) battery of tests that were developed in connection with the COST Action IS0804 Language Impairment in a Multilingual Society: Linguistic Patterns and the Road to Assessment (2009−2013). MAIN has been designed to assess both narrative production and comprehension in children who acquire one or more languages from birth or from early age. Its design allows for the comparable assessment of narrative skills in several languages in the same child and in different elicitation modes: Telling, Retelling and Model Story. MAIN contains four parallel stories, each with a carefully designed six-picture sequence based on a theoretical model of multidimensional story organization. The stories are controlled for cognitive and linguistic complexity, parallelism in macrostructure and microstructure, as well as for cultural appropriateness and robustness. As a tool MAIN had been used to compare children’s narrative skills across languages, and also to help differentiate between children with and without developmental language disorders, both monolinguals and bilinguals.
This volume consists of two parts. The main content of Part I consists of 33 papers describing the process of adapting and translating MAIN to a large number of languages from different parts of the world. Part II contains materials for use for about 80 languages, including pictorial stimuli, which are accessible after registration.
MAIN was first published in 2012/2013 (ZASPiL 56). Several years of theory development and material construction preceded this launch. In 2019 (ZASPiL 63), the revised English version (revised on the basis of over 2,500 transcribed MAIN narratives as well as ca 24,000 responses to MAIN comprehension questions, collected from around 700 monolingual and bilingual children in Germany, Russia and Sweden between 2013-2019) was published together with revised versions in German, Russian, Swedish, and Turkish for the bilingual Turkish-Swedish population in Sweden. The present 2020 (ZASPiL 64) volume contains new and revised language versions of MAIN.
On the basis of the economic theory of network effects, this article provides a novel explanation of the so-called patent paradox, i.e. the question why the propensity to patent is so strong when the expected average value of most patents is low. It demonstrates that the patent system of a country resembles a telephone network or a social media platform. Patents are perceived as nodes in a virtual network that, as a whole, exhibits network effects. It is explained why patents are not independent of other patents but that they complement each other in several ways both within and beyond markets and fields of technology, and that patents thus create synchronization value over and above individual interests of patent holders in exclusivity. As a consequence, the more patents there are, the more valuable it is to also seek patents, and vice versa. Since patents thus display increasing returns to adoption, the willingness to pay for the next patent slopes upwards. This explains why, after a phase of early instability and a certain tipping point, many countries’ patent systems expanded quickly and eventually became a rigid standard (“lock-in”). The concluding section raises the question what regulatory measures are suitable to effectively address the ensuing anticommons effects.
The long-standing battle between economic nationalism and globalism has again taken center stage in geopolitics. This article applies this dichotomy to the law and policy of international intellectual property (IP). Most commentators see IP as a prime example of globalization. The article challenges this view on several levels. In a nutshell, it claims that economic nationalist concerns about domestic industries and economic development lie at the heart of the global IP system. To support this argument, the article summarizes and categorizes IP policies adopted by selected European countries, the European Union, and the U.S. Section I presents three types of inbound IP policies that aim to foster local economic development and innovation. Section II adds three versions of outbound IP policies that, in contrast, target foreign countries and markets. Concluding section III traces a dialectic virtuous circle of economic nationalist motives leading to global legal structures and identifies the function and legal structure of IP as the reason for the resilience and even dominance of economic nationalist motives in international IP politics. IP concerns exclusive private rights that are territorially limited creatures of (supra-)national statutes. These legal structures make up the economic nationalist DNA of IP.
Using a structural life-cycle model, we quantify the long-term impact of school closures during the Corona crisis on children affected at different ages and coming from households with different parental characteristics. In the model, public investment through schooling is combined with parental time and resource investments in the production of child human capital at different stages in the children's development process. We quantitatively characterize both the long-term earnings consequences on children from a Covid-19 induced loss of schooling, as well as the associated welfare losses. Due to self-productivity in the human capital production function, skill attainment at a younger stage of the life cycle raises skill attainment at later stages, and thus younger children are hurt more by the school closures than older children. We find that parental reactions reduce the negative impact of the school closures, but do not fully offset it. The negative impact of the crisis on children's welfare is especially severe for those with parents with low educational attainment and low assets. The school closures themselves are primarily responsible for the negative impact of the Covid-19 shock on the long-run welfare of the children, with the pandemic-induced income shock to parents playing a secondary role.
Central banks unexpectedly tightening policy rates often observe the exchange value of their currency depreciate, rather than appreciate as predicted by standard models. We document this for Fed and ECB policy days using event studies and ask whether an information effect, where the public attributes the policy surprise to an unobserved state of the economy that the central bank is signaling by its policy may explain the abnormality. It turns out that many informational assumptions make a standard two- country New Keynesian model match this behavior. To identify the particular mechanism, we condition on multiple asset prices in the event study and model implications for these. We find that there is heterogeneity in this dimension in the event study and no model with a single regime can match the evidence. Further, even after conditioning on possible information effects driving longer term interest rates, there appear to be other drivers of exchange rates. Our results show that existing models have a long way to go in reconciling event study analysis with model-based mechanisms of asset pricing.
The Multilingual Assessment Instrument for Narratives (MAIN) is part of LITMUS (Language Impairment Testing in Multilingual Settings). LITMUS is a battery of tests that have been developed in connection with the COST Action IS0804 Language Impairment in a Multilingual Society: Linguistic Patterns and the Road to Assessment (2009−2013).
In these volumes, we are very pleased to present a collection of papers based on talks and posters at Sinn und Bedeutung 22, which took place in Berlin and Potsdam on September 7-10, 2017, jointly organized by the Leibniz-Centre for General Linguistics (ZAS) and the University of Potsdam.
SuB22 received 183 submitted abstracts. Out of these, the organizing committee selected 39 oral presentations in the main session, 4 oral presentations in the special session ‘Semantics and Natural Logic’, and 24 poster presentations. There were an additional 6 invited talks. In total, 58 of these contributions appear in paper form in the present volumes.
Past research suggests that international real estate markets show return characteristics and interrelationships with other asset classes, which probably qualify them as an interesting component of national and international asset allocation decisions. However, the special characteristics of real estate assets are quite distinct from that of financial assets, such as stocks and bonds. This is also the case for real estate return distributions. Therefore, the proper integration of real estate markets into asset allocation decisions requires profound understanding of real estate returns' distributional characteristics .
Because of the particular characteristics of real estate, representing real estate markets through reliable a time-series is a complex task. Consequently, reliable real estate indices with a sufficiently long history in major international real estate markets are only scarcely available. Most of the research that has been done on real estate returns was done for the U.K. and U.S., where eligible indices exist. On the other hand, in other important real estate markets, such as Germany, either little or no research has been perfoimed.
In this analysis, the methodology of Maurer, Sebastian and Stephan (2000) for indirectly deriving an appraisal-based index for the German commercial real estate market will be applied. This approach is solely based on publicly available data from German open-ended real estate investment trusts. It could also provide a solution to deriving a reliable real estate time-series for other markets.
We will extend previous analyses for the U.K. and U.S. to provide additional fundamental insights into the return characteristics of the German commercial real estate market. Despite univariate considerations, the main focus is the interrelationships between various international real estate markets, as well as between those respective markets and the international stock and bond markets.
The classical approaches to asset allocation give very different conclusions about how much foreign stocks a US investor should hold. US investors should either allocate a large portion of about 40% to foreign stocks (which is the result of mean/variance optimization and the international CAPM) or they should hold no foreign stocks at all (which is the conclusion of the domestic CAPM and mean/variance spanning tests). There is no way in between.
The idea of the Bayesian approach discussed in this article is to shrink the mean/variance efficient portfolio towards the market portfolio. The shrinkage effect is determined by the investor's prior belief in the efficiency of the market portfolio and by the degree of violation of the CAPM in the sample. Interestingly, this Bayesian approach leads to the same implications for asset allocation as the mean-variance/tracking error criterion. In both cases, the optimal portfolio is a combination of the market portfolio and the mean/variance efficient portfolio with the highest Sharpe ratio.
Applying both approaches to the subject of international diversification, we find that a substantial home bias is only justified when a US investor has a strong belief in the global mean/variance efficiency of the US market portfolio and when he has a high regret aversion of falling behind the US market portfolio. We also find that the current level of home bias can be justified whenever-regret aversion is significantly higher than risk aversion.
Finally, we compare the Bayesian approach of shrinking the mean/variance efficient portfolio towards the market portfolio to another Bayesian approach which shrinks the mean/variance efficient portfolio towards the minimum-variance portfolio. An empirical out-of-sample study shows that both Bayesian approaches lead to a clearly superior performance compared to the classical mean/variance efficient portfolio.
Predictability and the cross-section of expected returns: a challenge for asset pricing models
(2020)
Many modern macro finance models imply that excess returns on arbitrary assets are predictable via the price-dividend ratio and the variance risk premium of the aggregate stock market. We propose a simple empirical test for the ability of such a model to explain the cross-section of expected returns by sorting stocks based on the sensitivity of expected returns to these quantities. Models with only one uncertainty-related state variable, like the habit model or the long-run risks model, cannot pass this test. However, even extensions with more state variables mostly fail. We derive criteria models have to satisfy to produce expected return patterns in line with the data and discuss various examples.
The possibility to investigate the impact of news on stock prices has observed a strong evolution thanks to the recent use of natural language processing (NLP) in finance and economics. In this paper, we investigate COVID-19 news, elaborated with the ”Natural Language Toolkit” that uses machine learning models to extract the news’ sentiment. We consider the period from January till June 2020 and analyze 203,886 online articles that deal with the pandemic and that were published on three platforms: MarketWatch.com, Reuters.com and NYtimes.com. Our findings show that there is a significant and positive relationship between sentiment score and market returns. This result indicates that an increase (decrease) in the sentiment score implies a rise in positive (negative) news and corresponds to positive (negative) market returns. We also find that the variance of the sentiments and the volume of the news sources for Reuters and MarketWatch, respectively, are negatively associated to market returns indicating that an increase of the uncertainty of the sentiment and an increase in the arrival of news have an adverse impact on the stock market.
Using experimental data from a comprehensive field study, we explore the causal effects of algorithmic discrimination on economic efficiency and social welfare. We harness economic, game-theoretic, and state-of-the-art machine learning concepts allowing us to overcome the central challenge of missing counterfactuals, which generally impedes assessing economic downstream consequences of algorithmic discrimination. This way, we are able to precisely quantify downstream efficiency and welfare ramifications, which provides us a unique opportunity to assess whether the introduction of an AI system is actually desirable. Our results highlight that AI systems’ capabilities in enhancing welfare critically depends on the degree of inherent algorithmic biases. While an unbiased system in our setting outperforms humans and creates substantial welfare gains, the positive impact steadily decreases and ultimately reverses the more biased an AI system becomes. We show that this relation is particularly concerning in selective-labels environments, i.e., settings where outcomes are only observed if decision-makers take a particular action so that the data is selectively labeled, because commonly used technical performance metrics like the precision measure are prone to be deceptive. Finally, our results depict that continued learning, by creating feedback loops, can remedy algorithmic discrimination and associated negative effects over time.
In this paper we adopt the Hamiltonian Monte Carlo (HMC) estimator for DSGE models by implementing it into a state-of-the-art, freely available high-performance software package. We estimate a small scale textbook New-Keynesian model and the Smets-Wouters model on US data. Our results and sampling diagnostics confirm the parameter estimates available in existing literature. In addition we combine the HMC framework with the Sequential Monte Carlo (SMC) algorithm which permits the estimation of DSGE models with ill-behaved posterior densities.
In these volumes, we are very pleased to present a collection of papers based on talks and posters at Sinn und Bedeutung 22, which took place in Berlin and Potsdam on September 7-10, 2017, jointly organized by the Leibniz-Centre for General Linguistics (ZAS) and the University of Potsdam.
SuB22 received 183 submitted abstracts. Out of these, the organizing committee selected 39 oral presentations in the main session, 4 oral presentations in the special session ‘Semantics and Natural Logic’, and 24 poster presentations. There were an additional 6 invited talks. In total, 58 of these contributions appear in paper form in the present volumes.
Incentivized experiments in which individuals receive monetary rewards according to the outcomes of their decisions are regarded as the gold standard for preference elicitation in experimental economics. These task-related real payments are considered necessary to reveal subjects' "true preferences". Using a systematic, large-sample approach with three subject pools of private investors, professional investors, and students, we test the effect of task-related monetary incentives on risk preferences elicited in four standard experimental tasks. We find no systematic differences in behavior between subjects in the incentivized and non-incentivized regimes. We discuss implications for academic research and for applications in the field.
The present paper seeks to study the possible diversification potential by the integration of indirect real estate investments in international portfolios. To this end, monthly index-return time-series in the time-period from January 1985 till December 1998 from real estate investment companies as well as common stocks and bonds in Germany, France, Switzerland, Great Britain and the USA were used. We utilize, due to the critical normal distribution assumption, a mean/lower-partial-moment framework. In order to take into account the influence of the currency risk for international investments the analyses have been undertaken both with as well as without hedging the currency risk. We take the viewpoint of a German as well as that of a US-investor to gain insight into the dependency of the diversification potential on the reference currency of the investor.
Access to loans and other financial services is extremely valuable for micro-, small- and medium-sized enterprises in developing and transition countries as it enables their owners as well as their employees to exploit their economic potential and to increase their income. Although this insight has lead development aid institutions to undertake many attempts to create sustainable microfinance institutions, only a small fraction of these has been successful so far. This article analyses what determines the success of attempts to provide financial services in general, and credit in particular, to low income target groups in these countries. We argue that it is crucial to understand, and to mitigate or even eliminate in practice, the serious and numerous incentive problems at the level of the lending operations as well as those at the levels of the human resource management and the governance of microfinance institutions. We attempt to show moreover, that unsolved incentive problems at only one level will ultimately undermine any potential success at the other levels. In our paper, we first analyse information and incentive problems from a theoretical perspective, using and extending the well-known Stiglitz-Weiss model of credit rationing, and derive theoretical requirements for solutions of these problems. In the light of these considerations, we then discuss how problems are solved in practice. Section 3 deals with the credit relationship. Section 4 extends the argument by showing how incentive problems within the institution can be handled, and section 5 analyses corporate governance-related problems of development finance institutions as incentive problems. In section 6 it is demonstrated why, and how, the incentive problems at the different levels, as well as their solutions, are interrelated. From this we derive the proposition that, as the institutional devices for dealing with these problems constitute a complementary system, any sustainable solution requires consistent arrangements of all elements and at all levels of the system. In the last section we will show the potential of strategic networks to set up institutions which we consider to be consistent systems for successfully solving the problems at all three levels simultaneously.
Insider trading and portfolio structure in experimental asset markets with a long lived asset
(1997)
We report results of a series of nine market experiments with asymmetric information and a fundamental value process that is more "realistic" than those in previous experiments. Both a call market institution and a continuous double auction mechanism are employed. We find considerable pricing inefficiencies that are only partially exploited by insiders. The magnitude of insider gains is analyzed separately for each experiment. We find support for the hypothesis that the continuous double auction leads to more efficient outcomes. Finally, we present evidence of an endowment effect: the initial portfolio structure influences the final asset holdings of experimental subjects.
During the last years issues of strategic management accounting have received widespread attention in the accounting literature. Yet the conceptual foundation of most proposals is not clear. This paper presents a theoretical analysis of one of the most prominent approaches of strategic management accounting, i.e., Target Costing. First, the relationship between Target Costing and Life-Cycle-Costing is shown. Secondly, a model based on a mechanism-design-approach is used to answer the question of whether the „Market-into-Company“-method of Target Costing can somehow be endogenized. The model captures problems of asymmetric information, price policy and cost structures (i.e. learning effects etc.). The analysis shows that the more „strategic“ is the firm´s cost function, the less valid is „strategic“ management accounting in terms of the usual way Target Costing is employed.
The main argument in this paper is that new information and communication technologies (ICT) in the financial industry will increase specialisation and competition within the European financial centre system and thereby lead to a ‘re-bundling’ of functions of the various financial centres. Frankfurt plays an interesting role in this development as it is one of the main development centres for ‘financial technology’. With these technologies, remote access to the Frankfurt stock exchange and inter-bank payment system is now feasible from most European cities. This leads to a reduced need for physical presence, which opens up new possibilities for the financial sector’s spatial organisation. However, as financial production is information- and knowledge-intensive, spatial and other types of proximity between financial actors and clients are still essential in many stages. We examine the value chains of three different products (advisory, lending, trading) with regard to different proximities, in order to identify possible patterns of their spatial (re)organisation. From these findings, inferences are drawn for a ‘new’ role for Frankfurt in the European financial centre system.
Economic theory suggests that a commitment by a firm to increased levels of disclosure should lower the information asymmetry component of the firm’s cost of capital. But whi le the theory is compelling, so far empirical results relating increased levels of disclosure to measurable economic benefits have been mixed. One explanation for the mixed results among studies using data from firms publicly registered in the US is that, under current US reporting standards, the disclosure environment is already rich. In this paper, we study German firms that have switched from the German to an international reporting regime (IAS or US -GAAP), thereby committing themselves to increased le vels of disclosure. We show that proxies for the information asymmetry component of the cost of capital for the switching firms, namely the bid-ask spread and trading volume, behave in the predicted direction compared to firms employing the German reporti ng regime.
Traditional tests of the CAPM following the Fama / MacBeth (1973) procedure are tests of the joint hypotheses that there is a relationship between beta and realized return and that the market risk premium is positive. The conditional test procedure developed by Pettengill / Sundaram / Mathur (1995) allows to independently test the hypothesis of a relation between beta and realized returns. Monte Carlo simulations show that the conditional test reliably identifies this relation. In an empirical examination for the German stock market we find a significant relation between beta and return. Previous studies failed to identify this relationship probably because the average market risk premium in the sample period was close to zero. Our results provide a justification for the use of betas estimated from historical return data by portfolio managers.
In international accounting literature there are various approaches to assess the quality of national accounting systems with respect to specific key functions, e.g. the intensity of capital market information. An empirical approach often used measures the quality of disclosure by ranking the national systems with the so-called "disclosure index" (e.g. Choi 1973, Barret 1975, Cooke 1992, Taylor/ Zarzeski 1996). Concentrating on disclosure regulation in contrast to accounting practices, Cooke/ Wallace 1990 construct an index which measures the "degree of financial regulation". They identify groups of countries which can be clearly classified in highly regulated, regulated and moderately regulated national accounting systems.
In our analysis, we want to enrich the idea of the degree of financial disclosure regulation to a concept for evaluating the degree of determination of financial measurement. Assuming that a high degree of determination of a national accounting system leads to more comparable accounts than a low degree, the index can be interpreted as a quality measure of national accounting systems according to the intensity of capital market information. The following hypothesis is to be proved: the degree of disclosure regulation equals the degree of measurement regulation in order to serve the information needs of the national capital markets.
Three groups of different degrees of determination for national accounting systems can be easily identified which are compared to the results of Cooke/ Wallace. For some of the national systems the above hypothesis seems to be appropriate whereas some opposing results can be shown. Possible explanations are presented which can be causally related to these diverging results. They are based on historical developments, the differentiation between rules for individual and group accounts, and on conditions where different degrees seem plausible.
This paper provides a detailed empirical analysis of the call auction procedure on the German stock exchanges. The auction is conducted by the Makler whose position resembles that of a NYSE specialist. We use a dataset which contains information about all individual orders for a sample of stocks traded on the Frankfurt Stock Exchange (FSE). This sample allows us to calculate the cost of transacting in a call market and compare them to the costs of transacting in a continuous market. We find that transaction costs for small transactions in the call market are lower than the quoted spread in the order book of the continuous market whereas transaction costs for large transactions are higher than the spread in the continuous market.
We further address the question whether active participation of the Makler is advantageous. On the one hand he may accomodate order imbalances, increase the liquidity of the market and stabilize prices. On the other hand, the discretion in price setting gives him an incentive to manipulate prices. This may increase return volatility. Our dataset identifies the trades the Maklers make for their own accounts. We eliminate these trades and determine the price that would have obtained without their participation. Comparing this hypothetical price series to the actual transaction prices, we find that Makler participation tends to reduce return volatility. A further analysis shows that the actual prices are much closer to the surrounding prices of the continuous trading session than the hypothetical prices that would have obtained without Makler participation. These results indicate that the Maklers provide a valuable service to the market. We further calculate the profits associated with the positions taken by the Maklers and find that, on average, they do not earn profits on the positions they take. Their compensation is thus restricted to the commissions they receive.
This paper studies the incentives of German firms to voluntarily disclose cash flow statements over time. While cash flow statement are mandated under many GAAP regimes, its disclosure has not been mandatory in Germany until recently. Nevertheless, an increasing number of firms provides cash flow statements voluntarily. These firms are likely to be influenced by recommendations of the German accounting profession, IAS 7 as well as the respective standards of other countries. The idea of the paper is to study this influence by looking at the adoption pattern over time and the format of the cash flow statement. It documents the development of voluntary cash flow statement disclosures by German firms with respect to ”milestones” in the evolution of German professional recommendations and respective international standards. The cross-sectional determinants of voluntary and international cash flow statements are analyzed using probit regressions and factor analysis. The results are generally consistent with the idea that capital-market forces drive voluntary cash flow statements that are in line with international reporting practice.
Our article integrates the manager’s care in the literature on auditor’s liability. With unobservable efforts, we face a double moral hazard setting. It is well-known that efficient liability rules without punitive damages do not exist under these circumstances. However, we show that the problem can be solved through strict liability, contingent auditing fees, and fair insurance contracts. Neither punitive damages nor deductibles above the damages are required.
Discretionary disclosure theory suggests that firms' incentives to provide proprietary versus nonproprietary information differ markedly. To test this conjecture, the paper investigates the incentives of German firms to voluntarily disclose business segment reports and cash flow statements in their annual financial reports. While the former is likely to reveal proprietary information to competitors, the latter is less proprietary in nature. Using these proxies for proprietary and non-proprietary disclosures, respectively, I find that the determinants or at least their relative magnitudes differ in a way consistent with the proprietary cost hypothesis. That is, cash flow statement disclosures appear to be governed primarily by capital-market considerations, whereas segment disclosures are more strongly associated with proxies for product-market and proprietary-cost considerations.
Compliance with prevailing accounting standards is induced if the expected disadvantage due to sanctions imposed if non-compliance is detected outweighs the advantage of noncompliant accounting choices. The expected disadvantage materialises the threat potential of sanctions imposed by an enforcement agency. The capital market mechanism unfolds an important threat potential if companies expect an adverse share price reaction suite to enforcement actions. Enforcement agencies in turn can make use of this capital market related sanction by releasing information on defections to the market after the settlement of an investigation. The present contribution analyses the capital market reaction on accounting standards enforcement activities of the British Financial Reporting Review Panel (FRRP). After a brief introduction into the legal basis and working procedure of the Panel, the analysis of its activities will serve a dual purpose: firstly, the significance of capital market related sanctions for the overall enforcement regime will be elaborated upon. Secondly, the extent to which capital market related sanctions accomplish their function within the overall enforcement regime will be assessed empirically. The results of the empirical analysis suggest that the capital market related sanctioning by the FRRP may not unfold a sufficient threat potential which is a prerequisite for compliance enhancement.
This paper investigates whether firms employing IAS or US GAAP exhibit measurable differences in proxies for information asymmetry and market liquidity. Sample firms are drawn from the "New Market" at the Frankfurt Stock Exchange. All firms listed in this market segment are required to provide financial statements in accordance with either IAS or US GAAP as part of the listing agreement. The sample choice provides a market-based comparison of the two standards holding disclosure requirements and standard enforcement constant. I find that differences in the bid-ask spread and trading volume are relatively small and more likely to be driven by firm characteristics than the choice of accounting standards. In contrast, New Market firms have lower spreads and higher turnover when compared with size-matched firms in other market segments following German GAAP. The results suggests that rigid disclosure regulation of the New Market matters in terms of information asymmetry and liquidity, but that the choice between IAS and US GAAP is of second order importance.
JEL Classification: D82, G30, M41
Discussions regarding the planned European Deposit Insurance Scheme (EDIS), the missing third pillar of the European Banking Union, have been ongoing since the Commission published its initial legisla-tive proposal in 2015. A breakthrough in negotiations has yet to be achieved. The gridlock on EDIS is most commonly attributed to moral hazard concerns over insufficient risk reduction harboured on the side of northern member states, particularly Germany, due to the weak state of some other member states’ banking sectors. While moral hazard based on uneven risk reduction is helpful for explaining divergent member-state preferences on the scope of necessary risk reduction, this does not explain preferences on the institutional design of EDIS. In this paper, we argue that contrary to persistent differences on necessary risk reduction, preferences regarding the institutional design of EDIS have become more closely aligned. We analyse how preferences on EDIS developed in the key member states of Germany, France, and Italy. In all sampled countries, we find path-dependent benefits con-nected to the current design of national Deposit Guarantee Schemes (DGS) that shifted preferences of the banking sector or significant subsectors in favour of retaining national DGSs. Overall, given that a compromise on risk-reduction can be accomplished, we argue that current preferences in these key member states provide an opportunity to implement EDIS in the form of a reinsurance system that maintains national DGSs in combination with a supranational fund.
Hope and reasons
(2020)
This paper argues that hope can be understood as an attitude or an attitudinal complex that is partially sensitive to reasons. One way that an attitude is sensitive to reasons is that it is permitted given the reasons available. A second way in which an attitude is sensitive to reasons is that it might be required in light of available reasons. This paper argues that hope may be permitted by the available reasons, and although it is sometimes good or praiseworthy to hope, hope is never categorically required. In that sense, hope is partially sensitive to reasons.
We employ a representative sample of 80,972 Italian firms to forecast the drop in profits and the equity shortfall triggered by the COVID-19 lockdown. A 3-month lockdown generates an aggregate yearly drop in profits of about 10% of GDP, and 17% of sample firms, which employ 8.8% of the sample’s employees, become financially distressed. Distress is more frequent for small and medium-sized enterprises, for firms with high pre-COVID-19 leverage, and for firms belonging to the Manufacturing and Wholesale Trading sectors. Listed companies are less likely to enter distress, whereas the correlation between distress rates and family firm ownership is unclear.
(JEL G01, G32, G33)
We analyze the ESG rating criteria used by prominent agencies and show that there is a lack of a commonality in the definition of ESG (i) characteristics, (ii) attributes and (iii) standards in defining E, S and G components. We provide evidence that heterogeneity in rating criteria can lead agencies to have opposite opinions on the same evaluated companies and that agreement across those providers is substantially low. Those alternative definitions of ESG also a↵ect sustainable investments leading to the identification of di↵erent investment universes and consequently to the creation of di↵erent benchmarks. This implies that in the asset management industry it is extremely dicult to measure the ability of a fund manager if financial performances are strongly conditioned by the chosen ESG benchmark. Finally, we find that the disagreement in the scores provided by the rating agencies disperses the e↵ect of preferences of ESG investors on asset prices, to the point that even when there is agreement, it has no impact on financial performances.
Advertising arbitrage
(2020)
Arbitrageurs with a short investment horizon gain from accelerating price discovery by advertising their private information. However, advertising many assets may overload investors' attention, reducing the number of informed traders per asset and slowing price discovery. So arbitrageurs optimally concentrate advertising on just a few assets, which they overweight in their portfolios. Unlike classic insiders, advertisers prefer assets with the least noise trading. If several arbitrageurs share information about the same assets, inefficient equilibria can arise, where investors' attention is overloaded and substantial mispricing persists. When they do not share, the overloading of investors' attention is maximal.
In fifteen European countries, China, and the US, stocks and business equity as a share of total household assets are represented by an increasing and convex function of income/wealth. A parsimonious model fitted to the data shows why background labor- income risk can explain much of this risk-taking pattern. Uncontrollable labor-income risk stresses middle-income households more because labor income is a larger fraction of their total lifetime resources compared with the rich. In response, middle-income households re-duce (controllable) financial risk. Richer households, having less pressure, can afford more risk-taking. The poor take low risk because they avoid jeopardizing their subsistence consumption.
The Wirecard scandal is a wake-up call alerting German politics to the importance of securities market integrity. The role of market supervision is to ensure the smooth functioning of capital markets and their integrity, creating trust among and acceptance by investors locally and globally. The existing patchwork of national supervisory practice in Europe is under discussion today, in the wake of Brexit that will end the role of London as a de-facto lead supervisor in stock and bond markets. A fundamental overhaul of a fragmented securities markets supervisory regime in Europe would offer the potential to lead to the establishment of an independent European Single Market Supervisor (ESMS). Endowed with strong enforcement powers, and supported by the existing national agencies, the ESMS would be entrusted with ensuring a uniform market standard as to transparency and other issues of market integrity across Europe. This would not rule out maintaining a variety of market organization structures at the national level. The ESMS would need executive powers in the world of markets (i.e. securities and trading), much like the SSM in the world of banking. To fill this new role, ESMS would have to be established as a new, independent institution, including an enormously scaled up staff if compared, e.g., to ESMA.
Accounting for financial stability: Bank disclosure and loss recognition in the financial crisis
(2020)
This paper examines banks’ disclosures and loss recognition in the financial crisis and identifies several core issues for the link between accounting and financial stability. Our analysis suggests that, going into the financial crisis, banks’ disclosures about relevant risk exposures were relatively sparse. Such disclosures came later after major concerns about banks’ exposures had arisen in markets. Similarly, the recognition of loan losses was relatively slow and delayed relative to prevailing market expectations. Among the possible explanations for this evidence, our analysis suggests that banks’ reporting incentives played a key role, which has important implications for bank supervision and the new expected loss model for loan accounting. We also provide evidence that shielding regulatory capital from accounting losses through prudential filters can dampen banks’ incentives for corrective actions. Overall, our analysis reveals several important challenges if accounting and financial reporting are to contribute to financial stability.
Using a nonlinear Bayesian likelihood approach that fully accounts for the zero lower bound on nominal interest rates, the authors analyze US post-crisis business cycle dynamics and provide reference parameter estimates. They find that neither the inclusion of financial frictions nor that of household heterogeneity improve the empirical fit of the standard model, or its ability to provide a joint explanation for the post-2007 dynamics. Associated financial shocks mis-predict an increase in consumption. The common practice of omitting the ZLB period in the estimation severely distorts the analysis of the more recent economic dynamics.
Do current levels of bank capital in Europe suffice to support a swift recovery from the COVID-19 crisis? Recent research shows that a well-capitalized banking sector is a major factor driving the speed and breadth of recoveries from economic downturns. In particular, loan supply is negatively affected by low levels of capital. We estimate a capital shortfall in European banks of up to 600 billion euro in a severe scenario, and around 143 billion euro in a moderate scenario. We propose a precautionary recapitalization on the European level that puts the European Stability Mechanism (ESM) center stage. This proposal would cut through the sovereign-bank nexus, safeguard financial stability, and position the Eurozone for a quick recovery from the pandemic.
This article provides a proposal to use IMF Article VIII, Section 2 (b) to establish a binding mechanism on private creditors for a sovereign debt standstill. The proposal builds on the original idea by Whitney Deveboise (1984). Using arguments brought forward by confidential IMF staff papers (1988, 1996) and the IMF General Counsel (1988), this paper shows how an authoritative interpretation of Article VIII, Section 2 (b) can provide protection from litigation to countries at risk of debt distress.
The envisaged mechanism presents several advantages over recent proposals for a binding standstill mechanism, such as the International Developing Country Debt Authority (IDCDA) by UNCTAD and a Central Credit Facility (CFF) by the Bolton Committee. First, this approach would not require the creation of new intergovernmental mechanisms or facilities. Second, the activation of the standstill mechanism can be set in motion by any IMF member country and does not require a modification of its Articles of Agreement. Third, debtor countries acting in good faith under an IMF program would be protected from aggressive litigation strategies from holdout creditors in numerous jurisdictions, including the US and the UK. Fourth, courts in key jurisdictions would avoid becoming overburdened by a cascade of sovereign debt litigation covering creditors and debtors across the globe. Fifth, private creditors would receive uniform treatment and ensure intercreditor equality. Sixth and last, the mechanism would provide additional safeguards to protect emergency multilateral financing provided to tackle Covid-19.
Using a novel experimental design, I test how the exposure to information about a group’s relative performance causally affects the members’ level of identification and thereby their propensity to harm affiliates of comparison groups. I find that both, being informed about a high and poor relative performance of the ingroup similarly fosters identification. Stronger ingroup identification creates increased hostility against the group of comparison. In cases where participants learn about poor relative performance, there appears to be a direct level effect additionally elevating hostile discrimination. My findings shed light on a specific channel through which social media may contribute to intergroup fragmentation and polarization.
Did the Federal Reserves’ Quantitative Easing (QE) in the aftermath of the financial crisis have macroeconomic effects? To answer this question, the authors estimate a large-scale DSGE model over the sample from 1998 to 2020, including data of the Fed’s balance sheet. The authors allow for QE to affect the economy via multiple channels that arise from several financial frictions. Their nonlinear Bayesian likelihood approach fully accounts for the zero lower bound on nominal interest rates. They find that between 2009 to 2015, QE increased output by about 1.2 percent. This reflects a net increase in investment of nearly 9 percent, that was accompanied by a 0.7 percent drop in aggregate consumption. Both, government bond and capital asset purchases were effective in improving financing conditions. Especially capital asset purchases significantly facilitated new investment and increased the production capacity. Against the backdrop of a fall in consumption, supply side effects dominated which led to a mild disinflationary effect of about 0.25 percent annually.
This Policy White Paper assesses several main elements of ECB’s upcoming review of its monetary policy strategy, announced in January 2020. Four aspects of the review are discussed in detail: i) ECB’s definition of price stability and the arguments for and against inflation targeting; ii) the scope of ECB’s objectives, considering financial stability, employment and the sustainability of the environment; iii) an update of ECB’s economic and monetary analyses to assess the risks to price stability; iv) the ECB’s communication practice. Furthermore, an overview of the ECB’s monetary policy strategy and its last evaluation in 2003 is given.