Working Paper
Refine
Year of publication
Document Type
- Working Paper (2353) (remove)
Language
- English (2353) (remove)
Is part of the Bibliography
- no (2353)
Keywords
- Deutschland (115)
- USA (51)
- Geldpolitik (48)
- monetary policy (46)
- Schätzung (45)
- Europäische Union (43)
- Bank (38)
- Corporate Governance (36)
- Monetary Policy (31)
- Inflation (23)
Institute
- Center for Financial Studies (CFS) (1378)
- Wirtschaftswissenschaften (1308)
- Sustainable Architecture for Finance in Europe (SAFE) (740)
- House of Finance (HoF) (606)
- Institute for Monetary and Financial Stability (IMFS) (173)
- Rechtswissenschaft (148)
- Informatik (114)
- Foundation of Law and Finance (50)
- Exzellenzcluster Die Herausbildung normativer Ordnungen (34)
- Gesellschaftswissenschaften (29)
The FOMC risk shift
(2021)
We identify a component of monetary policy news that is extracted from high-frequency changes in risky asset prices. These surprises, which we call “risk shifts”, are uncorrelated, and therefore complementary, to risk-free rate surprises. We show that (i) risk shifts capture the lion’s share of stock price movements around FOMC announcements; (ii) that they are accompanied by significant investor fund flows, suggesting that investors react heterogeneously to monetary policy news; and (iii) that price pressure amplifies the stock market response to monetary policy news. Our results imply that central bank information effects are overshadowed by short-term dynamics stemming from investor rebalancing activities and are likely to be more difficult to identify than previously thought.
Broad, long-term financial and economic datasets are a scarce resource, in particular in the European context. In this paper, we present an approach for an extensible, i.e. adaptable to future changes in technologies and sources, data model that may constitute a basis for digitized and structured long- term, historical datasets. The data model covers specific peculiarities of historical financial and economic data and is flexible enough to reach out for data of different types (quantitative as well as qualitative) from different historical sources, hence achieving extensibility. Furthermore, based on historical German company and stock market data, we discuss a relational implementation of this approach.
The so-called Troika, consisting of the EU-Commission, the European Central Bank (ECB) and the International Monetary Fund (IMF), was supposed to support the member states of the euro area which had been hit hard by a sovereign debt crisis. For that purpose, economic adjustment programs were drafted and monitored in order to prevent the break-up of the euro area and sovereign defaults. The cooperation of these institutions, which was born out of necessity, has been partly successful, but has also created persistent problems. With the further increase of public debt, especially in France and Italy, the danger of a renewed crisis in the euro area was growing. The European Stability Mechanism (ESM) together with the European Commission will replace the Troika in the future, following decisions of the EU Summit of December 2018. It shall play the role of a European Monetary Fund in the event of a crisis. The IMF, on the other side, will no longer play an active role in solving sovereign debt crises in the euro area. The current course is, however, inadequate to tackle the core problems of the euro zone and to avoid future crises, which are mainly structural in nature and due to escalating public debt and lack of international competitiveness of some member countries. The current Corona crisis will aggravate the institutional problems. It has led to a common European fiscal response ("Next Generation EU"). This rescue and recovery program will not be financed by ESM resources and will not be monitored by the ESM. One important novelty of this package is that it involves the issuance of substantial common European debt.
Despite the increasing use of cashless payment instruments, the notion that cash loses importance over time can be unambiguously refuted. In contrast, the authors show that cash demand increased steeply over the past 30 years. This is not only true on a global scale, but also for the most important currencies in advanced countries (USD, EUR, CHF, GBP and JPY). In this paper, they focus especially on the role of different crises (technological crises, financial market crises, natural disasters) and analyse the demand for small and large banknote denominations since the 1990s in an international perspective. It is evident that cash demand always increases in times of crises, independent of the nature of the crisis itself. However, largely unaffected from crises we observe a trend increase in global cash aligned with a shift from transaction balances towards more hoarding, especially in the form of large denomination banknotes.
Although the elderly are more vulnerable to COVID-19, the empirical evidence suggests that they do not behave more cautiously in the pandemic than younger individuals. This theoretical model argues that some individuals might not comply with the COVID-19 measures to reassure themselves that they are not vulnerable, and that the incentives for such self-signaling can be stronger for the elderly. The results suggest that communication strategies emphasizing the dangers of COVID-19 could backfire and reduce compliance among the elderly.
We study risk taking in a panel of subjects in Wuhan, China - before, during the COVID-19 crisis, and after the country reopened. Subjects in our sample traveled for semester break in January, generating variation in exposure to the virus and quarantine in Wuhan. Higher exposure leads subjects to reduce planned risk taking, risky investments, and optimism. Our findings help unify existing studies by showing that aggregate shocks affect general preferences for risk and economic expectations, while heterogeneity in experience further affect risk taking through beliefs about individuals’ own outcomes such as luck and sense of control.
JEL Classification: G50, G51, G11, D14, G41
The authors embed human capital-based endogenous growth into a New-Keynesian model with search and matching frictions in the labor market and skill obsolescence from long-term unemployment. The model can account for key features of the Great Recession: a decline in productivity growth, the relative stability of inflation despite a pronounced fall in output (the "missing disinflation puzzle"), and a permanent gap between output and the pre-crisis trend output.
In the model, lower aggregate demand raises unemployment and the training costs associated with skill obsolescence. Lower employment hinders learning-by-doing, which slows down human capital accumulation, feeding back into even fewer vacancies than justified by the demand shock alone. These feedback channels mitigate the disinflationary effect of the demand shock while amplifying its contractionary effect on output. The temporary growth slowdown translates into output hysteresis (permanently lower output and labor productivity).
Occasionally binding constraints have become an important part of economic modelling, especially since western central banks see themselves (again) constraint by the so-called zero lower bound (ZLB) of the nominal interest rate. A binding ZLB constraint poses a major problem for a quantitative-structural analysis: Linear solution methods do no work in the presence of a non-linearity such as the ZLB and existing alternatives tend to be computationally demanding. The urge to study macroeconomic questions related to the Great Recession and the Covid-19 crisis in a quantitative-structural framework requires algorithms that are not only accurate, but that are also robust, fast, and computationally efficient.
A particularly important application where efficient and fast methods for occasionally binding constraints (OBCs) are needed is the Bayesian estimation of macroeconomic models. This paper shows that a linear dynamic rational expectations system with OBCs, depending on the expected duration of the constraint, can be represented in closed form. Combined with a set of simple equilibrium conditions, this can be exploited to avoid matrix inversions and simulations at runtime for signifcant gains in computational speed.
Central banks sometimes evaluate their own policies. To assess the inherent conflict of interest, the authors compare the research findings of central bank researchers and academic economists regarding the macroeconomic effects of quantitative easing (QE). They find that central bank papers report larger effects of QE on output and inflation. Central bankers are also more likely to report significant effects of QE on output and to use more positive language in the abstract. Central bankers who report larger QE effects on output experience more favorable career outcomes. A survey of central banks reveals substantial involvement of bank management in research production.
Empirical estimates of equilibrium real interest rates are so far mostly limited to advanced economies, since no statistical procedure suitable for a large set of countries is available. This is surprising, as equilibrium rates have strong policy implications in emerging markets and developing economies as well; current estimates of the global equilibrium rate rely on only a few countries; and estimates for a more diverse set of countries can improve understanding of the drivers. The authors propose a model and estimation strategy that decompose ex ante real interest rates into a permanent and transitory component even with short samples and high volatility. This is done with an unobserved component local level stochastic volatility model, which is used to estimate equilibrium rates for 50 countries with Bayesian methods.
Equilibrium rates were lower in emerging markets and developing economies than in advanced economies in the 1980s, similar in the 1990s, and have been higher since 2000. In line with economic integration and rising global capital markets, synchronization has been rising over time and is higher among advanced economies. Equilibrium rates of countries with stronger trade linkages and similar demographic and economic trends are more synchronized.
With the second wave of the Covid-19 pandemic in full swing, banks face a challenging environment. They will need to address disappointing results and adverse balance sheet restatements, the intensity of which depends on the evolution of the euro area economies. At the same time, vulnerable banks reinforce real economy deficiencies. The contribution of this paper is to provide a comparative assessment of the various policy responses to address a looming banking crisis. Such a crisis will fully materialize when non-performing assets drag down banks simultaneously, raising the specter of a full-blown systemic crisis. The policy responses available range from forbearance, recapitalization (with public or private resources), asset separation (bad banks, at national or EU level), to debt conversion schemes. We evaluate these responses according to a set of five criteria that define the efficacy of each. These responses are not mutually exclusive, in practice, as they have never been. They may also go hand in hand with other restructuring initiatives, including potential consolidation in the banking sector. Although we do not make a specific recommendation, we provide a framework for policymakers to guide them in their decision making.
Following the financial crash and the subsequent recession, European policymakers have undertaken major reforms regarding the European Economic and Monetary Union (EMU). Yet, the success rate is mixed. Several reform proposals have either completely failed due to opposition forces or are still pending, sometimes for years. This article provides an overview of reforms in four major policy fields: financial stabilisation, economic governance, fiscal solidarity, and cooperative dissolution. Building on the conceptual foundation of policy analysis, it distinguishes between policy outputs and outcomes. Policy output refers to legislation being adopted or agreement on treaty changes, while policy outcomes depict the result from the implementation process.
This policy white paper shows, using data on European Commission (EC) lobby meetings, that financial institutions and finance trade associations have substantial access to EC policymakers. While lobbying could transfer policy-relevant information and expertise to policymakers, it could also result in the capture of policymakers by the industry, which could harm consumers and taxpayers. How could policymakers prevent regulatory capture, but retain the benefits of the sector expertise in policy decisions? Awareness of regulatory capture by policymakers is one of the most important remedies. This paper provides an overview of the origins of the regulatory capture theory and recent academic evidence. The paper shows that regulatory capture could emerge in a variety of institutions and policy areas but is not ubiquitous and depends on the incentives of policymakers and the policy environment. Subsequently, the paper discusses various measures to prevent regulatory capture, such as more transparency, diverse expert groups, and cooling-off periods.
The working paper reflects on the status that "sciences" have held at different points in time, and on the normative orders found in scientific works, as well as on the normative orders imposed by the sciences of a particular place and time on their environment. The latter is also suggested by recent developments concerning the influence (or lack thereof) of scientists on daily life and politics. The paper touches on several fundamental issues in the history of science as a discipline that have been or are still being intensely debated.
“Right to Buy” (RTB), a large-scale natural experiment by which incumbent tenants in public housing could buy properties at heavily-subsidised prices, increased the UK homeownership rate by over 10 percentage points between 1980 and the late 1990s. This paper studies its impact on crime, showing that RTB generated significant reductions in property and violent crime that persist up to today. The behavioural changes of incumbent tenants and the renovation of public properties were the main drivers of the crime reduction. This is evidence of a novel means by which subsidised homeownership and housing policy may contribute to reduce criminality.
We derive the Bayes estimator of vectors of structural VAR impulse responses under a range of alternative loss functions. We also derive joint credible regions for vectors of impulse responses as the lowest posterior risk region under the same loss functions. We show that conventional impulse response estimators such as the posterior median response function or the posterior mean response function are not in general the Bayes estimator of the impulse response vector obtained by stacking the impulse responses of interest. We show that such pointwise estimators may imply response function shapes that are incompatible with any possible parameterization of the underlying model. Moreover, conventional pointwise quantile error bands are not a valid measure of the estimation uncertainty about the impulse response vector because they ignore the mutual dependence of the responses. In practice, they tend to understate substantially the estimation uncertainty about the impulse response vector.
This paper examines the advantages and drawbacks of alternative methods of estimating oil supply and oil demand elasticities and of incorporating this information into structural VAR models. I not only summarize the state of the literature, but also draw attention to a number of econometric problems that have been overlooked in this literature. Once these problems are recognized, seemingly conflicting conclusions in the recent literature can be resolved. My analysis reaffirms the conclusion that the one-month oil supply elasticity is close to zero, which implies that oil demand shocks are the dominant driver of the real price of oil. The focus of this paper is not only on correcting some misunderstandings in the recent literature, but on the substantive and methodological insights generated by this exchange, which are of broader interest to applied researchers.
Using a novel dataset, we develop a structural model of the Very Large Crude Carrier (VLCC) market between the Arabian Gulf and the Far East. We study how fluctuations in oil tanker rates, oil exports, shipowner profits, and bunker fuel prices are determined by shocks to the supply and demand for oil tankers, to the utilization of tankers, and to the cost of operating tankers, including bunker fuel costs. Our analysis shows that time charter rates are largely unresponsive to tanker cost shocks. In response to higher costs, voyage profits decline, as cost shocks are only partially passed on to round-trip voyage rates. Oil exports from the Arabian Gulf also decline, reflecting lower demand for VLCCs. Positive utilization shocks are associated with higher profits, a slight increase in time charter rates and lower fuel prices and oil export volumes. Tanker supply and tanker demand shocks have persistent effects on time charter rates, round-trip voyage rates, the volume of oil exports, fuel prices, and profits with the expected sign.
Fiscal policies and household consumption during the COVID-19 pandemic: a review of early evidence
(2020)
We review early evidence on how household consumption behavior has evolved over the pandemic and how different groups of households have responded to fiscal stimulus programs. Due to the scarcity of evidence for Europe, our review focuses on evidence from the US. Notwithstanding the institutional and demographic differences, we highlight generalizable findings and challenges to the design of stimulus policies from the pandemic. In conclusion, we identify several open issues for dis cussion.
We study the effects of releases from the U.S. Strategic Petroleum Reserve (SPR) within the context of fully specified models of the global oil market that explicitly allow for storage demand as well as unanticipated changes in the SPR. We show that historically SPR policy interventions, defined as sequences of exogenous SPR shocks during selected periods, have helped stabilize the price of oil. Their effect on the price of oil, however, has been modest. For example, the cumulative effect of the SPR releases after the invasion of Kuwait in 1990 was a reduction of $2/barrel in the real price of oil after 7 months. Whereas emergency drawdowns tend to lower the real price of oil, we find that exchanges tend to raise the real price of oil in the long run. We also provide a detailed analysis of the benefits of the 2018 White House proposal to sell off half of the SPR within the next decade. We show that the expected fiscal benefits of this plan are somewhat higher than the revenue of $16.6 billion dollars projected by the White House.
There has been much interest in the relationship between the price of crude oil, the value of the U.S. dollar, and the U.S. interest rate since the 1980s. For example, the sustained surge in the real price of oil in the 2000s is often attributed to the declining real value of the U.S. dollar as well as low U.S. real interest rates, along with a surge in global real economic activity. Quantifying these effects one at a time is difficult not only because of the close relationship between the interest rate and the exchange rate, but also because demand and supply shocks in the oil market in turn may affect the real value of the dollar and real interest rates. We propose a novel identification strategy for disentangling the causal effects of traditional oil demand and oil supply shocks from the effects of exogenous variation in the U.S. real interest rate and in the real value of the U.S. dollar. Our approach exploits a combination of sign and zero restrictions and narrative restrictions motivated by economic theory and extraneous evidence. We empirically evaluate popular views about the role of exogenous real exchange rate shocks in driving the real price of oil, and we examine the extent to which shocks in the global oil market drive the U.S real exchange rate and U.S. real interest rates. Our evidence for the first time provides direct empirical support for theoretical models of the link between these variables.
The conventional wisdom that inflation expectations respond to the level of the price of oil (or the price of gasoline) is based on testing the null hypothesis of a zero slope coefficient in a static single-equation regression model fit to aggregate data. Given that the regressor in this model is not stationary, the null distribution of the t-test statistic is nonstandard, invalidating the use of the normal approximation. Once the critical values are adjusted, these regressions provide no support for the conventional wisdom. Using a new structural vector regression model, however, we demonstrate that gasoline price shocks may indeed drive one-year household inflation expectations. The model shows that there have been several such episodes since 1990. In particular, the rise in household inflation expectations between 2009 and 2013 is almost entirely explained by a large increase in gasoline prices. However, on average, gasoline price shocks account for only 39% of the variation in household inflation expectations since 1981.
In the wake of the global pandemic known as COVID-19, retirees, along with those hoping to retire someday, have been shocked into a new awareness of the need for better risk management tools to handle longevity and aging. This paper offers an assessment of the status quo prior to the spread of the coronavirus, evaluates how retirement systems are faring in the wake of the shock. Next we examine insurance and financial market products that may render retirement systems more resilient for the world’s aging population. Finally, potential roles for policymakers are evaluated.
OTC discount
(2020)
We document a sizable OTC discount in the interdealer market for German sovereign bonds where exchange and over-the-counter trading coexist: the vastmajority of OTC prices are favorable with respect to exchange quotes. This is a challenge for theories of OTC markets centered around search frictions but consistent with models of hybrid markets based on information frictions. We show empiricallythat proxies for both frictions determine variation in the discount, which is largely passed on to customers. Dealers trade on the exchange for immediacy and via brokers for opacity and anonymity, highlighting the complementary roles played by the di↵erent protocols.
We relate time-varying aggregate ambiguity (V-VSTOXX) to individual investor trading. We use the trading records of more than 100,000 individual investors from a large German online brokerage from March 2010 to December 2015. We find that an increase in ambiguity is associated with increased investor activity. It also leads to a reduction in risk-taking which does not reverse over the following days. When ambiguity is high, the effect of sentiment looms larger. Survey evidence reveals that ambiguity averse investors are more prone to ambiguity shocks. Our results are robust to alternative survey-, newspaper- or market-based ambiguity measures.
Supranational rules, national discretion: increasing versus inflating regulatory bank capital?
(2020)
We study how higher capital requirements introduced at the supranational level affect the regulatory capital of banks across countries. Using the 2011 EBA capital exercise as a quasi-natural experiment, we find that treated banks exploit discretion in the calculation of regulatory capital to inflate their capital ratios without a commensurate increase in their book equity and without a reduction in bank risk. Regulatory capital inflation is more pronounced in countries where credit supply is expected to tighten, suggesting that national authorities forbear their domestic banks to meet supranational requirements, with a focus on short-term economic considerations.
This article documents and classifies instances of transnational intellectual property (IP) enforcement and licensing on the Internet with a particular focus on the territorial reach of the respective regimes. Regarding IP enforcement, I show that the bulk of transnational or even global measures is adopted in the context of “voluntary” self-regulation by various intermediaries, namely domain name registrars, access and host providers, search engines, and advertising and payment services. Global IP licensing is, in contrast, less prevalent than one might expect. It is practically limited to freely accessible Open Content, whereas markets for fee-based services remain territorially fragmented. Overall, three layers of IP governance on the Internet can be distinguished. Based on global licenses, Open Content is freely accessible everywhere. Plain IP infringements are equally combatted on a worldwide scale. Territorial fragmentation persists, instead, in the market segment of fee-based services and in hard cases of conflicts of IP laws/rights. All three universal norms (global accessibility, global illegality, global fragmentation) are supported by a quite solid, “rough” global consensus.
The paper discusses the policy implications of the Wirecard scandal. The study finds that all lines of defense against corporate fraud, including internal control systems, external audits, the oversight bodies for financial reporting and auditing and the market supervisor, contributed to the scandal and are in need of reform. To ensure market integrity and investor protection in the future, the authors make eight suggestions for the market and institutional oversight architecture in Germany and in Europe.
Banks are not immune from COVID-19. The economic downturn may drive some banks to the point of non-viability (PONV). If so, is the resolution regime in the Euro-area ready to respond? No, for banks may not have the right amount of the right kind of liabilities to make bail-in work. That could lead to a banking crisis. The Euro area can avoid this risk, by arranging now for a recap later. This would plug the gap between what the failing bank has and what it would need to make bail-in work. To do so, banks would pay – possibly via the contributions they make to the Single Resolution Fund – a commitment fee to a European backstop authority for a mandatory, system-wide note issuance facility. This would compel each bank, as it approached or reached the PONV, to issue to the backstop, and the backstop to purchase from the bank, the obligations the failing bank needs in order to make bail-in work. Such obligations would take the form of “senior-most” non-preferred debt, and bail-in would stop with such debt. That would allow the SRB to use the bail-in tool to resolve the failed bank, reopen it and run it under a solvent wind-down strategy. That protects counterparties and customers and ensures the continuity of critical economic functions. It also keeps investors at risk and promotes market discipline. Above all, it preserves financial stability.
The paper compares provision of public infrastructure via public-private partnerships (PPPs) with provision under government management. Due to soft budget constraints of government management, PPPs exert more effort and therefore have a cost advantage in building infrastructure. At the same time, hard budget constraints for PPPs introduce a bankruptcy risk and bankruptcy costs. Consequently, if bankruptcy costs are high, PPPs may be less efficient than public management, although this does not result from PPPs’ higher interest costs.
We develop a novel empirical approach to identify the effectiveness of policies against a pandemic. The essence of our approach is the insight that epidemic dynamics are best tracked over stages, rather than over time. We use a normalization procedure that makes the pre-policy paths of the epidemic identical across regions. The procedure uncovers regional variation in the stage of the epidemic at the time of policy implementation. This variation delivers clean identification of the policy effect based on the epidemic path of a leading region that serves as a counterfactual for other regions. We apply our method to evaluate the effectiveness of the nationwide stay-home policy enacted in Spain against the Covid-19 pandemic. We find that the policy saved 15.9% of lives relative to the number of deaths that would have occurred had it not been for the policy intervention. Its effectiveness evolves with the epidemic and is larger when implemented at earlier stages.
This paper studies a household’s optimal demand for a reverse mortgage. These contracts allow homeowners to tap their home equity to finance consumption needs. In stylized frameworks, we show that the decision to enter a reverse mortgage is mainly driven by the dierential between the aggregate appreciation of the house price and principal limiting factor on the one hand and the funding costs of a household on the other hand. We also study a rich life-cycle model that can explain the low demand for reverse mortgages as observed in US data. In this model, we analyze the optimal response of a household that is confronted with a health shock or financial disaster. If an agent suers from an unexpected health shock, she reduces the risky portfolio share and is more likely to enter a reverse mortgage. On the other hand, if there is a large drop in the stock market, she keeps the risky portfolio share almost constant by buying additional shares of stock. Besides, the probability to take out a reverse mortgage is hardly aected.
The ruling of the German Federal Constitutional Court and its call for conducting and communicating proportionality assessments regarding monetary policy have been the subject of some controversy. However, it can also be understood as a way to strengthen the de-facto independence of the European Central Bank. The authors shows how a regular proportionality check could be integrated in the ECB’s strategy that is currently undergoing a systematic review. In particular, they propose to include quantitative benchmarks for policy rates and the central bank balance sheet. Deviations from such benchmarks can have benefits in terms of the intended path for inflation while involving costs in terms of risks and side effects that need to be balanced. Practical applications to the euro area are provided
In this paper we adapt the Hamiltonian Monte Carlo (HMC) estimator to DSGE models, a method presently used in various fields due to its superior sampling and diagnostic properties. We implement it into a state-of-theart, freely available high-performance software package, STAN. We estimate a small scale textbook New-Keynesian model and the Smets-Wouters model using US data. Our results and sampling diagnostics confirm the parameter estimates available in existing literature. In addition, we find bimodality in the Smets-Wouters model even if we estimate the model using the original tight priors. Finally, we combine the HMC framework with the Sequential Monte Carlo (SMC) algorithm to create a powerful tool which permits the estimation of DSGE models with ill-behaved posterior densities.
This paper determines the cost of employee stock options (ESOs) to shareholders. I present a pricing method that seeks to replicate the empirics of exercise and cancellation as good as possible. In a first step, an intensity-based pricing model of El Karoui and Martellini is adapted to the needs of ESOs. In a second step, I calibrate the model with a regression analysis of exercise rates from the empirical work of Heath, Huddart and Lang. The pricing model thus takes account for all effects captured in the regression. Separate regressions enable me to compare options for top executives with those for subordinates. I find no price differences. The model is also applied to test the precision of the fair value accounting method for ESOs, SFAS 123. Using my model as a reference, the SFAS method results in surprisingly accurate prices.
JEL classification: G13; J33; M41; M52
This paper studies the link between bank recapitalization and welfare in a dynamic production economy. The model features financial frictions because banks benefit of a cost advantage at monitoring firms and face costly equity issuance. The competitive equilibrium outcome is inefficient because agents do not internalize the effects banks’ capitalization over the allocation of capital, its price and, in turn, firms investments. It follows, individual recapitalizations are sub-optimal and bailout policies may benefit social welfare in the long-run. Bailouts improve capital allocation in states where aggregate banks are poorly capitalized, therefore enhancing their market valuation, fostering investments, and stabilizing the economy recovery path.
Market fragmentation and technological advances increasing the speed of trading altered the functioning and stability of global equity limit order markets. Taking market resiliency as an indicator of market quality, we investigate how resilient are trading venues in a high-frequency environment with cross-venue fragmented order flow. Employing a Hawkes process methodology on high-frequency data for FTSE 100 stocks on LSE, a traditional exchange, and on Chi-X, an alternative venue, we find that when liquidity becomes scarce Chi-X is a less resilient venue than LSE with variations existing across stocks and time. In comparison with LSE, Chi-X has more, longer, and severer liquidity shocks. Whereas the vast majority of liquidity droughts on both venues disappear within less than one minute, the recovery is not lasting, as liquidity shocks spiral over the time dimension. Over half of the shocks on both venues are caused by spiralling. Liquidity shocks tend to spiral more on Chi-X than on LSE for large stocks suggesting that the liquidity supply on Chi-X is thinner than on LSE. Finally, a significant amount of liquidity shocks spill over cross-venue providing supporting evidence for the competition for order flow between LSE and Chi-X.
The Multilingual Assessment Instrument for Narratives (MAIN) is a theoretically grounded toolkit that employs parallel pictorial stimuli to explore and assess narrative skills in children in many different languages. It is part of the LITMUS (Language Impairment Testing in Multilingual Settings) battery of tests that were developed in connection with the COST Action IS0804 Language Impairment in a Multilingual Society: Linguistic Patterns and the Road to Assessment (2009−2013). MAIN has been designed to assess both narrative production and comprehension in children who acquire one or more languages from birth or from early age. Its design allows for the comparable assessment of narrative skills in several languages in the same child and in different elicitation modes: Telling, Retelling and Model Story. MAIN contains four parallel stories, each with a carefully designed six-picture sequence based on a theoretical model of multidimensional story organization. The stories are controlled for cognitive and linguistic complexity, parallelism in macrostructure and microstructure, as well as for cultural appropriateness and robustness. As a tool MAIN had been used to compare children’s narrative skills across languages, and also to help differentiate between children with and without developmental language disorders, both monolinguals and bilinguals.
This volume consists of two parts. The main content of Part I consists of 33 papers describing the process of adapting and translating MAIN to a large number of languages from different parts of the world. Part II contains materials for use for about 80 languages, including pictorial stimuli, which are accessible after registration.
MAIN was first published in 2012/2013 (ZASPiL 56). Several years of theory development and material construction preceded this launch. In 2019 (ZASPiL 63), the revised English version (revised on the basis of over 2,500 transcribed MAIN narratives as well as ca 24,000 responses to MAIN comprehension questions, collected from around 700 monolingual and bilingual children in Germany, Russia and Sweden between 2013-2019) was published together with revised versions in German, Russian, Swedish, and Turkish for the bilingual Turkish-Swedish population in Sweden. The present 2020 (ZASPiL 64) volume contains new and revised language versions of MAIN.
On the basis of the economic theory of network effects, this article provides a novel explanation of the so-called patent paradox, i.e. the question why the propensity to patent is so strong when the expected average value of most patents is low. It demonstrates that the patent system of a country resembles a telephone network or a social media platform. Patents are perceived as nodes in a virtual network that, as a whole, exhibits network effects. It is explained why patents are not independent of other patents but that they complement each other in several ways both within and beyond markets and fields of technology, and that patents thus create synchronization value over and above individual interests of patent holders in exclusivity. As a consequence, the more patents there are, the more valuable it is to also seek patents, and vice versa. Since patents thus display increasing returns to adoption, the willingness to pay for the next patent slopes upwards. This explains why, after a phase of early instability and a certain tipping point, many countries’ patent systems expanded quickly and eventually became a rigid standard (“lock-in”). The concluding section raises the question what regulatory measures are suitable to effectively address the ensuing anticommons effects.
The long-standing battle between economic nationalism and globalism has again taken center stage in geopolitics. This article applies this dichotomy to the law and policy of international intellectual property (IP). Most commentators see IP as a prime example of globalization. The article challenges this view on several levels. In a nutshell, it claims that economic nationalist concerns about domestic industries and economic development lie at the heart of the global IP system. To support this argument, the article summarizes and categorizes IP policies adopted by selected European countries, the European Union, and the U.S. Section I presents three types of inbound IP policies that aim to foster local economic development and innovation. Section II adds three versions of outbound IP policies that, in contrast, target foreign countries and markets. Concluding section III traces a dialectic virtuous circle of economic nationalist motives leading to global legal structures and identifies the function and legal structure of IP as the reason for the resilience and even dominance of economic nationalist motives in international IP politics. IP concerns exclusive private rights that are territorially limited creatures of (supra-)national statutes. These legal structures make up the economic nationalist DNA of IP.
Using a structural life-cycle model, we quantify the long-term impact of school closures during the Corona crisis on children affected at different ages and coming from households with different parental characteristics. In the model, public investment through schooling is combined with parental time and resource investments in the production of child human capital at different stages in the children's development process. We quantitatively characterize both the long-term earnings consequences on children from a Covid-19 induced loss of schooling, as well as the associated welfare losses. Due to self-productivity in the human capital production function, skill attainment at a younger stage of the life cycle raises skill attainment at later stages, and thus younger children are hurt more by the school closures than older children. We find that parental reactions reduce the negative impact of the school closures, but do not fully offset it. The negative impact of the crisis on children's welfare is especially severe for those with parents with low educational attainment and low assets. The school closures themselves are primarily responsible for the negative impact of the Covid-19 shock on the long-run welfare of the children, with the pandemic-induced income shock to parents playing a secondary role.
Central banks unexpectedly tightening policy rates often observe the exchange value of their currency depreciate, rather than appreciate as predicted by standard models. We document this for Fed and ECB policy days using event studies and ask whether an information effect, where the public attributes the policy surprise to an unobserved state of the economy that the central bank is signaling by its policy may explain the abnormality. It turns out that many informational assumptions make a standard two- country New Keynesian model match this behavior. To identify the particular mechanism, we condition on multiple asset prices in the event study and model implications for these. We find that there is heterogeneity in this dimension in the event study and no model with a single regime can match the evidence. Further, even after conditioning on possible information effects driving longer term interest rates, there appear to be other drivers of exchange rates. Our results show that existing models have a long way to go in reconciling event study analysis with model-based mechanisms of asset pricing.
The Multilingual Assessment Instrument for Narratives (MAIN) is part of LITMUS (Language Impairment Testing in Multilingual Settings). LITMUS is a battery of tests that have been developed in connection with the COST Action IS0804 Language Impairment in a Multilingual Society: Linguistic Patterns and the Road to Assessment (2009−2013).
In these volumes, we are very pleased to present a collection of papers based on talks and posters at Sinn und Bedeutung 22, which took place in Berlin and Potsdam on September 7-10, 2017, jointly organized by the Leibniz-Centre for General Linguistics (ZAS) and the University of Potsdam.
SuB22 received 183 submitted abstracts. Out of these, the organizing committee selected 39 oral presentations in the main session, 4 oral presentations in the special session ‘Semantics and Natural Logic’, and 24 poster presentations. There were an additional 6 invited talks. In total, 58 of these contributions appear in paper form in the present volumes.
Past research suggests that international real estate markets show return characteristics and interrelationships with other asset classes, which probably qualify them as an interesting component of national and international asset allocation decisions. However, the special characteristics of real estate assets are quite distinct from that of financial assets, such as stocks and bonds. This is also the case for real estate return distributions. Therefore, the proper integration of real estate markets into asset allocation decisions requires profound understanding of real estate returns' distributional characteristics .
Because of the particular characteristics of real estate, representing real estate markets through reliable a time-series is a complex task. Consequently, reliable real estate indices with a sufficiently long history in major international real estate markets are only scarcely available. Most of the research that has been done on real estate returns was done for the U.K. and U.S., where eligible indices exist. On the other hand, in other important real estate markets, such as Germany, either little or no research has been perfoimed.
In this analysis, the methodology of Maurer, Sebastian and Stephan (2000) for indirectly deriving an appraisal-based index for the German commercial real estate market will be applied. This approach is solely based on publicly available data from German open-ended real estate investment trusts. It could also provide a solution to deriving a reliable real estate time-series for other markets.
We will extend previous analyses for the U.K. and U.S. to provide additional fundamental insights into the return characteristics of the German commercial real estate market. Despite univariate considerations, the main focus is the interrelationships between various international real estate markets, as well as between those respective markets and the international stock and bond markets.
The classical approaches to asset allocation give very different conclusions about how much foreign stocks a US investor should hold. US investors should either allocate a large portion of about 40% to foreign stocks (which is the result of mean/variance optimization and the international CAPM) or they should hold no foreign stocks at all (which is the conclusion of the domestic CAPM and mean/variance spanning tests). There is no way in between.
The idea of the Bayesian approach discussed in this article is to shrink the mean/variance efficient portfolio towards the market portfolio. The shrinkage effect is determined by the investor's prior belief in the efficiency of the market portfolio and by the degree of violation of the CAPM in the sample. Interestingly, this Bayesian approach leads to the same implications for asset allocation as the mean-variance/tracking error criterion. In both cases, the optimal portfolio is a combination of the market portfolio and the mean/variance efficient portfolio with the highest Sharpe ratio.
Applying both approaches to the subject of international diversification, we find that a substantial home bias is only justified when a US investor has a strong belief in the global mean/variance efficiency of the US market portfolio and when he has a high regret aversion of falling behind the US market portfolio. We also find that the current level of home bias can be justified whenever-regret aversion is significantly higher than risk aversion.
Finally, we compare the Bayesian approach of shrinking the mean/variance efficient portfolio towards the market portfolio to another Bayesian approach which shrinks the mean/variance efficient portfolio towards the minimum-variance portfolio. An empirical out-of-sample study shows that both Bayesian approaches lead to a clearly superior performance compared to the classical mean/variance efficient portfolio.
Predictability and the cross-section of expected returns: a challenge for asset pricing models
(2020)
Many modern macro finance models imply that excess returns on arbitrary assets are predictable via the price-dividend ratio and the variance risk premium of the aggregate stock market. We propose a simple empirical test for the ability of such a model to explain the cross-section of expected returns by sorting stocks based on the sensitivity of expected returns to these quantities. Models with only one uncertainty-related state variable, like the habit model or the long-run risks model, cannot pass this test. However, even extensions with more state variables mostly fail. We derive criteria models have to satisfy to produce expected return patterns in line with the data and discuss various examples.
The possibility to investigate the impact of news on stock prices has observed a strong evolution thanks to the recent use of natural language processing (NLP) in finance and economics. In this paper, we investigate COVID-19 news, elaborated with the ”Natural Language Toolkit” that uses machine learning models to extract the news’ sentiment. We consider the period from January till June 2020 and analyze 203,886 online articles that deal with the pandemic and that were published on three platforms: MarketWatch.com, Reuters.com and NYtimes.com. Our findings show that there is a significant and positive relationship between sentiment score and market returns. This result indicates that an increase (decrease) in the sentiment score implies a rise in positive (negative) news and corresponds to positive (negative) market returns. We also find that the variance of the sentiments and the volume of the news sources for Reuters and MarketWatch, respectively, are negatively associated to market returns indicating that an increase of the uncertainty of the sentiment and an increase in the arrival of news have an adverse impact on the stock market.
Using experimental data from a comprehensive field study, we explore the causal effects of algorithmic discrimination on economic efficiency and social welfare. We harness economic, game-theoretic, and state-of-the-art machine learning concepts allowing us to overcome the central challenge of missing counterfactuals, which generally impedes assessing economic downstream consequences of algorithmic discrimination. This way, we are able to precisely quantify downstream efficiency and welfare ramifications, which provides us a unique opportunity to assess whether the introduction of an AI system is actually desirable. Our results highlight that AI systems’ capabilities in enhancing welfare critically depends on the degree of inherent algorithmic biases. While an unbiased system in our setting outperforms humans and creates substantial welfare gains, the positive impact steadily decreases and ultimately reverses the more biased an AI system becomes. We show that this relation is particularly concerning in selective-labels environments, i.e., settings where outcomes are only observed if decision-makers take a particular action so that the data is selectively labeled, because commonly used technical performance metrics like the precision measure are prone to be deceptive. Finally, our results depict that continued learning, by creating feedback loops, can remedy algorithmic discrimination and associated negative effects over time.
In this paper we adopt the Hamiltonian Monte Carlo (HMC) estimator for DSGE models by implementing it into a state-of-the-art, freely available high-performance software package. We estimate a small scale textbook New-Keynesian model and the Smets-Wouters model on US data. Our results and sampling diagnostics confirm the parameter estimates available in existing literature. In addition we combine the HMC framework with the Sequential Monte Carlo (SMC) algorithm which permits the estimation of DSGE models with ill-behaved posterior densities.