Refine
Year of publication
- 2018 (132) (remove)
Document Type
- Working Paper (132) (remove)
Has Fulltext
- yes (132) (remove)
Is part of the Bibliography
- no (132) (remove)
Keywords
- Liquidity (6)
- household finance (4)
- OTC markets (3)
- regulation (3)
- regulatory arbitrage (3)
- Capital Markets Union (2)
- Central Clearing (2)
- Central Counterparties (2)
- Circuit Breaker (2)
- Collateral (2)
Institute
- Wirtschaftswissenschaften (101)
- Center for Financial Studies (CFS) (97)
- Sustainable Architecture for Finance in Europe (SAFE) (74)
- House of Finance (HoF) (61)
- Institute for Monetary and Financial Stability (IMFS) (9)
- Kulturwissenschaften (7)
- Geographie (3)
- Informatik (3)
- Rechtswissenschaft (3)
- Exzellenzcluster Die Herausbildung normativer Ordnungen (2)
This paper is the national report for Germany prepared for the to the 20th General Congress of the International Academy of Comparative Law 2018 and gives an overview of the regulation of crowdfunding in Germany and the typical design of crowdfunding campaigns under this legal framework. After a brief survey of market data, it delineates the classification of crowdfunding transactions in German contract law and their treatment under the applicable conflict of laws regime. It then turns to the relevant rules in prudential banking regulation and capital market law. It highlights disclosure requirements that flow from both contractual obligations of the initiators of campaigns vis-à-vis contributors and securities regulation (prospectus regime). After sketching the most important duties of the parties involved in crowdfunding, the report also looks at the key features of the respective transactions’ tax treatment.
This paper gives an account of the unmaking of Soviet workers at the Vernissage in Armenia. I argue that the unmaking of Soviet workers, first, is the irrelevance of Soviet workers as workers once they lost their jobs after the collapse of the Soviet Union and came to the Vernissage to trade. During the Soviet period, private trade was forbidden, and the Soviet government persecuted people who dared to engage in it. Consequently, many people grew up thinking of trade as a criminal activity that was non-productive and parasitic, as opposed to productive work that facilitated the modernization of the USSR. After the dissolution of the USSR, when trade was liberalized and many former Soviet workers were pushed into trade as they lost their jobs, it still retained its quality of not being “real” work, to borrow Roberman’s (2013) wording. Even 25 years after the dissolution of the USSR, former Soviet workers at the Vernissage still want to be identified with their former Soviet occupations and not with trade. However, now engaged in trade, former Soviet workers came up with a “new” way of establishing identity and hierarchy—through production. I describe this “new” way as “the identification game”; employing it, I demonstrate how former Soviet workers at the Vernissage identify and represent themselves as masters, whose work is productive and intellectual. In doing so, they single out resellers, people who resell the work of other masters, by implying that their work is parasitic and selfish. However, this “identification game” is reified only by the older generation of traders, former Soviet workers. The younger generation of traders at the Vernissage, which does not have any experience of being Soviet workers, is disengaged from it, thus undermining the Soviet view of trade as not “real” work and making it irrelevant in the postsocialist era. Thus, I contend that the unmaking of Soviet workers consists in, first, their irrelevance as workers in a postsocialist period, and second, the irrelevance of their ideas about trade as not “real” work. Furthermore, to support my depiction of a master who engages in “the identification game” and a younger-generation trader who is disengaged from it, I give two ethnographic portraits of traders at the Vernissage. I assert that the disengagement of a younger generation of traders at the Vernissage signals a change in the perception of trade as “real” work and runs parallel to the unmaking of Soviet workers.
The recent sovereign debt crisis in the Eurozone was characterized by a monetary policy, which has been constrained by the zero lower bound (ZLB) on nominal interest rates, and several countries, which faced high risk spreads on their sovereign bonds. How is the government spending multiplier affected by such an economic environment?While prominent results in the academic literature point to high government spending multipliers at the ZLB, higher public indebtedness is often associated with small government spending multipliers. I develop a DSGE model with leverage constrained banks that captures both features of this economic environment, the ZLB and fiscal stress. In this model, I analyze the effects of government spending shocks. I find that not only are multipliers large at the ZLB, the presence of fiscal stress can even increase their size. For longer durations of the ZLB,multipliers in this model can be considerably larger than one.
JEL Classification: E32, E 44, E62
Die Straßen deutscher Städte werden überwiegend vom Automobil dominiert, was nicht nur die gebaute Umwelt prägt, sondern auch die politischen Entscheidungen beeinflusst, wenn es um die Verteilung des Straßenraumes geht. Dass jedoch am Anfang der Mobilität das Zufußgehen steht und ein gut geplanter städtischer Fußverkehr nicht nur einen Beitrag zur Sicherheit der Fußgänger*innen leistet, sondern auch die Zukunft urbaner Mobilität gewährleistet, wird oft vergessen. Obwohl der Fußverkehr zahlreiche Potentiale bietet, bekommt er im öffentlichen Raum deutlich weniger Entfaltungsspielraum zur Verfügung gestellt. Die Zufußgehenden werden dort häufig kanalisiert und an den Rand gedrängt, was Einfluss auf ihr Verhalten nimmt. Diese Marginalisierung der Fußgänger*innen wird zusätzlich hervorgerufen durch eine geringe Beachtung jener in der städtischen Planung und Politik sowie in der Wissenschaft und Gesellschaft. Demnach stehen sie im Konfliktverhältnis zur persistenten und sozial konstruierten Struktur der Straße.
Die vorliegende Untersuchung überprüft, warum die Fortbewegung zu Fuß in der gebauten Straßenumwelt und ihrer Raumaufteilung im Vergleich zu anderen Verkehrsmitteln eine untergeordnete Rolle spielt und welchen Einfluss der planerische und politische Umgang und die Gestaltung des Straßenraumes darauf nimmt. Dies geschieht mittels eines Fallbeispiels in Frankfurt am Main, der Schweizer Straße. Im Rahmen der Studie werden Ergebnisse aus teilnehmenden Beobachtungen der Zufußgehenden der Schweizer Straße und deren Verhaltensweisen im öffentlichen Straßenraum sowie aus qualitativen Interviews mit Expert*innen der gebauten Straßenumwelt Frankfurts zusammengetragen. Das übergeordnete Ziel der Arbeit ist, ein Verständnis für die Wechselwirkung zwischen Raumstruktur, planerischem Einfluss und Mobilität zu entwickeln sowie die Konflikte der Raumaufteilung für die Fußgänger*innen herauszuarbeiten, um den Fußverkehr gezielter fördern zu können.
Automated deduction in higher-order program calculi, where properties of transformation rules are demanded, or confluence or other equational properties are requested, can often be done by syntactically computing overlaps (critical pairs) of reduction rules and transformation rules. Since higher-order calculi have alpha-equivalence as fundamental equivalence, the reasoning procedure must deal with it. We define ASD1-unification problems, which are higher-order equational unification problems employing variables for atoms, expressions and contexts, with additional distinct-variable constraints, and which have to be solved w.r.t. alpha-equivalence. Our proposal is to extend nominal unification to solve these unification problems. We succeeded in constructing the nominal unification algorithm NomUnifyASC. We show that NomUnifyASC is sound and complete for these problem class, and outputs a set of unifiers with constraints in nondeterministic polynomial time if the final constraints are satisfiable. We also show that solvability of the output constraints can be decided in NEXPTIME, and for a fixed number of context-variables in NP time. For terms without context-variables and atom-variables, NomUnifyASC runs in polynomial time, is unitary, and extends the classical problem by permitting distinct-variable constraints.
1998 ACM Subject Classification F.4.1 Mathematical Logic
We explore space improvements in LRP, a polymorphically typed call-by-need functional core language. A relaxed space measure is chosen for the maximal size usage during an evaluation. It Abstracts from the details of the implementation via abstract machines, but it takes garbage collection into account and thus can be seen as a realistic approximation of space usage. The results are: a context lemma for space improving translations and for space equivalences; all but one reduction rule of the calculus are shown to be space improvements, and the exceptional one, the copy-rule, is shown to increase space only moderately.
Several further program transformations are shown to be space improvements or space equivalences, in particular the translation into machine expressions is a space equivalence. These results are a step Forward in making predictions about the change in runtime space behavior of optimizing transformations in callbyneed functional languages.
Passt das deutsche Dreisäulensystem in eine zunehmend harmonisierte Bankenstruktur für Europa?
(2018)
Das deutsche Bankensystem ruht seit Jahrzehnten auf drei Säulen: den privaten Kreditbanken, einschließlich der großen Banken in Aktionärsbesitz, den öffentlichen Banken und den Genossenschaftsbanken. Fast nirgendwo anders in Europa hat ein solches Dreisäulensystem überlebt. Passt es also noch in ein Europa, in dem die Bankpolitik, die Regulierung und die Aufsicht inzwischen weitgehend in die Zuständigkeit der EU fallen? Für eine Bewahrung des Systems sprechen vor allem Gesichtspunkte der Stabilität. Angesichts ihrer Gruppenzugehörigkeit sind die deutschen "stakeholder-value-orientierten" Banken der Säulen 2 und 3 finanziell keineswegs weniger erfolgreich, sogar ein wenig erfolgreicher als die "shareholder-value-orientierten" Großbanken der Säule 1. Insbesondere schwanken ihre Geschäftszahlen deutlich weniger als jene der Großbanken, die in der Regel ein riskanteres Geschäftsmodell verfolgen. In vielen Privatbanken ist die Gewinnorientierung und damit auch die Bereitschaft, hohe Risiken einzugehen, aus ordnungspolitischer Sicht zu hoch, was die Systemstabilität tendenziell gefährdet. Zudem erfüllen die Genossenschaftsbanken und Sparkassen eine regionalpolitische Ausgleichsfunktion und haben eine gesamtwirtschaftlich stabilisierende Wirkung.
The School of Salamanca, and Iberian late Scholasticism in general, had the merit of transposing the wisdom of medieval scholasticism into the coordinates of early modernity. Due to the economic growth after the discovery of America, economic terms and moral problems become a central focus for moral theologians. In this article, I consider important key economic concepts that deliver a surprising wealth of insights into the modernization brought about by the leading scholars of the time. Social mobility, the principle of majority decision, the inviolability of property, human rights of the person, limited political power of the pope, and other key concepts that were decisive for the development of democracy and modernity are to be found in the works of the School of Salamanca in connection with economic issues.
Our recently developed LRSX Tool implements a technique to automatically prove the correctness of program transformations in higher-order program calculi which may permit recursive let-bindings as they occur in functional programming languages. A program transformation is correct if it preserves the observational semantics of programs- In our tool the so-called diagram method is automated by combining unification, matching, and reasoning on alpha-renamings on the higher-order metalanguage, and automating induction proofs via an encoding into termination problems of term rewrite systems. We explain the techniques, we illustrate the usage of the tool, and we report on experiments.
We develop a simple theoretical model to motivate testable hypotheses about how peer-to-peer (P2P) platforms compete with banks for loans. The model predicts that (i) P2P lending grows when some banks are faced with exogenously higher regulatory costs; (ii) P2P loans are riskier than bank loans; and (iii) the risk-adjusted interest rates on P2P loans are lower than those on bank loans. We confront these predictions with data on P2P lending and the consumer bank credit market in Germany and find empirical support. Overall, our analysis indicates the P2P lenders are bottom fishing when regulatory shocks create a competitive disadvantage for some banks.
Policymakers attach an important role to the macroeconomic outlook of households. Using a representative online panel form the U.S., the authors examine how individuals' macroeconomic expectations causally affect their personal economic prospects and their behavior and provide them with different professional forecasts about the likelihood of a recession. The authors find that groups with the largest exposure to aggregate risk, such as individuals working in cyclical industries, are most likely to respond to an improved macroeconomic outlook, while a large fraction of the population is unlikely to react.
Popularity/Prestige
(2018)
What is the canon? Usually this question is just a proxy for something like, "Which works are in the canon?" But the first question is not just a concise version of the second, or at least it doesn’t have to be. Instead, it can ask what the structure of the canon is - in other words, when things are in the canon, what are they in? This question came to the fore during the project that resulted in Pamphlet 11. The members of that group were looking for morphological differences between the canon and the archive. The latter they define, straightforwardly and capaciously, as "that portion of published literature that has been preserved—in libraries and elsewhere" The canon is a slipperier concept; the authors speak instead of multiple canons, like the books preserved in the Chadwyck-Healey Nineteenth-Century Fiction Collection, the constituents of the six different "best-twentieth century novels" lists analyzed by Mark Algee-Hewitt and Mark McGurl in Pamphlet 8, authors included in the British Dictionary of National Biography, and so forth. [...] This last conundrum points the way out of these difficulties and into a workable model of the structure of the canon. It suggests two different ways of entering the canon: being read by many and being prized by an elite few—or, to use the terms arrived at in Pamphlet 11, popularity and prestige. With these two dimensions, we arrive at a canonical space [...].
Der Beitrag bietet eine Übersicht zu den Zusammenhängen zwischen Immaterialgüterrechten (IP [intellectual property]-Rechte), Privatautonomie und Innovation. Demnach beruht das IP-Recht auf der Annahme, dass erst die Kombination aus fungiblen Ausschließlichkeitsrechten und Privatautonomie – also die juristische Form der Marktwirtschaft – einen innovationsförderlichen Effekt verspricht. Dementsprechend kombiniert das geltende Recht ein hohes materielles IP-Schutzniveau mit einer weitreichenden Anerkennung der Privatautonomie der Berechtigten. Dieser Regulierungsansatz hat den Vorteil, dass sehr anpassungsfähige Rahmenbedingungen für Innovationen geschaffen werden. Wer für seine Innovation eine umfassende Exklusivität benötigt, kann unter Geltung der beiden genannten Prinzipien ebenso operieren wie Akteure, die auf IP-Schutz teilweise oder ganz verzichten möchten, weil ihnen dies unter den gegebenen Wettbewerbsbedingungen vorzugswürdig erscheint. Und doch erläutert der Beitrag, dass die naheliegende Folgerung zu kurz greift, der Gesetzgeber könne sich darauf beschränken, möglichst umfassende und zugleich fungible IP-Rechte zu kodifizieren, da der Markt stets für eine effiziente und auch sonst sozial wünschenswerte Ressourcenallokation sorge. Denn die mit ausschließlichen IP-Rechten verbundenen Transaktionskosten stehen diesem Ziel nicht selten im Wege. Damit zeigt sich, dass keine noch so elaborierte Vertragsrechtstheorie die Frage nach dem Sinn des logisch vorrangigen Eigentums erübrigt.
Das Immaterialgüterrecht bildet eine der ältesten und inzwischen umfangreichsten Materien des Einheitsprivatrechts. Fast alle Staaten der Erde sind Mitglieder der World Intellectual Property Organization und bekennen sich als solche zur Förderung des „geistigen Eigentums“. Allerdings ist der Rechtsschutz nach dem seinerseits universell anerkannten Territorialitätsprinzip auf das Territorium des jeweiligen Gesetzgebers beschränkt. Zu dieser geografischen Fragmentierung treten fremdenrechtliche Beschränkungen des Zugangs zum lokalen Rechtsschutz hinzu. Der Beitrag erläutert, welche Akteure die Spannung zwischen globaler Kommunikation und fragmentiertem Immaterialgüterrechtsschutz auf welche Weisen regulativ bearbeiten. Dabei wird unterschieden zwischen der Rechtsangleichung bei fortdauernder Fragmentierung, der Schaffung supranational einheitlicher Verfahren, Immaterialgüterrechte und Gerichte sowie informellen Kooperationen zwischen Privaten und Patentämtern. Die Leitfrage der Bestandsaufnahme lautet, ob all diese Phänomene im Sinne von Kropholler und David als funktionales Einheitsrecht begriffen werden können, ob es sich also um Rechtssätze handelt, bei denen die Einheitlichkeit ihrer Geltung im Interesse des unverfälschten internationalen Handels zu einem besonderen Rechtszweck erhoben wurde, oder ob man lediglich objektiv-formal eine rechtlich bindende Einheitlichkeit konstatieren kann, die primär ein anderes Ziel verfolgt, nämlich: die weltweite Stärkung des Immaterialgüterrechtsschutzes. Den Abschluss bildet eine kritische Stellungnahme zur verbreiteten Annahme, die weit fortgeschrittene Vereinheitlichung des Immaterialgüterrechts sei ein großartiger Erfolg.
We show that bond purchases undertaken in the context of quantitative easing efforts by the European Central Bank created a large mispricing between the market for German and Italian government bonds and their respective futures contracts. On top of the direct effect the buying pressure exerted on bond prices, we show three indirect effects through which the scarcity of bonds, resulting from the asset purchases, drove a wedge between the futures contracts and the underlying bonds: the deterioration of bond market liquidity, the increased bond specialness on the repurchase agreement market, and the greater uncertainty about bond availability as collateral.
This paper investigates the effect of the conventional and unconventional (e.g. Quantitative Easing - QE) monetary policy intervention on the insurance industry. We first analyze the impact on the stock performances of 166 (re)insurers from the last QE programme launched by the European Central Bank (ECB) by constructing an event study around the announcement date. Then we enlarge the scope by looking at the monetary policy surprise effects on the same sample of (re)insurers over a timeframe of 12 years, also extending the analysis to the Credit Default Swaps (CDS) market. In the second part of the paper by building a set of balance sheet-based indices, we identify the characteristics of (re)insurers that determine sensitivity to monetary policy actions. Our evidences suggest that a single intervention extrapolated from the comprehensive strategy cannot be utilized to estimate the effect of monetary policy intervention on the market. With respect to the impact of monetary policies, we show how the effect of interventions changes over time. Expansionary monetary policy interventions, when generating an instantaneous reduction of interest rates, generated movement in stock prices in the same direction till September 2010. This effect turned positive during the European sovereign debt crisis. However, the effect faded away in 2014-2015. The pattern is confirmed by the impact on the CDS market. With regard to the determinants of these effects, our analysis suggests that sensitivity is mainly driven by asset allocation and in particular by exposure to fixed income assets.
The paper investigates the determinants of the idiosyncratic volatility puzzle by allowing linkages across asset returns. The first contribution of the paper is to show that portfolios sorted by increasing indegree computed on the network based on Granger causality test have lower expected returns, not related to idiosyncratic volatility. Secondly, empirical evidence indicates that stocks with higher idiosyncratic volatility have the lower exposition on the indegree risk factor.
In talent-intensive jobs, workers’ quality is revealed by their performance. This enhances productivity and earnings, but also increases layoff risk. Firms cannot insure workers against this risk if they compete fiercely for talent. In this case, the more risk-averse workers will choose less quality-revealing jobs. This lowers expected productivity and salaries. Public unemployment insurance corrects this inefficiency, enhancing employment in talent-sensitive industries, consistently with international evidence. Unemployment insurance dominates legal restrictions on firms’ dismissals, which penalize more talent-sensitive firms and thus depress expected productivity. Finally, unemployment insurance fosters education, by encouraging investment in risky human capital that enhances talent discovery.
What institutional arrangements for an independent central bank with a price stability mandate promote good policy outcomes when unconventional policies become necessary? Unconventional monetary policy poses challenges. The large scale asset purchases needed to counteract the zero lower bound on nominal interest rates have uncomfortable fiscal and distributional consequences and require central banks to assume greater risks on their balance sheets.
In his paper, Athanasios Orphanides draws lessons from the experience of the Bank of Japan (BoJ) since the late 1990s for the institutional design of independent central banks. He comes to the conclusion that lack of clarity on the precise definition of price stability, coupled with concerns about the legitimacy of large balance sheet expansions, hinders policy: It encourages the central bank to eschew the decisive quantitative easing needed to reflate the economy and instead to accommodate too-low inflation. The BoJ’s experience with the zero lower bound suggests important benefits from a clear definition of price stability as a symmetric 2% goal for inflation, which the Bank adopted in 2013.
We assess the relationship between finance and growth over the period 1980-2014. We estimate a cross-country growth regression for 48 countries during 20 periods of 15 years starting in 1980 (to 1995) and ending in 1999 (to 2014). We use OLS and IV estimations and we find that: 1) overall financial development had a positive effect on economic growth during all periods of our sample, i.e., we confirm that from 1980 to 2014 financial services provided by the various financial systems were significant (to various degrees) for firm creation, industrial expansion and economic growth; but that, 2) the structure of financial markets was particularly relevant for economic growth until the financial crisis; while 3) the structure of the banking sector played a major role since; and finally that, 4) the legal system is the primary determinant of the effectiveness of the overall financial system in facilitating innovation and growth in (almost) all of our sample period. Hence, overall our results suggest that the relationship between finance and growth matters but also that it varies over time in strength and in sector origination.
JEL Classification: O16, G16, G20.
Bargaining with a bank
(2018)
This paper examines bargaining as a mechanism to resolve information problems. To guide the analysis, I develop a parsimonious model of a credit negotiation between a bank and firms with varying levels of impatience. In equilibrium, impatient firms accept the bank’s offer immediately, while patient firms wait and negotiate price adjustments. I test the empirical predictions using a hand-collected dataset on credit line negotiations. Firms signing the bank’s offer right away draw down their line of credit after origination and default more than late signers. Late signers negotiate price adjustments more frequently, and, consistent with the model, these adjustments predict better ex post performance.
The authors relax the standard assumption in the dynamic stochastic general equilibrium (DSGE) literature that exogenous processes are governed by AR(1) processes and estimate ARMA (p,q) orders and parameters of exogenous processes. Methodologically, they contribute to the Bayesian DSGE literature by using Reversible Jump Markov Chain Monte Carlo (RJMCMC) to sample from the unknown ARMA orders and their associated parameter spaces of varying dimensions.
In estimating the technology process in the neoclassical growth model using post war US GDP data, they cast considerable doubt on the standard AR(1) assumption in favor of higher order processes. They find that the posterior concentrates density on hump-shaped impulse responses for all endogenous variables, consistent with alternative empirical estimates and the rigidities behind many richer structural models. Sampling from noninvertible MA representations, a negative response of hours to a positive technology shock is contained within the posterior credible set. While the posterior contains significant uncertainty regarding the exact order, the results are insensitive to the choice of data filter; this contrasts with the authors’ ARMA estimates of GDP itself, which vary significantly depending on the choice of HP or first difference filter.
The Eastern Steppe of Mongolia is one of the world's largest mostly intact grassland ecosystems and is characterised by a close coupling of societal and natural processes. In this ecosystem, mobility is one of the key characteristics of wildlife and human societies alike. The current economic development of Mongolia is accompanied by extensive societal transformation and changes in nomadic lifestyles, which potentially affects the unique steppe ecosystem and its biodiversity. The changing lifestyles are mainly characterised by rural-urban migration, resulting in reduced mobility of herders and their livestock, and presumably affecting wildlife. The question is how mobility can be fostered under these transformation processes. Time is pressing as a new generation is born which is growing up in urban environments and with new skill sets but a potential loss of the tight connection to nature and the nomadic lifestyle.
This paper presents new evidence on the expectation formation process from a Dutch household survey. Households become too optimistic about their future income after their income has improved, consistent with the over-extrapolation of their experience. We show that this effect of experience is persistent and that households over-extrapolate income losses more than income gains. Furthermore, older households over-extrapolate more, suggesting that they did not learn over time to form more accurate expectations. Finally, we study the relationship between expectation errors and consumption. We find that more over-optimistic households intend to consume more and subsequently report higher consumption, even though they do not consume as much as they intended to. These results suggests that overextrapolation hurts consumers and amplify business cycles.
I present a new business cycle model in which decision making follows a simple mental process motivated by neuroeconomics. Decision makers first compute the value of two different options and then choose the option that offers the highest value, but with errors. The resulting model is highly tractable and intuitive. A demand function in level replaces the traditional Euler equation. As a result, even liquid consumers can have a large marginal propensity to consume. The interest rate affects consumption through the cost of borrowing and not through intertemporal substitution. I discuss the implications for stimulus policies.
Manipulative communications touting stocks are common in capital markets around the world. Although the price distortions created by so-called “pump-and-dump” schemes are well known, little is known about the investors in these frauds. By examining 421 “pump-and-dump” schemes between 2002 and 2015 and a proprietary set of trading records for over 110,000 individual investors from a major German bank, we provide evidence on the participation rate, magnitude of the investments, losses, and the characteristics of the individuals who invest in such schemes. Our evidence suggests that participation is quite common and involves sizable losses, with nearly 6% of active investors participating in at least one “pump-and-dump” and an average loss of nearly 30%. Moreover, we identify several distinct types of investors, some of which should not be viewed as falling prey to these frauds. We also show that portfolio composition and past trading behavior can better explain participation in touted stocks than demographics. Our analysis offers insights into the challenges associated with designing effective investor protection against market manipulation.
The use of evidence and economic analysis in policymaking is on the rise, and accounting standard setting and financial regulation are no exception. This article discusses the promise of evidence-based policymaking in accounting and financial markets as well as the challenges and opportunities for research supporting this endeavor. In principle, using sound theory and robust empirical evidence should lead to better policies and regulations. But despite its obvious appeal and substantial promise, evidence-based policymaking is easier demanded than done. It faces many challenges related to the difficulty of providing relevant causal evidence, lack of data, the reliability of published research, and the transmission of research findings. Overcoming these challenges requires substantial infrastructure investments for generating and disseminating relevant research. To illustrate this point, I draw parallels to the rise of evidence-based medicine. The article provides several concrete suggestions for the research process and the aggregation of research findings if scientific evidence is to inform policymaking. I discuss how policymakers can foster and support policy-relevant research, chiefly by providing and generating data. The article also points to potential pitfalls when research becomes increasingly policy-oriented.
Through the lens of market participants' objective to minimize counterparty risk, we provide an explanation for the reluctance to clear derivative trades in the absence of a central clearing obligation. We develop a comprehensive understanding of the benefits and potential pitfalls with respect to a single market participant's counterparty risk exposure when moving from a bilateral to a clearing architecture for derivative markets. Previous studies suggest that central clearing is beneficial for single market participants in the presence of a sufficiently large number of clearing members. We show that three elements can render central clearing harmful for a market participant's counterparty risk exposure regardless of the number of its counterparties: 1) correlation across and within derivative classes (i.e., systematic risk), 2) collateralization of derivative claims, and 3) loss sharing among clearing members. Our results have substantial implications for the design of derivatives markets, and highlight that recent central clearing reforms might not incentivize market participants to clear derivatives.
Through the lens of market participants' objective to minimize counterparty risk, we provide an explanation for the reluctance to clear derivative trades in the absence of a central clearing obligation. We develop a comprehensive understanding of the benefits and potential pitfalls with respect to a single market participant's counterparty risk exposure when moving from a bilateral to a clearing architecture for derivative markets. Previous studies suggest that central clearing is beneficial for single market participants in the presence of a sufficiently large number of clearing members. We show that three elements can render central clearing harmful for a market participant's counterparty risk exposure regardless of the number of its counterparties: 1) correlation across and within derivative classes (i.e., systematic risk), 2) collateralization of derivative claims, and 3) loss sharing among clearing members. Our results have substantial implications for the design of derivatives markets, and highlight that recent central clearing reforms might not incentivize market participants to clear derivatives.
We study the relevance of signaling and marketing as explanations for the discount control mechanisms that a closed-end fund may choose to adopt in its prospectus. These policies are designed to narrow the potential gap between share price and net asset value, measured by the fund’s discount. The two most common discount control mechanisms are explicit discretion to repurchase shares based on the magnitude of the fund discount and mandatory continuation votes that provide shareholders the opportunity to liquidate the fund. We find very limited evidence that a discount control mechanism serves as costly signal of information. Funds with mandatory voting are not more likely to delist than the rest of the CEFs in general or whenever the fund discount is large. Similarly, funds that explicitly discuss share repurchases as a potential response do not subsequently buy back shares more often when discounts do increase. Instead, the existence of these policies is more consistent with marketing explanations because the policies are associated with an increased probability of issuing more equity in subsequent periods.
Direct financing of consumer credit by individual investors or non-bank institutions through an implementation of marketplace lending is a relatively new phenomenon in financial markets. The emergence of online platforms has made this type of financial intermediation widely available. This paper analyzes the performance of marketplace lending using proprietary cash flow data for each individual loan from the largest platform, Lending Club. While individual loan characteristics would be important for amateur investors holding a few loans, sophisticated lenders, including institutional investors, usually form broad portfolios to benefit from diversification. We find high risk-adjusted performance of approximately 40 basis points per month for these basic loan portfolios. This abnormal performance indicates that Lending Club, and similar marketplace lenders, are likely to attract capital to finance a growing share of the consumer credit market. In the absence of a competitive response from traditional credit providers, these loans lower costs to the ultimate borrowers and increase returns for the ultimate lenders.
We characterize the optimal linear tax on capital in an Overlapping Generations model with two period lived households facing uninsurable idiosyncratic labor income risk. The Ramsey government internalizes the general equilibrium feedback of private precautionary saving. For logarithmic utility our full analytical solution of the Ramsey problem shows that the optimal aggregate saving rate is independent of income risk. The optimal time-invariant tax on capital is increasing in income risk. Its sign depends on the extent of risk and on the Pareto weight of future generations. If the Ramsey tax rate that maximizes steady state utility is positive, then implementing this tax rate permanently generates a Pareto-improving transition even if the initial equilibrium is dynamically efficient. We generalize our results to Epstein-Zin-Weil utility and show that the optimal steady state saving rate is increasing in income risk if and only if the intertemporal elasticity of substitution is smaller than 1.
Über Scheinriesen: Was TARGET-Salden tatsächlich bedeuten : eine finanzökonomische Überprüfung
(2018)
Der TARGET-Saldo der Bundesbank beläuft sich gegenwärtig auf knapp 1 Billion Euro. Kritikern zufolge birgt dieser Umstand hohe Lasten und Risiken für den deutschen Steuerzahler und zeigt, dass Deutschland zu einem „Selbstbedienungsladen“ im Eurosystem geworden sei. Vor diesem Hintergrund erörtert das Papier im Detail, wie TARGET-Salden überhaupt entstehen und was sie finanzökonomisch bedeuten. Die wirtschaftspolitische Analyse kommt zu dem Schluss, dass - anders als von den Kritikern behauptet- unter den Bedingungen einer Währungsunion im Normalbetrieb - TARGET-Salden lediglich Verrechnungssalden ohne weitere Implikationen sind, die aber nützliche Informationen über ökonomisch tieferliegende, regionale Verschiebungen geben können. Unter dem Extremszenario eines Zerfalls der Währungsunion können TARGET-Salden zwar als offene Positionen interpretiert werden, deren spätere Erfüllung würde aber ähnlich dem Brexit von komplizierten politischen Verhandlungen abhängen, sodass über die Werthaltigkeit allenfalls spekuliert werden kann. Sollte man das Extremszenario für bedeutend halten, und politisches Handeln fordern, erscheinen zwei Lösungen sinnvoll. Beide Vorschläge führen zu einer institutionellen Stärkung der Eurozone: i) die Einführung einer Tilgungspraxis, wie sie im US-amerikanischen Fedwire-System angewandt wird. Dabei handelt es sich um eine rein fiktive Tilgung in Form einer Umbuchung auf einem gemeinsamen (Offenmarkt-)Konto bei der EZB; ii) die Bündelung aller monetären Aktivitäten bei der EZB, sodass eine regionale Abgrenzung von Zahlungsvorgängen entfällt (und damit die TARGET-Salden verschwinden), weil alle Banken in direkter Beziehung zu ein und derselben Zentralbank stehen und der Zahlungsverkehr direkt zwischen den beteiligten Banken stattfindet.
Based on OECD evidence, equity/housing-price busts and credit crunches are followed by substantial increases in public consumption. These increases in unproductive public spending lead to increases in distortionary marginal taxes, a policy in sharp contrast with presumably optimal Keynesian fiscal stimulus after a crisis. Here we claim that this seemingly adverse policy selection is optimal under rational learning about the frequency of rare capital-value busts. Bayesian updating after a bust implies massive belief jumps toward pessimism, with investors and policymakers believing that busts will be arriving more frequently in the future. Lowering taxes would be as if trying to kick a sick horse in order to stand up and run, since pessimistic markets would be unwilling to invest enough under any temporarily generous tax regime.
Asset transaction prices sampled at high frequency are much staler than one might expect in the sense that they frequently lack new updates showing zero returns. In this paper, we propose a theoretical framework for formalizing this phenomenon. It hinges on the existence of a latent continuous-time stochastic process pt valued in the open interval (0; 1), which represents at any point in time the probability of the occurrence of a zero return. Using a standard infill asymptotic design, we develop an inferential theory for nonparametrically testing, the null hypothesis that pt is constant over one day. Under the alternative, which encompasses a semimartingale model for pt, we develop non-parametric inferential theory for the probability of staleness that includes the estimation of various integrated functionals of pt and its quadratic variation. Using a large dataset of stocks, we provide empirical evidence that the null of the constant probability of staleness is fairly rejected. We then show that the variability of pt is mainly driven by transaction volume and is almost unaffected by bid-ask spread and realized volatility.
We present empirical evidence on the heterogeneity in monetary policy transmission across countries with different home ownership rates. We use household-level data together with shocks to the policy rate identified from high-frequency data. We find that housing tenure reacts more strongly to unexpected changes in the policy rate in Germany and Switzerland –the OECD countries with the lowest home ownership rates– compared with existing evidence for the U.S. An unexpected decrease in the policy rate by 25 basis points increases the home ownership rate by 0.8 percentage points in Germany and by 0.6 percentage points in Switzerland. The response of non-housing consumption in Switzerland is less heterogeneous across renters and mortgagors, and has a different pattern across age groups than in the U.S. We discuss economic explanations for these findings and implications for monetary policy.
Im Jahr 1564 veröffentlicht der Ulmer Militärexperte und -schriftsteller Leonhard Fronsperger die Schrift "Von dem Lob deß Eigen Nutzen", in der er darlegt, dass die konsequente Verfolgung des eigenen Nutzens als individuelle Handlungsmaxime im Ergebnis zu einer Förderung des Gemeinwohls führt. Das etwas mehr als hundert Seiten umfassende Werk wird in Frankfurt am Main, einem Zentrum des europäischen Buchdrucks und -handels, verlegt und findet Erwähnung im ersten veröffentlichten Katalog der Frankfurter Buchmesse. Fronsperger präsentiert seine für die damalige Zeit durchaus revolutionäre These in der Form eines satirischen Enkomions und unterlegt sie mit einer umfangreichen Gesellschaftsanalyse. Er stellt fest, dass die politischen Herrschaftsformen, die gesellschaftlichen Institutionen und die wirtschaftlichen Handelsbeziehungen auf einer konsequenten Verfolgung des eigenen Nutzens aller Akteure beruhen und dass sich die von der Kirche geforderte Ausrichtung des individuellen Handelns am Gemeinwohl in der Realität nicht finden lässt. Vielmehr hält er die Kritik der Theologen am egoistischen Handeln des Einzelnen für falsch, empfindet er doch den Staat, Wirtschaft und Gesellschaft im Großen und Ganzen als gut funktionierend.
Im Folgenden dokumentieren wir zunächst die Biografie des Autors, die Entstehung und Verbreitung des Werks und seine besondere literarische Form. Anschließend diskutieren wir die zentrale These in drei verschiedenen geistesgeschichtlichen Kontexten, die jeweils von besonderer Bedeutung für die Herausbildung der neuzeitlichen Gesellschafts- und Wirtschaftstheorien sind. Erkenntnis- und staatstheoretisch weist Fronspergers Werk deutliche Parallelen zu den Analysen auf, die Niccolò Machiavelli und später Giovanni Botero in Italien zur Bedeutung der auf den individuellen fürstlichen Interessen basierenden Staatsräson bzw. zu den Triebkräften erfolgreicher Stadtentwicklung vorlegten. Markante Unterschiede gibt es dagegen zu den Ansichten der deutschsprachigen Reformatoren im Anschluss an Luther, die zwar die Unterscheidung zwischen geistlicher und weltlicher Sphäre propagieren und damit die Entwicklung einer eigenständigen Moral für das Wirtschaftsleben befördern, dort allerdings mehrheitlich die Orientierung am "Gemeinen Nutzen" propagieren. Indem Fronsperger dagegen die Verfolgung des Eigennutzes fordert, nimmt er wirtschafts- und gesellschaftstheoretische Einsichten über das Wesen und die Auswirkungen der Arbeitsteilung vorweg, die erst 150 Jahre und später von Bernard Mandeville und Adam Smith in England und Schottland formuliert wurden. Das Werk Fronspergers bietet damit ein herausragendes Beispiel dafür, wie sich aus dem Zusammenspiel von wirtschaftlichem Erfolg, einem realistischen Menschenbild und manchen Aspekten der Reformation in deren Folge ein neues normatives Verständnis von den Antriebskräften ökonomischer und gesellschaftlicher Dynamik entwickelt, das später als der "Geist des Kapitalismus" bezeichnet wird.
The propagation of regional shocks in housing markets: evidence from oil price shocks in Canada
(2018)
Shocks to the demand for housing that originate in one region may seem important only for that regional housing market. We provide evidence that such shocks can also affect housing markets in other regions. Our analysis focuses on the response of Canadian housing markets to oil price shocks. Oil price shocks constitute an important source of exogenous regional variation in income in Canada because oil production is highly geographically concentrated. We document that, at the national level, real oil price shocks account for 11% of the variability in real house price growth over time. At the regional level, we find that unexpected increases in the real price of oil raise housing demand and real house prices not only in oil-producing regions, but also in other regions. We develop a theoretical model of the propagation of real oil price shocks across regions that helps understand this finding. The model differentiates between oil-producing and non-oil-producing regions and incorporates multiple sectors, trade between provinces, government redistribution, and consumer spending on fuel. We empirically confirm the model prediction that oil price shocks are propagated to housing markets in non-oil-producing regions by the government redistribution of oil revenue and by increased interprovincial trade.
Europe is a key normative power. Its legitimacy as a force for ensuring the reign of rule of law in international relations is unparalleled. It also packs an economic punch. In data protection and the fight against cybercrime, European norms have been successfully globalized. The time is right to take the next step: Europe must now become the international normative leader for developing a new deal on internet governance. To ensure this, European powers should commit to rules that work in security, economic development and human rights on the internet and implement them in a reinvigorated IGF.
Even if the importance of micro data transparency is a well-established fact, European institutions are still lacking behind the US when it comes to the provision of financial market data to academics. In this Policy Letter we discuss five different types of micro data that are crucial for monitoring (systemic) risk in the financial system, identifying and understanding inter-linkages in financial markets and thus have important implications for policymakers and regulatory authorities. We come to the conclusion that for all five areas of micro data, outlined in this Policy Letter (bank balance sheet data, asset portfolio data, market transaction data, market high frequency data and central bank data), the benefits of increased transparency greatly offset potential downsides. Hence, European policymakers would do well to follow the US example and close the sizeable gap in micro data transparency. For most cases, relevant data is already collected (at least on national level), but just not made available to academics for partly incomprehensible reasons. Overcoming these obstacles could foster financial stability in Europe and assure level playing fields with US regulators and policymakers.
Deutschland und Europa
(2018)
Otmar Issing erörtert die Reaktionen in Deutschland auf die Pläne des französischen Präsidenten Macron aus dessen viel beachteter Rede zur Zukunft Europas an der Pariser Sorbonne. Issing wertet das Ergebnis der Sondierungsgespräche zwischen CDU/CSU und SPD als Abschied von der Vorstellung einer auf Stabilität gerichteten europäischen Gemeinschaft und mahnt an, den einheitlichen Markt und die damit verbundenen Freiheiten nicht durch überzogene Ambitionen zu gefährden und damit zunehmendes Misstrauen gegenüber Europa zu fördern.
Insbesondere in der geplanten Weiterentwicklung des ESM zu einem im Unionsrecht verankerten Europäischen Währungsfonds sieht Issing die Auslieferung der durch den Fonds zur Verfügung gestellten Mittel an eine politische Mehrheit. Zudem führe die Bestellung eines europäischen Finanzministers zur Schaffung einer die Währungsunion ergänzenden Fiskalunion und damit zur Verlagerung finanzpolitischer Kompetenz von der nationalen auf die europäische Ebene. In letzter Konsequenz bedeute dies eine Aufgabe des grundlegenden Prinzips der demokratischen Legitimierung und Kontrolle finanzpolitischer Entscheidungen.
Das Ergebnis der Sondierungsgespräche muss man als Abschied von der Vorstellung einer auf Stabilität gerichteten europäischen Gemeinschaft verstehen. Damit werden die Versprechen gebrochen, die man den Bürgern in Deutschland vor der Einführung des Euros gegeben hat.
Der Beitrag analysiert die Voraussetzungen für stabiles Geld und setzt sich dabei grundlegend mit Hayeks Thesen zu alternativen Währungssystemen sowie dessen fundamentaler Kritik an der Möglichkeit zur Gestaltung der Geldpolitik auf wissenschaftlicher Basis auseinander. Er prüft Hayeks Vorschlag zur Entnationalisierung des Geldes und seine Thesen zur Überlegenheit des im privaten Wettbewerb geschaffenen Geldes. In diesem Zusammenhang schlägt der Beitrag einen Bogen zur aktuellen Diskussion über Kryptowährungen und wirft die Frage auf, ob virtuelle Währungen wie etwa Bitcoin geeignet sind, den Hayekschen Währungswettbewerb zu entfalten. Sodann wird im Gegensatz zu Hayeks Forderung nach einer Abschaffung der Zentralbanken deren entscheidende Rolle für anhaltendes Wachstum bei stabilen Preisen skizziert und die Wichtigkeit der Unabhängigkeit von Notenbanken für die dauerhafte Durchführung einer stabilitätsorientierten Geldpolitik hervorgehoben. Gleichwohl ergeht der Hinweis, dass Notenbanken mit der Überschreitung ihres Mandats auf lange Sicht gesehen selbst den Status ihrer Unabhängigkeit unterminieren können und damit die Rückübertragung der Kompetenz für zentrale geldpolitische Entscheidungen auf Regierung und Parlament provozieren. Die Gefahren der weitgehenden Unabhängigkeit einiger weniger an der Spitze der Notenbanken anerkennend wird anschließend die Bedeutung ihrer Rechenschaftspflicht und Transparenz ihrer Entscheidungen unterstrichen.
In this study we investigate which economic ideas were prevalent in the macroprudential discourse post-crises in order to understand the availability of ideas for reform minded agents. We base our analysis on new findings in the field of ideational shifts and regulatory science, which posit that change-agents engage with new ideas pragmatically and strategically in their effort to have their economic ideas institutionalized. We argue that in these epistemic battles over new regulation, scientific backing by academia is the key resource determining the outcome. We show that the present reforms implemented internationally follow this pattern. In our analysis we contrast the entire discourse on systemic risk and macroprudential regulation with Borio’s initial 2003 proposal for a macroprudential framework. We find that mostly cross-sectional measures targeted towards increasing the resilience of the financial system rather than inter-temporal measures dampening the financial cycle have been implemented. We provide evidence for the lacking support of new macroprudential thinking within academia and argue that this is partially responsible for the lack of anti-cyclical macroprudential regulation. Most worryingly, the financial cycle is largely absent in the academic discourse and is only tacitly assumed instead of fully fledged out in technocratic discourses, pointing to the possibility that no anti-cyclical measures will be forthcoming.
In the last decade, central bank interventions, flights to safety, and the shift in derivatives clearing resulted in exceptionally high demand for high quality liquid assets, such as German treasuries, in the securities lending market besides the traditional repo market activities. Despite the high demand, the realizable securities lending income has remained economically negligible for most beneficial owners. We provide empirical evidence of pricing inefficiencies in the non-transparent, oligopolistic securities lending market for German treasuries from 2006 to 2015. Consistent with Duffie, Gârleanu and Pedersen (2005)’s theory, we find that the less connected market participants’ interests are underrepresented, evident in the longer maturity segment, where lenders are more likely to be conservative passive investors, such as pension funds and insurance firms. The low price elasticity in this segment hinders these beneficial owners to fully capitalize on the additional income from securities lending, giving rise to important negative welfare implications.
We show that time-varying volatility of volatility is a significant risk factor which affects the cross-section and the time-series of index and VIX option returns, beyond volatility risk itself. Volatility and volatility-of-volatility measures, identified model-free from the option price data as the VIX and VVIX indices, respectively, are only weakly related to each other. Delta-hedged index and VIX option returns are negative on average, and are more negative for strategies which are more exposed to volatility and volatility-of-volatility risks. Volatility and volatility of volatility significantly and negatively predict future delta-hedged option payoffs. The evidence is consistent with a no-arbitrage model featuring time-varying market volatility and volatility-of-volatility factors, both of which have negative market price of risk.