Refine
Year of publication
Document Type
- Working Paper (2350) (remove)
Language
- English (2350) (remove)
Has Fulltext
- yes (2350) (remove)
Is part of the Bibliography
- no (2350)
Keywords
- Deutschland (115)
- USA (51)
- Geldpolitik (48)
- monetary policy (46)
- Schätzung (45)
- Europäische Union (43)
- Bank (38)
- Corporate Governance (36)
- Monetary Policy (31)
- Inflation (23)
Institute
- Center for Financial Studies (CFS) (1376)
- Wirtschaftswissenschaften (1305)
- Sustainable Architecture for Finance in Europe (SAFE) (738)
- House of Finance (HoF) (604)
- Institute for Monetary and Financial Stability (IMFS) (173)
- Rechtswissenschaft (148)
- Informatik (114)
- Foundation of Law and Finance (50)
- Exzellenzcluster Die Herausbildung normativer Ordnungen (34)
- Gesellschaftswissenschaften (29)
We use a novel disaggregate sectoral euro area data set with a regional breakdown to investigate price changes and suggest a new method to extract factors from over-lapping data blocks. This allows us to separately estimate aggregate, sectoral, country-specific and regional components of price changes. We thereby provide an improved estimate of the sectoral factor in comparison with previous literature, which decomposes price changes into an aggregate and idiosyncratic component only, and interprets the latter as sectoral. We find that the sectoral component explains much less of the variation in sectoral regional inflation rates and exhibits much less volatility than previous findings for the US indicate. We further contribute to the literature on price setting by providing evidence that country- and region-specific factors play an important role in addition to the sector-specific factors, emphasising heterogeneity of inflation dynamics along different dimensions. We also conclude that sectoral price changes have a “geographical” dimension, that leads to new insights regarding the properties of sectoral price changes.
Euro area shadow banking activities in a low-interest-rate environment: a flow-of-funds perspective
(2016)
Very low policy rates as well as the substantial redesign of rules and supervisory institutions have changed background conditions for the Euro Area’s financial intermediary sector substantially. Both policy initiatives have been targeted at improving societal welfare. And their potential side effects (or costs) have been discussed intensively, in academic as well as policy circles. Very low policy rates (and correspondingly low market rates) are likely to whet investors’ risk taking incentives. Concurrently, the tightened regulatory framework, in particular for banks, increases the comparative attractiveness of the less regulated, so-called shadow banking sector. Employing flow-of-funds data for the Euro Area’s non-bank banking sector we take stock of recent developments in this part of the financial sector. In addition, we examine to which extent low interest rates have had an impact on investment behavior. Our results reveal a declining role of banks (and, simultaneously, an increase in non-bank banking). Overall intermediation activity, hence, has remained roughly at the same level. Moreover, our findings also suggest that non-bank banks have tended to take positions in riskier assets (particularly in equities). In line with this observation, balance-sheet based risk measures indicate a rise in sector-specific risks in the non-bank banking sector (when narrowly defined).
The global financial crisis (as well as the European sovereign debt crisis) has led to a substantial redesign of rules and institutions – aiming in particular at underwriting financial stability. At the same time, the crisis generated a renewed interest in properly appraising systemic financial vulnerabilities. Employing most recent data and applying a variety of largely only recently developed methods we provide an assessment of indicators of financial stability within the Euro Area. Taking a “functional” approach, we analyze comprehensively all financial intermediary activities, regardless of the institutional roof – banks or non-bank (shadow) banks – under which they are conducted. Our results reveal a declining role of banks (and a commensurate increase in non-bank banking). These structural shifts (between institutions) are coincident with regulatory and supervisory reforms (implemented or firmly anticipated) as well as a non-standard monetary policy environment. They might, unintendedly, actually imply a rise in systemic risk. Overall, however, our analyses suggest that financial imbalances have been reduced over the course of recent years. Hence, the financial intermediation sector has become more resilient. Nonetheless, existing (equity) buffers would probably not suffice to face substantial volatility shocks.
Non-bank (-balance sheet) based financial intermediation has become considerably more important over the last couple of decades. For the U.S., this trend has been discussed ever since the mid-1990s. As a consequence, traditional monetary transmission mechanisms, mainly operating through bank balance sheets, have apparently become less relevant. This in particular applies to the bank lending channel. Concurrently, recent theoretical and empirical work uncovered a "risk-taking channel" of monetary policy. This mechanism is not confined to traditional banks but has been found to operate also across the spectrum of financial intermediaries and intermediation devices, including securitization and collateralized lending/borrowing. In addition, recent empirical evidence suggests that the increasing importance of shadow-banking activities might have given rise to a so-called "waterbed effect". This is a mediating mechanisms, dampening or counteracting typically to be expected reactions to monetary policy impulses. Employing flow-of-funds data, we can document also for the Euro Area that a trend towards non-bank (not necessarily more 'market'-based) intermediation has occurred. This is, however, a fairly recent development, substantially weaker than in the U.S. Nonetheless, analyzing the response of Euro Area bank and nonbank financial intermediaries to monetary policy impulses, we find some notable behavioral differences between mainly deposit-funded and more 'market'-based financial intermediaries. We also detect, inter alia, the existence of a (still) fairly weak, but potentially policyrelevant, "waterbed" effect.
Regional inflation dynamics within and across Euro area countries and a comparison with the US
(2006)
We investigate co-movements and heterogeneity in inflation dynamics of different regions within and across euro area countries using a novel disaggregate dataset to improve the understanding of inflation differentials in the European Monetary Union. We employ a model where regional inflation dynamics are explained by common euro area and country specific factors as well as an idiosyncratic regional component. Our findings indicate a substantial common area wide component, that can be related to the common monetary policy in the euro area and to external developments, in particular exchange rate movements and changes in oil prices. The effects of the area wide factors differ across regions, however. We relate these differences to structural economic characteristics of the various regions. We also find a substantial national component. Our findings do not differ substantially before and after the formal introduction of the euro in 1999, suggesting that convergence has largely taken place before the mid 90s. Analysing US regional inflation developments yields similar results regarding the relevance of common US factors. Finally, we find that disaggregate regional inflation information, as summarised by the area wide factors, is important in explaining aggregate euro area and US inflation rates, even after conditioning on macroeconomic variables. Therefore, monitoring regional inflation rates within euro area countries can enhance the monetary policy maker’s understanding of aggregate area wide inflation dynamics. JEL Classification: E31, E52, E58, C33
Studies employing micro price data to examine the extent of international goods market integration tend to find that borders induce arbitrage-impeding transaction costs which contribute to segment national markets. Analyzing household scanner price data from the three euro area countries Belgium, Germany and Netherlands, we document that Belgian households living in the vicinity of the border to Netherlands pay almost 10% more for the same good as their Dutch counterparts. German consumers on the other hand face prices that are on average up to around 3% smaller than those in the neighboring Netherlands. Counterfactual evidence for within-country price discontinuities provides no evidence of any existing border effects. The induced costs of crossing national borders amount to at least 13%. We also find evidence on border discontinuities in various household preference characteristics (such as demand elasticities and goods valuation) and household shopping patterns such as shopping frequencies.
Microeconometric evidence on demand-side real rigidity and
implications for monetary non-neutrality
(2016)
To model the observed slow response of aggregate real variables to nominal shocks, most macroeconomic models incorporate real rigidities in addition to nominal rigidities. One popular way of modelling such a real rigidity is to assume a non-constant demand elasticity. By using a homescan data set for three European countries, including prices and quantities bought for a large number of goods, in addition to consumer characteristics, we provide estimates of price elasticities of demand and on the degree of demand-side real rigidities. We find that price elasticites of demand are about 4 in the median. Furthermore, we find evidence for demand-side real rigidities. These are, however, much smaller than what is often assumed in macroeconomic models. The median estimate for demand-side real rigidity, the super-elasticity, is in a range between 1 and 2. To quantitatively assess the implications of our empirical estimates, we calibrate a menu-cost model with the estimated super-elasticity. We find that the degree of monetary non-neutrality doubles in the model including demand-side real rigidity, compared to the model with only nominal rigidity, suggesting a multiplier effect of around two. However, the model can explain only up to 6% of the monetary non-neutrality observed in the data, implying that additional multipliers are necessary to match the behavior of aggregate variables.
Using a set of regional inflation rates we examine the dynamics of inflation dispersion within the U.S.A., Japan and across U.S. and Canadian regions. We find that inflation rate dispersion is significant throughout the sample period in all three samples. Based on methods applied in the empirical growth literature, we provide evidence in favor of significant mean reversion (ß-convergence) in inflation rates in all considered samples. The evidence on ó-convergence is mixed, however. Observed declines in dispersion are usually associated with decreasing overall inflation levels which indicates a positive relationship between mean inflation and overall inflation rate dispersion. Our findings for the within-distribution dynamics of regional inflation rates show that dynamics are largest for Japanese prefectures, followed by U.S. metropolitan areas. For the combined U.S.-Canadian sample, we find a pattern of within-distribution dynamics that is comparable to that found for regions within the European Monetary Union (EMU). In line with findings in the so-called 'border literature' these results suggest that frictions across European markets are at least as large as they are, e.g., across North American markets. Klassifikation: E31, E52, E58
We use consumer price data for 205 cities/regions in 21 countries to study deviations from the law-of-one-price before, during and after the major currency crises of the 1990s. We combine data from industrialised nations in North America (Unites States, Canada, Mexico), Europe (Germany, Italy, Spain and Portugal) and Asia (Japan, Korea, New Zealand, Australia) with corresponding data from emerging market economies in the South America (Argentine, Bolivia, Brazil, Columbia) and Asia (India, Indonesia, Malaysia, Philippines, Taiwan, Thailand). We confirm previous results that both distance and border explain a significant amount of relative price variation across different locations. We also find that currency attacks had major disintegration effects by significantly increasing these border effects, and by raising within country relative price dispersion in emerging market economies. These effects are found to be quite persistent since relative price volatility across emerging markets today is still significantly larger than a decade ago. JEL classification: F40, F41
We use consumer price data for 81 European cities (in Germany, Austria, Switzerland, Italy, Spain and Portugal) to study deviations from the law-of-one-price before and during the European Economic and Monetary Union (EMU) by analysing both aggregate and disaggregate CPI data for 7 categories of goods we find that the distance between cities explains a significant amount of the variation in the prices of similar goods in different locations. We also find that the variation of the relative price is much higher for two cities located in different countries than for two equidistant cities in the same country. Under EMU, the elimination of nominal exchange rate volatility has largely reduced these border effects, but distance and border still matter for intra-European relative price volatility. JEL classification: F40, F41
In the New-Keynesian model, optimal interest rate policy under uncertainty is formulated without reference to monetary aggregates as long as certain standard assumptions on the distributions of unobservables are satisfied. The model has been criticized for failing to explain common trends in money growth and inflation, and that therefore money should be used as a cross-check in policy formulation (see Lucas (2007)). We show that the New-Keynesian model can explain such trends if one allows for the possibility of persistent central bank misperceptions. Such misperceptions motivate the search for policies that include additional robustness checks. In earlier work, we proposed an interest rate rule that is near-optimal in normal times but includes a cross-check with monetary information. In case of unusual monetary trends, interest rates are adjusted. In this paper, we show in detail how to derive the appropriate magnitude of the interest rate adjustment following a significant cross-check with monetary information, when the New-Keynesian model is the central bank’s preferred model. The cross-check is shown to be effective in offsetting persistent deviations of inflation due to central bank misperceptions. Keywords: Monetary Policy, New-Keynesian Model, Money, Quantity Theory, European Central Bank, Policy Under Uncertainty
Research with Keynesian-style models has emphasized the importance of the output gap for policies aimed at controlling inflation while declaring monetary aggregates largely irrelevant. Critics, however, have argued that these models need to be modified to account for observed money growth and inflation trends, and that monetary trends may serve as a useful cross-check for monetary policy. We identify an important source of monetary trends in form of persistent central bank misperceptions regarding potential output. Simulations with historical output gap estimates indicate that such misperceptions may induce persistent errors in monetary policy and sustained trends in money growth and inflation. If interest rate prescriptions derived from Keynesian-style models are augmented with a cross-check against money-based estimates of trend inflation, inflation control is improved substantially.
The European Central Bank has assigned a special role to money in its two pillar strategy and has received much criticism for this decision. In this paper, we explore possible justifications. The case against including money in the central bank’s interest rate rule is based on a standard model of the monetary transmission process that underlies many contributions to research on monetary policy in the last two decades. Of course, if one allows for a direct effect of money on output or inflation as in the empirical “two-pillar” Phillips curves estimated in some recent contributions, it would be optimal to include a measure of (long-run) money growth in the rule. In this paper, we develop a justification for including money in the interest rate rule by allowing for imperfect knowledge regarding unobservables such as potential output and equilibrium interest rates. We formulate a novel characterization of ECB-style monetary cross-checking and show that it can generate substantial stabilization benefits in the event of persistent policy misperceptions regarding potential output. Such misperceptions cause a bias in policy setting. We find that cross-checking and changing interest rates in response to sustained deviations of long-run money growth helps the central bank to overcome this bias. Our argument in favor of ECB-style cross-checking does not require direct effects of money on output or inflation. JEL Classification: E32, E41, E43, E52, E58
The European Central Bank has assigned a special role to money in its two pillar strategy and has received much criticism for this decision. The case against including money in the central bank’s interest rate rule is based on a standard model of the monetary transmission process that underlies many contributions to research on monetary policy in the last two decades. In this paper, we develop a justification for including money in the interest rate rule by allowing for imperfect knowledge regarding unobservables such as potential output and equilibrium interest rates. We formulate a novel characterization of ECB-style monetary cross-checking and show that it can generate substantial stabilization benefits in the event of persistent policy misperceptions regarding potential output. JEL Classification: E32, E41, E43, E52, E58
Since 2014 the ECB has implemented a massive expansion of monetary policy including large-scale asset purchases and negative policy rates. As the euro area economy has improved and inflation has risen, questions concerning the future normalization of monetary policy are starting to dominate the public debate.
The study argues that the ECB should develop a strategy for policy normalization and communicate it very soon to prepare the ground for subsequent steps towards tightening. It provides analysis and makes proposals concerning key aspects of this strategy. The aim is to facilitate the emergence of expectations among market participants that are consistent with a smooth process of policy normalization.
Despite the apparent stability of the wage bargaining institutions in West Germany, aggregate union membership has been declining dramatically since the early 90's. However, aggregate gross membership numbers do not distinguish by employment status and it is impossible to disaggregate these sufficiently. This paper uses four waves of the German Socioeconomic Panel in 1985, 1989, 1993, and 1998 to perform a panel analysis of net union membership among employees. We estimate a correlated random effects probit model suggested in Chamberlain (1984) to take proper account of individual specfic effects. Our results suggest that at the individual level the propensity to be a union member has not changed considerably over time. Thus, the aggregate decline in membership is due to composition effects. We also use the estimates to predict net union density at the industry level based on the IAB employment subsample for the time period 1985 to 1997. JEL - Klassifikation: J5
This paper examines empirically the question whether the presence of foreign banks and a liberal trade regime with regard to financial services can contribute to a stabilization of capital flows to emerging markets. Since foreign banks, so the argument goes, provide better information to foreign investors and increase transparency, the danger of herding is reduced. Previous findings by Kono and Schuknecht (1998) confirmed empirically that such an effect does exist. This study expands their data set with respect to the length of the time period and the number of countries. Contrary to Kono and Schuknecht, it is found that foreign bank penetration tends to rather increase the volatility of capital flows. The trade regime variables are not significant in explaining cross-country variations in the volatility of capital flows. This result does not change significantly when alternative measures of volatility are considered. This paper was presented at the conference ''Financial crisis in transition countries: recent lessons and problems yet to solve'' on 13-14 July 2000 at the Institute for Economic Research (IWH) in Halle, Germany.
This paper shows that emerging market eurobond spreads after the Asian crisis can be almost completely explained by market expectations about macroeconomic fundamentals and international interest rates. Contrary to the claim that emerging market bond spreads are driven by market variables such as stock market volatility in the developed countries, it is found that this did not play a significant role after the Asian crisis. Using panel data techniques, it is shown that the determinants of bond spreads can be divided into long-term structural variables and medium-term variables which explain month-to-month changes in bond spreads. As relevant medium-term variables, ''consensus forecasts'' of real GDP growth and inflation, and international interest rates are identified. The long-term structural factors do not explicitly enter the model and show up as fixed or random country-specific effects. These intercepts are highly correlated with the countries' credit rating.
Mindfully Resisting the Bandwagon – IT Implementation and Its Consequences in the Financial Crisis
(2013)
Although the ”financial meltdown” between 2007 and 2009 can be substantially attributed to herding behaviour in the subprime market for credit default swaps, a “mindless” IT implementation of participating financial services providers played a major role in the facilitation of the underlying bandwagon. The problem was a discrepancy between two core complementary capabilities: (1.) the (economic-rationalistic) ability to execute financial transactions (to comply with the herd) in milliseconds and (2.) the required contextualized mindfulness capabilities to comprehend the implications of the transactions being executed and the associated IT innovation decisions that enabled these transactions.
The great financial crisis and the euro area crisis led to a substantial reform of financial safety nets across Europe and – critically – to the introduction of supranational elements. Specifically, a supranational supervisor was established for the euro area, with discrete arrangements for supervisory competences and tasks depending on the systemic relevance of supervised credit institutions. A resolution mechanism was created to allow the frictionless resolution of large financial institutions. This resolution mechanism has been now complemented with a funding instrument.
While much more progress has been achieved than most observers could imagine 12 years ago, the banking union remains unfinished with important gaps and deficiencies. The experience over the past years, especially in the area of crisis management and resolution, has provided impetus for reform discussions, as reflected most lately in the Eurogroup statement of 16 June 2022.
This Policy Insight looks primarily at the current and the desired state of the banking union project. The key underlying question, and the focus here, is the level of ambition and how it is matched with effective legal and regulatory tools. Specifically, two questions will structure the discussions:
What would be a reasonable definition and rationale for a ‘complete’ banking union? And what legal reforms would be required to achieve it?
Banking union is a case of a new remit of EU-level policy that so far has been established on the basis of long pre-existing treaty stipulations, namely, Article 127(6) TFEU (for banking supervision) and Article 114 TFEU (for crisis management and deposit insurance). Could its completion be similarly carried out through secondary law? Or would a more comprehensive overhaul of the legal architecture be required to ensure legal certainty and legitimacy?
Did earnings inequality in the Federal Republic of Germany increase from the 1960s to the 1980s?
(1996)
We investigate the default probability, recovery rates and loss distribution of a portfolio of securitised loans granted to Italian small and medium enterprises (SMEs). To this end, we use loan level data information provided by the European DataWarehouse platform and employ a logistic regression to estimate the company default probability. We include loan-level default probabilities and recovery rates to estimate the loss distribution of the underlying assets. We find that bank securitised loans are less risky, compared to the average bank lending to small and medium enterprises.
In this paper, we examine how the institutional design affects the outcome of bank bailout decisions. In the German savings bank sector, distress events can be resolved by local politicians or a state-level association. We show that decisions by local politicians with close links to the bank are distorted by personal considerations: While distress events per se are not related to the electoral cycle, the probability of local politicians injecting taxpayers’ money into a bank in distress is 30 percent lower in the year directly preceding an election. Using the electoral cycle as an instrument, we show that banks that are bailed out by local politicians experience less restructuring and perform considerably worse than banks that are supported by the savings bank association. Our findings illustrate that larger distance between banks and decision makers reduces distortions in the decision making process, which has implications for the design of bank regulation and supervision.
In this paper, we examine how the institutional design affects the outcome of bank bailout decisions. In the German savings bank sector, distress events can be resolved by local politicians or a state-level association. We show that decisions by local politicians with close links to the bank are distorted by personal considerations: While distress events per se are not related to the electoral cycle, the probability of local politicians injecting taxpayers’ money into a bank in distress is 30 percent lower in the year directly preceding an election. Using the electoral cycle as an instrument, we show that banks that are bailed out by local politicians experience less restructuring and perform considerably worse than banks that are supported by the savings bank association. Our findings illustrate that larger distance between banks and decision makers reduces distortions in the decision making process, which has implications for the design of bank regulation and supervision.
In this paper, we investigate how the introduction of complex, model-based capital regulation affected credit risk of financial institutions. Model-based regulation was meant to enhance the stability of the financial sector by making capital charges more sensitive to risk. Exploiting the staggered introduction of the model-based approach in Germany and the richness of our loan-level data set, we show that (1) internal risk estimates employed for regulatory purposes systematically underpredict actual default rates by 0.5 to 1 percentage points; (2) both default rates and loss rates are higher for loans that were originated under the model-based approach, while corresponding risk-weights are significantly lower; and (3) interest rates are higher for loans originated under the model-based approach, suggesting that banks were aware of the higher risk associated with these loans and priced them accordingly. Further, we document that large banks benefited from the reform as they experienced a reduction in capital charges and consequently expanded their lending at the expense of smaller banks that did not introduce the model-based approach. Counter to the stated objectives, the introduction of complex regulation adversely affected the credit risk of financial institutions. Overall, our results highlight the pitfalls of complex regulation and suggest that simpler rules may increase the efficacy of financial regulation.
In this paper, we investigate how the introduction of complex, model-based capital regulation affected credit risk of financial institutions. Model-based regulation was meant to enhance the stability of the financial sector by making capital charges more sensitive to risk. Exploiting the staggered introduction of the model-based approach in Germany and the richness of our loan-level data set, we show that (1) internal risk estimates employed for regulatory purposes systematically underpredict actual default rates by 0.5 to 1 percentage points; (2) both default rates and loss rates are higher for loans that were originated under the model-based approach, while corresponding risk-weights are significantly lower; and (3) interest rates are higher for loans originated under the model-based approach, suggesting that banks were aware of the higher risk associated with these loans and priced them accordingly. Further, we document that large banks benefited from the reform as they experienced a reduction in capital charges and consequently expanded their lending at the expense of smaller banks that did not introduce the model-based approach. Counter to the stated objectives, the introduction of complex regulation adversely affected the credit risk of financial institutions. Overall, our results highlight the pitfalls of complex regulation and suggest that simpler rules may increase the efficacy of financial regulation.
Using loan-level data from Germany, we investigate how the introduction of model-based capital regulation affected banks’ ability to absorb shocks. The objective of this regulation was to enhance financial stability by making capital requirements responsive to asset risk. Our evidence suggests that banks ‘optimized’ model-based regulation to lower their capital requirements. Banks systematically underreported risk, with under reporting being more pronounced for banks with higher gains from it. Moreover, large banks benefitted from the regulation at the expense of smaller banks. Overall, our results suggest that sophisticated rules may have undesired effects if strategic misbehavior is difficult to detect.
In this paper we investigate the implications of providing loan officers with a compensation structure that rewards loan volume and penalizes poor performance versus a fixed wage unrelated to performance. We study detailed transaction information for more than 45,000 loans issued by 240 loan officers of a large commercial bank in Europe. We examine the three main activities that loan officers perform: monitoring, originating, and screening. We find that when the performance of their portfolio deteriorates, loan officers increase their effort to monitor existing borrowers, reduce loan origination, and approve a higher fraction of loan applications. These loans, however, are of above-average quality. Consistent with the theoretical literature on multitasking in incomplete contracts, we show that loan officers neglect activities that are not directly rewarded under the contract, but are in the interest of the bank. In addition, while the response by loan officers constitutes a rational response to a time allocation problem, their reaction to incentives appears myopic in other dimensions.
This paper investigates whether the stock market reacts to unsolicited ratings for a sample of S&P rated firms from January 1996 to December 2005. We first analyze the stock market reaction associated with the assignment of an initial unsolicited rating. We find evidence that this reaction is negative and particularly accentuated for Japanese firms. A comparison between S&P’s initial unsolicited ratings with previously published ratings of two Japanese rating agencies for a Japanese subsample shows that ratings assigned by S&P are systematically worse. Further, we find that the stock market does not react to the transition from an unsolicited to a solicited rating. Comparison of the upgrades in the sample with a matched-sample of upgrades of solicited ratings reveals that the price reactions are no different. In addition, abnormal returns are worse for firms whose rating remained unchanged after the solicitation compared to those for upgraded firms. Finally, we find that Japanese firms are less likely to receive an upgrade. Our findings suggest that unsolicited ratings are biased downwards, that the capital market therefore expects upgrades of formerly unsolicited ratings and punishes firms whose ratings remain unchanged. All these effects seem to be more pronounced for Japanese firms.
We derive the effects of credit risk transfer (CRT) markets on real sector productivity and on the volume of financial intermediation in a model where banks choose their optimal degree of CRT and monitoring. We find that CRT increases productivity in the up-market real sector but decreases it in the low-end segment. If optimal, CRT unambiguously fosters financial deepening, i.e., it reduces credit-rationing in the economy. These effects rely upon the ability of banks to commit to the optimal CRT at the funding stage. The optimal degree of CRT depends on the combination of moral hazard, general riskiness, and the cost of monitoring in non-monotonic ways.
This Paper gives an overview of the German banking system and current challenges it is facing. It starts with an overview of the so-called ‘Three-Pillar-Banking-System’ and a detailed description of the current structure of the banking system in Germany. A brief comparison of the banking system in Germany with the ones in other European countries points out its uniqueness. The consequences of the financial crisis of 2007/2008 and further challenges for the German banking system are discussed, as well as the the ongoing debate around the question whether the strong government involvement should be sustained.
Different languages employ different morphosyntactic devices for expressing genericity. And, of course, they also make use of different morphosyntactic and semantic or pragmatic cues which may contribute to the interpretation of a sentence as generic rather than episodic. [...] We will advance the strong hypo thesis that it is a fundamental property of lexical elements in natural language that they are neutral with respect to different modes of reference or non-reference. That is, we reject the idea that a certain use of a lexical element, e.g. a use which allows reference to particular spatio-temporally bounded objects in the world, should be linguistically prior to all other possible uses, e.g. to generic and non-specific uses. From this it follows that we do not consider generic uses as derived from non-generic uses as it is occasionally assumed in the literature. Rather, we regard these two possibilities of use as equivalent alternative uses of lexical elements. The typological differences to be noted therefore concern the formal and semantic relationship of generic and non-generic uses to each other; they do not pertain to the question of whether lexical elements are predetermined for one of these two uses. Even supposing we found a language where generic uses are always zero-marked and identical to lexical sterns, we would still not assume that lexical elements in this language primarily have a generic use from which the non-generic uses are derived. (Incidentally, none of the languages examined, not even Vietnamese, meets this criterion.)
We show that High Frequency Traders (HFTs) are not beneficial to the stock market during flash crashes. They actually consume liquidity when it is most needed, even when they are rewarded by the exchange to provide immediacy. The behavior of HFTs exacerbate the transient price impact, unrelated to fundamentals, typically observed during a flash crash. Slow traders provide liquidity instead of HFTs, taking advantage of the discounted price. We thus uncover a trade-o↵ between the greater liquidity and efficiency provided by HFTs in normal times, and the disruptive consequences of their trading activity during distressed times.
This paper analyses whether the post-crisis regulatory reforms developed by global-standard-setting bodies have created appropriate incentives for different types of market participants to centrally clear Over-The-Counter (OTC) derivative contracts. Beyond documenting the observed facts, we analyze four main drivers for the decision to clear: 1) the liquidity and riskiness of the reference entity; 2) the credit risk of the counterparty; 3) the clearing member’s portfolio net exposure with the Central Counterparty Clearing House (CCP) and 4) post trade transparency. We use confidential European trade repository data on single-name Sovereign Credit Derivative Swap (CDS) transactions, and show that for all the transactions reported in 2016 on Italian, German and French Sovereign CDS 48% were centrally cleared, 42% were not cleared despite being eligible for central clearing, while 9% of the contracts were not clearable because they did not satisfy certain CCP clearing criteria. However, there is a large difference between CCP clearing members that clear about 53% of their transactions and non-clearing members, even those that are subject to counterparty risk capital requirements, that almost never clear their trades. Moreover, we find that diverse factors explain clearing members’ decision to clear different CDS contracts: for Italian CDS, counterparty credit risk exposures matter most for the decision to clear, while for French and German CDS, margin costs are the most important factor for the decision. Clearing members use clearing to reduce their exposures to the CCP and largely clear contracts when at least one of the traders has a high counterparty credit risk.
Coming early to the party
(2017)
We examine the strategic behavior of High Frequency Traders (HFTs) during the pre-opening phase and the opening auction of the NYSE-Euronext Paris exchange. HFTs actively participate, and profitably extract information from the order flow. They also post "flash crash" orders, to gain time priority. They make profits on their last-second orders; however, so do others, suggesting that there is no speed advantage. HFTs lead price discovery, and neither harm nor improve liquidity. They "come early to the party", and enjoy it (make profits); however, they also help others enjoy the party (improve market quality) and do not have privileges (their speed advantage is not crucial).
Do competition and incentives offered to designated market makers (DMMs) improve market liquidity? Using data from NYSE Euronext Paris, we show that an exogenous increase in competition among DMMs leads to a significant decrease in quoted and effective spreads, mainly through a reduction in adverse selection costs. In contrast, changes in incentives, through small changes in rebates and requirements for DMMs, do not have any tangible effect on market liquidity. Our results are of relevance for designing optimal contracts between exchanges and DMMs and for regulatory market oversight.
We study whether the presence of low-latency traders (including high-frequency traders (HFTs)) in the pre-opening period contributes to market quality, defined by price discovery and liquidity provision, in the opening auction. We use a unique dataset from the Tokyo Stock Exchange (TSE) based on server-IDs and find that HFTs dynamically alter their presence in different stocks and on different days. In spite of the lack of immediate execution, about one quarter of HFTs participate in the pre-opening period, and contribute significantly to market quality in the pre-opening period, the opening auction that ensues and the continuous trading period. Their contribution is largely different from that of the other HFTs during the continuous period.
In the microstructure literature, information asymmetry is an important determinant of market liquidity. The classic setting is that uninformed dedicated liquidity suppliers charge price concessions when incoming market orders are likely to be informationally motivated. In limit order book markets, however, this relationship is less clear, as market participants can switch roles, and freely choose to immediately demand or patiently supply liquidity by submitting either market or limit orders. We study the importance of information asymmetry in limit order books based on a recent sample of thirty German DAX stocks. We find that Hasbrouck’s (1991) measure of trade informativeness Granger-causes book liquidity, in particular that required to fill large market orders. Picking-off risk due to public news induced volatility is more important for top-of-the book liquidity supply. In our multivariate analysis we control for volatility, trading volume, trading intensity and order imbalance to isolate the effect of trade informativeness on book liquidity. JEL Classification: G14 Keywords: Price Impact of Trades , Trading Intensity , Dynamic Duration Models, Spread Decomposition Models , Adverse Selection Risk
Previous evidence suggests that less liquid stocks entail higher average returns. Using NYSE data, we present evidence that both the sensitivity of returns to liquidity and liquidity premia have significantly declined over the past four decades to levels that we cannot statistically distinguish from zero. Furthermore, the profitability of trading strategies based on buying illiquid stocks and selling illiquid stocks has declined over the past four decades, rendering such strategies virtually unprofitable. Our results are robust to several conventional liquidity measures related to volume. When using liquidity measure that is not related to volume, we find just weak evidence of a liquidity premium even in the early periods of our sample. The gradual introduction and proliferation of index funds and exchange traded funds is a possible explanation for these results.
Do firms buy their stock at bargain prices? : Evidence from actual stock repurchase disclosure
(2011)
We use new data from SEC filings to investigate how S&P 500 firms execute their open market repurchase programs. We find that smaller S&P 500 firms repurchase less frequently than larger firms, and at a price which is significantly lower than the average market price. Their repurchase activity is followed by a positive and significant abnormal return which lasts up to three months after the repurchase. These findings do not hold for large S&P 500 firms. Our interpretation is that small firms repurchase strategically, whereas the repurchase activity of large firms is more focused on the disbursement of free cash. JEL Classification: G14, G30, G35 Keywords: Stock Repurchases, Stock Buybacks, Payout Policy, Timing, Bid-Ask Spread, Liquidity
Stability maintenance at the grassroots: China’s weiwen apparatus as a form of conflict resolution
(2013)
This working paper explores the history and potential of “stability maintenance” (weiwen) as a form of conflict resolution in China. Its emphasis on conflict resolution is novel. Previous examinations of the weiwen apparatus have concentrated on its political function, namely to manage resistance within society and maintain the authority of the party-state. This avenue of investigation has proved fruitful as a means of characterising the political motivation and the higher-level strategies involved in stability maintenance. Nonetheless, there remain significant conceptual and empirical gaps relating to how stability maintenance offices and processes actually function, particularly out of larger cities and at local levels. The research described in this paper aims to consider the effectiveness of stability maintenance as a part of the “market” for conflict resolution in local China, and to test the hypothesis that conflict resolution as facilitated by weiwen is the most pragmatic and effective means of actually resolving conflicts in the current Chinese political context, notwithstanding the closeness of the stability maintenance discourse to state authority and its relative distance from rule of law-based methods of dispute resolution...
We review arguments for and against reserve requirements and conclude that the main question is whether a distinction between money creation and intermediation can be made. We argue that such a distinction can be made in a money-in-advance economy and show that if the money-in-advance constraint is universally binding then reserve requirements on checkable accounts have no effect on intermediation. We then proceed to show that in a model in which trade is uncertain and sequential, a fractional reserve banking system gives rise to endogenous monetary shocks. These endogenous monetary shocks lead to fluctuations in capacity utilisation and waste. When the moneyin-advance constraint is universally binding, a 100% reserve requirement on checkable accounts can eliminate this waste.
This paper examines how networks of professional contacts contribute to the development of the careers of executives of North American and European companies. We build a dynamic model of career progression in which career moves may both depend upon existing networks and contribute to the development of future networks. We test the theory on an original dataset of nearly 73 000 executives in over 10 000 _rms. In principle professional networks could be relevant both because they are rewarded by the employer and because they facilitate job mobility. Our econometric analysis suggests that, although there is a substantial positive correlation between network size and executive compensation, with an elasticity of around 20%, almost all of this is due to unobserved individual characteristics. The true causal impact of networks on compensation is closer to an elasticity of 1 or 2% on average, all of this due to enhanced probability of moving to a higher-paid job. And there appear to be strongly diminishing returns to network size.
In this paper I assess the effect of interest rate risk and longevity risk on the solvency position of a life insurer selling policies with minimum guaranteed rate of return, profit participation and annuitization option at maturity. The life insurer is assumed to be based in Germany and therefore subject to German regulation as well as to Solvency II regulation. The model features an existing back book of policies and an existing asset allocation calibrated on observed data, which are then projected forward under stochastic financial markets and stochastic mortality developments. Different scenarios are proposed, with particular focus on a prolonged period of low interest rates and strong reduction in mortality rates. Results suggest that interest rate risk is by far the greatest threat for life insurers, whereas longevity risk can be more easily mitigated and thereby is less detrimental. Introducing a dynamic demand for new policies, i.e. assuming that lower offered guarantees are less attractive to savers, show that a decreasing demand may even be beneficial for the insurer in a protracted period of low interest rates. Introducing stochastic annuitization rates, i.e. allowing for deviations from the expected annuitization rate, the solvency position of the life insurer worsen substantially. Also profitability strongly declines over time, casting doubts on the sustainability of traditional life business going forward with the low interest rate environment. In general, in the proposed framework it is possible to study the evolution over time of an existing book of policies when underlying financial market conditions and mortality developments drastically change. This feature could be of particular interest for regulatory and supervisory authorities within their financial stability mandate, who could better evaluate micro- and macro-prudential policy interventions in light of the persistent low interest rate environment.
Low interest rates are becoming a threat to the stability of the life insurance industry, especially in countries such as Germany, where products with relatively high guaranteed returns sold in the past still represent a prominent share of the total portfolio. This contribution aims to assess and quantify the effects of the current low interest rate phase on the balance sheet of a representative German life insurer, given the current asset allocation and the outstanding liabilities. To do so, we generate a stochastic term structure of interest rates as well as stock market returns to simulate investment returns of a stylized life insurance business portfolio in a multi-period setting. Based on empirically calibrated parameters, we can observe the evolution of the life insurers' balance sheet over time with a special focus on their solvency situation. To account for different scenarios and in order to check the robustness of our findings, we calibrate different capital market settings and different initial situations of capital endowment. Our results suggest that a prolonged period of low interest rates would markedly affect the solvency situation of life insurers, leading to relatively high cumulative probability of default for less capitalized companies.
Low interest rates are becoming a threat to the stability of the life insurance industry, especially in countries such as Germany, where products with relatively high guaranteed returns sold in the past still represent a prominent share of the total portfolio. This contribution aims to assess and quantify the effects of the current low interest rate phase on the balance sheet of a representative German life insurer, given the current asset allocation and the outstanding liabilities. To do so, we generate a stochastic term structure of interest rates as well as stock market returns to simulate investment returns of a stylized life insurance business portfolio in a multi-period setting. Based on empirically calibrated parameters, we can observe the evolution of the life insurers’ balance sheet over time with a special focus on their solvency situation. To account for different scenarios and in order to check the robustness of our findings, we calibrate different capital market settings and different initial situations of capital endowment. Our results suggest that a prolonged period of low interest rates would markedly affect the solvency situation of life insurers, leading to a relatively high cumulative probability of default, especially for less capitalized companies. In addition, the new reform of the German life insurance regulation has a beneficial effect on the cumulative probability of default, as a direct consequence of the reduction of the payouts to policyholders.
This paper investigates the effects of a rise in interest rate and lapse risk of endowment life insurance policies on the liquidity and solvency of life insurers. We model the book and market value balance sheet of an average German life insurer, subject to both GAAP and Solvency II regulation, featuring an existing back book of policies and an existing asset allocation calibrated by historical data. The balance sheet is then projected forward under stochastic financial markets. Lapse rates are modeled stochastically and depend on the granted guaranteed rate of return and prevailing level of interest rates. Our results suggest that in the case of a sharp increase in interest rates, policyholders sharply increase lapses and the solvency position of the insurer deteriorates in the short-run. This result is particularly driven by the interaction between a reduction in the market value of assets, large guarantees for existing policies, and a very slow adjustment of asset returns to interest rates. A sharp or gradual rise in interest rates is associated with substantial and persistent liquidity needs, that are particularly driven by lapse rates.
A stochastic forward-looking model to assess the profitability and solvency of european insurers
(2016)
In this paper, we develop an analytical framework for conducting forward-looking assessments of profitability and solvency of the main euro area insurance sectors. We model the balance sheet of an insurance company encompassing both life and non-life business and we calibrate it using country level data to make it representative of the major euro area insurance markets. Then, we project this representative balance sheet forward under stochastic capital markets, stochastic mortality developments and stochastic claims. The model highlights the potential threats to insurers solvency and profitability stemming from a sustained period of low interest rates particularly in those markets which are largely exposed to reinvestment risks due to the relatively high guarantees and generous profit participation schemes. The model also proves how the resilience of insurers to adverse financial developments heavily depends on the diversification of their business mix. Finally, the model identifies potential negative spillovers between life and non-life business thorugh the redistribution of capital within groups.