330 Wirtschaft
Refine
Year of publication
- 2021 (218)
- 2014 (181)
- 2017 (173)
- 2020 (173)
- 2022 (167)
- 2018 (164)
- 2023 (155)
- 2016 (150)
- 2013 (145)
- 2015 (141)
- 2019 (133)
- 2012 (106)
- 2008 (100)
- 2005 (96)
- 2003 (95)
- 2009 (93)
- 2010 (92)
- 2011 (90)
- 2006 (82)
- 2004 (73)
- 2007 (68)
- 2024 (60)
- 2002 (45)
- 2001 (41)
- 1999 (35)
- 2000 (33)
- 1998 (31)
- 1997 (11)
- 1996 (10)
- 1993 (5)
- 1994 (4)
- 1995 (3)
- 1992 (2)
- 1892 (1)
- 1943 (1)
- 1946 (1)
- 1976 (1)
- 1990 (1)
- 1991 (1)
Document Type
- Working Paper (1834)
- Article (481)
- Part of Periodical (446)
- Report (105)
- Doctoral Thesis (40)
- Book (28)
- Conference Proceeding (14)
- Periodical (11)
- Part of a Book (9)
- Review (7)
Language
- English (2982) (remove)
Is part of the Bibliography
- no (2982)
Keywords
- Deutschland (117)
- Geldpolitik (55)
- USA (51)
- monetary policy (50)
- Financial Institutions (48)
- Schätzung (48)
- Europäische Union (44)
- Monetary Policy (44)
- ECB (42)
- Bank (39)
Institute
- Wirtschaftswissenschaften (1869)
- Center for Financial Studies (CFS) (1483)
- Sustainable Architecture for Finance in Europe (SAFE) (1057)
- House of Finance (HoF) (698)
- E-Finance Lab e.V. (356)
- Institute for Monetary and Financial Stability (IMFS) (191)
- Rechtswissenschaft (89)
- Foundation of Law and Finance (50)
- Gesellschaftswissenschaften (31)
- Institute for Law and Finance (ILF) (31)
This paper compares the dynamics of the financial integration process as described by different empirical approaches. To this end, a wide range of measures accounting for several dimensions of integration is employed. In addition, we evaluate the performance of each measure by relying on an established international finance result, i.e., increasing financial integration leads to declining international portfolio diversification benefits. Using monthly equity market data for three different country groups (i.e., developed markets, emerging markets, developed plus emerging markets) and a dynamic indicator of international portfolio diversification benefits, we find that (i) all measures give rise to a very similar long-run integration pattern; (ii) the standard correlation explains variations in diversification benefits as well or better than more sophisticated measures. These Findings are robust to a battery of robustness checks.
In this study, we unpack the ESG ratings of four prominent agencies in Europe and find that (i) each single E, S, G pillar explains the overall ESG score differently,(ii) there is a low co-movement between the three E, S, G pillars and (iii) there are specific ESG Key Performance Indicators (KPIs) that are driving these ratings more than others. We argue that such discrepancies might mislead firms about their actual ESG status, potentially leading to cherry-picking areas for improvement, thus raising questions about the accuracy and effectiveness of ESG evaluations in both explaining sustainability and driving capital toward sustainable companies.
We delve into the EU's regulatory changes aimed at boosting transparency in sustainable investments. By examining disparities among ESG rating agencies, we assess how these differences challenge standardization and consensus. Our analysis underscores the critical need for clearer ESG assessments to guide the sustainable investment landscape.
The centrality of the United States in the global financial system is taken for granted, but its response to recent political and epidemiological events has suggested that China now holds a comparable position. Using minute-by-minute data from 2012 to 2020 on the financial performance of twelve country-specific exchange-traded funds, we construct daily snapshots of the global financial network and analyze them for the centrality and connectedness of each country in our sample. We find evidence that the U.S. was central to the global financial system into 2018, but that the U.S.-China trade war of 2018–2019 diminished its centrality, and the Covid-19 outbreak of 2019–2020 increased the centrality of China. These indicators may be the first signals that the global financial system is moving from a unipolar to a bipolar world.
In the aftermath of the global financial crisis, both resolution planning, i.e. contingency planning by both regulated institutions and public authorities in order to prepare their actions in financial crisis, and concepts for structural bank reform have been identified as possible solutions to ending “Too Big To Fail” and foster market discipline among bank owners, bank managers and investors in bank debt. Both concepts thus complement the global quest for reliable procedures and tools for bank resolution that would minimise systemic implications once large and complex financial institutions have reached the stage of insolvency. Given the complex task of orchestrating swift and effective resolution actions, especially with regard to cross-border banking groups and financial conglomerates, planning ahead in good times has since been widely recognised as crucial for enhancing resolvability. At least part of the impediments to resolution will be found in organisational, financial and legal complexity that has evolved in banks and groups over time. To remove these impediments, interference with existing corporate and group structures is all but inevitable. However, in both international standard setting and at the European Union level, issues related to resolution planning (within the context of bank resolution reform) and structural banking reforms to date have been discussed rather separately. This lack of consistency is questionable, given the obvious need to reconcile both approaches in order to facilitate effective implementation and enforcement especially with regard to large, complex banking groups. Based on an analysis both of the Bank Recovery and Resolution Directive and the SRM Regulation, this paper explores how these problems could be dealt with within the context of the European Banking Union.
The creation of the Banking Union is likely to come with substantial implications for the governance of Eurozone banks. The European Central Bank, in its capacity as supervisory authority for systemically important banks, as well as the Single Resolution Board, under the EU Regulations establishing the Single Supervisory Mechanism and the Single Resolution Mechanism, have been provided with a broad mandate and corresponding powers that allow for far-reaching interference with the relevant institutions’ organisational and business decisions. Starting with an overview of the relevant powers, the present paper explores how these could – and should – be exercised against the backdrop of the fundamental policy objectives of the Banking Union. The relevant aspects directly relate to a fundamental question associated with the reallocation of the supervisory landscape, namely: Will the centralisation of supervisory powers, over time, also lead to the streamlining of business models, corporate and group structures of banks across the Eurozone?
Panel Sample Selection ModelsThe empirical evidence currently available in the literature regarding the effects of a country's IMF program participation on its output growth is rather inconclusive. In this paper we propose and estimate a panel data sample selection model featuring state dependence. As in this model the output growth effects of program participation can be conditional on the realization of a state variable (conditional pooling), our framework may reconcile previous empirical evidence based on models without state-dependent effects. We find that the effects of IMF program participation on output growth vary systematically with an index reflecting a country's institutional record, and that output growth effects of program participation are significantly positive only if the program participation is coupled with sufficient improvement of the institutional record.
For some time now, structural macroeconomic models used at central banks have been predominantly New Keynesian DSGE models featuring nominal rigidities and forwardlooking decision-making. While these features are widely deemed crucial for policy evaluation exercises, most central banks have added more detailed characterizations of the financial sector to these models following the Great Recession in order to improve their fit to the data and their forecasting performance. We employ a comparative approach to investigate the characteristics of this new generation of New Keynesian DSGE models and document an elevated degree of model uncertainty relative to earlier model generations. Policy transmission is highly heterogeneous across types of financial frictions and monetary policy causes larger effects, on average. The New Keynesian DSGE models we analyze suggest that a simple policy rule robust to model uncertainty involves a weaker response to inflation and the output gap in the presence of financial frictions as compared to earlier generations of such models. Leaning-against-the-wind policies in models of this class estimated for the Euro Area do not lead to substantial gains. With regard to forecasting performance, the inclusion of financial frictions can generate improvements, if conditioned on appropriate data. Looking forward, we argue that model-averaging and embracing alternative modelling paradigms is likely to yield a more robust framework for the conduct of monetary policy.
In this paper we revisit medium- to long-run exchange rate determination, focusing on the role of international investment positions. To do so, we develop a new econometric framework accounting for conditional long-run homogeneity in heterogeneous dynamic panel data models. In particular, in our model the long-run relationship between effective exchange rates and domestic as well as weighted foreign prices is a homogeneous function of a country’s international investment position. We find rather strong support for purchasing power parity in environments of limited negative net foreign asset to GDP positions, but not outside such environments. We thus argue that the purchasing power parity hypothesis holds conditionally, but not unconditionally, and that international investment positions are an essential component to characterizing this conditionality. Finally, we adduce evidence that whether deterioration of a country’s net foreign asset to GDP position leads to a depreciation of that country’s effective exchange rate depends on its rate of inflation relative to the rate of inflation abroad as well as its exposure to global shocks. JEL Classification: F31, F37, C23
The European Central Bank
(2007)
The establishment of the ECB and with it the launch of the euro has arguably been a unique endeavor in economic history, representing an important experiment in central banking. This note aims to summarize some of the main lessons learned from this experiment and sketch some of the prospects for the ECB. It is written for "The New Palgrave Dictionary of Economics", 2nd edition. JEL Classification: E52, E58
RECENTLY, A NEW CLASS OF SYSTEMS FOR SHARED AND COLLABORATIVE DATA MANAGEMENT HAS GAINED MORE AND MORE TRACTION. IN CONTRAST TO CLASSICAL DATA BASE MANAGEMENT SYSTEMS (DBMS), SYSTEMS FOR SHARED DATA NEED TO PROVIDE ADDITIONAL GUARANTEES TO ENSURE THE INTEGRITY OF DATA AND TRANSACTION EXECUTION. IN THIS PAPER, WE PRESENT TRUSTDBLE, A NEW DBMS THAT EXTENDS THE ACID PROPERTIES (I.E., ATOMICITY, CONSISTENCY, ISOLATION, DURABILITY) USED BY CLASSICAL DBMSS WITH A NEW VERIFIABILITY COMPONENT TO ADDRESS THESE NEW REQUIREMENTS.
The loan impairment rules recently introduced by IFRS 9 require banks to estimate their future credit losses by using forward-looking information. We use supervisory loan-level data from Germany to investigate how banks apply their reporting discretion and adjust their lending upon the announcement of the new rules. Our identification strategy exploits a cut-off for the level of provisions at the investment grade threshold based on banks’ internal rating of a borrower. We find that banks required to adopt the new rules assign better internal ratings to exactly the same borrowers compared to banks that do not apply IFRS 9 around this cut-off. This pattern is consistent with a strategic use of the increased reporting discretion that is inherent to rules requiring forward-looking loss estimation. At the same time, banks also reduce their lending exposure to exactly those borrowers at the highest risk of experiencing a rating downgrade below the cutoff. These loans would be associated with additional provisions in future periods, both in the intensive and extensive margin. The lending change thus mitigates some of the negative effects of increased reporting opportunism on banks’ crisis resilience. However, when these firms with internal ratings around the investment grade cut-off obtain less external funding through banks, the introduction of IFRS 9 will likely also be associated with real economic effects
European banks have substantial investments in assets that are
measured without directly observable market prices (mark-to-
model). Financial disclosures of these value estimates lack
standardization and are hard to compare across banks. These
comparability concerns are concentrated in large European
banks that extensively rely on level 3 estimates with the most
unobservable inputs. Although the relevant balance sheet
positions only represent a small fraction of these large banks’
total assets (2.9%), their value equals a significant fraction of core
equity tier 1 (48.9%). Incorrect valuations thus have a potential to
impact financial stability. 85% of these bank assets are under
direct ECB supervision. Prudential regulation requires value
adjustments that are apt to shield capital against valuation risk.
Yet, stringent enforcement is critical for achieving this objective.
This document was provided by the Economic Governance
Support Unit at the request of the ECON Committee.
Accounting for financial stability: Bank disclosure and loss recognition in the financial crisis
(2020)
This paper examines banks’ disclosures and loss recognition in the financial crisis and identifies several core issues for the link between accounting and financial stability. Our analysis suggests that, going into the financial crisis, banks’ disclosures about relevant risk exposures were relatively sparse. Such disclosures came later after major concerns about banks’ exposures had arisen in markets. Similarly, the recognition of loan losses was relatively slow and delayed relative to prevailing market expectations. Among the possible explanations for this evidence, our analysis suggests that banks’ reporting incentives played a key role, which has important implications for bank supervision and the new expected loss model for loan accounting. We also provide evidence that shielding regulatory capital from accounting losses through prudential filters can dampen banks’ incentives for corrective actions. Overall, our analysis reveals several important challenges if accounting and financial reporting are to contribute to financial stability.
This paper investigates what we can learn from the financial crisis about the link between accounting and financial stability. The picture that emerges ten years after the crisis is substantially different from the picture that dominated the accounting debate during and shortly after the crisis. Widespread claims about the role of fair-value (or mark-to-market) accounting in the crisis have been debunked. However, we identify several other core issues for the link between accounting and financial stability. Our analysis suggests that, going into the financial crisis, banks’ disclosures about relevant risk exposures were relatively sparse. Such disclosures came later after major concerns about banks’ exposures had arisen in markets. Similarly, banks delayed the recognition of loan losses. Banks’ incentives seem to drive this evidence, suggesting that reporting discretion and enforcement deserve careful consideration. In addition, bank regulation through its interlinkage with financial accounting may have dampened banks’ incentives for corrective actions. Our analysis illustrates that a number of serious challenges remain if accounting and financial reporting are to contribute to financial stability.
In the aftermath of an increasing integration of property and financial markets, the real estate industry is subject to soaring internationalization processes. Since international institutional investors appeared, transnational real estate investments have increased tremendously. In recent years, Central and Eastern European countries have been becoming more attractive to institutional investors and are therefore being integrated into international market structures. Within these countries, Warsaw emerged as the most dynamic and important real estate market. But what are the mechanisms and practices through which the real estate market of Warsaw becomes international? Which networks, intermediaries and frames are necessary to constitute a mature real estate market? The article argues that international real estate consultants are playing a crucial role in the underlying internationalization process. They are acting at the interface between investors, developers, construction companies and tenants and are therefore becoming a crucial hinge between real estate actors. With the example of the Warsaw real estate market we argue that international real estate consultancies are key drivers of the transformation process from a local to a global market. They transfer global knowledge, competence and practices and implement transparent and professional structures in the emerging Warsaw real estate market.
We develop a utility based model of fluctuations, with nominal rigidities, and unemployment. In doing so, we combine two strands of research: the New Keynesian model with its focus on nominal rigidities, and the Diamond-Mortensen-Pissarides model, with its focus on labor market frictions and unemployment. In developing this model, we proceed in two steps. We first leave nominal rigidities aside. We show that, under a standard utility specification, productivity shocks have no effect on unemployment in the constrained efficient allocation. We then focus on the implications of alternative real wage setting mechanisms for fluctuations in unemployment. We then introduce nominal rigidities in the form of staggered price setting by firms. We derive the relation between inflation and unemployment and discuss how it is influenced by the presence of real wage rigidities. We show the nature of the tradeoff between inflation and unemployment stabilization, and we draw the implications for optimal monetary policy. JEL Classification: E32, E50
BANK INSTABILITY SEEMINGLY COULD PUSH BORROWERS TO USE CROWDFUNDING AS A SOURCE OF EXTERNAL FINANCE. WE CONSTRUCT A NOVEL, HAND-COLLECTED DATA SET OF VENTURES' USES OF EQUITY CROWDFUNDING IN GERMANY, THEIR RELATIONSHIPS WITH BANKS, AND VARIOUS VENTURE TRAITS SINCE 2011. BY OBSERVING VENTUREBANK RELATIONSHIPS, WE CAN IDENTIFY IF VENTURES CONNECTED TO SHOCKED BANKS ARE MORE LIKELY TO USE CROWDFUNDING IN AN ATTEMPT TO SUBSTITUTE FOR CONTRACTING BANK CREDIT SUPPLY. OUR RESULTS SHOW THAT CROWDFUNDING IS SIGNIFICANTLY MORE LIKELY FOR NEW VENTURES THAT INTERACT WITH STRESSED BANKS. INNOVATIVE FUNDING SOURCES ARE THUS PARTICULARLY RELEVANT IN TIMES OF STRESS AMONG CONVENTIONAL FINANCIERS. BUT CROWDFUNDED VENTURES ARE GENERALLY ALSO MORE OPAQUE AND RISKY THAN NEW VENTURES THAT DO NOT USE CROWDFUNDING.
UNDER LAISSEZ-FAIRE REGULATION, REGULATORS CHOOSE NOT TO INTERFERE BECAUSE THEY SEEK TO STIMULATE INNOVATION AND PROTECT ENTERPRISES FROM THE COSTS IMPOSED BY REGULATORY COMPLIANCE. YET, EMPIRICAL EVIDENCE REGARDING THE ABILITY OF LAISSEZ-FAIRE REGULATION TO ENSURE CONSUMER PROTECTION IS LACKING. THIS ARTICLE TESTS EMPIRICALLY WHETHER THE CURRENT LAISSEZ-FAIRE REGULATION OF PRICE ADVERTISING CLAIMS ON THE MOST POPULAR REWARD-BASED CROWDFUNDING PLATFORM, KICKSTARTER, IS SUFFICIENT TO PROTECT CONSUMERS.
ENTREPRENEURS OFTEN OVERESTIMATE THE LIKELIHOOD OF SUCCESS WHEN PLANNING THEIR OWN ACTIVITIES, WHICH USUALLY RESULTS IN EXCESS MARKET ENTRY. SINCE ENTREPRENEURS CONSIDER THEIR PROJECT AS UNIQUE, THE FORECASTS OF THE FUTURE OUTCOME ARE OFTEN ANCHORED ON THE CASE AT HAND RATHER THAN ON PAST RESULTS OF COMPARABLE PROJECTS – A PHENOMENON KNOWN AS THE PLANNING FALLACY. WE INVESTIGATE WHETHER ENTREPRENEURS SUFFER FROM A PLANNING FALLACY BIAS WHEN PROVIDED WITH HISTORICAL OUTCOMES OF COMPARABLE PROJECTS AND ITS CONSEQUENCES.
The purpose of the data presented in this article is to use it in ex post estimations of interest rate decisions by the European Central Bank (ECB), as it is done by Bletzinger and Wieland (2017) [1]. The data is of quarterly frequency from 1999 Q1 until 2013 Q2 and consists of the ECB's policy rate, inflation rate, real output growth and potential output growth in the euro area. To account for forward-looking decision making in the interest rate rule, the data consists of expectations about future inflation and output dynamics. While potential output is constructed based on data from the European Commission's annual macro-economic database, inflation and real output growth are taken from two different sources both provided by the ECB: the Survey of Professional Forecasters and projections made by ECB staff. Careful attention was given to the publication date of the collected data to ensure a real-time dataset only consisting of information which was available to the decision makers at the time of the decision.
A number of contributions to research on monetary policy have suggested that policy should be asymmetric near the lower bound on nominal interest rates. As inflation and economic activity decline, policy should ease more aggressively than it would in the absence of the lower bound. As activity recovers and inflation picks up, the central bank should act to keep interest rates lower for longer than without the bound. In this note, we investigate to what extent the policy easing implemented by the ECB since summer 2013 mirrors the rate recommendations of a simple policy rule or deviates from it in a way that indicates a “lower for longer” approach to policy near zero interest rates.
On July 4, 2013 the ECB Governing Council provided more specific forward guidance than in the past by stating that it expects ECB interest rates to remain at present or lower levels for an extended period of time. As explained by ECB President Mario Draghi this expectation is based on the Council’s medium-term outlook for inflation conditional on economic activity and money and credit. Draghi also stressed that there is no precise deadline for this extended period of time, but that a reasonable period can be estimated by extracting a reaction function. In this note, we use such a reaction function, namely the interest rate rule from Orphanides and Wieland (2013) that matches past ECB interest rate decisions quite well, to project the rate path consistent with inflation and growth forecasts from the survey of professional forecasters published by the ECB on August 8, 2013. This evaluation suggests an increase in ECB interest rates by May 2014 at the latest. We also use the Eurosystem staff projection from June 6, 2013 for comparison. While it would imply a longer period of low rates, it does not match past ECB decisions as well as the reaction function with SPF forecasts.
In this paper, we investigate how bank mergers affect bank revenues and present empirical evidence that mergers among banks have a substantial and persistent negative impact on merging banks’ revenues. We refer to merger related negative effects on banks’ revenues as dissynergies and suggest that they are a result of organizational diseconomies, the loss of customers and the temporary distraction of management from day-to-day operations by effecting the merger. For our analyses we draw on a proprietary data set with detailed financials of all 457 regional savings banks in Germany, which have been involved in 212 mergers between 1994 and 2006. We find that the negative impact of a merger on net operating revenues amounts to 3% of pro-forma consolidated banks’ operating profits and persists not only for the year of the merger but for up to four years post-merger. Only thereafter mergers exhibit a significantly superior performance compared to their respective pre-merger performance or the performance of their non-merging peers. The magnitude and persistence of merger related revenue dissynergies highlight their economic relevance. Previous research on post-merger performance mainly focuses on the effects from mergers on banks’ (cost) efficiency and profitability but fails to provide clear and consistent results. We are the first, to our knowledge, to examine the post-merger performance of banks’ net operating revenues and to empirically verify significant negative implications of mergers for banks’ net operating revenues. We propose that our finding of negative merger related effects on banks’ operating revenues is the reason why previous research fails to show merger related gains.
In this paper, we examine the impact of mergers among German savings banks on the extent to which these savings banks engage in small business lending. The ongoing consolidation in the banking industry has sparked concerns about the continuous availability of credit to small businesses which has been further fueled by empirical studies that partly confirm a reduction in small business lending in the aftermath of mergers. However, using a proprietary data set of German savings banks we find strong evidence that in Germany merging savings banks do not significantly change the extent to which they lend to small businesses compared to prior to the merger or compared to the contemporaneous lending by non-merging banks. We investigate the merger related effects on small business lending in Germany from a bank-level perspective. Furthermore, we estimate small business lending and its continuous adjustment process simultaneously using recent General Method of Moments (GMM) techniques for panel data as proposed by Arellano and Bond (1991).
ALMOST 20% OF THE GERMAN POPULATION CURRENTLY HOLDS A CONSUMER LOAN. DESPITE ITS OBVIOUS IMPORTANCE FOR A PRIVATE HOUSEHOLD’S BALANCE SHEET, WE KNOW SURPRISINGLY LITTLE ABOUT IT. ONE PURPOSE OF THIS STUDY IS TO GIVE AN OVERVIEW OF THE GERMAN CONSUMER/INSTALLMENT LOAN MARKET. WE COMPARED FIVE WIDELY ACCEPTED DATA SOURCES AND FOUND DIFFERENCES IN THE LOAN PARTICIPATION RATE, EVEN ON AN AGGREGATED NATIONWIDE LEVEL. IN A SECOND STEP, WE TRY TO FIND REASONS FOR EXTRAORDINARY HIGH GROWTH RATES AMONG SENIORS’ DEBT PARTICIPATION RATES.
In a unifying framework generalizing established theories we characterize under which conditions Joint Ownership of assets creates the best cooperation incentives in a partnership. We endogenise renegotiation costs and assume that they weakly increase with additional assets. A salient sufficient condition for optimal cooperation incentives among patient partners is if Joint Ownership is a Strict Coasian Institution for which transaction costs impede an efficient asset reallocation after a breakdown. In contrast to Halonen (2002) the logic behind our results is that Joint Ownership maximizes the value of the relationship and the costs of renegotiating ownership after a broken relationship.
This paper provides an overview of how to use "big data" for economic research. We investigate the performance and ease of use of different Spark applications running on a distributed file system to enable the handling and analysis of data sets which were previously not usable due to their size. More specifically, we explain how to use Spark to (i) explore big data sets which exceed retail grade computers memory size and (ii) run typical econometric tasks including microeconometric, panel data and time series regression models which are prohibitively expensive to evaluate on stand-alone machines. By bridging the gap between the abstract concept of Spark and ready-to-use examples which can easily be altered to suite the researchers need, we provide economists and social scientists more generally with the theory and practice to handle the ever growing datasets available. The ease of reproducing the examples in this paper makes this guide a useful reference for researchers with a limited background in data handling and distributed computing.
This paper outlines a new method for using qualitative information to analyze the monetary policy strategy of central banks. Quantitative assessment indicators that are extracted from a central bank's public statements via the balance statistic approach are employed to estimate a Taylor-type rule. This procedure allows to directly capture a policymaker's assessments of macroeconomic variables that are relevant for its decision making process. As an application of the proposed method the monetary policy of the Bundesbank is re-investigated with a new dataset. One distinctive feature of the Bundesbank's strategy consisted of targeting growth in monetary aggregates. The analysis using the proposed method provides evidence that the Bundesbank indeed took into consideration monetary aggregates but also real economic activity and inflation developments in its monetary policy strategy since 1975. JEL Classification: E52, E58, N14 Keywords: Monetary Policy Rule, Statement Indicators, Bundesbank, Monetary Targeting
This thesis consist of three chapters of which each investigates a topic from financial and monetary economics. In the first chapter a novel method to analyze the monetary policy of central banks is presented. In the second chapter (joint work with Professor Michael Binder, Goethe-University Frankfurt) the effects of conditional loan programs of the International Monetary Fund (IMF) on participating countries' output growth are investigated. In the third chapter (joint work with Professor Jan Pieter Krahnen, Goethe-University Frankfurt) a network model of interconnected bank balance sheets which gives rise to systemic risk is developed and used to analyze the implications of a bank levy related to banks' contribution to systemic risk. All three chapters give important insights to the policy design of macroeconomic institutions such as central banks, the IMF, and agencies charged with macroprudential supervision.
The interbank market is important for the efficient functioning of the financial system, transmission of monetary policy and therefore ultimately the real economy. In particular, it facilitates banks' liquidity management. This paper aims at extending the literature which views interbank markets as mutual liquidity insurance mechanism by taking into account persistence of liquidity shocks. Following a theory of long-term interbank funding a financial system which is modeled as a micro-founded agent based complex network interacting with a real economic sector is developed. The model features interbank funding as an over-the-counter phenomenon and realistically replicates financial system phenomena of network formation, monetary policy transmission and endogenous money creation. The framework is used to carry out an optimal policy analysis in which the policymaker maximizes real activity via choosing the optimal interest rate in a trade-off between loan supply and financial fragility. It is shown that the interbank market renders the financial system more efficient relative to a setting without mutual insurance against persistent liquidity shocks and therefore plays a crucial role for welfare.
We develop a dynamic network model with heterogenous banks which undertake optimizing portfolio decisions subject to liquidity and capital constraints and trade in the interbank market whose equilibrium is governed by a tatonnement process. Due to the micro-funded structure of the decisional process as well as the iterative dynamic adjustment taking place in the market, the links in the network structures are endogenous and evolve dynamically. We use the model to assess the diffusion of systemic risk (measured as default probability), the contribution of each bank to it as well as the evolution of the network in response to financial shocks and across different prudential policy regimes.
We develop a dynamic network model with heterogenous banks which undertake optimizing portfolio decisions subject to liquidity and capital constraints and trade in the interbank market whose equilibrium is governed by a tatonnement process. Due to the micro-funded structure of the decisional process as well as the iterative dynamic adjustment taking place in the market, the links in the network structures are endogenous and evolve dynamically. We use the model to assess the diffusion of systemic risk, the contribution of each bank to it as well as the evolution of the network in response to financial shocks and across different prudential policy regimes.
We develop a dynamic network model whose links are governed by banks' optmizing decisions and by an endogenous tâtonnement market adjustment. Banks in our model can default and engage in firesales: risk is transmitted through direct and cascading counterparty defaults as well as through indirect pecuniary externalities triggered by firesales. We use the model to assess the evolution of the network configuration under various prudential policy regimes, to measure banks' contribution to systemic risk (through Shapley values) in response to shocks and to analyze the effects of systemic risk charges. We complement the analysis by introducing the possibility of central bank liquidity provision.
This paper makes a conceptual contribution to the effect of monetary policy on financial stability. We develop a microfounded network model with endogenous network formation to analyze the impact of central banks' monetary policy interventions on systemic risk. Banks choose their portfolio, including their borrowing and lending decisions on the interbank market, to maximize profit subject to regulatory constraints in an asset-liability framework. Systemic risk arises in the form of multiple bank defaults driven by common shock exposure on asset markets, direct contagion via the interbank market, and firesale spirals. The central bank injects or withdraws liquidity on the interbank markets to achieve its desired interest rate target. A tension arises between the beneficial effects of stabilized interest rates and increased loan volume and the detrimental effects of higher risk taking incentives. We find that central bank supply of liquidity quite generally increases systemic risk.
This paper analyzes the emergence of systemic risk in a network model of interconnected bank balance sheets. Given a shock to asset values of one or several banks, systemic risk in the form of multiple bank defaults depends on the strength of balance sheets and asset market liquidity. The price of bank assets on the secondary market is endogenous in the model, thereby relating funding liquidity to expected solvency - an important stylized fact of banking crises. Based on the concept of a system value at risk, Shapley values are used to define the systemic risk charge levied upon individual banks. Using a parallelized simulated annealing algorithm the properties of an optimal charge are derived. Among other things we find that there is not necessarily a correspondence between a bank's contribution to systemic risk - which determines its risk charge - and the capital that is optimally injected into it to make the financial system more resilient to systemic risk. The analysis has policy implications for the design of optimal bank levies. JEL Classification: G01, G18, G33 Keywords: Systemic Risk, Systemic Risk Charge, Systemic Risk Fund, Macroprudential Supervision, Shapley Value, Financial Network
This paper makes a conceptual contribution to the effect of monetary policy on financial stability. We develop a microfounded network model with endogenous network formation to analyze the impact of central banks' monetary policy interventions on systemic risk. Banks choose their portfolio, including their borrowing and lending decisions on the interbank market, to maximize profit subject to regulatory constraints in an asset-liability framework. Systemic risk arises in the form of multiple bank defaults driven by common shock exposure on asset markets, direct contagion via the interbank market, and firesale spirals. The central bank injects or withdraws liquidity on the interbank markets to achieve its desired interest rate target. A tension arises between the beneficial effects of stabilized interest rates and increased loan volume and the detrimental effects of higher risk taking incentives. We find that central bank supply of liquidity quite generally increases systemic risk.
Derivatives usage in risk management by U.S. and German non-financial firms : a comparative survey
(1998)
This paper is a comparative study of the responses to the 1995 Wharton School survey of derivative usage among US non-financial firms and a 1997 companion survey on German non-financial firms. It is not a mere comparison of the results of both studies but a comparative study, drawing a comparable subsample of firms from the US study to match the sample of German firms on both size and industry composition. We find that German firms are more likely to use derivatives than US firms, with 78% of German firms using derivatives compared to 57% of US firms. Aside from this higher overall usage, the general pattern of usage across industry and size groupings is comparable across the two countries. In both countries, foreign currency derivative usage is most common, followed closely by interest rate derivatives, with commodity derivatives a distant third. Usage rates across all three classes of derivatives are higher for German firms than US firms. In contrast to the similarities, firms in the two countries differ notably on issues such as the primary goal of hedging, their choice of instruments, and the influence of their market view when taking derivative positions. These differences appear to be driven by the greater importance of financial accounting statements in Germany than the US and stricter German corporate policies of control over derivative activities within the firm. German firms also indicate significantly less concern about derivative related issues than US firms, which appears to arise from a more basic and simple strategy for using derivatives. Finally, among the derivative non-users, German firms tend to cite reasons suggesting derivatives were not needed whereas US firms tend to cite reasons suggesting a possible role for derivatives, but a hesitation to use them for some reason.
We introduce a copula-based dynamic model for multivariate processes of (non-negative) high-frequency trading variables revealing time-varying conditional variances and correlations. Modeling the variables’ conditional mean processes using a multiplicative error model we map the resulting residuals into a Gaussian domain using a Gaussian copula. Based on high-frequency volatility, cumulative trading volumes, trade counts and market depth of various stocks traded at the NYSE, we show that the proposed copula-based transformation is supported by the data and allows capturing (multivariate) dynamics in higher order moments. The latter are modeled using a DCC-GARCH specification. We suggest estimating the model by composite maximum likelihood which is sufficiently flexible to be applicable in high dimensions. Strong empirical evidence for time-varying conditional (co-)variances in trading processes supports the usefulness of the approach. Taking these higher-order dynamics explicitly into account significantly improves the goodness-of-fit of the multiplicative error model and allows capturing time-varying liquidity risks.
I analyze the most powerful shareholders in Germany to illustrate the concentration of control over listed corporations. Compared to other developed economies, the German stock market is dominated by large shareholders. I show that 77% of the median firm’s voting rights arecontrolled by large blockholders. This corresponds to 47% of the market value of all firms listed in Germany’s official markets. About two thirds of this amount is controlled by banks, industrial firms, holdings, and insurance companies. I show that due to current legislation it is clear for neither group who ultimate exerts control over the shareholding firm itself. For the remaining blockholders, only blocks controlled by voting pools and individuals can be traced back to the highest level of ownership. In the aggregate, both groups control only 5.6% of all reported blocks. The German government controls 8%, and it is not clear who ultimately is responsible for the consequences of decisions.
Using a unique data set on trade credit defaults among French firms, we investigate whether and how trade credit is used to relax financial constraints. We show that firms that face idiosyncratic liquidity shocks are more likely to default on trade credit, especially when the shocks are unexpected, firms have little liquidity, are likely to be credit constrained or are close to their debt capacity. We estimate that credit constrained firms pass more than one fourth of the liquidity shocks they face on to their suppliers down the trade credit chain. The evidence is consistent with the idea that firms provide liquidity insurance to each other and that this mechanism is able to alleviate the consequences of credit constraints. In addition, we show that the chain of defaults stops when it reaches firms that are large, liquid, and have access to financial markets. This suggests that liquidity is allocated from large firms with access to outside finance to small, credit constrained firms through trade credit chains.
This paper uses factor-augmented vector autoregressions (FAVAR) estimated using a large data set to disentangle fluctuations in disaggregated consumer and producer prices which are due to macroeconomic factors from those due to sectorial conditions. This allows us to provide consistent estimates of the effects of US monetary policy on disaggregated prices. While sectorial prices respond quickly to sector-specific shocks, we find that for a large number of price series, there is a significant delay in the response of prices to monetary policy shocks. In addition, price responses display little evidence of a “price puzzle,” contrary to existing studies based on traditional VARs. The observed dispersion in the reaction of producer prices is relatively well explained by the degree of market power, as predicted by models with monopolistic competition. JEL Classification: E32, E52
The recent wave of randomized trials in development economics has provoked criticisms regarding external validity. We investigate two concerns—heterogeneity across beneficiaries and implementers—in a randomized trial of contract teachers in Kenyan schools. The intervention, previously shown to raise test scores in NGO- led trials in Western Kenya and parts of India, was replicated across all Kenyan provinces by an NGO and the government. Strong effects of shortterm contracts produced in controlled experimental settings are lost in weak public institutions: NGO implementation produces a positive effect on test scores across diverse contexts, while government implementation yields zero effect. The data suggests that the stark contrast in success between the government and NGO arm can be traced back to implementation constraints and political economy forces put in motion as the program went to scale.
A large empirical literature has shown that user fees signicantly deter public service utilization in developing #countries. While most of these results reflect partial equilibrium analysis, we find that the nationwide abolition of public school fees in Kenya in 2003 led to no increase in net public enrollment rates, but rather a dramatic shift toward private schooling. Results suggest this divergence between partial- and general-equilibrium effects is partially explained by social interactions: the entry of poorer pupils into free education contributed to the exit of their more affluent peers.