Refine
Year of publication
- 2012 (73) (remove)
Document Type
- Report (34)
- Working Paper (16)
- Book (7)
- Part of Periodical (7)
- Article (6)
- Doctoral Thesis (3)
Has Fulltext
- yes (73) (remove)
Is part of the Bibliography
- no (73)
Keywords
- forecasting (5)
- model uncertainty (5)
- DSGE models (4)
- monetary policy (4)
- Greenbook (3)
- Social Interaction (3)
- complexity (3)
- density forecasts (3)
- forecast combination (3)
- real-time data (3)
Institute
- Wirtschaftswissenschaften (73) (remove)
We introduce a new measure of systemic risk, the change in the conditional joint probability of default, which assesses the effects of the interdependence in the financial system on the general default risk of sovereign debtors. We apply our measure to examine the fragility of the European financial system during the ongoing sovereign debt crisis. Our analysis documents an increase in systemic risk contributions in the euro area during the post-Lehman global recession and especially after the beginning of the euro area sovereign debt crisis. We also find a considerable potential for cascade effects from small to large euro area sovereigns. When we investigate the effect of sovereign default on the European Union banking system, we find that bigger banks, banks with riskier activities, with poor asset quality, and funding and liquidity constraints tend to be more vulnerable to a sovereign default. Surprisingly, an increase in leverage does not seem to influence systemic vulnerability.
Financing asset growth
(2012)
We document the existence of a debt anomaly that is in addition to the asset growth anomaly: for a given asset growth rate, firms that issue more debt, as well as firms that retire more debt, have lower stock returns in the 12 months starting 6 months after the calendar year of asset growth. Exploring the reasons for debt issuance, we find that managers of firms for which analyst expectations are more over-optimistic, which suffer from declining investment profitability, and whose earnings-price ratios are relatively high are inclined to rely more heavily on debt financing. On the other hand, firms that retire more debt for a given asset growth rate tend to have improving profitability but to be over-priced. We also find that the financing decision is influenced by the prior debt ratio, the asset growth rate, profitability, and CEO pay sensitivity. We interpret our results in terms of managerial incentives, signaling, and market timing.
We outline a procedure for consistent estimation of marginal and joint default risk in the euro area financial system. We interpret the latter risk as the intrinsic financial system fragility and derive several systemic fragility indicators for euro area banks and sovereigns, based on CDS prices. Our analysis documents that although the fragility of the euro area banking system had started to deteriorate before Lehman Brothers' file for bankruptcy, investors did not expect the crisis to affect euro area sovereigns' solvency until September 2008. Since then, and especially after November 2009, joint sovereign default risk has outpaced the rise of systemic risk within the banking system.
In the aftermath of the global financial crisis and great recession, many countries face substantial deficits and growing debts. In the United States, federal government outlays as a ratio to GDP rose substantially from about 19.5 percent before the crisis to over 24 percent after the crisis. In this paper we consider a fiscal consolidation strategy that brings the budget to balance by gradually reducing this spending ratio over time to the level that prevailed prior to the crisis. A crucial issue is the impact of such a consolidation strategy on the economy. We use structural macroeconomic models to estimate this impact. We consider two types of dynamic stochastic general equilibrium models: a neoclassical growth model and more complicated models with price and wage rigidities and adjustment costs. We separate out the impact of reductions in government purchases and transfers, and we allow for a reduction in both distortionary taxes and government debt relative to the baseline of no consolidation. According to the initial model simulations GDP rises in the short run upon announcement and implementation of this fiscal consolidation strategy and remains higher than the baseline in the long run.
We build on previous work on operational performance evaluation of private equity portfolio companies as we are able to at least partially decrypt the black box consisting of restructuring tools these investors use and the corresponding impact on their portfolio companies. Beyond answering whether private equity improves operating efficiency we figure out which of the typical restructuring tools drive operating efficiency. Using a set of over 300 international leveraged buyout transactions in the last thirty years we find that while there is vast improvement in operational efficiency these gains vary considerably. Our top performing transactions are subject to strong equity incentives, frequent asset restructuring and tight control by the investor. Furthermore, investors experience has a positive and financial leverage a negative influence on operational performance.
We develop a dynamic network model with heterogenous banks which undertake optimizing portfolio decisions subject to liquidity and capital constraints and trade in the interbank market whose equilibrium is governed by a tatonnement process. Due to the micro-funded structure of the decisional process as well as the iterative dynamic adjustment taking place in the market, the links in the network structures are endogenous and evolve dynamically. We use the model to assess the diffusion of systemic risk, the contribution of each bank to it as well as the evolution of the network in response to financial shocks and across different prudential policy regimes.
We develop a dynamic network model with heterogenous banks which undertake optimizing portfolio decisions subject to liquidity and capital constraints and trade in the interbank market whose equilibrium is governed by a tatonnement process. Due to the micro-funded structure of the decisional process as well as the iterative dynamic adjustment taking place in the market, the links in the network structures are endogenous and evolve dynamically. We use the model to assess the diffusion of systemic risk (measured as default probability), the contribution of each bank to it as well as the evolution of the network in response to financial shocks and across different prudential policy regimes.
This paper presents a theory that explains why it is beneficial for banks to engage in circular lending activities on the interbank market. Using a simple network structure, it shows that if there is a non-zero bailout probability, banks can significantly increase the expected repayment of uninsured creditors by entering into cyclical liabilities on the interbank market before investing in loan portfolios. Therefore, banks are better able to attract funds from uninsured creditors. Our results show that implicit government guarantees incentivize banks to have large interbank exposures, to be highly interconnected, and to invest in highly correlated, risky portfolios. This can serve as an explanation for the observed high interconnectedness between banks and their investment behavior in the run-up to the subprime mortgage crisis.
The paper analyzes the mutual influence of the capital structure and the investment decision of a bank, as well as the incentive effects of the bank executives compensation schemes on these decisions. In case the government implicitly or explicitly insures deposits and/or the banks debt, banks are incentivized to invest in risky assets and to have a high leverage. Capital regulation could potentially solve this excessive risk taking problem. However, this is only possible if the regulator can observe and properly measure the investment risks of the bank, which was called into question during the 2008-09 financial crisis. Hence, we propose a regulatory approach that is also able to implement the first best risk taking levels by the bank, but does not require the regulator to know the investment risk of the bank. The regulatory approach involves the implementation of capital requirements, which are made contingent on the management compensation.
Option-implied information and predictability of extreme returns : [Version 24 September 2012]
(2012)
We study whether option-implied conditional expectation of market loss due to tail events, or tail loss measure, contains information about future returns, especially the negative ones. Our tail loss measure predicts future market returns, magnitude, and probability of the market crashes, beyond and above other option-implied variables. Stock-specific tail loss measure predicts individual expected returns and magnitude of realized stock-specific crashes in the cross-section of stocks. An investor, especially the one who cares about the left tail of her wealth distribution (e.g., disappointment-averse), benefits from using the tail loss measure as an information variable to construct managed portfolios of a risk-free asset and market index. The tail loss measure is motivated by the results of the extreme value theory, and it is computed from observed prices of out-of-the-money put as the risk-neutral expected value of a loss beyond a given relative threshold.
This paper investigates the accuracy of point and density forecasts of four DSGE models for inflation, output growth and the federal funds rate. Model parameters are estimated and forecasts are derived successively from historical U.S. data vintages synchronized with the Fed’s Greenbook projections. Point forecasts of some models are of similar accuracy as the forecasts of nonstructural large dataset methods. Despite their common underlying New Keynesian modeling philosophy, forecasts of different DSGE models turn out to be quite distinct. Weighted forecasts are more precise than forecasts from individual models. The accuracy of a simple average of DSGE model forecasts is comparable to Greenbook projections for medium term horizons. Comparing density forecasts of DSGE models with the actual distribution of observations shows that the models overestimate uncertainty around point forecasts.
This paper investigates the accuracy of forecasts from four DSGE models for inflation, output growth and the federal funds rate using a real-time dataset synchronized with the Fed’s Greenbook projections. Conditioning the model forecasts on the Greenbook nowcasts leads to forecasts that are as accurate as the Greenbook projections for output growth and the federal funds rate. Only for inflation the model forecasts are dominated by the Greenbook projections. A comparison with forecasts from Bayesian VARs shows that the economic structure of the DSGE models which is useful for the interpretation of forecasts does not lower the accuracy of forecasts. Combining forecasts of several DSGE models increases precision in comparison to individual model forecasts. Comparing density forecasts with the actual distribution of observations shows that DSGE models overestimate uncertainty around point forecasts.
Wie kann das Projekt gemeinsame Währung seine Glaubwürdigkeit wiederherstellen? Otmar Issing argumentiert, dass eine gemeinsame Währung ohne politische Union nur mit dem No-bail-out Prinzip funktionieren kann. Er warnt gleichzeitig davor, die politische Union nur als Mittel zur Krisenbewältigung voranzutreiben.
Großer Beifall
(2012)
Der Deutsche Coprporate Governance Kodex soll das deutsche Corporate Governance System transparent und nachvollziehbar machen. Der Kodex stellt gesetzliche Vorschriften zur Leitung und Überwachung deutscher börsennotierter Gesellschaften dar und enthält international anerkannte Standards guter und verantwortungsvoller Unternehmensführung. Die Stellungnahme befasst sich mit von der Regierungskommission Deutscher Corporate Governance Kodex vorgebrachten Änderungsvorschlägen.
Dieser Text fasst eine Studie zusammen, die für das Bundesministeriums für Ernährung, Landwirtschaft und Verbraucherschutz verfasst wurde und sich mit dem Kundennutzen von Anlageberatung auseinandersetzt. Das erhebliche Potenzial von interessenskongruenter Anlageberatung wird aufgezeigt und die aktuell geringe Leistungstransparenz im Markt kritisiert. Es wird empfohlen, ein standardisiertes Vokabular für Depotrisiken einzuführen und den Zugang aller Anleger zu leicht verständlichen und vergleichbaren Informationen zu historischem Depotrisiko und historischer Deporendite sicherzustellen. Die Studie fokussiert auf Wertpapierberatung und damit zuvorderst auf jene Teilmenge von Verbrauchern, die über Anlagevermögen verfügen. Die Grundideen zu Leistungstransparenz und standardisiertem Risikovokabular lassen sich jedoch auch z.B. auf den Alterssicherungsmarkt übertragen.
In the event of a Greek exit from the Eurozone, the stronger members of the monetary union, especially Germany, face at least two risks: First, the debt of the Greek National Bank vis-à-vis the Eurosystem of central banks will most likely be lost. Secondly, the large flow of capital from Greece and other periphery countries to Germany will accelerate inflation.
Der Beitrag diskutiert den unlängst veröffentlichten Referentenentwurf eines Finanzstabilitätsgesetzes, der vorsieht, dass die Bundesbank ab 2013 ein Mandat zur makroprudenziellen Überwachung bekommt. Die Entscheidungen über die Vorschläge, die sich vor allem an die Bundesregierung oder an die Bundesanstalt für Finanzdienstleistungsaufsicht (Bafin) richten dürften, wird ein neuer Ausschuss für Finanzstabilität beim Bundesfinanzministerium treffen. Der Autor befasst sich mit der Zusammensetzung dieses Ausschusses und mit den Instrumenten, die für die Umsetzung der Empfehlungen des Ausschusses zur Verfügung stehen.
The idea of appointing a non-national as Central Bank Governor remains surprisingly controversial. Nevertheless, given the skills required by the Governor in order to manage what no doubt are increasingly complex institutions, considering non-nationals makes good sense for at least two reasons. First, increasing the pool of candidates to include those with broader skills and backgrounds makes it easier to find a suitable person for the job. Second, non-nationals are less likely to be beholden to domestic pressure groups and could help better insulate the central bank from political pressures.
The exceptional circumstances in which the ECB has been operating in the past years are testing not only the currency union itself, but also its institutional design. While the Governing Council of the ECB was designed to mainly set interest rates optimally for the union as a whole, the recent crisis has expanded the tools of the ECB to include unconventional monetary policy actions that potentially increase the risk exposure of its balance sheet. Since each country would contribute to the losses according to its capital key, a different voting mechanism that takes into account the single country’s contribution to the ECB’s capital could be advisable.
After initial temporary measures in support of Greece prooved insufficient to end the sovereign debt crisis, extensive countermeasures have ensued. The heads of state of the euro group have agreed to permanent support mechanims over the course of the past two years. In addition, the European Central Bank (ECB) has become involved in the assistance program. The article provides an overview of the various support mechanisms installed and cautions against the connected legal problems.
In 2011 wurde der Preis für Wirtschaftswissenschaften der schwedischen Reichsbank im Gedenken an Alfred Nobel an die US-Ökonomen Thomas J. Sargent von der New York University und Chistopher A. Sims von Princeton University verliehen. Gerade deutsche Zeitungskommentare kritisierten die Forscher vielfach für die Verwendung „unrealistischer“ Annahmen wie Nutzenmaximierung und rationale Erwartungen. Diese Kritik verkennt den maßgeblichen Beitrag von Sargent und Sims zur Entwicklung der modernen Makroökonomik. Ihre empirischen Methoden sind heute Standardwerkzeuge der akademischen Forschung und werden auch von Ökonomen in Zentralbanken, Finanzministerien und internationalen Organisationen eingesetzt. Sie haben grundlegende neue Erkenntnisse ermöglicht, zum Beispiel über die Wirkungsweise der Geld- und Fiskalpolitik.
In its decision of December 13, 2011, the Constitutional Court of the state of North Rhine-Westphalia ruled that a State Court of Auditors is granted by the constitution a broad scope of powers not only to control the immediate state administration but also entities outside the direct state administration, as far as they exercise financial responsibility for the state. This ruling may have serious implications for the capital guarantees extended by EU Member States to the newly established institutions on the European level, as for instance the European Stability Mechanism (ESM).
Schlechte Erfahrungen
(2012)
Eine Transaktionssteuer auf Finanzgeschäfte würde weniger Geld einbringen, als viele ihrer Anhänger hoffen - und sie birgt gravierende ökonomische und juristische Risiken. Die Bundesregierung sollte sich der Belastungen durch eine Finanztransaktionssteuer bewusst sein – und sie nicht ohne Beteiligung der weltweit führenden Finanzplätze einführen. Eine internationale Einigung auf strengere Eigenkapitalvorschriften für Banken muss Vorrang haben.
This contribution draws on two recent publications in which the macroeconomic model data base (www.macromodelbase.com) is employed for model comparisons. The comparative approach is used to base policy analysis on a systematic evaluation of the different implications that a certain economic policy can have when submitted to different modeling approaches. In this manner, policy recommendations are more robust to modeling uncertainty. By extending the comparative approach to forecasting, the authors investigate the accuracy of different forecasting models and obtain more reliable mean forecasts.
Wie lässt sich das anthropologische Fundament des Ordoliberalismus und der Sozialen Marktwirtschaft beschreiben? Welche Prämissen liegen ihm zugrunde? Ist ein derartiges Menschenbild überhaupt noch zeitgemäß? Welchen Gefahren ist es ausgesetzt und welche institutionellen Vorkehrungen lassen sich treffen? Diese und andere Fragen stehen im Mittelpunkt des nachfolgenden Essays. Dieser ist dabei wie folgt gegliedert: Das zweite Kapitel analysiert die anthropologischen Grundlagen des Ordoliberalismus. Ein Schwerpunkt liegt dabei auf dem kantischen ‚Programm der Freiheit als Autonomie‘. Das darauf folgende dritte Kapitel thematisiert mögliche Gefahrenpotenziale für eben jenes ordoliberale Menschenbild. Hierbei werden insbesondere die Vermachtung der Wirtschaft und die Instrumentalisierung und Funktionalisierung der Wissenschaft eingehender untersucht. Im vorletzten Kapitel schließlich werden die häufig vorgeschlagenen institutionellen Vorkehrungen kritisch evaluiert. Wichtige Beiträge hierzu stammen aus der Feder von Röpke (Clercs und Nobilitas naturalis) und Hayek (Zwei-Kammern-Verfassungsmodell).
As recent newspaper headlines show the topic of patents/patent laws is still heavily disputed. In this paper I will approach this topic from a theoretical-historical and history of economic thought-perspective. In this regard I will link the patent controversy of the nineteenth century with Walter Eucken’s Ordoliberalism – a German version of neoliberalism. My paper is structured as follows: The second chapter provides the reader with a historical introduction. At the heart of this paragraph are the controversy and discourse on patent laws in nineteenth century Europe as well as the pro and contra arguments presented by the anti-patent/free-trade movement respectively by the advocates of patent protection. The focus of my paper is on the struggle for the protection of inventions and innovations in nineteenth century Germany, since Walter Eucken, main representative of the Freiburg School of Law and Economics, picks up the counter-arguments presented in the national debate and in particular by the Kongress deutscher Volkswirthe. The third chapter deals intensively with the question whether patent laws are just ‘nonsense upon stilts’ from an ordoliberal perspective. Here, Eucken’s arguments against the current patent system are elaborated in great detail. The paper ends with a summary of my main findings.
Notenbanken haben heute nicht die Aufgabe die Geldmenge zu kontrollieren. Ihr Job ist es, den Wert des Geldes – und damit den Preis der Wirtschaftsgüter in der jeweiligen Währung – zu stabilisieren. Doch wie ist diese Preisstabilität am besten herzustellen? Muß man dabei nicht doch die Geldmenge im Auge behalten? Unter monetären Ökonomen gibt es dazu eine wissenschaftliche Debatte.
The latest appointment to the ECB's Executive Board initiated a political dispute between the European Parliament and the Euro Group on the question of representation of females on the Executive Board and the Governing Council of the ECB. The dispute has raised awareness to the fact that a culture of equality and equal opportunity should be built from the ground up. A long term plan helping talented women to emerge and be prepared to take increasing responsibilities is necessary to make sure that there is a growing pool of qualified female candidates.
This present comment suggests an amendment to the proposal for a directive of the European Parliament and of the Council, establishing a framework for the recovery and resolution of credit institutions and investment firms. The current proposal focuses on bail-in, but does not sufficiently take into account the pressure exerted on central bankers, supervisors and politicians by the fear of interbank contagion. The only way out of this hold-up type of situation can be found in bail-in bonds. Bail-in bonds are dedicated loss taking debt instruments, whose status of being first in line if it comes to default is clearly communicated from day one.
In dieser Notiz wird ein neues Konzept für eine europäische Einlagensicherung vorgeschlagen, welches den starken politischen Vorbehalten Rechnung trägt, die gegen eine Vergemeinschaftung der Haftung für Bankeinlagen bestehen. Das skizzierte drei-stufige Einlagensicherungsmodell führt existierende nationale Einlagensicherungseinrichtungen weiter, bietet einen europäischen Verlustausgleich und verhindert eine exzessive Risikoübernahme zu Lasten der internationalen Gemeinschaft.
In this paper, we develop a state-dependent sensitivity value-at-risk (SDSVaR) approach that enables us to quantify the direction, size, and duration of risk spillovers among financial institutions as a function of the state of financial markets (tranquil, normal, and volatile). Within a system of quantile regressions for four sets of major financial institutions (commercial banks, investment banks, hedge funds, and insurance companies) we show that while small during normal times, equivalent shocks lead to considerable spillover effects in volatile market periods. Commercial banks and, especially, hedge funds appear to play a major role in the transmission of shocks to other financial institutions. Using daily data, we can trace out the spillover effects over time in a set of impulse response functions and find that they reach their peak after 10 to 15 days.
An analyst who works in Germany is more likely to publish a high (low) price target regarding a DAX30 stock if other Germany based analysts are also optimistic (pessimistic) about the same stock. This finding is not biased by the fact that DAX30 companies are headquartered in Germany. In times of bull markets, price targets of analysts who regularly exchange their opinion are higher correlated compared to other analysts. This effect vanishes in a bearish market environment. This suggests that communication among analysts indeed plays an important role. However, analysts’ incentives induce them not to deviate too much from the overall average during an economic downturn.
With this paper, I propose a simple asset pricing model that accounts for the influence from social interaction. Investors are assumed to make up their mind about an asset’s price based on a forecasting strategy and its past profitability as well as on the contemporaneous expectations of other market participants. Empirically analysing stocks of the DAX30 index, I provide evidence that social interaction rather destabilises financial markets. Based on my results, I state that at least, it does not have a stabilising effect.
In this paper, I analyse the reciprocal social influence on investment decisions within an international group of roughly 2000 mutual fund managers that invested in companies of the DAX30. Using a robust estimation procedure, I provide empirical evidence that in the average a fund manager puts 0.69% more portfolio weight on a particular stock, if other fund managers increase the corresponding position by 1%. The dynamics of this influence on portfolio weights suggest that fund managers adjust their behaviour according to the prevailing market situation and are more strongly influenced by others in times of an economic downturn. Analysing the working locations of the fund managers, I conclude that more than 90% of the magnitude of influence is due to pure observation. While this form of influence varies much in time, the magnitude of influence resulting from the exchange of opinion is more or less constant.
In this thesis the behavior of banks in financial markets which banks frequently use to obtain short-term as well as long-term financing is studied. In the first chapter we incorporate an interbank market for collateralized lending among banks into a dynamic, stochastic, general equilibrium (DSGE) framework to analyze the impact of variations in the expected value of the collateral on the interbank lending volume. We find that a central bank which decides to lower the haircut on eligible collateral in repurchase agreements is able to stimulate interbank markets. In the second chapter a microeconomic model of bank behavior on the interbank market is set up to analyze the impact of risk-taking behavior of interbank borrowing banks and uncertainty about their balance sheet quality on the lending behavior of interbank lending banks. It is found that the disruptions on the interbank market are the result of optimal behavior on the part of interbank lending banks in response to the uncertainty about the balance sheet quality of an interbank borrowing bank. In the third chapter we use monthly data on German bank bond spreads and regress it on bank-specific risk factors to assess the degree of market discipline in the German bank bond market. The regression results for the whole German bank bond market indicate that the bond spread does not show signs of market discipline. However, a structural break analysis uncovers that since the beginning of the financial crisis the German bank bond market exhibits at least a weak form of market discipline for bonds issued by medium-size and large banks.
We test whether investor mood affects trading with data on all stock market transactions in Finland, utilizing variation in daylight and local weather. We find some evidence that environmental mood variables (local weather, length of day, daylight saving and lunar phase) affect investors’ direction of trade and volume. The effect magnitudes are roughly comparable to those of classical seasonals, such as the Monday effect. The statistical significance of the mood variables is weak in many cases, however. Only very little of the day-to-day variation in trading is collectively explained by all mood variables and calendar effects, but lower frequency variation seems connected to holiday seasons.
Management Summary: Conducted within the project “Economic Implications of New Models for Information Supply for Science and Research in Germany”, the Houghton Report for Germany provides a general cost and benefit analysis for scientific communication in Germany comparing different scenarios according to their specific costs and explicitly including the German National License Program (NLP).
Basing on the scholarly lifecycle process model outlined by Björk (2007), the study compared the following scenarios according to their accounted costs:
- Traditional subscription publishing,
- Open access publishing (Gold Open Access; refers primarily to journal publishing where access is free of charge to readers, while the authors or funding organisations pay for publication)
- Open Access self-archiving (authors deposit their work in online open access institutional or subject-based repositories, making it freely available to anyone with Internet access; further divided into (i) CGreen Open Access’ self-archiving operating in parallel with subscription publishing; and (ii) the ‘overlay services’ model in which self-archiving provides the foundation for overlay services (e.g. peer review, branding and quality control services))
- the NLP.
Within all scenarios, five core activity elements (Fund research and research communication; perform research and communicate the results; publish scientific and scholarly works; facilitate dissemination, retrieval and preservation; study publications and apply the knowledge) were modeled and priced with all their including activities.
Modelling the impacts of an increase in accessibility and efficiency resulting from more open access on returns to R&D over a 20 year period and then comparing costs and benefits, we find that the benefits of open access publishing models are likely to substantially outweigh the costs and, while smaller, the benefits of the German NLP also exceed the costs.
This analysis of the potential benefits of more open access to research findings suggests that different publishing models can make a material difference to the benefits realised, as well as the costs faced. It seems likely that more Open Access would have substantial net benefits in the longer term and, while net benefits may be lower during a transitional period, they are likely to be positive for both ‘author-pays’ Open Access publishing and the ‘over-lay journals’ alternatives (‘Gold Open Access’), and for parallel subscription publishing and self-archiving (‘Green Open Access’). The NLP returns substantial benefits and savings at a modest cost, returning one of the highest benefit/cost ratios available from unilateral national policies during a transitional period (second to that of ‘Green Open Access’ self-archiving). Whether ‘Green Open Access’ self-archiving in parallel with subscriptions is a sustainable model over the longer term is debateable, and what impact the NLP may have on the take up of Open Access alternatives is also an important consideration. So too is the potential for developments in Open Access or other scholarly publishing business models to significantly change the relative cost-benefit of the NLP over time.
The results are comparable to those of previous studies from the UK and Netherlands. Green Open Access in parallel with the traditional model yields the best benefits/cost ratio. Beside its benefits/cost ratio, the meaningfulness of the NLP is given by its enforceability. The true costs of toll access publishing (beside the buyback” of information) is the prohibition of access to research and knowledge for society.
In the aftermath of the global financial crisis and great recession, many countries face substantial deficits and growing debts. In the United States, federal government outlays as a ratio to GDP rose substantially from about 19.5 percent before the crisis to over 24 percent after the crisis. In this paper we consider a fiscal consolidation strategy that brings the budget to balance by gradually reducing this spending ratio over time to the level that prevailed prior to the crisis. A crucial issue is the impact of such a consolidation strategy on the economy. We use structural macroeconomic models to estimate this impact focussing primarily on a dynamic stochastic general equilibrium model with price and wage rigidities and adjustment costs. We separate out the impact of reductions in government purchases and transfers, and we allow for a reduction in both distortionary taxes and government debt relative to the baseline of no consolidation. According to the model simulations GDP rises in the short run upon announcement and implementation of this fiscal consolidation strategy and remains higher than the baseline in the long run. We explore the role of the mix of expenditure cuts and tax reductions as well as gradualism in achieving this policy outcome. Finally, we conduct sensitivity studies regarding the type of model used and its parameterization.
We examine both the degree and the structural stability of inflation persis tence at different quantiles of the conditional inflation distribution. Previous research focused exclusively on persistence at the conditional mean of the inflation rate. Economic theory, however, provides various reasons -for example downward wage rigidities or menu costs- to expect higher inflation persistence at the upper than at the lower tail of the conditional inflation distribution.
Based on post-war US data we indeed find slower mean reversion in response to positive than to negative shocks. We find robust evidence for a structural break in persistence at all quantiles of the inflation process in the early 1980s. Inflation persistence has decreased and become more homogeneous across quantiles. Persistence at the conditional mean became more informative about the degree of persistence across the entire conditional inflation distribution. While prior to the 1980s inflation was not mean reverting in response to large positive shocks, our evidence strongly suggests that since the end of the Volcker disinflation the unit root can be rejected at every quantile including the upper tail of the conditional inflation distribution.
This paper investigates the accuracy of point and density forecasts of four DSGE models for inflation, output growth and the federal funds rate. Model parameters are estimated and forecasts are derived successively from historical U.S. data vintages synchronized with the Fed’s Greenbook projections. Point forecasts of some models are of similar accuracy as the forecasts of nonstructural large dataset methods. Despite their common underlying New Keynesian modeling philosophy, forecasts of different DSGE models turn out to be quite distinct. Weighted forecasts are more precise than forecasts from individual models. The accuracy of a simple average of DSGE model forecasts is comparable to Greenbook projections for medium term horizons. Comparing density forecasts of DSGE models with the actual distribution of observations shows that the models overestimate uncertainty around point forecasts.
The complexity resulting from intertwined uncertainties regarding model misspecification and mismeasurement of the state of the economy defines the monetary policy landscape. Using the euro area as laboratory this paper explores the design of robust policy guides aiming to maintain stability in the economy while recognizing this complexity. We document substantial output gap mismeasurement and make use of a new model data base to capture the evolution of model specification. A simple interest rate rule is employed to interpret ECB policy since 1999. An evaluation of alternative policy rules across 11 models of the euro area confirms the fragility of policy analysis optimized for any specific model and shows the merits of model averaging in policy design. Interestingly, a simple difference rule with the same coefficients on inflation and output growth as the one used to interpret ECB policy is quite robust as long as it responds to current outcomes of these variables.
In the aftermath of the global financial crisis and great recession, many countries face substantial deficits and growing debts. In the United States, federal government outlays as a ratio to GDP rose substantially from about 19.5 percent before the crisis to over 24 percent after the crisis. In this paper we consider a fiscal consolidation strategy that brings the budget to balance by gradually reducing this spending ratio over time to the level that prevailed prior to the crisis. A crucial issue is the impact of such a consolidation strategy on the economy. We use structural macroeconomic models to estimate this impact focussing primarily on a dynamic stochastic general equilibrium model with price and wage rigidities and adjustment costs. We separate out the impact of reductions in government purchases and transfers, and we allow for a reduction in both distortionary taxes and government debt relative to the baseline of no consolidation. According to the model simulations GDP rises in the short run upon announcement and implementation of this fiscal consolidation strategy and remains higher than the baseline in the long run. We explore the role of the mix of expenditure cuts and tax reductions as well as gradualism in achieving this policy outcome. Finally, we conduct sensitivity studies regarding the type of model used and its parameterization.
The complexity resulting from intertwined uncertainties regarding model misspecification and mismeasurement of the state of the economy defines the monetary policy landscape. Using the euro area as laboratory this paper explores the design of robust policy guides aiming to maintain stability in the economy while recognizing this complexity. We document substantial output gap mismeasurement and make use of a new model data base to capture the evolution of model specification. A simple interest rate rule is employed to interpret ECB policy since 1999. An evaluation of alternative policy rules across 11 models of the euro area confirms the fragility of policy analysis optimized for any specific model and shows the merits of model averaging in policy design. Interestingly, a simple difference rule with the same coefficients on inflation and output growth as the one used to interpret ECB policy is quite robust as long as it responds to current outcomes of these variables.
Zeit seines Lebens beschäftigte sich Goethe mit ökonomischen Theorien; in seinem dichterischen Werk entwarf er wirtschaftliche Visionen, die das späte 19. und das frühe 20. Jahrhundert übersahen. Seine positive Vision des Kapitalismus ist von einer Sittlichkeit durchdrungen, die extreme Formen des Erwerbsstrebens und der Ausbeutung hemmt. Dagegen steht Goethes Schreckensbild eines uns modern erscheinenden Kapitalismus, wie es im »Faust« beschworen wird.
Vielfältige Einschnitte im Rentensystem haben die Bedeutung der privaten Altersvorsorge in den vergangenen Jahren massiv erhöht. Neben Immobilienbesitz, Lebensversicherungen und staatlich geförderten Programmen zur privaten Vorsorge hat sich inzwischen auch die eigenverantwortliche Altersvorsorge mit Wertpapierdepots etabliert, so dass die Anzahl privater Depots in den letzten 25 Jahren von 8,0 auf 27,9 Millionen gestiegen ist. Vor diesem Hintergrund ist die Frage von zentraler Bedeutung, wie gut Anleger ihr Geld investieren.
Kaum eine andere Industrie wurde in den vergangenen beiden Jahrzehnten so massiv durch den Einzug der Informationstechnologie geprägt wie der Wertpapierhandel. Traditionelle Geschäftsmodelle haben sich grundlegend verändert. Das klassische Börsenparkett ist nahezu vollständig elektronischen Handelsplattformen gewichen.
Die Rechnung kommt immer zum Schluss – und sie zu bezahlen, macht in der Regel keine Freude. Wenn wir aber schon eher ungern bezahlen, soll die Zahlungsmethode selbst wenigstens einfach, überall verfügbar und sehr sicher sein. Insbesondere die Sicherheit ist beim Bezahlen im 21. Jahrhundert ein wichtiges Thema, zu dem es interdisziplinäre Forschungsansätze aus Wirtschaftswissenschaften, Informatik und Recht in einem von dynamischer Entwicklung geprägten Umfeld gibt.
Der Fokus dieser kumulativen Dissertation liegt auf der Untersuchung des Managements
öffentlich-privater Informationstechnologie (IT)-Partnerschaften. Daher werden
im Rahmen dieses Kapitels zunächst die Bedeutung der Forschung über öffentlichprivate
Partnerschaften (ÖPP) im IT-Bereich und anschließend die untersuchten zentralen
Forschungsfragen erläutert. Im Anschluss wird die Struktur dieser Arbeit kurz vorgestellt,
um darzustellen, wie die zentralen Forschungsfragen adressiert wurden.
...
Wie in der Einleitung beschrieben, bestand die zentrale Motivation für diese kumulative
Dissertation in der fehlenden Forschung zur Konzeptionalisierung öffentlicher und privater
Organisationskulturen und der Analyse deren Auswirkungen auf die erfolgreiche
Realisierung von IT-ÖPPs. Daher zielten die durchgeführten Fallstudien darauf ab, zu
untersuchen, wie IT-ÖPPs trotz anfänglicher Schwierigkeiten erfolgreich gestaltet werden
können und wie eine IT-ÖPP über die Zeit hinweg etabliert und aufrechterhalten
werden kann.
Das Ziel der ersten zentralen Forschungsfrage konzentrierte sich daher auf die Analyse
der Unterschiede von öffentlichen und privaten Organisationskulturen. Artikel 1 beantwortet
diese Forschungsfrage durch die Konzeptionalisierung öffentlicher und privater
Organisationskulturen (bestehend aus divergierenden Denkweisen, Wissensbasen
und organisationalen Strukturen) und liefert erste Einblicke in deren Auswirkungen auf
die öffentlich-private Zusammenarbeit sowie den Erfolg von IT-ÖPPs. Dadurch leistet
Artikel 1 einen Beitrag zur IT-ÖPP-Forschungsdomäne. Außerdem trägt Artikel 1 zur
theoretischen Domäne organisationaler Kulturunterschiede dadurch bei, dass er den
Einfluss öffentlicher und privater Normen sowie deren Werte auf organisationsspezifisches
Verhalten aufdeckt. Zusammenfassend illustriert Artikel 1, dass die Etablierung
einer nachhaltigen IT-ÖPP das Bewusstsein sowie Verständnis für Unterschiede von
öffentlichen und privaten Organisationskulturen erfordert, um einen Kooperationsmodus
zu verhandeln.
Artikel 2, 3 und 4 erweitern die Erkenntnisse der Auswirkungen unterschiedlicher Organisationskulturen
auf öffentlich-private Zusammenarbeit in IT-ÖPPs durch die Analyse
der Gründe und Vorgehensweisen zur erfolgreichen Gestaltung einer IT-ÖPP trotz
anfänglicher Schwierigkeiten. Dadurch antworten die Artikel auf die zweite zentrale
Forschungsfrage. Der Beitrag von Artikel 2 zur Forschungsdomäne besteht in der Untersuchung
der Rolle divergierender Verständnisse und Erwartungen, deren Auswirkungen
durch den ÖPP-Kontext verstärkt werden, für das Scheitern von IT-ÖPP-Projekten.
Der theoretische Beitrag hingegen bezieht sich auf die Identifizierung der Ursachen der
Verletzung des psychologischen Vertrags auf einer individuellen Ebene in IT-ÖPPs.
Abgeleitet von diesen Erkenntnissen veranschaulicht Artikel 2 die Bedeutung der Aufrechterhaltung
informaler Beziehungen für die Realisierung von IT-Projekten im ÖPPKontext.
Artikel 3 setzt auf den Ergebnissen von Artikel 2 auf und illustriert die organisationalen
Voraussetzungen sowie Management-Praktiken (auch beeinflusst durch
den spezifischen ÖPP-Kontext) für die Wiederherstellung einer nachhaltigen Partnerschaft
in scheiternden IT-ÖPP-Projekten. Dadurch leistet Artikel 3 einen Beitrag zur
IT-ÖPP-Forschungsdomäne. Aus einer theoretischen Perspektive erweitert Artikel 3 die
bestehende Literatur über das Brückenspannen durch die Untersuchung der notwendigen
Voraussetzungen und Aktivitäten des Brückenspannens auf einer organisationalen
Ebene, um die kulturelle Kluft in interorganisationalen Kooperationen zu überbrücken.
Basierend auf diesen Erkenntnissen zeigt Artikel 3, dass die Trendwende eines negativen
Verlaufs von IT-ÖPP-Projekten die Etablierung einer unbelasteten öffentlichprivaten
Beziehung und die kontinuierliche Pflege der Partnerschaft sowie der Beziehungen
zu den Anspruchsgruppen erfordert. Artikel 4 analysiert ebenfalls die Gründe
sowie Vorgehensweisen für erfolgreiche IT-ÖPPs. Basierend auf der Konzeptionalisierung
von öffentlichen und privaten Organisationskulturen von Artikel 1 erweitert
Artikel 4 die Analyse von Artikel 1 durch die detaillierte Darstellung, wie organisationsspezifische
Verhaltensweisen begründet sind und wie sie die öffentlich-private Zusammenarbeit
erschweren. Zusätzlich erweitert Artikel 4 die Erkenntnisse der IT-ÖPPForschungsdomäne
durch die Illustration, wie Unterschiede organisationaler Kulturen
ausbalanciert werden können, um eine nachhaltige Partnerschaft zu etablieren und ITÖPPs
aus administrativer, politischer und betriebswirtschaftlicher Perspektive erfolgreich
zu gestalten. Hinsichtlich der theoretischen Domäne organisationaler Kulturunterschiede
liefert Artikel 4 tiefgehende Einblicke in den Einfluss von Normen und Routinen
öffentlicher und privater Logiken auf die Ausprägungen öffentlicher und privater
Organisationskulturen sowie die detaillierte Erläuterung der Auswirkungen kultureller
Unterschiede auf organisationsspezifisches Verhalten. Abschließend erläutert Artikel 4
die Hierarchie von IT-ÖPP-Erfolgskriterien aus administrativer, politischer sowie betriebswirtschaftlicher
Perspektive und illustriert, dass der Erfolg von IT-ÖPPs von der
Verhandlung von Kompromissen über gemeinsame Partnerschaftsziele und -vorgehensweisen
hinsichtlich der verschiedenen Interessen der Parteien abhängig ist.
Artikel 5 integriert die Ergebnisse der Artikel 1 bis 4 und untersucht die Etablierung
und Aufrechterhaltung einer IT-ÖPP über die Zeit hinweg. Dadurch beantwortet Artikel
5 die dritte zentrale Forschungsfrage. Durch die Analyse des Partnerschaftsentwicklungsprozesses
in IT-ÖPP-Projekten, der der Herausforderung kollidierender öffentlicher
und privater Organisationskulturen ausgesetzt ist, erläutert Artikel 5 die drei
Phasen der IT-ÖPP-Entwicklung und die Ereignisse, die den Übergang zwischen den
Phasen initiieren. Zusätzlich zu diesen Beiträgen zur Forschungsdomäne illustriert Artikel
5 das Zusammenspiel konkurrierender institutioneller Logiken über die Zeit hinweg
und belegt den schrittweisen Austausch konkurrierender Logiken durch eine neue,
dominante Logik. Zusammenfassend zeigt Artikel 5, dass die Etablierung und Aufrechterhaltung
einer nachhaltigen, bilateralen Partnerschaft zwischen öffentlichen und
privaten Organisationen die Verhandlung eines Kooperationsmodus durch die Aufweichung
öffentlicher und privater institutioneller Normen und Prinzipien sowie die gedankliche
Annäherung der Parteien aneinander erfordert.
Insgesamt liefert diese kumulative Dissertation Erkenntnisse über die Unterschiede zwischen
öffentlichen und privaten Organisationskulturen, wie diese Unterschiede die öffentlich-
private Zusammenarbeit in IT-ÖPPs erschweren, die Management-Praktiken
für die Etablierung und Aufrechterhaltung einer nachhaltigen Partnerschaft über die Zeit
hinweg sowie die erfolgreiche Gestaltung von IT-ÖPPs. Dadurch schafft diese Arbeit
einen Ausgangspunkt für die tiefergehende Erforschung von IT-ÖPP-Management, die
Analyse des spezifischen ÖPP-Kontexts und die Verbesserung der Realisierung von ITÖPP-
Projekten.
This paper studies constrained portfolio problems that may involve constraints on the probability or the expected size of a shortfall of wealth or consumption. Our first contribution is that we solve the problems by dynamic programming, which is in contrast to the existing literature that applies the martingale method. More precisely, we construct the non-separable value function by formalizing the optimal constrained terminal wealth to be a (conjectured) contingent claim on the optimal non-constrained terminal wealth. This is relevant by itself, but also opens up the opportunity to derive new solutions to constrained problems. As a second contribution, we thus derive new results for non-strict constraints on the shortfall of inter¬mediate wealth and/or consumption.
This chapter aims to provide a hands-on approach to New Keynesian models and their uses for macroeconomic policy analysis. It starts by reviewing the origins of the New Keynesian approach, the key model ingredients and representative models. Building blocks of current-generation dynamic stochastic general equilibrium (DSGE) models are discussed in detail. These models address the famous Lucas critique by deriving behavioral equations systematically from the optimizing and forward-looking decision-making of households and firms subject to well-defined constraints. State-of-the-art methods for solving and estimating such models are reviewed and presented in examples. The chapter goes beyond the mere presentation of the most popular benchmark model by providing a framework for model comparison along with a database that includes a wide variety of macroeconomic models. Thus, it offers a convenient approach for comparing new models to available benchmarks and for investigating whether particular policy recommendations are robust to model uncertainty. Such robustness analysis is illustrated by evaluating the performance of simple monetary policy rules across a range of recently-estimated models including some with financial market imperfections and by reviewing recent comparative findings regarding the magnitude of government spending multipliers. The chapter concludes with a discussion of important objectives for on-going and future research using the New Keynesian framework.
Debt-induced crises, including the subprime, are usually attributed exclusively to supply-side factors. We examine the role of social influences on debt culture, emanating from perceived average income of peers. Utilizing unique information from a household survey representative of the Dutch population, that circumvents the issue of defining the social circle, we consider collateralized, consumer, and informal loans. We find robust social effects on borrowing, especially among those who consider themselves poorer than their peers; and on indebtedness, suggesting a link to financial distress. We employ a number of approaches to rule out spurious associations and to handle correlated effects.
In the aftermath of the global financial crisis, the state of macroeconomic modeling and the use of macroeconomic models in policy analysis has come under heavy criticism. Macroeconomists in academia and policy institutions have been blamed for relying too much on a particular class of macroeconomic models. This paper proposes a comparative approach to macroeconomic policy analysis that is open to competing modeling paradigms. Macroeconomic model comparison projects have helped produce some very influential insights such as the Taylor rule. However, they have been infrequent and costly, because they require the input of many teams of researchers and multiple meetings to obtain a limited set of comparative findings. This paper provides a new approach that enables individual researchers to conduct model comparisons easily, frequently, at low cost and on a large scale. Using this approach a model archive is built that includes many well-known empirically estimated models that may be used for quantitative analysis of monetary and fiscal stabilization policies. A computational platform is created that allows straightforward comparisons of models’ implications. Its application is illustrated by comparing different monetary and fiscal policies across selected models. Researchers can easily include new models in the data base and compare the effects of novel extensions to established benchmarks thereby fostering a comparative instead of insular approach to model development.
We investigate the decisions of listed firms to go private once again. We start by revealing that while a significant number of firms which go public is VC-backed, an overproportional share of these VC-backed firms go private later on (they stay on the exchange for an average of 8.5 years). We interpret this very robust pattern such that IPOs of VC-backed firms are to a large extent a temporary rather than a permanent feature of the corporate governance of these firms. We investigate various potential hypotheses why VCs actually seem to be able to bring marginal firms to the exchange by relating the going-private decisions to various characteristics of the IPO market as well as to VC characteristics. We find strong support for the certification ability of VCs: more experienced and reputable VCs are more able to bring marginal firms to public exchanges via an IPOs. These marginal firms backed-by more reputable and experienced VCs are more likely to go private later on. Hence, our analysis suggests that IPOs backed by experienced VCs are most likely to be a temporary rather than the final stage in the life of the portfolio firm. We find no support that reputable VCs underprice their IPO-exits more implying that they have no need to leave more money on the table to take the marginal firms public.
Die vorliegende Arbeit beschäftigt sich mit der zeitstetigen Portfoliooptimierung sowie mit Themen aus dem Bereich des Kreditrisikos. Das Ziel der Portfoliooptimierung ist es, zu einem gegebenen Anfangskapital die bestmöglichen Konsum- und Investmentstrategien zu finden. In dieser Arbeit wird dabei vor allem der Einfluss von Einkommen auf diese Entscheidungen untersucht. Da einerseits jedoch der zukünftige Einkommensstrom vom Zufall bestimmt ist und es andererseits keine Finanzprodukte gibt, die diesen replizieren können, stellt die Einbindung von Einkommen in die Portfoliooptimierung ein großes Problem dar. Es führt dazu, dass die Annahmen eines vollständigen Marktes nicht weiter gelten, so dass die Standardmethoden zur Lösung nicht angewendet werden können. Diese Arbeit analysiert mehrere Ausprägungen dieses Problems und geht auf verschiedene Verfahren zur Lösung ein. Weiterhin untersucht diese Studie den Einfluss des Kreditrisikos einer Firma auf die jeweilige Firmenrendite. Dabei wird vor allem auf eine Anomalie, die bereits umfassend in der Literatur diskutiert wurde, Bezug genommen. Diese Anomalie besagt, dass Firmen mit hohen Ausfallwahrscheinlichkeiten geringere Renditen erwirtschaften als Firmen mit kleineren Ausfallwahrscheinlichkeiten. Eine weitere Frage, die in den Bereich des Kreditrisikos fällt, ist die Frage, inwieweit Modelle dazu in der Lage sind, strukturierte Produkte zu bewerten und abzusichern. Diese Arbeit versucht Antworten darauf zu geben.