Working Paper
Refine
Year of publication
- 2004 (114) (remove)
Document Type
- Working Paper (114) (remove)
Has Fulltext
- yes (114)
Is part of the Bibliography
- no (114)
Keywords
- Anton Ulrich <Braunschweig-Wolfenbüttel (18)
- Herzog> / Octavia (18)
- Deutschland (13)
- Schätzung (7)
- Geldpolitik (6)
- Theorie (5)
- USA (5)
- Venture Capital (5)
- Corporate Governance (4)
- Kongress (4)
Institute
This paper employs individual bidding data to analyze the empirical performance of the longer term refinancing operations (LTROs) of the European Central Bank (ECB). We investigate how banks’ bidding behavior is related to a series of exogenous variables such as collateral costs, interest rate expectations, market volatility and to individual bank characteristics like country of origin, size, and experience. Panel regressions reveal that a bank’s bidding depends on bank characteristics. Yet, different bidding behavior generally does not translate into differences concerning bidder success. In contrast to the ECB’s main refinancing operations, we find evidence for the winner’s curse effect in LTROs. Our results indicate that LTROs do neither lead to market distortions nor to unfair auction outcomes. JEL classification: E52, D44
Even though tourism has been recognised as an important field for transnational research today, there are few attempts to place tourism in the context of transnational theories or to think about transnationalism from the perspective of tourists. I argue that in researching tourist practices one can add important aspects to transnational approaches. The prerequisites of mobility and interaction for example are the features chosen by backpackers to describe what their Round-The-World-Trip is about. A form of tourism is adopted, or created, that itself confronts many aspects of globalisation: First of all there is the immense dynamic that is involved. Backpackers try to cover as many places and experiences as possible, travelling at high speed. They adopt all kinds of touristic experiences ranging from beach to adventure to culture tourism. They don't focus on a specific area or country but travel the world. They cross national borders perpetually. Additionally they form a transnational network in which they interact with strangers of similar backgrounds (other backpackers, tourist professionals). This network helps them interacting with people from different backgrounds (the socalled hosts or locals). Considering my research Backpackers forge a certain identity from these transnational practices which I want to name globedentity. Globedentity expresses a type of identity construction that not only refers to the individual (I) but reflects the world (globe) in this identity. This globedentity is not fixed but is perpetually re-created and re-defined. It also embraces the increasing popular awareness of globalisation which backpackers, coming from highly educated middle class backgrounds, in particular have identified with. Due to the constant awareness of the latest global social, cultural and economic developments in these educated milieus they know exactly which tools to use to become successful parts of their societies.
Taxation and tax policy reform appears on the political agenda in most advanced welfare states in Europe and North America. Of course studies of taxation and tax policy are nothing new and have existed ever since people have paid taxes. The current work is situated in the context of the future of the welfare state and the reinforced international economic and political integration referred to as "globalization." The purpose of this paper is to analyze how globalization is affecting tax policy in advanced welfare states. In comparing the evolution of tax policy in Canada with those in the United States, Germany and Sweden from 1960 to 1995, I will try to review the conventional antiglobalization thesis, i.e., that globalization leads to a "race to the bottom" in revenue and expenditures policies, or as others have called it, a "beggar the neighbour policy" (Tanzi and Bovenberg 1990, 187). ... Conclusion: The empirical data and theoretical models clearly show that globalization is one relatively minor factor among many that explain tax policy reforms. And even that limited influence is mediated by domestic political systems, institutions and constellations of actors. As the data has shown, the conventional globalization thesis of a race to the bottom is not borne out. Tax rates and tax revenues are still increasing, despite the ongoing trend toward international trade integration. Countervailing pressures like the high cost of welfare programs, different parties in government, strong labour unions, and institutional veto players counteract the pressure of globalization on tax policy. As for the future of taxation in Canada, it is more likely to be one of gradual evolution than radical change. Although the data don’t show any downward pressure on tax rates and tax revenues comparatively speaking, there are at least four key factors in Canada that are likely to put pressure on future tax rates, although regional political dynamics and the workings of fiscal federalism suggest that tax reductions will be a higher priority in some provinces than others (Hale 2002). First, neoliberalism will continue to shape fiscal and tax policy, including the role of the tax system in delivering social policies and programs in most parts of Canada. Second, governments that seek to define their own economic and social priorities rather than simply react to events beyond their borders will have to exercise centralized control over budgetary policies and spending levels if they hope to foster the economic growth needed to finance social services in the context of Canada’s changing demographics. Third, the ability of governments to combine the promotion of economic growth and higher living standards will be closely linked to their ability to develop a workable division of responsibilities among federal and provincial governments and with other national governments. Finally, the diffusion of new technologies will continue to transform national and regional economies while giving individuals greater opportunity to avoid government and tax regulations that run contrary to their perceived interests and values. This discussion of determinants that shape tax policy reform has shown that successful management of fiscal and tax policy requires a capacity to set priorities; adapt to changing circumstances; and build a consensus that enables competing economic, social, regional and ideological interests to identify their own well-being in the broader political and economic environment. Tax policy is shaped by many political, economic and social determinants. As Geoffrey Hale correctly concludes, "it should not be surprising if the tax system stubbornly refuses to confirm either economic theories or political ideologies, but reflects past decisions and the policy tradeoffs of the political process" (2002, 71). The notion of tax policy being driven by globalization and forces associated with globalization (both positive and negative) is simply not borne by the facts.
In der vorliegenden Studie werden die sozialpolitischen Reformen in den USA und Kanada während der 1990er Jahren in einer vergleichenden Perspektive analysiert. Dabei wird insbesondere die Rolle steuerpolitischer Instrumentarien in den Reformen thematisiert und der Frage nachgegangen, ob sich hier ein neuer Typ von Wohlfahrtsstaat herausbildet. Im ersten Teil des Papiers wird das in der vergleichenden Wohlfahrtsstaatsforschung etablierte Modell des liberalen Wohlfahrtsstaats skizziert, um vor diesem Hintergrund die Reformen in den USA und Kanada zu untersuchen und zu vergleichen. Anschließend wird in einer breiteren vergleichenden Perspektive die out-put-Leistung der beiden Wohlfahrtsstaaten analysiert. Al normative Kriterien hierbei gilt in erster Linie die Umverteilungsfunktion sozialpolitischer Instrumentarien, hier in erster Linie verstanden als Einkommensumverteilung.
Am Beginn des 21. Jahrhunderts wird der Zustand der US-Demokratie kontrovers diskutiert. Während manche Beobachter eine zu hohe Responsivität des politischen Systems gegenüber den Ansprüchen seiner Bürger entdeckt haben wollen und deshalb von demosclerosis und einer Hyperdemokratie sprechen, in welcher der Volkswille in einen unantastbaren, göttlichen Rang erhoben worden sei, kommen andere zu dem Schluss, dass die Gründerväter im Hinblick auf ihre handlungsanleitende Furcht vor einer »Tyrannei der Mehrheit« ganze Arbeit geleistet und ein nahezu unüberwindbares System von Vetopositionen geschaffen hätten, das Partikularinteressen strukturell bevorzuge und deshalb nur in Ausnahmesituationen die Mehrheitspräferenzen der Bürger in Politik umsetze. Kurzum: Die Furcht der Federalists vor einer »Mehrheitstyrannei« habe einer »Minderheitstyrannei« Tür und Tor geöffnet. Der Artikel versucht die Vereinigten Staaten in diesem Spannungsbogen zu verorten. Ziel ist es, die Qualität der amerikanischen Demokratie am Beginn des 21. Jahrhunderts zu problematisieren. Dabei werden auch die Entwicklungen nach dem 11. September berücksichtigt.
In dieser Studie werden die Wirkungen von Arbeitsbeschaffungsmaßnahmen (ABM) in Deutschland auf die individuellen Eingliederungswahrscheinlichkeiten der Teilnehmer in reguläre Beschäftigung evaluiert. Für die Untersuchung wird ein umfangreicher und informativer Datensatz aus den Datenquellen der Bundesagentur für Arbeit (BA) verwendet, der es ermöglicht, die Wirkungen der Programme differenziert nach individuellen Unterschieden der Teilnehmer und mit Berücksichtigung der heterogenen Arbeitsmarktstruktur zu untersuchen. Der Datensatz enthält Informationen zu allen Teilnehmern in ABM, die ihre Maßnahmen im Februar 2000 begonnen haben, und zu einer Kontrollgruppe von Nichtteilnehmern, die im Januar 2000 arbeitslos waren und im Februar 2000 nicht in die Programme eingetreten sind. Mit Hilfe der Informationen der Beschäftigtenstatistik ist es hierbei erstmals möglich, den Abgang in reguläre Beschäftigung auf Grundlage administrativer Daten zu untersuchen. Der vorliegende Verbleibszeitraum reicht bis Dezember 2002. Unter Verwendung von Matching-Methoden auf dem Ansatz potenzieller Ergebnisse werden die Effekte von ABM mit regionaler Unterscheidung und für besondere Problem- und Zielgruppen des Arbeitsmarktes geschätzt. Die Ergebnisse zeigen zwar deutliche Unterschiede in den Effekten für Subgruppen, insgesamt weisen die empirischen Befunde jedoch darauf hin, dass das Ziel der Eingliederung in reguläre ungeförderte Beschäftigung durch ABM weitgehend nicht realisiert werden konnte. JEL: C40 , C13 , J64 , H43 , J68
A version of this paper was originally written for a plenary session about "The Futures of Ethnography" at the 1998 EASA conference in Frankfurt/Main. In the preparation of the paper, I sent out some questions to my former fellow researchers by e-mail. I thank Douglas Anthony, Jan-Patrick Heiß, Alaine Hutson, Matthias Krings, and Brian Larkin for their answers.
To resolve the IPO underpricing puzzle it is essential to analyze who knows what when during the issuing process. In Germany, broker-dealers make a market in IPOs during the subscription period. We examine these pre-issue prices and find that they are highly informative. They are closer to the first price subsequently established on the exchange than both the midpoint of the bookbuilding range and the offer price. The pre-issue prices explain a large part of the underpricing left unexplained by other variables. The results imply that information asymmetries are much lower than the observed variance of underpricing suggests.
Open source projects produce goods or standards that do not allow for the appropriation of private returns by those who contribute to their production. In this paper we analyze why programmers will nevertheless invest their time and effort to code open source software. We argue that the particular way in which open source projects are managed and especially how contributions are attributed to individual agents, allows the best programmers to create a signal that more mediocre programmers cannot achieve. Through setting themselves apart they can turn this signal into monetary rewards that correspond to their superior capabilities. With this incentive they will forgo the immediate rewards they could earn in software companies producing proprietary software by restricting the access to the source code of their product. Whenever institutional arrangements are in place that enable the acquisition of such a signal and the subsequent substitution into monetary rewards, the contribution to open source projects and the resulting public good is a feasible outcome that can be explained by standard economic theory.
The paper is a follow-up to an article published in Technique Financière et Developpement in 2000 (see the appendix to the hardcopy version), which portrayed the first results of a new strategy in the field of development finance implemented in South-East Europe. This strategy consists in creating microfinance banks as greenfield investments, that is, of building up new banks which specialise in providing credit and other financial services to micro and small enterprises, instead of transforming existing credit-granting NGOs into formal banks, which had been the dominant approach in the 1990s. The present paper shows that this strategy has, in the course of the last five years, led to the emergence of a network of microfinance banks operating in several parts of the world. After discussing why financial sector development is a crucial determinant of general social and economic development and contrasting the new strategy to former approaches in the area of development finance, the paper provides information about the shareholder composition and the investment portfolio of what is at present the world's largest and most successful network of microfinance banks. This network is a good example of a well-functioning "private public partnership". The paper then provides performance figures and discusses why the creation of such a network seems to be a particularly promising approach to the creation of financially self-sustaining financial institutions with a clear developmental objective.
This paper provides an in-depth analysis of the properties of popular tests for the existence and the sign of the market price of volatility risk. These tests are frequently based on the fact that for some option pricing models under continuous hedging the sign of the market price of volatility risk coincides with the sign of the mean hedging error. Empirically, however, these tests suffer from both discretization error and model mis-specification. We show that these two problems may cause the test to be either no longer able to detect additional priced risk factors or to be unable to identify the sign of their market prices of risk correctly. Our analysis is performed for the model of Black and Scholes (1973) (BS) and the stochastic volatility (SV) model of Heston (1993). In the model of BS, the expected hedging error for a discrete hedge is positive, leading to the wrong conclusion that the stock is not the only priced risk factor. In the model of Heston, the expected hedging error for a hedge in discrete time is positive when the true market price of volatility risk is zero, leading to the wrong conclusion that the market price of volatility risk is positive. If we further introduce model mis-specification by using the BS delta in a Heston world we find that the mean hedging error also depends on the slope of the implied volatility curve and on the equity risk premium. Under parameter scenarios which are similar to those reported in many empirical studies the test statistics tend to be biased upwards. The test often does not detect negative volatility risk premia, or it signals a positive risk premium when it is truly zero. The properties of this test furthermore strongly depend on the location of current volatility relative to its long-term mean, and on the degree of moneyness of the option. As a consequence tests reported in the literature may suffer from the problem that in a time-series framework the researcher cannot draw the hedging errors from the same distribution repeatedly. This implies that there is no guarantee that the empirically computed t-statistic has the assumed distribution. JEL: G12, G13 Keywords: Stochastic Volatility, Volatility Risk Premium, Discretization Error, Model Error
In a framework closely related to Diamond and Rajan (2001) we characterize different financial systems and analyze the welfare implications of different LOLR-policies in these financial systems. We show that in a bank-dominated financial system it is less likely that a LOLR-policy that follows the Bagehot rules is preferable. In financial systems with rather illiquid assets a discretionary individual liquidity assistance might be welfare improving, while in market-based financial systems, with rather liquid assets in the banks' balance sheets, emergency liquidity assistance provided freely to the market at a penalty rate is likely to be efficient. Thus, a "one size fits all"-approach that does not take the differences of financial systems into account is misguiding. JEL - Klassifikation: D52 , E44 , G21 , E52 , E58
When options are traded, one can use their prices and price changes to draw inference about the set of risk factors and their risk premia. We analyze tests for the existence and the sign of the market prices of jump risk that are based on option hedging errors. We derive a closed-form solution for the option hedging error and its expectation in a stochastic jump model under continuous trading and correct model specification. Jump risk is structurally different from, e.g., stochastic volatility: there is one market price of risk for each jump size (and not just \emph{the} market price of jump risk). Thus, the expected hedging error cannot identify the exact structure of the compensation for jump risk. Furthermore, we derive closed form solutions for the expected option hedging error under discrete trading and model mis-specification. Compared to the ideal case, the sign of the expected hedging error can change, so that empirical tests based on simplifying assumptions about trading frequency and the model may lead to incorrect conclusions.
This paper deals with the superhedging of derivatives and with the corresponding price bounds. A static superhedge results in trivial and fully nonparametric price bounds, which can be tightened if there exists a cheaper superhedge in the class of dynamic trading strategies. We focus on European path-independent claims and show under which conditions such an improvement is possible. For a stochastic volatility model with unbounded volatility, we show that a static superhedge is always optimal, and that, additionally, there may be infinitely many dynamic superhedges with the same initial capital. The trivial price bounds are thus the tightest ones. In a model with stochastic jumps or non-negative stochastic interest rates either a static or a dynamic superhedge is optimal. Finally, in a model with unbounded short rates, only a static superhedge is possible.
Empirical evidence suggests that even those firms presumably most in need of monitoringintensive financing (young, small, and innovative firms) have a multitude of bank lenders, where one may be special in the sense of relationship lending. However, theory does not tell us a lot about the economic rationale for relationship lending in the context of multiple bank financing. To fill this gap, we analyze the optimal debt structure in a model that allows for multiple but asymmetric bank financing. The optimal debt structure balances the risk of lender coordination failure from multiple lending and the bargaining power of a pivotal relationship bank. We show that firms with low expected cash-flows or low interim liquidation values of assets prefer asymmetric financing, while firms with high expected cash-flow or high interim liquidation values of assets tend to finance without a relationship bank. JEL - Klassifikation: G21 , G78 , G33
This paper suggests a motive for bank mergers that goes beyond alleged and typically unverifiable scale economies: preemtive resolution of banks´ financial distress. Such "distress mergers" can be a significant motivation for mergers because they can foster reorganizations, realize diversification gains, and avoid public attention. However, since none of these potential benefits comes without a cost, the overall assessment of distress mergers is unclear. We conduct an empirical analysis to provide evidence on consequences of distress mergers. The analysis is based on comprehensive data from Germany´s savings and cooperatives banks sectors over the period 1993 to 2001. During this period both sectors faced significant structural problems and superordinate institutions (associations) presumably have engaged in coordinated actions to manage distress mergers. The data comprise 3640 banks and 1484 mergers. Our results suggest that bank mergers as a means of preemtive distress resolution have moderate costs in terms of the economic impact on performance. We do find strong evidence consistent with diversification gains. Thus, distress mergers seem to have benefits without affecting systematic stability adversely.
Tests for the existence and the sign of the volatility risk premium are often based on expected option hedging errors. When the hedge is performed under the ideal conditions of continuous trading and correct model specification, the sign of the premium is the same as the sign of the mean hedging error for a large class of stochastic volatility option pricing models. We show, however, that the problems of discrete trading and model mis-specification, which are necessarily present in any empirical study, may cause the standard test to yield unreliable results.
Der Bestimmung risikoadäquater Diskontierungssätze kommt bei der Unternehmensbedeutung eine zentrale Bedeutung zu. Wird zu deren Bestimmung in der praktischen Anwendung das CAPM verwendet, gilt es dabei, risikolose Zinssätze und Risikoprämien zu bestimmen, für die erwartete Renditen des Marktportfeuilles und Beta-Faktoren als Maßgrößen für das systematische Risiko benötigt werden. Passend zu den zu bewertenden erwarteten Überschussgrößen sollten auch die zur Diskontierung verwendeten Renditeforderungen die im Bewertungszeitpunkt erwarteten künftigen Renditen vergleichbarer Anlagen widerspiegeln. Die weitaus meisten Beiträge zur Operationalisierung des CAPM leiten die Renditeforderungen jedoch aus historischen Kapitalmarktrenditen ab. Wir zeigen in diesem Beitrag auf, wie erwartete künftige Renditen aus beobachtbaren Größen, vor allen den Zinsstrukturkurven und den beobachtbaren Analystenprognosen, zukunftsorientiert abgeleitet werden können. Damit wird eine konzeptionell schlüssigere Bewertung der im Bewertungszeitpunkt erwarteten künftigen Überschüsse mit den zeitgleich erwarteten künftigen Renditen ermöglicht.