Refine
Year of publication
- 2008 (137) (remove)
Document Type
- Working Paper (137) (remove)
Is part of the Bibliography
- no (137) (remove)
Keywords
- USA (7)
- Deutschland (6)
- Bank (5)
- Geldpolitik (5)
- Lambda-Kalkül (5)
- Operationale Semantik (5)
- Programmiersprache (5)
- Haushalt (4)
- Liquidität (4)
- Aging (3)
Institute
Aufbauend auf einer Literaturanalyse wird der derzeitige technische Entwicklungsstand im Bereich der Wiedergewinnung von Phosphat und Stickstoffverbindungen aus dem häuslichen Abwasser skizziert: Neben der (chemischen) Wiedergewinnung aus dem Abwasser und der Verwendung von Anaerobverfahren sowie die Wiedergewinnung aus Klärschlamm ist auch die Bewässerung mit Abwasser, die Kompostierung sowie die Fraktionierung von Abwasser („Gelbwasser“) eine Möglichkeit zur besseren Ausnutzung der Nährstoffgehalte des Abwassers. Der erzielte Überblick über den derzeitigen Stand der Nährstoffrückgewinnung diente dazu, mögliche Entwicklungsaufgaben zu identifizieren, die einerseits vordringlich (insbesondere zur Lösung globaler Probleme, z.B. zur Beendigung des Ressourcenmangels) erscheinen und deren Lösung andererseits besonders innovative Leistungen erfordern. Die Entwicklungsaufgaben wurden thesenhaft zugespitzt, um so anschließend in einer Delphi-Befragung überprüft werden zu können.
Aufbauend auf einer Literaturanalyse wird der derzeitige technische Entwicklungsstand im Bereich des Grauwasserrecyclings skizziert. Neben mechanisch-biologische Anlagen treten vereinzelt Membranfilteranlagen, aber auch „Low-Tech“-Anlagen. Der Überblick half, mögliche Entwicklungsaufgaben zu identifizieren, die einerseits vordringlich (insbesondere zur Lösung künftiger Wassermengenprobleme) erscheinen und deren Lösung andererseits besonders innovative Leistungen erfordern. Die Entwicklungsaufgaben wurden thesenhaft zugespitzt, um so anschließend in einer Delphi-Befragung überprüft werden zu können.
Aufbauend auf einer Literaturanalyse wird der derzeitige technische Entwicklungsstand im Bereich der Energierückgewinnung aus dem Siedlungsabwasser skizziert. Neben der Wärmerückgewinnung, die sowohl im Kanalnetz als auch dezentral in Gebäuden möglich ist, wurde die Biogasgewinnung sowohl auf Aerobkläranlagen als auch in Anaerobanlagen und die anschließende Aufbereitung der Klärgase in Erdgasqualität ebenso diskutiert wie die Nutzung von Schlämmen als Brennmaterial. Die Darstellung des derzeitigen Entwicklungsstandes half dabei, mögliche Entwicklungsaufgaben zu identifizieren, die einerseits vordringlich erlauben könnten, Abwasser künftig als Energieressource zu betrachten, und deren Lösung andererseits besonders innovative Leistungen erfordern. Die Entwicklungsaufgaben wurden thesenhaft zugespitzt, um so anschließend in einer Delphi-Befragung überprüft zu werden.
Die vorliegende Arbeit widmet sich der phonetischen Motivation phonologischer Palatalisierungsprozesse, bei welchen Vorderzungenvokoide die Palatalisierung (bzw. Affrizierung) vorangehender Plosive bewirken. Durch akustische Analysen zu deutschen und bulgarischen stimmlosen alveolaren und velaren Verschlußlauten wird der Einfluß nachfolgender vorderer Vokoide und des tiefen Vokals /a/ auf die geräuschähnliche Phase nach der plosiven Verschlußlösung der Konsonanten untersucht. Zum Zwecke der Überprüfung einer nach universellen phonologischen Prinzipien formulierten Hierarchie der wahrscheinlichen Inputkandidaten für Palatalisierungen werden akustische Messungen zur Zeitdauer und zu den spektralen Eigenschaften des konsonantischen Segments in wortinitialen Konsonant-Vokoid-Sequenzen vorgestellt. Die Ergebnisse der Studie unterstützen nur teilweise die vorgeschlagene Hierarchiehypothese und zeigen, daß sprachspezifische Besonderheiten einen Einfluß auf die Anordnung der Elemente der Hierarchie ausüben.
We investigate methods and tools for analyzing translations between programming languages with respect to observational semantics. The behavior of programs is observed in terms of may- and mustconvergence in arbitrary contexts, and adequacy of translations, i.e., the reflection of program equivalence, is taken to be the fundamental correctness condition. For compositional translations we propose a notion of convergence equivalence as a means for proving adequacy. This technique avoids explicit reasoning about contexts, and is able to deal with the subtle role of typing in implementations of language extensions.
We investigate methods and tools for analysing translations between programming languages with respect to observational semantics. The behaviour of programs is observed in terms of may- and mustconvergence in arbitrary contexts, and adequacy of translations, i.e., the reflection of program equivalence, is taken to be the fundamental correctness condition. For compositional translations we propose a notion of convergence equivalence as a means for proving adequacy. This technique avoids explicit reasoning about contexts, and is able to deal with the subtle role of typing in implementations of language extensions.
We investigate methods and tools for analysing translations between programming languages with respect to observational semantics. The behaviour of programs is observed in terms of may- and mustconvergence in arbitrary contexts, and adequacy of translations, i.e., the reflection of program equivalence, is taken to be the fundamental correctness condition. For compositional translations we propose a notion of convergence equivalence as a means for proving adequacy. This technique avoids explicit reasoning about contexts, and is able to deal with the subtle role of typing in implementations of language extensions.
This paper proves several generic variants of context lemmas and thus contributes to improving the tools for observational semantics of deterministic and non-deterministic higher-order calculi that use a small-step reduction semantics. The generic (sharing) context lemmas are provided for may- as well as two variants of must-convergence, which hold in a broad class of extended process- and extended lambda calculi, if the calculi satisfy certain natural conditions. As a guide-line, the proofs of the context lemmas are valid in call-by-need calculi, in callby-value calculi if substitution is restricted to variable-by-variable and in process calculi like variants of the π-calculus. For calculi employing beta-reduction using a call-by-name or call-by-value strategy or similar reduction rules, some iu-variants of ciu-theorems are obtained from our context lemmas. Our results reestablish several context lemmas already proved in the literature, and also provide some new context lemmas as well as some new variants of the ciu-theorem. To make the results widely applicable, we use a higher-order abstract syntax that allows untyped calculi as well as certain simple typing schemes. The approach may lead to a unifying view of higher-order calculi, reduction, and observational equality.
We present a higher-order call-by-need lambda calculus enriched with constructors, case-expressions, recursive letrec-expressions, a seq-operator for sequential evaluation and a non-deterministic operator amb that is locally bottom-avoiding. We use a small-step operational semantics in form of a single-step rewriting system that defines a (nondeterministic) normal order reduction. This strategy can be made fair by adding resources for bookkeeping. As equational theory we use contextual equivalence, i.e. terms are equal if plugged into any program context their termination behaviour is the same, where we use a combination of may- as well as must-convergence, which is appropriate for non-deterministic computations. We show that we can drop the fairness condition for equational reasoning, since the valid equations w.r.t. normal order reduction are the same as for fair normal order reduction. We evolve different proof tools for proving correctness of program transformations, in particular, a context lemma for may- as well as mustconvergence is proved, which restricts the number of contexts that need to be examined for proving contextual equivalence. In combination with so-called complete sets of commuting and forking diagrams we show that all the deterministic reduction rules and also some additional transformations preserve contextual equivalence.We also prove a standardisation theorem for fair normal order reduction. The structure of the ordering <=c a is also analysed: Ω is not a least element, and <=c already implies contextual equivalence w.r.t. may-convergence.
This paper discusses the effect of capital regulation on the risk taking behavior of commercial banks. We first theoretically show that capital regulation works differently in different market structures of banking sectors. In lowly concentrated markets, capital regulation is effective in mitigating risk taking behavior because banks' franchise values are low and banks have incentives to pursue risky strategies in order to increase their franchise values. If franchise values are high, on the other hand, the effect of capital regulation on bank risk taking is ambiguous as banks lack those incentives. We then test the model predictions on a cross-country sample including 421 commercial banks from 61 countries. We find that capital regulation is effective in mitigating risk taking only in markets with a low degree of concentration. The results remain robust after accounting for financial sector development, legal system effciency, and for other country and bank-specific characteristics. Keywords: Banks, market structure, risk shifting, franchise value, capital regulation
CONTENTS Preamble 1. Concept and Drivers of Globalization 1.0 A Brief Historical Perspective 1.1 Concept of Globalization 1.2 Economic Globalization 1.3 Drivers of Economic Globalization 2. Globalization and Markets 2.1 The Free Market System 2.2 Markets and the Solution of Economic Problems 2.3 African Markets and “Getting the Prices Right”. 2.4 Implications of the Imperfect Market System 2.5 Government’s Inevitable Role 2.6 The International Environment/Markets 3. Globalization and Trade Liberalisation 3.1 The Experience of the Developing Countries 3.2 Nigeria’s Experience with Trade Liberalisation 4. Global Economic Integration and Sub-Saharan Africa 4.1 Global Economic Integration 4.2 Africa’s Integration with the World Economy 4.3 The Benefits of Economic Globalization and Sub-Saharan Africa 4.4 Why has Africa Lagged? 5. Nigeria and the Global Economy 5.1 Openness of the Economy and Integration with the World Economy 5.2 Globalization and Nigeria’s Trade 5.3 Globalization and Foreign Capital Flows to Nigeria 5.4 Foreign Capital Flows and Debt Accumulation 5.5 Globalization, Growth and Development 6. Appropriate Policy Responses and Lessons 7. Concluding Remarks 8. Appreciation 9. Annex 10. References
Heiner Boehncke beschreibt in seinem kurzen Aufsatz die Entwicklung des Kulturprojekts >Literaturland Hessen<, das mittlerweile zum Begriff geworden ist und über Hessen hinaus für gelungene Kultur-Kooperation steht. Heute handelt es sich bei dem Projekt >Literaturland Hessen< um eine Kooperation des Hessischen Rundfunks mit dem Hessischen Ministerium für Wissenschaft und Kultur, dem ADAC Hessen/Thüringen, dem Hessischen Literaturrat und der Kulturstiftung der Sparkassen Hessen/Thüringen.
This paper discusses the implications of transnational media production and diasporic networks for the cultural politics of migrant minorities. How are fields of cultural politics transformed if Hirschmann’s famous options ‘exit’ and ‘voice’ are no longer constituting mutually exclusive responses to dissent within a nation-state, but modes of action that can combine and build upon each other in the context of migration and diasporic media activism? Two case studies are discussed in more detail, relating to Alevi amateur television production in Germany and to a Kurdish satellite television station that reaches out to a diaspora across Europe and the Middle East. Keywords: migrant media, transnationalism, Alevis, Kurds, Turkey, Germany
Hong Kong’s Linked Exchange Rate System (LERS) has been in operation for twenty-five years during which time many other fixed exchange rate systems have succumbed to shocks and/or speculative attacks. This fact alone suggests that the LERS is a robust system which enjoys a large measure of credibility in financial markets. This paper intends to investigate whether this is indeed the case, and whether it has been the case throughout its 25-year history. In particular we will use the tools of modern finance to extract information from financial asset prices about market expectations that are related to the credibility of the LERS. The main focus is on how market participants ‘judged’ the various changes made to the LERS, such as the ‘seven technical measures’ introduced in September 1998 and the ‘three refinements’ made in May 2005. These changes have been characterizes as making the system less discretionary over time, and we hypothesize that they have also made it more credible as revealed in the prices of exchange rate related asset prices. We also investigate the relationship between interest rates and exchange rates in the current system in light of modern models of target-zone exchange rate systems. We will examine whether the intramarginal intervention in November 2007 changed the dynamic properties of the exchange rate as suggested by such models.
Inhalt: Prof. Dr. Helmut Siekmann : Stellungnahme für die öffentliche Anhörung des Haushaltsausschusses zu dem Gesetzentwurf der Fraktion der SPD und Bündnis 90/Die Grünen für ein Gesetz zur Änderung der Hessischen Landeshaushaltsordnung Gesetzentwurf der Fraktionen der SPD und Bündnis 90/Die Grünen für ein Gesetz zur Änderung der Hessischen Landeshaushaltsordnung (LHO) : Drucksache 17/265 Liste der Anzuhörenden im Haushaltsausschuss : am 17.09.2008 zur Drucksache 17/265
Ensuring financial stability : financial structure and the impact of monetary policy on asset prices
(2008)
This paper studies the responses of residential property and equity prices, inflation and economic activity to monetary policy shocks in 17 countries, using data spanning 1986-2006. We estimate VARs for individual economies and panel VARs in which we distinguish between groups of countries on the basis of the characteristics of their financial systems. The results suggest that using monetary policy to offset asset price movements in order to guard against financial instability may have large effects on economic activity. Furthermore, while financial structure influences the impact of policy on asset prices, its importance appears limited. Keywords: asset prices, monetary policy, panel VAR. JEL Number: C23, E52
Inhalt Prof. Dr. Helmut Siekmann : Föderalismuskommission II für eine zukunftsfähige Gestaltung der Finanzsysteme nutzen. Stellungnahme für das Expertengespräch des Haushalts- und Finanzausschusses des Landtags Nordrhein-Westfalen am 14.02.2008 Stellungnahme 14/1785 Antrag der Fraktion BÜNDNIS90/Die Grünen im Landtag Nordrhein-Westfalen: Drucksache 14/4338 Fragenkatalog zum Expertengespräch des Haushalts- und Finanzausschusses und des Hauptausschusses am 14.02.2008
Recently, the Bank of Japan outlined a “two perspectives” approach to the conduct of monetary policy that focuses on risks to price stability over different time horizons. Interpreting this as pertaining to different frequency bands, we use band spectrum regression to study the determination of inflation in Japan. We find that inflation is related to money growth and real output growth at low frequencies and the output gap at higher frequencies. Moreover, this relationship reflects Granger causality from money growth and the output gap to inflation in the relevant frequency bands. Keywords: spectral regression, frequency domain, Phillips curve, quantity theory. JEL Numbers: C22, E3, E5
We study the effect of randomness in the adversarial queueing model. All proofs of instability for deterministic queueing strategies exploit a finespun strategy of insertions by an adversary. If the local queueing decisions in the network are subject to randomness, it is far from obvious, that an adversary can still trick the network into instability. We show that uniform queueing is unstable even against an oblivious adversary. Consequently, randomizing the queueing decisions made to operate a network is not in itself a suitable fix for poor network performances due to packet pileups.
Purim and parodies
(2008)
Ce texte s’est voulu une brève présentation des tons phonologiques qu’on rencontre dans les langues bantoues parlées au Gabon. L’élément nouveau ici par rapport à ce que l’on sait de l'analyse de la tonalité des langues bantoues en général, c’est la prise en compte de l'intonation dans l'explication de certaines modifications tonales du niveau lexical dont les tons lexicaux (fixes ou flottants) ne peuvent pas rendre compte.
Vorwort: Klima ist vor allem deswegen nicht nur von wissenschaftlichem, sondern auch von öffentlichem Interesse, weil es veränderlich ist und weil solche Änderungen gravierende ökologische sowie sozioökonomische Folgen haben können. Im Detail weisen Klimaänderungen allerdings komplizierte zeitliche und räumliche Strukturen auf, deren Erfassung und Interpretation alles andere als einfach ist. Bei den zeitlichen Strukturen stehen mit Recht vor allem relativ langfristige Trends sowie Extremereignisse im Blickpunkt, erstere, weil sie den systematischen Klimawandel zum Ausdruck bringen und letztere wegen ihrer besonders brisanten Auswirkungen. Mit beiden Aspekten hat sich unsere Arbeitsgruppe immer wieder eingehend befasst. Hinsichtlich der Extremereignisse bzw. Extremwertstatistik sei beispielsweise auf die Institutsberichte Nr. 1, 2 und 5 sowie die dort angegebene Literatur hingewiesen. Hier geht es wieder einmal um Klimatrends und dabei ganz besonders um die räumlichen Trendstrukturen. Der relativ langfristige und somit systematische Klimawandel läuft nämlich regional sehr unterschiedlich ab, was am besten in Trendkarten zum Ausdruck kommt. Solche regionalen, zum Teil sehr kleinräumigen Besonderheiten sind insbesondere beim Niederschlag sehr ausgeprägt. Zudem sind die räumlichen Trendstrukturen auch jahreszeitlich/monatlich sehr unterschiedlich. In unserer Arbeitsgruppe hat sich Herr Dr. Jörg Rapp im Rahmen seiner Diplom- und insbesondere Doktorarbeit intensiv mit diesem Problem beschäftigt, was zur Publikation des „Atlas der Niederschlags- und Temperaturtrends in Deutschland 1891-1990“ (Rapp und Schönwiese, 2. Aufl. 1996) sowie des „Climate Trend Atlas of Europe – Based on Observations 1891-1990“ (Schönwiese und Rapp, 1997) geführt hat. Die große Beachtung dieser Arbeiten ließ es schon lange als notwendig erscheinen, eine Aktualisierung vorzunehmen. Dies ist zunächst für den Klima-Trendatlas Deutschland geschehen, der nun für das Zeitintervall 1901-2000 vorliegt (Institutsbericht Nr. 4, 2005). Hier wird nun auch eine entsprechende Aktualisierung für Europa vorgelegt, und zwar auf der Grundlage der Berechnungen, die Reinhard Janoschitz in seiner Diplomarbeit durchgeführt hat. Dabei besteht eine enge Querverbindung zum Projekt VASClimO (Variability Analysis of Surface Climate Observations), das dankenswerterweise vom Bundesministerium für Bildung und Forschung (BMBF) im Rahmen von DEKLIM (Deutsches Klimaforschungsprogramm) gefördert worden ist (siehe Institutsbericht Nr. 6, in den vorab schon einige wenige Europa-Klima-Trendkarten einbezogen worden sind). Mit der Publikation des hier vorliegenden „Klima-Trendatlas Europa 1901-2000“ werden in insgesamt 261 Karten (davon 17 Karten in Farbdarstellung in den Text integriert) wieder umfangreiche Informationen zum Klimawandel in Europa vorgelegt. Sie beruhen vorwiegend auf linearen Trendanalysen hinsichtlich der bodennahen Lufttemperatur und des Niederschlags für die Zeit 1901-2000 sowie für die Subintervalle 1951-2000, 1961-1990 und 1971-2000, jeweils aufgrund der jährlichen, jahreszeitlichen und monatlichen Beobachtungsdaten. Die Signifikanz der Trends ist im (schwarz/weiß wiedergegebenen) Kartenteil durch Rasterung markiert. Da sich die Analyse eng an die oben zitierte Arbeit von Schönwiese und Rapp (1997) anlehnt, wo ausführliche textliche Erläuterungen zu finden sind (ebenso in Rapp, 2000) wurde hier der Textteil sehr knapp gehalten.
We investigate, using the 2002 US Health and Retirement Study, the factors influencing individuals’ insecurity and expectations about terrorism, and study the effects these last have on households’ portfolio choices and spending patterns. We find that females, the religiously devout, those equipped with a better memory, the less educated, and those living close to where the events of September 2001 took place worry a lot about their safety. In addition, fear of terrorism discourages households from investing in stocks, mostly through the high levels of insecurity felt by females. Insecurity due to terrorism also makes single men less likely to own a business. Finally, we find evidence of expenditure shifting away from recreational activities that can potentially leave one exposed to a terrorist attack and towards goods that might help one cope with the consequences of terrorism materially (increased use of car and spending on the house) or psychologically (spending on personal care products by females in couples).
We document significant and robust empirical relationships in cross-country panel data between government size or social expenditure on the one hand, and trade and financial development indicators on the other. Across countries, deeper economic integration is associated with more intense government redistribution, but more developed financial markets weaken that relationship. Over time, controlling for country-specific effects, public social expenditure appears to be eroded by globalization trends where financial market development can more easily substitute for it.
The paper provides novel insights on the effect of a firm’s risk management objective on the optimal design of risk transfer instruments. I analyze the interrelation between the structure of the optimal insurance contract and the firm’s objective to minimize the required equity it has to hold to accommodate losses in the presence of multiple risks and moral hazard. In contrast to the case of risk aversion and moral hazard, the optimal insurance contract involves a joint deductible on aggregate losses in the present setting.
This paper analyzes liquidity in an order driven market. We only investigate the best limits in the limit order book, but also take into account the book behind these inside prices. When subsequent prices are close to the best ones and depth at them is substantial, larger orders can be executed without an extensive price impact and without deterring liquidity. We develop and estimate several econometric models, based on depth and prices in the book, as well as on the slopes of the limit order book. The dynamics of different dimensions of liquidity are analyzed: prices, depth at and beyond the best prices, as well as resiliency, i.e. how fast the different liquidity measures recover after a liquidity shock. Our results show a somewhat less favorable image of liquidity than often found in the literature. After a liquidity shock (in the spread or depth or in the book beyond the best limits), several dimension of liquidity deteriorate at the same time. Not only does the inside spread increase, and depth at the best prices decrease, also the difference between subsequent bid and ask prices may become larger and depth provided at them decreases. The impacts are both econometrically and economically significant. Also, our findings point to an interaction between different measures of liquidity, between liquidity at the best prices and beyond in the book, and between ask and bid side of the market.
Previous evidence suggests that less liquid stocks entail higher average returns. Using NYSE data, we present evidence that both the sensitivity of returns to liquidity and liquidity premia have significantly declined over the past four decades to levels that we cannot statistically distinguish from zero. Furthermore, the profitability of trading strategies based on buying illiquid stocks and selling illiquid stocks has declined over the past four decades, rendering such strategies virtually unprofitable. Our results are robust to several conventional liquidity measures related to volume. When using liquidity measure that is not related to volume, we find just weak evidence of a liquidity premium even in the early periods of our sample. The gradual introduction and proliferation of index funds and exchange traded funds is a possible explanation for these results.
This paper addresses and resolves the issue of microstructure noise when measuring the relative importance of home and U.S. market in the price discovery process of Canadian interlisted stocks. In order to avoid large bounds for information shares, previous studies applying the Cholesky decomposition within the Hasbrouck (1995) framework had to rely on high frequency data. However, due to the considerable amount of microstructure noise inherent in return data at very high frequencies, these estimators are distorted. We offer a modified approach that identifies unique information shares based on distributional assumptions and thereby enables us to control for microstructure noise. Our results indicate that the role of the U.S. market in the price discovery process of Canadian interlisted stocks has been underestimated so far. Moreover, we suggest that rather than stock specific factors, market characteristics determine information shares.
Innovative automated execution strategies like Algorithmic Trading gain significant market share on electronic market venues worldwide, although their impact on market outcome has not been investigated in depth yet. In order to assess the impact of such concepts, e.g. effects on the price formation or the volatility of prices, a simulation environment is presented that provides stylized implementations of algorithmic trading behavior and allows for modeling latency. As simulations allow for reproducing exactly the same basic situation, an assessment of the impact of algorithmic trading models can be conducted by comparing different simulation runs including and excluding a trader constituting an algorithmic trading model in its trading behavior. By this means the impact of Algorithmic Trading on different characteristics of market outcome can be assessed. The results indicate that large volumes to execute by the algorithmic trader have an increasing impact on market prices. On the other hand, lower latency appears to lower market volatility.
Macro announcements change the equilibrium riskfree rate. We find that treasury prices reflect part of the impact instantaneously, but intermediaries rely on their customer order flow in the 15 minutes after the announcement to discover the full impact. We show that this customer flow informativeness is strongest at times when analyst forecasts of macro variables are highly dispersed. We study 30 year treasury futures to identify the customer flow. We further show that intermediaries appear to benefit from privately recognizing informed customer flow, as, in the cross-section, their own-account trade profitability correlates with access to customer orders, controlling for volatility, competition, and the announcement surprise. These results suggest that intermediaries learn about equilibrium riskfree rates through customer orders.
We report evidence that the presence of hidden liquidity is associated with greater liquidity in the order books, greater trading volume, and smaller price impact. Limit and market order submission behavior changes when hidden liquidity is present consistent with at least some traders being able to detect hidden liquidity. We estimate a model of liquidity provision that allows us to measure variations in the marginal and total payoffs from liquidity provision in states with and without hidden liquidity. Our estimates of the expected surplus to providers of visible and hidden liquidity are positive and typically of the order of one-half to one basis points per trade. The positive liquidity provider surpluses combined with the increased trading volume when hidden liquidity is present are both consistent with liquidity externalities.
This paper considers a trading game in which sequentially arriving liquidity traders either opt for a market order or for a limit order. One class of traders is considered to have an extended trading horizon, implying their impatience is linked to their trading orientation. More specifically, sellers are considered to have a trading horizon of two periods, whereas buyers only have a single-period trading scope (the extended buyer-horizon case is completely symmetric). Clearly, as the life span of their submitted limit orders is longer, this setting implies sellers are granted a natural advantage in supplying liquidity. This benefit is hampered, however, by the direct competition arising between consecutively arriving sellers. Closed-form characterizations for the order submission strategies are obtained when solving for the equilibrium of this dynamic game. These allow to examine how these forces affect traders´ order placement decisions. Further, the analysis yields insight into the dynamic process of price formation and into the market clearing process of a non-intermediated, order driven market.
Central counterparties (CCPs) have increasingly become a cornerstone of financial markets infrastructure. We present a model where trades are time-critical, liquidity is limited and there is limited enforcement of trades. We show a CCP novating trades implements efficient trading behaviour. It is optimal for the CCP to face default losses to achieve the efficient level of trade. To cover these losses, the CCP optimally uses margin calls, and, as the default problem becomes more severe, also requires default funds and then imposes position limits.
n the last few years, many of the world’s largest financial exchanges have converted from mutual, not-for-profit organizations to publicly-traded, for-profit firms. In most cases, these exchanges have substantial responsibilities with respect to enforcing various regulations that protect investors from dishonest agents. We examine how the incentives to enforce such regulations change as an exchange converts from mutual to for-profit status. In contrast to oft-stated concerns, we find that, in many circumstances, an exchange that maximizes shareholder (rather than member) income has a greater incentive to aggressively enforce these types of regulations.
The execution, clearing, and settlement of financial transactions are all subject to substantial scale and scope economies which make each of these complementary functions a natural monopoly. Integration of trade, execution, and settlement in an exchange improves efficiency by economizing on transactions costs. When scope economies in clearing are more extensive than those in execution, integration is more costly, and efficient organization involves a trade-off of scope economies and transactions costs. A properly organized clearing cooperative can eliminate double marginalization problems and exploit scope economies, but can result in opportunism and underinvestment. Moreover, a clearing cooperative may exercise market power. Vertical integration and tying can foreclose entry, but foreclosure can be efficient because market power rents attract excessive entry. Integration of trading and post-trade services is the modal form of organization in financial markets, which is consistent with the hypothesis that transactional efficiencies explain organizational arrangements in these markets.
Central counterparties
(2008)
Central counterparties (CCPs) have increasingly become a cornerstone of financial markets infrastructure. We present a model where trades are time-critical, liquidity is limited and there is limited enforcement of trades. We show a CCP novating trades implements efficient trading behaviour. It is optimal for the CCP to face default losses to achieve the efficient level of trade. To cover these losses, the CCP optimally uses margin calls, and, as the default problem becomes more severe, also requires default funds and then imposes position limits.