Refine
Year of publication
Document Type
- Working Paper (2338) (remove)
Language
- English (2338) (remove)
Has Fulltext
- yes (2338) (remove)
Is part of the Bibliography
- no (2338)
Keywords
- Deutschland (115)
- USA (51)
- Geldpolitik (48)
- monetary policy (46)
- Schätzung (45)
- Europäische Union (43)
- Bank (38)
- Corporate Governance (36)
- Monetary Policy (30)
- Sprachtypologie (23)
Institute
- Center for Financial Studies (CFS) (1366)
- Wirtschaftswissenschaften (1295)
- Sustainable Architecture for Finance in Europe (SAFE) (729)
- House of Finance (HoF) (598)
- Institute for Monetary and Financial Stability (IMFS) (170)
- Rechtswissenschaft (146)
- Informatik (114)
- Foundation of Law and Finance (50)
- Exzellenzcluster Die Herausbildung normativer Ordnungen (34)
- Gesellschaftswissenschaften (29)
The emergence of Capitalism is said to always lead to extreme changes in the structure of a society. This view implies that Capitalism is a universal and unique concept that needs an explicit institutional framework and should not discriminate between a German or US Capitalism. In contrast, this work argues that the ‘ideal type’ of Capitalism in a Weberian sense does not exist. It will be demonstrated that Capitalism is not a concept that shapes a uniform institutional framework within every society, constructing a specific economic system. Rather, depending on the institutional environment - family structures in particular - different forms of Capitalism arise. To exemplify this, the networking (Guanxi) Capitalism of contemporary China will be presented, where social institutions known from the past were reinforced for successful development. It will be argued that especially the change, destruction and creation of family and kinship structures are key factors that determined the further development and success of the Chinese economy and the type of Capitalism arising there. In contrast to Weber, it will be argued that Capitalism not necessarily leads to a process of destruction of traditional structures and to large-scale enterprises under rational, bureaucratic management, without leaving space for socio-cultural structures like family businesses. The flexible global production increasingly favours small business production over larger corporations. Small Chinese family firms are able to respond to rapidly changing market conditions and motivate maximum efforts for modest pay. The structure of the Chinese family proved to be very persistent over time and to be able to accommodate diverse economic and political environments while maintaining its core identity. This implies that Chinese Capitalism may be an entirely new economic system, based on Guanxi and the family.
Context unification is a variant of second-order unification and also a generalization of string unification. Currently it is not known whether context uni cation is decidable. An expressive fragment of context unification is stratified context unification. Recently, it turned out that stratified context unification and one-step rewrite constraints are equivalent. This paper contains a description of a decision algorithm SCU for stratified context unification together with a proof of its correctness, which shows decidability of stratified context unification as well as of satisfiability of one-step rewrite constraints.
n the EU there are longstanding and ongoing pressures towards a tax that is levied on the EU level to substitute for national contributions. We discuss conditions under which such a transition can make sense, starting from what we call a "decentralization theorem of taxation" that is analogous to Oates (1972) famous result that in the absence of spill-over effects and economies of scale decentralized public good provision weakly dominates central provision. We then drop assumptions that turn out to be unnecessary for this results. While spill-over effects of taxation may call for central rules for taxation, as long as spill-over effects do not depend on the intra-regional distribution of the tax burden, decentralized taxation plus tax coordination is found superior to a union-wide tax.
The merchant language of the Georgian Jews deserves scholarly attention for several reasons. The political and social developments of the last fifty years have caused the extinction of this very interesting form of communication, as most Georgian Jews have emigrated to Israel. In a natural interaction, the type of language described in this article can be found very rarely, if at all. Records of this communication have been preserved in various contexts and received different levels of scholarly attention. Our interest concerns the linguistic aspects as well as the classification.
In the following paper we argue that the specific merchant language of Georgian Jews belongs to the pragmatic phenomenon of “very indirect language.” The use of mostly Hebrew lexemes in Georgian conversation leads to an unfounded assumption that the speakers are equally competent in Hebrew and Georgian. It is reported that a high level of linguistic competence in Hebrew does not guarantee understanding of the Jewish merchant language. In the Georgian context, the decisive factors are membership in the professional interest group of merchants and residential membership in the Jewish community. These factors seem to be equivalent, because Jewish members of other professional groups (and those from outside the particular urban residential area) have difficulties in following the language that are similar to those of the Georgian majority. We describe the pragmatic structure of interactions conducted with the help of the merchant language and take into account the purpose of the language’s use or the intention of the speakers. Relevant linguistic examples are analysed and their sociocultural contexts explained.
This paper deals with the proposed use of sovereign credit ratings in the "Basel Accord on Capital Adequacy" (Basel II) and considers its potential effect on emerging markets financing. It investigates in a first attempt the consequences of the planned revisions on the two central aspects of international bank credit flows: the impact on capital costs and the volatility of credit supply across the risk spectrum of borrowers. The empirical findings cast doubt on the usefulness of credit ratings in determining commercial banks' capital adequacy ratios since the standardized approach to credit risk would lead to more divergence rather than convergence between investment-grade and speculative-grade borrowers. This conclusion is based on the lateness and cyclical determination of credit rating agencies' sovereign risk assessments and the continuing incentives for short-term rather than long-term interbank lending ingrained in the proposed Basel II framework.
This paper examines optimal enviromental policy when external financing is costly for firms. We introduce emission externalities and industry equilibrium in the Holmström and Tirole (1997) model of corporate finance. While a cap-and- trading system optimally governs both firms` abatement activities (internal emission margin) and industry size (external emission margin) when firms have sufficient internal funds, external financing constraints introduce a wedge between these two objectives. When a sector is financially constrained in the aggregate, the optimal cap is strictly above the Pigouvian benchmark and emission allowances should be allocated below market prices. When a sector is not financially constrained in the aggregate, a cap that is below the Pigiouvian benchmark optimally shifts market share to less polluting firms and, moreover, there should be no "grandfathering" of emission allowances. With financial constraints and heterogeneity across firms or sectors, a uniform policy, such as a single cap-and-trade system, is typically not optimal.
Unquestionably (or: undoubtedly), every competent speaker has already come to doubt with respect to the question of which form is correct or appropriate and should be used (in the standard language) when faced with two or more almost identical competing variants of words, word forms or sentence and phrase structure (e.g. German "Pizzas/Pizzen/Pizze" 'pizzas', Dutch "de drie mooiste/mooiste drie stranden" 'the three most beautiful/most beautiful three beaches', Swedish "större än jag/mig" 'taller than I/me'). Such linguistic uncertainties or "cases of doubt" (cf. i.a. Klein 2003, 2009, 2018; Müller & Szczepaniak 2017; Schmitt, Szczepaniak & Vieregge 2019; Stark 2019 as well as the useful collections of data of Duden vol. 9, Taaladvies.net, Språkriktighetsboken etc.) systematically occur also in native speakers and they do not necessarily coincide with the difficulties of second language learners. In present-day German, most grammatical uncertainties occur in the domains of inflection (nominal plural formation, genitive singular allomorphy of strong masc./neut. nouns, inflectional variation of weak masc. nouns, strong/weak adjectival inflection and comparison forms, strong/weak verb forms, perfect auxiliary selection) and word-formation (linking elements in compounds, separability of complex verbs). As for syntax, there are often doubts in connection with case choice (pseudo-partitive constructions, prepositional case government) and agreement (especially due to coordination or appositional structures). This contribution aims to present a contrastive approach to morphological and syntactic uncertainties in contemporary Germanic languages (mostly German, Dutch, and Swedish) in order to obtain a broader and more fine-grained typology of grammatical instabilities and their causes. As will be discussed, most doubts of competent speakers - a problem also for general linguistic theory - can be attributed to processes of language change in progress, to language or variety contact, to gaps and rule conflicts in the grammar of every language or to psycholinguistic conditions of language processing. Our main concerns will be the issues of which (kinds of) common or different critical areas there are within Germanic (and, on the other hand, in which areas there are no doubts), which of the established (cross-linguistically valid) explanatory approaches apply to which phenomena and, ultimately, the question whether the new data reveals further lines of explanation for the empirically observable (standard) variation.
In this paper we analyze the semantics of a higher-order functional language with concurrent threads, monadic IO and synchronizing variables as in Concurrent Haskell. To assure declarativeness of concurrent programming we extend the language by implicit, monadic, and concurrent futures. As semantic model we introduce and analyze the process calculus CHF, which represents a typed core language of Concurrent Haskell extended by concurrent futures. Evaluation in CHF is defined by a small-step reduction relation. Using contextual equivalence based on may- and should-convergence as program equivalence, we show that various transformations preserve program equivalence. We establish a context lemma easing those correctness proofs. An important result is that call-by-need and call-by-name evaluation are equivalent in CHF, since they induce the same program equivalence. Finally we show that the monad laws hold in CHF under mild restrictions on Haskell’s seq-operator, which for instance justifies the use of the do-notation.
This paper proves correctness of Nöcker's method of strictness analysis, implemented in the Clean compiler, which is an effective way for strictness analysis in lazy functional languages based on their operational semantics. We improve upon the work of Clark, Hankin and Hunt did on the correctness of the abstract reduction rules. Our method fully considers the cycle detection rules, which are the main strength of Nöcker's strictness analysis. Our algorithm SAL is a reformulation of Nöcker's strictness analysis algorithm in a higher-order call-by-need lambda-calculus with case, constructors, letrec, and seq, extended by set constants like Top or Inf, denoting sets of expressions. It is also possible to define new set constants by recursive equations with a greatest fixpoint semantics. The operational semantics is a small-step semantics. Equality of expressions is defined by a contextual semantics that observes termination of expressions. Basically, SAL is a non-termination checker. The proof of its correctness and hence of Nöcker's strictness analysis is based mainly on an exact analysis of the lengths of normal order reduction sequences. The main measure being the number of 'essential' reductions in a normal order reduction sequence. Our tools and results provide new insights into call-by-need lambda-calculi, the role of sharing in functional programming languages, and into strictness analysis in general. The correctness result provides a foundation for Nöcker's strictness analysis in Clean, and also for its use in Haskell.
Extending the data set used in Beyer (2009) to 2017, we estimate I(1) and I(2) money demand models for euro area M3. After including two broken trends and a few dummies to account for shifts in the variables following the global financial crisis and the ECB's non-standard monetary policy measures, we find that the money demand and the real wealth relations identified in Beyer (2009) have remained remarkably stable throughout the extended sample period. Testing for price homogeneity in the I(2) model we find that the nominal-to-real transformation is not rejected for the money relation whereas the wealth relation cannot be expressed in real terms.
We present a higher-order call-by-need lambda calculus enriched with constructors, case-expressions, recursive letrec-expressions, a seq-operator for sequential evaluation and a non-deterministic operator amb that is locally bottom-avoiding. We use a small-step operational semantics in form of a single-step rewriting system that defines a (nondeterministic) normal order reduction. This strategy can be made fair by adding resources for bookkeeping. As equational theory we use contextual equivalence, i.e. terms are equal if plugged into any program context their termination behaviour is the same, where we use a combination of may- as well as must-convergence, which is appropriate for non-deterministic computations. We show that we can drop the fairness condition for equational reasoning, since the valid equations w.r.t. normal order reduction are the same as for fair normal order reduction. We evolve different proof tools for proving correctness of program transformations, in particular, a context lemma for may- as well as mustconvergence is proved, which restricts the number of contexts that need to be examined for proving contextual equivalence. In combination with so-called complete sets of commuting and forking diagrams we show that all the deterministic reduction rules and also some additional transformations preserve contextual equivalence.We also prove a standardisation theorem for fair normal order reduction. The structure of the ordering <=c a is also analysed: Ω is not a least element, and <=c already implies contextual equivalence w.r.t. may-convergence.
We present a higher-order call-by-need lambda calculus enriched with constructors, case-expressions, recursive letrec-expressions, a seq-operator for sequential evaluation and a non-deterministic operator amb that is locally bottom-avoiding. We use a small-step operational semantics in form of a single-step rewriting system that defines a (nondeterministic) normal order reduction. This strategy can be made fair by adding resources for bookkeeping. As equational theory we use contextual equivalence, i.e. terms are equal if plugged into any program context their termination behaviour is the same, where we use a combination of may- as well as must-convergence, which is appropriate for non-deterministic computations. We show that we can drop the fairness condition for equational reasoning, since the valid equations w.r.t. normal order reduction are the same as for fair normal order reduction. We evolve different proof tools for proving correctness of program transformations, in particular, a context lemma for may- as well as mustconvergence is proved, which restricts the number of contexts that need to be examined for proving contextual equivalence. In combination with so-called complete sets of commuting and forking diagrams we show that all the deterministic reduction rules and also some additional transformations preserve contextual equivalence.We also prove a standardisation theorem for fair normal order reduction. The structure of the ordering <=c a is also analysed: Ω is not a least element, and <=c already implies contextual equivalence w.r.t. may-convergence.
We present a higher-order call-by-need lambda calculus enriched with constructors, case-expressions, recursive letrec-expressions, a seq-operator for sequential evaluation and a non-deterministic operator amb, which is locally bottom-avoiding. We use a small-step operational semantics in form of a normal order reduction. As equational theory we use contextual equivalence, i.e. terms are equal if plugged into an arbitrary program context their termination behaviour is the same. We use a combination of may- as well as must-convergence, which is appropriate for non-deterministic computations. We evolve different proof tools for proving correctness of program transformations. We provide a context lemma for may- as well as must- convergence which restricts the number of contexts that need to be examined for proving contextual equivalence. In combination with so-called complete sets of commuting and forking diagrams we show that all the deterministic reduction rules and also some additional transformations keep contextual equivalence. In contrast to other approaches our syntax as well as semantics does not make use of a heap for sharing expressions. Instead we represent these expressions explicitely via letrec-bindings.
A call on art investments
(2010)
The art market has seen boom and bust during the last years and, despite the downturn, has received more attention from investors given the low interest environment following the financial crisis. However, participation has been reserved for a few investors and the hedging of exposures remains dificult. This paper proposes to overcome these problems by introducing a call option on an art index, derived from one of the most comprehensive data sets of art market transactions. The option allows investors to optimize their exposure to art. For pricing purposes, non-tradability of the art index is acknowledged and option prices are derived in an equilibrium setting as well as by replication arguments. In the former, option prices depend on the attractiveness of gaining exposure to a previously non-traded risk. This setting further overcomes the problem of art market exposures being dificult to hedge. Results in the replication case are primarily driven by the ability to reduce residual hedging risk. Even if this is not entirely possible, the replication approach serves as pricing benchmark for investors who are significantly exposed to art and try to hedge their art exposure by selling a derivative. JEL Classification: G11, G13, Z11
I present a new business cycle model in which decision making follows a simple mental process motivated by neuroeconomics. Decision makers first compute the value of two different options and then choose the option that offers the highest value, but with errors. The resulting model is highly tractable and intuitive. A demand function in level replaces the traditional Euler equation. As a result, even liquid consumers can have a large marginal propensity to consume. The interest rate affects consumption through the cost of borrowing and not through intertemporal substitution. I discuss the implications for stimulus policies.
Im Mai 2008 verwüstete der Sturm Nargis über Myanmar/Burma hinweg, 140.000 Menschen wurden getötet. Das autokratisch regierte Land wies jedoch Katastrophenhilfe als innere Einmischung zurück und verweigerte die Einfuhr von Medikamenten und Lebensmitteln. Der französische Außenminister Kouschner drängte angesichts dieser Situation die UN zum Handeln, auf Grundlage der Responsibility to Protect (kurz R2P).
Dieser Akt der Versicherheitlichung steht allerdings im Kontrast zur Medienberichterstattung, wie Gabi Schlag in diesem Papier untersucht. Besonders das Bildmaterial aus dem Katastrophengebiet erzählt eine andere Geschichte. Die Photos der Berichterstattung von BBC.com zum Thema bilden ein visuelles Narrativ, welches keine Hilfsbedürftigkeit suggeriert, sondern kontrolliertes, besonnenes Vorgehen der lokalen Kräfte. Dieser Kontrast verweist auf die sprichwörtliche Macht der Bilder, welche die jeweiligen Bedingungen von Handlungsmöglichkeiten vorstrukturieren.
Consumers purchase energy in many forms. Sometimes energy goods are consumed directly, for instance, in the form of gasoline used to operate a vehicle, electricity to light a home, or natural gas to heat a home. At other times, the cost of energy is embodied in the prices of goods and services that consumers buy, say when purchasing an airline ticket or when buying online garden furniture made from plastic to be delivered by mail. Previous research has focused on quantifying the pass-through of the price of crude oil or the price of motor gasoline to U.S. inflation. Neither approach accounts for the fact that percent changes in refined product prices need not be proportionate to the percent change in the price of oil, that not all energy is derived from oil, and that the correlation of price shocks across energy markets is far from one. This paper develops a vector autoregressive model that quantifies the joint impact of shocks to several energy prices on headline and core CPI inflation. Our analysis confirms that focusing on gasoline price shocks alone will underestimate the inflationary pressures emanating from the energy sector, but not enough to overturn the conclusion that much of the observed increase in headline inflation in 2021 and 2022 reflected non-energy price shocks.
We introduce a regularization and blocking estimator for well-conditioned high-dimensional daily covariances using high-frequency data. Using the Barndorff-Nielsen, Hansen, Lunde, and Shephard (2008a) kernel estimator, we estimate the covariance matrix block-wise and regularize it. A data-driven grouping of assets of similar trading frequency ensures the reduction of data loss due to refresh time sampling. In an extensive simulation study mimicking the empirical features of the S&P 1500 universe we show that the ’RnB’ estimator yields efficiency gains and outperforms competing kernel estimators for varying liquidity settings, noise-to-signal ratios, and dimensions. An empirical application of forecasting daily covariances of the S&P 500 index confirms the simulation results.
This policy letter provides an overview of the strengths, weaknesses, risks and opportunities of the upcoming comprehensive risk assessment, a euro area-wide evaluation of bank balance sheets and business models. If carried out properly, the 2014 comprehensive assessment will lead the euro area into a new era of banking supervision. Policy makers in euro area countries are now under severe pressure to define a credible backstop framework for banks. This framework, as the author argues, needs to be a broad, quasi-European system of mutually reinforcing backstops.
We collect data on the size distribution of all U.S. corporate businesses for 100 years. We document that corporate concentration (e.g., asset share or sales share of the top 1%) has increased persistently over the past century. Rising concentration was stronger in manufacturing and mining before the 1970s, and stronger in services, retail, and wholesale after the 1970s. Furthermore, rising concentration in an industry aligns closely with investment intensity in research and development and information technology. Industries with higher increases in concentration also exhibit higher output growth. The long-run trends of rising corporate concentration indicate increasingly stronger economies of scale.
June 4th, 2013 marks the formal launch of the third generation of the Equator Principles (EP III) and the tenth anniversary of the EPs – enough reasons for evaluating the EPs initiative from an economic ethics and business ethics perspectives. In particular, this essay deals with the following questions: What are the EPs and where are they going? What has been achieved so far by the EPs? What are the strengths and weaknesses of the EPs? Which necessary reform steps need to be adopted in order to further strengthen the EPs framework? Can the EPs be regarded as a role-model in the field of sustainable finance and CSR? The paper is structured as follows: The first chapter defines the term EPs and introduces the keywords related to the EPs framework. The second chapter gives a brief overview of the history of the EPs. The third chapter discusses the Equator Principles Association, the governing, administering, and managing institution behind the EPs. The fourth chapter summarizes the main features and characteristics of the newly released third generation of the EPs. The fifth chapter critically evaluates the EP III from an economic ethics and business ethics perspectives. The paper concludes with a summary of the main findings.
The term structure of interest rates is crucial for the transmission of monetary policy to financial markets and the macroeconomy. Disentangling the impact of monetary policy on the components of interest rates, expected short rates and term premia, is essential to understanding this channel. To accomplish this, we provide a quantitative structural model with endogenous, time-varying term premia that are consistent with empirical findings. News about future policy, in contrast to unexpected policy shocks, has quantitatively significant effects on term premia along the entire term structure. This provides a plausible explanation for partly contradictory estimates in the empirical literature.
Motivated by the U.S. events of the 2000s, we address whether a too low for too long interest rate policy may generate a boom-bust cycle. We simulate anticipated and unanticipated monetary policies in state-of-the-art DSGE models and in a model with bond financing via a shadow banking system, in which the bond spread is calibrated for normal and optimistic times. Our results suggest that the U.S. boom-bust was caused by the combination of (i) too low for too long interest rates, (ii) excessive optimism and (iii) a failure of agents to anticipate the extent of the abnormally favorable conditions.
This paper examines thoroughly the Chilean Pension Reform, giving first an overview of the mandatory saving plan, the relevant institutions, and the rules for transition from the old to the new system. The main part of the paper contains a critical evaluation of the reform, in particular the macroeconomic performance with respect to capital formation and growth, and the effects on the savings rate as well as on the rates of return and labor market are discussed. Furthermore, the development of capital markets is reviewed. A short critique is presented with respect to intergenerational distribution and risk sharing as well as with respect to the social consequences. This paper is the result of a CFS sponsored research project. A preliminary version was presented at the meeting of the committee of Social Policy of the Verein fuer Socialpolitik, May 1999 and at the 55th Congress of IIPF, 23-26 August 1999, in Moskow.
n this paper we analyze an economy with two heterogeneous investors who both exhibit misspecified filtering models for the unobservable expected growth rate of the aggregated dividend. A key result of our analysis with respect to long-run investor survival is that there are degrees of model misspecification on the part of one investor for which there is no compensation by the other investor's deficiency. The main finding with respect to the asset pricing properties of our model is that the two dimensions of asset pricing and survival are basically independent. In scenarios when the investors are more similar with respect to their expected consumption shares, return volatilities can nevertheless be higher than in cases when they are very different.
After he had only tightly lost the election in July 2006, Andrés Manuel López Obrador and his Coalición claimed fraud and asserted that unfair conditions during the campaign had diminished his chances to win the presidency. The paper investigates this latter allegation centering on a perceived campaign of hate, unequal access to campaign resources and malicious treatment by the mass media. It further analyzes the mass media’s performance during the conflictual post electoral period until the final decision of the Federal Electoral Tribunal on September 5th, 2006. While the media’s performance during the campaign tells us about their compliance with fair media coverage mechanisms that have been implemented by electoral reforms in the 1990s, the mass media is uncontained by such measures after the election. Thus, their mode of coverage of the postelectoral conflicts allows us to “test” the mass media’s transformation to a more unbiased, social responsible “fourth estate”. Finally the paper scrutinizes whether the claims of fraud and the protests by the leftist movement resulted in lower levels of institutional trust and democratic support. The analysis of the media performance is based on data provided by the Federal Electoral Institute (IFE). Its Media Monitor encompassed more than 150 TV stations, 240 radio stations and 200 press publications. However, there is no comparable data available for the postelectoral period. Interviews with Mexican media experts, which the author has conducted during the postelectoral period, serve as empirical basis for the second part. Data on the public opinions and attitudes of Mexican citizens are taken from the 2007 Latinobarometro, the 2006 Encuesta Nacional and several polls conducted by Grupo Reforma. The results do not support López Obradors notions. Even though a strong party bias is characteristic of the Mexican media system, all findings hint at a continuity of balanced campaign coverage and fair access to mass media publicity. Coverage during the postelectoral period was more polarized, yet both sides remained at least partially open for oppositional views. The claims of fraud, mass protest mobilization and anti-institutional discourse by Lopez Obrador’s leftist movement seem not to have caused significant loss in institutional trust, support of and satisfaction with democracy, even though these levels remain quite low.
On April 24, 2001 the European Commission presented a proposal for a Directive1 introducing supplementary supervision of financial conglomerates (the Proposed Directive). The Proposed Directive requires a closer coordination among supervisory authorities of different sectors of the financial industry and leads to changes in the number of existing Directives relating to the supervision of credit institutions, insurance undertakings and investment firms.
The present article explores perceptions and cultural constructions of the terms capitalism or capitalistic West among ex-Soviet, highly qualified Jewish migrants from Russia and Ukraine after their emigration to Germany between 1990 and 1996. It seems that migration offers a unique opportunity to migrants to realise knowledge that is normally taken for granted, behaviour schemes and values, and to reflect on them. How do they acquire such presumed capitalist knowledge of the new society and new social world, how do they create it, and with what concrete contents do they connect the illusion about monolithic cultural, economic and political capital, the illusion which contributes to group formation and which serves as action orientation? As my research shows, immigrants try to disparage much of what appeared to them in the Soviet Union as normative, right and appropriate; now they often act by way of categories, which were defined in the previous context as "capitalist" and were interpreted as immoral. Without exact ideas or knowledge about behaviour codes, unspoken norms and silent values from the new society, many immigrants orient themselves towards the opposite of what was counted as morally proper in the origin society. Simultaneously they revive old system through the establishing and development of a Russian language enclave. Nevertheless this enclave is not located in a vacuum of "dusty" memories from the past, but build transnational cross-border space connected and corresponding to the processes of to-day's CIS and with the life of those relatives and friends who still live there, und with whom the emigrants share intensive social networks.
This note discusses the basic economics of central clearing for derivatives and the need for a proper regulation, supervision and resolution of central counterparty clearing houses (CCPs). New regulation in the U.S. and in Europe renders the involvement of a central counterparty mandatory for standardized OTC derivatives’ trading and sets higher capital and collateral requirements for non-centrally cleared derivatives.
From a macrofinance perspective, CCPs provide a trade-off between reduced contagion risk in the financial industry and the creation of a significant systemic risk. However, so far, regulation and supervision of CCPs is very fragmented, limited and ignores two important aspects: the risk of consolidation of CCPs on the one side and the competition among CCPs on the other side. i) As the economies of scale of CCP operations in risk and cost reduction can be large, they provide an argument in favor of consolidation, leading at the extreme to a monopoly CCP that poses the ultimate default risk – a systemic risk for the entire financial sector. As a systemic risk event requires a government bailout, there is a public policy issue here. ii) As long as no monopoly CCP exists, there is competition for market share among existing CCPs. Such competition may undermine the stability of the entire financial system because it induces “predatory margining”: a reduction of margin requirements to increase market share.
The policy lesson from our consideration emphasizes the importance of a single authority supervising all competing CCPs as well as of a specific regulation and resolution framework for CCPs. Our general recommendations can be applied to the current situation in Europe, and the proposed merger between Deutsche Börse and London Stock Exchange.
The concept of length, the concept is synonymous, the concept is nothing more than, the proper definition of a concept ... Forget programs and visions; the operational approach refers specifically to concepts, and in a very specific way: it describes the process whereby concepts are transformed into a series of operations—which, in their turn, allow to measure all sorts of objects. Operationalizing means building a bridge from concepts to measurement, and then to the world. In our case: from the concepts of literary theory, through some form of quantification, to literary texts.
We present a simple model of personal finance in which an incumbent lender has an information advantage vis-a-vis both potential competitors and households. In order to extract more consumer surplus, a lender with sufficient market power may engage in "irresponsible"lending, approving credit even if this is knowingly against a household’s best interest. Unless rival lenders are equally well informed, competition may reduce welfare. This holds, in particular, if less informed rivals can free ride on the incumbent’s superior screening ability.
It is an established policy in the United States to separate commercial banking (the business of taking deposits and making commercial loans) from other commercial activities. The separation of banking and commercial activities is achieved by federal and state banking laws, which enumerate the powers that banks may exercise, the activities that banks may engage in, and the investments that banks may lawfully make, and expressly exclude banks from certain activities or relationships. Some of these provisions could be circumvented if a nonbank company could carry on banking activities through a banking subsidiary and nonbanking activities either itself or through a nonbanking subsidiary.
We assess the relationship between finance and growth over the period 1980-2014. We estimate a cross-country growth regression for 48 countries during 20 periods of 15 years starting in 1980 (to 1995) and ending in 1999 (to 2014). We use OLS and IV estimations and we find that: 1) overall financial development had a positive effect on economic growth during all periods of our sample, i.e., we confirm that from 1980 to 2014 financial services provided by the various financial systems were significant (to various degrees) for firm creation, industrial expansion and economic growth; but that, 2) the structure of financial markets was particularly relevant for economic growth until the financial crisis; while 3) the structure of the banking sector played a major role since; and finally that, 4) the legal system is the primary determinant of the effectiveness of the overall financial system in facilitating innovation and growth in (almost) all of our sample period. Hence, overall our results suggest that the relationship between finance and growth matters but also that it varies over time in strength and in sector origination.
JEL Classification: O16, G16, G20.
The experience in the period during and after the Asian crisis of 1997-98 has provoked an extensive debate about the credit rating agencies' evaluation of sovereign risk in emerging markets lending. This study analyzes the role of credit rating agencies in international finan-cial markets, particularly whether sovereign credit ratings have an impact on the financial stability in emerging market economies. The event study and panel regression results indicate that credit rating agencies have substantial influence on the size and volatility of emerging markets lending. The empirical results are significantly stronger in the case of government's downgrades and negative imminent sovereign credit rating actions such as credit watches and rating outlooks than positive adjustments by the credit rating agencies while by the market participants' anticipated sovereign credit rating changes have a smaller impact on financial markets in emerging economies.
German Expressionist cinema is a movement that began in 1919. Expressionist film is marked by distinct visual features and performance styles that rebel against prior realist art movements. More than 20 years prior to the Expressionist movement, Sigmund Frued published "The Interpretation of Dreams" in 1899, a ground breaking study that links dreams to unconcious impulses. This thesis argues that the unexplained dream - like imagery found in two Expressionist films, The Cabinet of Dr. Caligari (Robert Wiene, 1920) and Dr. Mabus, the Gambler (Fritz Lang, 1922) - can be seen in terms of Freud's model of dreaming.
This paper uses laboratory experiments to provide a systematic analysis of how di↵erent presentation formats a↵ect individuals’ investment decisions. The results indicate that the type of presentation as well as personal characteristics influence both, the consistency of decisions and the riskiness of investment choices. However, while personal characteristics have a larger impact on consistency, the chosen risk level is determined more by framing e↵ects. On the level of personal characteristics, participants’ decisions show that better financial literacy and a better understanding of the presentation format enhance consistency and thus decision quality. Moreover, female participants on average make less consistent decisions and tend to prefer less risky alternatives. On the level of framing dimensions, subjects choose riskier investments when possible outcomes are shown in absolute values rather than rates of return and when the loss potential is less obvious. In particular, reducing the emphasis on downside risk and upside potential simultaneously leads to a substantial increase in risk taking.
This paper examines the political-economy and cultural dynamics and discourses underlying the emergence of the Palestinian Hamas and the Algerian Islamic Salvation Front. Both movements emerged in the late 1980s as responses to continuing (neo) colonial conditions in their countries. I explore to what extent the various processes commonly referred to as “globalization,” both the world-wide economic transformations epitomized by post-fordism on the macro/system level and neo-liberal structural adjustment programs within countries, and—perhaps more important—its cultural dynamics contributed to the rise and power of both movements. I examine the socio-economic situation in Algeria and Palestine-Israel during the 1980s and link it to the politics developments in both countries. Next I review the events behind the founding of both movements and the main components of their ideologies and strategies. Finally I explore their arguments to determine whether the political-economic or cultural pressures unleashed by globalization were the determining factor in their emergence and ideological development. I conclude by comparing the two case studies to determine if there are common threads that can serve as the basis for a region-wide investigation of the role of globalization in the emergence and/or rise to social hegemony of Islamist movements in other MENA countries.