Refine
Year of publication
Document Type
- Working Paper (3393) (remove)
Language
- English (2357)
- German (1016)
- Spanish (8)
- French (7)
- Multiple languages (2)
Has Fulltext
- yes (3393) (remove)
Keywords
- Deutschland (223)
- USA (64)
- Corporate Governance (53)
- Geldpolitik (53)
- Schätzung (52)
- Europäische Union (51)
- monetary policy (47)
- Bank (41)
- Sprachtypologie (34)
- Monetary Policy (31)
Institute
- Wirtschaftswissenschaften (1503)
- Center for Financial Studies (CFS) (1477)
- Sustainable Architecture for Finance in Europe (SAFE) (811)
- House of Finance (HoF) (669)
- Rechtswissenschaft (403)
- Institute for Monetary and Financial Stability (IMFS) (216)
- Informatik (119)
- Exzellenzcluster Die Herausbildung normativer Ordnungen (75)
- Gesellschaftswissenschaften (75)
- Geographie (64)
Towards correctness of program transformations through unification and critical pair computation
(2010)
Correctness of program transformations in extended lambda-calculi with a contextual semantics is usually based on reasoning about the operational semantics which is a rewrite semantics. A successful approach is the combination of a context lemma with the computation of overlaps between program transformations and the reduction rules, which results in so-called complete sets of diagrams. The method is similar to the computation of critical pairs for the completion of term rewriting systems. We explore cases where the computation of these overlaps can be done in a first order way by variants of critical pair computation that use unification algorithms. As a case study of an application we describe a finitary and decidable unification algorithm for the combination of the equational theory of left-commutativity modelling multi-sets, context variables and many-sorted unification. Sets of equations are restricted to be almost linear, i.e. every variable and context variable occurs at most once, where we allow one exception: variables of a sort without ground terms may occur several times. Every context variable must have an argument-sort in the free part of the signature. We also extend the unification algorithm by the treatment of binding-chains in let- and letrec-environments and by context-classes. This results in a unification algorithm that can be applied to all overlaps of normal-order reductions and transformations in an extended lambda calculus with letrec that we use as a case study.
Measuring confidence and uncertainty during the financial crisis : evidence from the CFS survey
(2010)
The CFS survey covers individual situations of banks and other companies of the financial sector during the financial crisis. This provides a rare possibility to analyze appraisals, expectations and forecast errors of the core sector of the recent turmoil. Following standard ways of aggregating individual survey data, we first present and introduce the CFS survey by comparing CFS indicators of confidence and predicted confidence to ifo and ZEW indicators. The major contribution is the analysis of several indicators of uncertainty. In addition to well established concepts, we introduce innovative measures based on the skewness of forecast errors and on the share of ‘no response’ replies. Results show that uncertainty indicators fit quite well with pattern of real and financial time series of the time period 2007 to 2010. Business Sentiment , Financial Crisis , Survey Indicator , Uncertainty
This paper provides theory as well as empirical results for pre-averaging estimators of the daily quadratic variation of asset prices. We derive jump robust inference for pre-averaging estimators, corresponding feasible central limit theorems and an explicit test on serial dependence in microstructure noise. Using transaction data of different stocks traded at the NYSE, we analyze the estimators’ sensitivity to the choice of the pre-averaging bandwidth and suggest an optimal interval length. Moreover, we investigate the dependence of pre-averaging based inference on the sampling scheme, the sampling frequency, microstructure noise properties as well as the occurrence of jumps. As a result of a detailed empirical study we provide guidance for optimal implementation of pre-averaging estimators and discuss potential pitfalls in practice. Quadratic Variation , MarketMicrostructure Noise , Pre-averaging , Sampling Schemes , Jumps
SUMMARY RECOMMENDATIONS 1. One of the major lessons from the current financial crisis refers to the systemic dimension of financial risk which had been almost completely neglected by bankers and supervisors in the pre-2007 years. 2. Accordingly, the most needed change in financial regulation, in order to avoid a repetition of such a crisis in the future, consists of influencing individual bank behaviour such that systemic risk is decreased. This objective is new and distinct from what Basle II was intended to achieve. 3. It is important, therefore, to evaluate proposed new regulatory instruments on the ground of whether or not they contribute to a reduction, or containment of systemic risk. We see two new regulatory measures of paramount importance: the introduction of a Systemic Risk Charge (SRC), and the implementation of a transparent bank resolution regime. Both measures complement each other, thus both have to be realized to be effective. 4. We propose a Systemic Risk Charge (SRC), a levy capturing the contribution of any individual bank to the overall systemic risk which is distinct from the institution’s own default risk. The SRC is set up such that the more systemic risk a bank contributes, the higher is the cost it has to bear. Therefore, the SRC serves to internalize the cost of systemic risk which, up to now, was borne by the taxpayer. 5. Major details of our SRC refer to the use of debt that may be converted into equity when systemic risk threatens the stability of the banking system. Also, the SRC raises some revenues for government. 6. The SRC has to be compared to several bank levies currently debated. The Financial Transaction Tax (FTT) does not directly address systemic risk and is therefore inferior to a SRC. Nevertheless, a FTT may offer the opportunity to subsidize on-exchange trading at the expense of off-exchange (over-the-counter, OTC) transactions, thereby enhancing financial market stability. The Financial Activity Tax (FAT) is similar to a VAT on financial services. It is the least adequate instrument among all instruments discussed above to limit systemic risk. 7. Bank resolution regime: No instrument to contain systemic risk can be effective unless the restructuring of bank debt, and the ensuing loss given default to creditors, is a real possibility. As the crisis has taught, bank restructuring is very difficult in light of contagion risk between major banks. We therefore need a regulatory procedure that allows winding down banks, even large banks, on short notice. Among other things, the procedure will require to distinguish systemically relevant exposures from those that are irrelevant. Only the former will be saved with government money, and it will then be the task of the supervisor to ensure a sufficient amount of nonsystemically relevant debt on the balance sheet of all banks. 8. Further issues discussed in this policy paper and its appendices refer to the necessity of a global level playing field, or the lack thereof, for these new regulatory measures; the convergence of our SRC proposal with what is expected to be long-term outcome of Basle III discussions; as well as the role of global imbalances.
Many studies show that most people are not financially literate and are unfamiliar with even the most basic economic concepts. However, the evidence on the determinants of economic literacy is scant. This paper uses international panel data on 55 countries from 1995 to 2008, merging indicators of economic literacy with a large set of macroeconomic and institutional variables. Results show that there is substantial heterogeneity of financial and economic competence across countries, and that human capital indicators (PISA test scores and college attendance) are positively correlated with economic literacy. Furthermore, inhabitants of countries with more generous social security systems are generally less literate, lending support to the hypothesis that the incentives to acquire economic literacy are related to the amount of resources available for private accumulation. JEL Classification: E2, D8, G1
This paper investigates the accuracy and heterogeneity of output growth and inflation forecasts during the current and the four preceding NBER-dated U.S. recessions. We generate forecasts from six different models of the U.S. economy and compare them to professional forecasts from the Federal Reserve’s Greenbook and the Survey of Professional Forecasters (SPF). The model parameters and model forecasts are derived from historical data vintages so as to ensure comparability to historical forecasts by professionals. The mean model forecast comes surprisingly close to the mean SPF and Greenbook forecasts in terms of accuracy even though the models only make use of a small number of data series. Model forecasts compare particularly well to professional forecasts at a horizon of three to four quarters and during recoveries. The extent of forecast heterogeneity is similar for model and professional forecasts but varies substantially over time. Thus, forecast heterogeneity constitutes a potentially important source of economic fluctuations. While the particular reasons for diversity in professional forecasts are not observable, the diversity in model forecasts can be traced to different modeling assumptions, information sets and parameter estimates. JEL Classification: G14, G15, G24
Price pressures
(2010)
We study price pressures in stock prices—price deviations from fundamental value due to a risk-averse intermediary supplying liquidity to asynchronously arriving investors. Empirically, twelve years of daily New York Stock Exchange intermediary data reveal economically large price pressures. A $100,000 inventory shock causes an average price pressure of 0.28% with a half-life of 0.92 days. Price pressure causes average transitory volatility in daily stock returns of 0.49%. Price pressure effects are substantially larger with longer durations in smaller stocks. Theoretically, in a simple dynamic inventory model the ‘representative’ intermediary uses price pressure to control risk through inventory mean reversion. She trades off the revenue loss due to price pressure against the price risk associated with remaining in a nonzero inventory state. The model’s closed-form solution identifies the intermediary’s relative risk aversion and the distribution of investors’ private values for trading from the observed time series patterns. These allow us to estimate the social costs—deviations from constrained Pareto efficiency—due to price pressure which average 0.35 basis points of the value traded. JEL Classification: G12, G14, D53, D61
This paper presents a model to analyze the consequences of competition in order-flow between a profit maximizing stock exchange and an alternative trading platform on the decisions concerning trading fees and listing requirements. Listing requirements, set by the exchange, provide public information on listed firms and contribute to a better liquidity on all trading venues. It is sometimes asserted that competition induces the exchange to lower its level of listing standards compared to a situation in which it is a monopolist, because the trading platform can free-ride on this regulatory activity and compete more aggressively on trading fees. The present analysis shows that this is not always true and depends on the existence and size of gains related to multi market trading. These gains relax competition on trading fees. The higher these gains are, the more the exchange can increase its revenue from listing and trading when it raises its listing standards. For large enough gains from multi-market trading, the exchange is not induced to lower the level of listing standards when a competing trading platform appears. As a second result, this analysis also reveals a cross - subsidization effect between the listing and the trading activity when listing is not competitive. This model yields implications about the fee structures on stock markets, the regulation of listings and the social optimality of competition for volume. JEL Classification: G10, G18, G12
This paper proposes the Shannon entropy as an appropriate one-dimensional measure of behavioural trading patterns in financial markets. The concept is applied to the illustrative example of algorithmic vs. non-algorithmic trading and empirical data from Deutsche Börse's electronic cash equity trading system, Xetra. The results reveal pronounced differences between algorithmic and non-algorithmic traders. In particular, trading patterns of algorithmic traders exhibit a medium degree of regularity while non-algorithmic trading tends towards either very regular or very irregular trading patterns. JEL Classification: C40, D0, G14, G15, G20
How ordinary consumers make complex economic decisions: financial literacy and retirement readiness
(2010)
This paper explores who is financially literate, whether people accurately perceive their own economic decision-making skills, and where these skills come from. Self-assessed and objective measures of financial literacy can be linked to consumers’ efforts to plan for retirement in the American Life Panel, and causal relationships with retirement planning examined by exploiting information about respondent financial knowledge acquired in school. Results show that those with more advanced financial knowledge are those more likely to be retirement-ready.