330 Wirtschaft
Refine
Year of publication
- 2021 (217)
- 2014 (181)
- 2017 (173)
- 2020 (172)
- 2018 (163)
- 2022 (162)
- 2023 (154)
- 2016 (150)
- 2013 (145)
- 2015 (141)
- 2019 (133)
- 2012 (106)
- 2008 (100)
- 2005 (96)
- 2003 (95)
- 2009 (93)
- 2010 (91)
- 2011 (90)
- 2006 (82)
- 2004 (73)
- 2007 (68)
- 2024 (52)
- 2002 (45)
- 2001 (41)
- 1999 (35)
- 2000 (33)
- 1998 (31)
- 1997 (11)
- 1996 (10)
- 1993 (5)
- 1994 (4)
- 1995 (3)
- 1992 (2)
- 1892 (1)
- 1943 (1)
- 1946 (1)
- 1976 (1)
- 1990 (1)
- 1991 (1)
Document Type
- Working Paper (1833)
- Article (466)
- Part of Periodical (444)
- Report (105)
- Doctoral Thesis (40)
- Book (28)
- Conference Proceeding (14)
- Periodical (11)
- Part of a Book (9)
- Review (7)
Language
- English (2964) (remove)
Is part of the Bibliography
- no (2964)
Keywords
- Deutschland (117)
- Geldpolitik (55)
- USA (51)
- monetary policy (50)
- Financial Institutions (48)
- Schätzung (48)
- Europäische Union (44)
- Monetary Policy (44)
- ECB (42)
- Bank (39)
Institute
- Wirtschaftswissenschaften (1857)
- Center for Financial Studies (CFS) (1475)
- Sustainable Architecture for Finance in Europe (SAFE) (1055)
- House of Finance (HoF) (698)
- E-Finance Lab e.V. (348)
- Institute for Monetary and Financial Stability (IMFS) (190)
- Rechtswissenschaft (89)
- Foundation of Law and Finance (50)
- Gesellschaftswissenschaften (31)
- Institute for Law and Finance (ILF) (31)
We analytically characterize optimal monetary policy for an augmented New Keynesian model with a housing sector. In a setting where the private sector has rational expectations about future housing prices and inflation, optimal monetary policy can be characterized without making reference to housing price developments: commitment to a 'target criterion' that refers to inflation and the output gap only is optimal, as in the standard model without a housing sector. When the policymaker is concerned with potential departures of private sector expectations from rational ones and seeks to choose a policy that is robust against such possible departures, then the optimal target criterion must also depend on housing prices. In the empirically realistic case where housing is subsidized and where monopoly power causes output to fall short of its optimal level, the robustly optimal target criterion requires the central bank to 'lean against' housing prices: following unexpected housing price increases, policy should adopt a stance that is projected to undershoot its normal targets for inflation and the output gap, and similarly aim to overshoot those targets in the case of unexpected declines in housing prices. The robustly optimal target criterion does not require that policy distinguish between 'fundamental' and 'non-fundamental' movements in housing prices.
In the secondary art market, artists play no active role. This allows us to isolate cultural influences on the demand for female artists’ work from supply-side factors. Using 1.5 million auction transactions in 45 countries, we document a 47.6% gender discount in auction prices for paintings. The discount is higher in countries with greater gender inequality. In experiments, participants are unable to guess the gender of an artist simply by looking at a painting and they vary in their preferences for paintings associated with female artists. Women's art appears to sell for less because it is made by women.
In this paper, we develop a state-dependent sensitivity value-at-risk (SDSVaR) approach that enables us to quantify the direction, size, and duration of risk spillovers among financial institutions as a function of the state of financial markets (tranquil, normal, and volatile). Within a system of quantile regressions for four sets of major financial institutions (commercial banks, investment banks, hedge funds, and insurance companies) we show that while small during normal times, equivalent shocks lead to considerable spillover effects in volatile market periods. Commercial banks and, especially, hedge funds appear to play a major role in the transmission of shocks to other financial institutions. Using daily data, we can trace out the spillover effects over time in a set of impulse response functions and find that they reach their peak after 10 to 15 days.
Credit boom detection methodologies (such as threshold method) lack robustness as they are based on univariate detrending analysis and resort to ratios of credit to real activity. I propose a quantitative indicator to detect atypical behavior of credit from a multivariate system - a monetary VAR. This methodology explicitly accounts for endogenous interactions between credit, asset prices and real activity and detects atypical credit expansions and contractions in the Euro Area, Japan and the U.S. robustly and timely. The analysis also proves useful in real time.
This paper investigates the risk channel of monetary policy on the asset side of banks’ balance sheets. We use a factoraugmented vector autoregression (FAVAR) model to show that aggregate lending standards of U.S. banks, such as their collateral requirements for firms, are significantly loosened in response to an unexpected decrease in the Federal Funds rate. Based on this evidence, we reformulate the costly state verification (CSV) contract to allow for an active financial intermediary, embed it in a New Keynesian dynamic stochastic general equilibrium (DSGE) model, and show that – consistent with our empirical findings – an expansionary monetary policy shock implies a temporary increase in bank lending relative to borrower collateral. In the model, this is accompanied by a higher default rate of borrowers.
We find that on average consumers chose the contract that ex post minimized their net costs. A substantial fraction of consumers (about 40%) still chose the ex post sub-optimal contract, with some incurring hundreds of dollars of avoidable interest costs. Nonetheless, the probability of choosing the sub-optimal contract declines with the dollar magnitude of the potential error, and consumers with larger errors were more likely to subsequently switch to the optimal contract. Thus most of the errors appear not to have been very costly, with the exception that a small minority of consumers persists in holding substantially sub-optimal contracts without switching. Klassifikation: G11, G21, E21, E51
The reaction of consumer spending and debt to tax rebates – evidence from consumer credit data
(2008)
We use a new panel dataset of credit card accounts to analyze how consumer responded to the 2001 Federal income tax rebates. We estimate the monthly response of credit card payments, spending, and debt, exploiting the unique, randomized timing of the rebate disbursement. We find that, on average, consumers initially saved some of the rebate, by increasing their credit card payments and thereby paying down debt. But soon afterwards their spending increased, counter to the canonical Permanent-Income model. Spending rose most for consumers who were initially most likely to be liquidity constrained, whereas debt declined most (so saving rose most) for unconstrained consumers. More generally, the results suggest that there can be important dynamics in consumers’ response to “lumpy” increases in income like tax rebates, working in part through balance sheet (liquidity) mechanisms.
Even as online advertising continues to grow, a central question remains: Who to target? Yet, advertisers know little about how to select from the hundreds of audience segments for targeting (and combinations thereof) for a profitable online advertising campaign. Utilizing insights from a field experiment on Facebook (Study 1), we develop a model that helps advertisers solve the cold-start problem of selecting audience segments for targeting. Our model enables advertisers to calculate the break-even performance of an audience segment to make a targeted ad campaign at least as profitable as an untargeted one. Advertisers can use this novel model to decide whether to test specific audience segments in their campaigns (e.g., in randomized controlled trials). We apply our model to data from the Spotify ad platform to study the profitability of different audience segments (Study 2). Approximately half of those audience segments require the click-through rate to double compared to an untargeted campaign, which is unrealistically high for most ad campaigns. Our model also shows that narrow segments require a lift that is likely not attainable, specifically when the data quality of these segments is poor. We confirm this theoretical finding in an empirical study (Study 3): A decrease in data quality due to Apple’s introduction of the App Tracking Transparency (ATT) framework more negatively affects the click-through rate of narrow (versus broad) audience segments.
DESPITE AMPLE EVIDENCE THAT CUSTOMERS EXHIBIT HIGHER DISCOUNT RATES THAN FIRMS, IT IS NOT CLEAR HOW DIFFERENCES IN DISCOUNT RATES AFFECT OPTIMAL PRICES, PROFITS, AND WELFARE OF COMPLEMENTARY PRODUCTS (WHICH COULD BE GOODS OR SERVICES). WE SHOW FOR COMPLEMENTARY PROUCTS THAT HIGHER DISCOUNT RATES OF CUSTOMERS DO NOT INCREASE PROFIT OR CONSUMER SURPLUS. FIRMS, INCLUDING BANKS, WOULD BE ADVISED TO SEEK TO REDUCE EXCESSIVE DISCOUNT RATES AMONG CONSUMERS.
Modeling short-term interest rates as following regime-switching processes has become increasingly popular. Theoretically, regime-switching models are able to capture rational expectations of infrequently occurring discrete events. Technically, they allow for potential time-varying stationarity. After discussing both aspects with reference to the recent literature, this paper provides estimations of various univariate regime-switching specifications for the German three-month money market rate and bivariate specifications additionally including the term spread. However, the main contribution is a multi-step out-of-sample forecasting competition. It turns out that forecasts are improved substantially when allowing for state-dependence. Particularly, the informational content of the term spread for future short rate changes can be exploited optimally within a multivariate regime-switching framework.
This study uses Markov-switching models to evaluate the informational content of the term structure as a predictor of recessions in eight OECD countries. The empirical results suggest that for all countries the term spread is sensibly modelled as a two-state regime-switching process. Moreover, our simple univariate model turns out to be a filter that transforms accurately term spread changes into turning point predictions. The term structure is confirmed to be a reliable recession indicator. However, the results of probit estimations show that the markov-switching filter does not significantly improve the forecasting ability of the spread.
In this study a regime switching approach is applied to estimate the chartist and fundamentalist (c&f) exchange rate model originally proposed by Frankel and Froot (1986). The c&f model is tested against alternative regime switching specifications applying likelihood ratio tests. Nested atheoretical models like the popular segmented trends model suggested by Engel and Hamilton (1990) are rejected in favour of the multi agent model. Moreover, the c&f regime switching model seems to describe the data much better than a competing regime switching GARCH(1,1) model. Finally, our findings turned out to be relatively robust when estimating the model in subsamples. The empirical results suggest that the model is able to explain daily DM/Dollar forward exchange rate dynamics from 1982 to 1998.
A common prediction of macroeconomic models of credit market frictions is that the tightness of financial constraints is countercyclical. As a result, theory implies a negative collateralizability premium; that is, capital that can be used as collateral to relax financial constraints provides insurance against aggregate shocks and commands a lower risk compensation compared with non-collateralizable assets. We show that a longshort portfolio constructed using a novel measure of asset collateralizability generates an average excess return of around 8% per year. We develop a general equilibrium model with heterogeneous firms and financial constraints to quantitatively account for the collateralizability premium.
Most insurers in the European Union determine their regulatory capital requirements based on the standard formula of Solvency II. However, there is evidence that the standard formula inaccurately reflects insurers’ risk situation and may provide misleading steering incentives. In the second pillar, Solvency II requires insurers to perform a so-called “Own Risk and Solvency Assessment” (ORSA). In their ORSA, insurers must establish their own risk measurement approaches, including those based on scenarios, in order to derive suitable risk assessments and address shortcomings of the standard formula. The idea of this paper is to identify scenarios in such a way that the standard formula in connection with the ORSA provides a reliable basis for risk management decisions. Using an innovative method for scenario identification, our approach allows for a simple but relatively precise assessment of marginal and even non-marginal portfolio changes. We numerically evaluate the proposed approach in the context of market risk employing an internal model from the academic literature and the Solvency Capital Requirement (SCR) calculation under Solvency II.
Gradient capital allocation, also known as Euler allocation, is a technique used to redistribute diversified capital requirements among different segments of a portfolio. The method is commonly employed to identify dominant risks, assessing the risk-adjusted profitability of segments, and installing limit systems. However, capital allocation can be misleading in all these applications because it only accounts for the current portfolio composition and ignores how diversification effects may change with a portfolio restructuring. This paper proposes enhancing the gradient capital allocation by adding “orthogonal convexity scenarios” (OCS). OCS identify risk concentrations that potentially drive portfolio risk and become relevant after restructuring. OCS have strong ties with principal component analysis (PCA), but they are a more general concept and compatible with common empirical patterns of risk drivers being fat-tailed and increasingly dependent in market downturns. We illustrate possible applications of OCS in terms of risk communication and risk limits.
We show that the presence of high frequency trading (HFT) has significantly mitigated the frequency and severity of end-of-day price dislocation, counter to recent concerns expressed in the media. The effect of HFT is more pronounced on days when end of day price dislocation is more likely to be the result of market manipulation on days of option expiry dates and end of month. Moreover, the effect of HFT is more pronounced than the role of trading rules, surveillance, enforcement and legal conditions in curtailing the frequency and severity of end-of-day price dislocation. We show our findings are robust to different proxies of the start of HFT by trade size, cancellation of orders, and co-location.
We examine the impact of stock exchange trading rules and surveillance on the frequency and severity of suspected insider trading cases in 22 stock exchanges around the world over the period January 2003 through June 2011. Using new indices for market manipulation, insider trading, and broker-agency conflict based on the specific provisions of the trading rules of each stock exchange, along with surveillance to detect non-compliance with such rules, we show that more detailed exchange trading rules and surveillance over time and across markets significantly reduce the number of cases, but increase the profits per case.
In this paper we assess the implications of sunk costs and product differentiation on the pricing decisions of the multinational firms. For this purpose we use a modified version of Salop's spatial competition. The model yields clear-cut predictions regarding the effects of exchange rate shocks on the market structure and on pass-through. The main results are following: shocks within the band of inaction do not affect market structure. The upper bound of this range rises as the industry ratio of sunk- to fixed costs increases. As fixed costs and product heterogeneity jointly increase, the lower bound drops. Outside of the range, depreciations cause one or several of those foreign brands closest to the home brand to leave. This decreases the overall responsiveness of prices to exchange rate shocks. Large appreciations induce entry and increase the elasticity of prices. This asymmetry implies larger positive than negative PPP deviations. When accounting for price changes in foreign markets, strategic pricing behaviour is no longer sufficient to generate real exchange rate variability. Incomplete pass-through obtains if and only if the domestic firms have a smaller market share abroad. With large nominal exchange rate shocks a hysteresis result obtains if and only if sunk costs are non-zero. Klassifikation: C33, E31
Under a conventional policy rule, a central bank adjusts its policy rate linearly according to the gap between inflation and its target, and the gap between output and its potential. Under "the opportunistic approach to disinflation" a central bank controls inflation aggressively when inflation is far from its target, but concentrates more on output stabilization when inflation is close to its target, allowing supply shocks and unforeseen fluctuations in aggregate demand to move inflation within a certain band. We use stochastic simulations of a small-scale rational expectations model to contrast the behavior of output and inflation under opportunistic and linear rules. Klassifikation: E31, E52, E58, E61. July, 2005.
Recent empirical research found that the strong short-term relationship between monetary aggregates and US real output and inflation, as outlined in the classical study by M. Friedman and Schwartz, mostly disappeared since the early 1980s. In the light of the B. Friedman and Kuttner (1992) information value approach, we reevaluate the vanishing relationship between US monetary aggregates and these macroeconomic fundamentals by taking into account the international currency feature of the US dollar. In practice, by using official US data for foreign flows constructed by Porter and Judson (1996) we find that domestic money (currency component of M1 corrected for the foreign holdings of dollars) contains valuable information about future movements of US real output and inflation. Statistical evidence here provided thus suggests that the Friedman and Schwartz's stylized facts can be reestablished once the focus of analysis is back on the domestic monetary aggregates. This Version: August, 2001. Klassifikation: E3, E4, E5