Article
Refine
Year of publication
Document Type
- Article (195) (remove)
Has Fulltext
- yes (195)
Is part of the Bibliography
- no (195)
Keywords
- Machine learning (4)
- Retirement (4)
- Artificial intelligence (3)
- Household finance (3)
- Ordoliberalism (3)
- Walter Eucken (3)
- machine learning (3)
- 401(k) plan (2)
- Aesthetics (2)
- Annuity (2)
Institute
- Wirtschaftswissenschaften (195) (remove)
Highlights
• Six Newton methods for solving matrix quadratic equations in linear DSGE models.
• Compared to QZ using 99 different DSGE models including Smets and Wouters (2007).
• Newton methods more accurate than QZ with comparable computation burden.
• Apt for refining solutions from alternative methods or nearby parameterizations.
Abstract
This paper presents and compares Newton-based methods from the applied mathematics literature for solving the matrix quadratic that underlies the recursive solution of linear DSGE models. The methods are compared using nearly 100 different models from the Macroeconomic Model Data Base (MMB) and different parameterizations of the monetary policy rule in the medium-scale New Keynesian model of Smets and Wouters (2007) iteratively. We find that Newton-based methods compare favorably in solving DSGE models, providing higher accuracy as measured by the forward error of the solution at a comparable computation burden. The methods, however, suffer from their inability to guarantee convergence to a particular, e.g. unique stable, solution, but their iterative procedures lend themselves to refining solutions either from different methods or parameterizations.
The hierarchical feature regression (HFR) is a novel graph-based regularized regression estimator, which mobilizes insights from the domains of machine learning and graph theory to estimate robust parameters for a linear regression. The estimator constructs a supervised feature graph that decomposes parameters along its edges, adjusting first for common variation and successively incorporating idiosyncratic patterns into the fitting process. The graph structure has the effect of shrinking parameters towards group targets, where the extent of shrinkage is governed by a hyperparameter, and group compositions as well as shrinkage targets are determined endogenously. The method offers rich resources for the visual exploration of the latent effect structure in the data, and demonstrates good predictive accuracy and versatility when compared to a panel of commonly used regularization techniques across a range of empirical and simulated regression tasks.
In a unifying framework generalizing established theories we characterize under which conditions Joint Ownership of assets creates the best cooperation incentives in a partnership. We endogenise renegotiation costs and assume that they weakly increase with additional assets. A salient sufficient condition for optimal cooperation incentives among patient partners is if Joint Ownership is a Strict Coasian Institution for which transaction costs impede an efficient asset reallocation after a breakdown. In contrast to Halonen (2002) the logic behind our results is that Joint Ownership maximizes the value of the relationship and the costs of renegotiating ownership after a broken relationship.
Highlights
• Pathways for a circular economy towards the EU goals require policy support that, in turn, requires legitimacy.
• Legitimacy is often contested in the public discourse at all phases in the technological innovation system.
• Legitimacy remains poorly understood for ‘in-between’ technologies that struggle to move from the formative to the growth stage.
• The article explores legitimacy for chemical recycling primarily based on evidence from the UK, Germany, and Italy.
Abstract
The European Commission aims to increase the recycling of plastic packaging to 60% by 2025, requiring fundamental changes towards a more circular economy. Pathways for this transition require policy support that largely depends on their legitimacy in the public discourse. These normative aspects remain poorly understood for ‘in-between’ technologies, i.e., technologies that are no longer novel but struggle to move to the growth phase within the technological innovation system. Therefore, we ask: How do discourses shape technology legitimacy for in-between technologies? Drawing on the empirical example of chemical recycling, the analysis renders two principal findings. First, legitimising and delegitimising storylines present contesting views on in-between technologies regarding their technological aspects, environmental and social impacts, and economic and policy implications. Second, how discourses contribute to technology legitimacy depends on the actors and interests that drive the prevalent storylines in particular contexts.
When estimating misspecified linear factor models for the cross-section of expected returns using GMM, the explanatory power of these models can be spuriously high when the estimated factor means are allowed to deviate substantially from the sample averages. In fact, by shifting the weights on the moment conditions, any level of cross-sectional fit can be attained. The mathematically correct global minimum of the GMM objective function can be obtained at a parameter vector that is far from the true parameters of the data-generating process. This property is not restricted to small samples, but rather holds in population. It is a feature of the GMM estimation design and applies to both strong and weak factors, as well as to all types of test assets.
Does political conflict with another country influence domestic consumers' daily consumption choices? We exploit the volatile US-China relations in 2018 and 2019 to analyze whether US consumers reduce their visits to Chinese restaurants when bilateral relations deteriorate. We measure the degree of political conflict through negativity in media reports and rely on smartphone location data to measure daily visits to over 190,000 US restaurants. A deterioration in US-China relations induces a significant decline in visits not only to Chinese but also to other foreign ethnic restaurants, while visits to typical American restaurants increase. We identify consumers' age, race, and cultural openness to moderate the strength of this ethnocentric effect.
The 2011 Arab Spring marked the opening of the Central Mediterranean Route for irregular border crossings between Libya and Italy, which produced heterogeneous reductions of bilateral smuggling distances between country pairs in the Mediterranean region. We exploit this source of spatial and temporal variation in bilateral distance along land and sea routes to estimate the elasticity of irregular migration intentions for African and Near East countries. We estimate an elasticity of migration intentions to smuggling distances exceeding −3, mainly driven by countries with weak rule of law and high internet penetration. Our findings are consistent across irregular migration measures both at the aggregate and individual levels. We show that irregular migration elasticity is higher for youth, relatively skilled individuals and those with an informative advantage (having a social network abroad or a mobile phone).
This paper studies discrete time finite horizon life-cycle models with arbitrary discount functions and iso-elastic per period power utility with concavity parameter θ. We distinguish between the savings behavior of a sophisticated versus a naive agent. Although both agent types have identical preferences, they solve different utility maximization problems whenever the model is dynamically inconsistent. Pollak (1968) shows that the savings behavior of both agent types is nevertheless identical for logarithmic utility (θ = 1). We generalize this result by showing that the sophisticated agent saves in every period a greater fraction of her wealth than the naive agent if and only if θ ≥ 1. While this result goes through for model extensions that preserve linearity of the consumption policy function, it breaks down for non-linear model extensions.
Detailed feedback on exercises helps learners become proficient but is time-consuming for educators and, thus, hardly scalable. This manuscript evaluates how well Generative Artificial Intelligence (AI) provides automated feedback on complex multimodal exercises requiring coding, statistics, and economic reasoning. Besides providing this technology through an easily accessible web application, this article evaluates the technology’s performance by comparing the quantitative feedback (i.e., points achieved) from Generative AI models with human expert feedback for 4,349 solutions to marketing analytics exercises. The results show that automated feedback produced by Generative AI (GPT-4) provides almost unbiased evaluations while correlating highly with (r = 0.94) and deviating only 6 % from human evaluations. GPT-4 performs best among seven Generative AI models, albeit at the highest cost. Comparing the models’ performance with costs shows that GPT-4, Mistral Large, Claude 3 Opus, and Gemini 1.0 Pro dominate three other Generative AI models (Claude 3 Sonnet, GPT-3.5, and Gemini 1.5 Pro). Expert assessment of the qualitative feedback (i.e., the AI’s textual response) indicates that it is mostly correct, sufficient, and appropriate for learners. A survey of marketing analytics learners shows that they highly recommend the app and its Generative AI feedback. An advantage of the app is its subject-agnosticism—it does not require any subject- or exercise-specific training. Thus, it is immediately usable for new exercises in marketing analytics and other subjects.
Highlights
• The 1986 Immigration Reform and Control Act legalized millions of Hispanic migrants.
• The IRCA receive significantly increases state-to-county fiscal transfers.
• Electoral incentives of the state governor drive the fiscal response of the IRCA.
• Legalization increases Hispanic turnout and political engagement.
Abstract
We study the impact of immigrant legalization on fiscal transfers from state to local governments in the United States, exploiting variation in legal status from the 1986 Immigration Reform and Control Act (IRCA). State governments allocate more resources to IRCA counties, an allocation that is responsive to the electoral incentives of the governor. Importantly, the effect emerges prior to the enfranchisement of the IRCA migrants and we argue it is driven by the IRCA’s capacity to politically empower already legal Hispanic migrants in mixed legal status communities. The IRCA increases turnout in large Hispanic communities as well as Hispanic political engagement, without detectably triggering anti-migrant sentiment.
The recent COVID-19 pandemic represents an unprecedented worldwide event to study the influence of related news on the financial markets, especially during the early stage of the pandemic when information on the new threat came rapidly and was complex for investors to process. In this paper, we investigate whether the flow of news on COVID-19 had an impact on forming market expectations. We analyze 203,886 online articles dealing with COVID-19 and published on three news platforms (MarketWatch.com, NYTimes.com, and Reuters.com) in the period from January to June 2020. Using machine learning techniques, we extract the news sentiment through a financial market-adapted BERT model that enables recognizing the context of each word in a given item. Our results show that there is a statistically significant and positive relationship between sentiment scores and S&P 500 market. Furthermore, we provide evidence that sentiment components and news categories on NYTimes.com were differently related to market returns.
This paper proposes tests for out-of-sample comparisons of interval forecasts based on parametric conditional quantile models. The tests rank the distance between actual and nominal conditional coverage with respect to the set of conditioning variables from all models, for a given loss function. We propose a pairwise test to compare two models for a single predictive interval. The set-up is then extended to a comparison across multiple models and/or intervals. The limiting distribution varies depending on whether models are strictly non-nested or overlapping. In the latter case, degeneracy may occur. We establish the asymptotic validity of wild bootstrap based critical values across all cases. An empirical application to Growth-at-Risk (GaR) uncovers situations in which a richer set of financial indicators are found to outperform a commonly-used benchmark model when predicting downside risk to economic activity.
Market risks account for an integral part of insurers' risk profiles. We explore market risk sensitivities of insurers in the United States and Europe. Based on panel regression models and daily market data from 2012 to 2018, we find that sensitivities are particularly driven by insurers' product portfolio. The influence of interest rate movements on stock returns is 60% larger for US than for European life insurers. For the former, interest rate risk is a dominant market risk with an effect that is five times larger than through corporate credit risk. For European life insurers, the sensitivity to interest rate changes is only 44% larger than toward credit default swap of government bonds, underlining the relevance of sovereign credit risk.
In recent years, European regulators have debated restricting the time an online tracker can track a user to protect consumer privacy better. Despite the significance of these debates, there has been a noticeable absence of any comprehensive cost-benefit analysis. This article fills this gap on the cost side by suggesting an approach to estimate the economic consequences of lifetime restrictions on cookies for publishers. The empirical study on cookies of 54,127 users who received ∼128 million ad impressions over ∼2.5 years yields an average cookie lifetime of 279 days, with an average value of €2.52 per cookie. Only ∼13 % of all cookies increase their daily value over time, but their average value is about four times larger than the average value of all cookies. Restricting cookies’ lifetime to one year (two years) could potentially decrease their lifetime value by ∼25 % (∼19 %), which represents a potential decrease in the value of all cookies of ∼9 % (∼5%). Most cookies, however, would not be affected by lifetime restrictions of 12 or 24 months as 72 % (85 %) of the users delete their cookies within 12 (24) months. In light of the €10.60 billion cookie-based display ad revenue in Europe, such restrictions would endanger €904 million (€576 million) annually, equivalent to €2.08 (€1.33) per EU internet user. The article discusses these results' marketing strategy challenges and opportunities for advertisers and publishers.
Using a field study at a German brokerage, we investigate advised individual investors’ behavior and outcomes after self-selecting into a flat-fee scheme (percentage of portfolio value) for mutual funds. In a difference-in-differences setting, we compare 699 switchers to propensity-score-matched advisory clients who remained in the commission-based scheme. Switchers increase their portfolio values, improve portfolio diversification, and increase their portfolio performance. They also demand more financial advice and follow more advisor recommendations. We argue that switchers attribute a higher quality to the unchanged advisory services.
We estimate the causal effect of shared e-scooter services on traffic accidents by exploiting the variation in the availability of e-scooter services induced by the staggered rollout across 93 cities in six countries. Police-reported accidents involving personal injuries in the average month increased by around 8.2% after shared e-scooters were introduced. Effects are large during summer and insignificant during winter. Further heterogeneity analysis reveals the largest estimated effects for cities with limited cycling infrastructure, while no effects are detectable in cities with high bike-lane density. This difference suggests that public policy can play a crucial role in mitigating accidents related to e-scooters and, more generally, to changes in urban mobility.
Even as online advertising continues to grow, a central question remains: Who to target? Yet, advertisers know little about how to select from the hundreds of audience segments for targeting (and combinations thereof) for a profitable online advertising campaign. Utilizing insights from a field experiment on Facebook (Study 1), we develop a model that helps advertisers solve the cold-start problem of selecting audience segments for targeting. Our model enables advertisers to calculate the break-even performance of an audience segment to make a targeted ad campaign at least as profitable as an untargeted one. Advertisers can use this novel model to decide whether to test specific audience segments in their campaigns (e.g., in randomized controlled trials). We apply our model to data from the Spotify ad platform to study the profitability of different audience segments (Study 2). Approximately half of those audience segments require the click-through rate to double compared to an untargeted campaign, which is unrealistically high for most ad campaigns. Our model also shows that narrow segments require a lift that is likely not attainable, specifically when the data quality of these segments is poor. We confirm this theoretical finding in an empirical study (Study 3): A decrease in data quality due to Apple’s introduction of the App Tracking Transparency (ATT) framework more negatively affects the click-through rate of narrow (versus broad) audience segments.
Goal setting is vital in learning sciences, but the scientific evaluation of optimal learning goals is underexplored. This study proposes a novel methodological approach to determine optimal learning goals. The data in this study comes from a gamified learning app implemented in an undergraduate accounting course at a large German university. With a combination of decision trees and regression analyses, the goals connected to the badges implemented in the app are evaluated. The results show that the initial badge set already motivated learning strategies that led to better grades on the exam. However, the results indicate that the levels of the goals could be improved, and additional badges could be implemented. In addition to new goal levels, new goal types are also discussed. The findings show that learning goals initially determined by the instructors need to be evaluated to offer an optimal motivational effect. The new methodological approach used in this study can be easily transferred to other learning data sets to provide further insights.
In this article, we examine anti-refugee hate crime in the wake of the large influx of refugees to Germany in 2014 and 2015. By exploiting institutional features of the assignment of refugees to German regions, we estimate the impact of unexpected and sudden large-scale immigration on hate crime against refugees. Results indicate that it is not simply the size of local refugee inflows which drives the increase in hate crime, but rather the combination of refugee arrivals and latent anti-refugee sentiment. We show that ethnically homogeneous areas, areas which experienced hate crimes in the 1990s, and areas with high support for the Nazi party in the Weimar Republic, are more prone to respond to the arrival of refugees with incidents of hate crime against this group. Our results highlight the importance of regional anti-immigration sentiment in the analysis of the incumbent population’s reaction to immigration.
Do required minimum distribution 401(k) rules matter, and for whom? Insights from a lifecycle model
(2023)
Tax-qualified vehicles have helped U.S. private-sector workers accumulate $33Tr in retirement plans. An often-overlooked important institutional feature shaping decumulations from these plans is the “Required Minimum Distribution” (RMD) regulation requiring retirees to withdraw a minimum fraction from their retirement accounts or pay excise taxes on withdrawal shortfalls. Our calibrated lifecycle model measures the impact of RMD rules on heterogeneous households’ financial behavior during their work lives and in retirement. The model shows that reforms delaying or eliminating the RMD rules have little effect on consumption profiles, but they would influence withdrawals and tax payments for households with bequest motives.
While the COVID-19 pandemic had a large and asymmetric impact on firms, many countries quickly enacted massive business rescue programs which are specifically targeted to smaller firms. Little is known about the effects of such policies on business entry and exit, investment, factor reallocation, and macroeconomic outcomes. This paper builds a general equilibrium model with heterogeneous and financially constrained firms in order to evaluate the short- and long-term consequences of small firm rescue programs in a pandemic recession. We calibrate the stationary equilibrium and the pandemic shock to the U.S. economy, taking into account the factual Paycheck Protection Program (PPP) as a specific policy. We find that the policy has only a modest impact on aggregate output and employment because (i) jobs are saved predominately in the smallest firms that account for a minor share of employment and (ii) the grant reduces the reallocation of resources towards larger and less impacted firms. Much of the reallocation effects occur in the aftermath of the pandemic episode. By preventing inefficient liquidations, the policy dampens the long-term declines of aggregate consumption and of the real wage, thus delivering small welfare gains.
Homeownership rates differ widely across European countries. We document that part of this variation is driven by differences in the fraction of adults co-residing with their parents. Comparing Germany and Italy, we show that in contrast to homeownership rates per household, homeownership rates per individual are very similar during the first part of the life cycle. To understand these patterns, we build an overlapping-generations model where individuals face uninsurable income risk and make consumption-saving and housing tenure decisions. We embed an explicit intergenerational link between children and parents to capture the three-way trade-off between owning, renting, and co-residing. Calibrating the model to Germany we explore the role of income profiles, housing policies, and the taste for independence and show that a combination of these factors goes a long way in explaining the differential life-cycle patterns of living arrangements between the two countries.
This paper examines rent sharing in private investments in public equity (PIPEs) between newly public firms and private investors. The evidence suggests highly asymmetric rent sharing. Newly public firms earn a negative return of up to −15% in the first post-PIPE year, while investors benefit due to the ability to dictate transaction terms. The results are economically relevant because newly public firms are, at least in recent years, more likely to tap private rather than public markets for follow-on financing shortly after the initial public offering (IPO), and because the results for newly public firms contrast with those for the broad PIPE market in Lim et al. (2021). The study also contributes to the PIPE literature by offering an integrative view of competing theories of the cross-section of post-PIPE stock returns. We simultaneously test proxies for corporate governance, asymmetric information, bargaining power, and managerial entrenchment. While all explanations have univariate predictive power for the post-PIPE performance, only the proxies for corporate governance and asymmetric information are robust in ceteris-paribus tests.
Questionable research practices have generated considerable recent interest throughout and beyond the scientific community. We subsume such practices involving secret data snooping that influences subsequent statistical inference under the term MESSing (manipulating evidence subject to snooping) and discuss, illustrate and quantify the possibly dramatic effects of several forms of MESSing using an empirical and a simple theoretical example. The empirical example uses numbers from the most popular German lottery, which seem to suggest that 13 is an unlucky number.
We use census data to show that structural transformation reflects a fundamental reallocation of labour from goods to services, instead of a relabelling that occurs when goods-producing firms outsource their in-house service production. The novelty of our approach is that it categorizes labour by occupations, which are invariant to outsourcing. We find that the reallocation of labour from goods-producing to service-producing occupations is a robust feature in censuses from around the world and different time periods. To understand the underlying forces, we propose a tractable model in which uneven occupation-specific technological change generates structural transformation of occupation employment.
We propose a novel approach to the study of international trade based on a theory of country integration that embodies a broad systemic viewpoint on the relationship between trade and growth. Our model leads to an indicator of country openness that measures a country's level of integration through the full architecture of its connections in the trade network. We apply our methodology to a sample of 204 countries and find a sizable and significant positive relationship between our integration measure and a country's growth rate, while that of the traditional measures of outward orientation is only minor and statistically insignificant.
The present study investigates the moderating effect of usage intensity of the social networking site (SNS) Instagram (IG) on the influence of advertisement disclosure types on advertising performance. A national sample (N = 566) participated in a randomized online experiment including a real influencer and followers in order to investigate how different advertisement disclosure types affect advertising performance and how usage intensity moderates this effect. We find that disclosing an influencer’s postings with “#ad” increases the trustworthiness of the influencer and the general credibility of the posting for heavy users, but not for light users. Followership of a user has been found to strongly improve all researched variables (attitude toward product placement, trustworthiness of the spokesperson and general credibility of the posting). This study adds to literature the first distinction on heavy and light usage intensity, and on followership of an IG user when regarding the effects of advertisement disclosure types on advertising performance. To conclude, we present a number of recommendations regarding how advertisers, influencers, and SNS providers should develop strategies for monitoring, understanding, and responding to different social media users, e.g., to closely monitor an influencer’s audience to identify heavy users and optimally target them.
A person's intelligence level positively influences his or her professional success. Gifted and highly intelligent individuals should therefore be successful in their careers. However, previous findings on the occupational situation of gifted adults are mainly known from popular scientific sources in the fields of coaching and self-help groups and confirm prevailing stereotypes that gifted people have difficulties at work. Reliable studies are scarce. This systematic literature review examines 40 studies with a total of 22 job-related variables. Results are shown in general for (a) the employment situation and more specific for the occupational aspects (b) career, (c) personality and behavior, (d) satisfaction, (e) organization, and (f) influence of giftedness on the profession. Moreover, possible differences between female and male gifted individuals and gifted and non-gifted individuals are analyzed. Based on these findings, implications for practice as well as further research are discussed.
The crowdfunding of altruism
(2022)
This paper introduces a machine learning approach to quantify altruism from the linguistic style of textual documents. We apply our method to a central question in (social) entrepreneurship: How does altruism impact entrepreneurial success? Specifically, we examine the effects of altruism on crowdfunding outcomes in Initial Coin Offerings (ICOs). The main result suggests that altruism and ICO firm valuation are negatively related. We, then, explore several channels to shed some light on whether the negative altruism-valuation relation is causal. Our findings suggest that it is not altruism that causes lower firm valuation; rather, low-quality entrepreneurs select into altruistic projects, while the marginal effect of altruism on high-quality entrepreneurs is actually positive. Altruism increases the funding amount in ICOs in the presence of high-quality projects, low asymmetric information, and strong corporate governance.
The importance of agile methods has increased in recent years, not only to manage IT projects but also to establish flexible and adaptive organisational structures, which are essential to deal with disruptive changes and build successful digital business strategies. This paper takes an industry-specific perspective by analysing the dissemination, objectives and relative popularity of agile frameworks in the German banking sector. The data provides insights into expectations and experiences associated with agile methods and indicates possible implementation hurdles and success factors. Our research provides the first comprehensive analysis of agile methods in the German banking sector. The comparison with a selected number of fintechs has revealed some differences between banks and fintechs. We found that almost all banks and fintechs apply agile methods in IT projects. However, fintechs have relatively more experience with agile methods than banks and use them more intensively. Scrum is the most relevant framework used in practice. Scaled agile frameworks are so far negligible in the German banking sector. Acceleration of projects is apparently the most important objective of deploying agile methods. In addition, agile methods can contribute to cost savings and lead to improved quality and innovation performance, though for banks it is evidently more challenging to reach their respective targets than for fintechs. Overall our findings suggest that German banks are still in a maturing process of becoming more agile and that there is room for an accelerated adoption of agile methods in general and scaled agile frameworks in particular.
With adequate support for the learner, errors can have high learning potential. This study investigates rather unsuitable action patterns of teachers in dealing with errors. Teachers rarely investigate the causes that evoke the occurrence of individual students’ errors, but instead often change addressees immediately after an error occurs. Such behavior is frequent in the classroom, leaving unexploited, yet important potential to learn from errors. It has remained unexplained why teachers act the way they do in error situations. Using video-stimulated recalls, I investigate the reasons for teachers’ behavior in students’ error situations by confronting them with recorded episodes from their own teaching. Error situations are analyzed (within-case) and teachers’ beliefs are classified in an explanatory model (cross-case) to illustrate patterns across teachers. Results show that teachers refer to an interaction of student attributes, their own attributes, and error attributes when reasoning their own behavior. I find that reference to specific attributes varies depending on the situation, and so do the described reasons that led to a particular behavior as a spontaneous or more reflective decision.
A common element of market structure analysis is the spatial representation of firms’ competitive positions on maps. Such maps typically capture static snapshots in time. Yet, competitive positions tend to change. Embedded in such changes are firms’ trajectories, that is, the series of changes in firms’ positions over time relative to all other firms in a market. Identifying these trajectories contributes to market structure analysis by providing a forward-looking perspective on competition, revealing firms’ (re)positioning strategies and indicating strategy effectiveness. To unlock these insights, we propose EvoMap, a novel dynamic mapping framework that identifies firms’ trajectories from high-frequency and potentially noisy data. We validate EvoMap via extensive simulations and apply it empirically to study the trajectories of more than 1,000 publicly listed firms over 20 years. We find substantial changes in several firms’ positioning strategies, including Apple, Walmart, and Capital One. Because EvoMap accommodates a wide range of mapping methods, analysts can easily apply it in other empirical settings and to data from various sources.
Regulators worldwide have been implementing different privacy laws. They vary in their impact on the value for advertisers, publishers and users, but not much is known about these differences. This article focuses on three important privacy laws (i.e., General Data Protection Regulation [GDPR], California Consumer Privacy Act [CCPA] and Personal Information Protection Law [PIPL]) and compares their impact on the value for the three primary actors of the online advertising market, namely, advertisers, publishers and users. This article first compares these three privacy laws by developing a legal strictness score. It then uses the existing literature to derive the effects of the legal strictness of each privacy law on each actor’s value. Finally, it quantifies the three privacy laws’ impact on each actor’s value. The results show that GDPR and PIPL are similar and stricter than CCPA. Stricter privacy laws bring larger negative changes to the value for actors. As a result, both GDPR and PIPL decrease the actors’ value more substantially than CCPA. These value declines are the largest for publishers and are rather similar for users and advertisers. Scholars and practitioners can use our findings to explore ways to create value for multiple actors under various privacy laws.
Sample-based longitudinal discrete choice experiments: preferences for electric vehicles over time
(2021)
Discrete choice experiments have emerged as the state-of-the-art method for measuring preferences, but they are mostly used in cross-sectional studies. In seeking to make them applicable for longitudinal studies, our study addresses two common challenges: working with different respondents and handling altering attributes. We propose a sample-based longitudinal discrete choice experiment in combination with a covariate-extended hierarchical Bayes logit estimator that allows one to test the statistical significance of changes. We showcase this method’s use in studies about preferences for electric vehicles over six years and empirically observe that preferences develop in an unpredictable, non-monotonous way. We also find that inspecting only the absolute differences in preferences between samples may result in misleading inferences. Moreover, surveying a new sample produced similar results as asking the same sample of respondents over time. Finally, we experimentally test how adding or removing an attribute affects preferences for the other attributes.
Crowdfunding platforms offer project initiators the opportunity to acquire funds from the Internet crowd and, therefore, have become a valuable alternative to traditional sources of funding. However, some processes on crowdfunding platforms cause undesirable external effects that influence the funding success of projects. In this context, we focus on the phenomenon of project overfunding. Massively overfunded projects have been discussed to overshadow other crowdfunding projects which in turn receive less funding. We propose a funding redistribution mechanism to internalize these overfunding externalities and to improve overall funding results. To evaluate this concept, we develop and deploy an agent-based model (ABM). This ABM is based on a multi-attribute decision-making approach and is suitable to simulate the dynamic funding processes on a crowdfunding platform. Our evaluation provides evidence that possible modifications of the crowdfunding mechanisms bear the chance to optimize funding results and to alleviate existing flaws.
We analyze the extent to which individual audit partners influence the audited narrative disclosures in their clients’ financial reports. Using a sample of 3,281,423 private and public client firm-pairs, we find that the similarity among audited narrative disclosures is higher when two client firms share the same audit partner. Specifically, we find that the wording similarity of management reports (notes) increases by 30 (48) percent, the content similarity by 29 (49) percent, and the structure similarity by 48 (121) percent. Moreover, we find that audit partners in particular are relevant for their clients’ narrative disclosures because the increase in narrative disclosure similarity when sharing the same audit partner is nine (four) times greater than when sharing the same audit firm (audit office). We show that this influence of audit partners goes beyond adding boilerplate statements and, using novel field evidence, we shed light on the underlying mechanisms. Our findings are economically relevant because a stronger involvement of audit partners with their clients’ narratives is associated with a higher quality of narrative disclosures, which helps users better predict the future profitability of client firms.
Consider two independent random walks. By chance, there will be spells of association between them where the two processes move in the same direction, or in opposite direction. We compute the probabilities of the length of the longest spell of such random association for a given sample size, and discuss measures like mean and mode of the exact distributions. We observe that long spells (relative to small sample sizes) of random association occur frequently, which explains why nonsense correlation between short independent random walks is the rule rather than the exception. The exact figures are compared with approximations. Our finite sample analysis as well as the approximations rely on two older results popularized by Révész (Stat Pap 31:95–101, 1990, Statistical Papers). Moreover, we consider spells of association between correlated random walks. Approximate probabilities are compared with finite sample Monte Carlo results.
The mobile games business is an ever-increasing sub-sector of the entertainment industry. Due to its high profitability but also high risk and competitive atmosphere, game publishers need to develop strategies that allow them to release new products at a high rate, but without compromising the already short lifespan of the firms' existing games. Successful game publishers must enlarge their user base by continually releasing new and entertaining games, while simultaneously motivating the current user base of existing games to remain active for more extended periods. Since the core-component reuse strategy has proven successful in other software products, this study investigates the advantages and drawbacks of this strategy in mobile games. Drawing on the widely accepted Product Life Cycle concept, the study investigates whether the introduction of a new mobile game built with core-components of an existing mobile game curtails the incumbent's product life cycle. Based on real and granular data on the gaming activity of a popular mobile game, the authors find that by promoting multi-homing (i.e., by smartly interlinking the incumbent and new product with each other so that users start consuming both games in parallel), the core-component reuse strategy can prolong the lifespan of the incumbent game.
The term structure of interest rates is crucial for the transmission of monetary policy to financial markets and the macroeconomy. Disentangling the impact of monetary policy on the components of interest rates, expected short rates, and term premia is essential to understanding this channel. To accomplish this, we provide a quantitative structural model with endogenous, time-varying term premia that are consistent with empirical findings. News about future policy, in contrast to unexpected policy shocks, has quantitatively significant effects on term premia along the entire term structure. This provides a plausible explanation for partly contradictory estimates in the empirical literature.
Contemporary information systems make widespread use of artificial intelligence (AI). While AI offers various benefits, it can also be subject to systematic errors, whereby people from certain groups (defined by gender, age, or other sensitive attributes) experience disparate outcomes. In many AI applications, disparate outcomes confront businesses and organizations with legal and reputational risks. To address these, technologies for so-called “AI fairness” have been developed, by which AI is adapted such that mathematical constraints for fairness are fulfilled. However, the financial costs of AI fairness are unclear. Therefore, the authors develop AI fairness for a real-world use case from e-commerce, where coupons are allocated according to clickstream sessions. In their setting, the authors find that AI fairness successfully manages to adhere to fairness requirements, while reducing the overall prediction performance only slightly. However, they find that AI fairness also results in an increase in financial cost. Thus, in this way the paper’s findings contribute to designing information systems on the basis of AI fairness.
This study simulates three income tax scenarios in a Mirrleesian setting for 24 EU countries using data from the 2014 Structure of Earnings Survey. In scenario 1, each country individually maximizes its own welfare (benchmark). In scenarios 2 and 3, total welfare in the EU is maximized over a common budget constraint. Unlike scenario 2, the social planner of scenario 3 differentiates taxes by country of residence. If a common tax and transfer system were implemented in the EU, countries with a relatively higher mean wage rate—particularly those in Western and some of the Northern European countries—would transfer resources to the others. Scenario 2 implies increased labor distortions for almost all countries and, hence, leads to a contraction in total output. Scenario 3 produces higher (lower) marginal taxes for high- (low-) mean countries compared to the benchmark. The change in total output depends on the income effects on labor supply. Overall, total welfare is higher for the scenarios involving a European tax and transfer system despite more than two thirds of all the agents becoming worse off relative to the benchmark. A politically more feasible integrated tax system improves the well-being of almost half of all the EU but considerably reduces the aggregate welfare benefits.
This paper uses historical monthly temperature level data for a panel of 114 countries to identify the effects of within year temperature level variability on productivity growth in five different macro regions, i.e., (1) Africa, (2) Asia, (3) Europe, (4) North America and (5) South America. We find two primary results. First, higher intra-annual temperature variability reduces (increases) productivity in Europe and North America (Asia). Second, higher intra-annual temperature variability has no significant effects on productivity in Africa and South America. Additional empirical tests indicate also the following: (1) rising intra-annual temperature variability reduces productivity (even thought less significantly)in both tropical and non-tropical regions, (2) inter-annual temperature variability reduces (increases) productivity in North America (Europe) and (3) winter and summer inter-annual temperature variability generates a drop in productivity in both Europe and North America. Taken together, these findings indicate that temperature variability shocks tend to have stronger adverse economic effects among richer economies. In a production economy featuring long-run productivity and temperature volatility shocks, we quantify these negative impacts and find welfare losses of 2.9% (1%) in Europe (North America).
Solving High-Dimensional Dynamic Portfolio Choice Models with Hierarchical B-Splines on Sparse Grids
(2021)
Discrete time dynamic programming to solve dynamic portfolio choice models has three immanent issues: firstly, the curse of dimensionality prohibits more than a handful of continuous states. Secondly, in higher dimensions, even regular sparse grid discretizations need too many grid points for sufficiently accurate approximations of the value function. Thirdly, the models usually require continuous control variables, and hence gradient-based optimization with smooth approximations of the value function is necessary to obtain accurate solutions to the optimization problem. For the first time, we enable accurate and fast numerical solutions with gradient-based optimization while still allowing for spatial adaptivity using hierarchical B-splines on sparse grids. When compared to the standard linear bases on sparse grids or finite difference approximations of the gradient, our approach saves an order of magnitude in total computational complexity for a representative dynamic portfolio choice model with varying state space dimensionality, stochastic sample space, and choice variables.
Strict environmental regulation may deter foreign direct investment (FDI). The paper develops the hypothesis that regulation predominantly discourages FDI that is conducted as Greenfield investment rather than mergers and acquisitions (M&A). The hypothesis is tested with German firm-level FDI data. Empirically, stricter regulation reduces new Greenfield projects in polluting industries, but indeed has a much smaller impact on the number of M&As. This significant difference is compatible with the fact that existing operations often benefit from grandfathering rules, which provide softer regulation for pre-exisiting plants, and with the expectation that for M&As part of the regulation is capitalized in the purchase price. The heterogeneous effects help explaining mixed results in previous studies that have neglected the mode of entry.
Device-to-device (D2D) communication is an innovative solution for improving wireless network performance to efficiently handle the ever-increasing mobile data traffic. Communication takes place directly between two devices that are in each other’s transmission range. So far, research has focused on the technical challenges of implementing this technology and assumes a user’s general willingness to participate as forwarder in this technology. However, this simplifying assumption is not realistic, as willingness to participate in D2D communication can vary depending on the user. In this work, we consider the scenario that a user can act as a forwarder for a receiver who is not directly or insufficiently reached by the base station and accordingly has no or poor Internet connection. We take a user-centric approach and investigate the willingness to provide an Internet connection as a forwarder. We are the first to investigate user preferences for D2D communication using a choice-based conjoint analysis. Our results, based on a representative sample of potential users (N=181), show that the social relationship between the potential forwarder and the receiver has the greatest impact on the potential forwarder’s decision to provide an Internet connection to the receiver, accepting sacrifices in terms of additional battery consumption and reduced own service performance. In a detailed segment analysis, we observe significant preference differences depending on smartphone usage behavior and user age. Taking the corresponding preferences into account when matching forwarders and receivers can further increase technology adoption.
We analyze the joint dynamics of prices, productivity, and employment across firms, building a dynamic equilibrium model of heterogeneous firms who compete for workers and customers in frictional labor and product markets. Using panel data on prices and output for German manufacturing firms, the model is calibrated to evaluate the quantitative contributions of productivity and demand for the labor market. Product market frictions decisively dampen the firms' employment adjustments to productivity shocks. We further analyze the impact of aggregate shocks to the first and second moments of productivity and demand and relate them to business-cycle features in our data.
Vehicle registrations have been shown to strongly react to tax reforms aimed at reducing CO2 emissions from passengers’ cars, but are the effects equally strong for positive and negative tax changes? The literature on asymmetric reactions to price and tax changes has documented asymmetries for everyday goods but has not yet considered durables. We leverage multiple vehicle registration tax (VRT) reforms in Norway and estimate their impact on within car-model substitutions. We estimate stronger effects for cars receiving tax cuts and rebates than for those affected by tax increases. The corresponding estimated elasticity is − 1.99 for VRT decreases and 0.77 for increases. As consumers may also substitute across car models, our estimates represent a lower bound.
This study explores the implications of rising markups for optimal Mirrleesian income and profit taxation. Using a stylized model with two individuals, the main forces shaping welfare-optimal policies are analytically characterized. Although a higher profit tax has redistributive benefits, it adversely affects market competition, leading to a greater equilibrium cost-of-living. Rising markups directly contribute to a decline in optimal marginal taxes on labor income. The optimal policy response to higher markups includes increasingly relying on the profit tax to fund redistribution. Declining optimal marginal income taxes assists the redistributive function of the profit tax by contributing to the expansion of the profit tax base. This response alone considerably increases the equilibrium cost-of-living. Nevertheless, a majority of the individuals become better off with the optimal policy. If it is not possible to tax profits optimally, due, for example, to profit shifting, increasing redistribution via income taxes is not optimal; every individual is worse off relative to the scenario with optimal profit taxation.