Filtern
Dokumenttyp
- Arbeitspapier (7)
- Wissenschaftlicher Artikel (3)
- Preprint (2)
- Bericht (1)
Sprache
- Englisch (13)
Volltext vorhanden
- ja (13)
Gehört zur Bibliographie
- nein (13)
Schlagworte
- DSGE Model (1)
- DSGE Models (1)
- Fiscal Consolidation (1)
- Fiscal Multiplier (1)
- Fiscal Policy (1)
- Fiscal Stimulus (1)
- Fiskalpolitik (1)
- Geldtheorie (1)
- Government Debt (1)
- Government Deficit (1)
Renewed interest in fiscal policy has increased the use of quantitative models to evaluate policy. Because of modeling uncertainty, it is essential that policy evaluations be robust to alternative assumptions. We find that models currently being used in practice to evaluate fiscal policy stimulus proposals are not robust. Government spending multipliers in an alternative empirically-estimated and widely-cited new Keynesian model are much smaller than in these old Keynesian models; the estimated stimulus is extremely small with GDP and employment effects only one-sixth as large.
In the aftermath of the global financial crisis and great recession, many countries face substantial deficits and growing debts. In the United States, federal government outlays as a ratio to GDP rose substantially from about 19.5 percent before the crisis to over 24 percent after the crisis. In this paper we consider a fiscal consolidation strategy that brings the budget to balance by gradually reducing this spending ratio over time to the level that prevailed prior to the crisis. A crucial issue is the impact of such a consolidation strategy on the economy. We use structural macroeconomic models to estimate this impact focussing primarily on a dynamic stochastic general equilibrium model with price and wage rigidities and adjustment costs. We separate out the impact of reductions in government purchases and transfers, and we allow for a reduction in both distortionary taxes and government debt relative to the baseline of no consolidation. According to the model simulations GDP rises in the short run upon announcement and implementation of this fiscal consolidation strategy and remains higher than the baseline in the long run. We explore the role of the mix of expenditure cuts and tax reductions as well as gradualism in achieving this policy outcome. Finally, we conduct sensitivity studies regarding the type of model used and its parameterization.
In the aftermath of the global financial crisis and great recession, many countries face substantial deficits and growing debts. In the United States, federal government outlays as a ratio to GDP rose substantially from about 19.5 percent before the crisis to over 24 percent after the crisis. In this paper we consider a fiscal consolidation strategy that brings the budget to balance by gradually reducing this spending ratio over time to the level that prevailed prior to the crisis. A crucial issue is the impact of such a consolidation strategy on the economy. We use structural macroeconomic models to estimate this impact focussing primarily on a dynamic stochastic general equilibrium model with price and wage rigidities and adjustment costs. We separate out the impact of reductions in government purchases and transfers, and we allow for a reduction in both distortionary taxes and government debt relative to the baseline of no consolidation. According to the model simulations GDP rises in the short run upon announcement and implementation of this fiscal consolidation strategy and remains higher than the baseline in the long run. We explore the role of the mix of expenditure cuts and tax reductions as well as gradualism in achieving this policy outcome. Finally, we conduct sensitivity studies regarding the type of model used and its parameterization.
Recently, we evaluated a fiscal consolidation strategy for the United States that would bring the government budget into balance by gradually reducing government spending relative to GDP to the ratio that prevailed prior to the crisis (Cogan et al, JEDC 2013). Specifically, we published an analysis of the macroeconomic consequences of the 2013 Budget Resolution that was passed by the U.S. House of Representatives in March 2012. In this note, we provide an update of our research that evaluates this year’s budget reform proposal that is to be discussed and voted on in the House of Representative in March 2013. Contrary to the views voiced by critics of fiscal consolidation, we show that such a reduction in government purchases and transfer payments can increase GDP immediately and permanently relative to a policy without spending restraint. Our research makes use of a modern structural model of the economy that incorporates the long-standing essential features of economics: opportunity costs, efficiency, foresight and incentives. GDP rises because households take into account that spending restraint helps avoid future increases in tax rates. Lower taxes imply less distorted incentives for work, investment and production relative to a scenario without fiscal consolidation and lead to higher growth.
In the aftermath of the global financial crisis and great recession, many countries face substantial deficits and growing debts. In the United States, federal government outlays as a ratio to GDP rose substantially from about 19.5 percent before the crisis to over 24 percent after the crisis. In this paper we consider a fiscal consolidation strategy that brings the budget to balance by gradually reducing this spending ratio over time to the level that prevailed prior to the crisis. A crucial issue is the impact of such a consolidation strategy on the economy. We use structural macroeconomic models to estimate this impact. We consider two types of dynamic stochastic general equilibrium models: a neoclassical growth model and more complicated models with price and wage rigidities and adjustment costs. We separate out the impact of reductions in government purchases and transfers, and we allow for a reduction in both distortionary taxes and government debt relative to the baseline of no consolidation. According to the initial model simulations GDP rises in the short run upon announcement and implementation of this fiscal consolidation strategy and remains higher than the baseline in the long run.
White matter abnormalities across different epilepsy syndromes in adults: an ENIGMA Epilepsy study
(2019)
The epilepsies are commonly accompanied by widespread abnormalities in cerebral white matter. ENIGMA-Epilepsy is a large quantitative brain imaging consortium, aggregating data to investigate patterns of neuroimaging abnormalities in common epilepsy syndromes, including temporal lobe epilepsy, extratemporal epilepsy, and genetic generalized epilepsy. Our goal was to rank the most robust white matter microstructural differences across and within syndromes in a multicentre sample of adult epilepsy patients. Diffusion-weighted MRI data were analyzed from 1,069 non-epileptic controls and 1,249 patients: temporal lobe epilepsy with hippocampal sclerosis (N=599), temporal lobe epilepsy with normal MRI (N=275), genetic generalized epilepsy (N=182) and nonlesional extratemporal epilepsy (N=193). A harmonized protocol using tract-based spatial statistics was used to derive skeletonized maps of fractional anisotropy and mean diffusivity for each participant, and fiber tracts were segmented using a diffusion MRI atlas. Data were harmonized to correct for scanner-specific variations in diffusion measures using a batch-effect correction tool (ComBat). Analyses of covariance, adjusting for age and sex, examined differences between each epilepsy syndrome and controls for each white matter tract (Bonferroni corrected at p<0.001). Across “all epilepsies” lower fractional anisotropy was observed in most fiber tracts with small to medium effect sizes, especially in the corpus callosum, cingulum and external capsule. Less robust effects were seen with mean diffusivity. Syndrome-specific fractional anisotropy and mean diffusivity differences were most pronounced in patients with hippocampal sclerosis in the ipsilateral parahippocampal cingulum and external capsule, with smaller effects across most other tracts. Those with temporal lobe epilepsy and normal MRI showed a similar pattern of greater ipsilateral than contralateral abnormalities, but less marked than those in patients with hippocampal sclerosis. Patients with generalized and extratemporal epilepsies had pronounced differences in fractional anisotropy in the corpus callosum, corona radiata and external capsule, and in mean diffusivity of the anterior corona radiata. Earlier age of seizure onset and longer disease duration were associated with a greater extent of microstructural abnormalities in patients with hippocampal sclerosis. We demonstrate microstructural abnormalities across major association, commissural, and projection fibers in a large multicentre study of epilepsy. Overall, epilepsy patients showed white matter abnormalities in the corpus callosum, cingulum and external capsule, with differing severity across epilepsy syndromes. These data further define the spectrum of white matter abnormalities in common epilepsy syndromes, yielding new insights into pathological substrates that may be used to guide future therapeutic and genetic studies.
Investigators in the cognitive neurosciences have turned to Big Data to address persistent replication and reliability issues by increasing sample sizes, statistical power, and representativeness of data. While there is tremendous potential to advance science through open data sharing, these efforts unveil a host of new questions about how to integrate data arising from distinct sources and instruments. We focus on the most frequently assessed area of cognition - memory testing - and demonstrate a process for reliable data harmonization across three common measures. We aggregated raw data from 53 studies from around the world which measured at least one of three distinct verbal learning tasks, totaling N = 10,505 healthy and brain-injured individuals. A mega analysis was conducted using empirical bayes harmonization to isolate and remove site effects, followed by linear models which adjusted for common covariates. After corrections, a continuous item response theory (IRT) model estimated each individual subject’s latent verbal learning ability while accounting for item difficulties. Harmonization significantly reduced inter-site variance by 37% while preserving covariate effects. The effects of age, sex, and education on scores were found to be highly consistent across memory tests. IRT methods for equating scores across AVLTs agreed with held-out data of dually-administered tests, and these tools are made available for free online. This work demonstrates that large-scale data sharing and harmonization initiatives can offer opportunities to address reproducibility and integration challenges across the behavioral sciences.
The nucleosynthesis of elements beyond iron is dominated by neutron captures in the s and r processes. However, 32 stable, proton-rich isotopes cannot be formed during those processes, because they are shielded from the s-process flow and r-process β-decay chains. These nuclei are attributed to the p and rp process.
For all those processes, current research in nuclear astrophysics addresses the need for more precise reaction data involving radioactive isotopes. Depending on the particular reaction, direct or inverse kinematics, forward or time-reversed direction are investigated to determine or at least to constrain the desired reaction cross sections.
The Facility for Antiproton and Ion Research (FAIR) will offer unique, unprecedented opportunities to investigate many of the important reactions. The high yield of radioactive isotopes, even far away from the valley of stability, allows the investigation of isotopes involved in processes as exotic as the r or rp processes.
Diagnosing and treating acute severe and recurrent antivenom-related anaphylaxis (ARA) is challenging and reported experience is limited. Herein, we describe our experience of severe ARA in patients with neurotoxic snakebite envenoming in Nepal. Patients were enrolled in a randomised, double-blind trial of high vs. low dose antivenom, given by intravenous (IV) push, followed by infusion. Training in ARA management emphasised stopping antivenom and giving intramuscular (IM) adrenaline, IV hydrocortisone, and IV chlorphenamine at the first sign/s of ARA. Later, IV adrenaline infusion (IVAI) was introduced for patients with antecedent ARA requiring additional antivenom infusions. Preantivenom subcutaneous adrenaline (SCAd) was introduced in the second study year (2012). Of 155 envenomed patients who received ≥ 1 antivenom dose, 13 (8.4%), three children (aged 5−11 years) and 10 adults (18−52 years), developed clinical features consistent with severe ARA, including six with overlapping signs of severe envenoming. Four and nine patients received low and high dose antivenom, respectively, and six had received SCAd. Principal signs of severe ARA were dyspnoea alone (n=5 patients), dyspnoea with wheezing (n=3), hypotension (n=3), shock (n=3), restlessness (n=3), respiratory/cardiorespiratory arrest (n=7), and early (n=1) and late laryngeal oedema (n=1); rash was associated with severe ARA in 10 patients. Four patients were given IVAI. Of the 8 (5.1%) deaths, three occurred in transit to hospital. Severe ARA was common and recurrent and had overlapping signs with severe neurotoxic envenoming. Optimising the management of ARA at different healthy system levels needs more research. This trial is registered with NCT01284855.
Ribosome biogenesis in eukaryotes requires the participation of a large number of ribosome assembly factors. The highly conserved eukaryotic nucleolar protein Nep1 has an essential but unknown function in 18S rRNA processing and ribosome biogenesis. In Saccharomyces cerevisiae the malfunction of a temperature-sensitive Nep1 protein (nep1-1ts) was suppressed by the addition of S-adenosylmethionine (SAM). This suggests the participation of Nep1 in a methyltransferase reaction during ribosome biogenesis. In addition, yeast Nep1 binds to a 6-nt RNA-binding motif also found in 18S rRNA and facilitates the incorporation of ribosomal protein Rps19 during the formation of pre-ribosomes. Here, we present the X-ray structure of the Nep1 homolog from the archaebacterium Methanocaldococcus jannaschii in its free form (2.2 Å resolution) and bound to the S-adenosylmethionine analog S-adenosylhomocysteine (SAH, 2.15 Å resolution) and the antibiotic and general methyltransferase inhibitor sinefungin (2.25 Å resolution). The structure reveals a fold which is very similar to the conserved core fold of the SPOUT-class methyltransferases but contains a novel extension of this common core fold. SAH and sinefungin bind to Nep1 at a preformed binding site that is topologically equivalent to the cofactor-binding site in other SPOUT-class methyltransferases. Therefore, our structures together with previous genetic data suggest that Nep1 is a genuine rRNA methyltransferase.
In this paper we investigate the comparative properties of empirically-estimated monetary models of the U.S. economy. We make use of a new data base of models designed for such investigations. We focus on three representative models: the Christiano, Eichenbaum, Evans (2005) model, the Smets and Wouters (2007) model, and the Taylor (1993a) model. Although the three models differ in terms of structure, estimation method, sample period, and data vintage, we find surprisingly similar economic impacts of unanticipated changes in the federal funds rate. However, the optimal monetary policy responses to other sources of economic fluctuations are widely different in the different models. We show that simple optimal policy rules that respond to the growth rate of output and smooth the interest rate are not robust. In contrast, policy rules with no interest rate smoothing and no response to the growth rate, as distinct from the level, of output are more robust. Robustness can be improved further by optimizing rules with respect to the average loss across the three models.
In this paper we investigate the comparative properties of empirically-estimated monetary models of the U.S. economy using a new database of models designed for such investigations. We focus on three representative models due to Christiano, Eichenbaum, Evans (2005), Smets and Wouters (2007) and Taylor (1993a). Although these models differ in terms of structure, estimation method, sample period, and data vintage, we find surprisingly similar economic impacts of unanticipated changes in the federal funds rate. However, optimized monetary policy rules differ across models and lack robustness. Model averaging offers an effective strategy for improving the robustness of policy rules.
Recently there has been an explosion of research on whether the equilibrium real interest rate has declined, an issue with significant implications for monetary policy. A common finding is that the rate has declined. In this paper we provide evidence that contradicts this finding. We show that the perceived decline may well be due to shifts in regulatory policy and monetary policy that have been omitted from the research. In developing the monetary policy implications, it is promising that much of the research approaches the policy problem through the framework of monetary policy rules, as uncertainty in the equilibrium real rate is not a reason to abandon rules in favor of discretion. But the results are still inconclusive and too uncertain to incorporate into policy rules in the ways that have been suggested.