Refine
Year of publication
Document Type
- Article (76)
- Conference Proceeding (1)
Has Fulltext
- yes (77)
Is part of the Bibliography
- no (77)
Keywords
Institute
- Medizin (77) (remove)
Mathematical models of virus dynamics have not previously acknowledged spatial resolution at the intracellular level despite substantial arguments that favor the consideration of intracellular spatial dependence. The replication of the hepatitis C virus (HCV) viral RNA (vRNA) occurs within special replication complexes formed from membranes derived from endoplasmatic reticulum (ER). These regions, termed membranous webs, are generated primarily through specific interactions between nonstructural virus-encoded proteins (NSPs) and host cellular factors. The NSPs are responsible for the replication of the vRNA and their movement is restricted to the ER surface. Therefore, in this study we developed fully spatio-temporal resolved models of the vRNA replication cycle of HCV. Our simulations are performed upon realistic reconstructed cell structures—namely the ER surface and the membranous webs—based on data derived from immunostained cells replicating HCV vRNA. We visualized 3D simulations that reproduced dynamics resulting from interplay of the different components of our models (vRNA, NSPs, and a host factor), and we present an evaluation of the concentrations for the components within different regions of the cell. Thus far, our model is restricted to an internal portion of a hepatocyte and is qualitative more than quantitative. For a quantitative adaption to complete cells, various additional parameters will have to be determined through further in vitro cell biology experiments, which can be stimulated by the results deccribed in the present study.
Variants resistant to compounds specifically targeting HCV are observed in clinical trials. A multi-variant viral dynamic model was developed to quantify the evolution and in vivo fitness of variants in subjects dosed with monotherapy of an HCV protease inhibitor, telaprevir. Variant fitness was estimated using a model in which variants were selected by competition for shared limited replication space. Fitness was represented in the absence of telaprevir by different variant production rate constants and in the presence of telaprevir by additional antiviral blockage by telaprevir. Model parameters, including rate constants for viral production, clearance, and effective telaprevir concentration, were estimated from 1) plasma HCV RNA levels of subjects before, during, and after dosing, 2) post-dosing prevalence of plasma variants from subjects, and 3) sensitivity of variants to telaprevir in the HCV replicon. The model provided a good fit to plasma HCV RNA levels observed both during and after telaprevir dosing, as well as to variant prevalence observed after telaprevir dosing. After an initial sharp decline in HCV RNA levels during dosing with telaprevir, HCV RNA levels increased in some subjects. The model predicted this increase to be caused by pre-existing variants with sufficient fitness to expand once available replication space increased due to rapid clearance of wild-type (WT) virus. The average replicative fitness estimates in the absence of telaprevir ranged from 1% to 68% of WT fitness. Compared to the relative fitness method, the in vivo estimates from the viral dynamic model corresponded more closely to in vitro replicon data, as well as to qualitative behaviors observed in both on-dosing and long-term post-dosing clinical data. The modeling fitness estimates were robust in sensitivity analyses in which the restoration dynamics of replication space and assumptions of HCV mutation rates were varied.
Background: The development of robotic systems has provided an alternative to frame-based stereotactic procedures. The aim of this experimental phantom study was to compare the mechanical accuracy of the Robotic Surgery Assistant (ROSA) and the Leksell stereotactic frame by reducing clinical and procedural factors to a minimum.
Methods: To precisely compare mechanical accuracy, a stereotactic system was chosen as reference for both methods. A thin layer CT scan with an acrylic phantom fixed to the frame and a localizer enabling the software to recognize the coordinate system was performed. For each of the five phantom targets, two different trajectories were planned, resulting in 10 trajectories. A series of five repetitions was performed, each time based on a new CT scan. Hence, 50 trajectories were analyzed for each method. X-rays of the final cannula position were fused with the planning data. The coordinates of the target point and the endpoint of the robot- or frame-guided probe were visually determined using the robotic software. The target point error (TPE) was calculated applying the Euclidian distance. The depth deviation along the trajectory and the lateral deviation were separately calculated.
Results: Robotics was significantly more accurate, with an arithmetic TPE mean of 0.53 mm (95% CI 0.41–0.55 mm) compared to 0.72 mm (95% CI 0.63–0.8 mm) in stereotaxy (p < 0.05). In robotics, the mean depth deviation along the trajectory was −0.22 mm (95% CI −0.25 to −0.14 mm). The mean lateral deviation was 0.43 mm (95% CI 0.32–0.49 mm). In frame-based stereotaxy, the mean depth deviation amounted to −0.20 mm (95% CI −0.26 to −0.14 mm), the mean lateral deviation to 0.65 mm (95% CI 0.55–0.74 mm).
Conclusion: Both the robotic and frame-based approach proved accurate. The robotic procedure showed significantly higher accuracy. For both methods, procedural factors occurring during surgery might have a more relevant impact on overall accuracy.
Background: Acoustic Radiation Force Impulse (ARFI)-imaging is an ultrasound-based elastography method enabling quantitative measurement of tissue stiffness. The aim of the present study was to evaluate sensitivity and specificity of ARFI-imaging for differentiation of thyroid nodules and to compare it to the well evaluated qualitative real-time elastography (RTE).
Methods: ARFI-imaging involves the mechanical excitation of tissue using acoustic pulses to generate localized displacements resulting in shear-wave propagation which is tracked using correlation-based methods and recorded in m/s. Inclusion criteria were: nodules $5 mm, and cytological/histological assessment. All patients received conventional ultrasound, real-time elastography (RTE) and ARFI-imaging.
Results: One-hundred-fifty-eight nodules in 138 patients were available for analysis. One-hundred-thirty-seven nodules were benign on cytology/histology, and twenty-one nodules were malignant. The median velocity of ARFI-imaging in the healthy thyroid tissue, as well as in benign and malignant thyroid nodules was 1.76 m/s, 1.90 m/s, and 2.69 m/s, respectively. While no significant difference in median velocity was found between healthy thyroid tissue and benign thyroid nodules, a significant difference was found between malignant thyroid nodules on the one hand and healthy thyroid tissue (p = 0.0019) or benign thyroid nodules (p = 0.0039) on the other hand. No significant difference of diagnostic accuracy for the diagnosis of malignant thyroid nodules was found between RTE and ARFI-imaging (0.74 vs. 0.69, p = 0.54). The combination of RTE with ARFI did not improve diagnostic accuracy.
Conclusions: ARFI can be used as an additional tool in the diagnostic work up of thyroid nodules with high negative predictive value and comparable results to RTE.
Acute cholecystitis – a cohort study in a real-world clinical setting (REWO study, NCT02796443)
(2018)
Background: For decades, the optimal timing of surgery for acute cholecystitis has been controversial. Recent meta-analyses and population-based studies favor early surgery. One recent large randomized trial has demonstrated that a delayed approach increases morbidity and cost compared to early surgery within 24 hours of hospital admission. Since cases of severe cholecystitis were excluded from this trial, we argue that these results do not reflect real-world clinical situations. From our point of view, these results were in contrast to the clinical experience with our patients; so, we decided to analyze critically all our patients with the null hypothesis that the patients treated with a delayed cholecystectomy after an acute cholecystitis have a similar or even better outcome than those treated with an early operative approach.
Patients and methods: We retrospectively analyzed clinical data from all patients with cholecystectomies in the period between January 2006 and September 2015. A total of 1,723 patients were categorized into four groups: early (n=138): urgent surgery of patients with acute cholecystitis within the first 72 hours of the onset of symptoms; intermediate (n=297): surgery of patients with acute cholecystitis within an average of 10 days after the onset of symptoms; delayed (n=427): initial non-surgical treatment of acute cholecystitis with surgery performed within 6–12 weeks of the onset of symptoms; and elective (n=868): cholecystectomy within a symptom-free interval of choice in patients with symptomatic cholecystolithiasis without signs of acute cholecystitis.
Results: In a real-world scenario, early/intermediate cholecystectomy in acute cholecystitis was associated with a significant increase in morbidity and mortality (Clavien–Dindo score) compared to a delayed approach with surgery performed 6–12 weeks after the onset of symptoms. The adjusted linear rank statistics showed a decrease in the complication score with values of 2.29 in the early group, 0.48 in the intermediate group, –0.26 in the delayed group and –2.12 in the elective group. The results translate into a continuous decrease of the complication score from early over intermediate and delayed to the elective group.
Conclusion: These results demonstrate that delayed cholecystectomy can be performed safely. In cases with severe cholecystitis, early and/or intermediate approaches still have a relatively high risk of morbidity and mortality.
Hepatitis C virus (HCV) naturally infects only humans and chimpanzees. The determinants responsible for this narrow species tropism are not well defined. Virus cell entry involves human scavenger receptor class B type I (SR-BI), CD81, claudin-1 and occludin. Among these, at least CD81 and occludin are utilized in a highly species-specific fashion, thus contributing to the narrow host range of HCV. We adapted HCV to mouse CD81 and identified three envelope glycoprotein mutations which together enhance infection of cells with mouse or other rodent receptors approximately 100-fold. These mutations enhanced interaction with human CD81 and increased exposure of the binding site for CD81 on the surface of virus particles. These changes were accompanied by augmented susceptibility of adapted HCV to neutralization by E2-specific antibodies indicative of major conformational changes of virus-resident E1/E2-complexes. Neutralization with CD81, SR-BI- and claudin-1-specific antibodies and knock down of occludin expression by siRNAs indicate that the adapted virus remains dependent on these host factors but apparently utilizes CD81, SR-BI and occludin with increased efficiency. Importantly, adapted E1/E2 complexes mediate HCV cell entry into mouse cells in the absence of human entry factors. These results further our knowledge of HCV receptor interactions and indicate that three glycoprotein mutations are sufficient to overcome the species-specific restriction of HCV cell entry into mouse cells. Moreover, these findings should contribute to the development of an immunocompetent small animal model fully permissive to HCV.
The hepatitis C virus (HCV) RNA replication cycle is a dynamic intracellular process occurring in three-dimensional space (3D), which is difficult both to capture experimentally and to visualize conceptually. HCV-generated replication factories are housed within virus-induced intracellular structures termed membranous webs (MW), which are derived from the Endoplasmatic Reticulum (ER). Recently, we published 3D spatiotemporal resolved diffusion–reaction models of the HCV RNA replication cycle by means of surface partial differential equation (sPDE) descriptions. We distinguished between the basic components of the HCV RNA replication cycle, namely HCV RNA, non-structural viral proteins (NSPs), and a host factor. In particular, we evaluated the sPDE models upon realistic reconstructed intracellular compartments (ER/MW). In this paper, we propose a significant extension of the model based upon two additional parameters: different aggregate states of HCV RNA and NSPs, and population dynamics inspired diffusion and reaction coefficients instead of multilinear ones. The combination of both aspects enables realistic modeling of viral replication at all scales. Specifically, we describe a replication complex state consisting of HCV RNA together with a defined amount of NSPs. As a result of the combination of spatial resolution and different aggregate states, the new model mimics a cis requirement for HCV RNA replication. We used heuristic parameters for our simulations, which were run only on a subsection of the ER. Nevertheless, this was sufficient to allow the fitting of core aspects of virus reproduction, at least qualitatively. Our findings should help stimulate new model approaches and experimental directions for virology.
Background/aims: Hepatocellular carcinoma (HCC) is a leading indication for liver transplantation (LT) worldwide. Early identification of patients at risk for HCC recurrence is of paramount importance since early treatment of recurrent HCC after LT may be associated with increased survival. We evaluated incidence of and predictors for HCC recurrence, with a focus on the course of AFP levels.
Methods: We performed a retrospective, single-center study of 99 HCC patients who underwent LT between January 28th, 1997 and May 11th, 2016. A multi-stage proportional hazards model with three stages was used to evaluate potential predictive markers, both by univariate and multivariable analysis, for influences on 1) recurrence after transplantation, 2) mortality without HCC recurrence, and 3) mortality after recurrence.
Results: 19/99 HCC patients showed recurrence after LT. Waiting time was not associated with overall HCC recurrence (HR = 1, p = 0.979). Similarly, waiting time did not affect mortality in LT recipients both with (HR = 0.97, p = 0.282) or without (HR = 0.99, p = 0.685) HCC recurrence. Log10-transformed AFP values at the time of LT (HR 1.75, p = 0.023) as well as after LT (HR 2.07, p = 0.037) were significantly associated with recurrence. Median survival in patients with a ratio (AFP at recurrence divided by AFP 3 months before recurrence) of 0.5 was greater than 70 months, as compared to a median of only 8 months in patients with a ratio of 5.
Conclusion: A rise in AFP levels rather than an absolute threshold could help to identify patients at short-term risk for HCC recurrence post LT, which may allow intensification of the surveillance strategy on an individualized basis.
Estimating the age of the developmental stages of the blow fly Calliphora vicina (Diptera: Calliphoridae) is of forensic relevance for the determination of the minimum post-mortem interval (PMImin). Fly eggs and larvae can be aged using anatomical and morphological characters and their modification during development. However, such methods can only hardly be applied for aging fly pupae. Previous study described age estimation of C. vicina pupae using gene expression, but just when reared at constant temperatures, but fluctuating temperatures represent a more realistic scenario at a crime scene. Therefore, age-dependent gene expression of C. vicina pupae were compared at 3 fluctuating and 3 constant temperatures, the latter representing the mean values of the fluctuating profiles. The chosen marker genes showed uniform expression patterns during metamorphosis of C. vicina pupae bred at different temperature conditions (constant or fluctuating) but the same mean temperature (e.g. constant 10 °C vs. fluctuating 5–15 °C). We present an R-based statistical tool, which enables estimation of the age of the examined pupa based on the analysed gene expression data.
Altered mucosal immune response after acute lung injury in a murine model of Ataxia Telangiectasia
(2014)
Background: Ataxia telangiectasia (A-T) is a rare but devastating and progressive disorder characterized by cerebellar dysfunction, lymphoreticular malignancies and recurrent sinopulmonary infections. In A-T, disease of the respiratory system causes significant morbidity and is a frequent cause of death.
Methods: We used a self-limited murine model of hydrochloric acid-induced acute lung injury (ALI) to determine the inflammatory answer due to mucosal injury in Atm (A-T mutated)- deficient mice (Atm−/−).
Results: ATM deficiency increased peak lung inflammation as demonstrated by bronchoalveolar lavage fluid (BALF) neutrophils and lymphocytes and increased levels of BALF pro-inflammatory cytokines (e.g. IL-6, TNF). Furthermore, bronchial epithelial damage after ALI was increased in Atm−/− mice. ATM deficiency increased airway resistance and tissue compliance before ALI was performed.
Conclusions: Together, these findings indicate that ATM plays a key role in inflammatory response after airway mucosal injury.
Background: To compare the effect of aprotinin with the effect of lysine analogues (tranexamic acid and ε-aminocaproic acid) on early mortality in three subgroups of patients: low, intermediate and high risk of cardiac surgery.
Methods and Findings: We performed a meta-analysis of randomised controlled trials and observational with the following data sources: Medline, Cochrane Library, and reference lists of identified articles. The primary outcome measure was early (in-hospital/30-day) mortality. The secondary outcome measures were any transfusion of packed red blood cells within 24 hours after surgery, any re-operation for bleeding or massive bleeding, and acute renal dysfunction or failure within the selected cited publications, respectively.
Out of 328 search results, 31 studies (15 trials and 16 observational studies) included 33,501 patients. Early mortality was significantly increased after aprotinin vs. lysine analogues with a pooled risk ratio (95% CI) of 1.58 (1.13–2.21), p<0.001 in the low (n = 14,297) and in the intermediate risk subgroup (1.42 (1.09–1.84), p<0.001; n = 14,427), respectively. Contrarily, in the subgroup of high risk patients (n = 4,777), the risk for mortality did not differ significantly between aprotinin and lysine analogues (1.03 (0.67–1.58), p = 0.90).
Conclusion: Aprotinin may be associated with an increased risk of mortality in low and intermediate risk cardiac surgery, but presumably may has no effect on early mortality in a subgroup of high risk cardiac surgery compared to lysine analogues. Thus, decisions to re-license aprotinin in lower risk patients should critically be debated. In contrast, aprotinin might probably be beneficial in high risk cardiac surgery as it reduces risk of transfusion and bleeding complications.
Background: Liver fibrosis in human immunodeficiency virus (HIV)-infected individuals is mostly attributable to co-infection with hepatitis B or C. The impact of other risk factors, including prolonged exposure to combined antiretroviral therapy (cART) is poorly understood. Our aim was to determine the prevalence of liver fibrosis and associated risk factors in HIV-infected individuals based on non-invasive fibrosis assessment using transient elastography (TE) and serum biomarkers (Fibrotest [FT]).
Methods: In 202 consecutive HIV-infected individuals (159 men; mean age 47 ± 9 years; 35 with hepatitis-C-virus [HCV] co-infection), TE and FT were performed. Repeat TE examinations were conducted 1 and 2 years after study inclusion.
Results: Significant liver fibrosis was present in 16% and 29% of patients, respectively, when assessed by TE (≥ 7.1 kPa) and FT (> 0.48). A combination of TE and FT predicted significant fibrosis in 8% of all patients (31% in HIV/HCV co-infected and 3% in HIV mono-infected individuals). Chronic ALT, AST and γ-GT elevation was present in 29%, 20% and 51% of all cART-exposed patients and in 19%, 8% and 45.5% of HIV mono-infected individuals. Overall, factors independently associated with significant fibrosis as assessed by TE (OR, 95% CI) were co-infection with HCV (7.29, 1.95-27.34), chronic AST (6.58, 1.30-33.25) and γ-GT (5.17, 1.56-17.08) elevation and time on dideoxynucleoside therapy (1.01, 1.00-1.02). In 68 HIV mono-infected individuals who had repeat TE examinations, TE values did not differ significantly during a median follow-up time of 24 months (median intra-patient changes at last TE examination relative to baseline: -0.2 kPa, p = 0.20).
Conclusions: Chronic elevation of liver enzymes was observed in up to 45.5% of HIV mono-infected patients on cART. However, only a small subset had significant fibrosis as predicted by TE and FT. There was no evidence for fibrosis progression during follow-up TE examinations.
The coronavirus pandemic continues to challenge global healthcare. Severely affected patients are often in need of high doses of analgesics and sedatives. The latter was studied in critically ill coronavirus disease 2019 (COVID-19) patients in this prospective monocentric analysis. COVID-19 acute respiratory distress syndrome (ARDS) patients admitted between 1 April and 1 December 2020 were enrolled in the study. A statistical analysis of impeded sedation using mixed-effect linear regression models was performed. Overall, 114 patients were enrolled, requiring unusual high levels of sedatives. During 67.9% of the observation period, a combination of sedatives was required in addition to continuous analgesia. During ARDS therapy, 85.1% (n = 97) underwent prone positioning. Veno-venous extracorporeal membrane oxygenation (vv-ECMO) was required in 20.2% (n = 23) of all patients. vv-ECMO patients showed significantly higher sedation needs (p < 0.001). Patients with hepatic (p = 0.01) or renal (p = 0.01) dysfunction showed significantly lower sedation requirements. Except for patient age (p = 0.01), we could not find any significant influence of pre-existing conditions. Age, vv-ECMO therapy and additional organ failure could be demonstrated as factors influencing sedation needs. Young patients and those receiving vv-ECMO usually require increased sedation for intensive care therapy. However, further studies are needed to elucidate the causes and mechanisms of impeded sedation.
Augmenting LTP-like plasticity in human motor cortex by spaced paired associative stimulation
(2015)
Paired associative stimulation (PASLTP) of the human primary motor cortex (M1) can induce LTP-like plasticity by increasing corticospinal excitability beyond the stimulation period. Previous studies showed that two consecutive PASLTP protocols interact by homeostatic metaplasticity, but animal experiments provided evidence that LTP can be augmented by repeated stimulation protocols spaced by ~30min. Here we tested in twelve healthy selected PASLTP responders the possibility that LTP-like plasticity can be augmented in the human M1 by systematically varying the interval between two consecutive PASLTP protocols. The first PASLTP protocol (PAS1) induced strong LTP-like plasticity lasting for 30-60min. The effect of a second identical PASLTP protocol (PAS2) critically depended on the time between PAS1 and PAS2. At 10min, PAS2 prolonged the PAS1-induced LTP-like plasticity. At 30min, PAS2 augmented the LTP-like plasticity induced by PAS1, by increasing both magnitude and duration. At 60min and 180min, PAS2 had no effect on corticospinal excitability. The cumulative LTP-like plasticity after PAS1 and PAS2 at 30min exceeded significantly the effect of PAS1 alone, and the cumulative PAS1 and PAS2 effects at 60min and 180min. In summary, consecutive PASLTP protocols interact in human M1 in a time-dependent manner. If spaced by 30min, two consecutive PASLTP sessions can augment LTP-like plasticity in human M1. Findings may inspire further research on optimized therapeutic applications of non-invasive brain stimulation in neurological and psychiatric diseases.
Background and Aims: In patients with advanced liver cirrhosis due to chronic hepatitis C virus (HCV) infection antiviral therapy with peginterferon and ribavirin is feasible in selected cases only due to potentially life-threatening side effects. However, predictive factors associated with hepatic decompensation during antiviral therapy are poorly defined.
Methods: In a retrospective cohort study, 68 patients with HCV-associated liver cirrhosis (mean MELD score 9.18±2.72) were treated with peginterferon and ribavirin. Clinical events indicating hepatic decompensation (onset of ascites, hepatic encephalopathy, upper gastrointestinal bleeding, hospitalization) as well as laboratory data were recorded at baseline and during a follow up period of 72 weeks after initiation of antiviral therapy. To monitor long term sequelae of end stage liver disease an extended follow up for HCC development, transplantation and death was applied (240weeks, ±SD 136weeks).
Results: Eighteen patients (26.5%) achieved a sustained virologic response. During the observational period a hepatic decompensation was observed in 36.8%. Patients with hepatic decompensation had higher MELD scores (10.84 vs. 8.23, p<0.001) and higher mean bilirubin levels (26.74 vs. 14.63 µmol/l, p<0.001), as well as lower serum albumin levels (38.2 vs. 41.1 g/l, p = 0.015), mean platelets (102.64 vs. 138.95/nl, p = 0.014) and mean leukocytes (4.02 vs. 5.68/nl, p = 0.002) at baseline as compared to those without decompensation. In the multivariate analysis the MELD score remained independently associated with hepatic decompensation (OR 1.56, 1.18–2.07; p = 0.002). When the patients were grouped according to their baseline MELD scores, hepatic decompensation occurred in 22%, 59%, and 83% of patients with MELD scores of 6–9, 10–13, and >14, respectively. Baseline MELD score was significantly associated with the risk for transplantation/death (p<0.001).
Conclusions: Our data suggest that the baseline MELD score predicts the risk of hepatic decompensation during antiviral therapy and thus contributes to decision making when antiviral therapy is discussed in HCV patients with advanced liver cirrhosis.
Background: The aim of this meta-analysis was to evaluate efficacy and safety of first-line chemotherapy with or without a monoclonal antibody in elderly patients ( ≥ 70 years) with metastatic colorectal cancer (mCRC), since they are frequently underrepresented in clinical trials.
Results: Individual data from 10 studies were included. From a total of 3271 patients, 604 patients (18%) were ≥ 70 years (median 73 years, range 70–88). Of these, 335 patients were treated with a bevacizumab-based first-line regimen and 265 were treated with chemotherapy only. The median PFS was 8.2 vs. 6.5 months and the median OS was 16.7 vs. 13.0 months in patients treated with and without bevacizumab, respectively. The safety profile of bevacizumab in combination with first-line chemotherapy did not differ from published clinical trials.
Materials and Methods: PubMed and Cochrane Library searches were performed on 29 April 2013 and studies published to this date were included. Authors were contacted to request progression-free survival (PFS), overall survival (OS) data, patient data on treatment regimens, age, sex and potential signs of toxicity in patients ≥ 70 years of age.
Conclusions: This meta-analysis suggests that the addition of bevacizumab to standard first-line chemotherapy improves clinical outcome in elderly patients with mCRC and is well tolerated.
CD4+ T cell lymphopenia predicts mortality from Pneumocystis pneumonia in kidney transplant patients
(2020)
Background: Pneumocystis jirovecii pneumonia (PcP) remains a life-threatening opportunistic infection after solid organ transplantation, even in the era of Pneumocystis prophylaxis. The association between risk of developing PcP and low CD4+ T cell counts has been well established. However, it is unknown whether lymphopenia in the context of post-renal transplant PcP increases the risk of mortality. Methods: We carried out a retrospective analysis of a cohort of kidney transplant patients with PcP (n = 49) to determine the risk factors for mortality associated with PcP. We correlated clinical and demographic data with the outcome of the disease. For CD4+ T cell counts, we used the Wilcoxon rank sum test for in-hospital mortality and a Cox proportional-hazards regression model for 60-day mortality. Results: In univariate analyses, high CRP, high neutrophils, CD4+ T cell lymphopenia, mechanical ventilation, and high acute kidney injury network stage were associated with in-hospital mortality following presentation with PcP. In a receiver-operator characteristic (ROC) analysis, an optimum cutoff of ≤200 CD4+ T cells/µL predicted in-hospital mortality, CD4+ T cell lymphopenia remained a risk factor in a Cox regression model. Conclusions: Low CD4+ T cell count in kidney transplant recipients is a biomarker for disease severity and a risk factor for in-hospital mortality following presentation with PcP.
Rationale: The clinical relevance of sensitization to Aspergillus (A) fumigatus in cystic fibrosis (CF) is unclear. Some researchers propose that specific A fumigatus IgE is an innocent bystander, whereas others describe it as the major cause of TH‐2‐driven asthma‐like disease.
Objectives: Lung function parameters in mild CF patients may be different in patients with and without A fumigatus sensitization. We aimed to ascertain whether allergen exposure to A fumigatus by bronchial allergen provocation (BAP) induces TH‐2 inflammation comparable to an asthma‐like disease.
Methods: A total of 35 patients, aged 14.8 ± 8.5 years, and 20 healthy controls were investigated prospectively. The patients were divided into two groups: group 1 (n = 18): specific (s)IgE negative, and group 2 (n = 17): sIgE positive (≥0.7 KU/L) for A fumigatus. Lung function, exhaled NO, and induced sputum were analysed. All sensitized patients with an FEV1 > 75% (n = 13) underwent BAP with A fumigatus, and cell counts, and the expression of IL‐5, IL‐13, INF‐γ, and IL‐8 as well as transcription factors T‐bet, GATA‐3, and FoxP3, were measured.
Results: Lung function parameters decreased significantly compared to controls, but not within the CF patient group. After BAP, 8 of 13 patients (61%) had a significant asthmatic response and increased eNO 24 hours later. In addition, marked TH‐2‐mediated inflammation involving eosinophils, IL‐5, IL‐13, and FoxP3 became apparent in induced sputum cells.
Conclusion: Our study demonstrated the clinical relevance of A fumigatus for the majority of sensitized CF patients. A distinct IgE/TH‐2‐dominated inflammation was found in induced sputum after A fumigatus exposure.
Estimating intraoperative blood loss is one of the daily challenges for clinicians. Despite the knowledge of the inaccuracy of visual estimation by anaesthetists and surgeons, this is still the mainstay to estimate surgical blood loss. This review aims at highlighting the strengths and weaknesses of currently used measurement methods. A systematic review of studies on estimation of blood loss was carried out. Studies were included investigating the accuracy of techniques for quantifying blood loss in vivo and in vitro. We excluded nonhuman trials and studies using only monitoring parameters to estimate blood loss. A meta-analysis was performed to evaluate systematic measurement errors of the different methods. Only studies that were compared with a validated reference e.g. Haemoglobin extraction assay were included. 90 studies met the inclusion criteria for systematic review and were analyzed. Six studies were included in the meta-analysis, as only these were conducted with a validated reference. The mixed effect meta-analysis showed the highest correlation to the reference for colorimetric methods (0.93 95% CI 0.91–0.96), followed by gravimetric (0.77 95% CI 0.61–0.93) and finally visual methods (0.61 95% CI 0.40–0.82). The bias for estimated blood loss (ml) was lowest for colorimetric methods (57.59 95% CI 23.88–91.3) compared to the reference, followed by gravimetric (326.36 95% CI 201.65–450.86) and visual methods (456.51 95% CI 395.19–517.83). Of the many studies included, only a few were compared with a validated reference. The majority of the studies chose known imprecise procedures as the method of comparison. Colorimetric methods offer the highest degree of accuracy in blood loss estimation. Systems that use colorimetric techniques have a significant advantage in the real-time assessment of blood loss.
Background: FibroTest (FT) is the most frequently used serum fibrosis marker and consists of an algorithm of five fibrosis markers (alfa2-macroglobulin, apolipoproteinA1, haptoglobin, GGT, bilirubin). The Enhanced Liver Fibrosis (ELF) test consists of an algorithm of three fibrosis markers (hyaluronic acid, amino-terminal propeptide-of-type-III-collagen, tissue-inhibitor of matrix-metaloproteinase-1). While a systematic review has shown comparable results for both individual markers, there has been no direct comparison of both markers. Methods: In the present study, the ELF-test was analyzed retrospectively in patients with chronic liver disease, who received a liver biopsy, transient elastography (TE) and the FibroTest using histology as the reference method. Histology was classified according to METAVIR and the Ludwig's classification (F0-F4) for patients with chronic hepatitis C and B virus (HCV, HBV) infection and primary biliary cirrhosis (PBC), respectively. Results: Seventy-four patients were analysed: 36 with HCV, 10 with HBV, and 28 with PBC. The accuracy (AUROC) for the diagnosis of significant fibrosis (F[greater than or equal to]2) for ELF and FibroTest was 0.78 (95%CI:0.67-0.89) and 0.69 (95%-CI:0.57-0.82), respectively (difference not statistically significant, n.s.). The AUROC for the diagnosis of liver cirrhosis was 0.92 (95%CI:0.83-1,00), and 0.91 (95%CI:0.83-0.99), respectively (n.s.). For 66 patients with reliable TE measurements the AUROC for the diagnosis of significant fibrosis (cirrhosis) for TE, ELF and FT were 0.80 (0.94), 0.76 (0.92), and 0.67 (0.91), respectively (n.s.). Conclusion: FibroTest and ELF can be performed with comparable diagnostic accuracy for the non-invasive staging of liver fibrosis. Serum tests are informative in a higher proportion of patients than transient elastography.
Purpose: The aim of the study was to compare three different elastography methods, namely Strain Elastography (SE), Point Shear-Wave Elastography (pSWE) using Acoustic Radiation Force Impulse (ARFI)-Imaging and 2D-Shear Wave Elastography (2D-SWE), in the same study population for the differentiation of thyroid nodules.
Materials and methods: All patients received a conventional ultrasound scan, SE and 2D-SWE, and all patients except for two received ARFI-Imaging. Cytology/histology of thyroid nodules was used as a reference method. SE measures the relative stiffness within the region of interest (ROI) using the surrounding tissue as reference tissue. ARFI mechanically excites the tissue at the ROI using acoustic pulses to generate localized tissue displacements. 2D-SWE measures tissue elasticity using the velocity of many shear waves as they propagate through the tissue.
Results: 84 nodules (73 benign and 11 malignant) in 62 patients were analyzed. Sensitivity, specificity and NPV of SE were 73%, 70% and 94%, respectively. Sensitivity, specificity and NPV of ARFI and 2D-SWE were 90%, 79%, 98% and 73%, 67%, 94% respectively, using a cut-off value of 1.98m/s for ARFI and 2.65m/s (21.07kPa) for 2D-SWE. The AUROC (Area under the Receiver Operating Characteristic) of SE, ARFI and 2D-SWE for the diagnosis of malignant thyroid nodules were 52%, 86% and 71%, respectively. A significant difference in AUROC was found between SE and ARFI (p = 0.008), while no significant difference was found between ARFI and SWE (86% vs. 71%, p = 0.31), or SWE and SE (71% vs. 52%, p = 0.26).
Conclusion: pSWE using ARFI and 2D-SWE showed comparable results for the differentiation of thyroid nodules. ARFI was superior to elastography using SE.
Aim: It can be challenging to distinguish COVID-19 in children from other common infections. We set out to determine the rate at which children consulting a primary care paediatrician with an acute infection are infected with SARS-CoV-2 and to compare distinct findings. Method: In seven out-patient clinics, children aged 0–13 years with any new respiratory or gastrointestinal symptoms and presumed infection were invited to be tested for SARS-CoV-2. Factors that were correlated with testing positive were determined. Samples were collected from 25 January 2021 to 01 April 2021. Results: Seven hundred and eighty-three children participated in the study (median age 3 years and 0 months, range 1 month to 12 years and 11 months). Three hundred and fifty-eight were female (45.7%). SARS-CoV-2 RNA was detected in 19 (2.4%). The most common symptoms in children with as well as without detectable SARS-CoV-2 RNA were rhinitis, fever and cough. Known recent exposure to a case of COVID-19 was significantly correlated with testing positive, but symptoms or clinical findings were not. Conclusion: COVID-19 among the children with symptoms of an acute infection was uncommon, and the clinical presentation did not differ significantly between children with and without evidence of an infection with SARS-CoV-2.
Aim: It can be challenging to distinguish COVID-19 in children from other common infections. We set out to determine the rate at which children consulting a primary care paediatrician with an acute infection are infected with SARS-CoV-2 and to compare distinct findings. Method: In seven out-patient clinics, children aged 0–13 years with any new respiratory or gastrointestinal symptoms and presumed infection were invited to be tested for SARS-CoV-2. Factors that were correlated with testing positive were determined. Samples were collected from 25 January 2021 to 01 April 2021. Results: Seven hundred and eighty-three children participated in the study (median age 3 years and 0 months, range 1 month to 12 years and 11 months). Three hundred and fifty-eight were female (45.7%). SARS-CoV-2 RNA was detected in 19 (2.4%). The most common symptoms in children with as well as without detectable SARS-CoV-2 RNA were rhinitis, fever and cough. Known recent exposure to a case of COVID-19 was significantly correlated with testing positive, but symptoms or clinical findings were not. Conclusion: COVID-19 among the children with symptoms of an acute infection was uncommon, and the clinical presentation did not differ significantly between children with and without evidence of an infection with SARS-CoV-2.
Six dentin adhesives were tested in vitro regarding their cytotoxicity on human fibroblasts. The adhesives Hybrid Bond, One-up Bond F Plus, AdheSE, Clearfil SE Bond, Optibond Solo Plus and Syntac were eluted with culture medium as single or sequentially applied adhesive part for 24 h. 75 Petri dishes were produced per group. They were evaluated triangulated, comprising the quantitative evaluation (105 ones) to determine “viable”, “dead” and “debris” cells with the use of a cell-counter and the reactivity index was also identified based on the qualitative assessment (420 ones). One-up Bond F Plus, AdheSE and Clearfil SE Bond showed a statistical difference of viable cells to the cell control. For One-up Bond F Plus, statistically, differences compared to hybrid bond and Syntac were also found. All the adhesives except One-up Bond F Plus showed significant differences between single and sequentially applied adhesive part regarding the quantitative evaluation. The test material showed a moderate grade of cytotoxicity. As a result, a statistically significant difference of the cytotoxicity between the self-etch and etch-and-rinse adhesives cannot be demonstrated regarding the qualitative evaluation and the reactivity index, but the differences between sequentially applied and single applied components can be proved.
Aim: Patients with advanced systolic chronic heart failure frequently suffer from progressive functional mitral regurgitation. We report our initial experience in patients with an implanted pulmonary artery pressure (PAP) sensor, who developed severe mitral regurgitation, which was treated with the MitraClip system. We non‐invasively compared changes in PAP values in patients after MitraClip with PAP changes in patients without MitraClip.
Methods and results: Among 28 patients with New York Heart Association III heart failure with implanted PAP sensor for haemodynamic telemonitoring from a single centre, four patients (age 66 ± 6 years, left ventricular ejection fraction 21 ± 3%, and cardiac index 1.8 ± 0.3) received a MitraClip procedure and were compared with 24 patients (age 72 ± 8 years, left ventricular ejection fraction 26 ± 9.9%, and cardiac index 2.0 ± 1.0) without MitraClip procedure in a descriptive manner. Ambulatory PAP values were followed for 90 days in both groups. In comparison with the PAP values 4 weeks before MitraClip procedure, PAP was profoundly reduced in all four patients after 30 days (ΔPAPmean −11 ± 5, ΔPAPdiast −7 ± 3 mmHg, P < 0.02) as well as after 90 days (ΔPAPmean −6.3 ± 6, ΔPAPdiast −1 ± 3 mmHg). Reductions in PAP were accompanied by a profound reduction in N terminal pro brain natriuretic peptide as well as clinical and echocardiographic improvement. When analysing the dynamics with a regression model, reductions in all PAP values were significantly greater after MitraClip compared with conservative haemodynamic monitoring (P < 0.001).
Conclusions: The efficacy of the interventional MitraClip procedure on clinical symptoms can be confirmed by haemodynamic telemonitoring. Thus, daily non‐invasive haemodynamic telemonitoring allows, for the first time, for a continuous assessment of the haemodynamic efficacy of novel therapies in patients with chronic heart failure.
Background: Epileptic seizures are common clinical features in patients with acute subdural hematoma (aSDH); however, diagnostic feasibility and therapeutic monitoring remain limited. Surface electroencephalography (EEG) is the major diagnostic tool for the detection of seizures but it might be not sensitive enough to detect all subclinical or nonconvulsive seizures or status epilepticus. Therefore, we have planned a clinical trial to evaluate a novel treatment modality by perioperatively implanting subdural EEG electrodes to diagnose seizures; we will then treat the seizures under therapeutic monitoring and analyze the clinical benefit.
Methods: In a prospective nonrandomized trial, we aim to include 110 patients with aSDH. Only patients undergoing surgical removal of aSDH will be included; one arm will be treated according to the guidelines of the Brain Trauma Foundation, while the other arm will additionally receive a subdural grid electrode. The study's primary outcome is the comparison of incidence of seizures and time-to-seizure between the interventional and control arms. Invasive therapeutic monitoring will guide treatment with antiseizure drugs (ASDs). The secondary outcome will be the functional outcome for both groups as assessed via the Glasgow Outcome Scale and modified Rankin Scale both at discharge and during 6 months of follow-up. The tertiary outcome will be the evaluation of chronic epilepsy within 2-4 years of follow-up.
Discussion: The implantation of a subdural EEG grid electrode in patients with aSDH is expected to be effective in diagnosing seizures in a timely manner, facilitating treatment with ASDs and monitoring of treatment success. Moreover, the occurrence of epileptiform discharges prior to the manifestation of seizure patterns could be evaluated in order to identify high-risk patients who might benefit from prophylactic treatment with ASDs.
Trial registration: ClinicalTrials.gov identifier no. NCT04211233.
Introduction: Colorectal cancers (CRCs) deficient in the DNA mismatch repair protein MutL homolog 1 (MLH1) display distinct clinicopathological features and require a different therapeutic approach compared to CRCs with MLH1 proficiency. However, the molecular basis of this fundamental difference remains elusive. Here, we report that MLH1-deficient CRCs exhibit reduced levels of the cytoskeletal scaffolding protein non-erythroid spectrin αII (SPTAN1), and that tumor progression and metastasis of CRCs correlate with SPTAN1 levels.
Methods and results: To investigate the link between MLH1 and SPTAN1 in cancer progression, a cohort of 189 patients with CRC was analyzed by immunohistochemistry. Compared with the surrounding normal mucosa, SPTAN1 expression was reduced in MLH1-deficient CRCs, whereas MLH1-proficient CRCs showed a significant upregulation of SPTAN1. Overall, we identified a strong correlation between MLH1 status and SPTAN1 expression. When comparing TNM classification and SPTAN1 levels, we found higher SPTAN1 levels in stage I CRCs, while stages II to IV showed a gradual reduction of SPTAN1 expression. In addition, SPTAN1 expression was lower in metastatic compared with non-metastatic CRCs. Knockdown of SPTAN1 in CRC cell lines demonstrated decreased cell viability, impaired cellular mobility and reduced cell-cell contact formation, indicating that SPTAN1 plays an important role in cell growth and cell attachment. The observed weakened cell-cell contact of SPTAN1 knockdown cells might indicate that tumor cells expressing low levels of SPTAN1 detach from their primary tumor and metastasize more easily.
Conclusion: Taken together, we demonstrate that MLH1 deficiency, low SPTAN1 expression, and tumor progression and metastasis are in close relation. We conclude that SPTAN1 is a candidate molecule explaining the tumor progression and metastasis of MLH1-deficient CRCs. The detailed analysis of SPTAN1 is now mandatory to substantiate its relevance and its potential value as a candidate protein for targeted therapy, and as a predictive marker of cancer aggressiveness.
Background: Postoperative complication rates using 3D visualization are rarely reported. The primary aim of our study is to detect a possible advantage of using 3D on postoperative complication rates in a real-world setting.
Method: With a sample size calculation for a medium effect size difference that 3D reduces significantly postoperative complications, data of 287 patients with 3D visualization and 832 with 2D procedure were screened. The groups underwent an exact propensity score-matching to be comparable. Comprehensive complication index (CCI) for every procedure was calculated and Operation Time was determined.
Results: Including 1078 patients in the study, 213 exact propensity score-matched pairs could finally be established. Concerning overall CCI (3D: 5.70 ± 13.63 vs. 2D: 3.37 ± 9.89; p = 0.076) and operation time (3D: 103.98 ± 93.26 min vs. 2D: 88.60 ±6 9.32 min; p = 0.2569) there was no significant difference between the groups.
Conclusion: Our study shows no advantage of 3D over 2D laparoscopy regarding postoperative complications in a real-world setting, the second endpoint operation time, too, was not influenced by 3D overall.
Keywords: 3D laparoscopy; Comprehensive complication index; Propensity score matching
Seroconversion rates following influenza vaccination in patients with hematologic malignancies after hematopoietic stem cell transplantation (HSCT) are known to be lower compared to healthy adults. The aim of our diagnostic study was to determine the rate of seroconversion after 1 or 2 doses of a novel split virion, inactivated, AS03-adjuvanted pandemic H1N1 influenza vaccine (A/California/7/2009) in HSCT recipients (ClinicalTrials.gov Identifier: NCT01017172). Blood samples were taken before and 21 days after a first dose and 21 days after a second dose of the vaccine. Antibody (AB) titers were determined by hemagglutination inhibition assay. Seroconversion was defined by either an AB titer of ≤1:10 before and ≥1:40 after or ≥1:10 before and ≥4-fold increase in AB titer 21 days after vaccination. Seventeen patients (14 allogeneic, 3 autologous HSCT) received 1 dose and 11 of these patients 2 doses of the vaccine. The rate of seroconversion was 41.2% (95% confidence interval [CI] 18.4-67.1) after the first and 81.8% (95% CI 48.2-97.7) after the second dose. Patients who failed to seroconvert after 1 dose of the vaccine were more likely to receive any immunosuppressive agent (P = .003), but time elapsed after or type of HSCT, age, sex, or chronic graft-versus-host disease was not different when compared to patients with seroconversion. In patients with hematologic malignancies after HSCT the rate of seroconversion after a first dose of an adjuvanted H1N1 influenza A vaccine was poor, but increased after a second dose.
Eosinophilic cholangitis is a potentially underdiagnosed etiology in indeterminate biliary stricture
(2017)
AIM: To investigate presence and extent of eosinophilic cholangitis (EC) as well as IgG4-related disease in patients with indeterminate biliary stricture (IBS).
METHODS: All patients with diagnosis of sclerosing cholangitis (SC) and histopathological samples such as biopsies or surgical specimens at University Hospital Frankfurt from 2005-2015 were included. Histopathological diagnoses as well as further clinical course were reviewed. Tissue samples of patients without definite diagnosis after complete diagnostic work-up were reviewed regarding presence of eosinophilic infiltration and IgG4 positive plasma cells. Eosinophilic infiltration was as well assessed in a control group of liver transplant donors and patients with primary sclerosing cholangitis.
RESULTS: one hundred and thirty-five patients with SC were included. In 10/135 (13.5%) patients, no potential cause of IBS could be identified after complete diagnostic work-up and further clinical course. After histopathological review, a post-hoc diagnosis of EC was established in three patients resulting in a prevalence of 2.2% (3/135) of all patients with SC as well as 30% (3/10) of patients, where no cause of IBS was identified. 2/3 patients with post-hoc diagnosis of EC underwent surgical resection with suspicion for malignancy. Diagnosis of IgG4-related cholangitis was observed in 7/135 patients (5.1%), whereas 3 cases were discovered in post-hoc analysis. 6/7 cases with IgG4-related cholangitis (85.7%) presented with eosinophilic infiltration in addition to IgG4 positive plasma cells. There was no patient with eosinophilic infiltration in the control group of liver transplant donors (n = 27) and patients with primary sclerosing cholangitis (n = 14).
CONCLUSION: EC is an underdiagnosed benign etiology of SC and IBS, which has to be considered in differential diagnosis of IBS.
Standard monitoring of heart rate, blood pressure and arterial oxygen saturation during endoscopy is recommended by current guidelines on procedural sedation. A number of studies indicated a reduction of hypoxic (art. oxygenation < 90% for > 15 s) and severe hypoxic events (art. oxygenation < 85%) by additional use of capnography. Therefore, U.S. and the European guidelines comment that additional capnography monitoring can be considered in long or deep sedation. Integrated Pulmonary Index® (IPI) is an algorithm-based monitoring parameter that combines oxygenation measured by pulse oximetry (art. oxygenation, heart rate) and ventilation measured by capnography (respiratory rate, apnea > 10 s, partial pressure of end-tidal carbon dioxide [PetCO2]). The aim of this paper was to analyze the value of IPI as parameter to monitor the respiratory status in patients receiving propofol sedation during PEG-procedure. Patients reporting for PEG-placement under sedation were randomized 1:1 in either standard monitoring group (SM) or capnography monitoring group including IPI (IM). Heart rate, blood pressure and arterial oxygen saturation were monitored in SM. In IM additional monitoring was performed measuring PetCO2, respiratory rate and IPI. Capnography and IPI values were recorded for all patients but were only visible to the endoscopic team for the IM-group. IPI values range between 1 and 10 (10 = normal; 8–9 = within normal range; 7 = close to normal range, requires attention; 5–6 = requires attention and may require intervention; 3–4 = requires intervention; 1–2 requires immediate intervention). Results on capnography versus standard monitoring of the same study population was published previously. A total of 147 patients (74 in SM and 73 in IM) were included in the present study. Hypoxic events occurred in 62 patients (42%) and severe hypoxic events in 44 patients (29%), respectively. Baseline characteristics were equally distributed in both groups. IPI = 1, IPI < 7 as well as the parameters PetCO2 = 0 mmHg and apnea > 10 s had a high sensitivity for hypoxic and severe hypoxic events, respectively (IPI = 1: 81%/81% [hypoxic/severe hypoxic event], IPI < 7: 82%/88%, PetCO2: 69%/68%, apnea > 10 s: 84%/84%). All four parameters had a low specificity for both hypoxic and severe hypoxic events (IPI = 1: 13%/12%, IPI < 7: 7%/7%, PetCO2: 29%/27%, apnea > 10 s: 7%/7%). In multivariate analysis, only SM and PetCO2 = 0 mmHg were independent risk factors for hypoxia. IPI (IPI = 1 and IPI < 7) as well as the individual parameters PetCO2 = 0 mmHg and apnea > 10 s allow a fast and convenient conclusion on patients’ respiratory status in a morbid patient population. Sensitivity is good for most parameters, but specificity is poor. In conclusion, IPI can be a useful metric to assess respiratory status during propofol-sedation in PEG-placement. However, IPI was not superior to PetCO2 and apnea > 10 s.
Introduction Patients undergoing heart valve surgery are predominantly transferred postoperatively to the intensive care unit (ICU) under continuous sedation. Volatile anaesthetics are an increasingly used treatment alternative to intravenous substances in the ICU. As subject to inhalational uptake and elimination, the resulting pharmacological benefits have been repeatedly demonstrated. Therefore, volatile anaesthetics appear suitable to meet the growing demands of fast-track cardiac surgery. However, their use requires special preparation at the bedside and trained medical and nursing staff, which might limit the pharmacological benefits. The aim of our work is to assess whether the temporal advantages of recovery under volatile sedation outweigh the higher effort of special preparation.
Methods and analysis The study is designed to evaluate the differences between intravenous sedatives (n=48) and volatile sedatives (n=48) in continued intensive care sedation. This study will be conducted as a prospective, randomised, controlled, single-blinded, monocentre trial at a German university hospital in consenting adult patients undergoing heart valve surgery at a university hospital. This observational study will examine the necessary preparation time, staff consultation and overall feasibility of the chosen sedation method. For this purpose, the continuation of sedation in the ICU with volatile sedatives is considered as one study arm and with intravenous sedatives as the comparison group. Due to rapid elimination and quick awakening after the termination of sedation, closer consultation between the attending physician and the ICU nursing staff is required, in addition to a prolonged setup time. Study analysis will include the required setup time, time from admission to extubation as primary outcome and neurocognitive assessability. In addition, possible operation-specific (blood loss, complications), treatment parameters (catecholamine dosages, lung function) and laboratory results (acute kidney injury, acid base balance (lactataemia), liver failure) as influencing factors will be collected. The study-relevant data will be extracted from the continuous digital records of the patient data management system after the patient has been discharged from the ICU. For statistical evaluation, 95% CIs will be calculated for the median time to extubation and neurocognitive assessability, and the association will be assessed with a Cox regression model. In addition, secondary binary outcome measures will be evaluated using Fisher’s exact tests. Further descriptive and exploratory statistical analyses are also planned.
Ethics and dissemination The study was approved by the Institutional Ethics Board of the University of Frankfurt, Germany (#20-1050). Informed consent of all individual patients will be obtained before randomisation. Results will be disseminated via publication in peer-reviewed journals.
Background: Subdural hematoma (SDH) is a common disease associated with high morbidity, which is becoming more prominent due to the increasing incidence. Decision for a surgical evacuation is made depending on the clinical appearance and the volume of SDH, wherefore it is important to have a simple ‘bedside’ method to measure and compare the volume of SDH.
Objective: The aim of the study was to verify the accuracy of the simplified ABC/2 volumetric formula to determine a valuable tool for the clinical practice.
Methods: Preoperative CT-scans of 83 patients with SDHs were used for the computer-assisted volumetric measurement via BrainLab® as well as the ABC/2 volumetric measurement. A = largest length (anterior to posterior) of the SDH; B = maximum width (lateral to midline) 90° to A; C = maximum height (coronal plane or multiplication of slices) of the hematoma. These measurements were performed by two independent clinicians in a blinded fashion. Both volumes were compared by linear regression analysis of Pearson and Bland-Altman regression analysis.
Results: Among 100 SDHs, 53% were under an 47% were over 100cm3 showing a well distribution of the hematoma sizes. There was an excellent correlation between computer-assisted volumetric measurement and ABC/2 (R2 = 0.947, p<0.0001) and no undesirable deviation and trend were detected (p = 0.101; p = 0.777). A 95% tolerance region of the ratios of both methods was [0.805–1.201].
Conclusion: The ABC/2 method is a simple and fast bedside formula for the measurement of SDH volume in a timely manner without limited access through simple adaption, which may replace the computer-assisted volumetric measurement in the clinical and research area. Reason for the good accuracy seems to be the spherical form of SDH, which has a similarity to a half ellipsoid.
Introduction: Recommendations for venous thromboembolism and deep venous thrombosis (DVT) prophylaxis using graduated compression stockings (GCS) is historically based and has been critically examined in current publications. Existing guidelines are inconclusive as to recommend the general use of GCS.
Patients/Methods: 24 273 in-patients (general surgery and orthopedic patients) undergoing surgery between 2006 and 2016 were included in a retrospectively analysis from a single center. From January 2006 to January 2011 perioperative GCS was employed additionally to drug prophylaxis and from February 2011 to March 2016 patients received drug prophylaxis alone. According to german guidelines all patients received venous thromboembolism prophylaxis with weight-adapted LMWH. Risk stratification (low risk, moderate risk, high risk) was based on the guideline of the American College of Chest Physicians. Data analysis was performed before and after propensity matching (PM). The defined primary endpoint was the incidence of symptomatic or fatal pulmonary embolism (PE). A secondary endpoint was the incidence of deep venous thromboembolism (DVT).
Results: After risk stratification (low risk n = 16 483; moderate risk n = 4464; high risk n = 3326) a total of 24 273 patient were analyzed. Before to PM the relative risk for the occurrence of a PE or DVT was not increased by abstaining from GCS. After PM two groups of 11 312 patients each, one with and one without GCS application, were formed. When comparing the two groups, the relative risk (RR) for the occurrence of a pulmonary embolism was: Low Risk 0.99 [CI95% 0.998–1.000]; Moderate Risk 0.999 [CI95% 0.95–1.003]; High Risk 0.996 [CI95% 0.992–1.000] (p > 0.05). The incidence of PE in the total group LMWH alone was 0.1% (n = 16). In the total group using LMWH + GCS, the incidence was 0.3% (n = 29). RR after PM was 0.999 [CI95% 0.998–1.00].
Conclusion: In comparison to prior studies with only small numbers of patients our trial shows in a large group of patients with moderate and high risk developing VTE we can support the view that abstaining from GCS-use does not increase the incidence of symptomatic or fatal PE and symptomatic DVT.
Background: Chronic hepatitis C virus (HCV) infections are causally linked with metabolic comorbidities such as insulin resistance, hepatic steatosis, and dyslipidemia. However, the clinical impact of HCV eradication achieved by direct-acting antivirals (DAAs) on glucose and lipid homeostasis is still controversial. The study aimed to prospectively investigate whether antiviral therapy of HCV with DAAs alters glucose and lipid parameters. Methods: 50 patients with chronic HCV who were treated with DAAs were screened, and 49 were enrolled in the study. Biochemical and virological data, as well as noninvasive liver fibrosis parameters, were prospectively collected at baseline, at the end of treatment (EOT) and 12 and 24 weeks post-treatment. Results: 45 of 46 patients achieved sustained virologic response (SVR). The prevalence of insulin resistance (HOMA-IR) after HCV clearance was significantly lower, compared to baseline (5.3 ± 6.1 to 2.5 ± 1.9, p < 0.001), which is primarily attributable to a significant decrease of fasting insulin levels (18.9 ± 17.3 to 11.7 ± 8.7; p = 0.002). In contrast to that, HCV eradication resulted in a significant increase in cholesterol levels (total cholesterol, low-density lipoprotein cholesterol (LDL-C), and high-density lipoprotein (HDL-C) levels) and Controlled Attenuated Score (CAP), although BMI did not significantly change over time (p = 0.95). Moreover, HOMA-IR correlated significantly with noninvasive liver fibrosis measurements at baseline und during follow-up (TE: r = 0.45; p = 0.003, pSWE: r = 0.35; p = 0.02, APRI: r = 0.44; p = 0.003, FIB-4: r = 0.41; p < 0.001). Conclusion: Viral eradication following DAA therapy may have beneficial effects on glucose homeostasis, whereas lipid profile seems to be worsened.
Objectives: Regarding reactogenicity and immunogenicity, heterologous COVID-19 vaccination regimens are considered as an alternative to conventional immunization schemes.
Methods: Individuals receiving either heterologous (ChAdOx1-S [AstraZeneca, Cambridge, UK]/BNT162b2 [Pfizer-BioNTech, Mainz, Germany]; n = 306) or homologous (messenger RNA [mRNA]-1273 [Moderna, Cambridge, Massachusetts, USA]; n = 139) vaccination were asked to participate when receiving their second dose. Reactogenicity was assessed after 1 month, immunogenicity after 1, 3, and/or 6 months, including a third dose, through SARS-CoV-2 antispike immunoglobulin G, surrogate virus neutralization test, and a plaque reduction neutralization test against the Delta (B.1.167.2) and Omicron (B.1.1.529; BA.1) variants of concern.
Results: The overall reactogenicity was lower after heterologous vaccination. In both cohorts, SARS-CoV-2 antispike immunoglobulin G concentrations waned over time with the heterologous vaccination demonstrating higher neutralizing activity than homologous mRNA vaccination after 3 months to low neutralizing levels in the Delta plaque reduction neutralization test after 6 months. At this point, 3.2% of the heterologous and 11.4% of the homologous cohort yielded low neutralizing activity against Omicron. After a third dose of an mRNA vaccine, ≥99% of vaccinees demonstrated positive neutralizing activity against Delta. Depending on the vaccination scheme and against Omicron, 60% to 87.5% of vaccinees demonstrated positive neutralizing activity.
Conclusion: ChAdOx1-S/BNT162b2 vaccination demonstrated an acceptable reactogenicity and immunogenicity profile. A third dose of an mRNA vaccine is necessary to maintain neutralizing activity against SARS-CoV-2. However, variants of concern-adapted versions of the vaccines would be desirable.
High sedation needs of critically ill COVID-19 ARDS patients - a monocentric observational study
(2021)
Background: Therapy of severely affected coronavirus patient, requiring intubation and sedation is still challenging. Recently, difficulties in sedating these patients have been discussed. This study aims to describe sedation practices in patients with 2019 coronavirus disease (COVID-19)-induced acute respiratory distress syndrome (ARDS). Methods: We performed a retrospective monocentric analysis of sedation regimens in critically ill intubated patients with respiratory failure who required sedation in our mixed 32-bed university intensive care unit. All mechanically ventilated adults with COVID-19-induced ARDS requiring continuously infused sedative therapy admitted between April 4, 2020, and June 30, 2020 were included. We recorded demographic data, sedative dosages, prone positioning, sedation levels and duration. Descriptive data analysis was performed; for additional analysis, a logistic regression with mixed effect was used. Results: In total, 56 patients (mean age 67 (±14) years) were included. The mean observed sedation period was 224 (±139) hours. To achieve the prescribed sedation level, we observed the need for two or three sedatives in 48.7% and 12.8% of the cases, respectively. In cases with a triple sedation regimen, the combination of clonidine, esketamine and midazolam was observed in most cases (75.7%). Analgesia was achieved using sufentanil in 98.6% of the cases. The analysis showed that the majority of COVID-19 patients required an unusually high sedation dose compared to those available in the literature. Conclusion: The global pandemic continues to affect patients severely requiring ventilation and sedation, but optimal sedation strategies are still lacking. The findings of our observation suggest unusual high dosages of sedatives in mechanically ventilated patients with COVID-19. Prescribed sedation levels appear to be achievable only with several combinations of sedatives in most critically ill patients suffering from COVID-19-induced ARDS and a potential association to the often required sophisticated critical care including prone positioning and ECMO treatment seems conceivable.
Intrahepatic cholangiocarcinoma (iCCA) is the most frequent subtype of cholangiocarcinoma (CCA), and the incidence has globally increased in recent years. In contrast to surgically treated iCCA, data on the impact of fibrosis on survival in patients undergoing palliative chemotherapy are missing. We retrospectively analyzed the cases of 70 patients diagnosed with iCCA between 2007 and 2020 in our tertiary hospital. Histopathological assessment of fibrosis was performed by an expert hepatobiliary pathologist. Additionally, the fibrosis-4 score (FIB-4) was calculated as a non-invasive surrogate marker for liver fibrosis. For overall survival (OS) and progression-free survival (PFS), Kaplan–Meier curves and Cox-regression analyses were performed. Subgroup analyses revealed a median OS of 21 months (95% CI = 16.7–25.2 months) and 16 months (95% CI = 7.6–24.4 months) for low and high fibrosis, respectively (p = 0.152). In non-cirrhotic patients, the median OS was 21.8 months (95% CI = 17.1–26.4 months), compared with 9.5 months (95% CI = 4.6–14.3 months) in cirrhotic patients (p = 0.007). In conclusion, patients with iCCA and cirrhosis receiving palliative chemotherapy have decreased OS rates, while fibrosis has no significant impact on OS or PFS. These patients should not be prevented from state-of-the-art first-line chemotherapy.
In-line filtration of intravenous infusion may reduce organ dysfunction of adult critical patients
(2019)
Background: The potential harmful effects of particle-contaminated infusions for critically ill adult patients are yet unclear. So far, only significant improved outcome in critically ill children and new-borns was demonstrated when using in-line filters, but for adult patients, evidence is still missing.
Methods: This single-centre, retrospective controlled cohort study assessed the effect of in-line filtration of intravenous fluids with finer 0.2 or 1.2 μm vs 5.0 μm filters in critically ill adult patients. From a total of n = 3215 adult patients, n = 3012 patients were selected by propensity score matching (adjusting for sex, age, and surgery group) and assigned to either a fine filter cohort (with 0.2/1.2 μm filters, n = 1506, time period from February 2013 to January 2014) or a control filter cohort (with 5.0 μm filters, n = 1506, time period from April 2014 to March 2015). The cohorts were compared regarding the occurrence of severe vasoplegia, organ dysfunctions (lung, kidney, and brain), inflammation, in-hospital complications (myocardial infarction, ischemic stroke, pneumonia, and sepsis), in-hospital mortality, and length of ICU and hospital stay.
Results: Comparing fine filter vs control filter cohort, respiratory dysfunction (Horowitz index 206 (119–290) vs 191 (104.75–280); P = 0.04), pneumonia (11.4% vs 14.4%; P = 0.02), sepsis (9.6% vs 12.2%; P = 0.03), interleukin-6 (471.5 (258.8–1062.8) ng/l vs 540.5 (284.5–1147.5) ng/l; P = 0.01), and length of ICU (1.2 (0.6–4.9) vs 1.7 (0.8–6.9) days; P < 0.01) and hospital stay (14.0 (9.2–22.2) vs 14.8 (10.0–26.8) days; P = 0.01) were reduced. Rate of severe vasoplegia (21.0% vs 19.6%; P > 0.20) and acute kidney injury (11.8% vs 13.7%; P = 0.11) was not significantly different between the cohorts.
Conclusions: In-line filtration with finer 0.2 and 1.2 μm filters may be associated with less organ dysfunction and less inflammation in critically ill adult patients.
Trial registration: The study was registered at ClinicalTrials.gov (number: NCT02281604).
Introduction: The number of individuals requesting medical treatment for gender dysphoria has increased significantly within the past years. Our purpose was to examine current biographic and socio‐demographic characteristics and aspects of legal gender reassignment.
Design: Medical files from n = 350 individuals of a German Endocrine outpatient clinic were collected from 2009 to 2017 and analysed retrospectively.
Results: Ratio of transwomen to transmen equates to 1:1.89 with a remarkable increase of transmen by the year 2013, showing a reversal of gender distribution compared with previous studies for the first time. Use of illegal substances or self‐initiated hormone therapy was rare (4.6 and 2.1%). Satisfaction with gender‐affirming hormone therapy was significantly higher in transmen than in transwomen (100% vs 96.2%, P = .005). Use of antidepressants declined significantly after onset of hormone treatment in transmen (13% vs 7%; P = .007). The number of individuals with a graduation diploma was only about half as high as in the general population (14.3% vs 27.3%), whereas unemployment rate was more than twice as high (14% vs 6.9%). Median latency between application for legal gender reassignment and definitive court decision was 9 months.
Conclusions: Our data provide possible indications for a decline of psychosocial burden in individuals diagnosed with gender dysphoria over the last years. However, affected individuals are still limited in their occupational and financial opportunities as well as by a complex and expensive procedure of legal gender reassignment in Germany.
Influence of antibiotic-regimens on intensive-care unit-mortality and liver-cirrhosis as risk factor
(2016)
AIM: To assess the rate of infection, appropriateness of antimicrobial-therapy and mortality on intensive care unit (ICU). Special focus was drawn on patients with liver cirrhosis.
METHODS: The study was approved by the local ethical committee. All patients admitted to the Internal Medicine-ICU between April 1, 2007 and December 31, 2009 were included. Data were extracted retrospectively from all patients using patient charts and electronic documentations on infection, microbiological laboratory reports, diagnosis and therapy. Due to the large hepatology department and liver transplantation center, special interest was on the subgroup of patients with liver cirrhosis. The primary statistical-endpoint was the evaluation of the influence of appropriate versus inappropriate antimicrobial-therapy on in-hospital-mortality.
RESULTS: Charts of 1979 patients were available. The overall infection-rate was 53%. Multiresistant-bacteria were present in 23% of patients with infection and were associated with increased mortality (P < 0.000001). Patients with infection had significantly increased in-hospital-mortality (34% vs 17%, P < 0.000001). Only 9% of patients with infection received inappropriate initial antimicrobial-therapy, no influence on mortality was observed. Independent risk-factors for in-hospital-mortality were the presence of septic-shock, prior chemotherapy for malignoma and infection with Pseudomonas spp. Infection and mortality-rate among 175 patients with liver-cirrhosis was significantly higher than in patients without liver-cirrhosis. Infection increased mortality 2.24-fold in patients with cirrhosis. Patients with liver cirrhosis were at an increased risk to receive inappropriate initial antimicrobial therapy.
CONCLUSION: The results of the present study report the successful implementation of early-goal-directed therapy. Liver cirrhosis patients are at increased risk of infection, mortality and to receive inappropriate therapy. Increasing burden are multiresistant-bacteria.
Interleukin-22 predicts severity and death in advanced liver cirrhosis: a prospective cohort study
(2012)
Background: Interleukin-22 (IL-22), recently identified as a crucial parameter of pathology in experimental liver damage, may determine survival in clinical end-stage liver disease. Systematic analysis of serum IL-22 in relation to morbidity and mortality of patients with advanced liver cirrhosis has not been performed so far.
Methods: This is a prospective cohort study including 120 liver cirrhosis patients and 40 healthy donors to analyze systemic levels of IL-22 in relation to survival and hepatic complications.
Results: A total of 71% of patients displayed liver cirrhosis-related complications at study inclusion. A total of 23% of the patients died during a mean follow-up of 196 +/- 165 days. Systemic IL-22 was detectable in 74% of patients but only in 10% of healthy donors (P <0.001). Elevated levels of IL-22 were associated with ascites (P = 0.006), hepatorenal syndrome (P <0.0001), and spontaneous bacterial peritonitis (P = 0.001). Patients with elevated IL-22 (>18 pg/ml, n = 57) showed significantly reduced survival compared to patients with regular ([less than or equal to]18 pg/ml) levels of IL-22 (321 days versus 526 days, P = 0.003). Other factors associated with overall survival were high CRP ([greater than or equal to]2.9 mg/dl, P = 0.005, hazard ratio (HR) 0.314, confidence interval (CI) (0.141 to 0.702)), elevated serum creatinine (P = 0.05, HR 0.453, CI (0.203 to 1.012)), presence of liver-related complications (P = 0.028, HR 0.258 CI (0.077 to 0.862)), model of end stage liver disease (MELD) score [greater than or equal to]20 (P = 0.017, HR 0.364, CI (0.159 to 0.835)) and age (P = 0.011, HR 1.047, CI (1.011 to 1.085)). Adjusted multivariate Cox proportional-hazards analysis identified elevated systemic IL-22 levels as independent predictors of reduced survival (P = 0.007, HR 0.218, CI (0.072 to 0.662)).
Conclusions: In patients with liver cirrhosis, elevated systemic IL-22 levels are predictive for reduced survival independently from age, liver-related complications, CRP, creatinine and the MELD score. Thus, processes that lead to a rise in systemic interleukin-22 may be relevant for prognosis of advanced liver cirrhosis.
Background: Thyroid Imaging Reporting and Data System (TIRADS) was developed to improve patient management and cost-effectiveness by avoiding unnecessary fine needle aspiration biopsy (FNAB) in patients with thyroid nodules. However, its clinical use is still very limited. Strain elastography (SE) enables the determination of tissue elasticity and has shown promising results for the differentiation of thyroid nodules.
Methods: The aim of the present study was to evaluate the interobserver agreement (IA) of TIRADS developed by Horvath et al. and SE. Three blinded observers independently scored stored images of TIRADS and SE in 114 thyroid nodules (114 patients). Cytology and/or histology was available for all benign (n = 99) and histology for all malignant nodules (n = 15).
Results: The IA between the 3 observers was only fair for TIRADS categories 2–5 (Coheńs kappa = 0.27,p = 0.000001) and TIRADS categories 2/3 versus 4/5 (ck = 0.25,p = 0.0020). The IA was substantial for SE scores 1–4 (ck = 0.66,p<0.000001) and very good for SE scores 1/2 versus 3/4 (ck = 0.81,p<0.000001). 92–100% of patients with TIRADS-2 had benign lesions, while 28–42% with TIRADS-5 had malignant cytology/histology. The negative-predictive-value (NPV) was 92–100% for TIRADS using TIRADS-categories 4&5 and 96–98% for SE using score ES-3&4 for the diagnosis of malignancy, respectively. However, only 11–42% of nodules were in TIRADS-categories 2&3, as compared to 58–60% with ES-1&2.
Conclusions: IA of TIRADS developed by Horvath et al. is only fair. TIRADS and SE have high NPV for excluding malignancy in the diagnostic work-up of thyroid nodules.
Triple therapy of chronic hepatitis C virus (HCV) infection with boceprevir (BOC) or telaprevir (TVR) leads to virologic failure in many patients which is often associated with the selection of resistance-associated variants (RAVs). These resistance profiles are of importance for the selection of potential rescue treatment options. In this study, we sequenced baseline NS3 RAVs population-based and investigated the sensitivity of NS3 phenotypes in an HCV replicon assay together with clinical factors for a prediction of treatment response in a cohort of 165 German and Swiss patients treated with a BOC or TVR-based triple therapy. Overall, the prevalence of baseline RAVs was low, although the frequency of RAVs was higher in patients with virologic failure compared to those who achieved a sustained virologic response (SVR) (7% versus 1%, P = 0.06). The occurrence of RAVs was associated with a resistant NS3 quasispecies phenotype (P<0.001), but the sensitivity of phenotypes was not associated with treatment outcome (P = 0.2). The majority of single viral and host predictors of SVR was only weakly associated with treatment response. In multivariate analyses, low AST levels, female sex and an IFNL4 CC genotype were independently associated with SVR. However, a combined analysis of negative predictors revealed a significantly lower overall number of negative predictors in patients with SVR in comparison to individuals with virologic failure (P<0.0001) and the presence of 2 or less negative predictors was indicative for SVR. These results demonstrate that most single baseline viral and host parameters have a weak influence on the response to triple therapy, whereas the overall number of negative predictors has a high predictive value for SVR.
Rapid immune reconstitution (IR) following stem cell transplantation (SCT) is essential for a favorable outcome. The optimization of graft composition should not only enable a sufficient IR but also improve graft vs. leukemia/tumor effects, overcome infectious complications and, finally, improve patient survival. Especially in haploidentical SCT, the optimization of graft composition is controversial. Therefore, we analyzed the influence of graft manipulation on IR in 40 patients with acute leukemia in remission. We examined the cell recovery post haploidentical SCT in patients receiving a CD34+-selected or CD3/CD19-depleted graft, considering the applied conditioning regimen. We used joint model analysis for overall survival (OS) and analyzed the dynamics of age-adjusted leukocytes; lymphocytes; monocytes; CD3+, CD3+CD4+, and CD3+CD8+ T cells; natural killer (NK) cells; and B cells over the course of time after SCT. Lymphocytes, NK cells, and B cells expanded more rapidly after SCT with CD34+-selected grafts (P = 0.036, P = 0.002, and P < 0.001, respectively). Contrarily, CD3+CD4+ helper T cells recovered delayer in the CD34 selected group (P = 0.026). Furthermore, reduced intensity conditioning facilitated faster immune recovery of lymphocytes and T cells and their subsets (P < 0.001). However, the immune recovery for NK cells and B cells was comparable for patients who received reduced-intensity or full preparative regimens. Dynamics of all cell types had a significant influence on OS, which did not differ between patients receiving CD34+-selected and those receiving CD3/CD19-depleted grafts. In conclusion, cell reconstitution dynamics showed complex diversity with regard to the graft manufacturing procedure and conditioning regimen.
The free radical theory of aging suggests reactive oxygen species as a main reason for accumulation of damage events eventually leading to aging. Nox4, a member of the family of NADPH oxidases constitutively produces ROS and therefore has the potential to be a main driver of aging. Herein we analyzed the life span of Nox4 deficient mice and found no difference when compared to their wildtype littermates. Accordingly neither Tert expression nor telomere length was different in cells isolated from those animals. In fact, Nox4 mRNA expression in lungs of wildtype mice dropped with age. We conclude that Nox4 has no influence on lifespan of healthy mice.
Background: In recent months, Omicron variants of SARS-CoV-2 have become dominant in many regions of the world, and case numbers with Omicron subvariants BA.1 and BA.2 continue to increase. Due to numerous mutations in the spike protein, the efficacy of currently available vaccines, which are based on Wuhan-Hu 1 isolate of SARS-CoV-2, is reduced, leading to breakthrough infections. Efficacy of monoclonal antibody therapy is also likely impaired.
Methods: In our in vitro study using A549-AT cells constitutively expressing ACE2 and TMPRSS2, we determined and compared the neutralizing capacity of vaccine-elicited sera, convalescent sera and monoclonal antibodies against authentic SARS-CoV-2 Omicron BA.1 and BA.2 compared with Delta.
Findings: Almost no neutralisation of Omicron BA.1 and BA.2 was observed using sera from individuals vaccinated with two doses 6 months earlier, regardless of the type of vaccine taken. Shortly after the booster dose, most sera from triple BNT162b2-vaccinated individuals were able to neutralise both Omicron variants. In line with waning antibody levels three months after the booster, only weak residual neutralisation was observed for BA.1 (26%, n = 34, 0 median NT50) and BA.2 (44%, n = 34, 0 median NT50). In addition, BA.1 but not BA.2 was resistant to the neutralising monoclonal antibodies casirivimab/imdevimab, while BA.2 exhibited almost a complete evasion from the neutralisation induced by sotrovimab.
Interpretation: Both SARS-CoV-2 Omicron subvariants BA.1 and BA.2 escape antibody-mediated neutralisation elicited by vaccination, previous infection with SARS-CoV-2, and monoclonal antibodies. Waning immunity renders the majority of tested sera obtained three months after booster vaccination negative in BA.1 and BA.2 neutralisation. Omicron subvariant specific resistance to the monoclonal antibodies casirivimab/imdevimab and sotrovimab emphasizes the importance of genotype-surveillance and guided application.
Funding: This study was supported in part by the Goethe-Corona-Fund of the Goethe University Frankfurt (M.W.) and the Federal Ministry of Education and Research (COVIDready; grant 02WRS1621C (M.W.).
Rationale: Postinfectious bronchiolitis obliterans (PIBO) is a rare, chronic respiratory condition, which follows an acute insult due to a severe infection of the lower airways. Objectives: The objective of this study was to investigate the long-term course of bronchial inflammation and pulmonary function testing in children with PIBO. Methods: Medical charts of 21 children with PIBO were analyzed retrospectively at the Children's University Hospital Frankfurt/Main Germany. Pulmonary function tests (PFTs) with an interval of at least 1 month were studied between 2002 and 2019. A total of 382 PFTs were analyzed retrospectively and per year, the two best PFTs, in total 217, were evaluated. Additionally, 56 sputum analysis were assessed and the sputum neutrophils were evaluated. Results: The evaluation of the 217 PFTs showed a decrease in FEV1 with a loss of 1.07% and a loss in z score of −0.075 per year. FEV1/FVC decreased by 1.44 per year. FVC remained stable, showing a nonsignificant increase by 0.006 in z score per year. However, FEV1 and FVC in L increased significantly with FEV1 0.032 L per cm and FVC 0.048 L/cm in height. Sputum neutrophils showed a significant increase of 2.12% per year. Conclusion: Our results demonstrated that in patients with PIBO pulmonary function decreased significantly showing persistent obstruction over an average follow-up period of 8 years. However, persistent lung growth was revealed. In addition, pulmonary inflammation persisted clearly showing an increasing amount of neutrophils in induced sputum. Patients did not present with a general susceptibility to respiratory infections.
The immune response is known to wane after vaccination with BNT162b2, but the role of age, morbidity and body composition is not well understood. We conducted a cross-sectional study in long-term care facilities (LTCFs) for the elderly. All study participants had completed two-dose vaccination with BNT162b2 five to 7 months before sample collection. In 298 residents (median age 86 years, range 75–101), anti-SARS-CoV-2 rector binding IgG antibody (anti-RBD-IgG) concentrations were low and inversely correlated with age (mean 51.60 BAU/ml). We compared the results to Health Care Workers (HCW) aged 18–70 years (n = 114, median age: 53 years), who had a higher mean anti-RBD-IgG concentration of 156.99 BAU/ml. Neutralization against the Delta variant was low in both groups (9.5% in LTCF residents and 31.6% in HCWs). The Charlson Comorbidity Index was inversely correlated with anti-RBD-IgG, but not the body mass index (BMI). A control group of 14 LTCF residents with known breakthrough infection had significant higher antibody concentrations (mean 3,199.65 BAU/ml), and 85.7% had detectable neutralization against the Delta variant. Our results demonstrate low but recoverable markers of immunity in LTCF residents five to 7 months after vaccination.
Purpose: The prevalence of "ocal allergic rhinitis" within individuals suffering from perennial rhinitis remains uncertain, and patients usually are diagnosed with non-allergic rhinitis. The aim of this study was to evaluate the prevalence of a potential "local allergic rhinitis" in subjects suffering from non-allergic rhinitis in a non-selected group of young students.
Methods: 131 students (age 25.0 ± 5.1 years) with a possible allergic rhinitis and 25 non-allergic controls without rhinitis symptoms (age 22.0 ± 2.0 years) were recruited by public postings. 97 of 131 students with rhinitis were tested positive (≥3 mm) to prick testing with 17 frequent allergens at visit 1. Twenty-four 24 subjects with a house dust mite allergy, 21 subjects with a non-allergic rhinitis, and 18 non-allergic controls were further investigated at visit 2. Blood samples were taken, and nasal secretion was examined. In addition, all groups performed a nasal provocation test with house dust mite (HDM).
Results: In serum and nasal secretion, total IgE and house dust mite specific IgE significantly differed between HDM positive subjects and controls. However, no differences between non-allergic subjects and control subjects were quantifiable. Neither a nasal provocation test nor a nasal IgE to HDM allergens showed a measurable positive response in any of the non-allergic rhinitis subjects as well as the healthy controls, whilst being positive in 13 subjects with HDM allergy.
Conclusions: Nasal IgE is present in subjects with HDM allergy, but not in non-allergic rhinitis. In the investigated non-selected population, exclusive local production of IgE is absent. By implication, therefore, our findings challenge the emerging concept of local allergic rhinitis.
Study identifier at ClinicalTrials.gov: NCT 02810535.