Refine
Year of publication
Document Type
- Article (94)
- Conference Proceeding (4)
- Preprint (3)
Language
- English (101)
Has Fulltext
- yes (101)
Is part of the Bibliography
- no (101) (remove)
Keywords
- Patient blood management (6)
- Transfusion (5)
- COVID-19 (4)
- Outcome (4)
- patient blood management (4)
- ARDS (3)
- Critical care (3)
- Intensive care (3)
- Mortality (3)
- SARS-CoV-2 (3)
Institute
Cholinesterase alterations in delirium after cardiosurgery: a German monocentric prospective study
(2020)
Objectives: Postoperative delirium (POD) is a common complication after elective cardiac surgery. Recent evidence indicates that a disruption in the normal activity of the cholinergic system may be associated with delirium.
Design: Prospective observational study.
Setting: Single-centre at a European academic hospital.
Primary: and secondary outcome measures In our study the enzyme activities of acetylcholinesterase (AChE) and butyrylcholinesterase (BChE) were determined preoperatively as well as on the first and second postoperative day. The confusion assessment method for the intensive care unit was used to screen patients for the presence of POD.
Results: A total of 114 patients were included in the study. POD was associated with a decrease in BChE activity on postoperative day 1 (p=0.03). In addition, patients who developed POD, had significantly lower preoperative AChE activity than patients without POD (p<0.01). Multivariate analysis identified a preoperatively decreased AChE activity (OR 3.1; 95% CI 1.14 to 8.46), anticholinergic treatment (OR 5.09; 95% CI 1.51 to 17.23), elevated European System for Cardiac Operative Risk Evaluation (OR 3.68; 95% CI 1.04 to 12.99) and age (OR 3.02; 95% CI 1.06 to 8.62) to be independently associated with the development of POD.
Conclusions: We conclude that a reduction in the acetylcholine hydrolysing enzyme activity in patients undergoing cardiac surgery may correlate with the development of POD.
Background: Macrophage Migration Inhibitory Factor (MIF) is highly elevated after cardiac surgery and impacts the postoperative inflammation. The aim of this study was to analyze whether the polymorphisms CATT5–7 (rs5844572/rs3063368,“-794”) and G>C single-nucleotide polymorphism (rs755622,-173) in the MIF gene promoter are related to postoperative outcome. Methods: In 1116 patients undergoing cardiac surgery, the MIF gene polymorphisms were analyzed and serum MIF was measured by ELISA in 100 patients. Results: Patients with at least one extended repeat allele (CATT7) had a significantly higher risk of acute kidney injury (AKI) compared to others (23% vs. 13%; OR 2.01 (1.40–2.88), p = 0.0001). Carriers of CATT7 were also at higher risk of death (1.8% vs. 0.4%; OR 5.12 (0.99–33.14), p = 0.026). The GC genotype was associated with AKI (20% vs. GG/CC:13%, OR 1.71 (1.20–2.43), p = 0.003). Multivariate analyses identified CATT7 predictive for AKI (OR 2.13 (1.46–3.09), p < 0.001) and death (OR 5.58 (1.29–24.04), p = 0.021). CATT7 was associated with higher serum MIF before surgery (79.2 vs. 50.4 ng/mL, p = 0.008). Conclusion: The CATT7 allele associates with a higher risk of AKI and death after cardiac surgery, which might be related to chronically elevated serum MIF. Polymorphisms in the MIF gene may constitute a predisposition for postoperative complications and the assessment may improve risk stratification and therapeutic guidance.
BACKGROUND: Recent findings support the idea that interleukin (IL)-22 serum levels are related to disease severity in end-stage liver disease. Existing scoring systems--Model for End-Stage Liver Disease (MELD), Survival Outcomes Following Liver Transplantation (SOFT) and Pre-allocation-SOFT (P-SOFT)--are well-established in appraising survival rates with or without liver transplantation. We tested the hypothesis that IL-22 serum levels at transplantation date correlate with survival and potentially have value as a predictive factor for survival.
MATERIAL AND METHODS: MELD, SOFT, and P-SOFT scores were calculated to estimate post-transplantation survival. Serum levels of IL-22, IL-6, IL-10, C-reactive protein (CRP), and procalcitonin (PCT) were collected prior to transplantation in 41 patients. Outcomes were assessed at 3 months, 1 year, and 3 years after transplantation.
RESULTS: IL-22 significantly correlated with MELD, P-SOFT, and SOFT scores (Rs 0.35, 0.63, 0.56 respectively, p<0.05) and with the discrimination in post-transplantation survival. IL-6 showed a heterogeneous pattern (Rs 0.40, 0.63, 0.57, respectively, p<0.05); CRP and PCT did not correlate. We therefore added IL-22 serum values to existing scoring systems in a generalized linear model (GLM), resulting in a significantly improved outcome prediction in 58% of the cases for both the P-SOFT (p<0.01) and SOFT scores (p<0.001).
CONCLUSIONS: Further studies are needed to address the concept that IL-22 serum values at the time of transplantation provide valuable information about survival rates following orthotopic liver transplantation.
Introduction: Organ dysfunction or failure after the first days of ICU treatment and subsequent mortality with respect to the type of intensive care unit (ICU) admission is poorly elucidated. Therefore we analyzed the association of ICU mortality and admission for medical (M), scheduled surgery (ScS) or unscheduled surgery (US) patients mirrored by the occurrence of organ dysfunction/failure (OD/OF) after the first 72h of ICU stay.
Methods: For this retrospective cohort study (23,795 patients; DIVI registry; German Interdisciplinary Association for Intensive Care Medicine (DIVI)) organ dysfunction or failure were derived from the Sequential Organ Failure Assessment (SOFA) score (excluding the Glasgow Coma Scale). SOFA scores were collected on admission to ICU and 72h later. For patients with a length of stay of at least five days, a multivariate analysis was performed for individual OD/OF on day three.
Results: M patients had the lowest prevalence of cardiovascular failure (M 31%; ScS 35%; US 38%), and the highest prevalence of respiratory (M 24%; ScS 13%; US 17%) and renal failure (M 10%; ScS 6%; US 7%). Risk of death was highest for M- and ScS-patients in those with respiratory failure (OR; M 2.4; ScS 2.4; US 1.4) and for surgical patients with renal failure (OR; M 1.7; ScS 2.7; US 2.4).
Conclusion: The dynamic evolution of OD/OF within 72h after ICU admission and mortality differed between patients depending on their types of admission. This has to be considered to exclude a systematic bias during multi-center trials.
Introduction: The triggering receptor expressed on myeloid cells-1 (TREM-1) is known to be expressed during bacterial infections. We investigated whether TREM-1 is also expressed in non-infectious inflammation following traumatic lung contusion.
Methods: In a study population of 45 adult patients with multiple trauma and lung contusion, we obtained bronchoalveolar lavage (BAL) (blind suctioning of 20 ml NaCl (0.9%) via jet catheter) and collected blood samples at two time points (16 hours and 40 hours) after trauma. Post hoc patients were assigned to one of four groups radiologically classified according to the severity of lung contusion based on the initial chest tomography. Concentration of soluble TREM-1 (sTREM-1) and bacterial growth were determined in the BAL. sTREM-1, IL-6, IL-10, lipopolysaccharide binding protein, procalcitonin, C-reactive protein and leukocyte count were assessed in blood samples. Pulmonary function was evaluated by the paO2/FiO2 ratio.
Results: Three patients were excluded due to positive bacterial growth in the initial BAL. In 42 patients the severity of lung contusion correlated with the levels of sTREM-1 16 hours and 40 hours after trauma. sTREM-1 levels were significantly (P < 0.01) elevated in patients with severe contusion (2,184 pg/ml (620 to 4,000 pg/ml)) in comparison with patients with mild (339 pg/ml (135 to 731 pg/ml)) or no (217 pg/ml (97 to 701 pg/ml)) contusion 40 hours following trauma. At both time points the paO2/FiO2 ratio correlated negatively with sTREM-1 levels (Spearman correlation coefficient = -0.446, P < 0.01).
Conclusions: sTREM-1 levels are elevated in the BAL of patients following pulmonary contusion. Furthermore, the levels of sTREM-1 in the BAL correlate well with both the severity of radiological pulmonary tissue damage and functional impairment of gas exchange (paO2/FiO2 ratio).
Background: Numerous cases of swine-origin 2009 H1N1 influenza A virus (H1N1)-associated acute respiratory distress syndrome (ARDS) bridged by extracorporeal membrane oxygenation (ECMO) therapy have been reported; however, complication rates are high. We present our experience with H1N1-associated ARDS and successful bridging of lung function using superimposed high-frequency jet ventilation (SHFJV) in combination with continuous positive airway pressure/assisted spontaneous breathing (CPAP/ASB).
Methods: We admitted five patients with H1N1 infection and ARDS to our intensive care unit. Although all patients required pure oxygen and controlled ventilation, oxygenation was insufficient. We applied SHFJV/CPAP/ASB to improve oxygenation.
Results: Initial PaO2/FiO2 ratio prior SHFJV was 58-79 mmHg. In all patients, successful oxygenation was achieved by SHFJV (PaO2/FiO2 ratio 105-306 mmHg within 24 h). Spontaneous breathing was set during first hours after admission. SHFJV could be stopped after 39, 40, 72, 100, or 240 h. Concomitant pulmonary herpes simplex virus (HSV) infection was observed in all patients. Two patients were successfully discharged. The other three patients relapsed and died within 7 weeks mainly due to combined HSV infection and in two cases reoccurring H1N1 infection.
Conclusions: SHFJV represents an alternative to bridge lung function successfully and improve oxygenation in the critically ill.
Background: Age and preoperative anaemia are risk factors for poor surgical outcome and blood transfusion. The aim of this study was to examine the effect of iron supplementation in iron-deficient (ID) elderly patients undergoing major surgery.
Method: In this single-centre observational study, patients ≥ 65 years undergoing major surgery were screened for anaemia and ID. Patients were assigned to the following groups: A− (no anaemia); A−,ID+,T+ (no anaemia, iron-deficient, intravenous iron supplementation); A+ (anaemia); and A+,ID+,T+ (anaemia, iron-deficient, intravenous iron supplementation).
Results: Of 4,381 patients screened at the anaemia walk-in clinic, 2,381 (54%) patients were ≥ 65 years old and 2,191 cases were included in analysis. The ID prevalence was 63% in patients with haemoglobin (Hb) < 8 g/dl, 47.2% in patients with Hb from 8.0 to 8.9 g/dl, and 44.3% in patients with Hb from 9 to 9.9 g/dl. In severely anaemic patients, an Hb increase of 0.6 (0.4; 1.2) and 1.2 (0.7; 1.6) g/dl was detected with iron supplementation 6–10 and > 10 days before surgery, respectively. Hb increased by 0 (-0.1; 0) g/dl with iron supplementation 1–5 days before surgery, 0.2 (-0.1; 0.5) g/dl with iron supplementation 6–10 days before surgery, and 0.2 (-0.2; 1.1) g/dl with supplementation > 10 days before surgery (p < 0.001 for 1–5 vs. 6–10 days). Overall, 58% of A+,ID+,T+ patients showed an Hb increase of > 0.5 g/dl. The number of transfused red blood cell units was significantly lower in patients supplemented with iron (0 (0; 3)) compared to non-treated anaemic patients (1 (0; 4)) (p = 0.03). Patients with iron supplementation > 6 days before surgery achieved mobility 2 days earlier than patients with iron supplementation < 6 days.
Conclusions: Intravenous iron supplementation increases Hb level and thereby reduces blood transfusion rate in elderly surgical patients with ID anaemia.
The scope of extracorporeal membrane oxygenation (ECMO) is expanding, nevertheless, pharmacokinetics in patients receiving cardiorespiratory support are fairly unknown leading to unpredictable drug concentrations. Currently, there are no clear guidelines for antibiotic dosing during ECMO. This study aims to evaluate the pharmacokinetics (PK) of cefazolin in patients undergoing ECMO treatment. Total and unbound plasma cefazolin concentration of critically ill patients on veno-arterial ECMO were determined. Observed PK was compared to dose recommendations calculated by an online available, free dosing software. Concentration of cefazolin varied broadly despite same dosage in all patients. The mean total and unbound plasma concentration were high showing significantly (p = 5.8913 E−09) greater unbound fraction compared to a standard patient. Cefazolin clearance was significantly (p = 0.009) higher in patients with preserved renal function compared with CRRT. Based upon the calculated clearance, the use of dosing software would have led to lower but still sufficient concentrations of cefazolin in general. Our study shows that a “one size fits all” dosing regimen leads to excessive unbound cefazolin concentration in these patients. They exhibit high PK variability and decreased cefazolin clearance on ECMO appears to compensate for ECMO- and critical illness-related increases in volume of distribution.
Background: Conditions during blood product storage and transportation should maintain quality. The aim of this in vitro study was to investigate the effect of interruption of agitation, temporary cooling (TC), and pneumatic tube system transportation (PTST) on the aggregation ability (AA) and mitochondrial function (MF) of platelet concentrates (PC).
Study Design and Methods: A PC was divided equally into four subunits and then allocated to four test groups. The control group (I) was stored as recommended (continuous agitation, 22 ± 2°C) for 4 days. The test groups were stored without agitation (II), stored as recommended, albeit 4°C for 60 minutes on day (d)2 (III) and PTST (IV). Aggregometry was measured using Multiplate (RocheAG; ADPtest, ASPItest, TRAPtest, COLtest) and MF using Oxygraph‐2k (Oroboros Instruments). The basal and maximum mitochondrial respiratory rate (MMRR) were determined. AA and MF were measured daily in I and II and AA in III and IV on d2 after TC/PTST. Statistical analysis was performed using tests for matched observations.
Results: Eleven PCs were used. TRAP‐6 induced AA was significantly lower in II when compared to I on d4 (P = 0.015*). In III the ASPItest was significantly lower (P = 0.032*). IV showed no significant differences. The basal and MMRR were significantly reduced over 4 days in I and II (for both rates in both groups: P = <0.0001*). No significant differences occurred on d4 (P = 0.495).
Conclusion: Our results indicate that ex vivo AA and MF of PCs are unaffected, even in no‐ideal storage and transport circumstances with respect to agitation, temperature, and force.
A high incidence of thromboembolic events associated with high mortality has been reported in severe acute respiratory syndrome coronavirus type 2 (SARS-CoV-2) infections with respiratory failure. The present study characterized post-transcriptional gene regulation by global microRNA (miRNA) expression in relation to activated coagulation and inflammation in 21 critically ill SARS-CoV-2 patients. The cohort consisted of patients with moderate respiratory failure (n = 11) and severe respiratory failure (n = 10) at an acute stage (day 0–3) and in the later course of the disease (>7 days). All patients needed supplemental oxygen and severe patients were defined by the requirement of positive pressure ventilation (intubation). Levels of D-dimers, activated partial thromboplastin time (aPTT), C-reactive protein (CRP), and interleukin (IL)-6 were significantly higher in patients with severe compared with moderate respiratory failure. Concurrently, next generation sequencing (NGS) analysis demonstrated increased dysregulation of miRNA expression with progression of disease severity connected to extreme downregulation of miR-320a, miR-320b and miR-320c. Kyoto encyclopedia of genes and genomes (KEGG) pathway analysis revealed involvement in the Hippo signaling pathway, the transforming growth factor (TGF)-β signaling pathway and in the regulation of adherens junctions. The expression of all miR-320 family members was significantly correlated with CRP, IL-6, and D-dimer levels. In conclusion, our analysis underlines the importance of thromboembolic processes in patients with respiratory failure and emphasizes miRNA-320s as potential biomarkers for severe progressive SARS-CoV-2 infection.
Early and adequate restoration of endothelial and tubular renal function is a substantial step during regeneration after ischemia reperfusion (IR) injury, occurring, e.g., in kidney transplantation, renal surgery, and sepsis. While tubular epithelial cell injury has long been of central importance, recent perception includes the renal vascular endothelium. In this regard, the fibrin cleavage product fibrinopeptide Bβ15-42 mitigate IR injury by stabilizing interendothelial junctions through its affinity to VE-cadherin. Therefore, this study focused on the effect of Bβ15-42 on post-acute physiological renal regeneration. For this, adult male C57BL/6 mice were exposed to a 30 min bilateral renal ischemia and reperfusion for 24 h or 48 h. Animals were randomized in a non-operative control group, two operative groups each treated with i.v. administration of either saline or Bβ15-42 (2.4 mg/kg) immediately prior to reperfusion. Endothelial activation and inflammatory response was attenuated in renal tissue homogenates by single application of Bβ15-42. Meanwhile, Bβ15-42 did not affect acute kidney injury markers. Regarding the angiogenetic players VEGF-A, Angiopoietin-1, Angiopoietin-2, however, we observed significant higher expressions at mRNA and trend to higher protein level in Bβ15-42 treated mice, compared to saline treated mice after 48 h of IR, thus pointing toward an increased angiogenetic activity. Similar dynamics were observed for the intermediate filament vimentin, the cytoprotective protein klotho, stathmin and the proliferation cellular nuclear antigen, which were significantly up-regulated at the same points in time. These results suggest a beneficial effect of anatomical contiguously located endothelial cells on tubular regeneration through stabilization of endothelial integrity. Therefore, it seems that Bβ15-42 represents a novel pharmacological approach in the targeted therapy of acute renal failure in everyday clinical practice.
High sedation needs of critically ill COVID-19 ARDS patients - a monocentric observational study
(2021)
Background: Therapy of severely affected coronavirus patient, requiring intubation and sedation is still challenging. Recently, difficulties in sedating these patients have been discussed. This study aims to describe sedation practices in patients with 2019 coronavirus disease (COVID-19)-induced acute respiratory distress syndrome (ARDS). Methods: We performed a retrospective monocentric analysis of sedation regimens in critically ill intubated patients with respiratory failure who required sedation in our mixed 32-bed university intensive care unit. All mechanically ventilated adults with COVID-19-induced ARDS requiring continuously infused sedative therapy admitted between April 4, 2020, and June 30, 2020 were included. We recorded demographic data, sedative dosages, prone positioning, sedation levels and duration. Descriptive data analysis was performed; for additional analysis, a logistic regression with mixed effect was used. Results: In total, 56 patients (mean age 67 (±14) years) were included. The mean observed sedation period was 224 (±139) hours. To achieve the prescribed sedation level, we observed the need for two or three sedatives in 48.7% and 12.8% of the cases, respectively. In cases with a triple sedation regimen, the combination of clonidine, esketamine and midazolam was observed in most cases (75.7%). Analgesia was achieved using sufentanil in 98.6% of the cases. The analysis showed that the majority of COVID-19 patients required an unusually high sedation dose compared to those available in the literature. Conclusion: The global pandemic continues to affect patients severely requiring ventilation and sedation, but optimal sedation strategies are still lacking. The findings of our observation suggest unusual high dosages of sedatives in mechanically ventilated patients with COVID-19. Prescribed sedation levels appear to be achievable only with several combinations of sedatives in most critically ill patients suffering from COVID-19-induced ARDS and a potential association to the often required sophisticated critical care including prone positioning and ECMO treatment seems conceivable.
Association of mortality and early tracheostomy in patients with COVID-19: a retrospective analysis
(2022)
COVID-19 adds to the complexity of optimal timing for tracheostomy. Over the course of this pandemic, and expanded knowledge of the disease, many centers have changed their operating procedures and performed an early tracheostomy. We studied the data on early and delayed tracheostomy regarding patient outcome such as mortality. We performed a retrospective analysis of all tracheostomies at our institution in patients diagnosed with COVID-19 from March 2020 to June 2021. Time from intubation to tracheostomy and mortality of early (≤ 10 days) vs. late (> 10 days) tracheostomy were the primary objectives of this study. We used mixed cox-regression models to calculate the effect of distinct variables on events. We studied 117 tracheostomies. Intubation to tracheostomy shortened significantly (Spearman’s correlation coefficient; rho = − 0.44, p ≤ 0.001) during the course of this pandemic. Early tracheostomy was associated with a significant increase in mortality in uni- and multivariate analysis (Hazard ratio 1.83, 95% CI 1.07–3.17, p = 0.029). The timing of tracheostomy in COVID-19 patients has a potentially critical impact on mortality. The timing of tracheostomy has changed during this pandemic tending to be performed earlier. Future prospective research is necessary to substantiate these results.
The coronavirus pandemic continues to challenge global healthcare. Severely affected patients are often in need of high doses of analgesics and sedatives. The latter was studied in critically ill coronavirus disease 2019 (COVID-19) patients in this prospective monocentric analysis. COVID-19 acute respiratory distress syndrome (ARDS) patients admitted between 1 April and 1 December 2020 were enrolled in the study. A statistical analysis of impeded sedation using mixed-effect linear regression models was performed. Overall, 114 patients were enrolled, requiring unusual high levels of sedatives. During 67.9% of the observation period, a combination of sedatives was required in addition to continuous analgesia. During ARDS therapy, 85.1% (n = 97) underwent prone positioning. Veno-venous extracorporeal membrane oxygenation (vv-ECMO) was required in 20.2% (n = 23) of all patients. vv-ECMO patients showed significantly higher sedation needs (p < 0.001). Patients with hepatic (p = 0.01) or renal (p = 0.01) dysfunction showed significantly lower sedation requirements. Except for patient age (p = 0.01), we could not find any significant influence of pre-existing conditions. Age, vv-ECMO therapy and additional organ failure could be demonstrated as factors influencing sedation needs. Young patients and those receiving vv-ECMO usually require increased sedation for intensive care therapy. However, further studies are needed to elucidate the causes and mechanisms of impeded sedation.
Introduction Patients undergoing heart valve surgery are predominantly transferred postoperatively to the intensive care unit (ICU) under continuous sedation. Volatile anaesthetics are an increasingly used treatment alternative to intravenous substances in the ICU. As subject to inhalational uptake and elimination, the resulting pharmacological benefits have been repeatedly demonstrated. Therefore, volatile anaesthetics appear suitable to meet the growing demands of fast-track cardiac surgery. However, their use requires special preparation at the bedside and trained medical and nursing staff, which might limit the pharmacological benefits. The aim of our work is to assess whether the temporal advantages of recovery under volatile sedation outweigh the higher effort of special preparation.
Methods and analysis The study is designed to evaluate the differences between intravenous sedatives (n=48) and volatile sedatives (n=48) in continued intensive care sedation. This study will be conducted as a prospective, randomised, controlled, single-blinded, monocentre trial at a German university hospital in consenting adult patients undergoing heart valve surgery at a university hospital. This observational study will examine the necessary preparation time, staff consultation and overall feasibility of the chosen sedation method. For this purpose, the continuation of sedation in the ICU with volatile sedatives is considered as one study arm and with intravenous sedatives as the comparison group. Due to rapid elimination and quick awakening after the termination of sedation, closer consultation between the attending physician and the ICU nursing staff is required, in addition to a prolonged setup time. Study analysis will include the required setup time, time from admission to extubation as primary outcome and neurocognitive assessability. In addition, possible operation-specific (blood loss, complications), treatment parameters (catecholamine dosages, lung function) and laboratory results (acute kidney injury, acid base balance (lactataemia), liver failure) as influencing factors will be collected. The study-relevant data will be extracted from the continuous digital records of the patient data management system after the patient has been discharged from the ICU. For statistical evaluation, 95% CIs will be calculated for the median time to extubation and neurocognitive assessability, and the association will be assessed with a Cox regression model. In addition, secondary binary outcome measures will be evaluated using Fisher’s exact tests. Further descriptive and exploratory statistical analyses are also planned.
Ethics and dissemination The study was approved by the Institutional Ethics Board of the University of Frankfurt, Germany (#20-1050). Informed consent of all individual patients will be obtained before randomisation. Results will be disseminated via publication in peer-reviewed journals.
Epidural catheterization has become an indispensable part of modern pain therapy, for example, in obstetrics. Learning how to master this skill is an important competency. Videos are among the information sources with the highest information content for learning such skills. The present study aims to analyze videos regarding epidural catheter placement provided on the YouTube platform based on a validated checklist. An expert workshop selected crucial items for learning epidural catheterization in obstetrics. Items were identified and optimized in a five-step testing process. Using this checklist, videos from YouTube were evaluated by eleven health care professionals. Sixteen videos were identified and analyzed. Concerning the catheterization-specific part of the checklist, only two videos showed satisfactory quality. In the didactic part, eleven out of 21 items reached a mean score >50% of the points. Regarding interrater reliability, the catheterization-specific checklist was shown to be substantial (Fleiss’ kappa = 0.610), and the didactic part was shown to be fair (Fleiss’ kappa = 0.401). Overall, standard monitoring and appropriate aseptic technique were followed in only 42% and 49% for the procedure. There was a significant correlation between the runtime and the content quality (p < 0.001). No correlation could be found in terms of platform rating parameters. The video quality varied highly in terms of the requirements of this practical skill. The majority appear unsuitable for self-study due to serious errors and deficiencies regarding patient safety. However, there is no quality control on free platforms. Accordingly, it is difficult to identify suitable videos for educational purposes.
Objectives: The ongoing coronavirus pandemic is challenging, especially in severely affected patients who require intubation and sedation. Although the potential benefits of sedation with volatile anesthetics in coronavirus disease 2019 patients are currently being discussed, the use of isoflurane in patients with coronavirus disease 2019–induced acute respiratory distress syndrome has not yet been reported. Design: We performed a retrospective analysis of critically ill patients with hypoxemic respiratory failure requiring mechanical ventilation. Setting: The study was conducted with patients admitted between April 4 and May 15, 2020 to our ICU. Patients: We included five patients who were previously diagnosed with severe acute respiratory syndrome coronavirus 2 infection. Intervention: Even with high doses of several IV sedatives, the targeted level of sedation could not be achieved. Therefore, the sedation regimen was switched to inhalational isoflurane. Clinical data were recorded using a patient data management system. We recorded demographical data, laboratory results, ventilation variables, sedative dosages, sedation level, prone positioning, duration of volatile sedation and outcomes. Measurements & Main Results: Mean age (four men, one women) was 53.0 (± 12.7) years. The mean duration of isoflurane sedation was 103.2 (± 66.2) hours. Our data demonstrate a substantial improvement in the oxygenation ratio when using isoflurane sedation. Deep sedation as assessed by the Richmond Agitation and Sedation Scale was rapidly and closely controlled in all patients, and the subsequent discontinuation of IV sedation was possible within the first 30 minutes. No adverse events were detected. Conclusions: Our findings demonstrate the feasibility of isoflurane sedation in five patients suffering from severe coronavirus disease 2019 infection. Volatile isoflurane was able to achieve the required deep sedation and reduced the need for IV sedation.
Background: Extracorporeal life support (ECLS) has become an integral part of modern intensive therapy. The choice of support mode depends largely on the indication. Patients with respiratory failure are predominantly treated with a venovenous (VV) approach. We hypothesized that mortality in Germany in ECLS therapy did not differ from previously reported literature
Methods: Inpatient data from Germany from 2007 to 2018 provided by the Federal Statistical Office of Germany were analysed. The international statistical classification of diseases and related health problems codes (ICD) and process keys (OPS) for extracorporeal membrane oxygenation (ECMO) types, acute respiratory distress syndrome (ARDS) and hospital mortality were used.
Results: In total, 45,647 hospitalized patients treated with ECLS were analysed. In Germany, 231 hospitals provided ECLS therapy, with a median of 4 VV-ECMO and 9 VA-ECMO in 2018. Overall hospital mortality remained higher than predicted in comparison to the values reported in the literature. The number of VV-ECMO cases increased by 236% from 825 in 2007 to 2768 in 2018. ARDS was the main indication for VV-ECMO in only 33% of the patients in the past, but that proportion increased to 60% in 2018. VA-ECMO support is of minor importance in the treatment of ARDS in Germany. The age distribution of patients undergoing ECLS has shifted towards an older population. In 2018, the hospital mortality decreased in VV-ECMO patients and VV-ECMO patients with ARDS to 53.9% (n = 1493) and 54.4% (n = 926), respectively.
Conclusions: ARDS is a severe disease with a high mortality rate despite ECLS therapy. Although endpoints and timing of the evaluations differed from those of the CESAR and EOLIA studies and the Extracorporeal Life Support Organization (ELSO) Registry, the reported mortality in these studies was lower than in the present analysis. Further prospective analyses are necessary to evaluate outcomes in ECMO therapy at the centre volume level.
Background: Approximately every third surgical patient is anemic. The most common form, iron deficiency anemia, results from persisting iron‐deficient erythropoiesis (IDE). Zinc protoporphyrin (ZnPP) is a promising parameter for diagnosing IDE, hitherto requiring blood drawing and laboratory workup.
Study design and methods: Noninvasive ZnPP (ZnPP‐NI) measurements are compared to ZnPP reference determination of the ZnPP/heme ratio by high‐performance liquid chromatography (ZnPP‐HPLC) and the analytical performance in detecting IDE is evaluated against traditional iron status parameters (ferritin, transferrin saturation [TSAT], soluble transferrin receptor–ferritin index [sTfR‐F], soluble transferrin receptor [sTfR]), likewise measured in blood. The study was conducted at the University Hospitals of Frankfurt and Zurich.
Results: Limits of agreement between ZnPP‐NI and ZnPP‐HPLC measurements for 584 cardiac and noncardiac surgical patients equaled 19.7 μmol/mol heme (95% confidence interval, 18.0–21.3; acceptance criteria, 23.2 μmol/mol heme; absolute bias, 0 μmol/mol heme). Analytical performance for detecting IDE (inferred from area under the curve receiver operating characteristics) of parameters measured in blood was: ZnPP‐HPLC (0.95), sTfR (0.92), sTfR‐F (0.89), TSAT (0.87), and ferritin (0.67). Noninvasively measured ZnPP‐NI yielded results of 0.90.
Conclusion: ZnPP‐NI appears well suited for an initial IDE screening, informing on the state of erythropoiesis at the point of care without blood drawing and laboratory analysis. Comparison with a multiparameter IDE test revealed that ZnPP‐NI values of 40 μmol/mol heme or less allows exclusion of IDE, whereas for 65 μmol/mol heme or greater, IDE is very likely if other causes of increased values are excluded. In these cases (77% of our patients) ZnPP‐NI may suffice for a diagnosis, while values in between require analyses of additional iron status parameters.
Estimating intraoperative blood loss is one of the daily challenges for clinicians. Despite the knowledge of the inaccuracy of visual estimation by anaesthetists and surgeons, this is still the mainstay to estimate surgical blood loss. This review aims at highlighting the strengths and weaknesses of currently used measurement methods. A systematic review of studies on estimation of blood loss was carried out. Studies were included investigating the accuracy of techniques for quantifying blood loss in vivo and in vitro. We excluded nonhuman trials and studies using only monitoring parameters to estimate blood loss. A meta-analysis was performed to evaluate systematic measurement errors of the different methods. Only studies that were compared with a validated reference e.g. Haemoglobin extraction assay were included. 90 studies met the inclusion criteria for systematic review and were analyzed. Six studies were included in the meta-analysis, as only these were conducted with a validated reference. The mixed effect meta-analysis showed the highest correlation to the reference for colorimetric methods (0.93 95% CI 0.91–0.96), followed by gravimetric (0.77 95% CI 0.61–0.93) and finally visual methods (0.61 95% CI 0.40–0.82). The bias for estimated blood loss (ml) was lowest for colorimetric methods (57.59 95% CI 23.88–91.3) compared to the reference, followed by gravimetric (326.36 95% CI 201.65–450.86) and visual methods (456.51 95% CI 395.19–517.83). Of the many studies included, only a few were compared with a validated reference. The majority of the studies chose known imprecise procedures as the method of comparison. Colorimetric methods offer the highest degree of accuracy in blood loss estimation. Systems that use colorimetric techniques have a significant advantage in the real-time assessment of blood loss.
Background: The most common technique used worldwide to quantify blood loss during an operation is the visual assessment by the attending intervention team. In every operating room you will find scaled suction canisters that collect fluids from the surgical field. This scaling is commonly used by clinicians for visual assessment of intraoperative blood loss. While many studies have been conducted to quantify and improve the inaccuracy of the visual estimation method, research has focused on the estimation of blood volume in surgical drapes. The question whether and how scaling of canisters correlates with actual blood loss and how accurately clinicians estimate blood loss in scaled canisters has not been the focus of research to date.
Methods: A simulation study with four “bleeding” scenarios was conducted using expired whole blood donations. After diluting the blood donations with full electrolyte solution, the sample blood loss volume (SBL) was transferred into suction canisters. The study participants then had to estimate the blood loss in all four scenarios. The difference to the reference blood loss (RBL) per scenario was analyzed.
Results: Fifty-three anesthetists participated in the study. The median estimated blood loss was 500 ml (IQR 300/1150) compared to the RBL median of 281.5 ml (IQR 210.0/1022.0). Overestimations up to 1233 ml were detected. Underestimations were also observed in the range of 138 ml. The visual estimate for canisters correlated moderately with RBL (Spearman’s rho: 0.818; p < 0.001). Results from univariate nonparametric confirmation statistics regarding visual estimation of canisters show that the deviation of the visual estimate of blood loss is significant (z = − 10.95, p < 0.001, n = 220). Participants’ experience level had no significant influence on VEBL (p = 0.402).
Conclusion: The discrepancies between the visual estimate of canisters and the actual blood loss are enormous despite the given scales. Therefore, we do not recommend estimating the blood loss visually in scaled suction canisters. Colorimetric blood loss estimation could be a more accurate option.
Loss of vascular barrier function causes leak of fluid and proteins into tissues, extensive leak leads to shock and death. Barriers are largely formed by endothelial cell-cell contacts built up by VE-cadherin and are under the control of RhoGTPases. Here we show that a natural plasmin digest product of fibrin, peptide Bß15-42 (also called FX06), significantly reduces vascular leak and mortality in animal models for Dengue shock syndrome. The ability of Bß15-42 to preserve endothelial barriers is confirmed in rats i.v.-injected with LPS. In endothelial cells, Bß15-42 prevents thrombin-induced stress fiber formation, myosin light chain phosphorylation and RhoA activation. The molecular key for the protective effect of Bß15-42 is the src kinase Fyn, which associates with VE-cadherin-containing junctions. Following exposure to Bß15-42 Fyn dissociates from VE-cadherin and associates with p190RhoGAP, a known antagonists of RhoA activation. The role of Fyn in transducing effects of Bß15-42 is confirmed in Fyn -/- mice, where the peptide is unable to reduce LPS-induced lung edema, whereas in wild type littermates the peptide significantly reduces leak. Our results demonstrate a novel function for Bß15-42. Formerly mainly considered as a degradation product occurring after fibrin inactivation, it has now to be considered as a signaling molecule. It stabilizes endothelial barriers and thus could be an attractive adjuvant in the treatment of shock.
The transcription factor NF-E2 p45-related factor 2 (Nrf2) is an established master regulator of the anti-oxidative and detoxifying cellular response. Thus, a role in inflammatory diseases associated with the generation of large amounts of reactive oxygen species (ROS) seems obvious. In line with this, data obtained in cell culture experiments and preclinical settings have shown that Nrf2 is important in regulating target genes that are necessary to ensure cellular redox balance. Additionally, Nrf2 is involved in the induction of phase II drug metabolizing enzymes, which are important both in degrading and converting drugs into active forms, and into putative carcinogens. Therefore, Nrf2 has also been implicated in tumorigenesis. This must be kept in mind when new therapy approaches are planned for the treatment of sepsis. Therefore, this review highlights the function of Nrf2 in sepsis with a special focus on the translation of rodent-based results into sepsis patients in the intensive care unit (ICU).
Background: Nicolaides-Baraitser syndrome (NCBRS) is a rare disease caused by mutations in the SMRCA2 gene, which affects chromatin remodelling and leads to a wide range of symptoms including microcephaly, distinct facial features, recurrent seizures, and severe mental retardation. Until now, less than 100 cases have been reported. Case presentation: A 22-month old male infant with NCBRS underwent elective cleft palate surgery. The anaesthetists were challenged by the physiological condition of the patient: narrow face, very small mouth, mild tachypnea, slight sternal retractions, physical signs of partial monosomy 9p, and plagiocephalus, midface hypoplasia, V-shaped cleft palate, enhanced muscular hypotension, dysplastic kidneys (bilateral, estimated GFR: approx. 40 ml/m2), nocturnal oxygen demand, and combined apnea. In addition, little information was available about interaction of the NCBRS displayed by the patient and anaesthesia medications. Conclusions: The cleft palate was successfully closed using the bridge flap technique. Overall, we recommend to perform a trial video assisted laryngoscopy in the setting of spontaneous breathing with deep inhalative anaesthesia before administration of muscle relaxation to detect any airway difficulties while remaining spontaneoues breathing and protective reflexes.
Background: The pro-inflammatory status of the elderly triggers most of the age-related diseases such as cancer and atherosclerosis. Atherosclerosis, the leading cause world wide of morbidity and death, is an inflammatory disease influenced by life-style and genetic host factors. Stimuli such as oxLDL or microbial ligands have been proposed to trigger inflammation leading to atherosclerosis. It has recently been shown that oxLDL activates immune cells via the Toll-like receptor (TLR) 4/6 complex. Several common single nucleotide polymorphisms (SNPs) of the TLR system have been associated with atherosclerosis. To investigate the role of TLR-6 we analyzed the association of the TLR-6 SNP Pro249Ser with atherogenesis.
Results: Genotyping of two independent groups with CAD, as well as of healthy controls revealed a significant association of the homozygous genotype with a reduced risk for atherosclerosis (odds ratio: 0.69, 95% CI 0.51-0.95, P = 0.02). In addition, we found a trend towards an association with the risk of restenosis after transluminal coronary angioplasty (odds ratio: 0.53, 95% CI 0.24-1.16, P = 0.12). In addition, first evidence is presented that the frequency of this protective genotype increases in a healthy population with age. Taken together, our results define a role for TLR-6 and its genetic variations in modulating the inflammatory response leading to atherosclerosis.
Conclusions: These results may lead to a better risk stratification, and potentially to an improved prophylactic treatment of high-risk populations. Furthermore, the protective effect of this polymorphism may lead to an increase of this genotype in the healthy elderly and may therefore be a novel genetic marker for the well-being during aging.
The administration of intravenous fluid to critically ill patients is one of the most common but also one of the most fiercely debated interventions in intensive care medicine. During the past decade, a number of important studies have been published which provide clinicians with improved knowledge regarding the timing, the type and the amount of fluid they should give to their critically ill patients. However, despite the fact that many thousands of patients have been enrolled in these trials of alternative fluid strategies, consensus remains elusive and practice is widely variable. Early adequate resuscitation of patients in shock followed by a restrictive strategy may be associated with better outcomes. Colloids such as modern hydroxyethyl starch are more effective than crystalloids in early resuscitation of patients in shock, and are safe when administered during surgery. However, these colloids may not be beneficial later in the course of intensive care treatment and should best be avoided in intensive care patients who have a high risk of developing acute kidney injury. Albumin has no clear benefit over saline and is associated with increased mortality in neurotrauma patients. Balanced fluids reduce the risk of hyperchloraemic acidosis and possibly kidney injury. The use of hypertonic fluids in patients with sepsis and acute lung injury warrants further investigation and should be considered experimental at this stage. Fluid therapy impacts relevant patient-related outcomes. Clinicians should adopt an individualized strategy based on the clinical scenario and best available evidence. One size does not fit all.
Endogenous nitro-fatty acids (NFA) are potent electrophilic lipid mediators that exert biological effects in vitro and in vivo via selective covalent modification of thiol-containing target proteins. The cytoprotective, anti-inflammatory, and anti-tumorigenic effects of NFA in animal models of disease caused by targeted protein nitroalkylation are a valuable basis for the development of future anti-phlogistic and anti-neoplastic drugs. Considering the complexity of diseases and accompanying comorbidities there is an urgent need for clinically effective multifunctional drugs. NFA are composed of a fatty acid backbone containing a nitroalkene moiety triggering Michael addition reactions. However, less is known about the target-specific structure–activity relationships and selectivities comparing different NFA targets. Therefore, we analyzed 15 NFA derivatives and compared them with the lead structure 9-nitro-oleic acid (9NOA) in terms of their effect on NF-κB (nuclear factor kappa B) signaling inhibition, induction of Nrf-2 (nuclear factor erythroid 2-related factor 2) gene expression, sEH (soluble epoxide hydrolase), LO (lipoxygenase), and COX-2 (cyclooxygenase-2) inhibition, and their cytotoxic effects on colorectal cancer cells. Minor modifications of the Michael acceptor position and variation of the chain length led to drugs showing increased target preference or enhanced multi-targeting, partly with higher potency than 9NOA. This study is a significant step forward to better understanding the biology of NFA and their enormous potential as scaffolds for designing future anti-inflammatory drugs.
Background: Nerve injury induced protein 1 (Ninjurin 1 (Ninj1)) was first identified in Schwann cells and neurons contributing to cell adhesion and nerve regeneration. Recently, the role of Ninj1 has been linked to inflammatory processes in the central nervous system where functional repression reduced leukocyte infiltration and clinical disease activity during experimental autoimmune encephalomyelitis in mice [1]. But Ninj1 is also expressed outside the nervous system in various organs such as the liver and kidney as well as on leukocytes [2,3]. Therefore, we hypothesized that Ninj1 contributes to inflammation in general; that is, also outside the nervous system, with special interest in the pathogenesis of sepsis.
Methods: Ninj1 was repressed by transfecting HMEC-1 cells, a human dermal microvascular endothelial cell line with siRNA targeting Ninj1 (siNinj1) or a negative control (siC). Subsequently, cells were stimulated with 100 ng/ml LPS (TLR4 agonist), 3 μg/ml LTA (TLR2 agonist) or 100 n/ml poly(I:C) (TLR3 agonist) for 3 hours. The inflammatory response was analyzed by real-time PCR. In addition, transmigration of neutrophils across a HMEC-1 monolayer was measured using transwell plates (pore size 3 μm).
Results: Repression of Ninj1 by siRNA reduced Ninj1 mRNA expression in HMEC about 90% (Figure 1A). Reduced Ninj1 expression decreased neutrophil migration to 62.5% (Figure 1B) and TLR signaling. In detail, knockdown of Ninj1 significantly reduced TLR-2 and TLR-4 triggered expression of ICAM-1 and IL-6 (Figure 1C,D) while poly(I:C)-induced expression was only slightly reduced. To analyze a more specific TLR-3 target, we measured IP-10 mRNA expression, which was also significantly reduced in siNinj1-transfected cells (Figure 1E).
Conclusion: Our in vitro data strongly indicated that Ninj1 is involved in regulation of TLR signaling and therewith contributes to inflammation. In vivo experiments will clarify its impact on systemic inflammation.
Background and objectives: Preoperative anaemia is an independent risk factor for a higher morbidity and mortality, a longer hospitalization and increased perioperative transfusion rates. Managing preoperative anaemia is the first of three pillars of Patient Blood Management (PBM), a multidisciplinary concept to improve patient safety. While various studies provide medical information on (successful) anaemia treatment pathways, knowledge of organizational details of diagnosis and management of preoperative anaemia across Europe is scarce.
Materials and methods: To gain information on various aspects of preoperative anaemia management including organization, financing, diagnostics and treatment, we conducted a survey (74 questions) in ten hospitals from seven European nations within the PaBloE (Patient Blood Management in Europe) working group covering the year 2016.
Results: Organization and activity in the field of preoperative anaemia management were heterogeneous in the participating hospitals. Almost all hospitals had pathways for managing preoperative anaemia in place, however, only two nations had national guidelines. In six of the ten participating hospitals, preoperative anaemia management was organized by anaesthetists. Diagnostics and treatment focused on iron deficiency anaemia which, in most hospitals, was corrected with intravenous iron.
Conclusion: Implementation and approaches of preoperative anaemia management vary across Europe with a primary focus on treating iron deficiency anaemia. Findings of this survey motivated the hospitals involved to critically evaluate their practice and may also help other hospitals interested in PBM to develop action plans for diagnosis and management of preoperative anaemia.
5-Lipoxygenase (5-LO) is the key enzyme in the formation of pro-inflammatory leukotrienes (LT) which play an important role in a number of inflammatory diseases. Accordingly, 5-LO inhibitors are frequently used to study the role of 5-LO and LT in models of inflammation and cancer. Interestingly, the therapeutic efficacy of these inhibitors is highly variable. Here we show that the frequently used 5-LO inhibitors AA-861, BWA4C, C06, CJ-13,610 and the FDA approved compound zileuton as well as the pan-LO inhibitor nordihydroguaiaretic acid interfere with prostaglandin E2 (PGE2) release into the supernatants of cytokine-stimulated (TNFα/IL-1β) HeLa cervix carcinoma, A549 lung cancer as well as HCA-7 colon carcinoma cells with similar potencies compared to their LT inhibitory activities (IC50 values ranging from 0.1–9.1 µM). In addition, AA-861, BWA4C, CJ-13,610 and zileuton concentration-dependently inhibited bacterial lipopolysaccharide triggered prostaglandin (PG) release into human whole blood. Western Blot analysis revealed that inhibition of expression of enzymes involved in PG synthesis was not part of the underlying mechanism. Also, liberation of arachidonic acid which is the substrate for PG synthesis as well as PGH2 and PGE2 formation were not impaired by the compounds. However, accumulation of intracellular PGE2 was found in the inhibitor treated HeLa cells suggesting inhibition of PG export as major mechanism. Further, experiments showed that the PG exporter ATP-binding cassette transporter multidrug resistance protein 4 (MRP-4) is targeted by the inhibitors and may be involved in the 5-LO inhibitor-mediated PGE2 inhibition. In conclusion, the pharmacological effects of a number of 5-LO inhibitors are compound-specific and involve the potent inhibition of PGE2 export. Results from experimental models on the role of 5-LO in inflammation and pain using 5-LO inhibitors may be misleading and their use as pharmacological tools in experimental models has to be revisited. In addition, 5-LO inhibitors may serve as new scaffolds for the development of potent prostaglandin export inhibitors.
Background: Peritonitis is responsible for thousands of deaths annually in Germany alone. Even source control (SC) and antibiotic treatment often fail to prevent severe sepsis or septic shock, and this situation has hardly improved in the past two decades. Most experimental immunomodulatory therapeutics for sepsis have been aimed at blocking or dampening a specific pro-inflammatory immunological mediator. However, the patient collective is large and heterogeneous. There are therefore grounds for investigating the possibility of developing personalized therapies by classifying patients into groups according to biomarkers. This study aims to combine an assessment of the efficacy of treatment with a preparation of human immunoglobulins G, A, and M (IgGAM) with individual status of various biomarkers (immunoglobulin level, procalcitonin, interleukin 6, antigen D-related human leucocyte antigen (HLA-DR), transcription factor NF-κB1, adrenomedullin, and pathogen spectrum).
Methods/design: A total of 200 patients with sepsis or septic shock will receive standard-of-care treatment (SoC). Of these, 133 patients (selected by 1:2 randomization) will in addition receive infusions of IgGAM for 5 days. All patients will be followed for approximately 90 days and assessed by the multiple-organ failure (MOF) score, by the EQ QLQ 5D quality-of-life scale, and by measurement of vital signs, biomarkers (as above), and survival.
Discussion: This study is intended to provide further information on the efficacy and safety of treatment with IgGAM and to offer the possibility of correlating these with the biomarkers to be studied. Specifically, it will test (at a descriptive level) the hypothesis that patients receiving IgGAM who have higher inflammation status (IL-6) and poorer immune status (low HLA-DR, low immunoglobulin levels) have a better outcome than patients who do not receive IgGAM. It is expected to provide information that will help to close the knowledge gap concerning the association between the effect of IgGAM and the presence of various biomarkers, thus possibly opening the way to a personalized medicine.
Trial registration: EudraCT, 2016–001788-34; ClinicalTrials.gov, NCT03334006. Registered on 17 Nov 2017.
Trial sponsor: RWTH Aachen University, represented by the Center for Translational & Clinical Research Aachen (contact Dr. S. Isfort).
Introduction: Acute kidney injury (AKI) can evolve quickly and clinical measures of function often fail to detect AKI at a time when interventions are likely to provide benefit. Identifying early markers of kidney damage has been difficult due to the complex nature of human AKI, in which multiple etiologies exist. The objective of this study was to identify and validate novel biomarkers of AKI.
Methods: We performed two multicenter observational studies in critically ill patients at risk for AKI - discovery and validation. The top two markers from discovery were validated in a second study (Sapphire) and compared to a number of previously described biomarkers. In the discovery phase, we enrolled 522 adults in three distinct cohorts including patients with sepsis, shock, major surgery, and trauma and examined over 300 markers. In the Sapphire validation study, we enrolled 744 adult subjects with critical illness and without evidence of AKI at enrollment; the final analysis cohort was a heterogeneous sample of 728 critically ill patients. The primary endpoint was moderate to severe AKI (KDIGO stage 2 to 3) within 12 hours of sample collection.
Results: Moderate to severe AKI occurred in 14% of Sapphire subjects. The two top biomarkers from discovery were validated. Urine insulin-like growth factor-binding protein 7 (IGFBP7) and tissue inhibitor of metalloproteinases-2 (TIMP-2), both inducers of G1 cell cycle arrest, a key mechanism implicated in AKI, together demonstrated an AUC of 0.80 (0.76 and 0.79 alone). Urine [TIMP-2].[IGFBP7] was significantly superior to all previously described markers of AKI (P <0.002), none of which achieved an AUC >0.72. Furthermore, [TIMP-2].[IGFBP7] significantly improved risk stratification when added to a nine-variable clinical model when analyzed using Cox proportional hazards model, generalized estimating equation, integrated discrimination improvement or net reclassification improvement. Finally, in sensitivity analyses [TIMP-2].[IGFBP7] remained significant and superior to all other markers regardless of changes in reference creatinine method.
Conclusions: Two novel markers for AKI have been identified and validated in independent multicenter cohorts. Both markers are superior to existing markers, provide additional information over clinical variables and add mechanistic insight into AKI. Trial registration: ClinicalTrials.gov number NCT01209169.
Background: Patient Blood Management (PBM) is a systematic quality improving clinical model to reduce anemia and avoid transfusions in all kinds of clinical settings. Here, we investigated the potential of PBM in oncologic surgery and hypothesized that PBM improves 2-year overall survival (OS).
Methods: Retrospective analysis of patients 2 years before and after PBM implementation. The primary endpoint was OS at 2 years after surgery. We identified a sample size of 824 to detect a 10% improvement in survival in the PBM group.
Results: The analysis comprised of 836 patients that underwent oncologic surgery, 389 before and 447 after PBM, was implemented. Patients in the PBM+ presented significantly more frequent with normal hemoglobin values before surgery than PBM− (56.6 vs. 35.7%; p < 0.001). The number of transfusions was significantly reduced from 5.5 ± 11.1 to 3.0 ± 6.9 units/patient (p < 0.001); moreover, the percentage of patients being transfused during the clinic stay was significantly reduced from 62.4 to 40.9% (p < 0.001). Two-year OS was significantly better in the PBM+ and increased from 67.0 to 80.1% (p = 0.001). A normal hemoglobin value (> 12 g/dl in female and > 13 g/dl in male) before surgery (HR 0.43, 95% CI 0.29–0.65, p < 0.001) was the only independent predictive factor positively affecting survival.
Conclusions: PBM is a quality improvement tool that is associated with better mid-term surgical oncologic outcome. The root cause for improvement is the increase of patients entering surgery with normal hemoglobin values.
Background: SARS-CoV-2 has massively changed the care situation in hospitals worldwide. Although tumour care should not be affected, initial reports from European countries were suggestive for a decrease in skin cancer during the first pandemic wave and only limited data are available thereafter.
Objectives: The aim of this study was to investigate skin cancer cases and surgeries in a nationwide inpatient dataset in Germany.
Methods: Comparative analyses were performed in a prepandemic (18 March 2019 until 17 March 2020) and a pandemic cohort (18 March 2020 until 17 March 2021). Cases were identified and analysed using the WHO international classification of diseases codes (ICDs) and process key codes (OPSs).
Results: Comparing the first year of the pandemic with the same period 1 year before, a persistent decrease of 14% in skin cancer cases (n = 19 063) was observed. The largest decrease of 24% was seen in non-invasive in situ tumours (n = 1665), followed by non-melanoma skin cancer (NMSC) with a decrease of 16% (n = 15 310) and malignant melanoma (MM) with a reduction of 7% (n = 2088). Subgroup analysis showed significant differences in the distribution of sex, age, hospital carrier type and hospital volume. There was a decrease of 17% in surgical procedures (n = 22 548), which was more pronounced in minor surgical procedures with a decrease of 24.6% compared to extended skin surgery including micrographic surgery with a decrease of 15.9%.
Conclusions: Hospital admissions and surgical procedures decreased persistently since the beginning of the pandemic in Germany for skin cancer patients. The higher decrease in NMSC cases compared to MM might reflect a prioritization effect. Further evidence from tumour registries is needed to investigate the consequences of the therapy delay and identify the upcoming challenges in skin cancer care.
The ongoing SARS-CoV-2 pandemic is characterized by poor outcome and a high mortality especially in the older patient cohort. Up to this point there is a lack of data characterising COVID-19 patients in Germany admitted to intensive care (ICU) vs. non-ICU patients. German Reimbursement inpatient data covering the period in Germany from January 1st, 2020 to December 31th, 2021 were analyzed. 561,379 patients were hospitalized with COVID-19. 24.54% (n = 137,750) were admitted to ICU. Overall hospital mortality was 16.69% (n = 93,668) and 33.36% (n = 45,947) in the ICU group. 28.66% (n = 160,881) of all patients suffer from Cardiac arrhythmia and 17.98% (n = 100,926) developed renal failure. Obesity showed an odds-ratio ranging from 0.83 (0.79–0.87) for WHO grade I to 1.13 (1.08–1.19) for grade III. Mortality-rates peaked in April 2020 and January 2021 being 21.23% (n = 4539) and 22.99% (n = 15,724). A third peak was observed November and December 2021 (16.82%, n = 7173 and 16.54%, n = 9416). Hospitalized COVID-19 patient mortality in Germany is lower than previously shown in other studies. 24.54% of all patients had to be treated in the ICU with a mortality rate of 33.36%. Congestive heart failure was associated with a higher risk of death whereas low grade obesity might have a protective effect on patient survival. High admission numbers are accompanied by a higher mortality rate.
Background: GLUT1-deficiency-syndrome (G1DS) is an autosomal dominant genetic disorder based on a mutation of the SLC2A1 gene. This mutation can lead to an encephalopathy due to abnormal glucose transport in the brain. G1DS is a rare disease, with an estimated incidence of 1: 90 000.
Case report: We report a case of a 10-year-old female who presented with recurrent fever, headaches, and vertigo for more than 3 days within 2 weeks following pneumonia. A bilateral mastoiditis was proven by a cerebral magnetic resonance imaging and a cranial computed tomography scan. The patient had to undergo mastoidectomy and thus, her first general anesthesia. Half a year previously she was diagnosed with G1DS. According to the standard of care, a ketogenic diet had been administered since the patient’s diagnosis 6 months earlier. Our patient received a total intravenous anesthesia (TIVA) using propofol, fentanyl, and rocuronium administered without any incidents.
Conclusions: We recommend normoglycemia during the perioperative phase and avoidance of glucose-based medication to keep a patient’s ketotic state. Our case highlights that TIVA, with the outlined medication used in this case, was safe when the patient’s ketotic state and periprocedural blood glucose was monitored continuously. Nevertheless, we would suggest using remifentanil instead of fentanyl for future TIVAs due to a reduced increase in blood glucose level in our patient.
Acute respiratory distress syndrome (ARDS) is a major cause of patient mortality in intensive care units (ICUs) worldwide. Considering that no causative treatment but only symptomatic care is available, it is obvious that there is a high unmet medical need for a new therapeutic concept. One reason for a missing etiologic therapy strategy is the multifactorial origin of ARDS, which leads to a large heterogeneity of patients. This review summarizes the various kinds of ARDS onset with a special focus on the role of reactive oxygen species (ROS), which are generally linked to ARDS development and progression. Taking a closer look at the data which already have been established in mouse models, this review finally proposes the translation of these results on successful antioxidant use in a personalized approach to the ICU patient as a potential adjuvant to standard ARDS treatment.
Background Bacterial DNA containing motifs of unmethylated CpG dinucleotides (CpG-ODN) initiate an innate immune response mediated by the pattern recognition receptor Toll-like receptor 9 (TLR9). This leads in particular to the expression of proinflammatory mediators such as tumor necrosis factor (TNF-alpha) and interleukin-1beta (IL-1beta). TLR9 is expressed in human and murine pulmonary tissue and induction of proinflammatory mediators has been linked to the development of acute lung injury. Therefore, the hypothesis was tested whether CpG-ODN administration induces an inflammatory response in the lung via TLR9 in vivo. Methods Wild-type (WT) and TLR9-deficient (TLR9-D) mice received CpG-ODN intraperitoneally (1668-Thioat, 1 nmol/g BW) and were observed for up to 6 hrs. Lung tissue and plasma samples were taken and various inflammatory markers were measured. Results In WT mice, CpG-ODN induced a strong activation of pulmonary NFKB as well as a significant increase in pulmonary TNF-alpha and IL-1beta mRNA/protein. In addition, cytokine serum levels were significantly elevated in WT mice. Increased pulmonary content of lung myeloperoxidase (MPO) was documented in WT mice following application of CpG-ODN. Bronchoalveolar lavage (BAL) revealed that CpG-ODN stimulation significantly increased total cell number as well as neutrophil count in WT animals. In contrast, the CpG-ODN-induced inflammatory response was abolished in TLR9-D mice. Conclusion This study suggests that bacterial CpG-ODN causes lung inflammation via TLR9.
Introduction: Systemic inflammation (e.g. following surgery) involves Toll-like receptor (TLR) signaling and leads to an endocrine stress response. This study aims to investigate a possible influence of TLR2 and TLR4 single nucleotide polymorphisms (SNPs) on perioperative adrenocorticotropic hormone (ACTH) and cortisol regulation in serum of cardiac surgical patients. To investigate the link to systemic inflammation in this context, we additionally measured 10 different cytokines in the serum. Methods: 338 patients admitted for elective cardiac surgery were included in this prospective observational clinical cohort study. Genomic DNA of patients was screened for TLR2 and TLR4 SNPs. Serum concentrations of ACTH, cortisol, interferon (IFN)-, interleukin (IL)-1, IL-2, IL-4, IL-5, IL-6, IL-8, IL-10, tumor necrosis factor (TNF)- and granulocyte macro-phage-colony stimulating factor (GM-CSF) were determined before surgery, immediately post surgery and on the first postoperative day. Results: 13 patients were identified as TLR2 SNP carrier, 51 as TLR4 SNP carrier and 274 pa-tients as non-carrier. Basal levels of ACTH, cortisol and cytokines did not differ between groups. In all three groups a significant, transient perioperative rise of cortisol could be ob-served. However, only in the non-carrier group this was accompanied by a significant ACTH rise, TLR4 SNP carriers had significant lower ACTH levels compared to non-carriers ((mean[95% confidence intervals]) non-carriers: 201.9[187.7 to 216.1]pg/ml; TLR4 SNP car-riers: 149.9[118.4 to 181.5]pg/ml; TLR2 SNP carriers: 176.4[110.5 to 242.3]pg/ml). Compared to non-carriers, TLR4 SNP carriers showed significant lower serum IL-8, IL-10 and GM-CSF peaks ((mean[95% confidence intervals]): IL-8: non-carriers: 42.6[36.7 to 48.5]pg/ml, TLR4 SNP carriers: 23.7[10.7 to 36.8]pg/ml; IL-10: non-carriers: 83.8[70.3 to 97.4]pg/ml, TLR4 SNP carriers: 54.2[24.1 to 84.2]pg/ml; GM-CSF: non-carriers: 33.0[27.8 to 38.3]pg/ml, TLR4 SNP carriers: 20.2[8.6 to 31.8]pg/ml). No significant changes over time or between the groups were found for the other cytokines. Conclusions: Regulation of the immunoendocrine stress response during systemic inflamma-tion is influenced by the presence of a TLR4 SNP. Cardiac surgical patients carrying this ge-notype showed decreased serum concentrations of ACTH, IL-8, IL-10 and GM-CSF. This finding might have impact on interpreting previous and designing future trials on diagnosing and modulating immunoendocrine dysregulation (e.g. adrenal insufficiency) during systemic inflammation and sepsis.
Introduction: Hip fracture surgery is associated with high in-hospital and 30-day mortality rates and serious adverse patient outcomes. Evidence from randomised controlled trials regarding effectiveness of spinal versus general anaesthesia on patient-centred outcomes after hip fracture surgery is sparse.
Methods and analysis: The iHOPE study is a pragmatic national, multicentre, randomised controlled, open-label clinical trial with a two-arm parallel group design. In total, 1032 patients with hip fracture (>65 years) will be randomised in an intended 1:1 allocation ratio to receive spinal anaesthesia (n=516) or general anaesthesia (n=516). Outcome assessment will occur in a blinded manner after hospital discharge and inhospital. The primary endpoint will be assessed by telephone interview and comprises the time to the first occurring event of the binary composite outcome of all-cause mortality or new-onset serious cardiac and pulmonary complications within 30 postoperative days. In-hospital secondary endpoints, assessed via in-person interviews and medical record review, include mortality, perioperative adverse events, delirium, satisfaction, walking independently, length of hospital stay and discharge destination. Telephone interviews will be performed for long-term endpoints (all-cause mortality, independence in walking, chronic pain, ability to return home cognitive function and overall health and disability) at postoperative day 30±3, 180±45 and 365±60.
Ethics and dissemination: iHOPE has been approved by the leading Ethics Committee of the Medical Faculty of the RWTH Aachen University on 14 March 2018 (EK 022/18). Approval from all other involved local Ethical Committees was subsequently requested and obtained. Study started in April 2018 with a total recruitment period of 24 months. iHOPE will be disseminated via presentations at national and international scientific meetings or conferences and publication in peer-reviewed international scientific journals.
Trial registration number: DRKS00013644; Pre-results
Introduction: It has been proposed that individual genetic variation contributes to the course of severe infections and sepsis. Recent studies of single nucleotide polymorphisms (SNPs) within the endotoxin receptor and its signaling system showed an association with the risk of disease development. This study aims to examine the response associated with genetic variations of TLR4, the receptor for bacterial LPS, and a central intracellular signal transducer (TIRAP/Mal) on cytokine release and for susceptibility and course of severe hospital acquired infections in distinct patient populations. Methods: Three intensive care units in tertiary care university hospitals in Greece and Germany participated. 375 and 415 postoperative patients and 159 patients with ventilator associated pneumonia (VAP) were included. TLR4 and TIRAP/Mal polymorphisms in 375 general surgical patients were associated with risk of infection, clinical course and outcome. In two prospective studies, 415 patients following cardiac surgery and 159 patients with newly diagnosed VAP predominantly caused by Gram-negative bacteria were studied for cytokine levels in-vivo and after ex-vivo monocyte stimulation and clinical course. Results: Patients simultaneously carrying polymorphisms in TIRAP/Mal and TLR4 and patients homozygous for the TIRAP/Mal SNP had a significantly higher risk of severe infections after surgery (odds ratio (OR) 5.5; confidence interval (CI): 1.34 - 22.64; P = 0.02 and OR: 7.3; CI: 1.89 - 28.50; P < 0.01 respectively). Additionally we found significantly lower circulating cytokine levels in double-mutant individuals with ventilator associated pneumonia and reduced cytokine production in an ex-vivo monocyte stimulation assay, but this difference was not apparent in TIRAP/Mal-homozygous patients. In cardiac surgery patients without infection, the cytokine release profiles were not changed when comparing different genotypes. Conclusions: Carriers of mutations in sequential components of the TLR signaling system may have an increased risk for severe infections. Patients with this genotype showed a decrease in cytokine release when infected which was not apparent in patients with sterile inflammation following cardiac surgery.
Background: Clonidine effectively decreases perioperative mortality by reducing sympathetic tone. However, application of clonidine might also restrict anaemia tolerance due to impairment of compensatory mechanisms. Therefore, the influence of clonidine induced, short-term sympathicolysis on anaemia tolerance was assessed in anaesthetized pigs. We measured the effect of clonidine on anaemia tolerance and of the potential for macrohemodynamic alterations to constrain the acute anaemia compensatory mechanisms.
Methods: After governmental approval, 14 anaesthetized pigs of either gender (Deutsche Landrasse, weight (mean ± SD) 24.1 ± 2.4 kg) were randomly assigned to intravenous saline or clonidine treatment (bolus: 20 μg · kg−1, continuous infusion: 15 μg · kg−1 · h−1). Thereafter, the animals were hemodiluted by exchange of whole blood for 6 % hydroxyethyl starch (MW 130.000/0.4) until the individual critical haemoglobin concentration (Hbcrit) was reached. Primary outcome parameters were Hbcrit and the exchangeable blood volume (EBV) until Hbcrit was reached.
Results: Hbcrit did not differ between both groups (values are median [interquartile range]: saline: 2.2 (2.0–2.5) g · dL−1 vs. clonidine: 2.1 (2.1–2.4) g · dL−1; n.s.). Furthermore, there was no difference in exchangeable blood volume (EBV) between both groups (saline: 88 (76–106) mL · kg−1 vs. clonidine: 92 (85–95) mL · kg−1; n.s.).
Conclusion: Anaemia tolerance was not affected by clonidine induced sympathicolysis. Consequently, perioperative clonidine administration probably has not to be omitted in view of acute anaemia.
Background: The use of cell salvage and autologous blood transfusion has become an important method of blood conservation. So far, there are no clinical data about the performance of the continuous autotransfusion device CATSmart.
Methods: In total, 74 patients undergoing either cardiac or orthopedic surgery were included in this prospective, bicenter and observational technical evaluation to validate red cell separation process and washout quality of CATSmart. The target of red cell separation process was defined as a hematocrit value in the packed red cell unit of 55–75% and of washout quality of 80–100% removal ratio.
Results: Hematocrit values measured by CATSmart and laboratory analysis were 78.5% [71.3%; 84.0%] and 73.7% [67.5%; 75.5%], respectively. Removal ratios for platelets 94.7% [88.2%; 96.7%], free hemoglobin 89.3% [85.2%; 94.9%], albumin 97.9% [96.6%; 98.5%], heparin 99.9% [99.9%; 100.0%], and potassium 92.5% [90.8%; 95.0%] were within the target range while removal of white blood cells was slightly worse 72.4% [57.9%; 87.3%].
Conclusion: The new autotransfusion device enables sufficient red cell separation and washout quality.
Background: Intensive Care Resources are heavily utilized during the COVID-19 pandemic. However, risk stratification and prediction of SARS-CoV-2 patient clinical outcomes upon ICU admission remain inadequate. This study aimed to develop a machine learning model, based on retrospective & prospective clinical data, to stratify patient risk and predict ICU survival and outcomes. Methods: A Germany-wide electronic registry was established to pseudonymously collect admission, therapeutic and discharge information of SARS-CoV-2 ICU patients retrospectively and prospectively. Machine learning approaches were evaluated for the accuracy and interpretability of predictions. The Explainable Boosting Machine approach was selected as the most suitable method. Individual, non-linear shape functions for predictive parameters and parameter interactions are reported. Results: 1039 patients were included in the Explainable Boosting Machine model, 596 patients retrospectively collected, and 443 patients prospectively collected. The model for prediction of general ICU outcome was shown to be more reliable to predict “survival”. Age, inflammatory and thrombotic activity, and severity of ARDS at ICU admission were shown to be predictive of ICU survival. Patients’ age, pulmonary dysfunction and transfer from an external institution were predictors for ECMO therapy. The interaction of patient age with D-dimer levels on admission and creatinine levels with SOFA score without GCS were predictors for renal replacement therapy. Conclusions: Using Explainable Boosting Machine analysis, we confirmed and weighed previously reported and identified novel predictors for outcome in critically ill COVID-19 patients. Using this strategy, predictive modeling of COVID-19 ICU patient outcomes can be performed overcoming the limitations of linear regression models. Trial registration “ClinicalTrials” (clinicaltrials.gov) under NCT04455451.
Introduction: Balanced fluid replacement solutions can possibly reduce the risks for electrolyte imbalances, for acid-base imbalances, and thus for renal failure. To assess the intraoperative change of base excess (BE) and chloride in serum after treatment with either a balanced gelatine/electrolyte solution or a non-balanced gelatine/electrolyte solution, a prospective, controlled, randomized, double-blind, dual centre phase III study was conducted in two tertiary care university hospitals in Germany.
Material and methods: 40 patients of both sexes, aged 18 to 90 years, who were scheduled to undergo elective abdominal surgery with assumed intraoperative volume requirement of at least 15 mL/kg body weight gelatine solution were included. Administration of study drug was performed intravenously according to patients need. The trigger for volume replacement was a central venous pressure (CVP) minus positive end-expiratory pressure (PEEP) <10 mmHg (CVP <10 mmHg). The crystalloid:colloid ratio was 1:1 intra- and postoperatively. The targets for volume replacement were a CVP between 10 and 14 mmHg minus PEEP after treatment with vasoactive agent and mean arterial pressure (MAP) > 65 mmHg.
Results: The primary endpoints, intraoperative changes of base excess –2.59 ± 2.25 (median: –2.65) mmol/L (balanced group) and –4.79 ± 2.38 (median: –4.70) mmol/L (non-balanced group)) or serum chloride 2.4 ± 1.9 (median: 3.0) mmol/L and 5.2 ± 3.1 (median: 5.0) mmol/L were significantly different (p = 0.0117 and p = 0.0045, respectively). In both groups (each n = 20) the investigational product administration in terms of volume and infusion rate was comparable throughout the course of the study, i.e. before, during and after surgery.
Discussion: Balanced gelatine solution 4% combined with a balanced electrolyte solution demonstrated significant smaller impact on blood gas analytic parameters in the primary endpoints BE and serum chloride when compared to a non-balanced gelatine solution 4% combined with NaCl 0.9%. No marked treatment differences were observed with respect to haemodynamics, coagulation and renal function.
Trial registration: ClinicalTrials.gov (NCT01515397) and clinicaltrialsregister.eu, EudraCT number 2010-018524-58.
Genetic or pharmacological ablation of toll-like receptor 2 (TLR2) protects against myocardial ischemia/reperfusion injury (MI/R). However, the endogenous ligand responsible for TLR2 activation has not yet been detected. The objective of this study was to identify HMGB1 as an activator of TLR2 signalling during MI/R. C57BL/6 wild-type (WT) or TLR2(-/-)-mice were injected with vehicle, HMGB1, or HMGB1 BoxA one hour before myocardial ischemia (30 min) and reperfusion (24 hrs). Infarct size, cardiac troponin T, leukocyte infiltration, HMGB1 release, TLR4-, TLR9-, and RAGE-expression were quantified. HMGB1 plasma levels were measured in patients undergoing coronary artery bypass graft (CABG) surgery. HMGB1 antagonist BoxA reduced cardiomyocyte necrosis during MI/R in WT mice, accompanied by reduced leukocyte infiltration. Injection of HMGB1 did, however, not increase infarct size in WT animals. In TLR2(-/-)-hearts, neither BoxA nor HMGB1 affected infarct size. No differences in RAGE and TLR9 expression could be detected, while TLR2(-/-)-mice display increased TLR4 and HMGB1 expression. Plasma levels of HMGB1 were increased MI/R in TLR2(-/-)-mice after CABG surgery in patients carrying a TLR2 polymorphism (Arg753Gln). We here provide evidence that absence of TLR2 signalling abrogates infarct-sparing effects of HMGB1 blockade.
Background: Cell salvage is commonly used as part of a blood conservation strategy. However concerns among clinicians exist about the efficacy of transfusion of washed cell salvage.
Methods: We performed a meta-analysis of randomized controlled trials in which patients, scheduled for all types of surgery, were randomized to washed cell salvage or to a control group with no cell salvage. Data were independently extracted, risk ratio (RR), and weighted mean differences (WMD) with 95% confidence intervals (CIs) were calculated. Data were pooled using a random effects model. The primary endpoint was the number of patients exposed to allogeneic red blood cell (RBC) transfusion.
Results: Out of 1140 search results, a total of 47 trials were included. Overall, the use of washed cell salvage reduced the rate of exposure to allogeneic RBC transfusion by a relative 39% (RR = 0.61; 95% CI 0.57 to 0.65; P < 0.001), resulting in an average saving of 0.20 units of allogeneic RBC per patient (weighted mean differences [WMD] = -0.20; 95% CI -0.22 to -0.18; P < 0.001), reduced risk of infection by 28% (RR = 0.72; 95% CI 0.54 to 0.97; P = 0.03), reduced length of hospital stay by 2.31 days (WMD = -2.31; 95% CI -2.50 to -2.11; P < 0.001), but did not significantly affect risk of mortality (RR = 0.92; 95% CI 0.63 to 1.34; P = 0.66). No statistical difference could be observed in the number of patients exposed to re-operation, plasma, platelets, or rate of myocardial infarction and stroke.
Conclusions: Washed cell salvage is efficacious in reducing the need for allogeneic RBC transfusion and risk of infection in surgery.
Background: Mild therapeutic hypothermia following cardiac arrest is neuroprotective, but its effect on myocardial dysfunction that is a critical issue following resuscitation is not clear. This study sought to examine whether hypothermia and the combination of hypothermia and pharmacological postconditioning are cardioprotective in a model of cardiopulmonary resuscitation following acute myocardial ischemia. Methodology/Principal Findings: Thirty pigs (28–34 kg) were subjected to cardiac arrest following left anterior descending coronary artery ischemia. After 7 minutes of ventricular fibrillation and 2 minutes of basic life support, advanced cardiac life support was started according to the current AHA guidelines. After successful return of spontaneous circulation (n = 21), coronary perfusion was reestablished after 60 minutes of occlusion, and animals were randomized to either normothermia at 38°C, hypothermia at 33°C or hypothermia at 33°C combined with sevoflurane (each group n = 7) for 24 hours. The effects on cardiac damage especially on inflammation, apoptosis, and remodeling were studied using cellular and molecular approaches. Five animals were sham operated. Animals treated with hypothermia had lower troponin T levels (p<0.01), reduced infarct size (34±7 versus 57±12%; p<0.05) and improved left ventricular function compared to normothermia (p<0.05). Hypothermia was associated with a reduction in: (i) immune cell infiltration, (ii) apoptosis, (iii) IL-1beta and IL-6 mRNA up-regulation, and (iv) IL-1beta protein expression (p<0.05). Moreover, decreased matrix metalloproteinase-9 activity was detected in the ischemic myocardium after treatment with mild hypothermia. Sevoflurane conferred additional protective effects although statistic significance was not reached. Conclusions/Significance: Hypothermia reduced myocardial damage and dysfunction after cardiopulmonary resuscitation possible via a reduced rate of apoptosis and pro-inflammatory cytokine expression.
Introduction: Hypothermia improves survival and neurological recovery after cardiac arrest. Pro-inflammatory cytokines have been implicated in focal cerebral ischemia/reperfusion in-jury. It is unknown whether cardiac arrest also triggers the release of cerebral inflammatory molecules, and whether therapeutic hypothermia alters this inflammatory response. This study sought to examine whether hypothermia or the combination of hypothermia with anes-thetic postconditioning with sevoflurane affect cerebral inflammatory response after cardio-pulmonary resuscitation. Methods: Thirty pigs (28 - 34kg) were subjected to cardiac arrest following temporary coro-nary artery occlusion. After 7 minutes of ventricular fibrillation and 2 minutes of basic life support, advanced cardiac life support was started according to the current AHA guidelines. Return of spontaneous circulation was achieved in 21 animals who were randomized to ei-ther normothermia at 38degreesC, hypothermia at 33degreesC or hypothermia at 33degreesC combined with se-voflurane (each group: n = 7) for 24 hours. The effects of hypothermia and the combination of hypothermia with sevoflurane on cerebral inflammatory response after cardiopulmonary resuscitation were studied using tissue samples from the cerebral cortex of pigs euthanized after 24 hours and employing quantitative RT-PCR and ELISA techniques. Results: Global cerebral ischemia following resuscitation resulted in significant upregulation of cerebral tissue inflammatory cytokine mRNA expression (mean +/- SD; interleukin (IL)-1beta 8.7 +/- 4.0, IL-6 4.3 +/- 2.6, IL-10 2.5 +/- 1.6, tumor necrosis factor (TNF)alpha 2.8 +/- 1.8, intercellular adhesion molecule-1 (ICAM-1) 4.0 +/- 1.9-fold compared with sham control) and IL-1beta protein concentration (1.9 +/- 0.6-fold compared with sham control). Hypothermia was associated with a significant (P <0.05 versus normothermia) reduction in cerebral inflammatory cytokine mRNA expression (IL-1beta 1.7 +/- 1.0, IL-6 2.2 +/- 1.1, IL-10 0.8 +/- 0.4, TNFalpha 1.1 +/- 0.6, ICAM-1 1.9 +/- 0.7-fold compared with sham control). These results were also confirmed for IL-1beta on protein level. Experimental settings employing hypothermia in combination with sevoflurane showed that the volatile anesthetic did not confer additional anti-inflammatory effects com-pared with hypothermia alone. Conclusions: Mild therapeutic hypothermia resulted in decreased expression of typical ce-rebral inflammatory mediators after cardiopulmonary resuscitation. This may confer, at least in part, neuroprotection following global cerebral ischemia and resuscitation.