Refine
Year of publication
Document Type
- Article (93)
- Conference Proceeding (4)
- Preprint (1)
Language
- English (98)
Has Fulltext
- yes (98)
Is part of the Bibliography
- no (98)
Keywords
- Patient blood management (6)
- Transfusion (5)
- COVID-19 (4)
- Outcome (4)
- patient blood management (4)
- ARDS (3)
- Critical care (3)
- Intensive care (3)
- Mortality (3)
- SARS-CoV-2 (3)
Institute
Introduction: Hypothermia improves survival and neurological recovery after cardiac arrest. Pro-inflammatory cytokines have been implicated in focal cerebral ischemia/reperfusion in-jury. It is unknown whether cardiac arrest also triggers the release of cerebral inflammatory molecules, and whether therapeutic hypothermia alters this inflammatory response. This study sought to examine whether hypothermia or the combination of hypothermia with anes-thetic postconditioning with sevoflurane affect cerebral inflammatory response after cardio-pulmonary resuscitation. Methods: Thirty pigs (28 - 34kg) were subjected to cardiac arrest following temporary coro-nary artery occlusion. After 7 minutes of ventricular fibrillation and 2 minutes of basic life support, advanced cardiac life support was started according to the current AHA guidelines. Return of spontaneous circulation was achieved in 21 animals who were randomized to ei-ther normothermia at 38degreesC, hypothermia at 33degreesC or hypothermia at 33degreesC combined with se-voflurane (each group: n = 7) for 24 hours. The effects of hypothermia and the combination of hypothermia with sevoflurane on cerebral inflammatory response after cardiopulmonary resuscitation were studied using tissue samples from the cerebral cortex of pigs euthanized after 24 hours and employing quantitative RT-PCR and ELISA techniques. Results: Global cerebral ischemia following resuscitation resulted in significant upregulation of cerebral tissue inflammatory cytokine mRNA expression (mean +/- SD; interleukin (IL)-1beta 8.7 +/- 4.0, IL-6 4.3 +/- 2.6, IL-10 2.5 +/- 1.6, tumor necrosis factor (TNF)alpha 2.8 +/- 1.8, intercellular adhesion molecule-1 (ICAM-1) 4.0 +/- 1.9-fold compared with sham control) and IL-1beta protein concentration (1.9 +/- 0.6-fold compared with sham control). Hypothermia was associated with a significant (P <0.05 versus normothermia) reduction in cerebral inflammatory cytokine mRNA expression (IL-1beta 1.7 +/- 1.0, IL-6 2.2 +/- 1.1, IL-10 0.8 +/- 0.4, TNFalpha 1.1 +/- 0.6, ICAM-1 1.9 +/- 0.7-fold compared with sham control). These results were also confirmed for IL-1beta on protein level. Experimental settings employing hypothermia in combination with sevoflurane showed that the volatile anesthetic did not confer additional anti-inflammatory effects com-pared with hypothermia alone. Conclusions: Mild therapeutic hypothermia resulted in decreased expression of typical ce-rebral inflammatory mediators after cardiopulmonary resuscitation. This may confer, at least in part, neuroprotection following global cerebral ischemia and resuscitation.
Introduction: Immune paralysis with massive T-cell apoptosis is a central pathogenic event during sepsis and correlates with septic patient mortality. Previous observations implied a crucial role of peroxisome proliferator-activated receptor gamma (PPARγ) during T-cell apoptosis.
Methods: To elucidate mechanisms of PPARγ-induced T-cell depletion, we used an endotoxin model as well as the caecal ligation and puncture sepsis model to imitate septic conditions in wild-type versus conditional PPARγ knockout (KO) mice.
Results: PPARγ KO mice showed a marked survival advantage compared with control mice. Their T cells were substantially protected against sepsis-induced death and showed a significantly higher expression of the pro-survival factor IL-2. Since PPARγ is described to repress nuclear factor of activated T cells (NFAT) transactivation and concomitant IL-2 expression, we propose inhibition of NFAT as the underlying mechanism allowing T-cell apoptosis. Corroborating our hypothesis, we observed up-regulation of the pro-apoptotic protein BIM and downregulation of the anti-apoptotic protein Bcl-2 in control mice, which are downstream effector proteins of IL-2 receptor signaling. Application of a neutralizing anti-IL-2 antibody reversed the pro-survival effect of PPARγ-deficient T cells and confirmed IL-2-dependent apoptosis during sepsis.
Conclusion: Apparently antagonizing PPARγ in T cells might improve their survival during sepsis, which concomitantly enhances defence mechanisms and possibly provokes an increased survival of septic patients.
Background Bacterial DNA containing motifs of unmethylated CpG dinucleotides (CpG-ODN) initiate an innate immune response mediated by the pattern recognition receptor Toll-like receptor 9 (TLR9). This leads in particular to the expression of proinflammatory mediators such as tumor necrosis factor (TNF-alpha) and interleukin-1beta (IL-1beta). TLR9 is expressed in human and murine pulmonary tissue and induction of proinflammatory mediators has been linked to the development of acute lung injury. Therefore, the hypothesis was tested whether CpG-ODN administration induces an inflammatory response in the lung via TLR9 in vivo. Methods Wild-type (WT) and TLR9-deficient (TLR9-D) mice received CpG-ODN intraperitoneally (1668-Thioat, 1 nmol/g BW) and were observed for up to 6 hrs. Lung tissue and plasma samples were taken and various inflammatory markers were measured. Results In WT mice, CpG-ODN induced a strong activation of pulmonary NFKB as well as a significant increase in pulmonary TNF-alpha and IL-1beta mRNA/protein. In addition, cytokine serum levels were significantly elevated in WT mice. Increased pulmonary content of lung myeloperoxidase (MPO) was documented in WT mice following application of CpG-ODN. Bronchoalveolar lavage (BAL) revealed that CpG-ODN stimulation significantly increased total cell number as well as neutrophil count in WT animals. In contrast, the CpG-ODN-induced inflammatory response was abolished in TLR9-D mice. Conclusion This study suggests that bacterial CpG-ODN causes lung inflammation via TLR9.
Loss of vascular barrier function causes leak of fluid and proteins into tissues, extensive leak leads to shock and death. Barriers are largely formed by endothelial cell-cell contacts built up by VE-cadherin and are under the control of RhoGTPases. Here we show that a natural plasmin digest product of fibrin, peptide Bß15-42 (also called FX06), significantly reduces vascular leak and mortality in animal models for Dengue shock syndrome. The ability of Bß15-42 to preserve endothelial barriers is confirmed in rats i.v.-injected with LPS. In endothelial cells, Bß15-42 prevents thrombin-induced stress fiber formation, myosin light chain phosphorylation and RhoA activation. The molecular key for the protective effect of Bß15-42 is the src kinase Fyn, which associates with VE-cadherin-containing junctions. Following exposure to Bß15-42 Fyn dissociates from VE-cadherin and associates with p190RhoGAP, a known antagonists of RhoA activation. The role of Fyn in transducing effects of Bß15-42 is confirmed in Fyn -/- mice, where the peptide is unable to reduce LPS-induced lung edema, whereas in wild type littermates the peptide significantly reduces leak. Our results demonstrate a novel function for Bß15-42. Formerly mainly considered as a degradation product occurring after fibrin inactivation, it has now to be considered as a signaling molecule. It stabilizes endothelial barriers and thus could be an attractive adjuvant in the treatment of shock.
Background During gram-negative sepsis, lipopolysaccharide (LPS) induces tissue factor expression on monocytes. The resulting disseminated intravascular coagulation leads to tissue ischemia and worsens the prognosis of septic patients. There are indications, that fever reduces the mortality of sepsis, the effect on tissue factor activity on monocytes is unknown. Therefore, we investigated whether heat shock modulates LPS-induced tissue factor activity in human blood. Methods Whole blood samples and leukocyte suspensions, respectively, from healthy probands (n = 12) were incubated with LPS for 2 hours under heat shock conditions (43°C) or control conditions (37°C), respectively. Subsequent to further 3 hours of incubation at 37°C the clotting time, a measure of tissue factor expression, was determined. Cell integrity was verified by trypan blue exclusion test and FACS analysis. Results Incubation of whole blood samples with LPS for 5 hours at normothermia resulted in a significant shortening of clotting time from 357 ± 108 sec to 82 ± 8 sec compared to samples incubated without LPS (n = 12; p < 0.05). This LPS effect was mediated by tissue factor, as inhibition with active site-inhibited factor VIIa (ASIS) abolished the effect of LPS on clotting time. Blockade of protein synthesis using cycloheximide demonstrated that LPS exerted its procoagulatory effect via an induction of tissue factor expression. Upon heat shock treatment, the LPS effect was blunted: clotting times were 312 ± 66 s in absence of LPS and 277 ± 65 s in presence of LPS (n = 8; p > 0.05). Similarly, heat shock treatment of leukocyte suspensions abolished the LPS-induced tissue factor activity. Clotting time was 73 ± 31 s, when cells were treated with LPS (100 ng/mL) under normothermic conditions, and 301 ± 118 s, when treated with LPS (100 ng/mL) and heat shock (n = 8, p < 0.05). Control experiments excluded cell damage as a potential cause of the observed heat shock effect. Conclusion Heat shock treatment inhibits LPS-induced tissue factor activity in human whole blood samples and isolated leukocytes.
Background and objectives: Preoperative anaemia is an independent risk factor for a higher morbidity and mortality, a longer hospitalization and increased perioperative transfusion rates. Managing preoperative anaemia is the first of three pillars of Patient Blood Management (PBM), a multidisciplinary concept to improve patient safety. While various studies provide medical information on (successful) anaemia treatment pathways, knowledge of organizational details of diagnosis and management of preoperative anaemia across Europe is scarce.
Materials and methods: To gain information on various aspects of preoperative anaemia management including organization, financing, diagnostics and treatment, we conducted a survey (74 questions) in ten hospitals from seven European nations within the PaBloE (Patient Blood Management in Europe) working group covering the year 2016.
Results: Organization and activity in the field of preoperative anaemia management were heterogeneous in the participating hospitals. Almost all hospitals had pathways for managing preoperative anaemia in place, however, only two nations had national guidelines. In six of the ten participating hospitals, preoperative anaemia management was organized by anaesthetists. Diagnostics and treatment focused on iron deficiency anaemia which, in most hospitals, was corrected with intravenous iron.
Conclusion: Implementation and approaches of preoperative anaemia management vary across Europe with a primary focus on treating iron deficiency anaemia. Findings of this survey motivated the hospitals involved to critically evaluate their practice and may also help other hospitals interested in PBM to develop action plans for diagnosis and management of preoperative anaemia.
Background: Clonidine effectively decreases perioperative mortality by reducing sympathetic tone. However, application of clonidine might also restrict anaemia tolerance due to impairment of compensatory mechanisms. Therefore, the influence of clonidine induced, short-term sympathicolysis on anaemia tolerance was assessed in anaesthetized pigs. We measured the effect of clonidine on anaemia tolerance and of the potential for macrohemodynamic alterations to constrain the acute anaemia compensatory mechanisms.
Methods: After governmental approval, 14 anaesthetized pigs of either gender (Deutsche Landrasse, weight (mean ± SD) 24.1 ± 2.4 kg) were randomly assigned to intravenous saline or clonidine treatment (bolus: 20 μg · kg−1, continuous infusion: 15 μg · kg−1 · h−1). Thereafter, the animals were hemodiluted by exchange of whole blood for 6 % hydroxyethyl starch (MW 130.000/0.4) until the individual critical haemoglobin concentration (Hbcrit) was reached. Primary outcome parameters were Hbcrit and the exchangeable blood volume (EBV) until Hbcrit was reached.
Results: Hbcrit did not differ between both groups (values are median [interquartile range]: saline: 2.2 (2.0–2.5) g · dL−1 vs. clonidine: 2.1 (2.1–2.4) g · dL−1; n.s.). Furthermore, there was no difference in exchangeable blood volume (EBV) between both groups (saline: 88 (76–106) mL · kg−1 vs. clonidine: 92 (85–95) mL · kg−1; n.s.).
Conclusion: Anaemia tolerance was not affected by clonidine induced sympathicolysis. Consequently, perioperative clonidine administration probably has not to be omitted in view of acute anaemia.
Background: Extracorporeal life support (ECLS) has become an integral part of modern intensive therapy. The choice of support mode depends largely on the indication. Patients with respiratory failure are predominantly treated with a venovenous (VV) approach. We hypothesized that mortality in Germany in ECLS therapy did not differ from previously reported literature
Methods: Inpatient data from Germany from 2007 to 2018 provided by the Federal Statistical Office of Germany were analysed. The international statistical classification of diseases and related health problems codes (ICD) and process keys (OPS) for extracorporeal membrane oxygenation (ECMO) types, acute respiratory distress syndrome (ARDS) and hospital mortality were used.
Results: In total, 45,647 hospitalized patients treated with ECLS were analysed. In Germany, 231 hospitals provided ECLS therapy, with a median of 4 VV-ECMO and 9 VA-ECMO in 2018. Overall hospital mortality remained higher than predicted in comparison to the values reported in the literature. The number of VV-ECMO cases increased by 236% from 825 in 2007 to 2768 in 2018. ARDS was the main indication for VV-ECMO in only 33% of the patients in the past, but that proportion increased to 60% in 2018. VA-ECMO support is of minor importance in the treatment of ARDS in Germany. The age distribution of patients undergoing ECLS has shifted towards an older population. In 2018, the hospital mortality decreased in VV-ECMO patients and VV-ECMO patients with ARDS to 53.9% (n = 1493) and 54.4% (n = 926), respectively.
Conclusions: ARDS is a severe disease with a high mortality rate despite ECLS therapy. Although endpoints and timing of the evaluations differed from those of the CESAR and EOLIA studies and the Extracorporeal Life Support Organization (ELSO) Registry, the reported mortality in these studies was lower than in the present analysis. Further prospective analyses are necessary to evaluate outcomes in ECMO therapy at the centre volume level.
Background: Every year, ~ 210,000 initial implantations of hip endoprostheses are carried out in Germany alone. The “bone cement implantation syndrome” (BCIS) is considered a severe peri- and early-postoperative complication when implanting cemented prostheses. The origin of the BCIS and its impact on the clinical outcome are still uncertain. This study investigates the clinical progression after BCIS cases in patients with cemented hemiarthroplasty. Risk factors for the occurrence of BCIS are evaluated.
Material and methods* Clinical data of all patients with a proximal femur fracture and which received a cemented hemiarthroplasty within a period of 9.5 years have been collected. BCIS (+) patients and BCIS (−) patients were compared with respect to their demographics and clinical outcome. Risk factors for the development of BCIS were identified.
Results: A total of 208 patients could be included with complete data sets. The mean age was 81.1 ± 10.0 years. Overall, 37% of the patients showed symptoms of BCIS. In comparison to BCIS (−) patients there was a significantly higher rate of cardiovascular complications (27.3% vs. 13.7%, p = 0.016) and a higher in-hospital mortality rate (15.6% vs. 4.6%, p = 0.006) in BCIS (+) patients. Age, absence of a femoral borehole and ASA status were identified as statistically significant risk factors of BCIS.
Conclusion: BCIS is frequently observed and in some cases severe complication. The therapy is exclusively symptomatic; identifying preventional measures might reduce the occurrence of BCIS.
The administration of intravenous fluid to critically ill patients is one of the most common but also one of the most fiercely debated interventions in intensive care medicine. During the past decade, a number of important studies have been published which provide clinicians with improved knowledge regarding the timing, the type and the amount of fluid they should give to their critically ill patients. However, despite the fact that many thousands of patients have been enrolled in these trials of alternative fluid strategies, consensus remains elusive and practice is widely variable. Early adequate resuscitation of patients in shock followed by a restrictive strategy may be associated with better outcomes. Colloids such as modern hydroxyethyl starch are more effective than crystalloids in early resuscitation of patients in shock, and are safe when administered during surgery. However, these colloids may not be beneficial later in the course of intensive care treatment and should best be avoided in intensive care patients who have a high risk of developing acute kidney injury. Albumin has no clear benefit over saline and is associated with increased mortality in neurotrauma patients. Balanced fluids reduce the risk of hyperchloraemic acidosis and possibly kidney injury. The use of hypertonic fluids in patients with sepsis and acute lung injury warrants further investigation and should be considered experimental at this stage. Fluid therapy impacts relevant patient-related outcomes. Clinicians should adopt an individualized strategy based on the clinical scenario and best available evidence. One size does not fit all.
This position paper is the second ESCMID Consensus Document on this subject and aims to provide intensivists, infectious disease specialists, and emergency physicians with a standardized approach to the management of serious travel-related infections in the intensive care unit (ICU) or the emergency department. This document is a cooperative effort between members of two European Society of Clinical Microbiology and Infectious Diseases (ESCMID) study groups and was coordinated by Hakan Leblebicioglu and Jordi Rello for ESGITM (ESCMID Study Group for Infections in Travellers and Migrants) and ESGCIP (ESCMID Study Group for Infections in Critically Ill Patients), respectively. A relevant expert on the subject of each section prepared the first draft which was then edited and approved by additional members from both ESCMID study groups. This article summarizes considerations regarding clinical syndromes requiring ICU admission in travellers, covering immunocompromised patients.
Introduction: Sepsis remains associated with a high mortality rate. Endotoxin has been shown to influence viscoelastic coagulation parameters, thus suggesting a link between endotoxin levels and the altered coagulation phenotype in septic patients. This study evaluated the effects of systemic polyspecific IgM-enriched immunoglobulin (IgM-IVIg) (Pentaglobin® [Biotest, Dreieich, Germany]) on endotoxin activity (EA), inflammatory markers, viscoelastic and conventional coagulation parameters.
Methods: Patients with severe sepsis were identified by daily screening in a tertiary, academic, surgical ICU. After the inclusion of 15 patients, the application of IgM-IVIg (5 mg/kg/d over three days) was integrated into the unit’s standard operation procedure (SOP) to treat patients with severe sepsis, thereby generating “control” and “IgM-IVIg” groups. EA assays, thrombelastometry (ROTEM®) and impedance aggregometry (Multiplate®) were performed on whole blood. Furthermore, routine laboratory parameters were determined according to unit’s standards.
Results: Data from 26 patients were included. On day 1, EA was significantly decreased in the IgM-IVIg group following 6 and 12 hours of treatment (0.51 ±0.06 vs. 0.26 ±0.07, p<0.05 and 0.51 ±0.06 vs. 0.25 ±0.04, p<0.05) and differed significantly compared with the control group following 6 hours of treatment (0.26 ±0.07 vs. 0.43 ±0.07, p<0.05). The platelet count was significantly higher in the IgM-IVIg group following four days of IgM-IVIg treatment (200/nl ±43 vs. 87/nl ±20, p<0.05). The fibrinogen concentration was significantly lower in the control group on day 2 (311 mg/dl ±37 vs. 475 mg/dl ±47 (p = 0.015)) and day 4 (307 mg/dl ±35 vs. 420 mg/dl ±16 (p = 0.017)). No differences in thrombelastometric or aggregometric measurements, or inflammatory markers (interleukin-6 (IL-6), leukocyte, lipopolysaccharide binding protein (LBP)) were observed.
Conclusion: Treatment with IgM-enriched immunoglobulin attenuates the EA levels in patients with severe sepsis and might have an effect on septic thrombocytopenia and fibrinogen depletion. Viscoelastic, aggregometric or inflammatory parameters were not influenced.
Introduction: Balanced fluid replacement solutions can possibly reduce the risks for electrolyte imbalances, for acid-base imbalances, and thus for renal failure. To assess the intraoperative change of base excess (BE) and chloride in serum after treatment with either a balanced gelatine/electrolyte solution or a non-balanced gelatine/electrolyte solution, a prospective, controlled, randomized, double-blind, dual centre phase III study was conducted in two tertiary care university hospitals in Germany.
Material and methods: 40 patients of both sexes, aged 18 to 90 years, who were scheduled to undergo elective abdominal surgery with assumed intraoperative volume requirement of at least 15 mL/kg body weight gelatine solution were included. Administration of study drug was performed intravenously according to patients need. The trigger for volume replacement was a central venous pressure (CVP) minus positive end-expiratory pressure (PEEP) <10 mmHg (CVP <10 mmHg). The crystalloid:colloid ratio was 1:1 intra- and postoperatively. The targets for volume replacement were a CVP between 10 and 14 mmHg minus PEEP after treatment with vasoactive agent and mean arterial pressure (MAP) > 65 mmHg.
Results: The primary endpoints, intraoperative changes of base excess –2.59 ± 2.25 (median: –2.65) mmol/L (balanced group) and –4.79 ± 2.38 (median: –4.70) mmol/L (non-balanced group)) or serum chloride 2.4 ± 1.9 (median: 3.0) mmol/L and 5.2 ± 3.1 (median: 5.0) mmol/L were significantly different (p = 0.0117 and p = 0.0045, respectively). In both groups (each n = 20) the investigational product administration in terms of volume and infusion rate was comparable throughout the course of the study, i.e. before, during and after surgery.
Discussion: Balanced gelatine solution 4% combined with a balanced electrolyte solution demonstrated significant smaller impact on blood gas analytic parameters in the primary endpoints BE and serum chloride when compared to a non-balanced gelatine solution 4% combined with NaCl 0.9%. No marked treatment differences were observed with respect to haemodynamics, coagulation and renal function.
Trial registration: ClinicalTrials.gov (NCT01515397) and clinicaltrialsregister.eu, EudraCT number 2010-018524-58.
Background: Patient Blood Management (PBM) is a systematic quality improving clinical model to reduce anemia and avoid transfusions in all kinds of clinical settings. Here, we investigated the potential of PBM in oncologic surgery and hypothesized that PBM improves 2-year overall survival (OS).
Methods: Retrospective analysis of patients 2 years before and after PBM implementation. The primary endpoint was OS at 2 years after surgery. We identified a sample size of 824 to detect a 10% improvement in survival in the PBM group.
Results: The analysis comprised of 836 patients that underwent oncologic surgery, 389 before and 447 after PBM, was implemented. Patients in the PBM+ presented significantly more frequent with normal hemoglobin values before surgery than PBM− (56.6 vs. 35.7%; p < 0.001). The number of transfusions was significantly reduced from 5.5 ± 11.1 to 3.0 ± 6.9 units/patient (p < 0.001); moreover, the percentage of patients being transfused during the clinic stay was significantly reduced from 62.4 to 40.9% (p < 0.001). Two-year OS was significantly better in the PBM+ and increased from 67.0 to 80.1% (p = 0.001). A normal hemoglobin value (> 12 g/dl in female and > 13 g/dl in male) before surgery (HR 0.43, 95% CI 0.29–0.65, p < 0.001) was the only independent predictive factor positively affecting survival.
Conclusions: PBM is a quality improvement tool that is associated with better mid-term surgical oncologic outcome. The root cause for improvement is the increase of patients entering surgery with normal hemoglobin values.
Introduction: The triggering receptor expressed on myeloid cells-1 (TREM-1) is known to be expressed during bacterial infections. We investigated whether TREM-1 is also expressed in non-infectious inflammation following traumatic lung contusion.
Methods: In a study population of 45 adult patients with multiple trauma and lung contusion, we obtained bronchoalveolar lavage (BAL) (blind suctioning of 20 ml NaCl (0.9%) via jet catheter) and collected blood samples at two time points (16 hours and 40 hours) after trauma. Post hoc patients were assigned to one of four groups radiologically classified according to the severity of lung contusion based on the initial chest tomography. Concentration of soluble TREM-1 (sTREM-1) and bacterial growth were determined in the BAL. sTREM-1, IL-6, IL-10, lipopolysaccharide binding protein, procalcitonin, C-reactive protein and leukocyte count were assessed in blood samples. Pulmonary function was evaluated by the paO2/FiO2 ratio.
Results: Three patients were excluded due to positive bacterial growth in the initial BAL. In 42 patients the severity of lung contusion correlated with the levels of sTREM-1 16 hours and 40 hours after trauma. sTREM-1 levels were significantly (P < 0.01) elevated in patients with severe contusion (2,184 pg/ml (620 to 4,000 pg/ml)) in comparison with patients with mild (339 pg/ml (135 to 731 pg/ml)) or no (217 pg/ml (97 to 701 pg/ml)) contusion 40 hours following trauma. At both time points the paO2/FiO2 ratio correlated negatively with sTREM-1 levels (Spearman correlation coefficient = -0.446, P < 0.01).
Conclusions: sTREM-1 levels are elevated in the BAL of patients following pulmonary contusion. Furthermore, the levels of sTREM-1 in the BAL correlate well with both the severity of radiological pulmonary tissue damage and functional impairment of gas exchange (paO2/FiO2 ratio).
Introduction: Organ dysfunction or failure after the first days of ICU treatment and subsequent mortality with respect to the type of intensive care unit (ICU) admission is poorly elucidated. Therefore we analyzed the association of ICU mortality and admission for medical (M), scheduled surgery (ScS) or unscheduled surgery (US) patients mirrored by the occurrence of organ dysfunction/failure (OD/OF) after the first 72h of ICU stay.
Methods: For this retrospective cohort study (23,795 patients; DIVI registry; German Interdisciplinary Association for Intensive Care Medicine (DIVI)) organ dysfunction or failure were derived from the Sequential Organ Failure Assessment (SOFA) score (excluding the Glasgow Coma Scale). SOFA scores were collected on admission to ICU and 72h later. For patients with a length of stay of at least five days, a multivariate analysis was performed for individual OD/OF on day three.
Results: M patients had the lowest prevalence of cardiovascular failure (M 31%; ScS 35%; US 38%), and the highest prevalence of respiratory (M 24%; ScS 13%; US 17%) and renal failure (M 10%; ScS 6%; US 7%). Risk of death was highest for M- and ScS-patients in those with respiratory failure (OR; M 2.4; ScS 2.4; US 1.4) and for surgical patients with renal failure (OR; M 1.7; ScS 2.7; US 2.4).
Conclusion: The dynamic evolution of OD/OF within 72h after ICU admission and mortality differed between patients depending on their types of admission. This has to be considered to exclude a systematic bias during multi-center trials.
Background: Numerous cases of swine-origin 2009 H1N1 influenza A virus (H1N1)-associated acute respiratory distress syndrome (ARDS) bridged by extracorporeal membrane oxygenation (ECMO) therapy have been reported; however, complication rates are high. We present our experience with H1N1-associated ARDS and successful bridging of lung function using superimposed high-frequency jet ventilation (SHFJV) in combination with continuous positive airway pressure/assisted spontaneous breathing (CPAP/ASB).
Methods: We admitted five patients with H1N1 infection and ARDS to our intensive care unit. Although all patients required pure oxygen and controlled ventilation, oxygenation was insufficient. We applied SHFJV/CPAP/ASB to improve oxygenation.
Results: Initial PaO2/FiO2 ratio prior SHFJV was 58-79 mmHg. In all patients, successful oxygenation was achieved by SHFJV (PaO2/FiO2 ratio 105-306 mmHg within 24 h). Spontaneous breathing was set during first hours after admission. SHFJV could be stopped after 39, 40, 72, 100, or 240 h. Concomitant pulmonary herpes simplex virus (HSV) infection was observed in all patients. Two patients were successfully discharged. The other three patients relapsed and died within 7 weeks mainly due to combined HSV infection and in two cases reoccurring H1N1 infection.
Conclusions: SHFJV represents an alternative to bridge lung function successfully and improve oxygenation in the critically ill.
Introduction: Acute kidney injury (AKI) can evolve quickly and clinical measures of function often fail to detect AKI at a time when interventions are likely to provide benefit. Identifying early markers of kidney damage has been difficult due to the complex nature of human AKI, in which multiple etiologies exist. The objective of this study was to identify and validate novel biomarkers of AKI.
Methods: We performed two multicenter observational studies in critically ill patients at risk for AKI - discovery and validation. The top two markers from discovery were validated in a second study (Sapphire) and compared to a number of previously described biomarkers. In the discovery phase, we enrolled 522 adults in three distinct cohorts including patients with sepsis, shock, major surgery, and trauma and examined over 300 markers. In the Sapphire validation study, we enrolled 744 adult subjects with critical illness and without evidence of AKI at enrollment; the final analysis cohort was a heterogeneous sample of 728 critically ill patients. The primary endpoint was moderate to severe AKI (KDIGO stage 2 to 3) within 12 hours of sample collection.
Results: Moderate to severe AKI occurred in 14% of Sapphire subjects. The two top biomarkers from discovery were validated. Urine insulin-like growth factor-binding protein 7 (IGFBP7) and tissue inhibitor of metalloproteinases-2 (TIMP-2), both inducers of G1 cell cycle arrest, a key mechanism implicated in AKI, together demonstrated an AUC of 0.80 (0.76 and 0.79 alone). Urine [TIMP-2].[IGFBP7] was significantly superior to all previously described markers of AKI (P <0.002), none of which achieved an AUC >0.72. Furthermore, [TIMP-2].[IGFBP7] significantly improved risk stratification when added to a nine-variable clinical model when analyzed using Cox proportional hazards model, generalized estimating equation, integrated discrimination improvement or net reclassification improvement. Finally, in sensitivity analyses [TIMP-2].[IGFBP7] remained significant and superior to all other markers regardless of changes in reference creatinine method.
Conclusions: Two novel markers for AKI have been identified and validated in independent multicenter cohorts. Both markers are superior to existing markers, provide additional information over clinical variables and add mechanistic insight into AKI. Trial registration: ClinicalTrials.gov number NCT01209169.
Genetic or pharmacological ablation of toll-like receptor 2 (TLR2) protects against myocardial ischemia/reperfusion injury (MI/R). However, the endogenous ligand responsible for TLR2 activation has not yet been detected. The objective of this study was to identify HMGB1 as an activator of TLR2 signalling during MI/R. C57BL/6 wild-type (WT) or TLR2(-/-)-mice were injected with vehicle, HMGB1, or HMGB1 BoxA one hour before myocardial ischemia (30 min) and reperfusion (24 hrs). Infarct size, cardiac troponin T, leukocyte infiltration, HMGB1 release, TLR4-, TLR9-, and RAGE-expression were quantified. HMGB1 plasma levels were measured in patients undergoing coronary artery bypass graft (CABG) surgery. HMGB1 antagonist BoxA reduced cardiomyocyte necrosis during MI/R in WT mice, accompanied by reduced leukocyte infiltration. Injection of HMGB1 did, however, not increase infarct size in WT animals. In TLR2(-/-)-hearts, neither BoxA nor HMGB1 affected infarct size. No differences in RAGE and TLR9 expression could be detected, while TLR2(-/-)-mice display increased TLR4 and HMGB1 expression. Plasma levels of HMGB1 were increased MI/R in TLR2(-/-)-mice after CABG surgery in patients carrying a TLR2 polymorphism (Arg753Gln). We here provide evidence that absence of TLR2 signalling abrogates infarct-sparing effects of HMGB1 blockade.
In contrast to several smaller studies, which demonstrate that remote ischemic preconditioning (RIPC) reduces myocardial injury in patients that undergo cardiovascular surgery, the RIPHeart study failed to demonstrate beneficial effects of troponin release and clinical outcome in propofol-anesthetized cardiac surgery patients. Therefore, we addressed the potential biochemical mechanisms triggered by RIPC. This is a predefined prospective sub-analysis of the randomized and controlled RIPHeart study in cardiac surgery patients (n = 40) that was recently published. Blood samples were drawn from patients prior to surgery, after RIPC of four cycles of 5 min arm ischemia/5 min reperfusion (n = 19) and the sham (n = 21) procedure, after connection to cardiopulmonary bypass (CPB), at the end of surgery, 24 h postoperatively, and 48 h postoperatively for the measurement of troponin T, macrophage migration inhibitory factor (MIF), stromal cell-derived factor 1 (CXCL12), IL-6, CXCL8, and IL-10. After RIPC, right atrial tissue samples were taken for the measurement of extracellular-signal regulated kinase (ERK1/2), protein kinase B (AKT), Glycogen synthase kinase 3 (GSK-3β), protein kinase C (PKCε), and MIF content. RIPC did not significantly reduce the troponin release when compared with the sham procedure. MIF serum levels intraoperatively increased, peaking at intensive care unit (ICU) admission (with an increase of 48.04%, p = 0.164 in RIPC; and 69.64%, p = 0.023 over the baseline in the sham procedure), and decreased back to the baseline 24 h after surgery, with no differences between the groups. In the right atrial tissue, MIF content decreased after RIPC (1.040 ± 1.032 Arbitrary units [au] in RIPC vs. 2.028 ± 1.631 [au] in the sham procedure, p < 0.05). CXCL12 serum levels increased significantly over the baseline at the end of surgery, with no differences between the groups. ERK1/2, AKT, GSK-3β, and PKCɛ phosphorylation in the right atrial samples were no different between the groups. No difference was found in IL-6, CXCL8, and IL10 serum levels between the groups. In this cohort of cardiac surgery patients that received propofol anesthesia, we could not show a release of potential mediators of signaling, nor an effect on the inflammatory response, nor an activation of well-established protein kinases after RIPC. Based on these data, we cannot exclude that confounding factors, such as propofol, may have interfered with RIPC.
Background: Remote ischemic preconditioning (RIPC) has been shown to enhance the tolerance of remote organs to cope with a subsequent ischemic event. We hypothesized that RIPC reduces postoperative neurocognitive dysfunction (POCD) in patients undergoing complex cardiac surgery.
Methods: We conducted a prospective, randomized, double-blind, controlled trial including 180 adult patients undergoing elective cardiac surgery with cardiopulmonary bypass. Patients were randomized either to RIPC or to control group. Primary endpoint was postoperative neurocognitive dysfunction 5–7 days after surgery assessed by a comprehensive test battery. Cognitive change was assumed if the preoperative to postoperative difference in 2 or more tasks assessing different cognitive domains exceeded more than one SD (1 SD criterion) or if the combined Z score was 1.96 or greater (Z score criterion).
Results: According to 1 SD criterion, 52% of control and 46% of RIPC patients had cognitive deterioration 5–7 days after surgery (p = 0.753). The summarized Z score showed a trend to more cognitive decline in the control group (2.16±5.30) compared to the RIPC group (1.14±4.02; p = 0.228). Three months after surgery, incidence and severity of neurocognitive dysfunction did not differ between control and RIPC. RIPC tended to decrease postoperative troponin T release at both 12 hours [0.60 (0.19–1.94) µg/L vs. 0.48 (0.07–1.84) µg/L] and 24 hours after surgery [0.36 (0.14–1.89) µg/L vs. 0.26 (0.07–0.90) µg/L].
Conclusions: We failed to demonstrate efficacy of a RIPC protocol with respect to incidence and severity of POCD and secondary outcome variables in patients undergoing a wide range of cardiac surgery. Therefore, definitive large-scale multicenter trials are needed.
Trial Registration: ClinicalTrials.gov NCT00877305
BACKGROUND: Transient episodes of ischemia in a remote organ or tissue (remote ischemic preconditioning, RIPC) can attenuate myocardial injury. Myocardial damage is associated with tissue remodeling and the matrix metalloproteinases 2 and 9 (MMP-2/9) are crucially involved in these events. Here we investigated the effects of RIPC on the activities of heart tissue MMP-2/9 and their correlation with serum concentrations of cardiac troponin T (cTnT), a marker for myocardial damage.
METHODS: In cardiosurgical patients with cardiopulmonary bypass (CPB) RIPC was induced by four 5 minute cycles of upper limb ischemia/reperfusion. Cardiac tissue was obtained before as well as after CPB and serum cTnT concentrations were measured. Tissue derived from control patients (N = 17) with high cTnT concentrations (≥0.32 ng/ml) and RIPC patients (N = 18) with low cTnT (≤0.32 ng/ml) was subjected to gelatin zymography to quantify MMP-2/9 activities.
RESULTS: In cardiac biopsies obtained before CPB, activities of MMP-2/9 were attenuated in the RIPC group (MMP-2: Control, 1.13 ± 0.13 a.u.; RIPC, 0.71 ± 0.12 a.u.; P < 0.05. MMP-9: Control, 1.50 ± 0.16 a.u.; RIPC, 0.87 ± 0.14 a.u.; P < 0.01), while activities of the pro-MMPs were not altered (P > 0.05). In cardiac biopsies taken after CPB activities of pro- and active MMP-2/9 were not different between the groups (P > 0.05). Spearman's rank tests showed that MMP-2/9 activities in cardiac tissue obtained before CPB were positively correlated with postoperative cTnT serum levels (MMP-2, P = 0.016; MMP-9, P = 0.015).
CONCLUSIONS: Activities of MMP-2/9 in cardiac tissue obtained before CPB are attenuated by RIPC and are positively correlated with serum concentrations of cTnT. MMPs may represent potential targets for RIPC mediated cardioprotection.
TRIAL REGISTRATION: ClinicalTrials.gov identifier NCT00877305.
Introduction: Hip fracture surgery is associated with high in-hospital and 30-day mortality rates and serious adverse patient outcomes. Evidence from randomised controlled trials regarding effectiveness of spinal versus general anaesthesia on patient-centred outcomes after hip fracture surgery is sparse.
Methods and analysis: The iHOPE study is a pragmatic national, multicentre, randomised controlled, open-label clinical trial with a two-arm parallel group design. In total, 1032 patients with hip fracture (>65 years) will be randomised in an intended 1:1 allocation ratio to receive spinal anaesthesia (n=516) or general anaesthesia (n=516). Outcome assessment will occur in a blinded manner after hospital discharge and inhospital. The primary endpoint will be assessed by telephone interview and comprises the time to the first occurring event of the binary composite outcome of all-cause mortality or new-onset serious cardiac and pulmonary complications within 30 postoperative days. In-hospital secondary endpoints, assessed via in-person interviews and medical record review, include mortality, perioperative adverse events, delirium, satisfaction, walking independently, length of hospital stay and discharge destination. Telephone interviews will be performed for long-term endpoints (all-cause mortality, independence in walking, chronic pain, ability to return home cognitive function and overall health and disability) at postoperative day 30±3, 180±45 and 365±60.
Ethics and dissemination: iHOPE has been approved by the leading Ethics Committee of the Medical Faculty of the RWTH Aachen University on 14 March 2018 (EK 022/18). Approval from all other involved local Ethical Committees was subsequently requested and obtained. Study started in April 2018 with a total recruitment period of 24 months. iHOPE will be disseminated via presentations at national and international scientific meetings or conferences and publication in peer-reviewed international scientific journals.
Trial registration number: DRKS00013644; Pre-results
Background: Macrophage Migration Inhibitory Factor (MIF) is highly elevated after cardiac surgery and impacts the postoperative inflammation. The aim of this study was to analyze whether the polymorphisms CATT5–7 (rs5844572/rs3063368,“-794”) and G>C single-nucleotide polymorphism (rs755622,-173) in the MIF gene promoter are related to postoperative outcome. Methods: In 1116 patients undergoing cardiac surgery, the MIF gene polymorphisms were analyzed and serum MIF was measured by ELISA in 100 patients. Results: Patients with at least one extended repeat allele (CATT7) had a significantly higher risk of acute kidney injury (AKI) compared to others (23% vs. 13%; OR 2.01 (1.40–2.88), p = 0.0001). Carriers of CATT7 were also at higher risk of death (1.8% vs. 0.4%; OR 5.12 (0.99–33.14), p = 0.026). The GC genotype was associated with AKI (20% vs. GG/CC:13%, OR 1.71 (1.20–2.43), p = 0.003). Multivariate analyses identified CATT7 predictive for AKI (OR 2.13 (1.46–3.09), p < 0.001) and death (OR 5.58 (1.29–24.04), p = 0.021). CATT7 was associated with higher serum MIF before surgery (79.2 vs. 50.4 ng/mL, p = 0.008). Conclusion: The CATT7 allele associates with a higher risk of AKI and death after cardiac surgery, which might be related to chronically elevated serum MIF. Polymorphisms in the MIF gene may constitute a predisposition for postoperative complications and the assessment may improve risk stratification and therapeutic guidance.
Background: To compare the effect of aprotinin with the effect of lysine analogues (tranexamic acid and ε-aminocaproic acid) on early mortality in three subgroups of patients: low, intermediate and high risk of cardiac surgery.
Methods and Findings: We performed a meta-analysis of randomised controlled trials and observational with the following data sources: Medline, Cochrane Library, and reference lists of identified articles. The primary outcome measure was early (in-hospital/30-day) mortality. The secondary outcome measures were any transfusion of packed red blood cells within 24 hours after surgery, any re-operation for bleeding or massive bleeding, and acute renal dysfunction or failure within the selected cited publications, respectively.
Out of 328 search results, 31 studies (15 trials and 16 observational studies) included 33,501 patients. Early mortality was significantly increased after aprotinin vs. lysine analogues with a pooled risk ratio (95% CI) of 1.58 (1.13–2.21), p<0.001 in the low (n = 14,297) and in the intermediate risk subgroup (1.42 (1.09–1.84), p<0.001; n = 14,427), respectively. Contrarily, in the subgroup of high risk patients (n = 4,777), the risk for mortality did not differ significantly between aprotinin and lysine analogues (1.03 (0.67–1.58), p = 0.90).
Conclusion: Aprotinin may be associated with an increased risk of mortality in low and intermediate risk cardiac surgery, but presumably may has no effect on early mortality in a subgroup of high risk cardiac surgery compared to lysine analogues. Thus, decisions to re-license aprotinin in lower risk patients should critically be debated. In contrast, aprotinin might probably be beneficial in high risk cardiac surgery as it reduces risk of transfusion and bleeding complications.
Background: Cell salvage is commonly used as part of a blood conservation strategy. However concerns among clinicians exist about the efficacy of transfusion of washed cell salvage.
Methods: We performed a meta-analysis of randomized controlled trials in which patients, scheduled for all types of surgery, were randomized to washed cell salvage or to a control group with no cell salvage. Data were independently extracted, risk ratio (RR), and weighted mean differences (WMD) with 95% confidence intervals (CIs) were calculated. Data were pooled using a random effects model. The primary endpoint was the number of patients exposed to allogeneic red blood cell (RBC) transfusion.
Results: Out of 1140 search results, a total of 47 trials were included. Overall, the use of washed cell salvage reduced the rate of exposure to allogeneic RBC transfusion by a relative 39% (RR = 0.61; 95% CI 0.57 to 0.65; P < 0.001), resulting in an average saving of 0.20 units of allogeneic RBC per patient (weighted mean differences [WMD] = -0.20; 95% CI -0.22 to -0.18; P < 0.001), reduced risk of infection by 28% (RR = 0.72; 95% CI 0.54 to 0.97; P = 0.03), reduced length of hospital stay by 2.31 days (WMD = -2.31; 95% CI -2.50 to -2.11; P < 0.001), but did not significantly affect risk of mortality (RR = 0.92; 95% CI 0.63 to 1.34; P = 0.66). No statistical difference could be observed in the number of patients exposed to re-operation, plasma, platelets, or rate of myocardial infarction and stroke.
Conclusions: Washed cell salvage is efficacious in reducing the need for allogeneic RBC transfusion and risk of infection in surgery.
Background: The most common technique used worldwide to quantify blood loss during an operation is the visual assessment by the attending intervention team. In every operating room you will find scaled suction canisters that collect fluids from the surgical field. This scaling is commonly used by clinicians for visual assessment of intraoperative blood loss. While many studies have been conducted to quantify and improve the inaccuracy of the visual estimation method, research has focused on the estimation of blood volume in surgical drapes. The question whether and how scaling of canisters correlates with actual blood loss and how accurately clinicians estimate blood loss in scaled canisters has not been the focus of research to date.
Methods: A simulation study with four “bleeding” scenarios was conducted using expired whole blood donations. After diluting the blood donations with full electrolyte solution, the sample blood loss volume (SBL) was transferred into suction canisters. The study participants then had to estimate the blood loss in all four scenarios. The difference to the reference blood loss (RBL) per scenario was analyzed.
Results: Fifty-three anesthetists participated in the study. The median estimated blood loss was 500 ml (IQR 300/1150) compared to the RBL median of 281.5 ml (IQR 210.0/1022.0). Overestimations up to 1233 ml were detected. Underestimations were also observed in the range of 138 ml. The visual estimate for canisters correlated moderately with RBL (Spearman’s rho: 0.818; p < 0.001). Results from univariate nonparametric confirmation statistics regarding visual estimation of canisters show that the deviation of the visual estimate of blood loss is significant (z = − 10.95, p < 0.001, n = 220). Participants’ experience level had no significant influence on VEBL (p = 0.402).
Conclusion: The discrepancies between the visual estimate of canisters and the actual blood loss are enormous despite the given scales. Therefore, we do not recommend estimating the blood loss visually in scaled suction canisters. Colorimetric blood loss estimation could be a more accurate option.
Background: Nicolaides-Baraitser syndrome (NCBRS) is a rare disease caused by mutations in the SMRCA2 gene, which affects chromatin remodelling and leads to a wide range of symptoms including microcephaly, distinct facial features, recurrent seizures, and severe mental retardation. Until now, less than 100 cases have been reported. Case presentation: A 22-month old male infant with NCBRS underwent elective cleft palate surgery. The anaesthetists were challenged by the physiological condition of the patient: narrow face, very small mouth, mild tachypnea, slight sternal retractions, physical signs of partial monosomy 9p, and plagiocephalus, midface hypoplasia, V-shaped cleft palate, enhanced muscular hypotension, dysplastic kidneys (bilateral, estimated GFR: approx. 40 ml/m2), nocturnal oxygen demand, and combined apnea. In addition, little information was available about interaction of the NCBRS displayed by the patient and anaesthesia medications. Conclusions: The cleft palate was successfully closed using the bridge flap technique. Overall, we recommend to perform a trial video assisted laryngoscopy in the setting of spontaneous breathing with deep inhalative anaesthesia before administration of muscle relaxation to detect any airway difficulties while remaining spontaneoues breathing and protective reflexes.
Purpose: Trauma is the leading cause of death in children. In adults, blood transfusion and fluid resuscitation protocols changed resulting in a decrease of morbidity and mortality over the past 2 decades. Here, transfusion and fluid resuscitation practices were analysed in severe injured children in Germany.
Methods: Severely injured children (maximum Abbreviated Injury Scale (AIS) ≥ 3) admitted to a certified trauma-centre (TraumaZentrum DGU®) between 2002 and 2017 and registered at the TraumaRegister DGU® were included and assessed regarding blood transfusion rates and fluid therapy.
Results: 5,118 children (aged 1–15 years) with a mean ISS 22 were analysed. Blood transfusion rates administered until ICU admission decreased from 18% (2002–2005) to 7% (2014–2017). Children who are transfused are increasingly seriously injured. ISS has increased for transfused children aged 1–15 years (2002–2005: mean 27.7–34.4 in 2014–2017). ISS in non-transfused children has decreased in children aged 1–15 years (2002–2005: mean 19.6 to mean 17.6 in 2014–2017). Mean prehospital fluid administration decreased from 980 to 549 ml without affecting hemodynamic instability.
Conclusion: Blood transfusion rates and amount of fluid resuscitation decreased in severe injured children over a 16-year period in Germany. Restrictive blood transfusion and fluid management has become common practice in severe injured children. A prehospital restrictive fluid management strategy in severely injured children is not associated with a worsened hemodynamic state, abnormal coagulation or base excess but leads to higher hemoglobin levels.
Transfusion of red blood cells (RBC) in patients undergoing major elective cranial surgery is associated with increased morbidity, mortality and prolonged hospital length of stay (LOS). This retrospective single center study aims to identify the clinical outcome of RBC transfusions on skull base and non-skull base meningioma patients including the identification of risk factors for RBC transfusion. Between October 2009 and October 2016, 423 patients underwent primary meningioma resection. Of these, 68 (16.1%) received RBC transfusion and 355 (83.9%) did not receive RBC units. Preoperative anaemia rate was significantly higher in transfused patients (17.7%) compared to patients without RBC transfusion (6.2%; p = 0.0015). In transfused patients, postoperative complications as well as hospital LOS was significantly higher (p < 0.0001) compared to non-transfused patients. After multivariate analyses, risk factors for RBC transfusion were preoperative American Society of Anaesthesiologists (ASA) physical status score (p = 0.0247), tumor size (p = 0.0006), surgical time (p = 0.0018) and intraoperative blood loss (p < 0.0001). Kaplan-Meier curves revealed significant influence on overall survival by preoperative anaemia, RBC transfusion, smoking, cardiovascular disease, preoperative KPS ≤ 60% and age (elderly ≥ 75 years). We concluded that blood loss due to large tumors or localization near large vessels are the main triggers for RBC transfusion in meningioma patients paired with a potential preselection that masks the effect of preoperative anaemia in multivariate analysis. Further studies evaluating the impact of preoperative anaemia management for reduction of RBC transfusion are needed to improve the clinical outcome of meningioma patients.
Background. Tracheal intubation still represents the "gold standard" in securing the airway of unconscious patients in the prehospital setting. Especially in cases of restricted access to the patient, video laryngoscopy became more and more relevant.
Objectives. The aim of the study was to evaluate the performance and intubation success of four different video laryngoscopes, one optical laryngoscope, and a Macintosh blade while intubating from two different positions in a mannequin trial with difficult access to the patient.
Methods. A mannequin with a cervical collar was placed on the driver’s seat. Intubation was performed with six different laryngoscopes either through the driver’s window or from the backseat. Success, C/L score, time to best view (TTBV), time to intubation (TTI), and number of attempts were measured. All participants were asked to rate their favored device.
Results. Forty-two physicians participated. 100% of all intubations performed from the backseat were successful. Intubation success through the driver’s window was less successful. Only with the Airtraq® optical laryngoscope, 100% success was achieved. Best visualization (window C/L 2a; backseat C/L 2a) and shortest TTBV (window 4.7 s; backseat 4.1 s) were obtained when using the D-Blade video laryngoscope, but this was not associated with a higher success through the driver’s window. Fastest TTI was achieved through the window (14.2 s) when using the C-MAC video laryngoscope and from the backseat (7.3 s) when using a Macintosh blade.
Conclusions. Video laryngoscopy revealed better results in visualization but was not associated with a higher success. Success depended on the approach and familiarity with the device. We believe that video laryngoscopy is suitable for securing airways in trapped accident victims. The decision for an optimal device is complicated and should be based upon experience and regular training with the device.
Characterization of neonates born to mothers with SARS-CoV-2 infection: review and meta-analysis
(2020)
Characterization of neonates born to mothers with SARS-CoV-2 infection has been partially carried out. There has been no systematic review providing a holistic neonatal presentation including possible vertical transmission. A systematic literature search was performed using PubMed, Google Scholar and Web of Science up to June, 6 2020. Studies on neonates born to mothers with SARS-CoV-2 infection were included. A binary random effect model was used for prevalence and 95% confidence interval. 32 studies involving 261 neonates were included in meta-analysis. Most neonates born to infected mothers did not show any clinical abnormalities (80.4%). Clinical features were dyspnea in 11 (42.3%) and fever in 9 newborns (19.1%). Of 261 neonates, 120 neonates were tested for infection, of whom 12 (10.0%) tested positive. Swabs from placenta, cord blood and vaginal secretion were negative. Neonates are mostly non affected by the mother's SARS-CoV-2 infection. The risk of vertical transmission is low.
Background: Point of care devices for performing targeted coagulation substitution in patients who are bleeding have become increasingly important in recent years. New on the market is the Quantra. It is a device that uses sonorheometry, a sonic estimation of elasticity via resonance, which is a novel ultrasound-based technology that measures viscoelastic properties of whole blood. Several studies have already shown the comparability of the Quantra with devices already established on the market, such as the rotational thromboelastometry (ROTEM) device.
Objective: In contrast to existing studies, this study is the first prospective interventional study using this new system in a cardiac surgical patient cohort. We will investigate the noninferiority between an already existing coagulation algorithm based on the ROTEM/Multiplate system and a new algorithm based on the Quantra system for the treatment of coagulopathic cardiac surgical patients.
Methods: The study is divided into two phases. In an initial observation phase, whole blood samples of 20 patients obtained at three defined time points (prior to surgery, after completion of cardiopulmonary bypass, and on arrival in the intensive care unit) will be analyzed using both the ROTEM/Multiplate and Quantra systems. The obtained threshold values will be used to develop a novel algorithm for hemotherapy. In a second intervention phase, the new algorithm will be tested for noninferiority against an algorithm used routinely for years in our department.
Results: The main objective of the examination is the cumulative loss of blood within 24 hours after surgery. Statistical calculations based on the literature and in-house data suggest that the new algorithm is not inferior if the difference in cumulative blood loss is <150 mL/24 hours.
Conclusions: Because of the comparability of the Quantra sonorheometry system with the ROTEM measurement methods, the existing hemotherapy treatment algorithm can be adapted to the Quantra device with proof of noninferiority.
Trial Registration: ClinicalTrials.gov NCT03902275; https://clinicaltrials.gov/ct2/show/NCT03902275
International Registered Report Identifier (IRRID): DERR1-10.2196/17206
Health economics of Patient Blood Management: a cost‐benefit analysis based on a meta‐analysis
(2019)
Background and Objectives: Patient Blood Management (PBM) is the timely application of evidence‐based medical and surgical concepts designed to improve haemoglobin concentration, optimize haemostasis and minimize blood loss in an effort to improve patient outcomes. The focus of this cost‐benefit analysis is to analyse the economic benefit of widespread implementation of a multimodal PBM programme.
Materials and Methods: Based on a recent meta‐analysis including 17 studies (>235 000 patients) comparing PBM with control care and data from the University Hospital Frankfurt, a cost‐benefit analysis was performed. Outcome data were red blood cell (RBC) transfusion rate, number of transfused RBC units, and length of hospital stay (LOS). Costs were considered for the following three PBM interventions as examples: anaemia management including therapy of iron deficiency, use of cell salvage and tranexamic acid. For sensitivity analysis, a Monte Carlo simulation was performed.
Results: Iron supplementation was applied in 3·1%, cell salvage in 65% and tranexamic acid in 89% of the PBM patients. In total, applying these three PBM interventions costs €129·04 per patient. However, PBM was associated with a reduction in transfusion rate, transfused RBC units per patient, and LOS which yielded to mean savings of €150·64 per patient. Thus, the overall benefit of PBM implementation was €21·60 per patient. In the Monte Carlo simulation, the cost savings on the outcome side exceeded the PBM costs in approximately 2/3 of all repetitions and the total benefit was €1 878 000 in 100·000 simulated patients.
Conclusion: Resources to implement a multimodal PBM concept optimizing patient care and safety can be cost‐effectively.
Introduction: Cell salvage (CS) is an integral part of patient blood management (PBM) and aims to reduce allogeneic red blood cell (RBC) transfusion.
Material and methods: This observational study analysed patients scheduled for elective cardiac surgery requiring cardiopulmonary bypass (CPB) between November 2015 and October 2018. Patients were divided into a CS group (patients receiving CS) and a control group (no CS). Primary endpoints were the number of patients exposed to allogeneic RBC transfusions and the number of RBC units transfused per patient.
Results: A total of 704 patients undergoing cardiac surgery were analysed, of whom 338 underwent surgery with CS (CS group) and 366 were without CS (control group). Intraoperatively, 152 patients (45%) were exposed to allogeneic RBC transfusions in the CS group and 93 patients (25%) in the control group (P < 0.001). Considering the amount of intraoperative blood loss, regression analysis revealed a significant association between blood loss and increased use of RBC units in patients of the control compared to the CS group (1000 mL: 1.0 vs. 0.6 RBC units; 2000 mL: 2.2 vs. 1.1 RBC units; 3000 mL: 3.4 vs. 1.6 RBC units). Thus, CS was significantly associated with a reduced number of allogeneic RBCs by 40% for 1000 mL, 49% for 2000 mL, and 52% for 3000 mL of blood loss compared to patients without CS.
Conclusions: Cell salvage was significantly associated with a reduced number of allogeneic RBC transfusions. It supports the beneficial effect of CS in cardiac surgical patients as an individual measure in a comprehensive PBM program.
Introduction: Systemic inflammation (e.g. following surgery) involves Toll-like receptor (TLR) signaling and leads to an endocrine stress response. This study aims to investigate a possible influence of TLR2 and TLR4 single nucleotide polymorphisms (SNPs) on perioperative adrenocorticotropic hormone (ACTH) and cortisol regulation in serum of cardiac surgical patients. To investigate the link to systemic inflammation in this context, we additionally measured 10 different cytokines in the serum. Methods: 338 patients admitted for elective cardiac surgery were included in this prospective observational clinical cohort study. Genomic DNA of patients was screened for TLR2 and TLR4 SNPs. Serum concentrations of ACTH, cortisol, interferon (IFN)-, interleukin (IL)-1, IL-2, IL-4, IL-5, IL-6, IL-8, IL-10, tumor necrosis factor (TNF)- and granulocyte macro-phage-colony stimulating factor (GM-CSF) were determined before surgery, immediately post surgery and on the first postoperative day. Results: 13 patients were identified as TLR2 SNP carrier, 51 as TLR4 SNP carrier and 274 pa-tients as non-carrier. Basal levels of ACTH, cortisol and cytokines did not differ between groups. In all three groups a significant, transient perioperative rise of cortisol could be ob-served. However, only in the non-carrier group this was accompanied by a significant ACTH rise, TLR4 SNP carriers had significant lower ACTH levels compared to non-carriers ((mean[95% confidence intervals]) non-carriers: 201.9[187.7 to 216.1]pg/ml; TLR4 SNP car-riers: 149.9[118.4 to 181.5]pg/ml; TLR2 SNP carriers: 176.4[110.5 to 242.3]pg/ml). Compared to non-carriers, TLR4 SNP carriers showed significant lower serum IL-8, IL-10 and GM-CSF peaks ((mean[95% confidence intervals]): IL-8: non-carriers: 42.6[36.7 to 48.5]pg/ml, TLR4 SNP carriers: 23.7[10.7 to 36.8]pg/ml; IL-10: non-carriers: 83.8[70.3 to 97.4]pg/ml, TLR4 SNP carriers: 54.2[24.1 to 84.2]pg/ml; GM-CSF: non-carriers: 33.0[27.8 to 38.3]pg/ml, TLR4 SNP carriers: 20.2[8.6 to 31.8]pg/ml). No significant changes over time or between the groups were found for the other cytokines. Conclusions: Regulation of the immunoendocrine stress response during systemic inflamma-tion is influenced by the presence of a TLR4 SNP. Cardiac surgical patients carrying this ge-notype showed decreased serum concentrations of ACTH, IL-8, IL-10 and GM-CSF. This finding might have impact on interpreting previous and designing future trials on diagnosing and modulating immunoendocrine dysregulation (e.g. adrenal insufficiency) during systemic inflammation and sepsis.
Introduction: It has been proposed that individual genetic variation contributes to the course of severe infections and sepsis. Recent studies of single nucleotide polymorphisms (SNPs) within the endotoxin receptor and its signaling system showed an association with the risk of disease development. This study aims to examine the response associated with genetic variations of TLR4, the receptor for bacterial LPS, and a central intracellular signal transducer (TIRAP/Mal) on cytokine release and for susceptibility and course of severe hospital acquired infections in distinct patient populations. Methods: Three intensive care units in tertiary care university hospitals in Greece and Germany participated. 375 and 415 postoperative patients and 159 patients with ventilator associated pneumonia (VAP) were included. TLR4 and TIRAP/Mal polymorphisms in 375 general surgical patients were associated with risk of infection, clinical course and outcome. In two prospective studies, 415 patients following cardiac surgery and 159 patients with newly diagnosed VAP predominantly caused by Gram-negative bacteria were studied for cytokine levels in-vivo and after ex-vivo monocyte stimulation and clinical course. Results: Patients simultaneously carrying polymorphisms in TIRAP/Mal and TLR4 and patients homozygous for the TIRAP/Mal SNP had a significantly higher risk of severe infections after surgery (odds ratio (OR) 5.5; confidence interval (CI): 1.34 - 22.64; P = 0.02 and OR: 7.3; CI: 1.89 - 28.50; P < 0.01 respectively). Additionally we found significantly lower circulating cytokine levels in double-mutant individuals with ventilator associated pneumonia and reduced cytokine production in an ex-vivo monocyte stimulation assay, but this difference was not apparent in TIRAP/Mal-homozygous patients. In cardiac surgery patients without infection, the cytokine release profiles were not changed when comparing different genotypes. Conclusions: Carriers of mutations in sequential components of the TLR signaling system may have an increased risk for severe infections. Patients with this genotype showed a decrease in cytokine release when infected which was not apparent in patients with sterile inflammation following cardiac surgery.
Background: paediatric patients are vulnerable to blood loss and even a small loss of blood can be associated with severe shock. In emergency situations, a red blood cell (RBC) transfusion may become unavoidable, although it is associated with various risks. The aim of this trial was to identify independent risk factors for perioperative RBC transfusion in children undergoing surgery. Methods: to identify independent risk factors for perioperative RBC transfusion in children undergoing surgery and to access RBC transfusion rates and in-hospital outcomes (e.g., length of stay, mortality, and typical postoperative complication rates), a monocentric, retrospective, and observational study was conducted. Descriptive, univariate, and multivariate analyses were performed. Results: between 1 January 2010 and 31 December 2019, data from n = 14,248 cases were identified at the centre. Analysis revealed an RBC transfusion rate of 10.1% (n = 1439) in the entire cohort. The independent predictors of RBC transfusion were the presence of preoperative anaemia (p < 0.001; OR = 15.10 with preoperative anaemia and OR = 2.40 without preoperative anaemia), younger age (p < 0.001; ORs between 0.14 and 0.28 for children older than 0 years), female gender (p = 0.036; OR = 1.19 compared to male gender), certain types of surgery (e.g., neuro surgery (p < 0.001; OR = 10.14), vascular surgery (p < 0.001; OR = 9.93), cardiac surgery (p < 0.001; OR = 4.79), gynaecology (p = 0.014; OR = 3.64), visceral surgery (p < 0.001; OR = 2.48), and the presence of postoperative complications (e.g., sepsis (p < 0.001; OR = 10.16), respiratory dysfunction (p < 0.001; OR = 7.56), cardiovascular dysfunction (p < 0.001; OR = 4.68), neurological dysfunction (p = 0.029; OR = 1.77), and renal dysfunction (p < 0.001; OR = 16.17)). Conclusion: preoperative anaemia, younger age, female gender, certain types of surgery, and postoperative complications are independent predictors for RBC transfusion in children undergoing surgery. Future prospective studies are urgently required to identify, in detail, the potential risk factors and impact of RBC transfusion in children.
Background: Cerebral O2 saturation (ScO2) reflects cerebral perfusion and can be measured noninvasively by near-infrared spectroscopy (NIRS). Objectives: In this pilot study, we describe the dynamics of ScO2 during TAVI in nonventilated patients and its impact on procedural outcome. Methods and Results: We measured ScO2 of both frontal lobes continuously by NIRS in 50 consecutive analgo-sedated patients undergoing transfemoral TAVI (female 58%, mean age 80.8 years). Compared to baseline ScO2 dropped significantly during RVP (59.3% vs. 53.9%, p < .01). Five minutes after RVP ScO2 values normalized (post RVP 62.6% vs. 53.9% during RVP, p < .01; pre 61.6% vs. post RVP 62.6%, p = .53). Patients with an intraprocedural pathological ScO2 decline of >20% (n = 13) had higher EuroSCORE II (3.42% vs. 5.7%, p = .020) and experienced more often delirium (24% vs. 62%, p = .015) and stroke (0% vs. 23%, p < .01) after TAVI. Multivariable logistic regression revealed higher age and large ScO2 drops as independent risk factors for delirium. Conclusions: During RVP ScO2 significantly declined compared to baseline. A ScO2 decline of >20% is associated with a higher incidence of delirium and stroke and a valid cut-off value to screen for these complications. NIRS measurement during TAVI procedure may be an easy to implement diagnostic tool to detect patients at high risks for cerebrovascular complications and delirium.
Background: Intensive Care Resources are heavily utilized during the COVID-19 pandemic. However, risk stratification and prediction of SARS-CoV-2 patient clinical outcomes upon ICU admission remain inadequate. This study aimed to develop a machine learning model, based on retrospective & prospective clinical data, to stratify patient risk and predict ICU survival and outcomes. Methods: A Germany-wide electronic registry was established to pseudonymously collect admission, therapeutic and discharge information of SARS-CoV-2 ICU patients retrospectively and prospectively. Machine learning approaches were evaluated for the accuracy and interpretability of predictions. The Explainable Boosting Machine approach was selected as the most suitable method. Individual, non-linear shape functions for predictive parameters and parameter interactions are reported. Results: 1039 patients were included in the Explainable Boosting Machine model, 596 patients retrospectively collected, and 443 patients prospectively collected. The model for prediction of general ICU outcome was shown to be more reliable to predict “survival”. Age, inflammatory and thrombotic activity, and severity of ARDS at ICU admission were shown to be predictive of ICU survival. Patients’ age, pulmonary dysfunction and transfer from an external institution were predictors for ECMO therapy. The interaction of patient age with D-dimer levels on admission and creatinine levels with SOFA score without GCS were predictors for renal replacement therapy. Conclusions: Using Explainable Boosting Machine analysis, we confirmed and weighed previously reported and identified novel predictors for outcome in critically ill COVID-19 patients. Using this strategy, predictive modeling of COVID-19 ICU patient outcomes can be performed overcoming the limitations of linear regression models. Trial registration “ClinicalTrials” (clinicaltrials.gov) under NCT04455451.
Introduction: In recent years, resource-saving handling of allogeneic blood products and a reduction of transfusion rates in adults has been observed. However, comparable published national data for transfusion practices in pediatric patients are currently not available. In this study, the transfusion rates for children and adolescents were analyzed based on data from the Federal Statistical Office of Germany during the past 2 decades. Methods: Data were queried via the database of the Federal Statistical Office (Destasis). The period covered was from 2005 to 2018, and those in the sample group were children and adolescents aged 0–17 years receiving inpatient care. Operation and procedure codes (OPS) for transfusions, procedures, or interventions with increased transfusion risk were queried and evaluated in detail. Results: In Germany, 0.9% of the children and adolescents treated in hospital received a transfusion in 2018. A reduction in transfusion rates from 1.02% (2005) to 0.9% (2018) was observed for the total collective of children and adolescents receiving inpatient care. Increases in transfusion rates were recorded for 1- to 4- (1.41–1.45%) and 5- to 10-year-olds (1.24–1.33%). Children under 1 year of age were most frequently transfused (in 2018, 40.2% of the children were cared for in hospital). Transfusion-associated procedures such as chemotherapy or machine ventilation and respiratory support for newborns and infants are on the rise. Conclusion: Transfusion rates are declining in children and adolescents, but the reasons for increases in transfusion rates in other groups are unclear. Prospective studies to evaluate transfusion rates and triggers in children are urgently needed.
Background: Trauma may be associated with significant to life-threatening blood loss, which in turn may increase the risk of complications and death, particularly in the absence of adequate treatment. Hydroxyethyl starch (HES) solutions are used for volume therapy to treat hypovolemia due to acute blood loss to maintain or re-establish hemodynamic stability with the ultimate goal to avoid organ hypoperfusion and cardiovascular collapse. The current study compares a 6% HES 130 solution (Volulyte 6%) versus an electrolyte solution (Ionolyte) for volume replacement therapy in adult patients with traumatic injuries, as requested by the European Medicines Agency to gain more insights into the safety and efficacy of HES in the setting of trauma care.
Methods: TETHYS is a pragmatic, prospective, randomized, controlled, double-blind, multicenter, multinational trial performed in two parallel groups. Eligible consenting adults ≥ 18 years, with an estimated blood loss of ≥ 500 ml, and in whom initial surgery is deemed necessary within 24 h after blunt or penetrating trauma, will be randomized to receive intravenous treatment at an individualized dose with either a 6% HES 130, or an electrolyte solution, for a maximum of 24 h or until reaching the maximum daily dose of 30 ml/kg body weight, whatever occurs first. Sample size is estimated as 175 patients per group, 350 patients total (α = 0.025 one-tailed, power 1–β = 0.8). Composite primary endpoint evaluated in an exploratory manner will be 90-day mortality and 90-day renal failure, defined as AKIN stage ≥ 2, RIFLE injury/failure stage, or use of renal replacement therapy (RRT) during the first 3 months. Secondary efficacy and safety endpoints are fluid administration and balance, changes in vital signs and hemodynamic status, changes in laboratory parameters including renal function, coagulation, and inflammation biomarkers, incidence of adverse events during treatment period, hospital, and intensive care unit (ICU) length of stay, fitness for ICU or hospital discharge, and duration of mechanical ventilation and/or RRT.
Discussion: This pragmatic study will increase the evidence on safety and efficacy of 6% HES 130 for treatment of hypovolemia secondary to acute blood loss in trauma patients.
Trial registration:Registered in EudraCT, No.: 2016-002176-27 (21 April 2017) and ClinicalTrials.gov, ID: NCT03338218 (09 November 2017).
Background: Acute bleeding requires fast and targeted therapy. Therefore, knowledge of the patient's potential to form a clot is crucial. Point-of-care testing (POCT) provides fast and reliable information on coagulation. Structural circumstances, such as person-bound sample transport, can prolong the reporting of the results. The aim of the present study was to investigate the diagnostic quality and accuracy between POCT INR diagnostics and standard laboratory analysis (SLA) as well as the time advantage between a pneumatic tube and a personal-based transport system. Methods: Two groups of haemorrhagic patients (EG: emergency department; OG: delivery room; each n = 12) were examined in the context of bleeding emergencies using POCT and SLA. Samples were transported via a pneumatic tube system or by a personal transport service. Results: INR results between POCT and SLA showed a high and significant correlation (EG: p < 0.001; OG: p < 0.001). POCT results were reported significantly more quickly (EG: 1.1 vs. 39.6 min; OG: 2.0 vs. 75.0 min; p < 0.001) and required less time for analysis (EG: 0.3 vs. 24.0 min; OG: 0.5 vs. 45.0 min; p < 0.001) compared to SLA. The time for transportation with the pneumatic tube was significantly shorter (8.0 vs. 18.5 min; p < 0.001) than with the personal-based transport system. Conclusion: The results of the present study suggest that POCT may be a suitable method for the emergency diagnosis and may be used as prognostic diagnostic elements in haemotherapy algorithms to initiate targeted haemotherapy at an early point in time.
Objectives: The ongoing coronavirus pandemic is challenging, especially in severely affected patients who require intubation and sedation. Although the potential benefits of sedation with volatile anesthetics in coronavirus disease 2019 patients are currently being discussed, the use of isoflurane in patients with coronavirus disease 2019–induced acute respiratory distress syndrome has not yet been reported. Design: We performed a retrospective analysis of critically ill patients with hypoxemic respiratory failure requiring mechanical ventilation. Setting: The study was conducted with patients admitted between April 4 and May 15, 2020 to our ICU. Patients: We included five patients who were previously diagnosed with severe acute respiratory syndrome coronavirus 2 infection. Intervention: Even with high doses of several IV sedatives, the targeted level of sedation could not be achieved. Therefore, the sedation regimen was switched to inhalational isoflurane. Clinical data were recorded using a patient data management system. We recorded demographical data, laboratory results, ventilation variables, sedative dosages, sedation level, prone positioning, duration of volatile sedation and outcomes. Measurements & Main Results: Mean age (four men, one women) was 53.0 (± 12.7) years. The mean duration of isoflurane sedation was 103.2 (± 66.2) hours. Our data demonstrate a substantial improvement in the oxygenation ratio when using isoflurane sedation. Deep sedation as assessed by the Richmond Agitation and Sedation Scale was rapidly and closely controlled in all patients, and the subsequent discontinuation of IV sedation was possible within the first 30 minutes. No adverse events were detected. Conclusions: Our findings demonstrate the feasibility of isoflurane sedation in five patients suffering from severe coronavirus disease 2019 infection. Volatile isoflurane was able to achieve the required deep sedation and reduced the need for IV sedation.
Background: Nerve injury induced protein 1 (Ninjurin 1 (Ninj1)) was first identified in Schwann cells and neurons contributing to cell adhesion and nerve regeneration. Recently, the role of Ninj1 has been linked to inflammatory processes in the central nervous system where functional repression reduced leukocyte infiltration and clinical disease activity during experimental autoimmune encephalomyelitis in mice [1]. But Ninj1 is also expressed outside the nervous system in various organs such as the liver and kidney as well as on leukocytes [2,3]. Therefore, we hypothesized that Ninj1 contributes to inflammation in general; that is, also outside the nervous system, with special interest in the pathogenesis of sepsis.
Methods: Ninj1 was repressed by transfecting HMEC-1 cells, a human dermal microvascular endothelial cell line with siRNA targeting Ninj1 (siNinj1) or a negative control (siC). Subsequently, cells were stimulated with 100 ng/ml LPS (TLR4 agonist), 3 μg/ml LTA (TLR2 agonist) or 100 n/ml poly(I:C) (TLR3 agonist) for 3 hours. The inflammatory response was analyzed by real-time PCR. In addition, transmigration of neutrophils across a HMEC-1 monolayer was measured using transwell plates (pore size 3 μm).
Results: Repression of Ninj1 by siRNA reduced Ninj1 mRNA expression in HMEC about 90% (Figure 1A). Reduced Ninj1 expression decreased neutrophil migration to 62.5% (Figure 1B) and TLR signaling. In detail, knockdown of Ninj1 significantly reduced TLR-2 and TLR-4 triggered expression of ICAM-1 and IL-6 (Figure 1C,D) while poly(I:C)-induced expression was only slightly reduced. To analyze a more specific TLR-3 target, we measured IP-10 mRNA expression, which was also significantly reduced in siNinj1-transfected cells (Figure 1E).
Conclusion: Our in vitro data strongly indicated that Ninj1 is involved in regulation of TLR signaling and therewith contributes to inflammation. In vivo experiments will clarify its impact on systemic inflammation.
BACKGROUND: Recent findings support the idea that interleukin (IL)-22 serum levels are related to disease severity in end-stage liver disease. Existing scoring systems--Model for End-Stage Liver Disease (MELD), Survival Outcomes Following Liver Transplantation (SOFT) and Pre-allocation-SOFT (P-SOFT)--are well-established in appraising survival rates with or without liver transplantation. We tested the hypothesis that IL-22 serum levels at transplantation date correlate with survival and potentially have value as a predictive factor for survival.
MATERIAL AND METHODS: MELD, SOFT, and P-SOFT scores were calculated to estimate post-transplantation survival. Serum levels of IL-22, IL-6, IL-10, C-reactive protein (CRP), and procalcitonin (PCT) were collected prior to transplantation in 41 patients. Outcomes were assessed at 3 months, 1 year, and 3 years after transplantation.
RESULTS: IL-22 significantly correlated with MELD, P-SOFT, and SOFT scores (Rs 0.35, 0.63, 0.56 respectively, p<0.05) and with the discrimination in post-transplantation survival. IL-6 showed a heterogeneous pattern (Rs 0.40, 0.63, 0.57, respectively, p<0.05); CRP and PCT did not correlate. We therefore added IL-22 serum values to existing scoring systems in a generalized linear model (GLM), resulting in a significantly improved outcome prediction in 58% of the cases for both the P-SOFT (p<0.01) and SOFT scores (p<0.001).
CONCLUSIONS: Further studies are needed to address the concept that IL-22 serum values at the time of transplantation provide valuable information about survival rates following orthotopic liver transplantation.
Background: Approximately one in three patients suffers from preoperative anaemia. Even though haemoglobin is measured before surgery, anaemia management is not implemented in every hospital. Objective: Here, we demonstrate the implementation of an anaemia walk-in clinic at an Orthopedic University Hospital. To improve the diagnosis of iron deficiency (ID), we examined whether reticulocyte haemoglobin (Ret-He) could be a useful additional parameter. Material and Methods: In August 2019, an anaemia walk-in clinic was established. Between September and December 2019, major orthopaedic surgical patients were screened for preoperative anaemia. The primary endpoint was the incidence of preoperative anaemia. Secondary endpoints included Ret-He level, red blood cell (RBC) transfusion rate, in-hospital length of stay and anaemia at hospital discharge. Results: A total of 104 patients were screened for anaemia. Preoperative anaemia rate was 20.6%. Intravenous iron was supplemented in 23 patients. Transfusion of RBC units per patient (1.7 ± 1.2 vs. 0.2 ± 0.9; p = 0.004) and hospital length of stay (13.1 ± 4.8 days vs. 10.6 ± 5.1 days; p = 0.068) was increased in anaemic patients compared to non-anaemic patients. Ret-He values were significantly lower in patients with ID anaemia (33.3 pg [28.6–40.2 pg]) compared to patients with ID (35.3 pg [28.9–38.6 pg]; p = 0.015) or patients without anaemia (35.4 pg [30.2–39.4 pg]; p = 0.001). Conclusion: Preoperative anaemia is common in orthopaedic patients. Our results proved the feasibility of an anaemia walk-in clinic to manage preoperative anaemia. Furthermore, our analysis supports the use of Ret-He as an additional parameter for the diagnosis of ID in surgical patients.
Background: The intraoperative blood loss is estimated daily in the operating room and is mainly done by visual techniques. Due to local standards, the surgical sponge colours can vary (e.g. white in US, green in Germany). The influence of sponge colour on accuracy of estimation has not been in the focus of research yet. Material and methods: A blood loss simulation study containing four “bleeding” scenarios each per sponge colour were created by using expired whole blood donation samples. The blood donations were applied to white and green surgical sponges after dilution with full electrolyte solution. Study participants had to estimate the absorbed blood loss in sponges in all scenarios. The difference to the reference blood loss was analysed. Multivariate linear regression analysis was performed to investigate other influence factors such as staff experience and sponge colour. Results: A total of 53 anaesthesists participated in the study. Visual estimation correlated moderately with reference blood loss in white (Spearman's rho: 0.521; p = 3.748*10−16) and green sponges (Spearman's rho: 0.452; p = 4.683*10−12). The median visually estimated blood loss was higher in white sponges (250ml IRQ 150–412.5ml) than in green sponges (150ml IQR 100-300ml), compared to reference blood loss (103ml IQR 86–162.8). For both colour types of sponges, major under- and overestimation was observed. The multivariate statistics demonstrates that fabric colours have a significant influence on estimation (p = 3.04*10−10), as well as clinician’s qualification level (p = 2.20*10−10, p = 1.54*10−08) and amount of RBL to be estimated (p < 2*10−16). Conclusion: The deviation of correct blood loss estimation was smaller with white surgical sponges compared to green sponges. In general, deviations were so severe for both types of sponges, that it appears to be advisable to refrain from visually estimating blood loss whenever possible and instead to use other techniques such as e.g. colorimetric estimation.