610 Medizin und Gesundheit
Refine
Year of publication
Document Type
- Article (95)
- Conference Proceeding (4)
- Preprint (3)
Language
- English (102)
Has Fulltext
- yes (102)
Is part of the Bibliography
- no (102)
Keywords
- Patient blood management (6)
- Transfusion (5)
- COVID-19 (4)
- Critical care (4)
- Outcome (4)
- patient blood management (4)
- ARDS (3)
- Intensive care (3)
- Mortality (3)
- SARS-CoV-2 (3)
Institute
Background: The use of cell salvage and autologous blood transfusion has become an important method of blood conservation. So far, there are no clinical data about the performance of the continuous autotransfusion device CATSmart.
Methods: In total, 74 patients undergoing either cardiac or orthopedic surgery were included in this prospective, bicenter and observational technical evaluation to validate red cell separation process and washout quality of CATSmart. The target of red cell separation process was defined as a hematocrit value in the packed red cell unit of 55–75% and of washout quality of 80–100% removal ratio.
Results: Hematocrit values measured by CATSmart and laboratory analysis were 78.5% [71.3%; 84.0%] and 73.7% [67.5%; 75.5%], respectively. Removal ratios for platelets 94.7% [88.2%; 96.7%], free hemoglobin 89.3% [85.2%; 94.9%], albumin 97.9% [96.6%; 98.5%], heparin 99.9% [99.9%; 100.0%], and potassium 92.5% [90.8%; 95.0%] were within the target range while removal of white blood cells was slightly worse 72.4% [57.9%; 87.3%].
Conclusion: The new autotransfusion device enables sufficient red cell separation and washout quality.
Background: Cerebral oxygen saturation (ScO2) can be measured non-invasively by near-infrared spectroscopy (NIRS) and correlates with cerebral perfusion. We investigated cerebral saturation during transfemoral transcatheter aortic valve implantation (TAVI) and its impact on outcome.
Methods and results: Cerebral oxygenation was measured continuously by NIRS in 173 analgo-sedated patients during transfemoral TAVI (female 47%, mean age 81 years) with self-expanding (39%) and balloon-expanding valves (61%). We investigated the periprocedural dynamics of cerebral oxygenation. Mean ScO2 at baseline without oxygen supply was 60%. During rapid ventricular pacing, ScO2 dropped significantly (before 64% vs. after 55%, p < 0.001). ScO2 at baseline correlated positively with baseline left-ventricular ejection fraction (0.230, p < 0.006) and hemoglobin (0.327, p < 0.001), and inversely with EuroSCORE-II ( − 0.285, p < 0.001) and length of in-hospital stay ( − 0.229, p < 0.01). Patients with ScO2 < 56% despite oxygen supply at baseline had impaired 1 year survival (log-rank test p < 0.01) and prolonged in-hospital stay (p = 0.03). Furthermore, baseline ScO2 was found to be a predictor for 1 year survival independent of age and sex (multivariable adjusted Cox regression, p = 0.020, hazard ratio (HR 0.94, 95% CI 0.90–0.99) and independent of overall perioperative risk estimated by EuroSCORE-II and hemoglobin (p = 0.03, HR 0.95, 95% CI 0.91–0.99).
Conclusions: Low baseline ScO2 not responding to oxygen supply might act as a surrogate for impaired cardiopulmonary function and is associated with worse 1 year survival and prolonged in-hospital stay after transfemoral TAVI. ScO2 monitoring is an easy to implement diagnostic tool to screen patients at risk with a potential preserved recovery and worse outcome after TAVI.
Purpose: Trauma is the leading cause of death in children. In adults, blood transfusion and fluid resuscitation protocols changed resulting in a decrease of morbidity and mortality over the past 2 decades. Here, transfusion and fluid resuscitation practices were analysed in severe injured children in Germany.
Methods: Severely injured children (maximum Abbreviated Injury Scale (AIS) ≥ 3) admitted to a certified trauma-centre (TraumaZentrum DGU®) between 2002 and 2017 and registered at the TraumaRegister DGU® were included and assessed regarding blood transfusion rates and fluid therapy.
Results: 5,118 children (aged 1–15 years) with a mean ISS 22 were analysed. Blood transfusion rates administered until ICU admission decreased from 18% (2002–2005) to 7% (2014–2017). Children who are transfused are increasingly seriously injured. ISS has increased for transfused children aged 1–15 years (2002–2005: mean 27.7–34.4 in 2014–2017). ISS in non-transfused children has decreased in children aged 1–15 years (2002–2005: mean 19.6 to mean 17.6 in 2014–2017). Mean prehospital fluid administration decreased from 980 to 549 ml without affecting hemodynamic instability.
Conclusion: Blood transfusion rates and amount of fluid resuscitation decreased in severe injured children over a 16-year period in Germany. Restrictive blood transfusion and fluid management has become common practice in severe injured children. A prehospital restrictive fluid management strategy in severely injured children is not associated with a worsened hemodynamic state, abnormal coagulation or base excess but leads to higher hemoglobin levels.
Introduction: In recent years, resource-saving handling of allogeneic blood products and a reduction of transfusion rates in adults has been observed. However, comparable published national data for transfusion practices in pediatric patients are currently not available. In this study, the transfusion rates for children and adolescents were analyzed based on data from the Federal Statistical Office of Germany during the past 2 decades. Methods: Data were queried via the database of the Federal Statistical Office (Destasis). The period covered was from 2005 to 2018, and those in the sample group were children and adolescents aged 0–17 years receiving inpatient care. Operation and procedure codes (OPS) for transfusions, procedures, or interventions with increased transfusion risk were queried and evaluated in detail. Results: In Germany, 0.9% of the children and adolescents treated in hospital received a transfusion in 2018. A reduction in transfusion rates from 1.02% (2005) to 0.9% (2018) was observed for the total collective of children and adolescents receiving inpatient care. Increases in transfusion rates were recorded for 1- to 4- (1.41–1.45%) and 5- to 10-year-olds (1.24–1.33%). Children under 1 year of age were most frequently transfused (in 2018, 40.2% of the children were cared for in hospital). Transfusion-associated procedures such as chemotherapy or machine ventilation and respiratory support for newborns and infants are on the rise. Conclusion: Transfusion rates are declining in children and adolescents, but the reasons for increases in transfusion rates in other groups are unclear. Prospective studies to evaluate transfusion rates and triggers in children are urgently needed.
The ongoing SARS-CoV-2 pandemic is characterized by poor outcome and a high mortality especially in the older patient cohort. Up to this point there is a lack of data characterising COVID-19 patients in Germany admitted to intensive care (ICU) vs. non-ICU patients. German Reimbursement inpatient data covering the period in Germany from January 1st, 2020 to December 31th, 2021 were analyzed. 561,379 patients were hospitalized with COVID-19. 24.54% (n = 137,750) were admitted to ICU. Overall hospital mortality was 16.69% (n = 93,668) and 33.36% (n = 45,947) in the ICU group. 28.66% (n = 160,881) of all patients suffer from Cardiac arrhythmia and 17.98% (n = 100,926) developed renal failure. Obesity showed an odds-ratio ranging from 0.83 (0.79–0.87) for WHO grade I to 1.13 (1.08–1.19) for grade III. Mortality-rates peaked in April 2020 and January 2021 being 21.23% (n = 4539) and 22.99% (n = 15,724). A third peak was observed November and December 2021 (16.82%, n = 7173 and 16.54%, n = 9416). Hospitalized COVID-19 patient mortality in Germany is lower than previously shown in other studies. 24.54% of all patients had to be treated in the ICU with a mortality rate of 33.36%. Congestive heart failure was associated with a higher risk of death whereas low grade obesity might have a protective effect on patient survival. High admission numbers are accompanied by a higher mortality rate.
Characterization of neonates born to mothers with SARS-CoV-2 infection: review and meta-analysis
(2020)
Characterization of neonates born to mothers with SARS-CoV-2 infection has been partially carried out. There has been no systematic review providing a holistic neonatal presentation including possible vertical transmission. A systematic literature search was performed using PubMed, Google Scholar and Web of Science up to June, 6 2020. Studies on neonates born to mothers with SARS-CoV-2 infection were included. A binary random effect model was used for prevalence and 95% confidence interval. 32 studies involving 261 neonates were included in meta-analysis. Most neonates born to infected mothers did not show any clinical abnormalities (80.4%). Clinical features were dyspnea in 11 (42.3%) and fever in 9 newborns (19.1%). Of 261 neonates, 120 neonates were tested for infection, of whom 12 (10.0%) tested positive. Swabs from placenta, cord blood and vaginal secretion were negative. Neonates are mostly non affected by the mother's SARS-CoV-2 infection. The risk of vertical transmission is low.
Cholinesterase alterations in delirium after cardiosurgery: a German monocentric prospective study
(2020)
Objectives: Postoperative delirium (POD) is a common complication after elective cardiac surgery. Recent evidence indicates that a disruption in the normal activity of the cholinergic system may be associated with delirium.
Design: Prospective observational study.
Setting: Single-centre at a European academic hospital.
Primary: and secondary outcome measures In our study the enzyme activities of acetylcholinesterase (AChE) and butyrylcholinesterase (BChE) were determined preoperatively as well as on the first and second postoperative day. The confusion assessment method for the intensive care unit was used to screen patients for the presence of POD.
Results: A total of 114 patients were included in the study. POD was associated with a decrease in BChE activity on postoperative day 1 (p=0.03). In addition, patients who developed POD, had significantly lower preoperative AChE activity than patients without POD (p<0.01). Multivariate analysis identified a preoperatively decreased AChE activity (OR 3.1; 95% CI 1.14 to 8.46), anticholinergic treatment (OR 5.09; 95% CI 1.51 to 17.23), elevated European System for Cardiac Operative Risk Evaluation (OR 3.68; 95% CI 1.04 to 12.99) and age (OR 3.02; 95% CI 1.06 to 8.62) to be independently associated with the development of POD.
Conclusions: We conclude that a reduction in the acetylcholine hydrolysing enzyme activity in patients undergoing cardiac surgery may correlate with the development of POD.
Transfusion of red blood cells (RBC) in patients undergoing major elective cranial surgery is associated with increased morbidity, mortality and prolonged hospital length of stay (LOS). This retrospective single center study aims to identify the clinical outcome of RBC transfusions on skull base and non-skull base meningioma patients including the identification of risk factors for RBC transfusion. Between October 2009 and October 2016, 423 patients underwent primary meningioma resection. Of these, 68 (16.1%) received RBC transfusion and 355 (83.9%) did not receive RBC units. Preoperative anaemia rate was significantly higher in transfused patients (17.7%) compared to patients without RBC transfusion (6.2%; p = 0.0015). In transfused patients, postoperative complications as well as hospital LOS was significantly higher (p < 0.0001) compared to non-transfused patients. After multivariate analyses, risk factors for RBC transfusion were preoperative American Society of Anaesthesiologists (ASA) physical status score (p = 0.0247), tumor size (p = 0.0006), surgical time (p = 0.0018) and intraoperative blood loss (p < 0.0001). Kaplan-Meier curves revealed significant influence on overall survival by preoperative anaemia, RBC transfusion, smoking, cardiovascular disease, preoperative KPS ≤ 60% and age (elderly ≥ 75 years). We concluded that blood loss due to large tumors or localization near large vessels are the main triggers for RBC transfusion in meningioma patients paired with a potential preselection that masks the effect of preoperative anaemia in multivariate analysis. Further studies evaluating the impact of preoperative anaemia management for reduction of RBC transfusion are needed to improve the clinical outcome of meningioma patients.
Background. Tracheal intubation still represents the "gold standard" in securing the airway of unconscious patients in the prehospital setting. Especially in cases of restricted access to the patient, video laryngoscopy became more and more relevant.
Objectives. The aim of the study was to evaluate the performance and intubation success of four different video laryngoscopes, one optical laryngoscope, and a Macintosh blade while intubating from two different positions in a mannequin trial with difficult access to the patient.
Methods. A mannequin with a cervical collar was placed on the driver’s seat. Intubation was performed with six different laryngoscopes either through the driver’s window or from the backseat. Success, C/L score, time to best view (TTBV), time to intubation (TTI), and number of attempts were measured. All participants were asked to rate their favored device.
Results. Forty-two physicians participated. 100% of all intubations performed from the backseat were successful. Intubation success through the driver’s window was less successful. Only with the Airtraq® optical laryngoscope, 100% success was achieved. Best visualization (window C/L 2a; backseat C/L 2a) and shortest TTBV (window 4.7 s; backseat 4.1 s) were obtained when using the D-Blade video laryngoscope, but this was not associated with a higher success through the driver’s window. Fastest TTI was achieved through the window (14.2 s) when using the C-MAC video laryngoscope and from the backseat (7.3 s) when using a Macintosh blade.
Conclusions. Video laryngoscopy revealed better results in visualization but was not associated with a higher success. Success depended on the approach and familiarity with the device. We believe that video laryngoscopy is suitable for securing airways in trapped accident victims. The decision for an optimal device is complicated and should be based upon experience and regular training with the device.
Estimating intraoperative blood loss is one of the daily challenges for clinicians. Despite the knowledge of the inaccuracy of visual estimation by anaesthetists and surgeons, this is still the mainstay to estimate surgical blood loss. This review aims at highlighting the strengths and weaknesses of currently used measurement methods. A systematic review of studies on estimation of blood loss was carried out. Studies were included investigating the accuracy of techniques for quantifying blood loss in vivo and in vitro. We excluded nonhuman trials and studies using only monitoring parameters to estimate blood loss. A meta-analysis was performed to evaluate systematic measurement errors of the different methods. Only studies that were compared with a validated reference e.g. Haemoglobin extraction assay were included. 90 studies met the inclusion criteria for systematic review and were analyzed. Six studies were included in the meta-analysis, as only these were conducted with a validated reference. The mixed effect meta-analysis showed the highest correlation to the reference for colorimetric methods (0.93 95% CI 0.91–0.96), followed by gravimetric (0.77 95% CI 0.61–0.93) and finally visual methods (0.61 95% CI 0.40–0.82). The bias for estimated blood loss (ml) was lowest for colorimetric methods (57.59 95% CI 23.88–91.3) compared to the reference, followed by gravimetric (326.36 95% CI 201.65–450.86) and visual methods (456.51 95% CI 395.19–517.83). Of the many studies included, only a few were compared with a validated reference. The majority of the studies chose known imprecise procedures as the method of comparison. Colorimetric methods offer the highest degree of accuracy in blood loss estimation. Systems that use colorimetric techniques have a significant advantage in the real-time assessment of blood loss.
Objective: Videolaryngoscopy has mainly been developed to facilitate difficult airway intubation. However, there is a lack of studies demonstrating this method's efficacy in pediatric patients. The aim of the present study was to compare the TruView infant EVO2 and the C-MAC videolaryngoscope with conventional direct Macintosh laryngoscopy in children with a bodyweight ≤10 kg in terms of intubation conditions and the time to intubation.
Methods: In total, 65 children with a bodyweight ≤10 kg (0-22 months) who had undergone elective surgery requiring endotracheal intubation were retrospectively analyzed. Our database was screened for intubations with the TruView infant EVO2, the C-MAC videolaryngoscope, and conventional direct Macintosh laryngoscopy. The intubation conditions, the time to intubation, and the oxygen saturation before and after intubation were monitored, and demographic data were recorded. Only children with a bodyweight ≤10 kg were included in the analysis.
Results: A total of 23 children were intubated using the C-MAC videolaryngoscope, and 22 children were intubated using the TruView EVO2. Additionally, 20 children were intubated using a standard Macintosh blade. The time required for tracheal intubation was significantly longer using the TruView EVO2 (52 sec vs. 28 sec for C-MAC vs. 26 sec for direct LG). However, no significant difference in oxygen saturation was found after intubation.
Conclusion: All devices allowed excellent visualization of the vocal cords, but the time to intubation was prolonged when the TruView EVO2 was used. The absence of a decline in oxygen saturation may be due to apneic oxygenation via the TruView scope and may provide a margin of safety. In sum, the use of the TruView by a well-trained anesthetist may be an alternative for difficult airway management in pediatric patients.
Background Bacterial DNA containing motifs of unmethylated CpG dinucleotides (CpG-ODN) initiate an innate immune response mediated by the pattern recognition receptor Toll-like receptor 9 (TLR9). This leads in particular to the expression of proinflammatory mediators such as tumor necrosis factor (TNF-alpha) and interleukin-1beta (IL-1beta). TLR9 is expressed in human and murine pulmonary tissue and induction of proinflammatory mediators has been linked to the development of acute lung injury. Therefore, the hypothesis was tested whether CpG-ODN administration induces an inflammatory response in the lung via TLR9 in vivo. Methods Wild-type (WT) and TLR9-deficient (TLR9-D) mice received CpG-ODN intraperitoneally (1668-Thioat, 1 nmol/g BW) and were observed for up to 6 hrs. Lung tissue and plasma samples were taken and various inflammatory markers were measured. Results In WT mice, CpG-ODN induced a strong activation of pulmonary NFKB as well as a significant increase in pulmonary TNF-alpha and IL-1beta mRNA/protein. In addition, cytokine serum levels were significantly elevated in WT mice. Increased pulmonary content of lung myeloperoxidase (MPO) was documented in WT mice following application of CpG-ODN. Bronchoalveolar lavage (BAL) revealed that CpG-ODN stimulation significantly increased total cell number as well as neutrophil count in WT animals. In contrast, the CpG-ODN-induced inflammatory response was abolished in TLR9-D mice. Conclusion This study suggests that bacterial CpG-ODN causes lung inflammation via TLR9.
Coronavirus disease 2019 (COVID-19) is caused by the Severe Acute Respiratory Syndrome Coronavirus 2 (SARS-CoV-2) and can affect multiple organs, among which is the circulatory system. Inflammation and mortality risk markers were previously detected in COVID-19 plasma and red blood cells (RBCs) metabolic and proteomic profiles. Additionally, biophysical properties, such as deformability, were found to be changed during the infection. Based on such data, we aim to better characterize RBC functions in COVID-19. We evaluate the flow properties of RBCs in severe COVID-19 patients admitted to the intensive care unit by using microfluidic techniques and automated methods, including artificial neural networks, for an unbiased RBC analysis. We find strong flow and RBC shape impairment in COVID-19 samples and demonstrate that such changes are reversible upon suspension of COVID-19 RBCs in healthy plasma. Vice versa, healthy RBCs resemble COVID-19 RBCs when suspended in COVID-19 plasma. Proteomics and metabolomics analyses allow us to detect the effect of plasma exchanges on both plasma and RBCs and demonstrate a new role of RBCs in maintaining plasma equilibria at the expense of their flow properties. Our findings provide a framework for further investigations of clinical relevance for therapies against COVID-19 and possibly other infectious diseases.
Editor's evaluation
This report illustrates a comprehensive account detailing the marked alteration of red blood cell (RBC) morphology that occurs with COVID-19 infection. A particularly important result is the observation that RBC morphology is dramatically affected by plasma from COVID-19 patients and reversible with plasma from healthy donors. The claims of the manuscript are well supported by the data, and the approaches used are thoughtful and rigorous. The results are important for consideration of the broader pathophysiology of COVID-19, particularly with regard to the impact on vascular biology and will be of interest to the readership of eLife.
Coronavirus disease 2019 (COVID-19) is caused by the Severe Acute Respiratory Syndrome Coronavirus 2 (SARS-CoV-2) and can affect multiple organs, among which is the circulatory system. Inflammation and mortality risk markers were previously detected in COVID-19 plasma and red blood cells (RBCs) metabolic and proteomic profiles. Additionally, biophysical properties, such as deformability, were found to be changed during the infection. Based on such data, we aim to better characterize RBC functions in COVID-19. We evaluate the flow properties of RBCs in severe COVID-19 patients admitted to the intensive care unit by using in vitro microfluidic techniques and automated methods, including artificial neural networks, for an unbiased RBC analysis. We find strong flow and RBC shape impairment in COVID-19 samples and demonstrate that such changes are reversible upon suspension of COVID-19 RBCs in healthy plasma. Vice versa, healthy RBCs immediately resemble COVID-19 RBCs when suspended in COVID-19 plasma. Proteomics and metabolomics analyses allow us to detect the effect of plasma exchanges on both plasma and RBCs and demonstrate a new role of RBCs in maintaining plasma equilibria at the expense of their flow properties. Our findings provide a framework for further investigations of clinical relevance for therapies against COVID-19 and possibly other infectious diseases.
Coronavirus disease 2019 (COVID-19) is caused by the Severe Acute Respiratory Syndrome Coronavirus 2 (SARS-CoV-2) and can affect multiple organs, among which is the circulatory system. Inflammation and mortality risk markers were previously detected in COVID-19 plasma and red blood cells (RBCs) metabolic and proteomic profiles. Additionally, biophysical properties, such as deformability, were found to be changed during the infection. Based on such data, we aim to better characterize RBC functions in COVID-19. We evaluate the flow properties of RBCs in severe COVID-19 patients admitted to the intensive care unit by using in vitro microfluidic techniques and automated methods, including artificial neural networks, for an unbiased RBC analysis. We find strong flow and RBC shape impairment in COVID-19 samples and demonstrate that such changes are reversible upon suspension of COVID-19 RBCs in healthy plasma. Vice versa, healthy RBCs immediately resemble COVID-19 RBCs when suspended in COVID-19 plasma. Proteomics and metabolomics analyses allow us to detect the effect of plasma exchanges on both plasma and RBCs and demonstrate a new role of RBCs in maintaining plasma equilibria at the expense of their flow properties. Our findings provide a framework for further investigations of clinical relevance for therapies against COVID-19 and possibly other infectious diseases.
The main goal of adequate organ preservation is to avoid further cellular metabolism during the phase of ischemia. However, modern preservation solutions do rarely achieve this target. In donor organs hypoxia and ischemia induce a broad spectrum of pathologic molecular mechanisms favoring primary graft dysfunction (PGD) after transplantation. Increased hypoxia-induced transcriptional activity leads to increased vascular permeability which in turn is the soil of a reperfusion edema and the enhancement of a pro-inflammatory response in the graft after reperfusion. We hypothesize that inhibition of the respiration chain in mitochondria and thus inhibition of the hypoxia induced mechanisms might reduce reperfusion edema and consecutively improve survival in vivo. In this study we demonstrate that the rotenoid Deguelin reduces the expression of hypoxia induced target genes, and especially VEGF-A, dose-dependently in hypoxic human lung derived cells. Furthermore, Deguelin significantly suppresses the mRNA expression of the HIF target genes VEGF-A, the pro-inflammatory CXCR4 and ICAM-1 in ischemic lungs vs. control lungs. After lung transplantation, the VEGF-A induced reperfusion-edema is significantly lower in Deguelin-treated animals than in controls. Deguelin-treated rats exhibit a significantly increased survival-rate after transplantation. Additionally, a downregulation of the pro-inflammatory molecules ICAM-1 and CXCR4 and an increase in the recruitment of immunomodulatory monocytes (CD163+ and CD68+) to the transplanted organ involving the IL4 pathway was observed. Therefore, we conclude that ischemic periods preceding reperfusion are mainly responsible for the increased vascular permeability via upregulation of VEGF. Together with this, the resulting endothelial dysfunction also enhances inflammation and consequently lung dysfunction. Deguelin significantly decreases a VEGF-A induced reperfusion edema, induces the recruitment of immunomodulatory monocytes and thus improves organ function and survival after lung transplantation by interfering with hypoxia induced signaling.
Introduction: Acute kidney injury (AKI) can evolve quickly and clinical measures of function often fail to detect AKI at a time when interventions are likely to provide benefit. Identifying early markers of kidney damage has been difficult due to the complex nature of human AKI, in which multiple etiologies exist. The objective of this study was to identify and validate novel biomarkers of AKI.
Methods: We performed two multicenter observational studies in critically ill patients at risk for AKI - discovery and validation. The top two markers from discovery were validated in a second study (Sapphire) and compared to a number of previously described biomarkers. In the discovery phase, we enrolled 522 adults in three distinct cohorts including patients with sepsis, shock, major surgery, and trauma and examined over 300 markers. In the Sapphire validation study, we enrolled 744 adult subjects with critical illness and without evidence of AKI at enrollment; the final analysis cohort was a heterogeneous sample of 728 critically ill patients. The primary endpoint was moderate to severe AKI (KDIGO stage 2 to 3) within 12 hours of sample collection.
Results: Moderate to severe AKI occurred in 14% of Sapphire subjects. The two top biomarkers from discovery were validated. Urine insulin-like growth factor-binding protein 7 (IGFBP7) and tissue inhibitor of metalloproteinases-2 (TIMP-2), both inducers of G1 cell cycle arrest, a key mechanism implicated in AKI, together demonstrated an AUC of 0.80 (0.76 and 0.79 alone). Urine [TIMP-2].[IGFBP7] was significantly superior to all previously described markers of AKI (P <0.002), none of which achieved an AUC >0.72. Furthermore, [TIMP-2].[IGFBP7] significantly improved risk stratification when added to a nine-variable clinical model when analyzed using Cox proportional hazards model, generalized estimating equation, integrated discrimination improvement or net reclassification improvement. Finally, in sensitivity analyses [TIMP-2].[IGFBP7] remained significant and superior to all other markers regardless of changes in reference creatinine method.
Conclusions: Two novel markers for AKI have been identified and validated in independent multicenter cohorts. Both markers are superior to existing markers, provide additional information over clinical variables and add mechanistic insight into AKI. Trial registration: ClinicalTrials.gov number NCT01209169.
Background: The intraoperative blood loss is estimated daily in the operating room and is mainly done by visual techniques. Due to local standards, the surgical sponge colours can vary (e.g. white in US, green in Germany). The influence of sponge colour on accuracy of estimation has not been in the focus of research yet. Material and methods: A blood loss simulation study containing four “bleeding” scenarios each per sponge colour were created by using expired whole blood donation samples. The blood donations were applied to white and green surgical sponges after dilution with full electrolyte solution. Study participants had to estimate the absorbed blood loss in sponges in all scenarios. The difference to the reference blood loss was analysed. Multivariate linear regression analysis was performed to investigate other influence factors such as staff experience and sponge colour. Results: A total of 53 anaesthesists participated in the study. Visual estimation correlated moderately with reference blood loss in white (Spearman's rho: 0.521; p = 3.748*10−16) and green sponges (Spearman's rho: 0.452; p = 4.683*10−12). The median visually estimated blood loss was higher in white sponges (250ml IRQ 150–412.5ml) than in green sponges (150ml IQR 100-300ml), compared to reference blood loss (103ml IQR 86–162.8). For both colour types of sponges, major under- and overestimation was observed. The multivariate statistics demonstrates that fabric colours have a significant influence on estimation (p = 3.04*10−10), as well as clinician’s qualification level (p = 2.20*10−10, p = 1.54*10−08) and amount of RBL to be estimated (p < 2*10−16). Conclusion: The deviation of correct blood loss estimation was smaller with white surgical sponges compared to green sponges. In general, deviations were so severe for both types of sponges, that it appears to be advisable to refrain from visually estimating blood loss whenever possible and instead to use other techniques such as e.g. colorimetric estimation.
Background: Cerebral O2 saturation (ScO2) reflects cerebral perfusion and can be measured noninvasively by near-infrared spectroscopy (NIRS). Objectives: In this pilot study, we describe the dynamics of ScO2 during TAVI in nonventilated patients and its impact on procedural outcome. Methods and Results: We measured ScO2 of both frontal lobes continuously by NIRS in 50 consecutive analgo-sedated patients undergoing transfemoral TAVI (female 58%, mean age 80.8 years). Compared to baseline ScO2 dropped significantly during RVP (59.3% vs. 53.9%, p < .01). Five minutes after RVP ScO2 values normalized (post RVP 62.6% vs. 53.9% during RVP, p < .01; pre 61.6% vs. post RVP 62.6%, p = .53). Patients with an intraprocedural pathological ScO2 decline of >20% (n = 13) had higher EuroSCORE II (3.42% vs. 5.7%, p = .020) and experienced more often delirium (24% vs. 62%, p = .015) and stroke (0% vs. 23%, p < .01) after TAVI. Multivariable logistic regression revealed higher age and large ScO2 drops as independent risk factors for delirium. Conclusions: During RVP ScO2 significantly declined compared to baseline. A ScO2 decline of >20% is associated with a higher incidence of delirium and stroke and a valid cut-off value to screen for these complications. NIRS measurement during TAVI procedure may be an easy to implement diagnostic tool to detect patients at high risks for cerebrovascular complications and delirium.
Background: Intraosseous (IO) access represents a reliable alternative to intravenous vascular access and is explicitly recommended in the current guidelines of the European Resuscitation Council when intravenous access is difficult or impossible. We therefore aimed to study the efficacy of the intraosseous needle driver EZ-IO(R) in the prehospital setting.
Methods: During a 24-month period, all cases of prehospital IO access using the EZ-IO(R) needle driver within three operational areas of emergency medical services were prospectively recorded by a standardized questionnaire that needed to be filled out by the rescuer immediately after the mission and sent to the primary investigator. We determined the rate of successful insertion of the IO needle, the time required, immediate procedure-related complications, the level of previous experience with IO access, and operator's subjective satisfaction with the device.
Results: 77 IO needle insertions were performed in 69 adults and five infants and children by emergency physicians (n=72 applications) and paramedics (n=5 applications). Needle placement was successful at the first attempt in all but 2 adults (one patient with unrecognized total knee arthroplasty, one case of needle obstruction after placement). The majority of users (92%) were relative novices with less than five previous IO needle placements. Of 22 responsive patients, 18 reported pain upon fluid administration via the needle. The rescuers' subjective rating regarding handling of the device and ease of needle insertion, as described by means of an analogue scale (0 = entirely unsatisfied, 10 = most satisfied), provided a median score of 10 (range 1-10).
Conclusions: The EZ-IO(R) needle driver was an efficient alternative to establish immediate out-of-hospital vascular access. However, significant pain upon intramedullary infusion was observed in the majority of responsive patients.
Epidural catheterization has become an indispensable part of modern pain therapy, for example, in obstetrics. Learning how to master this skill is an important competency. Videos are among the information sources with the highest information content for learning such skills. The present study aims to analyze videos regarding epidural catheter placement provided on the YouTube platform based on a validated checklist. An expert workshop selected crucial items for learning epidural catheterization in obstetrics. Items were identified and optimized in a five-step testing process. Using this checklist, videos from YouTube were evaluated by eleven health care professionals. Sixteen videos were identified and analyzed. Concerning the catheterization-specific part of the checklist, only two videos showed satisfactory quality. In the didactic part, eleven out of 21 items reached a mean score >50% of the points. Regarding interrater reliability, the catheterization-specific checklist was shown to be substantial (Fleiss’ kappa = 0.610), and the didactic part was shown to be fair (Fleiss’ kappa = 0.401). Overall, standard monitoring and appropriate aseptic technique were followed in only 42% and 49% for the procedure. There was a significant correlation between the runtime and the content quality (p < 0.001). No correlation could be found in terms of platform rating parameters. The video quality varied highly in terms of the requirements of this practical skill. The majority appear unsuitable for self-study due to serious errors and deficiencies regarding patient safety. However, there is no quality control on free platforms. Accordingly, it is difficult to identify suitable videos for educational purposes.
Introduction Patients undergoing heart valve surgery are predominantly transferred postoperatively to the intensive care unit (ICU) under continuous sedation. Volatile anaesthetics are an increasingly used treatment alternative to intravenous substances in the ICU. As subject to inhalational uptake and elimination, the resulting pharmacological benefits have been repeatedly demonstrated. Therefore, volatile anaesthetics appear suitable to meet the growing demands of fast-track cardiac surgery. However, their use requires special preparation at the bedside and trained medical and nursing staff, which might limit the pharmacological benefits. The aim of our work is to assess whether the temporal advantages of recovery under volatile sedation outweigh the higher effort of special preparation.
Methods and analysis The study is designed to evaluate the differences between intravenous sedatives (n=48) and volatile sedatives (n=48) in continued intensive care sedation. This study will be conducted as a prospective, randomised, controlled, single-blinded, monocentre trial at a German university hospital in consenting adult patients undergoing heart valve surgery at a university hospital. This observational study will examine the necessary preparation time, staff consultation and overall feasibility of the chosen sedation method. For this purpose, the continuation of sedation in the ICU with volatile sedatives is considered as one study arm and with intravenous sedatives as the comparison group. Due to rapid elimination and quick awakening after the termination of sedation, closer consultation between the attending physician and the ICU nursing staff is required, in addition to a prolonged setup time. Study analysis will include the required setup time, time from admission to extubation as primary outcome and neurocognitive assessability. In addition, possible operation-specific (blood loss, complications), treatment parameters (catecholamine dosages, lung function) and laboratory results (acute kidney injury, acid base balance (lactataemia), liver failure) as influencing factors will be collected. The study-relevant data will be extracted from the continuous digital records of the patient data management system after the patient has been discharged from the ICU. For statistical evaluation, 95% CIs will be calculated for the median time to extubation and neurocognitive assessability, and the association will be assessed with a Cox regression model. In addition, secondary binary outcome measures will be evaluated using Fisher’s exact tests. Further descriptive and exploratory statistical analyses are also planned.
Ethics and dissemination The study was approved by the Institutional Ethics Board of the University of Frankfurt, Germany (#20-1050). Informed consent of all individual patients will be obtained before randomisation. Results will be disseminated via publication in peer-reviewed journals.
The scope of extracorporeal membrane oxygenation (ECMO) is expanding, nevertheless, pharmacokinetics in patients receiving cardiorespiratory support are fairly unknown leading to unpredictable drug concentrations. Currently, there are no clear guidelines for antibiotic dosing during ECMO. This study aims to evaluate the pharmacokinetics (PK) of cefazolin in patients undergoing ECMO treatment. Total and unbound plasma cefazolin concentration of critically ill patients on veno-arterial ECMO were determined. Observed PK was compared to dose recommendations calculated by an online available, free dosing software. Concentration of cefazolin varied broadly despite same dosage in all patients. The mean total and unbound plasma concentration were high showing significantly (p = 5.8913 E−09) greater unbound fraction compared to a standard patient. Cefazolin clearance was significantly (p = 0.009) higher in patients with preserved renal function compared with CRRT. Based upon the calculated clearance, the use of dosing software would have led to lower but still sufficient concentrations of cefazolin in general. Our study shows that a “one size fits all” dosing regimen leads to excessive unbound cefazolin concentration in these patients. They exhibit high PK variability and decreased cefazolin clearance on ECMO appears to compensate for ECMO- and critical illness-related increases in volume of distribution.
BACKGROUND: In the context of the coronavirus disease 2019 (COVID-19) pandemic, many retrospective single-centre or specialised centre reports have shown promising mortality rates with the use of extracorporeal membrane oxygenation (ECMO) therapy. However, the mortality rate of an entire country throughout the COVID-19 pandemic remains unknown.
OBJECTIVES: The primary objective is to determine the hospital mortality in COVID-19 patients receiving venovenous ECMO (VV-ECMO) and veno-arterial ECMO (VA-ECMO) therapy. Secondary objectives are the chronological development of mortality during the pandemic, the analysis of comorbidities, age and complications.
DESIGN: Cohort study.
SETTING: Inpatient data from January 2020 to September 2021 of all hospitals in Germany were analysed.
PARTICIPANTS: All COVID-19-positive patients who received ECMO therapy were analysed according to the appropriate international statistical classification of diseases and related health problem codes (ICDs) and process key codes (OPSs).
MAIN OUTCOME MEASURES: The primary outcome was the hospital mortality.
RESULTS: In total, 4279 COVID-19-positive patients who received ECMO therapy were analysed. Among 404 patients treated with VA-ECMO and 3875 treated with VV-ECMO, the hospital mortality was high: 72% (n = 291) for VA-ECMO and 65.9% (n = 2552) for VV-ECMO. A total of 43.2% (n = 1848) of all patients were older than 60 years with a hospital mortality rate of 72.7% (n = 172) for VA-ECMO and 77.6% (n = 1301) for VV-ECMO. CPR was performed in 44.1% (n = 178) of patients with VA-ECMO and 16.4% (n = 637) of patients with VV-ECMO. The mortality rates widely varied from 48.1 to 84.4% in individual months and worsened from March 2020 (59.2%) to September 2021 (78.4%).
CONCLUSION: In Germany, a large proportion of elderly patients with COVID-19 were treated with ECMO, with an unacceptably high hospital mortality. Considering these data, the unconditional use of ECMO therapy in COVID-19 must be carefully considered and advanced age should be considered as a relative contraindication.
Background: Anemia is the most important complication during major surgery and transfusion of red blood cells is the mainstay to compensate for life threating blood loss. Therefore, accurate measurement of hemoglobin (Hb) concentration should be provided in real-time. Blood Gas Analysis (BGA) provides rapid point-of-care assessment using smaller sampling tubes compared to central laboratory (CL) services. Objective: This study aimed to investigate the accuracy of BGA hemoglobin testing as compared to CL services. Methods: Data of the ongoing LIBERAL-Trial (Liberal transfusion strategy to prevent mortality and anemia-associated ischemic events in elderly non-cardiac surgical patients, LIBERAL) was used to assess the bias for Hb level measured by BGA devices (ABL800 Flex analyzer®, GEM series® and RapidPoint 500®) and CL as the reference method. For that, we analyzed pairs of Hb level measured by CL and BGA within two hours. Furthermore, the impact of various confounding factors including age, gender, BMI, smoker status, transfusion of RBC, intraoperative hemodilution, and co-medication was elucidated. In order to ensure adequate statistical analysis, only data of participating centers providing more than 200 Hb pairs were used. Results: In total, three centers including 963 patients with 1,814 pairs of Hb measurements were analyzed. Mean bias was comparable between ABL800 Flex analyzer® and GEM series®: - 0.38 ± 0.15 g/dl whereas RapidPoint 500® showed a smaller bias (-0.09 g/dl) but greater median absolute deviation (± 0.45 g/dl). In order to avoid interference with different standard deviations caused by the different analytic devices, we focused on two centers using the same BGA technique (309 patients and 1,570 Hb pairs). A Bland-Altman analysis and LOWESS curve showed that bias decreased with smaller Hb values in absolute numbers but increased relatively. The smoker status showed the greatest reduction in bias (0.1 g/dl, p<0.001) whereas BMI (0.07 g/dl, p = 0.0178), RBC transfusion (0.06 g/dl, p<0.001), statins (0.04 g/dl, p<0.05) and beta blocker (0.03 g/dl, p = 0.02) showed a slight effect on bias. Intraoperative substitution of volume and other co-medications did not influence the bias significantly. Conclusion: Many interventions like substitution of fluids, coagulating factors or RBC units rely on the accuracy of laboratory measurement devices. Although BGA Hb testing showed a consistently stable difference to CL, our data confirm that BGA devices are associated with different bias. Therefore, we suggest that hospitals assess their individual bias before implementing BGA as valid and stable supplement to CL. However, based on the finding that bias decreased with smaller Hb values, which in turn are used for transfusion decision, we expect no unnecessary or delayed RBC transfusion, and no major impact on the LIBERAL trial performance.
Introduction: Systemic inflammation (e.g. following surgery) involves Toll-like receptor (TLR) signaling and leads to an endocrine stress response. This study aims to investigate a possible influence of TLR2 and TLR4 single nucleotide polymorphisms (SNPs) on perioperative adrenocorticotropic hormone (ACTH) and cortisol regulation in serum of cardiac surgical patients. To investigate the link to systemic inflammation in this context, we additionally measured 10 different cytokines in the serum. Methods: 338 patients admitted for elective cardiac surgery were included in this prospective observational clinical cohort study. Genomic DNA of patients was screened for TLR2 and TLR4 SNPs. Serum concentrations of ACTH, cortisol, interferon (IFN)-, interleukin (IL)-1, IL-2, IL-4, IL-5, IL-6, IL-8, IL-10, tumor necrosis factor (TNF)- and granulocyte macro-phage-colony stimulating factor (GM-CSF) were determined before surgery, immediately post surgery and on the first postoperative day. Results: 13 patients were identified as TLR2 SNP carrier, 51 as TLR4 SNP carrier and 274 pa-tients as non-carrier. Basal levels of ACTH, cortisol and cytokines did not differ between groups. In all three groups a significant, transient perioperative rise of cortisol could be ob-served. However, only in the non-carrier group this was accompanied by a significant ACTH rise, TLR4 SNP carriers had significant lower ACTH levels compared to non-carriers ((mean[95% confidence intervals]) non-carriers: 201.9[187.7 to 216.1]pg/ml; TLR4 SNP car-riers: 149.9[118.4 to 181.5]pg/ml; TLR2 SNP carriers: 176.4[110.5 to 242.3]pg/ml). Compared to non-carriers, TLR4 SNP carriers showed significant lower serum IL-8, IL-10 and GM-CSF peaks ((mean[95% confidence intervals]): IL-8: non-carriers: 42.6[36.7 to 48.5]pg/ml, TLR4 SNP carriers: 23.7[10.7 to 36.8]pg/ml; IL-10: non-carriers: 83.8[70.3 to 97.4]pg/ml, TLR4 SNP carriers: 54.2[24.1 to 84.2]pg/ml; GM-CSF: non-carriers: 33.0[27.8 to 38.3]pg/ml, TLR4 SNP carriers: 20.2[8.6 to 31.8]pg/ml). No significant changes over time or between the groups were found for the other cytokines. Conclusions: Regulation of the immunoendocrine stress response during systemic inflamma-tion is influenced by the presence of a TLR4 SNP. Cardiac surgical patients carrying this ge-notype showed decreased serum concentrations of ACTH, IL-8, IL-10 and GM-CSF. This finding might have impact on interpreting previous and designing future trials on diagnosing and modulating immunoendocrine dysregulation (e.g. adrenal insufficiency) during systemic inflammation and sepsis.
Health economics of Patient Blood Management: a cost‐benefit analysis based on a meta‐analysis
(2019)
Background and Objectives: Patient Blood Management (PBM) is the timely application of evidence‐based medical and surgical concepts designed to improve haemoglobin concentration, optimize haemostasis and minimize blood loss in an effort to improve patient outcomes. The focus of this cost‐benefit analysis is to analyse the economic benefit of widespread implementation of a multimodal PBM programme.
Materials and Methods: Based on a recent meta‐analysis including 17 studies (>235 000 patients) comparing PBM with control care and data from the University Hospital Frankfurt, a cost‐benefit analysis was performed. Outcome data were red blood cell (RBC) transfusion rate, number of transfused RBC units, and length of hospital stay (LOS). Costs were considered for the following three PBM interventions as examples: anaemia management including therapy of iron deficiency, use of cell salvage and tranexamic acid. For sensitivity analysis, a Monte Carlo simulation was performed.
Results: Iron supplementation was applied in 3·1%, cell salvage in 65% and tranexamic acid in 89% of the PBM patients. In total, applying these three PBM interventions costs €129·04 per patient. However, PBM was associated with a reduction in transfusion rate, transfused RBC units per patient, and LOS which yielded to mean savings of €150·64 per patient. Thus, the overall benefit of PBM implementation was €21·60 per patient. In the Monte Carlo simulation, the cost savings on the outcome side exceeded the PBM costs in approximately 2/3 of all repetitions and the total benefit was €1 878 000 in 100·000 simulated patients.
Conclusion: Resources to implement a multimodal PBM concept optimizing patient care and safety can be cost‐effectively.
Background During gram-negative sepsis, lipopolysaccharide (LPS) induces tissue factor expression on monocytes. The resulting disseminated intravascular coagulation leads to tissue ischemia and worsens the prognosis of septic patients. There are indications, that fever reduces the mortality of sepsis, the effect on tissue factor activity on monocytes is unknown. Therefore, we investigated whether heat shock modulates LPS-induced tissue factor activity in human blood. Methods Whole blood samples and leukocyte suspensions, respectively, from healthy probands (n = 12) were incubated with LPS for 2 hours under heat shock conditions (43°C) or control conditions (37°C), respectively. Subsequent to further 3 hours of incubation at 37°C the clotting time, a measure of tissue factor expression, was determined. Cell integrity was verified by trypan blue exclusion test and FACS analysis. Results Incubation of whole blood samples with LPS for 5 hours at normothermia resulted in a significant shortening of clotting time from 357 ± 108 sec to 82 ± 8 sec compared to samples incubated without LPS (n = 12; p < 0.05). This LPS effect was mediated by tissue factor, as inhibition with active site-inhibited factor VIIa (ASIS) abolished the effect of LPS on clotting time. Blockade of protein synthesis using cycloheximide demonstrated that LPS exerted its procoagulatory effect via an induction of tissue factor expression. Upon heat shock treatment, the LPS effect was blunted: clotting times were 312 ± 66 s in absence of LPS and 277 ± 65 s in presence of LPS (n = 8; p > 0.05). Similarly, heat shock treatment of leukocyte suspensions abolished the LPS-induced tissue factor activity. Clotting time was 73 ± 31 s, when cells were treated with LPS (100 ng/mL) under normothermic conditions, and 301 ± 118 s, when treated with LPS (100 ng/mL) and heat shock (n = 8, p < 0.05). Control experiments excluded cell damage as a potential cause of the observed heat shock effect. Conclusion Heat shock treatment inhibits LPS-induced tissue factor activity in human whole blood samples and isolated leukocytes.
Background: Point of care devices for performing targeted coagulation substitution in patients who are bleeding have become increasingly important in recent years. New on the market is the Quantra. It is a device that uses sonorheometry, a sonic estimation of elasticity via resonance, which is a novel ultrasound-based technology that measures viscoelastic properties of whole blood. Several studies have already shown the comparability of the Quantra with devices already established on the market, such as the rotational thromboelastometry (ROTEM) device.
Objective: In contrast to existing studies, this study is the first prospective interventional study using this new system in a cardiac surgical patient cohort. We will investigate the noninferiority between an already existing coagulation algorithm based on the ROTEM/Multiplate system and a new algorithm based on the Quantra system for the treatment of coagulopathic cardiac surgical patients.
Methods: The study is divided into two phases. In an initial observation phase, whole blood samples of 20 patients obtained at three defined time points (prior to surgery, after completion of cardiopulmonary bypass, and on arrival in the intensive care unit) will be analyzed using both the ROTEM/Multiplate and Quantra systems. The obtained threshold values will be used to develop a novel algorithm for hemotherapy. In a second intervention phase, the new algorithm will be tested for noninferiority against an algorithm used routinely for years in our department.
Results: The main objective of the examination is the cumulative loss of blood within 24 hours after surgery. Statistical calculations based on the literature and in-house data suggest that the new algorithm is not inferior if the difference in cumulative blood loss is <150 mL/24 hours.
Conclusions: Because of the comparability of the Quantra sonorheometry system with the ROTEM measurement methods, the existing hemotherapy treatment algorithm can be adapted to the Quantra device with proof of noninferiority.
Trial Registration: ClinicalTrials.gov NCT03902275; https://clinicaltrials.gov/ct2/show/NCT03902275
International Registered Report Identifier (IRRID): DERR1-10.2196/17206
High sedation needs of critically ill COVID-19 ARDS patients - a monocentric observational study
(2021)
Background: Therapy of severely affected coronavirus patient, requiring intubation and sedation is still challenging. Recently, difficulties in sedating these patients have been discussed. This study aims to describe sedation practices in patients with 2019 coronavirus disease (COVID-19)-induced acute respiratory distress syndrome (ARDS). Methods: We performed a retrospective monocentric analysis of sedation regimens in critically ill intubated patients with respiratory failure who required sedation in our mixed 32-bed university intensive care unit. All mechanically ventilated adults with COVID-19-induced ARDS requiring continuously infused sedative therapy admitted between April 4, 2020, and June 30, 2020 were included. We recorded demographic data, sedative dosages, prone positioning, sedation levels and duration. Descriptive data analysis was performed; for additional analysis, a logistic regression with mixed effect was used. Results: In total, 56 patients (mean age 67 (±14) years) were included. The mean observed sedation period was 224 (±139) hours. To achieve the prescribed sedation level, we observed the need for two or three sedatives in 48.7% and 12.8% of the cases, respectively. In cases with a triple sedation regimen, the combination of clonidine, esketamine and midazolam was observed in most cases (75.7%). Analgesia was achieved using sufentanil in 98.6% of the cases. The analysis showed that the majority of COVID-19 patients required an unusually high sedation dose compared to those available in the literature. Conclusion: The global pandemic continues to affect patients severely requiring ventilation and sedation, but optimal sedation strategies are still lacking. The findings of our observation suggest unusual high dosages of sedatives in mechanically ventilated patients with COVID-19. Prescribed sedation levels appear to be achievable only with several combinations of sedatives in most critically ill patients suffering from COVID-19-induced ARDS and a potential association to the often required sophisticated critical care including prone positioning and ECMO treatment seems conceivable.
Background: Mild therapeutic hypothermia following cardiac arrest is neuroprotective, but its effect on myocardial dysfunction that is a critical issue following resuscitation is not clear. This study sought to examine whether hypothermia and the combination of hypothermia and pharmacological postconditioning are cardioprotective in a model of cardiopulmonary resuscitation following acute myocardial ischemia. Methodology/Principal Findings: Thirty pigs (28–34 kg) were subjected to cardiac arrest following left anterior descending coronary artery ischemia. After 7 minutes of ventricular fibrillation and 2 minutes of basic life support, advanced cardiac life support was started according to the current AHA guidelines. After successful return of spontaneous circulation (n = 21), coronary perfusion was reestablished after 60 minutes of occlusion, and animals were randomized to either normothermia at 38°C, hypothermia at 33°C or hypothermia at 33°C combined with sevoflurane (each group n = 7) for 24 hours. The effects on cardiac damage especially on inflammation, apoptosis, and remodeling were studied using cellular and molecular approaches. Five animals were sham operated. Animals treated with hypothermia had lower troponin T levels (p<0.01), reduced infarct size (34±7 versus 57±12%; p<0.05) and improved left ventricular function compared to normothermia (p<0.05). Hypothermia was associated with a reduction in: (i) immune cell infiltration, (ii) apoptosis, (iii) IL-1beta and IL-6 mRNA up-regulation, and (iv) IL-1beta protein expression (p<0.05). Moreover, decreased matrix metalloproteinase-9 activity was detected in the ischemic myocardium after treatment with mild hypothermia. Sevoflurane conferred additional protective effects although statistic significance was not reached. Conclusions/Significance: Hypothermia reduced myocardial damage and dysfunction after cardiopulmonary resuscitation possible via a reduced rate of apoptosis and pro-inflammatory cytokine expression.
Introduction: Sepsis remains associated with a high mortality rate. Endotoxin has been shown to influence viscoelastic coagulation parameters, thus suggesting a link between endotoxin levels and the altered coagulation phenotype in septic patients. This study evaluated the effects of systemic polyspecific IgM-enriched immunoglobulin (IgM-IVIg) (Pentaglobin® [Biotest, Dreieich, Germany]) on endotoxin activity (EA), inflammatory markers, viscoelastic and conventional coagulation parameters.
Methods: Patients with severe sepsis were identified by daily screening in a tertiary, academic, surgical ICU. After the inclusion of 15 patients, the application of IgM-IVIg (5 mg/kg/d over three days) was integrated into the unit’s standard operation procedure (SOP) to treat patients with severe sepsis, thereby generating “control” and “IgM-IVIg” groups. EA assays, thrombelastometry (ROTEM®) and impedance aggregometry (Multiplate®) were performed on whole blood. Furthermore, routine laboratory parameters were determined according to unit’s standards.
Results: Data from 26 patients were included. On day 1, EA was significantly decreased in the IgM-IVIg group following 6 and 12 hours of treatment (0.51 ±0.06 vs. 0.26 ±0.07, p<0.05 and 0.51 ±0.06 vs. 0.25 ±0.04, p<0.05) and differed significantly compared with the control group following 6 hours of treatment (0.26 ±0.07 vs. 0.43 ±0.07, p<0.05). The platelet count was significantly higher in the IgM-IVIg group following four days of IgM-IVIg treatment (200/nl ±43 vs. 87/nl ±20, p<0.05). The fibrinogen concentration was significantly lower in the control group on day 2 (311 mg/dl ±37 vs. 475 mg/dl ±47 (p = 0.015)) and day 4 (307 mg/dl ±35 vs. 420 mg/dl ±16 (p = 0.017)). No differences in thrombelastometric or aggregometric measurements, or inflammatory markers (interleukin-6 (IL-6), leukocyte, lipopolysaccharide binding protein (LBP)) were observed.
Conclusion: Treatment with IgM-enriched immunoglobulin attenuates the EA levels in patients with severe sepsis and might have an effect on septic thrombocytopenia and fibrinogen depletion. Viscoelastic, aggregometric or inflammatory parameters were not influenced.
Background: Transfusion of red blood cells (RBC) in patients undergoing major elective cranial surgery is associated with increased morbidity, mortality and prolonged hospital length of stay (LOS). This retrospective single center study aims to identify the impact of RBC transfusions on skull-base and non-skull-base meningioma patients including the identification of risk factors for RBC transfusion.
Methods: From October 2009 - October 2016 we retrospectively analyzed 423 primary meningioma patients undergoing surgery for primary meningioma resection our department.
Results: Of these 423 patients, 68 (16.1%) received RBC transfusion and 355 (83.9%) did not receive RBC units. Preoperative anaemia rate was significantly higher in transfused patients (17.7%) compared to patients without RBC transfusion (6.2%; p = 0.0015). In transfused patients, postoperative complications as well as hospital LOS was significantly higher (p < 00001) compared to non-transfused patients. After multivariate analyses, risk factors for RBC transfusion were preoperative American Society of Anesthesiologists (ASA) physical status score (p = 0.0247), tumor size (p = 0.0006), surgical time (p = 0.0018) and intraoperative blood loss (p < 0.001). Kaplan-Meier curves revealed significant influence on overall survival by preoperative anaemia, RBC transfusion, smoking, cardiovascular disease, preoperative KPS ≤ 60% and age (elderly ≥ 75 years).
Conclusion: We concluded that blood loss due to large tumors or localization near large vessels are the main triggers for RBC transfusion in meningioma patients paired with a potential preselection that masks the effect of preoperative anaemia in multivariate analysis. Further studies evaluating the impact of preoperative anaemia management for reduction of RBC transfusion are needed to improve clinical outcomes of meningioma patients.
Introduction: Balanced fluid replacement solutions can possibly reduce the risks for electrolyte imbalances, for acid-base imbalances, and thus for renal failure. To assess the intraoperative change of base excess (BE) and chloride in serum after treatment with either a balanced gelatine/electrolyte solution or a non-balanced gelatine/electrolyte solution, a prospective, controlled, randomized, double-blind, dual centre phase III study was conducted in two tertiary care university hospitals in Germany.
Material and methods: 40 patients of both sexes, aged 18 to 90 years, who were scheduled to undergo elective abdominal surgery with assumed intraoperative volume requirement of at least 15 mL/kg body weight gelatine solution were included. Administration of study drug was performed intravenously according to patients need. The trigger for volume replacement was a central venous pressure (CVP) minus positive end-expiratory pressure (PEEP) <10 mmHg (CVP <10 mmHg). The crystalloid:colloid ratio was 1:1 intra- and postoperatively. The targets for volume replacement were a CVP between 10 and 14 mmHg minus PEEP after treatment with vasoactive agent and mean arterial pressure (MAP) > 65 mmHg.
Results: The primary endpoints, intraoperative changes of base excess –2.59 ± 2.25 (median: –2.65) mmol/L (balanced group) and –4.79 ± 2.38 (median: –4.70) mmol/L (non-balanced group)) or serum chloride 2.4 ± 1.9 (median: 3.0) mmol/L and 5.2 ± 3.1 (median: 5.0) mmol/L were significantly different (p = 0.0117 and p = 0.0045, respectively). In both groups (each n = 20) the investigational product administration in terms of volume and infusion rate was comparable throughout the course of the study, i.e. before, during and after surgery.
Discussion: Balanced gelatine solution 4% combined with a balanced electrolyte solution demonstrated significant smaller impact on blood gas analytic parameters in the primary endpoints BE and serum chloride when compared to a non-balanced gelatine solution 4% combined with NaCl 0.9%. No marked treatment differences were observed with respect to haemodynamics, coagulation and renal function.
Trial registration: ClinicalTrials.gov (NCT01515397) and clinicaltrialsregister.eu, EudraCT number 2010-018524-58.
Background: SARS-CoV-2 has massively changed the care situation in hospitals worldwide. Although tumour care should not be affected, initial reports from European countries were suggestive for a decrease in skin cancer during the first pandemic wave and only limited data are available thereafter.
Objectives: The aim of this study was to investigate skin cancer cases and surgeries in a nationwide inpatient dataset in Germany.
Methods: Comparative analyses were performed in a prepandemic (18 March 2019 until 17 March 2020) and a pandemic cohort (18 March 2020 until 17 March 2021). Cases were identified and analysed using the WHO international classification of diseases codes (ICDs) and process key codes (OPSs).
Results: Comparing the first year of the pandemic with the same period 1 year before, a persistent decrease of 14% in skin cancer cases (n = 19 063) was observed. The largest decrease of 24% was seen in non-invasive in situ tumours (n = 1665), followed by non-melanoma skin cancer (NMSC) with a decrease of 16% (n = 15 310) and malignant melanoma (MM) with a reduction of 7% (n = 2088). Subgroup analysis showed significant differences in the distribution of sex, age, hospital carrier type and hospital volume. There was a decrease of 17% in surgical procedures (n = 22 548), which was more pronounced in minor surgical procedures with a decrease of 24.6% compared to extended skin surgery including micrographic surgery with a decrease of 15.9%.
Conclusions: Hospital admissions and surgical procedures decreased persistently since the beginning of the pandemic in Germany for skin cancer patients. The higher decrease in NMSC cases compared to MM might reflect a prioritization effect. Further evidence from tumour registries is needed to investigate the consequences of the therapy delay and identify the upcoming challenges in skin cancer care.
Background: Approximately one in three patients suffers from preoperative anaemia. Even though haemoglobin is measured before surgery, anaemia management is not implemented in every hospital. Objective: Here, we demonstrate the implementation of an anaemia walk-in clinic at an Orthopedic University Hospital. To improve the diagnosis of iron deficiency (ID), we examined whether reticulocyte haemoglobin (Ret-He) could be a useful additional parameter. Material and Methods: In August 2019, an anaemia walk-in clinic was established. Between September and December 2019, major orthopaedic surgical patients were screened for preoperative anaemia. The primary endpoint was the incidence of preoperative anaemia. Secondary endpoints included Ret-He level, red blood cell (RBC) transfusion rate, in-hospital length of stay and anaemia at hospital discharge. Results: A total of 104 patients were screened for anaemia. Preoperative anaemia rate was 20.6%. Intravenous iron was supplemented in 23 patients. Transfusion of RBC units per patient (1.7 ± 1.2 vs. 0.2 ± 0.9; p = 0.004) and hospital length of stay (13.1 ± 4.8 days vs. 10.6 ± 5.1 days; p = 0.068) was increased in anaemic patients compared to non-anaemic patients. Ret-He values were significantly lower in patients with ID anaemia (33.3 pg [28.6–40.2 pg]) compared to patients with ID (35.3 pg [28.9–38.6 pg]; p = 0.015) or patients without anaemia (35.4 pg [30.2–39.4 pg]; p = 0.001). Conclusion: Preoperative anaemia is common in orthopaedic patients. Our results proved the feasibility of an anaemia walk-in clinic to manage preoperative anaemia. Furthermore, our analysis supports the use of Ret-He as an additional parameter for the diagnosis of ID in surgical patients.
Background: Following elective craniotomy patients routinely receive monitoring on ICU. However, the benefit of ICU monitoring in these patients is discussed controversially. Due to the current COVID-19 pandemic, there are further limitations of ICU capacities. This study aimed to compare this strategy with a standardized management of post-craniotomy patients on ICU.
Methods: Two postoperative strategies were compared in a matched-pair analysis: The first cohort included patients treated between May-August 2021 according to the “No ICU – unless” concept (NIU group), where patients were managed on the normal ward postoperatively. The second cohort contained patients routinely admitted to the ICU between February-April 2021 (control group). Outcome parameters contained complications, length of stay, duration to first postoperative mobilization, number of unplanned imaging, number/type of ICU interventions and pre- and postoperative mRS. Patient characteristics were analyzed using electronic medical records.
Results: The NIU group consisted of 96 patients, the control group of 75 patients. Complication rates were comparable in both cohorts (16% in NIU vs. 17% in control; p=0.123). Groups did not differ significantly in the number of imaging (10% in NIU vs. 13% in control; p=0.67), in the type of interventions on ICU (antihypertensive therapy 5% (NIU) vs. 6% (control); p=0.825) or in the time to first postoperative mobilization (average 1.1± 1.6 days vs. 0.9± 1.2 days; p=0.402). Length of hospital stay was shorter in the NIU group without reaching statistical significance (average 5.8 vs. 6.8 days; p=0.481). There was no significant change in the distribution of preoperative (p=0.960) and postoperative (p=0.425) mRS scores.
Conclusion: Postoperative ICU management does not reduce postoperative complications and has no effect on the surgical outcome of elective craniotomies. The majority of postoperative complications are detected after a 24-hour observation period. This approach may represent a potential strategy to prevent overutilization of ICU capacities while maintaining sufficient postoperative care for neurosurgical patients.
Introduction: Hip fracture surgery is associated with high in-hospital and 30-day mortality rates and serious adverse patient outcomes. Evidence from randomised controlled trials regarding effectiveness of spinal versus general anaesthesia on patient-centred outcomes after hip fracture surgery is sparse.
Methods and analysis: The iHOPE study is a pragmatic national, multicentre, randomised controlled, open-label clinical trial with a two-arm parallel group design. In total, 1032 patients with hip fracture (>65 years) will be randomised in an intended 1:1 allocation ratio to receive spinal anaesthesia (n=516) or general anaesthesia (n=516). Outcome assessment will occur in a blinded manner after hospital discharge and inhospital. The primary endpoint will be assessed by telephone interview and comprises the time to the first occurring event of the binary composite outcome of all-cause mortality or new-onset serious cardiac and pulmonary complications within 30 postoperative days. In-hospital secondary endpoints, assessed via in-person interviews and medical record review, include mortality, perioperative adverse events, delirium, satisfaction, walking independently, length of hospital stay and discharge destination. Telephone interviews will be performed for long-term endpoints (all-cause mortality, independence in walking, chronic pain, ability to return home cognitive function and overall health and disability) at postoperative day 30±3, 180±45 and 365±60.
Ethics and dissemination: iHOPE has been approved by the leading Ethics Committee of the Medical Faculty of the RWTH Aachen University on 14 March 2018 (EK 022/18). Approval from all other involved local Ethical Committees was subsequently requested and obtained. Study started in April 2018 with a total recruitment period of 24 months. iHOPE will be disseminated via presentations at national and international scientific meetings or conferences and publication in peer-reviewed international scientific journals.
Trial registration number: DRKS00013644; Pre-results
In-line filtration of intravenous infusion may reduce organ dysfunction of adult critical patients
(2019)
Background: The potential harmful effects of particle-contaminated infusions for critically ill adult patients are yet unclear. So far, only significant improved outcome in critically ill children and new-borns was demonstrated when using in-line filters, but for adult patients, evidence is still missing.
Methods: This single-centre, retrospective controlled cohort study assessed the effect of in-line filtration of intravenous fluids with finer 0.2 or 1.2 μm vs 5.0 μm filters in critically ill adult patients. From a total of n = 3215 adult patients, n = 3012 patients were selected by propensity score matching (adjusting for sex, age, and surgery group) and assigned to either a fine filter cohort (with 0.2/1.2 μm filters, n = 1506, time period from February 2013 to January 2014) or a control filter cohort (with 5.0 μm filters, n = 1506, time period from April 2014 to March 2015). The cohorts were compared regarding the occurrence of severe vasoplegia, organ dysfunctions (lung, kidney, and brain), inflammation, in-hospital complications (myocardial infarction, ischemic stroke, pneumonia, and sepsis), in-hospital mortality, and length of ICU and hospital stay.
Results: Comparing fine filter vs control filter cohort, respiratory dysfunction (Horowitz index 206 (119–290) vs 191 (104.75–280); P = 0.04), pneumonia (11.4% vs 14.4%; P = 0.02), sepsis (9.6% vs 12.2%; P = 0.03), interleukin-6 (471.5 (258.8–1062.8) ng/l vs 540.5 (284.5–1147.5) ng/l; P = 0.01), and length of ICU (1.2 (0.6–4.9) vs 1.7 (0.8–6.9) days; P < 0.01) and hospital stay (14.0 (9.2–22.2) vs 14.8 (10.0–26.8) days; P = 0.01) were reduced. Rate of severe vasoplegia (21.0% vs 19.6%; P > 0.20) and acute kidney injury (11.8% vs 13.7%; P = 0.11) was not significantly different between the cohorts.
Conclusions: In-line filtration with finer 0.2 and 1.2 μm filters may be associated with less organ dysfunction and less inflammation in critically ill adult patients.
Trial registration: The study was registered at ClinicalTrials.gov (number: NCT02281604).
Background: paediatric patients are vulnerable to blood loss and even a small loss of blood can be associated with severe shock. In emergency situations, a red blood cell (RBC) transfusion may become unavoidable, although it is associated with various risks. The aim of this trial was to identify independent risk factors for perioperative RBC transfusion in children undergoing surgery. Methods: to identify independent risk factors for perioperative RBC transfusion in children undergoing surgery and to access RBC transfusion rates and in-hospital outcomes (e.g., length of stay, mortality, and typical postoperative complication rates), a monocentric, retrospective, and observational study was conducted. Descriptive, univariate, and multivariate analyses were performed. Results: between 1 January 2010 and 31 December 2019, data from n = 14,248 cases were identified at the centre. Analysis revealed an RBC transfusion rate of 10.1% (n = 1439) in the entire cohort. The independent predictors of RBC transfusion were the presence of preoperative anaemia (p < 0.001; OR = 15.10 with preoperative anaemia and OR = 2.40 without preoperative anaemia), younger age (p < 0.001; ORs between 0.14 and 0.28 for children older than 0 years), female gender (p = 0.036; OR = 1.19 compared to male gender), certain types of surgery (e.g., neuro surgery (p < 0.001; OR = 10.14), vascular surgery (p < 0.001; OR = 9.93), cardiac surgery (p < 0.001; OR = 4.79), gynaecology (p = 0.014; OR = 3.64), visceral surgery (p < 0.001; OR = 2.48), and the presence of postoperative complications (e.g., sepsis (p < 0.001; OR = 10.16), respiratory dysfunction (p < 0.001; OR = 7.56), cardiovascular dysfunction (p < 0.001; OR = 4.68), neurological dysfunction (p = 0.029; OR = 1.77), and renal dysfunction (p < 0.001; OR = 16.17)). Conclusion: preoperative anaemia, younger age, female gender, certain types of surgery, and postoperative complications are independent predictors for RBC transfusion in children undergoing surgery. Future prospective studies are urgently required to identify, in detail, the potential risk factors and impact of RBC transfusion in children.
Introduction: Organ dysfunction or failure after the first days of ICU treatment and subsequent mortality with respect to the type of intensive care unit (ICU) admission is poorly elucidated. Therefore we analyzed the association of ICU mortality and admission for medical (M), scheduled surgery (ScS) or unscheduled surgery (US) patients mirrored by the occurrence of organ dysfunction/failure (OD/OF) after the first 72h of ICU stay.
Methods: For this retrospective cohort study (23,795 patients; DIVI registry; German Interdisciplinary Association for Intensive Care Medicine (DIVI)) organ dysfunction or failure were derived from the Sequential Organ Failure Assessment (SOFA) score (excluding the Glasgow Coma Scale). SOFA scores were collected on admission to ICU and 72h later. For patients with a length of stay of at least five days, a multivariate analysis was performed for individual OD/OF on day three.
Results: M patients had the lowest prevalence of cardiovascular failure (M 31%; ScS 35%; US 38%), and the highest prevalence of respiratory (M 24%; ScS 13%; US 17%) and renal failure (M 10%; ScS 6%; US 7%). Risk of death was highest for M- and ScS-patients in those with respiratory failure (OR; M 2.4; ScS 2.4; US 1.4) and for surgical patients with renal failure (OR; M 1.7; ScS 2.7; US 2.4).
Conclusion: The dynamic evolution of OD/OF within 72h after ICU admission and mortality differed between patients depending on their types of admission. This has to be considered to exclude a systematic bias during multi-center trials.
Purpose: Anaemia is one of the leading causes of death among severely injured patients. It is also known to increase the risk of death and prolong the length of hospital stay in various surgical groups. The main objective of this study is to analyse the anaemia rate on admission to the emergency department and the impact of anaemia on in-hospital mortality.
Methods: Data from the TraumaRegister DGU® (TR-DGU) between 2015 and 2019 were analysed. Inclusion criteria were age ≥ 16 years and most severe Abbreviated Injury Scale (AIS) score ≥ 3. Patients were divided into three anaemia subgroups: no or mild anaemia (NA), moderate anaemia (MA) and severe anaemia (SA). Pre-hospital data, patient characteristics, treatment in the emergency room (ER), outcomes, and differences between trauma centres were analysed.
Results: Of 67,595 patients analysed, 94.9% (n = 64,153) exhibited no or mild anaemia (Hb ≥ 9 g/dl), 3.7% (n = 2478) displayed moderate anaemia (Hb 7–8 g/dl) and 1.4% (n = 964) presented with severe anaemia (Hb < 7 g/dl). Haemoglobin (Hb) values ranged from 3 to 18 g/dl with a mean Hb value of 12.7 g/dl. In surviving patients, anaemia was associated with prolonged length of stay (LOS). Multivariate logistic regression analyses revealed moderate (p < 0.001 OR 1.88 (1.66–2.13)) and severe anaemia (p < 0.001 OR 4.21 (3.46–5.12)) to be an independent predictor for mortality. Further significant predictors are ISS score per point (OR 1.0), age 70–79 (OR 4.8), age > 80 (OR 12.0), severe pre-existing conditions (ASA 3/4) (OR 2.26), severe head injury (AIS 5/6) (OR 4.8), penetrating trauma (OR 1.8), unconsciousness (OR 4.8), shock (OR 2.2) and pre-hospital intubation (OR 1.6).
Conclusion: The majority of severely injured patients are admitted without anaemia to the ER. Injury-associated moderate and severe anaemia is an independent predictor of mortality in severely injured patients.
Background: Clonidine effectively decreases perioperative mortality by reducing sympathetic tone. However, application of clonidine might also restrict anaemia tolerance due to impairment of compensatory mechanisms. Therefore, the influence of clonidine induced, short-term sympathicolysis on anaemia tolerance was assessed in anaesthetized pigs. We measured the effect of clonidine on anaemia tolerance and of the potential for macrohemodynamic alterations to constrain the acute anaemia compensatory mechanisms.
Methods: After governmental approval, 14 anaesthetized pigs of either gender (Deutsche Landrasse, weight (mean ± SD) 24.1 ± 2.4 kg) were randomly assigned to intravenous saline or clonidine treatment (bolus: 20 μg · kg−1, continuous infusion: 15 μg · kg−1 · h−1). Thereafter, the animals were hemodiluted by exchange of whole blood for 6 % hydroxyethyl starch (MW 130.000/0.4) until the individual critical haemoglobin concentration (Hbcrit) was reached. Primary outcome parameters were Hbcrit and the exchangeable blood volume (EBV) until Hbcrit was reached.
Results: Hbcrit did not differ between both groups (values are median [interquartile range]: saline: 2.2 (2.0–2.5) g · dL−1 vs. clonidine: 2.1 (2.1–2.4) g · dL−1; n.s.). Furthermore, there was no difference in exchangeable blood volume (EBV) between both groups (saline: 88 (76–106) mL · kg−1 vs. clonidine: 92 (85–95) mL · kg−1; n.s.).
Conclusion: Anaemia tolerance was not affected by clonidine induced sympathicolysis. Consequently, perioperative clonidine administration probably has not to be omitted in view of acute anaemia.
Introduction: It has been proposed that individual genetic variation contributes to the course of severe infections and sepsis. Recent studies of single nucleotide polymorphisms (SNPs) within the endotoxin receptor and its signaling system showed an association with the risk of disease development. This study aims to examine the response associated with genetic variations of TLR4, the receptor for bacterial LPS, and a central intracellular signal transducer (TIRAP/Mal) on cytokine release and for susceptibility and course of severe hospital acquired infections in distinct patient populations. Methods: Three intensive care units in tertiary care university hospitals in Greece and Germany participated. 375 and 415 postoperative patients and 159 patients with ventilator associated pneumonia (VAP) were included. TLR4 and TIRAP/Mal polymorphisms in 375 general surgical patients were associated with risk of infection, clinical course and outcome. In two prospective studies, 415 patients following cardiac surgery and 159 patients with newly diagnosed VAP predominantly caused by Gram-negative bacteria were studied for cytokine levels in-vivo and after ex-vivo monocyte stimulation and clinical course. Results: Patients simultaneously carrying polymorphisms in TIRAP/Mal and TLR4 and patients homozygous for the TIRAP/Mal SNP had a significantly higher risk of severe infections after surgery (odds ratio (OR) 5.5; confidence interval (CI): 1.34 - 22.64; P = 0.02 and OR: 7.3; CI: 1.89 - 28.50; P < 0.01 respectively). Additionally we found significantly lower circulating cytokine levels in double-mutant individuals with ventilator associated pneumonia and reduced cytokine production in an ex-vivo monocyte stimulation assay, but this difference was not apparent in TIRAP/Mal-homozygous patients. In cardiac surgery patients without infection, the cytokine release profiles were not changed when comparing different genotypes. Conclusions: Carriers of mutations in sequential components of the TLR signaling system may have an increased risk for severe infections. Patients with this genotype showed a decrease in cytokine release when infected which was not apparent in patients with sterile inflammation following cardiac surgery.
5-Lipoxygenase (5-LO) is the key enzyme in the formation of pro-inflammatory leukotrienes (LT) which play an important role in a number of inflammatory diseases. Accordingly, 5-LO inhibitors are frequently used to study the role of 5-LO and LT in models of inflammation and cancer. Interestingly, the therapeutic efficacy of these inhibitors is highly variable. Here we show that the frequently used 5-LO inhibitors AA-861, BWA4C, C06, CJ-13,610 and the FDA approved compound zileuton as well as the pan-LO inhibitor nordihydroguaiaretic acid interfere with prostaglandin E2 (PGE2) release into the supernatants of cytokine-stimulated (TNFα/IL-1β) HeLa cervix carcinoma, A549 lung cancer as well as HCA-7 colon carcinoma cells with similar potencies compared to their LT inhibitory activities (IC50 values ranging from 0.1–9.1 µM). In addition, AA-861, BWA4C, CJ-13,610 and zileuton concentration-dependently inhibited bacterial lipopolysaccharide triggered prostaglandin (PG) release into human whole blood. Western Blot analysis revealed that inhibition of expression of enzymes involved in PG synthesis was not part of the underlying mechanism. Also, liberation of arachidonic acid which is the substrate for PG synthesis as well as PGH2 and PGE2 formation were not impaired by the compounds. However, accumulation of intracellular PGE2 was found in the inhibitor treated HeLa cells suggesting inhibition of PG export as major mechanism. Further, experiments showed that the PG exporter ATP-binding cassette transporter multidrug resistance protein 4 (MRP-4) is targeted by the inhibitors and may be involved in the 5-LO inhibitor-mediated PGE2 inhibition. In conclusion, the pharmacological effects of a number of 5-LO inhibitors are compound-specific and involve the potent inhibition of PGE2 export. Results from experimental models on the role of 5-LO in inflammation and pain using 5-LO inhibitors may be misleading and their use as pharmacological tools in experimental models has to be revisited. In addition, 5-LO inhibitors may serve as new scaffolds for the development of potent prostaglandin export inhibitors.
Background: Extracorporeal life support (ECLS) has become an integral part of modern intensive therapy. The choice of support mode depends largely on the indication. Patients with respiratory failure are predominantly treated with a venovenous (VV) approach. We hypothesized that mortality in Germany in ECLS therapy did not differ from previously reported literature
Methods: Inpatient data from Germany from 2007 to 2018 provided by the Federal Statistical Office of Germany were analysed. The international statistical classification of diseases and related health problems codes (ICD) and process keys (OPS) for extracorporeal membrane oxygenation (ECMO) types, acute respiratory distress syndrome (ARDS) and hospital mortality were used.
Results: In total, 45,647 hospitalized patients treated with ECLS were analysed. In Germany, 231 hospitals provided ECLS therapy, with a median of 4 VV-ECMO and 9 VA-ECMO in 2018. Overall hospital mortality remained higher than predicted in comparison to the values reported in the literature. The number of VV-ECMO cases increased by 236% from 825 in 2007 to 2768 in 2018. ARDS was the main indication for VV-ECMO in only 33% of the patients in the past, but that proportion increased to 60% in 2018. VA-ECMO support is of minor importance in the treatment of ARDS in Germany. The age distribution of patients undergoing ECLS has shifted towards an older population. In 2018, the hospital mortality decreased in VV-ECMO patients and VV-ECMO patients with ARDS to 53.9% (n = 1493) and 54.4% (n = 926), respectively.
Conclusions: ARDS is a severe disease with a high mortality rate despite ECLS therapy. Although endpoints and timing of the evaluations differed from those of the CESAR and EOLIA studies and the Extracorporeal Life Support Organization (ELSO) Registry, the reported mortality in these studies was lower than in the present analysis. Further prospective analyses are necessary to evaluate outcomes in ECMO therapy at the centre volume level.
OBJECTIVE: The role of supraglottic airway devices in emergency airway management is highlighted in international airway management guidelines. We evaluated the application of the new generation laryngeal tube suction (LTS-II/LTS-D) in the management of in-hospital unexpected difficult airway and cardiopulmonary resuscitation.
METHODS: During a seven-year period, patients treated with a laryngeal tube who received routine anesthesia and had an unexpected difficult airway (Cormack Lehane Grade 3-4), who underwent cardiopulmonary resuscitation, or who underwent cardiopulmonary resuscitation outside the operating room and had a difficult airway were evaluated. Successful placement of the LTS II/LTS-D, sufficient ventilation, time to placement, number of placement attempts, stomach content, peripheral oxygen saturation/end-tidal carbon dioxide development (SpO2/etCO2) over 5 minutes, subjective overall assessment and complications were recorded.
RESULTS: In total, 106 adult patients were treated using an LTS-II/LTS-D. The main indication for placement was a difficult airway (75%, n=80), followed by cardiopulmonary resuscitation (25%, n=26) or an overlap between both (18%, n=19). In 94% of patients (n=100), users placed the laryngeal tube during the first attempt. In 93% of patients (n=98), the tube was placed within 30 seconds. A significant increase in SpO2 from 97% (0-100) to 99% (5-100) was observed in the whole population and in cardiopulmonary resuscitation patients. The average initial etCO2 of 39.5 mmHg (0-100 mmHg) decreased significantly to an average of 38.4 mmHg (10-62 mmHg) after 5 minutes. A comparison of cardiopulmonary resuscitation patients with non-cardiopulmonary resuscitation patients regarding gastric contents showed no significant difference.
CONCLUSIONS: LTS-D/LTS-II use for in-hospital unexpected difficult airway management provides a secure method for primary airway management until other options such as video laryngoscopy or fiber optic intubation become available.
Background: Perioperative anaemia leads to impaired oxygen supply with a risk of vital organ ischaemia. In healthy and fit individuals, anaemia can be compensated by several mechanisms. Elderly patients, however, have less compensatory mechanisms because of multiple co-morbidities and age-related decline of functional reserves. The purpose of the study is to evaluate whether elderly surgical patients may benefit from a liberal red blood cell (RBC) transfusion strategy compared to a restrictive transfusion strategy.
Methods: The LIBERAL Trial is a prospective, randomized, multicentre, controlled clinical phase IV trial randomising 2470 elderly (≥ 70 years) patients undergoing intermediate- or high-risk non-cardiac surgery. Registered patients will be randomised only if Haemoglobin (Hb) reaches ≤9 g/dl during surgery or within 3 days after surgery either to the LIBERAL group (transfusion of a single RBC unit when Hb ≤ 9 g/dl with a target range for the post-transfusion Hb level of 9–10.5 g/dl) or the RESTRICTIVE group (transfusion of a single RBC unit when Hb ≤ 7.5 g/dl with a target range for the post-transfusion Hb level of 7.5–9 g/dl). The intervention per patient will be followed until hospital discharge or up to 30 days after surgery, whichever occurs first. The primary efficacy outcome is defined as a composite of all-cause mortality, acute myocardial infarction, acute ischaemic stroke, acute kidney injury (stage III), acute mesenteric ischaemia and acute peripheral vascular ischaemia within 90 days after surgery. Infections requiring iv antibiotics with re-hospitalisation are assessed as important secondary endpoint. The primary endpoint will be analysed by logistic regression adjusting for age, cancer surgery (y/n), type of surgery (intermediate- or high-risk), and incorporating centres as random effect.
Discussion: The LIBERAL-Trial will evaluate whether a liberal transfusion strategy reduces the occurrence of major adverse events after non-cardiac surgery in the geriatric population compared to a restrictive strategy within 90 days after surgery.
Trial registration: ClinicalTrials.gov (identifier: NCT03369210).
Background: Intensive Care Resources are heavily utilized during the COVID-19 pandemic. However, risk stratification and prediction of SARS-CoV-2 patient clinical outcomes upon ICU admission remain inadequate. This study aimed to develop a machine learning model, based on retrospective & prospective clinical data, to stratify patient risk and predict ICU survival and outcomes. Methods: A Germany-wide electronic registry was established to pseudonymously collect admission, therapeutic and discharge information of SARS-CoV-2 ICU patients retrospectively and prospectively. Machine learning approaches were evaluated for the accuracy and interpretability of predictions. The Explainable Boosting Machine approach was selected as the most suitable method. Individual, non-linear shape functions for predictive parameters and parameter interactions are reported. Results: 1039 patients were included in the Explainable Boosting Machine model, 596 patients retrospectively collected, and 443 patients prospectively collected. The model for prediction of general ICU outcome was shown to be more reliable to predict “survival”. Age, inflammatory and thrombotic activity, and severity of ARDS at ICU admission were shown to be predictive of ICU survival. Patients’ age, pulmonary dysfunction and transfer from an external institution were predictors for ECMO therapy. The interaction of patient age with D-dimer levels on admission and creatinine levels with SOFA score without GCS were predictors for renal replacement therapy. Conclusions: Using Explainable Boosting Machine analysis, we confirmed and weighed previously reported and identified novel predictors for outcome in critically ill COVID-19 patients. Using this strategy, predictive modeling of COVID-19 ICU patient outcomes can be performed overcoming the limitations of linear regression models. Trial registration “ClinicalTrials” (clinicaltrials.gov) under NCT04455451.