610 Medizin und Gesundheit
Refine
Year of publication
Document Type
- Article (95)
- Conference Proceeding (4)
- Preprint (3)
Language
- English (102)
Has Fulltext
- yes (102)
Is part of the Bibliography
- no (102)
Keywords
- Patient blood management (6)
- Transfusion (5)
- COVID-19 (4)
- Critical care (4)
- Outcome (4)
- patient blood management (4)
- ARDS (3)
- Intensive care (3)
- Mortality (3)
- SARS-CoV-2 (3)
Institute
Introduction: Cell salvage (CS) is an integral part of patient blood management (PBM) and aims to reduce allogeneic red blood cell (RBC) transfusion.
Material and methods: This observational study analysed patients scheduled for elective cardiac surgery requiring cardiopulmonary bypass (CPB) between November 2015 and October 2018. Patients were divided into a CS group (patients receiving CS) and a control group (no CS). Primary endpoints were the number of patients exposed to allogeneic RBC transfusions and the number of RBC units transfused per patient.
Results: A total of 704 patients undergoing cardiac surgery were analysed, of whom 338 underwent surgery with CS (CS group) and 366 were without CS (control group). Intraoperatively, 152 patients (45%) were exposed to allogeneic RBC transfusions in the CS group and 93 patients (25%) in the control group (P < 0.001). Considering the amount of intraoperative blood loss, regression analysis revealed a significant association between blood loss and increased use of RBC units in patients of the control compared to the CS group (1000 mL: 1.0 vs. 0.6 RBC units; 2000 mL: 2.2 vs. 1.1 RBC units; 3000 mL: 3.4 vs. 1.6 RBC units). Thus, CS was significantly associated with a reduced number of allogeneic RBCs by 40% for 1000 mL, 49% for 2000 mL, and 52% for 3000 mL of blood loss compared to patients without CS.
Conclusions: Cell salvage was significantly associated with a reduced number of allogeneic RBC transfusions. It supports the beneficial effect of CS in cardiac surgical patients as an individual measure in a comprehensive PBM program.
Coronavirus disease 2019 (COVID-19) is caused by the Severe Acute Respiratory Syndrome Coronavirus 2 (SARS-CoV-2) and can affect multiple organs, among which is the circulatory system. Inflammation and mortality risk markers were previously detected in COVID-19 plasma and red blood cells (RBCs) metabolic and proteomic profiles. Additionally, biophysical properties, such as deformability, were found to be changed during the infection. Based on such data, we aim to better characterize RBC functions in COVID-19. We evaluate the flow properties of RBCs in severe COVID-19 patients admitted to the intensive care unit by using microfluidic techniques and automated methods, including artificial neural networks, for an unbiased RBC analysis. We find strong flow and RBC shape impairment in COVID-19 samples and demonstrate that such changes are reversible upon suspension of COVID-19 RBCs in healthy plasma. Vice versa, healthy RBCs resemble COVID-19 RBCs when suspended in COVID-19 plasma. Proteomics and metabolomics analyses allow us to detect the effect of plasma exchanges on both plasma and RBCs and demonstrate a new role of RBCs in maintaining plasma equilibria at the expense of their flow properties. Our findings provide a framework for further investigations of clinical relevance for therapies against COVID-19 and possibly other infectious diseases.
Editor's evaluation
This report illustrates a comprehensive account detailing the marked alteration of red blood cell (RBC) morphology that occurs with COVID-19 infection. A particularly important result is the observation that RBC morphology is dramatically affected by plasma from COVID-19 patients and reversible with plasma from healthy donors. The claims of the manuscript are well supported by the data, and the approaches used are thoughtful and rigorous. The results are important for consideration of the broader pathophysiology of COVID-19, particularly with regard to the impact on vascular biology and will be of interest to the readership of eLife.
Introduction: In recent years, resource-saving handling of allogeneic blood products and a reduction of transfusion rates in adults has been observed. However, comparable published national data for transfusion practices in pediatric patients are currently not available. In this study, the transfusion rates for children and adolescents were analyzed based on data from the Federal Statistical Office of Germany during the past 2 decades. Methods: Data were queried via the database of the Federal Statistical Office (Destasis). The period covered was from 2005 to 2018, and those in the sample group were children and adolescents aged 0–17 years receiving inpatient care. Operation and procedure codes (OPS) for transfusions, procedures, or interventions with increased transfusion risk were queried and evaluated in detail. Results: In Germany, 0.9% of the children and adolescents treated in hospital received a transfusion in 2018. A reduction in transfusion rates from 1.02% (2005) to 0.9% (2018) was observed for the total collective of children and adolescents receiving inpatient care. Increases in transfusion rates were recorded for 1- to 4- (1.41–1.45%) and 5- to 10-year-olds (1.24–1.33%). Children under 1 year of age were most frequently transfused (in 2018, 40.2% of the children were cared for in hospital). Transfusion-associated procedures such as chemotherapy or machine ventilation and respiratory support for newborns and infants are on the rise. Conclusion: Transfusion rates are declining in children and adolescents, but the reasons for increases in transfusion rates in other groups are unclear. Prospective studies to evaluate transfusion rates and triggers in children are urgently needed.
Background Vasoplegic syndrome is frequently observed during cardiac surgery and resembles a complication of high mortality and morbidity. There is a clinical need for therapy and prevention of vasoplegic syndrome during complex cardiac surgical procedures. Therefore, we investigated different strategies in a porcine model of vasoplegia.
Methods We evaluated new medical therapies and prophylaxis to avoid vasoplegic syndrome in a porcine model. After induction of anesthesia, cardiopulmonary bypass was established through median sternotomy and central cannulation. Prolonged aortic cross-clamping (120 min) simulated a complex surgical procedure. The influence of sevoflurane-guided anesthesia (sevoflurane group) and the administration of glibenclamide (glibenclamide group) were compared to a control group, which received standard anesthesia using propofol. Online hemodynamic assessment was performed using PiCCO® measurements. In addition, blood and tissue samples were taken to evaluate hemodynamic effects and the degree of inflammatory response.
Results Glibenclamide was able to break through early vasoplegic syndrome by raising the blood pressure and systemic vascular resistance as well as less need of norepinephrine doses. Sevoflurane reduced the occurrence of the vasoplegic syndrome in the mean of stable blood pressure and less need of norepinephrine doses.
Conclusion Glibenclamide could serve as a potent drug to reduce effects of vasoplegic syndrome. Sevoflurane anesthesia during cardiopulmonary bypass shows less occurrence of vasoplegic syndrome and therefore could be used to prevent it in high-risk patients.
Clinical Perspective; what is new?
* to our knowledge, this is the first randomized in vivo study evaluating the hemodynamic effects of glibenclamide after the onset of vasoplegic syndrome
* furthermore according to literature research, there is no study showing the effect of sevoflurane-guided anesthesia on the occurrence of a vasoplegic syndrome
Clinical Perspective; clinical implications?
to achieve better outcomes after complex cardiac surgery there is a need for optimized drug therapy and prevention of the vasoplegic syndrome
5-Lipoxygenase (5-LO) is the key enzyme in the formation of pro-inflammatory leukotrienes (LT) which play an important role in a number of inflammatory diseases. Accordingly, 5-LO inhibitors are frequently used to study the role of 5-LO and LT in models of inflammation and cancer. Interestingly, the therapeutic efficacy of these inhibitors is highly variable. Here we show that the frequently used 5-LO inhibitors AA-861, BWA4C, C06, CJ-13,610 and the FDA approved compound zileuton as well as the pan-LO inhibitor nordihydroguaiaretic acid interfere with prostaglandin E2 (PGE2) release into the supernatants of cytokine-stimulated (TNFα/IL-1β) HeLa cervix carcinoma, A549 lung cancer as well as HCA-7 colon carcinoma cells with similar potencies compared to their LT inhibitory activities (IC50 values ranging from 0.1–9.1 µM). In addition, AA-861, BWA4C, CJ-13,610 and zileuton concentration-dependently inhibited bacterial lipopolysaccharide triggered prostaglandin (PG) release into human whole blood. Western Blot analysis revealed that inhibition of expression of enzymes involved in PG synthesis was not part of the underlying mechanism. Also, liberation of arachidonic acid which is the substrate for PG synthesis as well as PGH2 and PGE2 formation were not impaired by the compounds. However, accumulation of intracellular PGE2 was found in the inhibitor treated HeLa cells suggesting inhibition of PG export as major mechanism. Further, experiments showed that the PG exporter ATP-binding cassette transporter multidrug resistance protein 4 (MRP-4) is targeted by the inhibitors and may be involved in the 5-LO inhibitor-mediated PGE2 inhibition. In conclusion, the pharmacological effects of a number of 5-LO inhibitors are compound-specific and involve the potent inhibition of PGE2 export. Results from experimental models on the role of 5-LO in inflammation and pain using 5-LO inhibitors may be misleading and their use as pharmacological tools in experimental models has to be revisited. In addition, 5-LO inhibitors may serve as new scaffolds for the development of potent prostaglandin export inhibitors.
Background: Age and preoperative anaemia are risk factors for poor surgical outcome and blood transfusion. The aim of this study was to examine the effect of iron supplementation in iron-deficient (ID) elderly patients undergoing major surgery.
Method: In this single-centre observational study, patients ≥ 65 years undergoing major surgery were screened for anaemia and ID. Patients were assigned to the following groups: A− (no anaemia); A−,ID+,T+ (no anaemia, iron-deficient, intravenous iron supplementation); A+ (anaemia); and A+,ID+,T+ (anaemia, iron-deficient, intravenous iron supplementation).
Results: Of 4,381 patients screened at the anaemia walk-in clinic, 2,381 (54%) patients were ≥ 65 years old and 2,191 cases were included in analysis. The ID prevalence was 63% in patients with haemoglobin (Hb) < 8 g/dl, 47.2% in patients with Hb from 8.0 to 8.9 g/dl, and 44.3% in patients with Hb from 9 to 9.9 g/dl. In severely anaemic patients, an Hb increase of 0.6 (0.4; 1.2) and 1.2 (0.7; 1.6) g/dl was detected with iron supplementation 6–10 and > 10 days before surgery, respectively. Hb increased by 0 (-0.1; 0) g/dl with iron supplementation 1–5 days before surgery, 0.2 (-0.1; 0.5) g/dl with iron supplementation 6–10 days before surgery, and 0.2 (-0.2; 1.1) g/dl with supplementation > 10 days before surgery (p < 0.001 for 1–5 vs. 6–10 days). Overall, 58% of A+,ID+,T+ patients showed an Hb increase of > 0.5 g/dl. The number of transfused red blood cell units was significantly lower in patients supplemented with iron (0 (0; 3)) compared to non-treated anaemic patients (1 (0; 4)) (p = 0.03). Patients with iron supplementation > 6 days before surgery achieved mobility 2 days earlier than patients with iron supplementation < 6 days.
Conclusions: Intravenous iron supplementation increases Hb level and thereby reduces blood transfusion rate in elderly surgical patients with ID anaemia.
The ongoing SARS-CoV-2 pandemic is characterized by poor outcome and a high mortality especially in the older patient cohort. Up to this point there is a lack of data characterising COVID-19 patients in Germany admitted to intensive care (ICU) vs. non-ICU patients. German Reimbursement inpatient data covering the period in Germany from January 1st, 2020 to December 31th, 2021 were analyzed. 561,379 patients were hospitalized with COVID-19. 24.54% (n = 137,750) were admitted to ICU. Overall hospital mortality was 16.69% (n = 93,668) and 33.36% (n = 45,947) in the ICU group. 28.66% (n = 160,881) of all patients suffer from Cardiac arrhythmia and 17.98% (n = 100,926) developed renal failure. Obesity showed an odds-ratio ranging from 0.83 (0.79–0.87) for WHO grade I to 1.13 (1.08–1.19) for grade III. Mortality-rates peaked in April 2020 and January 2021 being 21.23% (n = 4539) and 22.99% (n = 15,724). A third peak was observed November and December 2021 (16.82%, n = 7173 and 16.54%, n = 9416). Hospitalized COVID-19 patient mortality in Germany is lower than previously shown in other studies. 24.54% of all patients had to be treated in the ICU with a mortality rate of 33.36%. Congestive heart failure was associated with a higher risk of death whereas low grade obesity might have a protective effect on patient survival. High admission numbers are accompanied by a higher mortality rate.
Background: Every year, ~ 210,000 initial implantations of hip endoprostheses are carried out in Germany alone. The “bone cement implantation syndrome” (BCIS) is considered a severe peri- and early-postoperative complication when implanting cemented prostheses. The origin of the BCIS and its impact on the clinical outcome are still uncertain. This study investigates the clinical progression after BCIS cases in patients with cemented hemiarthroplasty. Risk factors for the occurrence of BCIS are evaluated.
Material and methods* Clinical data of all patients with a proximal femur fracture and which received a cemented hemiarthroplasty within a period of 9.5 years have been collected. BCIS (+) patients and BCIS (−) patients were compared with respect to their demographics and clinical outcome. Risk factors for the development of BCIS were identified.
Results: A total of 208 patients could be included with complete data sets. The mean age was 81.1 ± 10.0 years. Overall, 37% of the patients showed symptoms of BCIS. In comparison to BCIS (−) patients there was a significantly higher rate of cardiovascular complications (27.3% vs. 13.7%, p = 0.016) and a higher in-hospital mortality rate (15.6% vs. 4.6%, p = 0.006) in BCIS (+) patients. Age, absence of a femoral borehole and ASA status were identified as statistically significant risk factors of BCIS.
Conclusion: BCIS is frequently observed and in some cases severe complication. The therapy is exclusively symptomatic; identifying preventional measures might reduce the occurrence of BCIS.
Background: Trauma may be associated with significant to life-threatening blood loss, which in turn may increase the risk of complications and death, particularly in the absence of adequate treatment. Hydroxyethyl starch (HES) solutions are used for volume therapy to treat hypovolemia due to acute blood loss to maintain or re-establish hemodynamic stability with the ultimate goal to avoid organ hypoperfusion and cardiovascular collapse. The current study compares a 6% HES 130 solution (Volulyte 6%) versus an electrolyte solution (Ionolyte) for volume replacement therapy in adult patients with traumatic injuries, as requested by the European Medicines Agency to gain more insights into the safety and efficacy of HES in the setting of trauma care.
Methods: TETHYS is a pragmatic, prospective, randomized, controlled, double-blind, multicenter, multinational trial performed in two parallel groups. Eligible consenting adults ≥ 18 years, with an estimated blood loss of ≥ 500 ml, and in whom initial surgery is deemed necessary within 24 h after blunt or penetrating trauma, will be randomized to receive intravenous treatment at an individualized dose with either a 6% HES 130, or an electrolyte solution, for a maximum of 24 h or until reaching the maximum daily dose of 30 ml/kg body weight, whatever occurs first. Sample size is estimated as 175 patients per group, 350 patients total (α = 0.025 one-tailed, power 1–β = 0.8). Composite primary endpoint evaluated in an exploratory manner will be 90-day mortality and 90-day renal failure, defined as AKIN stage ≥ 2, RIFLE injury/failure stage, or use of renal replacement therapy (RRT) during the first 3 months. Secondary efficacy and safety endpoints are fluid administration and balance, changes in vital signs and hemodynamic status, changes in laboratory parameters including renal function, coagulation, and inflammation biomarkers, incidence of adverse events during treatment period, hospital, and intensive care unit (ICU) length of stay, fitness for ICU or hospital discharge, and duration of mechanical ventilation and/or RRT.
Discussion: This pragmatic study will increase the evidence on safety and efficacy of 6% HES 130 for treatment of hypovolemia secondary to acute blood loss in trauma patients.
Trial registration:Registered in EudraCT, No.: 2016-002176-27 (21 April 2017) and ClinicalTrials.gov, ID: NCT03338218 (09 November 2017).
Purpose: Trauma is the leading cause of death in children. In adults, blood transfusion and fluid resuscitation protocols changed resulting in a decrease of morbidity and mortality over the past 2 decades. Here, transfusion and fluid resuscitation practices were analysed in severe injured children in Germany.
Methods: Severely injured children (maximum Abbreviated Injury Scale (AIS) ≥ 3) admitted to a certified trauma-centre (TraumaZentrum DGU®) between 2002 and 2017 and registered at the TraumaRegister DGU® were included and assessed regarding blood transfusion rates and fluid therapy.
Results: 5,118 children (aged 1–15 years) with a mean ISS 22 were analysed. Blood transfusion rates administered until ICU admission decreased from 18% (2002–2005) to 7% (2014–2017). Children who are transfused are increasingly seriously injured. ISS has increased for transfused children aged 1–15 years (2002–2005: mean 27.7–34.4 in 2014–2017). ISS in non-transfused children has decreased in children aged 1–15 years (2002–2005: mean 19.6 to mean 17.6 in 2014–2017). Mean prehospital fluid administration decreased from 980 to 549 ml without affecting hemodynamic instability.
Conclusion: Blood transfusion rates and amount of fluid resuscitation decreased in severe injured children over a 16-year period in Germany. Restrictive blood transfusion and fluid management has become common practice in severe injured children. A prehospital restrictive fluid management strategy in severely injured children is not associated with a worsened hemodynamic state, abnormal coagulation or base excess but leads to higher hemoglobin levels.
Background: The most common technique used worldwide to quantify blood loss during an operation is the visual assessment by the attending intervention team. In every operating room you will find scaled suction canisters that collect fluids from the surgical field. This scaling is commonly used by clinicians for visual assessment of intraoperative blood loss. While many studies have been conducted to quantify and improve the inaccuracy of the visual estimation method, research has focused on the estimation of blood volume in surgical drapes. The question whether and how scaling of canisters correlates with actual blood loss and how accurately clinicians estimate blood loss in scaled canisters has not been the focus of research to date.
Methods: A simulation study with four “bleeding” scenarios was conducted using expired whole blood donations. After diluting the blood donations with full electrolyte solution, the sample blood loss volume (SBL) was transferred into suction canisters. The study participants then had to estimate the blood loss in all four scenarios. The difference to the reference blood loss (RBL) per scenario was analyzed.
Results: Fifty-three anesthetists participated in the study. The median estimated blood loss was 500 ml (IQR 300/1150) compared to the RBL median of 281.5 ml (IQR 210.0/1022.0). Overestimations up to 1233 ml were detected. Underestimations were also observed in the range of 138 ml. The visual estimate for canisters correlated moderately with RBL (Spearman’s rho: 0.818; p < 0.001). Results from univariate nonparametric confirmation statistics regarding visual estimation of canisters show that the deviation of the visual estimate of blood loss is significant (z = − 10.95, p < 0.001, n = 220). Participants’ experience level had no significant influence on VEBL (p = 0.402).
Conclusion: The discrepancies between the visual estimate of canisters and the actual blood loss are enormous despite the given scales. Therefore, we do not recommend estimating the blood loss visually in scaled suction canisters. Colorimetric blood loss estimation could be a more accurate option.
Background: Nicolaides-Baraitser syndrome (NCBRS) is a rare disease caused by mutations in the SMRCA2 gene, which affects chromatin remodelling and leads to a wide range of symptoms including microcephaly, distinct facial features, recurrent seizures, and severe mental retardation. Until now, less than 100 cases have been reported. Case presentation: A 22-month old male infant with NCBRS underwent elective cleft palate surgery. The anaesthetists were challenged by the physiological condition of the patient: narrow face, very small mouth, mild tachypnea, slight sternal retractions, physical signs of partial monosomy 9p, and plagiocephalus, midface hypoplasia, V-shaped cleft palate, enhanced muscular hypotension, dysplastic kidneys (bilateral, estimated GFR: approx. 40 ml/m2), nocturnal oxygen demand, and combined apnea. In addition, little information was available about interaction of the NCBRS displayed by the patient and anaesthesia medications. Conclusions: The cleft palate was successfully closed using the bridge flap technique. Overall, we recommend to perform a trial video assisted laryngoscopy in the setting of spontaneous breathing with deep inhalative anaesthesia before administration of muscle relaxation to detect any airway difficulties while remaining spontaneoues breathing and protective reflexes.
Background: Acute bleeding requires fast and targeted therapy. Therefore, knowledge of the patient's potential to form a clot is crucial. Point-of-care testing (POCT) provides fast and reliable information on coagulation. Structural circumstances, such as person-bound sample transport, can prolong the reporting of the results. The aim of the present study was to investigate the diagnostic quality and accuracy between POCT INR diagnostics and standard laboratory analysis (SLA) as well as the time advantage between a pneumatic tube and a personal-based transport system. Methods: Two groups of haemorrhagic patients (EG: emergency department; OG: delivery room; each n = 12) were examined in the context of bleeding emergencies using POCT and SLA. Samples were transported via a pneumatic tube system or by a personal transport service. Results: INR results between POCT and SLA showed a high and significant correlation (EG: p < 0.001; OG: p < 0.001). POCT results were reported significantly more quickly (EG: 1.1 vs. 39.6 min; OG: 2.0 vs. 75.0 min; p < 0.001) and required less time for analysis (EG: 0.3 vs. 24.0 min; OG: 0.5 vs. 45.0 min; p < 0.001) compared to SLA. The time for transportation with the pneumatic tube was significantly shorter (8.0 vs. 18.5 min; p < 0.001) than with the personal-based transport system. Conclusion: The results of the present study suggest that POCT may be a suitable method for the emergency diagnosis and may be used as prognostic diagnostic elements in haemotherapy algorithms to initiate targeted haemotherapy at an early point in time.
Background: Intensive Care Resources are heavily utilized during the COVID-19 pandemic. However, risk stratification and prediction of SARS-CoV-2 patient clinical outcomes upon ICU admission remain inadequate. This study aimed to develop a machine learning model, based on retrospective & prospective clinical data, to stratify patient risk and predict ICU survival and outcomes. Methods: A Germany-wide electronic registry was established to pseudonymously collect admission, therapeutic and discharge information of SARS-CoV-2 ICU patients retrospectively and prospectively. Machine learning approaches were evaluated for the accuracy and interpretability of predictions. The Explainable Boosting Machine approach was selected as the most suitable method. Individual, non-linear shape functions for predictive parameters and parameter interactions are reported. Results: 1039 patients were included in the Explainable Boosting Machine model, 596 patients retrospectively collected, and 443 patients prospectively collected. The model for prediction of general ICU outcome was shown to be more reliable to predict “survival”. Age, inflammatory and thrombotic activity, and severity of ARDS at ICU admission were shown to be predictive of ICU survival. Patients’ age, pulmonary dysfunction and transfer from an external institution were predictors for ECMO therapy. The interaction of patient age with D-dimer levels on admission and creatinine levels with SOFA score without GCS were predictors for renal replacement therapy. Conclusions: Using Explainable Boosting Machine analysis, we confirmed and weighed previously reported and identified novel predictors for outcome in critically ill COVID-19 patients. Using this strategy, predictive modeling of COVID-19 ICU patient outcomes can be performed overcoming the limitations of linear regression models. Trial registration “ClinicalTrials” (clinicaltrials.gov) under NCT04455451.
Background: Approximately one in three patients suffers from preoperative anaemia. Even though haemoglobin is measured before surgery, anaemia management is not implemented in every hospital. Objective: Here, we demonstrate the implementation of an anaemia walk-in clinic at an Orthopedic University Hospital. To improve the diagnosis of iron deficiency (ID), we examined whether reticulocyte haemoglobin (Ret-He) could be a useful additional parameter. Material and Methods: In August 2019, an anaemia walk-in clinic was established. Between September and December 2019, major orthopaedic surgical patients were screened for preoperative anaemia. The primary endpoint was the incidence of preoperative anaemia. Secondary endpoints included Ret-He level, red blood cell (RBC) transfusion rate, in-hospital length of stay and anaemia at hospital discharge. Results: A total of 104 patients were screened for anaemia. Preoperative anaemia rate was 20.6%. Intravenous iron was supplemented in 23 patients. Transfusion of RBC units per patient (1.7 ± 1.2 vs. 0.2 ± 0.9; p = 0.004) and hospital length of stay (13.1 ± 4.8 days vs. 10.6 ± 5.1 days; p = 0.068) was increased in anaemic patients compared to non-anaemic patients. Ret-He values were significantly lower in patients with ID anaemia (33.3 pg [28.6–40.2 pg]) compared to patients with ID (35.3 pg [28.9–38.6 pg]; p = 0.015) or patients without anaemia (35.4 pg [30.2–39.4 pg]; p = 0.001). Conclusion: Preoperative anaemia is common in orthopaedic patients. Our results proved the feasibility of an anaemia walk-in clinic to manage preoperative anaemia. Furthermore, our analysis supports the use of Ret-He as an additional parameter for the diagnosis of ID in surgical patients.
Background: Cerebral O2 saturation (ScO2) reflects cerebral perfusion and can be measured noninvasively by near-infrared spectroscopy (NIRS). Objectives: In this pilot study, we describe the dynamics of ScO2 during TAVI in nonventilated patients and its impact on procedural outcome. Methods and Results: We measured ScO2 of both frontal lobes continuously by NIRS in 50 consecutive analgo-sedated patients undergoing transfemoral TAVI (female 58%, mean age 80.8 years). Compared to baseline ScO2 dropped significantly during RVP (59.3% vs. 53.9%, p < .01). Five minutes after RVP ScO2 values normalized (post RVP 62.6% vs. 53.9% during RVP, p < .01; pre 61.6% vs. post RVP 62.6%, p = .53). Patients with an intraprocedural pathological ScO2 decline of >20% (n = 13) had higher EuroSCORE II (3.42% vs. 5.7%, p = .020) and experienced more often delirium (24% vs. 62%, p = .015) and stroke (0% vs. 23%, p < .01) after TAVI. Multivariable logistic regression revealed higher age and large ScO2 drops as independent risk factors for delirium. Conclusions: During RVP ScO2 significantly declined compared to baseline. A ScO2 decline of >20% is associated with a higher incidence of delirium and stroke and a valid cut-off value to screen for these complications. NIRS measurement during TAVI procedure may be an easy to implement diagnostic tool to detect patients at high risks for cerebrovascular complications and delirium.
Background: Due to the coronavirus disease 2019 (COVID-19) pandemic, interventions in the upper airways are considered high-risk procedures for otolaryngologists and their colleagues. The purpose of this study was to evaluate limitations in hearing and communication when using a powered air-purifying respirator (PAPR) system to protect against severe acute respiratory syndrome coronavirus type 2 (SARS-CoV-2) transmission and to assess the benefit of a headset. Methods: Acoustic properties of the PAPR system were measured using a head and torso simulator. Audiological tests (tone audiometry, Freiburg speech test, Oldenburg sentence test (OLSA)) were performed in normal-hearing subjects (n = 10) to assess hearing with PAPR. The audiological test setup also included simulation of conditions in which the target speaker used either a PAPR, a filtering face piece (FFP) 3 respirator, or a surgical face mask. Results: Audiological measurements revealed that sound insulation by the PAPR headtop and noise, generated by the blower-assisted respiratory protection system, resulted in significantly deteriorated hearing thresholds (4.0 ± 7.2 dB hearing level (HL) vs. 49.2 ± 11.0
A high incidence of thromboembolic events associated with high mortality has been reported in severe acute respiratory syndrome coronavirus type 2 (SARS-CoV-2) infections with respiratory failure. The present study characterized post-transcriptional gene regulation by global microRNA (miRNA) expression in relation to activated coagulation and inflammation in 21 critically ill SARS-CoV-2 patients. The cohort consisted of patients with moderate respiratory failure (n = 11) and severe respiratory failure (n = 10) at an acute stage (day 0–3) and in the later course of the disease (>7 days). All patients needed supplemental oxygen and severe patients were defined by the requirement of positive pressure ventilation (intubation). Levels of D-dimers, activated partial thromboplastin time (aPTT), C-reactive protein (CRP), and interleukin (IL)-6 were significantly higher in patients with severe compared with moderate respiratory failure. Concurrently, next generation sequencing (NGS) analysis demonstrated increased dysregulation of miRNA expression with progression of disease severity connected to extreme downregulation of miR-320a, miR-320b and miR-320c. Kyoto encyclopedia of genes and genomes (KEGG) pathway analysis revealed involvement in the Hippo signaling pathway, the transforming growth factor (TGF)-β signaling pathway and in the regulation of adherens junctions. The expression of all miR-320 family members was significantly correlated with CRP, IL-6, and D-dimer levels. In conclusion, our analysis underlines the importance of thromboembolic processes in patients with respiratory failure and emphasizes miRNA-320s as potential biomarkers for severe progressive SARS-CoV-2 infection.
The scope of extracorporeal membrane oxygenation (ECMO) is expanding, nevertheless, pharmacokinetics in patients receiving cardiorespiratory support are fairly unknown leading to unpredictable drug concentrations. Currently, there are no clear guidelines for antibiotic dosing during ECMO. This study aims to evaluate the pharmacokinetics (PK) of cefazolin in patients undergoing ECMO treatment. Total and unbound plasma cefazolin concentration of critically ill patients on veno-arterial ECMO were determined. Observed PK was compared to dose recommendations calculated by an online available, free dosing software. Concentration of cefazolin varied broadly despite same dosage in all patients. The mean total and unbound plasma concentration were high showing significantly (p = 5.8913 E−09) greater unbound fraction compared to a standard patient. Cefazolin clearance was significantly (p = 0.009) higher in patients with preserved renal function compared with CRRT. Based upon the calculated clearance, the use of dosing software would have led to lower but still sufficient concentrations of cefazolin in general. Our study shows that a “one size fits all” dosing regimen leads to excessive unbound cefazolin concentration in these patients. They exhibit high PK variability and decreased cefazolin clearance on ECMO appears to compensate for ECMO- and critical illness-related increases in volume of distribution.
Objectives: The ongoing coronavirus pandemic is challenging, especially in severely affected patients who require intubation and sedation. Although the potential benefits of sedation with volatile anesthetics in coronavirus disease 2019 patients are currently being discussed, the use of isoflurane in patients with coronavirus disease 2019–induced acute respiratory distress syndrome has not yet been reported. Design: We performed a retrospective analysis of critically ill patients with hypoxemic respiratory failure requiring mechanical ventilation. Setting: The study was conducted with patients admitted between April 4 and May 15, 2020 to our ICU. Patients: We included five patients who were previously diagnosed with severe acute respiratory syndrome coronavirus 2 infection. Intervention: Even with high doses of several IV sedatives, the targeted level of sedation could not be achieved. Therefore, the sedation regimen was switched to inhalational isoflurane. Clinical data were recorded using a patient data management system. We recorded demographical data, laboratory results, ventilation variables, sedative dosages, sedation level, prone positioning, duration of volatile sedation and outcomes. Measurements & Main Results: Mean age (four men, one women) was 53.0 (± 12.7) years. The mean duration of isoflurane sedation was 103.2 (± 66.2) hours. Our data demonstrate a substantial improvement in the oxygenation ratio when using isoflurane sedation. Deep sedation as assessed by the Richmond Agitation and Sedation Scale was rapidly and closely controlled in all patients, and the subsequent discontinuation of IV sedation was possible within the first 30 minutes. No adverse events were detected. Conclusions: Our findings demonstrate the feasibility of isoflurane sedation in five patients suffering from severe coronavirus disease 2019 infection. Volatile isoflurane was able to achieve the required deep sedation and reduced the need for IV sedation.
High sedation needs of critically ill COVID-19 ARDS patients - a monocentric observational study
(2021)
Background: Therapy of severely affected coronavirus patient, requiring intubation and sedation is still challenging. Recently, difficulties in sedating these patients have been discussed. This study aims to describe sedation practices in patients with 2019 coronavirus disease (COVID-19)-induced acute respiratory distress syndrome (ARDS). Methods: We performed a retrospective monocentric analysis of sedation regimens in critically ill intubated patients with respiratory failure who required sedation in our mixed 32-bed university intensive care unit. All mechanically ventilated adults with COVID-19-induced ARDS requiring continuously infused sedative therapy admitted between April 4, 2020, and June 30, 2020 were included. We recorded demographic data, sedative dosages, prone positioning, sedation levels and duration. Descriptive data analysis was performed; for additional analysis, a logistic regression with mixed effect was used. Results: In total, 56 patients (mean age 67 (±14) years) were included. The mean observed sedation period was 224 (±139) hours. To achieve the prescribed sedation level, we observed the need for two or three sedatives in 48.7% and 12.8% of the cases, respectively. In cases with a triple sedation regimen, the combination of clonidine, esketamine and midazolam was observed in most cases (75.7%). Analgesia was achieved using sufentanil in 98.6% of the cases. The analysis showed that the majority of COVID-19 patients required an unusually high sedation dose compared to those available in the literature. Conclusion: The global pandemic continues to affect patients severely requiring ventilation and sedation, but optimal sedation strategies are still lacking. The findings of our observation suggest unusual high dosages of sedatives in mechanically ventilated patients with COVID-19. Prescribed sedation levels appear to be achievable only with several combinations of sedatives in most critically ill patients suffering from COVID-19-induced ARDS and a potential association to the often required sophisticated critical care including prone positioning and ECMO treatment seems conceivable.
Background: Iron deficiency (ID) is one of the most common nutritional deficiencies in children worldwide and may result in iron deficiency anemia (IDA). The reticulocyte hemoglobin equivalent (Ret-He) provides information about the current availability of iron in erythropoiesis. This study aims to examine the validation of Ret-He as a screening marker for ID and IDA in children. Methods: Blood samples were retrospectively obtained from medical records. Anemia was defined according to the definition provided by the World Health Organization (WHO) for children. ID was defined by transferrin saturation (TSAT) < 20% and ferritin < 100 ng/mL. Children were classified into four groups: IDA, non-anemia iron deficiency (NAID), control and others. Results: Out of 970 children, 332 (34.2%) had NAID and 278 (28.7%) presented with IDA. Analysis revealed that Ret-He significantly correlates with ferritin (rho = 0.41; p < 0.001), TSAT (rho = 0.66; p < 0.001) and soluble transferrin receptor (sTfR) (rho = −0.72; p < 0.001). For ROC analysis, the area under the curve (AUC) was 0.771 for Ret-He detecting ID and 0.845 for detecting IDA. The cut-off value for Ret-He to diagnose ID was 33.5 pg (sensitivity 90.7%; specificity 35.8%) and 31.6 pg (sensitivity 90.6%; specificity 50.4%) to diagnose IDA. Conclusions: The present study demonstrates Ret-He to be a screening marker for ID and IDA in children. Furthermore, Ret-He can be used as a single screening parameter for ID and IDA in children without considering other iron parameters. Economically, the use of Ret-He is highly relevant, as it can save one blood tube per patient and additional costs.
Background: The intraoperative blood loss is estimated daily in the operating room and is mainly done by visual techniques. Due to local standards, the surgical sponge colours can vary (e.g. white in US, green in Germany). The influence of sponge colour on accuracy of estimation has not been in the focus of research yet. Material and methods: A blood loss simulation study containing four “bleeding” scenarios each per sponge colour were created by using expired whole blood donation samples. The blood donations were applied to white and green surgical sponges after dilution with full electrolyte solution. Study participants had to estimate the absorbed blood loss in sponges in all scenarios. The difference to the reference blood loss was analysed. Multivariate linear regression analysis was performed to investigate other influence factors such as staff experience and sponge colour. Results: A total of 53 anaesthesists participated in the study. Visual estimation correlated moderately with reference blood loss in white (Spearman's rho: 0.521; p = 3.748*10−16) and green sponges (Spearman's rho: 0.452; p = 4.683*10−12). The median visually estimated blood loss was higher in white sponges (250ml IRQ 150–412.5ml) than in green sponges (150ml IQR 100-300ml), compared to reference blood loss (103ml IQR 86–162.8). For both colour types of sponges, major under- and overestimation was observed. The multivariate statistics demonstrates that fabric colours have a significant influence on estimation (p = 3.04*10−10), as well as clinician’s qualification level (p = 2.20*10−10, p = 1.54*10−08) and amount of RBL to be estimated (p < 2*10−16). Conclusion: The deviation of correct blood loss estimation was smaller with white surgical sponges compared to green sponges. In general, deviations were so severe for both types of sponges, that it appears to be advisable to refrain from visually estimating blood loss whenever possible and instead to use other techniques such as e.g. colorimetric estimation.
Background: Anemia is the most important complication during major surgery and transfusion of red blood cells is the mainstay to compensate for life threating blood loss. Therefore, accurate measurement of hemoglobin (Hb) concentration should be provided in real-time. Blood Gas Analysis (BGA) provides rapid point-of-care assessment using smaller sampling tubes compared to central laboratory (CL) services. Objective: This study aimed to investigate the accuracy of BGA hemoglobin testing as compared to CL services. Methods: Data of the ongoing LIBERAL-Trial (Liberal transfusion strategy to prevent mortality and anemia-associated ischemic events in elderly non-cardiac surgical patients, LIBERAL) was used to assess the bias for Hb level measured by BGA devices (ABL800 Flex analyzer®, GEM series® and RapidPoint 500®) and CL as the reference method. For that, we analyzed pairs of Hb level measured by CL and BGA within two hours. Furthermore, the impact of various confounding factors including age, gender, BMI, smoker status, transfusion of RBC, intraoperative hemodilution, and co-medication was elucidated. In order to ensure adequate statistical analysis, only data of participating centers providing more than 200 Hb pairs were used. Results: In total, three centers including 963 patients with 1,814 pairs of Hb measurements were analyzed. Mean bias was comparable between ABL800 Flex analyzer® and GEM series®: - 0.38 ± 0.15 g/dl whereas RapidPoint 500® showed a smaller bias (-0.09 g/dl) but greater median absolute deviation (± 0.45 g/dl). In order to avoid interference with different standard deviations caused by the different analytic devices, we focused on two centers using the same BGA technique (309 patients and 1,570 Hb pairs). A Bland-Altman analysis and LOWESS curve showed that bias decreased with smaller Hb values in absolute numbers but increased relatively. The smoker status showed the greatest reduction in bias (0.1 g/dl, p<0.001) whereas BMI (0.07 g/dl, p = 0.0178), RBC transfusion (0.06 g/dl, p<0.001), statins (0.04 g/dl, p<0.05) and beta blocker (0.03 g/dl, p = 0.02) showed a slight effect on bias. Intraoperative substitution of volume and other co-medications did not influence the bias significantly. Conclusion: Many interventions like substitution of fluids, coagulating factors or RBC units rely on the accuracy of laboratory measurement devices. Although BGA Hb testing showed a consistently stable difference to CL, our data confirm that BGA devices are associated with different bias. Therefore, we suggest that hospitals assess their individual bias before implementing BGA as valid and stable supplement to CL. However, based on the finding that bias decreased with smaller Hb values, which in turn are used for transfusion decision, we expect no unnecessary or delayed RBC transfusion, and no major impact on the LIBERAL trial performance.
Background: paediatric patients are vulnerable to blood loss and even a small loss of blood can be associated with severe shock. In emergency situations, a red blood cell (RBC) transfusion may become unavoidable, although it is associated with various risks. The aim of this trial was to identify independent risk factors for perioperative RBC transfusion in children undergoing surgery. Methods: to identify independent risk factors for perioperative RBC transfusion in children undergoing surgery and to access RBC transfusion rates and in-hospital outcomes (e.g., length of stay, mortality, and typical postoperative complication rates), a monocentric, retrospective, and observational study was conducted. Descriptive, univariate, and multivariate analyses were performed. Results: between 1 January 2010 and 31 December 2019, data from n = 14,248 cases were identified at the centre. Analysis revealed an RBC transfusion rate of 10.1% (n = 1439) in the entire cohort. The independent predictors of RBC transfusion were the presence of preoperative anaemia (p < 0.001; OR = 15.10 with preoperative anaemia and OR = 2.40 without preoperative anaemia), younger age (p < 0.001; ORs between 0.14 and 0.28 for children older than 0 years), female gender (p = 0.036; OR = 1.19 compared to male gender), certain types of surgery (e.g., neuro surgery (p < 0.001; OR = 10.14), vascular surgery (p < 0.001; OR = 9.93), cardiac surgery (p < 0.001; OR = 4.79), gynaecology (p = 0.014; OR = 3.64), visceral surgery (p < 0.001; OR = 2.48), and the presence of postoperative complications (e.g., sepsis (p < 0.001; OR = 10.16), respiratory dysfunction (p < 0.001; OR = 7.56), cardiovascular dysfunction (p < 0.001; OR = 4.68), neurological dysfunction (p = 0.029; OR = 1.77), and renal dysfunction (p < 0.001; OR = 16.17)). Conclusion: preoperative anaemia, younger age, female gender, certain types of surgery, and postoperative complications are independent predictors for RBC transfusion in children undergoing surgery. Future prospective studies are urgently required to identify, in detail, the potential risk factors and impact of RBC transfusion in children.
Transfusion of red blood cells (RBC) in patients undergoing major elective cranial surgery is associated with increased morbidity, mortality and prolonged hospital length of stay (LOS). This retrospective single center study aims to identify the clinical outcome of RBC transfusions on skull base and non-skull base meningioma patients including the identification of risk factors for RBC transfusion. Between October 2009 and October 2016, 423 patients underwent primary meningioma resection. Of these, 68 (16.1%) received RBC transfusion and 355 (83.9%) did not receive RBC units. Preoperative anaemia rate was significantly higher in transfused patients (17.7%) compared to patients without RBC transfusion (6.2%; p = 0.0015). In transfused patients, postoperative complications as well as hospital LOS was significantly higher (p < 0.0001) compared to non-transfused patients. After multivariate analyses, risk factors for RBC transfusion were preoperative American Society of Anaesthesiologists (ASA) physical status score (p = 0.0247), tumor size (p = 0.0006), surgical time (p = 0.0018) and intraoperative blood loss (p < 0.0001). Kaplan-Meier curves revealed significant influence on overall survival by preoperative anaemia, RBC transfusion, smoking, cardiovascular disease, preoperative KPS ≤ 60% and age (elderly ≥ 75 years). We concluded that blood loss due to large tumors or localization near large vessels are the main triggers for RBC transfusion in meningioma patients paired with a potential preselection that masks the effect of preoperative anaemia in multivariate analysis. Further studies evaluating the impact of preoperative anaemia management for reduction of RBC transfusion are needed to improve the clinical outcome of meningioma patients.
Background: Conditions during blood product storage and transportation should maintain quality. The aim of this in vitro study was to investigate the effect of interruption of agitation, temporary cooling (TC), and pneumatic tube system transportation (PTST) on the aggregation ability (AA) and mitochondrial function (MF) of platelet concentrates (PC).
Study Design and Methods: A PC was divided equally into four subunits and then allocated to four test groups. The control group (I) was stored as recommended (continuous agitation, 22 ± 2°C) for 4 days. The test groups were stored without agitation (II), stored as recommended, albeit 4°C for 60 minutes on day (d)2 (III) and PTST (IV). Aggregometry was measured using Multiplate (RocheAG; ADPtest, ASPItest, TRAPtest, COLtest) and MF using Oxygraph‐2k (Oroboros Instruments). The basal and maximum mitochondrial respiratory rate (MMRR) were determined. AA and MF were measured daily in I and II and AA in III and IV on d2 after TC/PTST. Statistical analysis was performed using tests for matched observations.
Results: Eleven PCs were used. TRAP‐6 induced AA was significantly lower in II when compared to I on d4 (P = 0.015*). In III the ASPItest was significantly lower (P = 0.032*). IV showed no significant differences. The basal and MMRR were significantly reduced over 4 days in I and II (for both rates in both groups: P = <0.0001*). No significant differences occurred on d4 (P = 0.495).
Conclusion: Our results indicate that ex vivo AA and MF of PCs are unaffected, even in no‐ideal storage and transport circumstances with respect to agitation, temperature, and force.
Background: Extracorporeal life support (ECLS) has become an integral part of modern intensive therapy. The choice of support mode depends largely on the indication. Patients with respiratory failure are predominantly treated with a venovenous (VV) approach. We hypothesized that mortality in Germany in ECLS therapy did not differ from previously reported literature
Methods: Inpatient data from Germany from 2007 to 2018 provided by the Federal Statistical Office of Germany were analysed. The international statistical classification of diseases and related health problems codes (ICD) and process keys (OPS) for extracorporeal membrane oxygenation (ECMO) types, acute respiratory distress syndrome (ARDS) and hospital mortality were used.
Results: In total, 45,647 hospitalized patients treated with ECLS were analysed. In Germany, 231 hospitals provided ECLS therapy, with a median of 4 VV-ECMO and 9 VA-ECMO in 2018. Overall hospital mortality remained higher than predicted in comparison to the values reported in the literature. The number of VV-ECMO cases increased by 236% from 825 in 2007 to 2768 in 2018. ARDS was the main indication for VV-ECMO in only 33% of the patients in the past, but that proportion increased to 60% in 2018. VA-ECMO support is of minor importance in the treatment of ARDS in Germany. The age distribution of patients undergoing ECLS has shifted towards an older population. In 2018, the hospital mortality decreased in VV-ECMO patients and VV-ECMO patients with ARDS to 53.9% (n = 1493) and 54.4% (n = 926), respectively.
Conclusions: ARDS is a severe disease with a high mortality rate despite ECLS therapy. Although endpoints and timing of the evaluations differed from those of the CESAR and EOLIA studies and the Extracorporeal Life Support Organization (ELSO) Registry, the reported mortality in these studies was lower than in the present analysis. Further prospective analyses are necessary to evaluate outcomes in ECMO therapy at the centre volume level.
The transcription factor NF-E2 p45-related factor 2 (Nrf2) is an established master regulator of the anti-oxidative and detoxifying cellular response. Thus, a role in inflammatory diseases associated with the generation of large amounts of reactive oxygen species (ROS) seems obvious. In line with this, data obtained in cell culture experiments and preclinical settings have shown that Nrf2 is important in regulating target genes that are necessary to ensure cellular redox balance. Additionally, Nrf2 is involved in the induction of phase II drug metabolizing enzymes, which are important both in degrading and converting drugs into active forms, and into putative carcinogens. Therefore, Nrf2 has also been implicated in tumorigenesis. This must be kept in mind when new therapy approaches are planned for the treatment of sepsis. Therefore, this review highlights the function of Nrf2 in sepsis with a special focus on the translation of rodent-based results into sepsis patients in the intensive care unit (ICU).
Characterization of neonates born to mothers with SARS-CoV-2 infection: review and meta-analysis
(2020)
Characterization of neonates born to mothers with SARS-CoV-2 infection has been partially carried out. There has been no systematic review providing a holistic neonatal presentation including possible vertical transmission. A systematic literature search was performed using PubMed, Google Scholar and Web of Science up to June, 6 2020. Studies on neonates born to mothers with SARS-CoV-2 infection were included. A binary random effect model was used for prevalence and 95% confidence interval. 32 studies involving 261 neonates were included in meta-analysis. Most neonates born to infected mothers did not show any clinical abnormalities (80.4%). Clinical features were dyspnea in 11 (42.3%) and fever in 9 newborns (19.1%). Of 261 neonates, 120 neonates were tested for infection, of whom 12 (10.0%) tested positive. Swabs from placenta, cord blood and vaginal secretion were negative. Neonates are mostly non affected by the mother's SARS-CoV-2 infection. The risk of vertical transmission is low.
Background and objectives: Preoperative anaemia is an independent risk factor for a higher morbidity and mortality, a longer hospitalization and increased perioperative transfusion rates. Managing preoperative anaemia is the first of three pillars of Patient Blood Management (PBM), a multidisciplinary concept to improve patient safety. While various studies provide medical information on (successful) anaemia treatment pathways, knowledge of organizational details of diagnosis and management of preoperative anaemia across Europe is scarce.
Materials and methods: To gain information on various aspects of preoperative anaemia management including organization, financing, diagnostics and treatment, we conducted a survey (74 questions) in ten hospitals from seven European nations within the PaBloE (Patient Blood Management in Europe) working group covering the year 2016.
Results: Organization and activity in the field of preoperative anaemia management were heterogeneous in the participating hospitals. Almost all hospitals had pathways for managing preoperative anaemia in place, however, only two nations had national guidelines. In six of the ten participating hospitals, preoperative anaemia management was organized by anaesthetists. Diagnostics and treatment focused on iron deficiency anaemia which, in most hospitals, was corrected with intravenous iron.
Conclusion: Implementation and approaches of preoperative anaemia management vary across Europe with a primary focus on treating iron deficiency anaemia. Findings of this survey motivated the hospitals involved to critically evaluate their practice and may also help other hospitals interested in PBM to develop action plans for diagnosis and management of preoperative anaemia.
Health economics of Patient Blood Management: a cost‐benefit analysis based on a meta‐analysis
(2019)
Background and Objectives: Patient Blood Management (PBM) is the timely application of evidence‐based medical and surgical concepts designed to improve haemoglobin concentration, optimize haemostasis and minimize blood loss in an effort to improve patient outcomes. The focus of this cost‐benefit analysis is to analyse the economic benefit of widespread implementation of a multimodal PBM programme.
Materials and Methods: Based on a recent meta‐analysis including 17 studies (>235 000 patients) comparing PBM with control care and data from the University Hospital Frankfurt, a cost‐benefit analysis was performed. Outcome data were red blood cell (RBC) transfusion rate, number of transfused RBC units, and length of hospital stay (LOS). Costs were considered for the following three PBM interventions as examples: anaemia management including therapy of iron deficiency, use of cell salvage and tranexamic acid. For sensitivity analysis, a Monte Carlo simulation was performed.
Results: Iron supplementation was applied in 3·1%, cell salvage in 65% and tranexamic acid in 89% of the PBM patients. In total, applying these three PBM interventions costs €129·04 per patient. However, PBM was associated with a reduction in transfusion rate, transfused RBC units per patient, and LOS which yielded to mean savings of €150·64 per patient. Thus, the overall benefit of PBM implementation was €21·60 per patient. In the Monte Carlo simulation, the cost savings on the outcome side exceeded the PBM costs in approximately 2/3 of all repetitions and the total benefit was €1 878 000 in 100·000 simulated patients.
Conclusion: Resources to implement a multimodal PBM concept optimizing patient care and safety can be cost‐effectively.
Background: Macrophage Migration Inhibitory Factor (MIF) is highly elevated after cardiac surgery and impacts the postoperative inflammation. The aim of this study was to analyze whether the polymorphisms CATT5–7 (rs5844572/rs3063368,“-794”) and G>C single-nucleotide polymorphism (rs755622,-173) in the MIF gene promoter are related to postoperative outcome. Methods: In 1116 patients undergoing cardiac surgery, the MIF gene polymorphisms were analyzed and serum MIF was measured by ELISA in 100 patients. Results: Patients with at least one extended repeat allele (CATT7) had a significantly higher risk of acute kidney injury (AKI) compared to others (23% vs. 13%; OR 2.01 (1.40–2.88), p = 0.0001). Carriers of CATT7 were also at higher risk of death (1.8% vs. 0.4%; OR 5.12 (0.99–33.14), p = 0.026). The GC genotype was associated with AKI (20% vs. GG/CC:13%, OR 1.71 (1.20–2.43), p = 0.003). Multivariate analyses identified CATT7 predictive for AKI (OR 2.13 (1.46–3.09), p < 0.001) and death (OR 5.58 (1.29–24.04), p = 0.021). CATT7 was associated with higher serum MIF before surgery (79.2 vs. 50.4 ng/mL, p = 0.008). Conclusion: The CATT7 allele associates with a higher risk of AKI and death after cardiac surgery, which might be related to chronically elevated serum MIF. Polymorphisms in the MIF gene may constitute a predisposition for postoperative complications and the assessment may improve risk stratification and therapeutic guidance.
Nitro fatty acids (NFAs) are endogenously generated lipid mediators deriving from reactions of unsaturated electrophilic fatty acids with reactive nitrogen species. Furthermore, Mediterranean diets can be a source of NFA. These highly electrophilic fatty acids can undergo Michael addition reaction with cysteine residues, leading to post-translational modifications (PTM) of selected regulatory proteins. Such modifications are capable of changing target protein function during cell signaling or in biosynthetic pathways. NFA target proteins include the peroxisome proliferator-activated receptor γ (PPAR-γ), the pro-inflammatory and tumorigenic nuclear factor-κB (NF-κB) signaling pathway, the pro-inflammatory 5-lipoxygenases (5-LO) biosynthesis pathway as well as soluble epoxide hydrolase (sEH), which is essentially involved in the regulation of vascular tone. In several animal models of inflammation and cancer, the therapeutic efficacy of well-tolerated NFA has been demonstrated. This has already led to clinical phase II studies investigating possible therapeutic effects of NFA in subjects with pulmonary arterial hypertension. Albeit Michael acceptors feature a broad spectrum of bioactivity, they have for a rather long time been avoided as drug candidates owing to their presumed unselective reactivity and toxicity. However, targeted covalent modification of regulatory proteins by Michael acceptors became recognized as a promising approach to drug discovery with the recent FDA approvals of the cancer therapeutics, afatanib (2013), ibrutinib (2013), and osimertinib (2015). Furthermore, the Michael acceptor, neratinib, a dual inhibitor of the human epidermal growth factor receptor 2 and epidermal growth factor receptor, was recently approved by the FDA (2017) and by the EMA (2018) for the treatment of breast cancer. Finally, a number of further Michael acceptor drug candidates are currently under clinical investigation for pharmacotherapy of inflammation and cancer. In this review, we focus on the pharmacology of NFA and other Michael acceptor drugs, summarizing their potential as an emerging class of future antiphlogistics and adjuvant in tumor therapeutics.
Background: Point of care devices for performing targeted coagulation substitution in patients who are bleeding have become increasingly important in recent years. New on the market is the Quantra. It is a device that uses sonorheometry, a sonic estimation of elasticity via resonance, which is a novel ultrasound-based technology that measures viscoelastic properties of whole blood. Several studies have already shown the comparability of the Quantra with devices already established on the market, such as the rotational thromboelastometry (ROTEM) device.
Objective: In contrast to existing studies, this study is the first prospective interventional study using this new system in a cardiac surgical patient cohort. We will investigate the noninferiority between an already existing coagulation algorithm based on the ROTEM/Multiplate system and a new algorithm based on the Quantra system for the treatment of coagulopathic cardiac surgical patients.
Methods: The study is divided into two phases. In an initial observation phase, whole blood samples of 20 patients obtained at three defined time points (prior to surgery, after completion of cardiopulmonary bypass, and on arrival in the intensive care unit) will be analyzed using both the ROTEM/Multiplate and Quantra systems. The obtained threshold values will be used to develop a novel algorithm for hemotherapy. In a second intervention phase, the new algorithm will be tested for noninferiority against an algorithm used routinely for years in our department.
Results: The main objective of the examination is the cumulative loss of blood within 24 hours after surgery. Statistical calculations based on the literature and in-house data suggest that the new algorithm is not inferior if the difference in cumulative blood loss is <150 mL/24 hours.
Conclusions: Because of the comparability of the Quantra sonorheometry system with the ROTEM measurement methods, the existing hemotherapy treatment algorithm can be adapted to the Quantra device with proof of noninferiority.
Trial Registration: ClinicalTrials.gov NCT03902275; https://clinicaltrials.gov/ct2/show/NCT03902275
International Registered Report Identifier (IRRID): DERR1-10.2196/17206
Cholinesterase alterations in delirium after cardiosurgery: a German monocentric prospective study
(2020)
Objectives: Postoperative delirium (POD) is a common complication after elective cardiac surgery. Recent evidence indicates that a disruption in the normal activity of the cholinergic system may be associated with delirium.
Design: Prospective observational study.
Setting: Single-centre at a European academic hospital.
Primary: and secondary outcome measures In our study the enzyme activities of acetylcholinesterase (AChE) and butyrylcholinesterase (BChE) were determined preoperatively as well as on the first and second postoperative day. The confusion assessment method for the intensive care unit was used to screen patients for the presence of POD.
Results: A total of 114 patients were included in the study. POD was associated with a decrease in BChE activity on postoperative day 1 (p=0.03). In addition, patients who developed POD, had significantly lower preoperative AChE activity than patients without POD (p<0.01). Multivariate analysis identified a preoperatively decreased AChE activity (OR 3.1; 95% CI 1.14 to 8.46), anticholinergic treatment (OR 5.09; 95% CI 1.51 to 17.23), elevated European System for Cardiac Operative Risk Evaluation (OR 3.68; 95% CI 1.04 to 12.99) and age (OR 3.02; 95% CI 1.06 to 8.62) to be independently associated with the development of POD.
Conclusions: We conclude that a reduction in the acetylcholine hydrolysing enzyme activity in patients undergoing cardiac surgery may correlate with the development of POD.
Background: Approximately every third surgical patient is anemic. The most common form, iron deficiency anemia, results from persisting iron‐deficient erythropoiesis (IDE). Zinc protoporphyrin (ZnPP) is a promising parameter for diagnosing IDE, hitherto requiring blood drawing and laboratory workup.
Study design and methods: Noninvasive ZnPP (ZnPP‐NI) measurements are compared to ZnPP reference determination of the ZnPP/heme ratio by high‐performance liquid chromatography (ZnPP‐HPLC) and the analytical performance in detecting IDE is evaluated against traditional iron status parameters (ferritin, transferrin saturation [TSAT], soluble transferrin receptor–ferritin index [sTfR‐F], soluble transferrin receptor [sTfR]), likewise measured in blood. The study was conducted at the University Hospitals of Frankfurt and Zurich.
Results: Limits of agreement between ZnPP‐NI and ZnPP‐HPLC measurements for 584 cardiac and noncardiac surgical patients equaled 19.7 μmol/mol heme (95% confidence interval, 18.0–21.3; acceptance criteria, 23.2 μmol/mol heme; absolute bias, 0 μmol/mol heme). Analytical performance for detecting IDE (inferred from area under the curve receiver operating characteristics) of parameters measured in blood was: ZnPP‐HPLC (0.95), sTfR (0.92), sTfR‐F (0.89), TSAT (0.87), and ferritin (0.67). Noninvasively measured ZnPP‐NI yielded results of 0.90.
Conclusion: ZnPP‐NI appears well suited for an initial IDE screening, informing on the state of erythropoiesis at the point of care without blood drawing and laboratory analysis. Comparison with a multiparameter IDE test revealed that ZnPP‐NI values of 40 μmol/mol heme or less allows exclusion of IDE, whereas for 65 μmol/mol heme or greater, IDE is very likely if other causes of increased values are excluded. In these cases (77% of our patients) ZnPP‐NI may suffice for a diagnosis, while values in between require analyses of additional iron status parameters.
Background. Tracheal intubation still represents the "gold standard" in securing the airway of unconscious patients in the prehospital setting. Especially in cases of restricted access to the patient, video laryngoscopy became more and more relevant.
Objectives. The aim of the study was to evaluate the performance and intubation success of four different video laryngoscopes, one optical laryngoscope, and a Macintosh blade while intubating from two different positions in a mannequin trial with difficult access to the patient.
Methods. A mannequin with a cervical collar was placed on the driver’s seat. Intubation was performed with six different laryngoscopes either through the driver’s window or from the backseat. Success, C/L score, time to best view (TTBV), time to intubation (TTI), and number of attempts were measured. All participants were asked to rate their favored device.
Results. Forty-two physicians participated. 100% of all intubations performed from the backseat were successful. Intubation success through the driver’s window was less successful. Only with the Airtraq® optical laryngoscope, 100% success was achieved. Best visualization (window C/L 2a; backseat C/L 2a) and shortest TTBV (window 4.7 s; backseat 4.1 s) were obtained when using the D-Blade video laryngoscope, but this was not associated with a higher success through the driver’s window. Fastest TTI was achieved through the window (14.2 s) when using the C-MAC video laryngoscope and from the backseat (7.3 s) when using a Macintosh blade.
Conclusions. Video laryngoscopy revealed better results in visualization but was not associated with a higher success. Success depended on the approach and familiarity with the device. We believe that video laryngoscopy is suitable for securing airways in trapped accident victims. The decision for an optimal device is complicated and should be based upon experience and regular training with the device.
In-line filtration of intravenous infusion may reduce organ dysfunction of adult critical patients
(2019)
Background: The potential harmful effects of particle-contaminated infusions for critically ill adult patients are yet unclear. So far, only significant improved outcome in critically ill children and new-borns was demonstrated when using in-line filters, but for adult patients, evidence is still missing.
Methods: This single-centre, retrospective controlled cohort study assessed the effect of in-line filtration of intravenous fluids with finer 0.2 or 1.2 μm vs 5.0 μm filters in critically ill adult patients. From a total of n = 3215 adult patients, n = 3012 patients were selected by propensity score matching (adjusting for sex, age, and surgery group) and assigned to either a fine filter cohort (with 0.2/1.2 μm filters, n = 1506, time period from February 2013 to January 2014) or a control filter cohort (with 5.0 μm filters, n = 1506, time period from April 2014 to March 2015). The cohorts were compared regarding the occurrence of severe vasoplegia, organ dysfunctions (lung, kidney, and brain), inflammation, in-hospital complications (myocardial infarction, ischemic stroke, pneumonia, and sepsis), in-hospital mortality, and length of ICU and hospital stay.
Results: Comparing fine filter vs control filter cohort, respiratory dysfunction (Horowitz index 206 (119–290) vs 191 (104.75–280); P = 0.04), pneumonia (11.4% vs 14.4%; P = 0.02), sepsis (9.6% vs 12.2%; P = 0.03), interleukin-6 (471.5 (258.8–1062.8) ng/l vs 540.5 (284.5–1147.5) ng/l; P = 0.01), and length of ICU (1.2 (0.6–4.9) vs 1.7 (0.8–6.9) days; P < 0.01) and hospital stay (14.0 (9.2–22.2) vs 14.8 (10.0–26.8) days; P = 0.01) were reduced. Rate of severe vasoplegia (21.0% vs 19.6%; P > 0.20) and acute kidney injury (11.8% vs 13.7%; P = 0.11) was not significantly different between the cohorts.
Conclusions: In-line filtration with finer 0.2 and 1.2 μm filters may be associated with less organ dysfunction and less inflammation in critically ill adult patients.
Trial registration: The study was registered at ClinicalTrials.gov (number: NCT02281604).
Background: GLUT1-deficiency-syndrome (G1DS) is an autosomal dominant genetic disorder based on a mutation of the SLC2A1 gene. This mutation can lead to an encephalopathy due to abnormal glucose transport in the brain. G1DS is a rare disease, with an estimated incidence of 1: 90 000.
Case report: We report a case of a 10-year-old female who presented with recurrent fever, headaches, and vertigo for more than 3 days within 2 weeks following pneumonia. A bilateral mastoiditis was proven by a cerebral magnetic resonance imaging and a cranial computed tomography scan. The patient had to undergo mastoidectomy and thus, her first general anesthesia. Half a year previously she was diagnosed with G1DS. According to the standard of care, a ketogenic diet had been administered since the patient’s diagnosis 6 months earlier. Our patient received a total intravenous anesthesia (TIVA) using propofol, fentanyl, and rocuronium administered without any incidents.
Conclusions: We recommend normoglycemia during the perioperative phase and avoidance of glucose-based medication to keep a patient’s ketotic state. Our case highlights that TIVA, with the outlined medication used in this case, was safe when the patient’s ketotic state and periprocedural blood glucose was monitored continuously. Nevertheless, we would suggest using remifentanil instead of fentanyl for future TIVAs due to a reduced increase in blood glucose level in our patient.
Background: The use of cell salvage and autologous blood transfusion has become an important method of blood conservation. So far, there are no clinical data about the performance of the continuous autotransfusion device CATSmart.
Methods: In total, 74 patients undergoing either cardiac or orthopedic surgery were included in this prospective, bicenter and observational technical evaluation to validate red cell separation process and washout quality of CATSmart. The target of red cell separation process was defined as a hematocrit value in the packed red cell unit of 55–75% and of washout quality of 80–100% removal ratio.
Results: Hematocrit values measured by CATSmart and laboratory analysis were 78.5% [71.3%; 84.0%] and 73.7% [67.5%; 75.5%], respectively. Removal ratios for platelets 94.7% [88.2%; 96.7%], free hemoglobin 89.3% [85.2%; 94.9%], albumin 97.9% [96.6%; 98.5%], heparin 99.9% [99.9%; 100.0%], and potassium 92.5% [90.8%; 95.0%] were within the target range while removal of white blood cells was slightly worse 72.4% [57.9%; 87.3%].
Conclusion: The new autotransfusion device enables sufficient red cell separation and washout quality.
Background: Patient Blood Management (PBM) is a systematic quality improving clinical model to reduce anemia and avoid transfusions in all kinds of clinical settings. Here, we investigated the potential of PBM in oncologic surgery and hypothesized that PBM improves 2-year overall survival (OS).
Methods: Retrospective analysis of patients 2 years before and after PBM implementation. The primary endpoint was OS at 2 years after surgery. We identified a sample size of 824 to detect a 10% improvement in survival in the PBM group.
Results: The analysis comprised of 836 patients that underwent oncologic surgery, 389 before and 447 after PBM, was implemented. Patients in the PBM+ presented significantly more frequent with normal hemoglobin values before surgery than PBM− (56.6 vs. 35.7%; p < 0.001). The number of transfusions was significantly reduced from 5.5 ± 11.1 to 3.0 ± 6.9 units/patient (p < 0.001); moreover, the percentage of patients being transfused during the clinic stay was significantly reduced from 62.4 to 40.9% (p < 0.001). Two-year OS was significantly better in the PBM+ and increased from 67.0 to 80.1% (p = 0.001). A normal hemoglobin value (> 12 g/dl in female and > 13 g/dl in male) before surgery (HR 0.43, 95% CI 0.29–0.65, p < 0.001) was the only independent predictive factor positively affecting survival.
Conclusions: PBM is a quality improvement tool that is associated with better mid-term surgical oncologic outcome. The root cause for improvement is the increase of patients entering surgery with normal hemoglobin values.
Objective. Evaluation of C-MAC PM® in combination with a standard Macintosh blade size 3 in direct and indirect laryngoscopy and D-Blade® in indirect laryngoscopy in a simulated difficult airway. Primary outcome was defined as the best view of the glottic structures. Secondary endpoints were subjective evaluation and assessment of the intubation process.
Methods. Prospective monocentric, observational study on 48 adult patients without predictors for difficult laryngoscopy/tracheal intubation undergoing orthopedic surgery. Every participant preoperatively received a cervical collar to simulate a difficult airway. Direct and indirect laryngoscopy w/o the BURP maneuver with a standard Macintosh blade and indirect laryngoscopy w/o the BURP maneuver using D-Blade® were performed to evaluate if blade geometry and the BURP maneuver improve the glottic view as measured by the Cormack-Lehane score.
Results. Using a C-MAC PM® laryngoscope, D-Blade® yielded improved glottic views compared with the Macintosh blade used with either the direct or indirect technique. Changing from direct laryngoscopy using a Macintosh blade to indirect videolaryngoscopy using C-MAC PM® with D-Blade® improved the Cormack-Lehane score from IIb, III, or IV to I or II in 31 cases.
Conclusion. The combination of C-MAC PM® and D-Blade® significantly enhances the view of the glottis compared to direct laryngoscopy with a Macintosh blade in patients with a simulated difficult airway.
Trial Registration Number. This trial is registered under number NCT03403946.
Introduction: Balanced fluid replacement solutions can possibly reduce the risks for electrolyte imbalances, for acid-base imbalances, and thus for renal failure. To assess the intraoperative change of base excess (BE) and chloride in serum after treatment with either a balanced gelatine/electrolyte solution or a non-balanced gelatine/electrolyte solution, a prospective, controlled, randomized, double-blind, dual centre phase III study was conducted in two tertiary care university hospitals in Germany.
Material and methods: 40 patients of both sexes, aged 18 to 90 years, who were scheduled to undergo elective abdominal surgery with assumed intraoperative volume requirement of at least 15 mL/kg body weight gelatine solution were included. Administration of study drug was performed intravenously according to patients need. The trigger for volume replacement was a central venous pressure (CVP) minus positive end-expiratory pressure (PEEP) <10 mmHg (CVP <10 mmHg). The crystalloid:colloid ratio was 1:1 intra- and postoperatively. The targets for volume replacement were a CVP between 10 and 14 mmHg minus PEEP after treatment with vasoactive agent and mean arterial pressure (MAP) > 65 mmHg.
Results: The primary endpoints, intraoperative changes of base excess –2.59 ± 2.25 (median: –2.65) mmol/L (balanced group) and –4.79 ± 2.38 (median: –4.70) mmol/L (non-balanced group)) or serum chloride 2.4 ± 1.9 (median: 3.0) mmol/L and 5.2 ± 3.1 (median: 5.0) mmol/L were significantly different (p = 0.0117 and p = 0.0045, respectively). In both groups (each n = 20) the investigational product administration in terms of volume and infusion rate was comparable throughout the course of the study, i.e. before, during and after surgery.
Discussion: Balanced gelatine solution 4% combined with a balanced electrolyte solution demonstrated significant smaller impact on blood gas analytic parameters in the primary endpoints BE and serum chloride when compared to a non-balanced gelatine solution 4% combined with NaCl 0.9%. No marked treatment differences were observed with respect to haemodynamics, coagulation and renal function.
Trial registration: ClinicalTrials.gov (NCT01515397) and clinicaltrialsregister.eu, EudraCT number 2010-018524-58.
Background: Peritonitis is responsible for thousands of deaths annually in Germany alone. Even source control (SC) and antibiotic treatment often fail to prevent severe sepsis or septic shock, and this situation has hardly improved in the past two decades. Most experimental immunomodulatory therapeutics for sepsis have been aimed at blocking or dampening a specific pro-inflammatory immunological mediator. However, the patient collective is large and heterogeneous. There are therefore grounds for investigating the possibility of developing personalized therapies by classifying patients into groups according to biomarkers. This study aims to combine an assessment of the efficacy of treatment with a preparation of human immunoglobulins G, A, and M (IgGAM) with individual status of various biomarkers (immunoglobulin level, procalcitonin, interleukin 6, antigen D-related human leucocyte antigen (HLA-DR), transcription factor NF-κB1, adrenomedullin, and pathogen spectrum).
Methods/design: A total of 200 patients with sepsis or septic shock will receive standard-of-care treatment (SoC). Of these, 133 patients (selected by 1:2 randomization) will in addition receive infusions of IgGAM for 5 days. All patients will be followed for approximately 90 days and assessed by the multiple-organ failure (MOF) score, by the EQ QLQ 5D quality-of-life scale, and by measurement of vital signs, biomarkers (as above), and survival.
Discussion: This study is intended to provide further information on the efficacy and safety of treatment with IgGAM and to offer the possibility of correlating these with the biomarkers to be studied. Specifically, it will test (at a descriptive level) the hypothesis that patients receiving IgGAM who have higher inflammation status (IL-6) and poorer immune status (low HLA-DR, low immunoglobulin levels) have a better outcome than patients who do not receive IgGAM. It is expected to provide information that will help to close the knowledge gap concerning the association between the effect of IgGAM and the presence of various biomarkers, thus possibly opening the way to a personalized medicine.
Trial registration: EudraCT, 2016–001788-34; ClinicalTrials.gov, NCT03334006. Registered on 17 Nov 2017.
Trial sponsor: RWTH Aachen University, represented by the Center for Translational & Clinical Research Aachen (contact Dr. S. Isfort).
Background: Perioperative anaemia leads to impaired oxygen supply with a risk of vital organ ischaemia. In healthy and fit individuals, anaemia can be compensated by several mechanisms. Elderly patients, however, have less compensatory mechanisms because of multiple co-morbidities and age-related decline of functional reserves. The purpose of the study is to evaluate whether elderly surgical patients may benefit from a liberal red blood cell (RBC) transfusion strategy compared to a restrictive transfusion strategy.
Methods: The LIBERAL Trial is a prospective, randomized, multicentre, controlled clinical phase IV trial randomising 2470 elderly (≥ 70 years) patients undergoing intermediate- or high-risk non-cardiac surgery. Registered patients will be randomised only if Haemoglobin (Hb) reaches ≤9 g/dl during surgery or within 3 days after surgery either to the LIBERAL group (transfusion of a single RBC unit when Hb ≤ 9 g/dl with a target range for the post-transfusion Hb level of 9–10.5 g/dl) or the RESTRICTIVE group (transfusion of a single RBC unit when Hb ≤ 7.5 g/dl with a target range for the post-transfusion Hb level of 7.5–9 g/dl). The intervention per patient will be followed until hospital discharge or up to 30 days after surgery, whichever occurs first. The primary efficacy outcome is defined as a composite of all-cause mortality, acute myocardial infarction, acute ischaemic stroke, acute kidney injury (stage III), acute mesenteric ischaemia and acute peripheral vascular ischaemia within 90 days after surgery. Infections requiring iv antibiotics with re-hospitalisation are assessed as important secondary endpoint. The primary endpoint will be analysed by logistic regression adjusting for age, cancer surgery (y/n), type of surgery (intermediate- or high-risk), and incorporating centres as random effect.
Discussion: The LIBERAL-Trial will evaluate whether a liberal transfusion strategy reduces the occurrence of major adverse events after non-cardiac surgery in the geriatric population compared to a restrictive strategy within 90 days after surgery.
Trial registration: ClinicalTrials.gov (identifier: NCT03369210).
Introduction: Hip fracture surgery is associated with high in-hospital and 30-day mortality rates and serious adverse patient outcomes. Evidence from randomised controlled trials regarding effectiveness of spinal versus general anaesthesia on patient-centred outcomes after hip fracture surgery is sparse.
Methods and analysis: The iHOPE study is a pragmatic national, multicentre, randomised controlled, open-label clinical trial with a two-arm parallel group design. In total, 1032 patients with hip fracture (>65 years) will be randomised in an intended 1:1 allocation ratio to receive spinal anaesthesia (n=516) or general anaesthesia (n=516). Outcome assessment will occur in a blinded manner after hospital discharge and inhospital. The primary endpoint will be assessed by telephone interview and comprises the time to the first occurring event of the binary composite outcome of all-cause mortality or new-onset serious cardiac and pulmonary complications within 30 postoperative days. In-hospital secondary endpoints, assessed via in-person interviews and medical record review, include mortality, perioperative adverse events, delirium, satisfaction, walking independently, length of hospital stay and discharge destination. Telephone interviews will be performed for long-term endpoints (all-cause mortality, independence in walking, chronic pain, ability to return home cognitive function and overall health and disability) at postoperative day 30±3, 180±45 and 365±60.
Ethics and dissemination: iHOPE has been approved by the leading Ethics Committee of the Medical Faculty of the RWTH Aachen University on 14 March 2018 (EK 022/18). Approval from all other involved local Ethical Committees was subsequently requested and obtained. Study started in April 2018 with a total recruitment period of 24 months. iHOPE will be disseminated via presentations at national and international scientific meetings or conferences and publication in peer-reviewed international scientific journals.
Trial registration number: DRKS00013644; Pre-results
In contrast to several smaller studies, which demonstrate that remote ischemic preconditioning (RIPC) reduces myocardial injury in patients that undergo cardiovascular surgery, the RIPHeart study failed to demonstrate beneficial effects of troponin release and clinical outcome in propofol-anesthetized cardiac surgery patients. Therefore, we addressed the potential biochemical mechanisms triggered by RIPC. This is a predefined prospective sub-analysis of the randomized and controlled RIPHeart study in cardiac surgery patients (n = 40) that was recently published. Blood samples were drawn from patients prior to surgery, after RIPC of four cycles of 5 min arm ischemia/5 min reperfusion (n = 19) and the sham (n = 21) procedure, after connection to cardiopulmonary bypass (CPB), at the end of surgery, 24 h postoperatively, and 48 h postoperatively for the measurement of troponin T, macrophage migration inhibitory factor (MIF), stromal cell-derived factor 1 (CXCL12), IL-6, CXCL8, and IL-10. After RIPC, right atrial tissue samples were taken for the measurement of extracellular-signal regulated kinase (ERK1/2), protein kinase B (AKT), Glycogen synthase kinase 3 (GSK-3β), protein kinase C (PKCε), and MIF content. RIPC did not significantly reduce the troponin release when compared with the sham procedure. MIF serum levels intraoperatively increased, peaking at intensive care unit (ICU) admission (with an increase of 48.04%, p = 0.164 in RIPC; and 69.64%, p = 0.023 over the baseline in the sham procedure), and decreased back to the baseline 24 h after surgery, with no differences between the groups. In the right atrial tissue, MIF content decreased after RIPC (1.040 ± 1.032 Arbitrary units [au] in RIPC vs. 2.028 ± 1.631 [au] in the sham procedure, p < 0.05). CXCL12 serum levels increased significantly over the baseline at the end of surgery, with no differences between the groups. ERK1/2, AKT, GSK-3β, and PKCɛ phosphorylation in the right atrial samples were no different between the groups. No difference was found in IL-6, CXCL8, and IL10 serum levels between the groups. In this cohort of cardiac surgery patients that received propofol anesthesia, we could not show a release of potential mediators of signaling, nor an effect on the inflammatory response, nor an activation of well-established protein kinases after RIPC. Based on these data, we cannot exclude that confounding factors, such as propofol, may have interfered with RIPC.