Refine
Year of publication
Document Type
- Article (96)
- Conference Proceeding (4)
- Preprint (3)
Language
- English (103)
Has Fulltext
- yes (103)
Is part of the Bibliography
- no (103)
Keywords
- Patient blood management (6)
- Transfusion (5)
- COVID-19 (4)
- Critical care (4)
- Outcome (4)
- patient blood management (4)
- ARDS (3)
- Intensive care (3)
- Mortality (3)
- SARS-CoV-2 (3)
Institute
Background: Cell salvage is commonly used as part of a blood conservation strategy. However concerns among clinicians exist about the efficacy of transfusion of washed cell salvage.
Methods: We performed a meta-analysis of randomized controlled trials in which patients, scheduled for all types of surgery, were randomized to washed cell salvage or to a control group with no cell salvage. Data were independently extracted, risk ratio (RR), and weighted mean differences (WMD) with 95% confidence intervals (CIs) were calculated. Data were pooled using a random effects model. The primary endpoint was the number of patients exposed to allogeneic red blood cell (RBC) transfusion.
Results: Out of 1140 search results, a total of 47 trials were included. Overall, the use of washed cell salvage reduced the rate of exposure to allogeneic RBC transfusion by a relative 39% (RR = 0.61; 95% CI 0.57 to 0.65; P < 0.001), resulting in an average saving of 0.20 units of allogeneic RBC per patient (weighted mean differences [WMD] = -0.20; 95% CI -0.22 to -0.18; P < 0.001), reduced risk of infection by 28% (RR = 0.72; 95% CI 0.54 to 0.97; P = 0.03), reduced length of hospital stay by 2.31 days (WMD = -2.31; 95% CI -2.50 to -2.11; P < 0.001), but did not significantly affect risk of mortality (RR = 0.92; 95% CI 0.63 to 1.34; P = 0.66). No statistical difference could be observed in the number of patients exposed to re-operation, plasma, platelets, or rate of myocardial infarction and stroke.
Conclusions: Washed cell salvage is efficacious in reducing the need for allogeneic RBC transfusion and risk of infection in surgery.
Background: The most common technique used worldwide to quantify blood loss during an operation is the visual assessment by the attending intervention team. In every operating room you will find scaled suction canisters that collect fluids from the surgical field. This scaling is commonly used by clinicians for visual assessment of intraoperative blood loss. While many studies have been conducted to quantify and improve the inaccuracy of the visual estimation method, research has focused on the estimation of blood volume in surgical drapes. The question whether and how scaling of canisters correlates with actual blood loss and how accurately clinicians estimate blood loss in scaled canisters has not been the focus of research to date.
Methods: A simulation study with four “bleeding” scenarios was conducted using expired whole blood donations. After diluting the blood donations with full electrolyte solution, the sample blood loss volume (SBL) was transferred into suction canisters. The study participants then had to estimate the blood loss in all four scenarios. The difference to the reference blood loss (RBL) per scenario was analyzed.
Results: Fifty-three anesthetists participated in the study. The median estimated blood loss was 500 ml (IQR 300/1150) compared to the RBL median of 281.5 ml (IQR 210.0/1022.0). Overestimations up to 1233 ml were detected. Underestimations were also observed in the range of 138 ml. The visual estimate for canisters correlated moderately with RBL (Spearman’s rho: 0.818; p < 0.001). Results from univariate nonparametric confirmation statistics regarding visual estimation of canisters show that the deviation of the visual estimate of blood loss is significant (z = − 10.95, p < 0.001, n = 220). Participants’ experience level had no significant influence on VEBL (p = 0.402).
Conclusion: The discrepancies between the visual estimate of canisters and the actual blood loss are enormous despite the given scales. Therefore, we do not recommend estimating the blood loss visually in scaled suction canisters. Colorimetric blood loss estimation could be a more accurate option.
Background: Nicolaides-Baraitser syndrome (NCBRS) is a rare disease caused by mutations in the SMRCA2 gene, which affects chromatin remodelling and leads to a wide range of symptoms including microcephaly, distinct facial features, recurrent seizures, and severe mental retardation. Until now, less than 100 cases have been reported. Case presentation: A 22-month old male infant with NCBRS underwent elective cleft palate surgery. The anaesthetists were challenged by the physiological condition of the patient: narrow face, very small mouth, mild tachypnea, slight sternal retractions, physical signs of partial monosomy 9p, and plagiocephalus, midface hypoplasia, V-shaped cleft palate, enhanced muscular hypotension, dysplastic kidneys (bilateral, estimated GFR: approx. 40 ml/m2), nocturnal oxygen demand, and combined apnea. In addition, little information was available about interaction of the NCBRS displayed by the patient and anaesthesia medications. Conclusions: The cleft palate was successfully closed using the bridge flap technique. Overall, we recommend to perform a trial video assisted laryngoscopy in the setting of spontaneous breathing with deep inhalative anaesthesia before administration of muscle relaxation to detect any airway difficulties while remaining spontaneoues breathing and protective reflexes.
Purpose: Trauma is the leading cause of death in children. In adults, blood transfusion and fluid resuscitation protocols changed resulting in a decrease of morbidity and mortality over the past 2 decades. Here, transfusion and fluid resuscitation practices were analysed in severe injured children in Germany.
Methods: Severely injured children (maximum Abbreviated Injury Scale (AIS) ≥ 3) admitted to a certified trauma-centre (TraumaZentrum DGU®) between 2002 and 2017 and registered at the TraumaRegister DGU® were included and assessed regarding blood transfusion rates and fluid therapy.
Results: 5,118 children (aged 1–15 years) with a mean ISS 22 were analysed. Blood transfusion rates administered until ICU admission decreased from 18% (2002–2005) to 7% (2014–2017). Children who are transfused are increasingly seriously injured. ISS has increased for transfused children aged 1–15 years (2002–2005: mean 27.7–34.4 in 2014–2017). ISS in non-transfused children has decreased in children aged 1–15 years (2002–2005: mean 19.6 to mean 17.6 in 2014–2017). Mean prehospital fluid administration decreased from 980 to 549 ml without affecting hemodynamic instability.
Conclusion: Blood transfusion rates and amount of fluid resuscitation decreased in severe injured children over a 16-year period in Germany. Restrictive blood transfusion and fluid management has become common practice in severe injured children. A prehospital restrictive fluid management strategy in severely injured children is not associated with a worsened hemodynamic state, abnormal coagulation or base excess but leads to higher hemoglobin levels.
Transfusion of red blood cells (RBC) in patients undergoing major elective cranial surgery is associated with increased morbidity, mortality and prolonged hospital length of stay (LOS). This retrospective single center study aims to identify the clinical outcome of RBC transfusions on skull base and non-skull base meningioma patients including the identification of risk factors for RBC transfusion. Between October 2009 and October 2016, 423 patients underwent primary meningioma resection. Of these, 68 (16.1%) received RBC transfusion and 355 (83.9%) did not receive RBC units. Preoperative anaemia rate was significantly higher in transfused patients (17.7%) compared to patients without RBC transfusion (6.2%; p = 0.0015). In transfused patients, postoperative complications as well as hospital LOS was significantly higher (p < 0.0001) compared to non-transfused patients. After multivariate analyses, risk factors for RBC transfusion were preoperative American Society of Anaesthesiologists (ASA) physical status score (p = 0.0247), tumor size (p = 0.0006), surgical time (p = 0.0018) and intraoperative blood loss (p < 0.0001). Kaplan-Meier curves revealed significant influence on overall survival by preoperative anaemia, RBC transfusion, smoking, cardiovascular disease, preoperative KPS ≤ 60% and age (elderly ≥ 75 years). We concluded that blood loss due to large tumors or localization near large vessels are the main triggers for RBC transfusion in meningioma patients paired with a potential preselection that masks the effect of preoperative anaemia in multivariate analysis. Further studies evaluating the impact of preoperative anaemia management for reduction of RBC transfusion are needed to improve the clinical outcome of meningioma patients.
Background. Tracheal intubation still represents the "gold standard" in securing the airway of unconscious patients in the prehospital setting. Especially in cases of restricted access to the patient, video laryngoscopy became more and more relevant.
Objectives. The aim of the study was to evaluate the performance and intubation success of four different video laryngoscopes, one optical laryngoscope, and a Macintosh blade while intubating from two different positions in a mannequin trial with difficult access to the patient.
Methods. A mannequin with a cervical collar was placed on the driver’s seat. Intubation was performed with six different laryngoscopes either through the driver’s window or from the backseat. Success, C/L score, time to best view (TTBV), time to intubation (TTI), and number of attempts were measured. All participants were asked to rate their favored device.
Results. Forty-two physicians participated. 100% of all intubations performed from the backseat were successful. Intubation success through the driver’s window was less successful. Only with the Airtraq® optical laryngoscope, 100% success was achieved. Best visualization (window C/L 2a; backseat C/L 2a) and shortest TTBV (window 4.7 s; backseat 4.1 s) were obtained when using the D-Blade video laryngoscope, but this was not associated with a higher success through the driver’s window. Fastest TTI was achieved through the window (14.2 s) when using the C-MAC video laryngoscope and from the backseat (7.3 s) when using a Macintosh blade.
Conclusions. Video laryngoscopy revealed better results in visualization but was not associated with a higher success. Success depended on the approach and familiarity with the device. We believe that video laryngoscopy is suitable for securing airways in trapped accident victims. The decision for an optimal device is complicated and should be based upon experience and regular training with the device.
Characterization of neonates born to mothers with SARS-CoV-2 infection: review and meta-analysis
(2020)
Characterization of neonates born to mothers with SARS-CoV-2 infection has been partially carried out. There has been no systematic review providing a holistic neonatal presentation including possible vertical transmission. A systematic literature search was performed using PubMed, Google Scholar and Web of Science up to June, 6 2020. Studies on neonates born to mothers with SARS-CoV-2 infection were included. A binary random effect model was used for prevalence and 95% confidence interval. 32 studies involving 261 neonates were included in meta-analysis. Most neonates born to infected mothers did not show any clinical abnormalities (80.4%). Clinical features were dyspnea in 11 (42.3%) and fever in 9 newborns (19.1%). Of 261 neonates, 120 neonates were tested for infection, of whom 12 (10.0%) tested positive. Swabs from placenta, cord blood and vaginal secretion were negative. Neonates are mostly non affected by the mother's SARS-CoV-2 infection. The risk of vertical transmission is low.
Introduction: Cell salvage (CS) is an integral part of patient blood management (PBM) and aims to reduce allogeneic red blood cell (RBC) transfusion.
Material and methods: This observational study analysed patients scheduled for elective cardiac surgery requiring cardiopulmonary bypass (CPB) between November 2015 and October 2018. Patients were divided into a CS group (patients receiving CS) and a control group (no CS). Primary endpoints were the number of patients exposed to allogeneic RBC transfusions and the number of RBC units transfused per patient.
Results: A total of 704 patients undergoing cardiac surgery were analysed, of whom 338 underwent surgery with CS (CS group) and 366 were without CS (control group). Intraoperatively, 152 patients (45%) were exposed to allogeneic RBC transfusions in the CS group and 93 patients (25%) in the control group (P < 0.001). Considering the amount of intraoperative blood loss, regression analysis revealed a significant association between blood loss and increased use of RBC units in patients of the control compared to the CS group (1000 mL: 1.0 vs. 0.6 RBC units; 2000 mL: 2.2 vs. 1.1 RBC units; 3000 mL: 3.4 vs. 1.6 RBC units). Thus, CS was significantly associated with a reduced number of allogeneic RBCs by 40% for 1000 mL, 49% for 2000 mL, and 52% for 3000 mL of blood loss compared to patients without CS.
Conclusions: Cell salvage was significantly associated with a reduced number of allogeneic RBC transfusions. It supports the beneficial effect of CS in cardiac surgical patients as an individual measure in a comprehensive PBM program.
Introduction: Systemic inflammation (e.g. following surgery) involves Toll-like receptor (TLR) signaling and leads to an endocrine stress response. This study aims to investigate a possible influence of TLR2 and TLR4 single nucleotide polymorphisms (SNPs) on perioperative adrenocorticotropic hormone (ACTH) and cortisol regulation in serum of cardiac surgical patients. To investigate the link to systemic inflammation in this context, we additionally measured 10 different cytokines in the serum. Methods: 338 patients admitted for elective cardiac surgery were included in this prospective observational clinical cohort study. Genomic DNA of patients was screened for TLR2 and TLR4 SNPs. Serum concentrations of ACTH, cortisol, interferon (IFN)-, interleukin (IL)-1, IL-2, IL-4, IL-5, IL-6, IL-8, IL-10, tumor necrosis factor (TNF)- and granulocyte macro-phage-colony stimulating factor (GM-CSF) were determined before surgery, immediately post surgery and on the first postoperative day. Results: 13 patients were identified as TLR2 SNP carrier, 51 as TLR4 SNP carrier and 274 pa-tients as non-carrier. Basal levels of ACTH, cortisol and cytokines did not differ between groups. In all three groups a significant, transient perioperative rise of cortisol could be ob-served. However, only in the non-carrier group this was accompanied by a significant ACTH rise, TLR4 SNP carriers had significant lower ACTH levels compared to non-carriers ((mean[95% confidence intervals]) non-carriers: 201.9[187.7 to 216.1]pg/ml; TLR4 SNP car-riers: 149.9[118.4 to 181.5]pg/ml; TLR2 SNP carriers: 176.4[110.5 to 242.3]pg/ml). Compared to non-carriers, TLR4 SNP carriers showed significant lower serum IL-8, IL-10 and GM-CSF peaks ((mean[95% confidence intervals]): IL-8: non-carriers: 42.6[36.7 to 48.5]pg/ml, TLR4 SNP carriers: 23.7[10.7 to 36.8]pg/ml; IL-10: non-carriers: 83.8[70.3 to 97.4]pg/ml, TLR4 SNP carriers: 54.2[24.1 to 84.2]pg/ml; GM-CSF: non-carriers: 33.0[27.8 to 38.3]pg/ml, TLR4 SNP carriers: 20.2[8.6 to 31.8]pg/ml). No significant changes over time or between the groups were found for the other cytokines. Conclusions: Regulation of the immunoendocrine stress response during systemic inflamma-tion is influenced by the presence of a TLR4 SNP. Cardiac surgical patients carrying this ge-notype showed decreased serum concentrations of ACTH, IL-8, IL-10 and GM-CSF. This finding might have impact on interpreting previous and designing future trials on diagnosing and modulating immunoendocrine dysregulation (e.g. adrenal insufficiency) during systemic inflammation and sepsis.
Introduction: It has been proposed that individual genetic variation contributes to the course of severe infections and sepsis. Recent studies of single nucleotide polymorphisms (SNPs) within the endotoxin receptor and its signaling system showed an association with the risk of disease development. This study aims to examine the response associated with genetic variations of TLR4, the receptor for bacterial LPS, and a central intracellular signal transducer (TIRAP/Mal) on cytokine release and for susceptibility and course of severe hospital acquired infections in distinct patient populations. Methods: Three intensive care units in tertiary care university hospitals in Greece and Germany participated. 375 and 415 postoperative patients and 159 patients with ventilator associated pneumonia (VAP) were included. TLR4 and TIRAP/Mal polymorphisms in 375 general surgical patients were associated with risk of infection, clinical course and outcome. In two prospective studies, 415 patients following cardiac surgery and 159 patients with newly diagnosed VAP predominantly caused by Gram-negative bacteria were studied for cytokine levels in-vivo and after ex-vivo monocyte stimulation and clinical course. Results: Patients simultaneously carrying polymorphisms in TIRAP/Mal and TLR4 and patients homozygous for the TIRAP/Mal SNP had a significantly higher risk of severe infections after surgery (odds ratio (OR) 5.5; confidence interval (CI): 1.34 - 22.64; P = 0.02 and OR: 7.3; CI: 1.89 - 28.50; P < 0.01 respectively). Additionally we found significantly lower circulating cytokine levels in double-mutant individuals with ventilator associated pneumonia and reduced cytokine production in an ex-vivo monocyte stimulation assay, but this difference was not apparent in TIRAP/Mal-homozygous patients. In cardiac surgery patients without infection, the cytokine release profiles were not changed when comparing different genotypes. Conclusions: Carriers of mutations in sequential components of the TLR signaling system may have an increased risk for severe infections. Patients with this genotype showed a decrease in cytokine release when infected which was not apparent in patients with sterile inflammation following cardiac surgery.
Background: paediatric patients are vulnerable to blood loss and even a small loss of blood can be associated with severe shock. In emergency situations, a red blood cell (RBC) transfusion may become unavoidable, although it is associated with various risks. The aim of this trial was to identify independent risk factors for perioperative RBC transfusion in children undergoing surgery. Methods: to identify independent risk factors for perioperative RBC transfusion in children undergoing surgery and to access RBC transfusion rates and in-hospital outcomes (e.g., length of stay, mortality, and typical postoperative complication rates), a monocentric, retrospective, and observational study was conducted. Descriptive, univariate, and multivariate analyses were performed. Results: between 1 January 2010 and 31 December 2019, data from n = 14,248 cases were identified at the centre. Analysis revealed an RBC transfusion rate of 10.1% (n = 1439) in the entire cohort. The independent predictors of RBC transfusion were the presence of preoperative anaemia (p < 0.001; OR = 15.10 with preoperative anaemia and OR = 2.40 without preoperative anaemia), younger age (p < 0.001; ORs between 0.14 and 0.28 for children older than 0 years), female gender (p = 0.036; OR = 1.19 compared to male gender), certain types of surgery (e.g., neuro surgery (p < 0.001; OR = 10.14), vascular surgery (p < 0.001; OR = 9.93), cardiac surgery (p < 0.001; OR = 4.79), gynaecology (p = 0.014; OR = 3.64), visceral surgery (p < 0.001; OR = 2.48), and the presence of postoperative complications (e.g., sepsis (p < 0.001; OR = 10.16), respiratory dysfunction (p < 0.001; OR = 7.56), cardiovascular dysfunction (p < 0.001; OR = 4.68), neurological dysfunction (p = 0.029; OR = 1.77), and renal dysfunction (p < 0.001; OR = 16.17)). Conclusion: preoperative anaemia, younger age, female gender, certain types of surgery, and postoperative complications are independent predictors for RBC transfusion in children undergoing surgery. Future prospective studies are urgently required to identify, in detail, the potential risk factors and impact of RBC transfusion in children.
Background: Cerebral O2 saturation (ScO2) reflects cerebral perfusion and can be measured noninvasively by near-infrared spectroscopy (NIRS). Objectives: In this pilot study, we describe the dynamics of ScO2 during TAVI in nonventilated patients and its impact on procedural outcome. Methods and Results: We measured ScO2 of both frontal lobes continuously by NIRS in 50 consecutive analgo-sedated patients undergoing transfemoral TAVI (female 58%, mean age 80.8 years). Compared to baseline ScO2 dropped significantly during RVP (59.3% vs. 53.9%, p < .01). Five minutes after RVP ScO2 values normalized (post RVP 62.6% vs. 53.9% during RVP, p < .01; pre 61.6% vs. post RVP 62.6%, p = .53). Patients with an intraprocedural pathological ScO2 decline of >20% (n = 13) had higher EuroSCORE II (3.42% vs. 5.7%, p = .020) and experienced more often delirium (24% vs. 62%, p = .015) and stroke (0% vs. 23%, p < .01) after TAVI. Multivariable logistic regression revealed higher age and large ScO2 drops as independent risk factors for delirium. Conclusions: During RVP ScO2 significantly declined compared to baseline. A ScO2 decline of >20% is associated with a higher incidence of delirium and stroke and a valid cut-off value to screen for these complications. NIRS measurement during TAVI procedure may be an easy to implement diagnostic tool to detect patients at high risks for cerebrovascular complications and delirium.
Introduction: In recent years, resource-saving handling of allogeneic blood products and a reduction of transfusion rates in adults has been observed. However, comparable published national data for transfusion practices in pediatric patients are currently not available. In this study, the transfusion rates for children and adolescents were analyzed based on data from the Federal Statistical Office of Germany during the past 2 decades. Methods: Data were queried via the database of the Federal Statistical Office (Destasis). The period covered was from 2005 to 2018, and those in the sample group were children and adolescents aged 0–17 years receiving inpatient care. Operation and procedure codes (OPS) for transfusions, procedures, or interventions with increased transfusion risk were queried and evaluated in detail. Results: In Germany, 0.9% of the children and adolescents treated in hospital received a transfusion in 2018. A reduction in transfusion rates from 1.02% (2005) to 0.9% (2018) was observed for the total collective of children and adolescents receiving inpatient care. Increases in transfusion rates were recorded for 1- to 4- (1.41–1.45%) and 5- to 10-year-olds (1.24–1.33%). Children under 1 year of age were most frequently transfused (in 2018, 40.2% of the children were cared for in hospital). Transfusion-associated procedures such as chemotherapy or machine ventilation and respiratory support for newborns and infants are on the rise. Conclusion: Transfusion rates are declining in children and adolescents, but the reasons for increases in transfusion rates in other groups are unclear. Prospective studies to evaluate transfusion rates and triggers in children are urgently needed.
Background: Trauma may be associated with significant to life-threatening blood loss, which in turn may increase the risk of complications and death, particularly in the absence of adequate treatment. Hydroxyethyl starch (HES) solutions are used for volume therapy to treat hypovolemia due to acute blood loss to maintain or re-establish hemodynamic stability with the ultimate goal to avoid organ hypoperfusion and cardiovascular collapse. The current study compares a 6% HES 130 solution (Volulyte 6%) versus an electrolyte solution (Ionolyte) for volume replacement therapy in adult patients with traumatic injuries, as requested by the European Medicines Agency to gain more insights into the safety and efficacy of HES in the setting of trauma care.
Methods: TETHYS is a pragmatic, prospective, randomized, controlled, double-blind, multicenter, multinational trial performed in two parallel groups. Eligible consenting adults ≥ 18 years, with an estimated blood loss of ≥ 500 ml, and in whom initial surgery is deemed necessary within 24 h after blunt or penetrating trauma, will be randomized to receive intravenous treatment at an individualized dose with either a 6% HES 130, or an electrolyte solution, for a maximum of 24 h or until reaching the maximum daily dose of 30 ml/kg body weight, whatever occurs first. Sample size is estimated as 175 patients per group, 350 patients total (α = 0.025 one-tailed, power 1–β = 0.8). Composite primary endpoint evaluated in an exploratory manner will be 90-day mortality and 90-day renal failure, defined as AKIN stage ≥ 2, RIFLE injury/failure stage, or use of renal replacement therapy (RRT) during the first 3 months. Secondary efficacy and safety endpoints are fluid administration and balance, changes in vital signs and hemodynamic status, changes in laboratory parameters including renal function, coagulation, and inflammation biomarkers, incidence of adverse events during treatment period, hospital, and intensive care unit (ICU) length of stay, fitness for ICU or hospital discharge, and duration of mechanical ventilation and/or RRT.
Discussion: This pragmatic study will increase the evidence on safety and efficacy of 6% HES 130 for treatment of hypovolemia secondary to acute blood loss in trauma patients.
Trial registration:Registered in EudraCT, No.: 2016-002176-27 (21 April 2017) and ClinicalTrials.gov, ID: NCT03338218 (09 November 2017).
Background: Acute bleeding requires fast and targeted therapy. Therefore, knowledge of the patient's potential to form a clot is crucial. Point-of-care testing (POCT) provides fast and reliable information on coagulation. Structural circumstances, such as person-bound sample transport, can prolong the reporting of the results. The aim of the present study was to investigate the diagnostic quality and accuracy between POCT INR diagnostics and standard laboratory analysis (SLA) as well as the time advantage between a pneumatic tube and a personal-based transport system. Methods: Two groups of haemorrhagic patients (EG: emergency department; OG: delivery room; each n = 12) were examined in the context of bleeding emergencies using POCT and SLA. Samples were transported via a pneumatic tube system or by a personal transport service. Results: INR results between POCT and SLA showed a high and significant correlation (EG: p < 0.001; OG: p < 0.001). POCT results were reported significantly more quickly (EG: 1.1 vs. 39.6 min; OG: 2.0 vs. 75.0 min; p < 0.001) and required less time for analysis (EG: 0.3 vs. 24.0 min; OG: 0.5 vs. 45.0 min; p < 0.001) compared to SLA. The time for transportation with the pneumatic tube was significantly shorter (8.0 vs. 18.5 min; p < 0.001) than with the personal-based transport system. Conclusion: The results of the present study suggest that POCT may be a suitable method for the emergency diagnosis and may be used as prognostic diagnostic elements in haemotherapy algorithms to initiate targeted haemotherapy at an early point in time.
Background: Nerve injury induced protein 1 (Ninjurin 1 (Ninj1)) was first identified in Schwann cells and neurons contributing to cell adhesion and nerve regeneration. Recently, the role of Ninj1 has been linked to inflammatory processes in the central nervous system where functional repression reduced leukocyte infiltration and clinical disease activity during experimental autoimmune encephalomyelitis in mice [1]. But Ninj1 is also expressed outside the nervous system in various organs such as the liver and kidney as well as on leukocytes [2,3]. Therefore, we hypothesized that Ninj1 contributes to inflammation in general; that is, also outside the nervous system, with special interest in the pathogenesis of sepsis.
Methods: Ninj1 was repressed by transfecting HMEC-1 cells, a human dermal microvascular endothelial cell line with siRNA targeting Ninj1 (siNinj1) or a negative control (siC). Subsequently, cells were stimulated with 100 ng/ml LPS (TLR4 agonist), 3 μg/ml LTA (TLR2 agonist) or 100 n/ml poly(I:C) (TLR3 agonist) for 3 hours. The inflammatory response was analyzed by real-time PCR. In addition, transmigration of neutrophils across a HMEC-1 monolayer was measured using transwell plates (pore size 3 μm).
Results: Repression of Ninj1 by siRNA reduced Ninj1 mRNA expression in HMEC about 90% (Figure 1A). Reduced Ninj1 expression decreased neutrophil migration to 62.5% (Figure 1B) and TLR signaling. In detail, knockdown of Ninj1 significantly reduced TLR-2 and TLR-4 triggered expression of ICAM-1 and IL-6 (Figure 1C,D) while poly(I:C)-induced expression was only slightly reduced. To analyze a more specific TLR-3 target, we measured IP-10 mRNA expression, which was also significantly reduced in siNinj1-transfected cells (Figure 1E).
Conclusion: Our in vitro data strongly indicated that Ninj1 is involved in regulation of TLR signaling and therewith contributes to inflammation. In vivo experiments will clarify its impact on systemic inflammation.
BACKGROUND: Recent findings support the idea that interleukin (IL)-22 serum levels are related to disease severity in end-stage liver disease. Existing scoring systems--Model for End-Stage Liver Disease (MELD), Survival Outcomes Following Liver Transplantation (SOFT) and Pre-allocation-SOFT (P-SOFT)--are well-established in appraising survival rates with or without liver transplantation. We tested the hypothesis that IL-22 serum levels at transplantation date correlate with survival and potentially have value as a predictive factor for survival.
MATERIAL AND METHODS: MELD, SOFT, and P-SOFT scores were calculated to estimate post-transplantation survival. Serum levels of IL-22, IL-6, IL-10, C-reactive protein (CRP), and procalcitonin (PCT) were collected prior to transplantation in 41 patients. Outcomes were assessed at 3 months, 1 year, and 3 years after transplantation.
RESULTS: IL-22 significantly correlated with MELD, P-SOFT, and SOFT scores (Rs 0.35, 0.63, 0.56 respectively, p<0.05) and with the discrimination in post-transplantation survival. IL-6 showed a heterogeneous pattern (Rs 0.40, 0.63, 0.57, respectively, p<0.05); CRP and PCT did not correlate. We therefore added IL-22 serum values to existing scoring systems in a generalized linear model (GLM), resulting in a significantly improved outcome prediction in 58% of the cases for both the P-SOFT (p<0.01) and SOFT scores (p<0.001).
CONCLUSIONS: Further studies are needed to address the concept that IL-22 serum values at the time of transplantation provide valuable information about survival rates following orthotopic liver transplantation.
Background: Approximately one in three patients suffers from preoperative anaemia. Even though haemoglobin is measured before surgery, anaemia management is not implemented in every hospital. Objective: Here, we demonstrate the implementation of an anaemia walk-in clinic at an Orthopedic University Hospital. To improve the diagnosis of iron deficiency (ID), we examined whether reticulocyte haemoglobin (Ret-He) could be a useful additional parameter. Material and Methods: In August 2019, an anaemia walk-in clinic was established. Between September and December 2019, major orthopaedic surgical patients were screened for preoperative anaemia. The primary endpoint was the incidence of preoperative anaemia. Secondary endpoints included Ret-He level, red blood cell (RBC) transfusion rate, in-hospital length of stay and anaemia at hospital discharge. Results: A total of 104 patients were screened for anaemia. Preoperative anaemia rate was 20.6%. Intravenous iron was supplemented in 23 patients. Transfusion of RBC units per patient (1.7 ± 1.2 vs. 0.2 ± 0.9; p = 0.004) and hospital length of stay (13.1 ± 4.8 days vs. 10.6 ± 5.1 days; p = 0.068) was increased in anaemic patients compared to non-anaemic patients. Ret-He values were significantly lower in patients with ID anaemia (33.3 pg [28.6–40.2 pg]) compared to patients with ID (35.3 pg [28.9–38.6 pg]; p = 0.015) or patients without anaemia (35.4 pg [30.2–39.4 pg]; p = 0.001). Conclusion: Preoperative anaemia is common in orthopaedic patients. Our results proved the feasibility of an anaemia walk-in clinic to manage preoperative anaemia. Furthermore, our analysis supports the use of Ret-He as an additional parameter for the diagnosis of ID in surgical patients.
Background: The intraoperative blood loss is estimated daily in the operating room and is mainly done by visual techniques. Due to local standards, the surgical sponge colours can vary (e.g. white in US, green in Germany). The influence of sponge colour on accuracy of estimation has not been in the focus of research yet. Material and methods: A blood loss simulation study containing four “bleeding” scenarios each per sponge colour were created by using expired whole blood donation samples. The blood donations were applied to white and green surgical sponges after dilution with full electrolyte solution. Study participants had to estimate the absorbed blood loss in sponges in all scenarios. The difference to the reference blood loss was analysed. Multivariate linear regression analysis was performed to investigate other influence factors such as staff experience and sponge colour. Results: A total of 53 anaesthesists participated in the study. Visual estimation correlated moderately with reference blood loss in white (Spearman's rho: 0.521; p = 3.748*10−16) and green sponges (Spearman's rho: 0.452; p = 4.683*10−12). The median visually estimated blood loss was higher in white sponges (250ml IRQ 150–412.5ml) than in green sponges (150ml IQR 100-300ml), compared to reference blood loss (103ml IQR 86–162.8). For both colour types of sponges, major under- and overestimation was observed. The multivariate statistics demonstrates that fabric colours have a significant influence on estimation (p = 3.04*10−10), as well as clinician’s qualification level (p = 2.20*10−10, p = 1.54*10−08) and amount of RBL to be estimated (p < 2*10−16). Conclusion: The deviation of correct blood loss estimation was smaller with white surgical sponges compared to green sponges. In general, deviations were so severe for both types of sponges, that it appears to be advisable to refrain from visually estimating blood loss whenever possible and instead to use other techniques such as e.g. colorimetric estimation.
The scope of extracorporeal membrane oxygenation (ECMO) is expanding, nevertheless, pharmacokinetics in patients receiving cardiorespiratory support are fairly unknown leading to unpredictable drug concentrations. Currently, there are no clear guidelines for antibiotic dosing during ECMO. This study aims to evaluate the pharmacokinetics (PK) of cefazolin in patients undergoing ECMO treatment. Total and unbound plasma cefazolin concentration of critically ill patients on veno-arterial ECMO were determined. Observed PK was compared to dose recommendations calculated by an online available, free dosing software. Concentration of cefazolin varied broadly despite same dosage in all patients. The mean total and unbound plasma concentration were high showing significantly (p = 5.8913 E−09) greater unbound fraction compared to a standard patient. Cefazolin clearance was significantly (p = 0.009) higher in patients with preserved renal function compared with CRRT. Based upon the calculated clearance, the use of dosing software would have led to lower but still sufficient concentrations of cefazolin in general. Our study shows that a “one size fits all” dosing regimen leads to excessive unbound cefazolin concentration in these patients. They exhibit high PK variability and decreased cefazolin clearance on ECMO appears to compensate for ECMO- and critical illness-related increases in volume of distribution.
Background: The pro-inflammatory status of the elderly triggers most of the age-related diseases such as cancer and atherosclerosis. Atherosclerosis, the leading cause world wide of morbidity and death, is an inflammatory disease influenced by life-style and genetic host factors. Stimuli such as oxLDL or microbial ligands have been proposed to trigger inflammation leading to atherosclerosis. It has recently been shown that oxLDL activates immune cells via the Toll-like receptor (TLR) 4/6 complex. Several common single nucleotide polymorphisms (SNPs) of the TLR system have been associated with atherosclerosis. To investigate the role of TLR-6 we analyzed the association of the TLR-6 SNP Pro249Ser with atherogenesis.
Results: Genotyping of two independent groups with CAD, as well as of healthy controls revealed a significant association of the homozygous genotype with a reduced risk for atherosclerosis (odds ratio: 0.69, 95% CI 0.51-0.95, P = 0.02). In addition, we found a trend towards an association with the risk of restenosis after transluminal coronary angioplasty (odds ratio: 0.53, 95% CI 0.24-1.16, P = 0.12). In addition, first evidence is presented that the frequency of this protective genotype increases in a healthy population with age. Taken together, our results define a role for TLR-6 and its genetic variations in modulating the inflammatory response leading to atherosclerosis.
Conclusions: These results may lead to a better risk stratification, and potentially to an improved prophylactic treatment of high-risk populations. Furthermore, the protective effect of this polymorphism may lead to an increase of this genotype in the healthy elderly and may therefore be a novel genetic marker for the well-being during aging.
Background: Due to the coronavirus disease 2019 (COVID-19) pandemic, interventions in the upper airways are considered high-risk procedures for otolaryngologists and their colleagues. The purpose of this study was to evaluate limitations in hearing and communication when using a powered air-purifying respirator (PAPR) system to protect against severe acute respiratory syndrome coronavirus type 2 (SARS-CoV-2) transmission and to assess the benefit of a headset. Methods: Acoustic properties of the PAPR system were measured using a head and torso simulator. Audiological tests (tone audiometry, Freiburg speech test, Oldenburg sentence test (OLSA)) were performed in normal-hearing subjects (n = 10) to assess hearing with PAPR. The audiological test setup also included simulation of conditions in which the target speaker used either a PAPR, a filtering face piece (FFP) 3 respirator, or a surgical face mask. Results: Audiological measurements revealed that sound insulation by the PAPR headtop and noise, generated by the blower-assisted respiratory protection system, resulted in significantly deteriorated hearing thresholds (4.0 ± 7.2 dB hearing level (HL) vs. 49.2 ± 11.0
Objective. Evaluation of C-MAC PM® in combination with a standard Macintosh blade size 3 in direct and indirect laryngoscopy and D-Blade® in indirect laryngoscopy in a simulated difficult airway. Primary outcome was defined as the best view of the glottic structures. Secondary endpoints were subjective evaluation and assessment of the intubation process.
Methods. Prospective monocentric, observational study on 48 adult patients without predictors for difficult laryngoscopy/tracheal intubation undergoing orthopedic surgery. Every participant preoperatively received a cervical collar to simulate a difficult airway. Direct and indirect laryngoscopy w/o the BURP maneuver with a standard Macintosh blade and indirect laryngoscopy w/o the BURP maneuver using D-Blade® were performed to evaluate if blade geometry and the BURP maneuver improve the glottic view as measured by the Cormack-Lehane score.
Results. Using a C-MAC PM® laryngoscope, D-Blade® yielded improved glottic views compared with the Macintosh blade used with either the direct or indirect technique. Changing from direct laryngoscopy using a Macintosh blade to indirect videolaryngoscopy using C-MAC PM® with D-Blade® improved the Cormack-Lehane score from IIb, III, or IV to I or II in 31 cases.
Conclusion. The combination of C-MAC PM® and D-Blade® significantly enhances the view of the glottis compared to direct laryngoscopy with a Macintosh blade in patients with a simulated difficult airway.
Trial Registration Number. This trial is registered under number NCT03403946.
Background: Iron deficiency (ID) is one of the most common nutritional deficiencies in children worldwide and may result in iron deficiency anemia (IDA). The reticulocyte hemoglobin equivalent (Ret-He) provides information about the current availability of iron in erythropoiesis. This study aims to examine the validation of Ret-He as a screening marker for ID and IDA in children. Methods: Blood samples were retrospectively obtained from medical records. Anemia was defined according to the definition provided by the World Health Organization (WHO) for children. ID was defined by transferrin saturation (TSAT) < 20% and ferritin < 100 ng/mL. Children were classified into four groups: IDA, non-anemia iron deficiency (NAID), control and others. Results: Out of 970 children, 332 (34.2%) had NAID and 278 (28.7%) presented with IDA. Analysis revealed that Ret-He significantly correlates with ferritin (rho = 0.41; p < 0.001), TSAT (rho = 0.66; p < 0.001) and soluble transferrin receptor (sTfR) (rho = −0.72; p < 0.001). For ROC analysis, the area under the curve (AUC) was 0.771 for Ret-He detecting ID and 0.845 for detecting IDA. The cut-off value for Ret-He to diagnose ID was 33.5 pg (sensitivity 90.7%; specificity 35.8%) and 31.6 pg (sensitivity 90.6%; specificity 50.4%) to diagnose IDA. Conclusions: The present study demonstrates Ret-He to be a screening marker for ID and IDA in children. Furthermore, Ret-He can be used as a single screening parameter for ID and IDA in children without considering other iron parameters. Economically, the use of Ret-He is highly relevant, as it can save one blood tube per patient and additional costs.
Background: Following elective craniotomy patients routinely receive monitoring on ICU. However, the benefit of ICU monitoring in these patients is discussed controversially. Due to the current COVID-19 pandemic, there are further limitations of ICU capacities. This study aimed to compare this strategy with a standardized management of post-craniotomy patients on ICU.
Methods: Two postoperative strategies were compared in a matched-pair analysis: The first cohort included patients treated between May-August 2021 according to the “No ICU – unless” concept (NIU group), where patients were managed on the normal ward postoperatively. The second cohort contained patients routinely admitted to the ICU between February-April 2021 (control group). Outcome parameters contained complications, length of stay, duration to first postoperative mobilization, number of unplanned imaging, number/type of ICU interventions and pre- and postoperative mRS. Patient characteristics were analyzed using electronic medical records.
Results: The NIU group consisted of 96 patients, the control group of 75 patients. Complication rates were comparable in both cohorts (16% in NIU vs. 17% in control; p=0.123). Groups did not differ significantly in the number of imaging (10% in NIU vs. 13% in control; p=0.67), in the type of interventions on ICU (antihypertensive therapy 5% (NIU) vs. 6% (control); p=0.825) or in the time to first postoperative mobilization (average 1.1± 1.6 days vs. 0.9± 1.2 days; p=0.402). Length of hospital stay was shorter in the NIU group without reaching statistical significance (average 5.8 vs. 6.8 days; p=0.481). There was no significant change in the distribution of preoperative (p=0.960) and postoperative (p=0.425) mRS scores.
Conclusion: Postoperative ICU management does not reduce postoperative complications and has no effect on the surgical outcome of elective craniotomies. The majority of postoperative complications are detected after a 24-hour observation period. This approach may represent a potential strategy to prevent overutilization of ICU capacities while maintaining sufficient postoperative care for neurosurgical patients.
Acute kidney injury (AKI) is one of the most important complications in hospitalized patients and its pathomechanisms are not completely elucidated. We hypothesize that signaling via toll-like receptor (TLR)-3, a receptor that is activated upon binding of double-stranded nucleotides, might play a crucial role in the pathogenesis of AKI following ischemia and reperfusion (IR). Male adult C57Bl6 wild-type (wt) mice and TLR-3 knock-out (-/-) mice were subjected to 30 minutes bilateral selective clamping of the renal artery followed by reperfusion for 30 min 2.5h and 23.5 hours or subjected to sham procedures. TLR-3 down-stream signaling was activated already within 3 h of ischemia and reperfusion in post-ischemic kidneys of wt mice lead to impaired blood perfusion followed by a strong pro-inflammatory response with significant neutrophil invasion. In contrast, this effect was absent in TLR-3-/- mice. Moreover, the quick TLR-3 activation resulted in kidney damage that was histomorphologically associated with significantly increased apoptosis and necrosis rates in renal tubules of wt mice. This finding was confirmed by increased kidney injury marker NGAL in wt mice and a better preserved renal perfusion after IR in TLR-3-/- mice than wt mice. Overall, the absence of TLR-3 is associated with lower cumulative kidney damage and maintained renal blood perfusion within the first 24 hours of reperfusion. Thus, we conclude that TLR-3 seems to participate in the pathogenesis of early acute kidney injury.
Background: Perioperative anaemia leads to impaired oxygen supply with a risk of vital organ ischaemia. In healthy and fit individuals, anaemia can be compensated by several mechanisms. Elderly patients, however, have less compensatory mechanisms because of multiple co-morbidities and age-related decline of functional reserves. The purpose of the study is to evaluate whether elderly surgical patients may benefit from a liberal red blood cell (RBC) transfusion strategy compared to a restrictive transfusion strategy.
Methods: The LIBERAL Trial is a prospective, randomized, multicentre, controlled clinical phase IV trial randomising 2470 elderly (≥ 70 years) patients undergoing intermediate- or high-risk non-cardiac surgery. Registered patients will be randomised only if Haemoglobin (Hb) reaches ≤9 g/dl during surgery or within 3 days after surgery either to the LIBERAL group (transfusion of a single RBC unit when Hb ≤ 9 g/dl with a target range for the post-transfusion Hb level of 9–10.5 g/dl) or the RESTRICTIVE group (transfusion of a single RBC unit when Hb ≤ 7.5 g/dl with a target range for the post-transfusion Hb level of 7.5–9 g/dl). The intervention per patient will be followed until hospital discharge or up to 30 days after surgery, whichever occurs first. The primary efficacy outcome is defined as a composite of all-cause mortality, acute myocardial infarction, acute ischaemic stroke, acute kidney injury (stage III), acute mesenteric ischaemia and acute peripheral vascular ischaemia within 90 days after surgery. Infections requiring iv antibiotics with re-hospitalisation are assessed as important secondary endpoint. The primary endpoint will be analysed by logistic regression adjusting for age, cancer surgery (y/n), type of surgery (intermediate- or high-risk), and incorporating centres as random effect.
Discussion: The LIBERAL-Trial will evaluate whether a liberal transfusion strategy reduces the occurrence of major adverse events after non-cardiac surgery in the geriatric population compared to a restrictive strategy within 90 days after surgery.
Trial registration: ClinicalTrials.gov (identifier: NCT03369210).
Cholinesterase alterations in delirium after cardiosurgery: a German monocentric prospective study
(2020)
Objectives: Postoperative delirium (POD) is a common complication after elective cardiac surgery. Recent evidence indicates that a disruption in the normal activity of the cholinergic system may be associated with delirium.
Design: Prospective observational study.
Setting: Single-centre at a European academic hospital.
Primary: and secondary outcome measures In our study the enzyme activities of acetylcholinesterase (AChE) and butyrylcholinesterase (BChE) were determined preoperatively as well as on the first and second postoperative day. The confusion assessment method for the intensive care unit was used to screen patients for the presence of POD.
Results: A total of 114 patients were included in the study. POD was associated with a decrease in BChE activity on postoperative day 1 (p=0.03). In addition, patients who developed POD, had significantly lower preoperative AChE activity than patients without POD (p<0.01). Multivariate analysis identified a preoperatively decreased AChE activity (OR 3.1; 95% CI 1.14 to 8.46), anticholinergic treatment (OR 5.09; 95% CI 1.51 to 17.23), elevated European System for Cardiac Operative Risk Evaluation (OR 3.68; 95% CI 1.04 to 12.99) and age (OR 3.02; 95% CI 1.06 to 8.62) to be independently associated with the development of POD.
Conclusions: We conclude that a reduction in the acetylcholine hydrolysing enzyme activity in patients undergoing cardiac surgery may correlate with the development of POD.
Background: Approximately every third surgical patient is anemic. The most common form, iron deficiency anemia, results from persisting iron‐deficient erythropoiesis (IDE). Zinc protoporphyrin (ZnPP) is a promising parameter for diagnosing IDE, hitherto requiring blood drawing and laboratory workup.
Study design and methods: Noninvasive ZnPP (ZnPP‐NI) measurements are compared to ZnPP reference determination of the ZnPP/heme ratio by high‐performance liquid chromatography (ZnPP‐HPLC) and the analytical performance in detecting IDE is evaluated against traditional iron status parameters (ferritin, transferrin saturation [TSAT], soluble transferrin receptor–ferritin index [sTfR‐F], soluble transferrin receptor [sTfR]), likewise measured in blood. The study was conducted at the University Hospitals of Frankfurt and Zurich.
Results: Limits of agreement between ZnPP‐NI and ZnPP‐HPLC measurements for 584 cardiac and noncardiac surgical patients equaled 19.7 μmol/mol heme (95% confidence interval, 18.0–21.3; acceptance criteria, 23.2 μmol/mol heme; absolute bias, 0 μmol/mol heme). Analytical performance for detecting IDE (inferred from area under the curve receiver operating characteristics) of parameters measured in blood was: ZnPP‐HPLC (0.95), sTfR (0.92), sTfR‐F (0.89), TSAT (0.87), and ferritin (0.67). Noninvasively measured ZnPP‐NI yielded results of 0.90.
Conclusion: ZnPP‐NI appears well suited for an initial IDE screening, informing on the state of erythropoiesis at the point of care without blood drawing and laboratory analysis. Comparison with a multiparameter IDE test revealed that ZnPP‐NI values of 40 μmol/mol heme or less allows exclusion of IDE, whereas for 65 μmol/mol heme or greater, IDE is very likely if other causes of increased values are excluded. In these cases (77% of our patients) ZnPP‐NI may suffice for a diagnosis, while values in between require analyses of additional iron status parameters.
In-line filtration of intravenous infusion may reduce organ dysfunction of adult critical patients
(2019)
Background: The potential harmful effects of particle-contaminated infusions for critically ill adult patients are yet unclear. So far, only significant improved outcome in critically ill children and new-borns was demonstrated when using in-line filters, but for adult patients, evidence is still missing.
Methods: This single-centre, retrospective controlled cohort study assessed the effect of in-line filtration of intravenous fluids with finer 0.2 or 1.2 μm vs 5.0 μm filters in critically ill adult patients. From a total of n = 3215 adult patients, n = 3012 patients were selected by propensity score matching (adjusting for sex, age, and surgery group) and assigned to either a fine filter cohort (with 0.2/1.2 μm filters, n = 1506, time period from February 2013 to January 2014) or a control filter cohort (with 5.0 μm filters, n = 1506, time period from April 2014 to March 2015). The cohorts were compared regarding the occurrence of severe vasoplegia, organ dysfunctions (lung, kidney, and brain), inflammation, in-hospital complications (myocardial infarction, ischemic stroke, pneumonia, and sepsis), in-hospital mortality, and length of ICU and hospital stay.
Results: Comparing fine filter vs control filter cohort, respiratory dysfunction (Horowitz index 206 (119–290) vs 191 (104.75–280); P = 0.04), pneumonia (11.4% vs 14.4%; P = 0.02), sepsis (9.6% vs 12.2%; P = 0.03), interleukin-6 (471.5 (258.8–1062.8) ng/l vs 540.5 (284.5–1147.5) ng/l; P = 0.01), and length of ICU (1.2 (0.6–4.9) vs 1.7 (0.8–6.9) days; P < 0.01) and hospital stay (14.0 (9.2–22.2) vs 14.8 (10.0–26.8) days; P = 0.01) were reduced. Rate of severe vasoplegia (21.0% vs 19.6%; P > 0.20) and acute kidney injury (11.8% vs 13.7%; P = 0.11) was not significantly different between the cohorts.
Conclusions: In-line filtration with finer 0.2 and 1.2 μm filters may be associated with less organ dysfunction and less inflammation in critically ill adult patients.
Trial registration: The study was registered at ClinicalTrials.gov (number: NCT02281604).
Background: Anemia is the most important complication during major surgery and transfusion of red blood cells is the mainstay to compensate for life threating blood loss. Therefore, accurate measurement of hemoglobin (Hb) concentration should be provided in real-time. Blood Gas Analysis (BGA) provides rapid point-of-care assessment using smaller sampling tubes compared to central laboratory (CL) services. Objective: This study aimed to investigate the accuracy of BGA hemoglobin testing as compared to CL services. Methods: Data of the ongoing LIBERAL-Trial (Liberal transfusion strategy to prevent mortality and anemia-associated ischemic events in elderly non-cardiac surgical patients, LIBERAL) was used to assess the bias for Hb level measured by BGA devices (ABL800 Flex analyzer®, GEM series® and RapidPoint 500®) and CL as the reference method. For that, we analyzed pairs of Hb level measured by CL and BGA within two hours. Furthermore, the impact of various confounding factors including age, gender, BMI, smoker status, transfusion of RBC, intraoperative hemodilution, and co-medication was elucidated. In order to ensure adequate statistical analysis, only data of participating centers providing more than 200 Hb pairs were used. Results: In total, three centers including 963 patients with 1,814 pairs of Hb measurements were analyzed. Mean bias was comparable between ABL800 Flex analyzer® and GEM series®: - 0.38 ± 0.15 g/dl whereas RapidPoint 500® showed a smaller bias (-0.09 g/dl) but greater median absolute deviation (± 0.45 g/dl). In order to avoid interference with different standard deviations caused by the different analytic devices, we focused on two centers using the same BGA technique (309 patients and 1,570 Hb pairs). A Bland-Altman analysis and LOWESS curve showed that bias decreased with smaller Hb values in absolute numbers but increased relatively. The smoker status showed the greatest reduction in bias (0.1 g/dl, p<0.001) whereas BMI (0.07 g/dl, p = 0.0178), RBC transfusion (0.06 g/dl, p<0.001), statins (0.04 g/dl, p<0.05) and beta blocker (0.03 g/dl, p = 0.02) showed a slight effect on bias. Intraoperative substitution of volume and other co-medications did not influence the bias significantly. Conclusion: Many interventions like substitution of fluids, coagulating factors or RBC units rely on the accuracy of laboratory measurement devices. Although BGA Hb testing showed a consistently stable difference to CL, our data confirm that BGA devices are associated with different bias. Therefore, we suggest that hospitals assess their individual bias before implementing BGA as valid and stable supplement to CL. However, based on the finding that bias decreased with smaller Hb values, which in turn are used for transfusion decision, we expect no unnecessary or delayed RBC transfusion, and no major impact on the LIBERAL trial performance.
Background: The use of cell salvage and autologous blood transfusion has become an important method of blood conservation. So far, there are no clinical data about the performance of the continuous autotransfusion device CATSmart.
Methods: In total, 74 patients undergoing either cardiac or orthopedic surgery were included in this prospective, bicenter and observational technical evaluation to validate red cell separation process and washout quality of CATSmart. The target of red cell separation process was defined as a hematocrit value in the packed red cell unit of 55–75% and of washout quality of 80–100% removal ratio.
Results: Hematocrit values measured by CATSmart and laboratory analysis were 78.5% [71.3%; 84.0%] and 73.7% [67.5%; 75.5%], respectively. Removal ratios for platelets 94.7% [88.2%; 96.7%], free hemoglobin 89.3% [85.2%; 94.9%], albumin 97.9% [96.6%; 98.5%], heparin 99.9% [99.9%; 100.0%], and potassium 92.5% [90.8%; 95.0%] were within the target range while removal of white blood cells was slightly worse 72.4% [57.9%; 87.3%].
Conclusion: The new autotransfusion device enables sufficient red cell separation and washout quality.
Disruption of the renal endothelial integrity is pivotal for the development of a vascular leak, tissue edema and consequently acute kidney injury. Kidney ischemia amplifies endothelial activation and up-regulation of pro-inflammatory mechanisms. After restoring a sufficient blood flow, the kidney is damaged through complex pathomechanisms that are classically referred to as ischemia and reperfusion injury, where the disruption of the inter-endothelial connections seems to be a crucial step in this pathomechanism. Focusing on the molecular cell-cell interaction, the fibrinopeptide Bβ15–42 prevents vascular leakage by stabilizing these inter-endothelial junctions. The peptide associates with vascular endothelial-cadherin, thus preventing early kidney dysfunction by preserving blood perfusion efficacy, edema formation and thus organ dysfunction. We intended to demonstrate the early therapeutic benefit of intravenously administered Bβ15–42 in a mouse model of renal ischemia and reperfusion. After 30 minutes of ischemia, the fibrinopeptide Bβ15–42 was administered intravenously before reperfusion was commenced for 1 and 3 hours. We show that Bβ15–42 alleviates early functional and morphological kidney damage as soon as 1 h and 3 h after ischemia and reperfusion. Mice treated with Bβ15–42 displayed a significantly reduced loss of VE-cadherin, indicating a conserved endothelial barrier leading to less neutrophil infiltration which in turn resulted in significantly reduced structural renal damage. The significant reduction in tissue and serum neutrophil gelatinase-associated lipocalin levels reinforced our findings. Moreover, renal perfusion analysis by color duplex sonography revealed that Bβ15–42 treatment preserved resistive indices and even improved blood velocity. Our data demonstrate the efficacy of early therapeutic intervention using the fibrinopeptide Bβ15–42 in the treatment of acute kidney injury resulting from ischemia and reperfusion. In this context Bβ15–42 may act as a potent renoprotective agent by preserving the endothelial and vascular integrity.
Background: Mild therapeutic hypothermia following cardiac arrest is neuroprotective, but its effect on myocardial dysfunction that is a critical issue following resuscitation is not clear. This study sought to examine whether hypothermia and the combination of hypothermia and pharmacological postconditioning are cardioprotective in a model of cardiopulmonary resuscitation following acute myocardial ischemia. Methodology/Principal Findings: Thirty pigs (28–34 kg) were subjected to cardiac arrest following left anterior descending coronary artery ischemia. After 7 minutes of ventricular fibrillation and 2 minutes of basic life support, advanced cardiac life support was started according to the current AHA guidelines. After successful return of spontaneous circulation (n = 21), coronary perfusion was reestablished after 60 minutes of occlusion, and animals were randomized to either normothermia at 38°C, hypothermia at 33°C or hypothermia at 33°C combined with sevoflurane (each group n = 7) for 24 hours. The effects on cardiac damage especially on inflammation, apoptosis, and remodeling were studied using cellular and molecular approaches. Five animals were sham operated. Animals treated with hypothermia had lower troponin T levels (p<0.01), reduced infarct size (34±7 versus 57±12%; p<0.05) and improved left ventricular function compared to normothermia (p<0.05). Hypothermia was associated with a reduction in: (i) immune cell infiltration, (ii) apoptosis, (iii) IL-1beta and IL-6 mRNA up-regulation, and (iv) IL-1beta protein expression (p<0.05). Moreover, decreased matrix metalloproteinase-9 activity was detected in the ischemic myocardium after treatment with mild hypothermia. Sevoflurane conferred additional protective effects although statistic significance was not reached. Conclusions/Significance: Hypothermia reduced myocardial damage and dysfunction after cardiopulmonary resuscitation possible via a reduced rate of apoptosis and pro-inflammatory cytokine expression.
Background: Undergoing systemic inflammation, the innate immune system releases excessive proinflammatory mediators, which finally can lead to organ failure. Pattern recognition receptors (PRRs), such as Toll-like receptors (TLRs) and NOD-like receptors (NLRs), form the interface between bacterial and viral toxins and innate immunity. During sepsis, patients with diagnosed adrenal gland insufficiency are at high risk of developing a multiorgan dysfunction syndrome, which dramatically increases the risk of mortality. To date, little is known about the mechanisms leading to adrenal dysfunction under septic conditions. Here, we investigated the sepsis-related activation of the PRRs, cell inflammation, and apoptosis within adrenal glands.
Methods: Two sepsis models were performed: the polymicrobial sepsis model (caecal ligation and puncture (CLP)) and the LTA-induced intoxication model. All experiments received institutional approval by the Regierungspräsidium Darmstadt. CLP was performed as previously described [1], wherein one-third of the caecum was ligated and punctured with a 20-gauge needle. For LTA-induced systemic inflammation, TLR2 knockout (TLR2-/-) and WT mice were injected intraperitoneally with pure LTA (pLTA; 1 mg/kg) or PBS for 2 hours. To detect potential direct adrenal dysfunction, mice were additionally injected with adrenocorticotropic hormone (ACTH; 100 μg/kg) 1 hour after pLTA or PBS. Adrenals and plasma samples were taken. Gene expressions in the adrenals (rt-PCR), cytokine release (multiplex assay), and the apoptosis rate (TUNEL assay) within the adrenals were determined.
Results: In both models, adrenals showed increased mRNA expression of TLR2 and TLR4, various NLRs, cytokines as well as inflammasome components, NADPH oxidase subunits, and nitric oxide synthases (data not shown). In WT mice, ACTH alone had no effect on inflammation, while pLTA or pLTA/ACTH administration showed increased levels of the cytokines IL-1β, IL-6, and TNFα. TLR2-/- mice indicated no response as expected (Figure 1, left). Interestingly, surviving CLP mice showed no inflammatory adrenal response, whereas nonsurvivors had elevated cytokine levels (Figure 1, right). Additionally, we identified a marked increase in apoptosis of both chromaffin and steroid-producing cells in adrenal glands obtained from mice with sepsis as compared with their controls (Figure 2).
...
Conclusion: Taken together, sepsis-induced activation of the PRRs may contribute to adrenal impairment by enhancing tissue inflammation, oxidative stress and culminate in cellular apoptosis, while mortality seems to be associated with adrenal inflammation.
Background: Peritonitis is responsible for thousands of deaths annually in Germany alone. Even source control (SC) and antibiotic treatment often fail to prevent severe sepsis or septic shock, and this situation has hardly improved in the past two decades. Most experimental immunomodulatory therapeutics for sepsis have been aimed at blocking or dampening a specific pro-inflammatory immunological mediator. However, the patient collective is large and heterogeneous. There are therefore grounds for investigating the possibility of developing personalized therapies by classifying patients into groups according to biomarkers. This study aims to combine an assessment of the efficacy of treatment with a preparation of human immunoglobulins G, A, and M (IgGAM) with individual status of various biomarkers (immunoglobulin level, procalcitonin, interleukin 6, antigen D-related human leucocyte antigen (HLA-DR), transcription factor NF-κB1, adrenomedullin, and pathogen spectrum).
Methods/design: A total of 200 patients with sepsis or septic shock will receive standard-of-care treatment (SoC). Of these, 133 patients (selected by 1:2 randomization) will in addition receive infusions of IgGAM for 5 days. All patients will be followed for approximately 90 days and assessed by the multiple-organ failure (MOF) score, by the EQ QLQ 5D quality-of-life scale, and by measurement of vital signs, biomarkers (as above), and survival.
Discussion: This study is intended to provide further information on the efficacy and safety of treatment with IgGAM and to offer the possibility of correlating these with the biomarkers to be studied. Specifically, it will test (at a descriptive level) the hypothesis that patients receiving IgGAM who have higher inflammation status (IL-6) and poorer immune status (low HLA-DR, low immunoglobulin levels) have a better outcome than patients who do not receive IgGAM. It is expected to provide information that will help to close the knowledge gap concerning the association between the effect of IgGAM and the presence of various biomarkers, thus possibly opening the way to a personalized medicine.
Trial registration: EudraCT, 2016–001788-34; ClinicalTrials.gov, NCT03334006. Registered on 17 Nov 2017.
Trial sponsor: RWTH Aachen University, represented by the Center for Translational & Clinical Research Aachen (contact Dr. S. Isfort).
Acute respiratory distress syndrome (ARDS) is a major cause of patient mortality in intensive care units (ICUs) worldwide. Considering that no causative treatment but only symptomatic care is available, it is obvious that there is a high unmet medical need for a new therapeutic concept. One reason for a missing etiologic therapy strategy is the multifactorial origin of ARDS, which leads to a large heterogeneity of patients. This review summarizes the various kinds of ARDS onset with a special focus on the role of reactive oxygen species (ROS), which are generally linked to ARDS development and progression. Taking a closer look at the data which already have been established in mouse models, this review finally proposes the translation of these results on successful antioxidant use in a personalized approach to the ICU patient as a potential adjuvant to standard ARDS treatment.
Background: Cerebral oxygen saturation (ScO2) can be measured non-invasively by near-infrared spectroscopy (NIRS) and correlates with cerebral perfusion. We investigated cerebral saturation during transfemoral transcatheter aortic valve implantation (TAVI) and its impact on outcome.
Methods and results: Cerebral oxygenation was measured continuously by NIRS in 173 analgo-sedated patients during transfemoral TAVI (female 47%, mean age 81 years) with self-expanding (39%) and balloon-expanding valves (61%). We investigated the periprocedural dynamics of cerebral oxygenation. Mean ScO2 at baseline without oxygen supply was 60%. During rapid ventricular pacing, ScO2 dropped significantly (before 64% vs. after 55%, p < 0.001). ScO2 at baseline correlated positively with baseline left-ventricular ejection fraction (0.230, p < 0.006) and hemoglobin (0.327, p < 0.001), and inversely with EuroSCORE-II ( − 0.285, p < 0.001) and length of in-hospital stay ( − 0.229, p < 0.01). Patients with ScO2 < 56% despite oxygen supply at baseline had impaired 1 year survival (log-rank test p < 0.01) and prolonged in-hospital stay (p = 0.03). Furthermore, baseline ScO2 was found to be a predictor for 1 year survival independent of age and sex (multivariable adjusted Cox regression, p = 0.020, hazard ratio (HR 0.94, 95% CI 0.90–0.99) and independent of overall perioperative risk estimated by EuroSCORE-II and hemoglobin (p = 0.03, HR 0.95, 95% CI 0.91–0.99).
Conclusions: Low baseline ScO2 not responding to oxygen supply might act as a surrogate for impaired cardiopulmonary function and is associated with worse 1 year survival and prolonged in-hospital stay after transfemoral TAVI. ScO2 monitoring is an easy to implement diagnostic tool to screen patients at risk with a potential preserved recovery and worse outcome after TAVI.
The transcription factor NF-E2 p45-related factor 2 (Nrf2) is an established master regulator of the anti-oxidative and detoxifying cellular response. Thus, a role in inflammatory diseases associated with the generation of large amounts of reactive oxygen species (ROS) seems obvious. In line with this, data obtained in cell culture experiments and preclinical settings have shown that Nrf2 is important in regulating target genes that are necessary to ensure cellular redox balance. Additionally, Nrf2 is involved in the induction of phase II drug metabolizing enzymes, which are important both in degrading and converting drugs into active forms, and into putative carcinogens. Therefore, Nrf2 has also been implicated in tumorigenesis. This must be kept in mind when new therapy approaches are planned for the treatment of sepsis. Therefore, this review highlights the function of Nrf2 in sepsis with a special focus on the translation of rodent-based results into sepsis patients in the intensive care unit (ICU).
Purpose: Anaemia is one of the leading causes of death among severely injured patients. It is also known to increase the risk of death and prolong the length of hospital stay in various surgical groups. The main objective of this study is to analyse the anaemia rate on admission to the emergency department and the impact of anaemia on in-hospital mortality.
Methods: Data from the TraumaRegister DGU® (TR-DGU) between 2015 and 2019 were analysed. Inclusion criteria were age ≥ 16 years and most severe Abbreviated Injury Scale (AIS) score ≥ 3. Patients were divided into three anaemia subgroups: no or mild anaemia (NA), moderate anaemia (MA) and severe anaemia (SA). Pre-hospital data, patient characteristics, treatment in the emergency room (ER), outcomes, and differences between trauma centres were analysed.
Results: Of 67,595 patients analysed, 94.9% (n = 64,153) exhibited no or mild anaemia (Hb ≥ 9 g/dl), 3.7% (n = 2478) displayed moderate anaemia (Hb 7–8 g/dl) and 1.4% (n = 964) presented with severe anaemia (Hb < 7 g/dl). Haemoglobin (Hb) values ranged from 3 to 18 g/dl with a mean Hb value of 12.7 g/dl. In surviving patients, anaemia was associated with prolonged length of stay (LOS). Multivariate logistic regression analyses revealed moderate (p < 0.001 OR 1.88 (1.66–2.13)) and severe anaemia (p < 0.001 OR 4.21 (3.46–5.12)) to be an independent predictor for mortality. Further significant predictors are ISS score per point (OR 1.0), age 70–79 (OR 4.8), age > 80 (OR 12.0), severe pre-existing conditions (ASA 3/4) (OR 2.26), severe head injury (AIS 5/6) (OR 4.8), penetrating trauma (OR 1.8), unconsciousness (OR 4.8), shock (OR 2.2) and pre-hospital intubation (OR 1.6).
Conclusion: The majority of severely injured patients are admitted without anaemia to the ER. Injury-associated moderate and severe anaemia is an independent predictor of mortality in severely injured patients.
Genetic or pharmacological ablation of toll-like receptor 2 (TLR2) protects against myocardial ischemia/reperfusion injury (MI/R). However, the endogenous ligand responsible for TLR2 activation has not yet been detected. The objective of this study was to identify HMGB1 as an activator of TLR2 signalling during MI/R. C57BL/6 wild-type (WT) or TLR2(-/-)-mice were injected with vehicle, HMGB1, or HMGB1 BoxA one hour before myocardial ischemia (30 min) and reperfusion (24 hrs). Infarct size, cardiac troponin T, leukocyte infiltration, HMGB1 release, TLR4-, TLR9-, and RAGE-expression were quantified. HMGB1 plasma levels were measured in patients undergoing coronary artery bypass graft (CABG) surgery. HMGB1 antagonist BoxA reduced cardiomyocyte necrosis during MI/R in WT mice, accompanied by reduced leukocyte infiltration. Injection of HMGB1 did, however, not increase infarct size in WT animals. In TLR2(-/-)-hearts, neither BoxA nor HMGB1 affected infarct size. No differences in RAGE and TLR9 expression could be detected, while TLR2(-/-)-mice display increased TLR4 and HMGB1 expression. Plasma levels of HMGB1 were increased MI/R in TLR2(-/-)-mice after CABG surgery in patients carrying a TLR2 polymorphism (Arg753Gln). We here provide evidence that absence of TLR2 signalling abrogates infarct-sparing effects of HMGB1 blockade.
In contrast to several smaller studies, which demonstrate that remote ischemic preconditioning (RIPC) reduces myocardial injury in patients that undergo cardiovascular surgery, the RIPHeart study failed to demonstrate beneficial effects of troponin release and clinical outcome in propofol-anesthetized cardiac surgery patients. Therefore, we addressed the potential biochemical mechanisms triggered by RIPC. This is a predefined prospective sub-analysis of the randomized and controlled RIPHeart study in cardiac surgery patients (n = 40) that was recently published. Blood samples were drawn from patients prior to surgery, after RIPC of four cycles of 5 min arm ischemia/5 min reperfusion (n = 19) and the sham (n = 21) procedure, after connection to cardiopulmonary bypass (CPB), at the end of surgery, 24 h postoperatively, and 48 h postoperatively for the measurement of troponin T, macrophage migration inhibitory factor (MIF), stromal cell-derived factor 1 (CXCL12), IL-6, CXCL8, and IL-10. After RIPC, right atrial tissue samples were taken for the measurement of extracellular-signal regulated kinase (ERK1/2), protein kinase B (AKT), Glycogen synthase kinase 3 (GSK-3β), protein kinase C (PKCε), and MIF content. RIPC did not significantly reduce the troponin release when compared with the sham procedure. MIF serum levels intraoperatively increased, peaking at intensive care unit (ICU) admission (with an increase of 48.04%, p = 0.164 in RIPC; and 69.64%, p = 0.023 over the baseline in the sham procedure), and decreased back to the baseline 24 h after surgery, with no differences between the groups. In the right atrial tissue, MIF content decreased after RIPC (1.040 ± 1.032 Arbitrary units [au] in RIPC vs. 2.028 ± 1.631 [au] in the sham procedure, p < 0.05). CXCL12 serum levels increased significantly over the baseline at the end of surgery, with no differences between the groups. ERK1/2, AKT, GSK-3β, and PKCɛ phosphorylation in the right atrial samples were no different between the groups. No difference was found in IL-6, CXCL8, and IL10 serum levels between the groups. In this cohort of cardiac surgery patients that received propofol anesthesia, we could not show a release of potential mediators of signaling, nor an effect on the inflammatory response, nor an activation of well-established protein kinases after RIPC. Based on these data, we cannot exclude that confounding factors, such as propofol, may have interfered with RIPC.
Background: Remote ischemic preconditioning (RIPC) has been shown to enhance the tolerance of remote organs to cope with a subsequent ischemic event. We hypothesized that RIPC reduces postoperative neurocognitive dysfunction (POCD) in patients undergoing complex cardiac surgery.
Methods: We conducted a prospective, randomized, double-blind, controlled trial including 180 adult patients undergoing elective cardiac surgery with cardiopulmonary bypass. Patients were randomized either to RIPC or to control group. Primary endpoint was postoperative neurocognitive dysfunction 5–7 days after surgery assessed by a comprehensive test battery. Cognitive change was assumed if the preoperative to postoperative difference in 2 or more tasks assessing different cognitive domains exceeded more than one SD (1 SD criterion) or if the combined Z score was 1.96 or greater (Z score criterion).
Results: According to 1 SD criterion, 52% of control and 46% of RIPC patients had cognitive deterioration 5–7 days after surgery (p = 0.753). The summarized Z score showed a trend to more cognitive decline in the control group (2.16±5.30) compared to the RIPC group (1.14±4.02; p = 0.228). Three months after surgery, incidence and severity of neurocognitive dysfunction did not differ between control and RIPC. RIPC tended to decrease postoperative troponin T release at both 12 hours [0.60 (0.19–1.94) µg/L vs. 0.48 (0.07–1.84) µg/L] and 24 hours after surgery [0.36 (0.14–1.89) µg/L vs. 0.26 (0.07–0.90) µg/L].
Conclusions: We failed to demonstrate efficacy of a RIPC protocol with respect to incidence and severity of POCD and secondary outcome variables in patients undergoing a wide range of cardiac surgery. Therefore, definitive large-scale multicenter trials are needed.
Trial Registration: ClinicalTrials.gov NCT00877305
BACKGROUND: Transient episodes of ischemia in a remote organ or tissue (remote ischemic preconditioning, RIPC) can attenuate myocardial injury. Myocardial damage is associated with tissue remodeling and the matrix metalloproteinases 2 and 9 (MMP-2/9) are crucially involved in these events. Here we investigated the effects of RIPC on the activities of heart tissue MMP-2/9 and their correlation with serum concentrations of cardiac troponin T (cTnT), a marker for myocardial damage.
METHODS: In cardiosurgical patients with cardiopulmonary bypass (CPB) RIPC was induced by four 5 minute cycles of upper limb ischemia/reperfusion. Cardiac tissue was obtained before as well as after CPB and serum cTnT concentrations were measured. Tissue derived from control patients (N = 17) with high cTnT concentrations (≥0.32 ng/ml) and RIPC patients (N = 18) with low cTnT (≤0.32 ng/ml) was subjected to gelatin zymography to quantify MMP-2/9 activities.
RESULTS: In cardiac biopsies obtained before CPB, activities of MMP-2/9 were attenuated in the RIPC group (MMP-2: Control, 1.13 ± 0.13 a.u.; RIPC, 0.71 ± 0.12 a.u.; P < 0.05. MMP-9: Control, 1.50 ± 0.16 a.u.; RIPC, 0.87 ± 0.14 a.u.; P < 0.01), while activities of the pro-MMPs were not altered (P > 0.05). In cardiac biopsies taken after CPB activities of pro- and active MMP-2/9 were not different between the groups (P > 0.05). Spearman's rank tests showed that MMP-2/9 activities in cardiac tissue obtained before CPB were positively correlated with postoperative cTnT serum levels (MMP-2, P = 0.016; MMP-9, P = 0.015).
CONCLUSIONS: Activities of MMP-2/9 in cardiac tissue obtained before CPB are attenuated by RIPC and are positively correlated with serum concentrations of cTnT. MMPs may represent potential targets for RIPC mediated cardioprotection.
TRIAL REGISTRATION: ClinicalTrials.gov identifier NCT00877305.
The lung is, more than other solid organs, susceptible for ischemia reperfusion injury after orthotopic transplantation. Corticosteroids are known to potently suppress pro-inflammatory processes when given in the post-operative setting or during rejection episodes. Whereas their use has been approved for these clinical indications, there is no study investigating its potential as a preservation additive in preventing vascular damage already in the phase of ischemia. To investigate these effects we performed orthotopic lung transplantations (LTX) in the rat. Prednisolone was either added to the perfusion solution for lung preservation or omitted and rats were followed for 48 hours after LTX. Prednisolone preconditioning significantly increased survival and diminished reperfusion edema. Hypoxia induced vasoactive cytokines such as VEGF were reduced. Markers of leukocyte invasiveness like matrix metalloprotease (MMP)-2, or common pro-inflammatory molecules like the CXCR4 receptor or the chemokine (C-C motif) ligand (CCL)-2 were downregulated by prednisolone. Neutrophil recruitment to the grafts was only increased in Perfadex treated lungs. Together with this, prednisolone treated animals displayed significantly reduced lung protein levels of neutrophil chemoattractants like CINC-1, CINC-2α/β and LIX and upregulated tissue inhibitor of matrix metalloproteinase (TIMP)-1. Interestingly, lung macrophage invasion was increased in both, Perfadex and prednisolone treated grafts, as measured by MMP-12 or RM4. Markers of anti-inflammatory macrophage transdifferentiation like MRC-1, IL-13, IL-4 and CD163, significantly correlated with prednisolone treatment. These observations lead to the conclusion that prednisolone as an additive to the perfusion solution protects from hypoxia triggered danger signals already in the phase of ischemia and thus reduces graft edema in the phase of reperfusion. Additionally, prednisolone preconditioning might also lead to macrophage polarization as a beneficial long-term effect.
The main goal of adequate organ preservation is to avoid further cellular metabolism during the phase of ischemia. However, modern preservation solutions do rarely achieve this target. In donor organs hypoxia and ischemia induce a broad spectrum of pathologic molecular mechanisms favoring primary graft dysfunction (PGD) after transplantation. Increased hypoxia-induced transcriptional activity leads to increased vascular permeability which in turn is the soil of a reperfusion edema and the enhancement of a pro-inflammatory response in the graft after reperfusion. We hypothesize that inhibition of the respiration chain in mitochondria and thus inhibition of the hypoxia induced mechanisms might reduce reperfusion edema and consecutively improve survival in vivo. In this study we demonstrate that the rotenoid Deguelin reduces the expression of hypoxia induced target genes, and especially VEGF-A, dose-dependently in hypoxic human lung derived cells. Furthermore, Deguelin significantly suppresses the mRNA expression of the HIF target genes VEGF-A, the pro-inflammatory CXCR4 and ICAM-1 in ischemic lungs vs. control lungs. After lung transplantation, the VEGF-A induced reperfusion-edema is significantly lower in Deguelin-treated animals than in controls. Deguelin-treated rats exhibit a significantly increased survival-rate after transplantation. Additionally, a downregulation of the pro-inflammatory molecules ICAM-1 and CXCR4 and an increase in the recruitment of immunomodulatory monocytes (CD163+ and CD68+) to the transplanted organ involving the IL4 pathway was observed. Therefore, we conclude that ischemic periods preceding reperfusion are mainly responsible for the increased vascular permeability via upregulation of VEGF. Together with this, the resulting endothelial dysfunction also enhances inflammation and consequently lung dysfunction. Deguelin significantly decreases a VEGF-A induced reperfusion edema, induces the recruitment of immunomodulatory monocytes and thus improves organ function and survival after lung transplantation by interfering with hypoxia induced signaling.
Background: Age and preoperative anaemia are risk factors for poor surgical outcome and blood transfusion. The aim of this study was to examine the effect of iron supplementation in iron-deficient (ID) elderly patients undergoing major surgery.
Method: In this single-centre observational study, patients ≥ 65 years undergoing major surgery were screened for anaemia and ID. Patients were assigned to the following groups: A− (no anaemia); A−,ID+,T+ (no anaemia, iron-deficient, intravenous iron supplementation); A+ (anaemia); and A+,ID+,T+ (anaemia, iron-deficient, intravenous iron supplementation).
Results: Of 4,381 patients screened at the anaemia walk-in clinic, 2,381 (54%) patients were ≥ 65 years old and 2,191 cases were included in analysis. The ID prevalence was 63% in patients with haemoglobin (Hb) < 8 g/dl, 47.2% in patients with Hb from 8.0 to 8.9 g/dl, and 44.3% in patients with Hb from 9 to 9.9 g/dl. In severely anaemic patients, an Hb increase of 0.6 (0.4; 1.2) and 1.2 (0.7; 1.6) g/dl was detected with iron supplementation 6–10 and > 10 days before surgery, respectively. Hb increased by 0 (-0.1; 0) g/dl with iron supplementation 1–5 days before surgery, 0.2 (-0.1; 0.5) g/dl with iron supplementation 6–10 days before surgery, and 0.2 (-0.2; 1.1) g/dl with supplementation > 10 days before surgery (p < 0.001 for 1–5 vs. 6–10 days). Overall, 58% of A+,ID+,T+ patients showed an Hb increase of > 0.5 g/dl. The number of transfused red blood cell units was significantly lower in patients supplemented with iron (0 (0; 3)) compared to non-treated anaemic patients (1 (0; 4)) (p = 0.03). Patients with iron supplementation > 6 days before surgery achieved mobility 2 days earlier than patients with iron supplementation < 6 days.
Conclusions: Intravenous iron supplementation increases Hb level and thereby reduces blood transfusion rate in elderly surgical patients with ID anaemia.
Objective: Videolaryngoscopy has mainly been developed to facilitate difficult airway intubation. However, there is a lack of studies demonstrating this method's efficacy in pediatric patients. The aim of the present study was to compare the TruView infant EVO2 and the C-MAC videolaryngoscope with conventional direct Macintosh laryngoscopy in children with a bodyweight ≤10 kg in terms of intubation conditions and the time to intubation.
Methods: In total, 65 children with a bodyweight ≤10 kg (0-22 months) who had undergone elective surgery requiring endotracheal intubation were retrospectively analyzed. Our database was screened for intubations with the TruView infant EVO2, the C-MAC videolaryngoscope, and conventional direct Macintosh laryngoscopy. The intubation conditions, the time to intubation, and the oxygen saturation before and after intubation were monitored, and demographic data were recorded. Only children with a bodyweight ≤10 kg were included in the analysis.
Results: A total of 23 children were intubated using the C-MAC videolaryngoscope, and 22 children were intubated using the TruView EVO2. Additionally, 20 children were intubated using a standard Macintosh blade. The time required for tracheal intubation was significantly longer using the TruView EVO2 (52 sec vs. 28 sec for C-MAC vs. 26 sec for direct LG). However, no significant difference in oxygen saturation was found after intubation.
Conclusion: All devices allowed excellent visualization of the vocal cords, but the time to intubation was prolonged when the TruView EVO2 was used. The absence of a decline in oxygen saturation may be due to apneic oxygenation via the TruView scope and may provide a margin of safety. In sum, the use of the TruView by a well-trained anesthetist may be an alternative for difficult airway management in pediatric patients.