Refine
Year of publication
Document Type
- Article (50)
- Preprint (2)
- Conference Proceeding (1)
Language
- English (53)
Has Fulltext
- yes (53)
Is part of the Bibliography
- no (53)
Keywords
- Patient blood management (5)
- patient blood management (4)
- Transfusion (3)
- anaemia (3)
- transfusion (3)
- COVID-19 (2)
- Cell salvage (2)
- Critical care (2)
- Mortality (2)
- Organ dysfunctions (2)
Institute
- Medizin (53)
Transfusion of red blood cells (RBC) in patients undergoing major elective cranial surgery is associated with increased morbidity, mortality and prolonged hospital length of stay (LOS). This retrospective single center study aims to identify the clinical outcome of RBC transfusions on skull base and non-skull base meningioma patients including the identification of risk factors for RBC transfusion. Between October 2009 and October 2016, 423 patients underwent primary meningioma resection. Of these, 68 (16.1%) received RBC transfusion and 355 (83.9%) did not receive RBC units. Preoperative anaemia rate was significantly higher in transfused patients (17.7%) compared to patients without RBC transfusion (6.2%; p = 0.0015). In transfused patients, postoperative complications as well as hospital LOS was significantly higher (p < 0.0001) compared to non-transfused patients. After multivariate analyses, risk factors for RBC transfusion were preoperative American Society of Anaesthesiologists (ASA) physical status score (p = 0.0247), tumor size (p = 0.0006), surgical time (p = 0.0018) and intraoperative blood loss (p < 0.0001). Kaplan-Meier curves revealed significant influence on overall survival by preoperative anaemia, RBC transfusion, smoking, cardiovascular disease, preoperative KPS ≤ 60% and age (elderly ≥ 75 years). We concluded that blood loss due to large tumors or localization near large vessels are the main triggers for RBC transfusion in meningioma patients paired with a potential preselection that masks the effect of preoperative anaemia in multivariate analysis. Further studies evaluating the impact of preoperative anaemia management for reduction of RBC transfusion are needed to improve the clinical outcome of meningioma patients.
Background: Transfusion of red blood cells (RBC) in patients undergoing major elective cranial surgery is associated with increased morbidity, mortality and prolonged hospital length of stay (LOS). This retrospective single center study aims to identify the impact of RBC transfusions on skull-base and non-skull-base meningioma patients including the identification of risk factors for RBC transfusion.
Methods: From October 2009 - October 2016 we retrospectively analyzed 423 primary meningioma patients undergoing surgery for primary meningioma resection our department.
Results: Of these 423 patients, 68 (16.1%) received RBC transfusion and 355 (83.9%) did not receive RBC units. Preoperative anaemia rate was significantly higher in transfused patients (17.7%) compared to patients without RBC transfusion (6.2%; p = 0.0015). In transfused patients, postoperative complications as well as hospital LOS was significantly higher (p < 00001) compared to non-transfused patients. After multivariate analyses, risk factors for RBC transfusion were preoperative American Society of Anesthesiologists (ASA) physical status score (p = 0.0247), tumor size (p = 0.0006), surgical time (p = 0.0018) and intraoperative blood loss (p < 0.001). Kaplan-Meier curves revealed significant influence on overall survival by preoperative anaemia, RBC transfusion, smoking, cardiovascular disease, preoperative KPS ≤ 60% and age (elderly ≥ 75 years).
Conclusion: We concluded that blood loss due to large tumors or localization near large vessels are the main triggers for RBC transfusion in meningioma patients paired with a potential preselection that masks the effect of preoperative anaemia in multivariate analysis. Further studies evaluating the impact of preoperative anaemia management for reduction of RBC transfusion are needed to improve clinical outcomes of meningioma patients.
Introduction: Hypothermia improves survival and neurological recovery after cardiac arrest. Pro-inflammatory cytokines have been implicated in focal cerebral ischemia/reperfusion in-jury. It is unknown whether cardiac arrest also triggers the release of cerebral inflammatory molecules, and whether therapeutic hypothermia alters this inflammatory response. This study sought to examine whether hypothermia or the combination of hypothermia with anes-thetic postconditioning with sevoflurane affect cerebral inflammatory response after cardio-pulmonary resuscitation. Methods: Thirty pigs (28 - 34kg) were subjected to cardiac arrest following temporary coro-nary artery occlusion. After 7 minutes of ventricular fibrillation and 2 minutes of basic life support, advanced cardiac life support was started according to the current AHA guidelines. Return of spontaneous circulation was achieved in 21 animals who were randomized to ei-ther normothermia at 38degreesC, hypothermia at 33degreesC or hypothermia at 33degreesC combined with se-voflurane (each group: n = 7) for 24 hours. The effects of hypothermia and the combination of hypothermia with sevoflurane on cerebral inflammatory response after cardiopulmonary resuscitation were studied using tissue samples from the cerebral cortex of pigs euthanized after 24 hours and employing quantitative RT-PCR and ELISA techniques. Results: Global cerebral ischemia following resuscitation resulted in significant upregulation of cerebral tissue inflammatory cytokine mRNA expression (mean +/- SD; interleukin (IL)-1beta 8.7 +/- 4.0, IL-6 4.3 +/- 2.6, IL-10 2.5 +/- 1.6, tumor necrosis factor (TNF)alpha 2.8 +/- 1.8, intercellular adhesion molecule-1 (ICAM-1) 4.0 +/- 1.9-fold compared with sham control) and IL-1beta protein concentration (1.9 +/- 0.6-fold compared with sham control). Hypothermia was associated with a significant (P <0.05 versus normothermia) reduction in cerebral inflammatory cytokine mRNA expression (IL-1beta 1.7 +/- 1.0, IL-6 2.2 +/- 1.1, IL-10 0.8 +/- 0.4, TNFalpha 1.1 +/- 0.6, ICAM-1 1.9 +/- 0.7-fold compared with sham control). These results were also confirmed for IL-1beta on protein level. Experimental settings employing hypothermia in combination with sevoflurane showed that the volatile anesthetic did not confer additional anti-inflammatory effects com-pared with hypothermia alone. Conclusions: Mild therapeutic hypothermia resulted in decreased expression of typical ce-rebral inflammatory mediators after cardiopulmonary resuscitation. This may confer, at least in part, neuroprotection following global cerebral ischemia and resuscitation.
Background and objectives: Preoperative anaemia is an independent risk factor for a higher morbidity and mortality, a longer hospitalization and increased perioperative transfusion rates. Managing preoperative anaemia is the first of three pillars of Patient Blood Management (PBM), a multidisciplinary concept to improve patient safety. While various studies provide medical information on (successful) anaemia treatment pathways, knowledge of organizational details of diagnosis and management of preoperative anaemia across Europe is scarce.
Materials and methods: To gain information on various aspects of preoperative anaemia management including organization, financing, diagnostics and treatment, we conducted a survey (74 questions) in ten hospitals from seven European nations within the PaBloE (Patient Blood Management in Europe) working group covering the year 2016.
Results: Organization and activity in the field of preoperative anaemia management were heterogeneous in the participating hospitals. Almost all hospitals had pathways for managing preoperative anaemia in place, however, only two nations had national guidelines. In six of the ten participating hospitals, preoperative anaemia management was organized by anaesthetists. Diagnostics and treatment focused on iron deficiency anaemia which, in most hospitals, was corrected with intravenous iron.
Conclusion: Implementation and approaches of preoperative anaemia management vary across Europe with a primary focus on treating iron deficiency anaemia. Findings of this survey motivated the hospitals involved to critically evaluate their practice and may also help other hospitals interested in PBM to develop action plans for diagnosis and management of preoperative anaemia.
Background: Invasive off- or on-pump cardiac surgery (elective and emergency procedures, excluding transplants are routinely performed to treat complications of ischaemic heart disease. Randomised controlled trials (RCT) evaluate the effectiveness of treatments in the setting of cardiac surgery. However, the impact of RCTs is weakened by heterogeneity in outcome measuring and reporting, which hinders comparison across trials. Core outcome sets (COS, a set of outcomes that should be measured and reported, as a minimum, in clinical trials for a specific clinical field) help reduce this problem. In light of the above, we developed a COS for cardiac surgery effectiveness trials.
Methods: Potential core outcomes were identified a priori by analysing data on 371 RCTs of 58,253 patients. We reached consensus on core outcomes in an international three-round eDelphi exercise. Outcomes for which at least 60% of the participants chose the response option "no" and less than 20% chose the response option "yes" were excluded.
Results: Eighty-six participants from 23 different countries involving adult cardiac patients, cardiac surgeons, anaesthesiologists, nursing staff and researchers contributed to this eDelphi. The panel reached consensus on four core outcomes: 1) Measure of mortality, 2) Measure of quality of life, 3) Measure of hospitalisation and 4) Measure of cerebrovascular complication to be included in adult cardiac surgery trials.
Conclusion: This study used robust research methodology to develop a minimum core outcome set for clinical trials evaluating the effectiveness of treatments in the setting of cardiac surgery. As a next step, appropriate outcome measurement instruments have to be selected.
Introduction: Sepsis remains associated with a high mortality rate. Endotoxin has been shown to influence viscoelastic coagulation parameters, thus suggesting a link between endotoxin levels and the altered coagulation phenotype in septic patients. This study evaluated the effects of systemic polyspecific IgM-enriched immunoglobulin (IgM-IVIg) (Pentaglobin® [Biotest, Dreieich, Germany]) on endotoxin activity (EA), inflammatory markers, viscoelastic and conventional coagulation parameters.
Methods: Patients with severe sepsis were identified by daily screening in a tertiary, academic, surgical ICU. After the inclusion of 15 patients, the application of IgM-IVIg (5 mg/kg/d over three days) was integrated into the unit’s standard operation procedure (SOP) to treat patients with severe sepsis, thereby generating “control” and “IgM-IVIg” groups. EA assays, thrombelastometry (ROTEM®) and impedance aggregometry (Multiplate®) were performed on whole blood. Furthermore, routine laboratory parameters were determined according to unit’s standards.
Results: Data from 26 patients were included. On day 1, EA was significantly decreased in the IgM-IVIg group following 6 and 12 hours of treatment (0.51 ±0.06 vs. 0.26 ±0.07, p<0.05 and 0.51 ±0.06 vs. 0.25 ±0.04, p<0.05) and differed significantly compared with the control group following 6 hours of treatment (0.26 ±0.07 vs. 0.43 ±0.07, p<0.05). The platelet count was significantly higher in the IgM-IVIg group following four days of IgM-IVIg treatment (200/nl ±43 vs. 87/nl ±20, p<0.05). The fibrinogen concentration was significantly lower in the control group on day 2 (311 mg/dl ±37 vs. 475 mg/dl ±47 (p = 0.015)) and day 4 (307 mg/dl ±35 vs. 420 mg/dl ±16 (p = 0.017)). No differences in thrombelastometric or aggregometric measurements, or inflammatory markers (interleukin-6 (IL-6), leukocyte, lipopolysaccharide binding protein (LBP)) were observed.
Conclusion: Treatment with IgM-enriched immunoglobulin attenuates the EA levels in patients with severe sepsis and might have an effect on septic thrombocytopenia and fibrinogen depletion. Viscoelastic, aggregometric or inflammatory parameters were not influenced.
Background: Hemodynamic instability is frequent and outcome-relevant in critical illness. The understanding of complex hemodynamic disturbances and their monitoring and management plays an important role in treatment of intensive care patients. An increasing number of treatment recommendations and guidelines in intensive care medicine emphasize hemodynamic goals, which go beyond the measurement of blood pressures. Yet, it is not known to which extent the infrastructural prerequisites for extended hemodynamic monitoring are given in intensive care units (ICUs) and how hemodynamic management is performed in clinical practice. Further, it is still unclear which factors trigger the use of extended hemodynamic monitoring.
Methods: In this multicenter, 1-day (November 7, 2013, and the preceding 24 h) cross-sectional study, we retrieved data on patient monitoring from ICUs in Germany, Austria, and Switzerland by means of a web-based case report form. One hundred and sixty-one intensive care units contributed detailed information on availability of hemodynamic monitoring. In addition, detailed information on hemodynamic monitoring of 1789 patients that were treated on due date was collected, and independent factors triggering the use of extended hemodynamic monitoring were identified by multivariate analysis.
Results: Besides basic monitoring with electrocardiography (ECG), pulse oximetry, and blood pressure monitoring, the majority of patients received invasive arterial (77.9 %) and central venous catheterization (55.2 %). All over, additional extended hemodynamic monitoring for assessment of cardiac output was only performed in 12.3 % of patients, while echocardiographic examination was used in only 1.9 %. The strongest independent predictors for the use of extended hemodynamic monitoring of any kind were mechanical ventilation, the need for catecholamine therapy, and treatment backed by protocols. In 71.6 % of patients in whom extended hemodynamic monitoring was added during the study period, this extension led to changes in treatment.
Conclusions: Extended hemodynamic monitoring, which goes beyond the measurement of blood pressures, to date plays a minor role in the surveillance of critically ill patients in German, Austrian, and Swiss ICUs. This includes also consensus-based recommended diagnostic and monitoring applications, such as echocardiography and cardiac output monitoring. Mechanical ventilation, the use of catecholamines, and treatment backed by protocol could be identified as factors independently associated with higher use of extended hemodynamic monitoring.
Introduction: Balanced fluid replacement solutions can possibly reduce the risks for electrolyte imbalances, for acid-base imbalances, and thus for renal failure. To assess the intraoperative change of base excess (BE) and chloride in serum after treatment with either a balanced gelatine/electrolyte solution or a non-balanced gelatine/electrolyte solution, a prospective, controlled, randomized, double-blind, dual centre phase III study was conducted in two tertiary care university hospitals in Germany.
Material and methods: 40 patients of both sexes, aged 18 to 90 years, who were scheduled to undergo elective abdominal surgery with assumed intraoperative volume requirement of at least 15 mL/kg body weight gelatine solution were included. Administration of study drug was performed intravenously according to patients need. The trigger for volume replacement was a central venous pressure (CVP) minus positive end-expiratory pressure (PEEP) <10 mmHg (CVP <10 mmHg). The crystalloid:colloid ratio was 1:1 intra- and postoperatively. The targets for volume replacement were a CVP between 10 and 14 mmHg minus PEEP after treatment with vasoactive agent and mean arterial pressure (MAP) > 65 mmHg.
Results: The primary endpoints, intraoperative changes of base excess –2.59 ± 2.25 (median: –2.65) mmol/L (balanced group) and –4.79 ± 2.38 (median: –4.70) mmol/L (non-balanced group)) or serum chloride 2.4 ± 1.9 (median: 3.0) mmol/L and 5.2 ± 3.1 (median: 5.0) mmol/L were significantly different (p = 0.0117 and p = 0.0045, respectively). In both groups (each n = 20) the investigational product administration in terms of volume and infusion rate was comparable throughout the course of the study, i.e. before, during and after surgery.
Discussion: Balanced gelatine solution 4% combined with a balanced electrolyte solution demonstrated significant smaller impact on blood gas analytic parameters in the primary endpoints BE and serum chloride when compared to a non-balanced gelatine solution 4% combined with NaCl 0.9%. No marked treatment differences were observed with respect to haemodynamics, coagulation and renal function.
Trial registration: ClinicalTrials.gov (NCT01515397) and clinicaltrialsregister.eu, EudraCT number 2010-018524-58.
Background: This study assessed the ability of mid-regional proadrenomedullin (MR-proADM) in comparison to conventional biomarkers (procalcitonin (PCT), lactate, C-reactive protein) and clinical scores to identify disease severity in patients with sepsis.
Methods: This is a secondary analysis of a randomised controlled trial in patients with severe sepsis or septic shock across 33 German intensive care units. The association between biomarkers and clinical scores with mortality was assessed by Cox regression analysis, area under the receiver operating characteristic and Kaplan-Meier curves. Patients were stratified into three severity groups (low, intermediate, high) for all biomarkers and scores based on cutoffs with either a 90% sensitivity or specificity.
Results: 1089 patients with a 28-day mortality rate of 26.9% were analysed. According to the Sepsis-3 definition, 41.2% and 58.8% fulfilled the criteria for sepsis and septic shock, with respective mortality rates of 20.0% and 32.1%. MR-proADM had the strongest association with mortality across all Sepsis-1 and Sepsis-3 subgroups and could facilitate a more accurate classification of low (e.g. MR-proADM vs. SOFA: N = 265 vs. 232; 9.8% vs. 13.8% mortality) and high (e.g. MR-proADM vs. SOFA: N = 161 vs. 155; 55.9% vs. 41.3% mortality) disease severity. Patients with decreasing PCT concentrations of either ≥ 20% (baseline to day 1) or ≥ 50% (baseline to day 4) but continuously high MR-proADM concentrations had a significantly increased mortality risk (HR (95% CI): 19.1 (8.0–45.9) and 43.1 (10.1–184.0)).
Conclusions: MR-proADM identifies disease severity and treatment response more accurately than established biomarkers and scores, adding additional information to facilitate rapid clinical decision-making and improve personalised sepsis treatment.
Background & aims: Recent studies indicate that vitamin D deficiency is associated with increased morbidity and mortality in critically ill patients. Knowledge about the functional role and clinical relevance of vitamin D for patients undergoing cardiac surgery is sparse. Therefore, we investigated the clinical significance of vitamin D levels on outcome of cardiac surgery patients.
Methods: 92 patients undergoing elective cardiac surgery with cardiopulmonary arrest were included in this prospective observational pilot study. 25-hydroxyvitamin D (25OHD) and 1,25-dihydroxyvitamin D (1,25(OH)2D) levels were measured prior to surgery, immediately postoperatively as well as 6, 12 and 24 h after surgery. We assessed postoperative organ dysfunctions, infections and death until hospital discharge.
Results: The serum concentration of 1,25(OH)2D significantly decreased intraoperatively by 29.3% (p < 0.001) and was significantly lower at any postoperative time point compared to baseline values, whereas 25OHD levels did not show significant changes during the observation period. Coronary artery bypass graft (CABG) patients had significant higher baseline 1,25(OH)2D values than patients with valve surgery (39.7 ± 13.9 ng/l vs. 30.1 ± 14.1 ng/l, p = 0.010) or CABG + valve surgery (39.7 ± 13.9 ng/l vs. 32.6 ± 11.8 ng/l, p = 0.044).
Our data showed a significant odds ratio to develop postoperative organ dysfunction (OR 0.95; p = 0.009) and PCT levels ≥5 μg/l (OR 0.94; p = 0.046) for every ng/l increment in 1,25(OH)2D, when performing multivariable analysis and after adjusting for preoperative illness and demographics. In addition, multivariable-adjusted statistical analyses revealed that patients stayed significantly shorter on ICU (−0.21 h; p = 0.001) and in hospital (−2.6 days; p = 0.009) for every ng/l increment in 1,25(OH)2D.
Conclusion: Our data highlight important evidence about the clinical significance of 1,25(OH)2D levels in cardiac surgery patients. Higher levels were associated with significantly less postoperative organ dysfunctions, elevated PCT levels, death and prolonged hospital stay. 1,25(OH)2D levels decreased significantly intra- and postoperatively, while serum levels of 25OHD did not.
Trial registration: clinicaltrials.gov (NCT 02488876), registered May 1, 2015.
Background: Patient Blood Management (PBM) is a systematic quality improving clinical model to reduce anemia and avoid transfusions in all kinds of clinical settings. Here, we investigated the potential of PBM in oncologic surgery and hypothesized that PBM improves 2-year overall survival (OS).
Methods: Retrospective analysis of patients 2 years before and after PBM implementation. The primary endpoint was OS at 2 years after surgery. We identified a sample size of 824 to detect a 10% improvement in survival in the PBM group.
Results: The analysis comprised of 836 patients that underwent oncologic surgery, 389 before and 447 after PBM, was implemented. Patients in the PBM+ presented significantly more frequent with normal hemoglobin values before surgery than PBM− (56.6 vs. 35.7%; p < 0.001). The number of transfusions was significantly reduced from 5.5 ± 11.1 to 3.0 ± 6.9 units/patient (p < 0.001); moreover, the percentage of patients being transfused during the clinic stay was significantly reduced from 62.4 to 40.9% (p < 0.001). Two-year OS was significantly better in the PBM+ and increased from 67.0 to 80.1% (p = 0.001). A normal hemoglobin value (> 12 g/dl in female and > 13 g/dl in male) before surgery (HR 0.43, 95% CI 0.29–0.65, p < 0.001) was the only independent predictive factor positively affecting survival.
Conclusions: PBM is a quality improvement tool that is associated with better mid-term surgical oncologic outcome. The root cause for improvement is the increase of patients entering surgery with normal hemoglobin values.
Predicting the requirement for renal replacement therapy in intensive care patients with sepsis
(2018)
Sepsis is one of the most frequent causes of acute kidney injury (AKI) in critically ill patients, with initial organ impairment often followed by dysfunction in other systems. Renal dysfunction may therefore represent one facet in the evolution towards multiple organ dysfunction syndrome (MODS) or, alternatively, may be indicative of system-wide endothelial damage caused by hyperinflammation and a positive fluid balance. Whilst numerous biomarkers have been investigated to predict renal replacement therapy (RRT) requirement, including NGAL, TIMP-2 and IGFBP-7, mid-regional proadrenomedullin (MR-proADM) may also be of interest due to its involvement in capillary leakage, endothelial dysfunction and the initial stages of multiple organ failure development. ...
An early identification of sepsis patients likely to progress towards multiple organ failure is crucial in order to initiate targeted therapeutic strategies to decrease mortality. Our recent publication highlighted the greater accuracy of mid-regional proadrenomedullin (MR-proADM) compared with conventional biomarkers and clinical scores in predicting 28-day mortality in patients with initially low (≤7 points; N = 240) or moderate (8–13 points; N = 653) Sepsis-related Organ Failure Assessment (SOFA) scores, thus confirming results from smaller investigations. This additional post hoc analysis aimed to further describe the non-surviving patient population of both subgroups and identify those likely to progress towards sepsis-related multiple organ failure. ...
Uncalibrated semi-invasive continous monitoring of cardiac index (CI) has recently gained increasing interest. The aim of the present study was to compare the accuracy of CI determination based on arterial waveform analysis with transpulmonary thermodilution. Fifty patients scheduled for elective coronary surgery were studied after induction of anaesthesia and before and after cardiopulmonary bypass (CPB), respectively. Each patient was monitored with a central venous line, the PiCCO system, and the FloTrac/Vigileo-system. Measurements included CI derived by transpulmonary thermodilution and uncalibrated semi-invasive pulse contour analysis. Percentage changes of CI were calculated. There was a moderate, but significant correlation between pulse contour CI and thermodilution CI both before (r(2) = 0.72, P < 0.0001) and after (r(2) = 0.62, P < 0.0001) CPB, with a percentage error of 31% and 25%, respectively. Changes in pulse contour CI showed a significant correlation with changes in thermodilution CI both before (r(2) = 0.52, P < 0.0001) and after (r(2) = 0.67, P < 0.0001) CPB. Our findings demonstrated that uncalibrated semi-invasive monitoring system was able to reliably measure CI compared with transpulmonary thermodilution in patients undergoing elective coronary surgery. Furthermore, the semi-invasive monitoring device was able to track haemodynamic changes and trends.
Introduction: Organ dysfunction or failure after the first days of ICU treatment and subsequent mortality with respect to the type of intensive care unit (ICU) admission is poorly elucidated. Therefore we analyzed the association of ICU mortality and admission for medical (M), scheduled surgery (ScS) or unscheduled surgery (US) patients mirrored by the occurrence of organ dysfunction/failure (OD/OF) after the first 72h of ICU stay.
Methods: For this retrospective cohort study (23,795 patients; DIVI registry; German Interdisciplinary Association for Intensive Care Medicine (DIVI)) organ dysfunction or failure were derived from the Sequential Organ Failure Assessment (SOFA) score (excluding the Glasgow Coma Scale). SOFA scores were collected on admission to ICU and 72h later. For patients with a length of stay of at least five days, a multivariate analysis was performed for individual OD/OF on day three.
Results: M patients had the lowest prevalence of cardiovascular failure (M 31%; ScS 35%; US 38%), and the highest prevalence of respiratory (M 24%; ScS 13%; US 17%) and renal failure (M 10%; ScS 6%; US 7%). Risk of death was highest for M- and ScS-patients in those with respiratory failure (OR; M 2.4; ScS 2.4; US 1.4) and for surgical patients with renal failure (OR; M 1.7; ScS 2.7; US 2.4).
Conclusion: The dynamic evolution of OD/OF within 72h after ICU admission and mortality differed between patients depending on their types of admission. This has to be considered to exclude a systematic bias during multi-center trials.
Vasoplegia is a severe complication after cardiac surgery. Within the last years the administration of nitric oxide synthase inhibitor methylene blue (MB) became a new therapeutic strategy. Our aim was to investigate the role of MB on transendothelial migration of circulating blood cells, the potential role of cyclic cGMP, eNOS and iNOS in this process, and the influence of MB on endothelial cell apoptosis. Human vascular endothelial cells (HuMEC-1) were treated for 30 minutes or 2 hours with different concentrations of MB. Inflammation was mimicked by LPS stimulation prior and after MB. Transmigration of PBMCs and T-Lymphocytes through the treated endothelial cells was investigated. The influence of MB upon the different subsets of PBMCs (Granulocytes, T- and B-Lymphocytes, and Monocytes) was assessed after transmigration by means of flow-cytometry. The effect of MB on cell apoptosis was evaluated using Annexin-V and Propidium Iodide stainings. Analyses of the expression of cyclic cGMP, eNOS and iNOS were performed by means of RT-PCR and Western Blot. Results were analyzed using unpaired Students T-test. Analysis of endothelial cell apoptosis by MB indicated a dose-dependent increase of apoptotic cells. We observed time- and dose-dependent effects of MB on transendothelial migration of PBMCs. The prophylactic administration of MB led to an increase of transendothelial migration of PBMCs but not Jurkat cells. Furthermore, HuMEC-1 secretion of cGMP correlated with iNOS expression after MB administration but not with eNOS expression. Expression of these molecules was reduced after MB administration at protein level. This study clearly reveals that endothelial response to MB is dose- and especially time-dependent. MB shows different effects on circulating blood cell-subtypes, and modifies the release patterns of eNOS, iNOS, and cGMP. The transendothelial migration is modulated after treatment with MB. Furthermore, MB provokes apoptosis of endothelial cells in a dose/time-dependent manner.
Introduction: Electrical impedance tomography (EIT) is an emerging clinical tool for monitoring ventilation distribution in mechanically ventilated patients, for which many image reconstruction algorithms have been suggested. We propose an experimental framework to assess such algorithms with respect to their ability to correctly represent well-defined physiological changes. We defined a set of clinically relevant ventilation conditions and induced them experimentally in 8 pigs by controlling three ventilator settings (tidal volume, positive end-expiratory pressure and the fraction of inspired oxygen). In this way, large and discrete shifts in global and regional lung air content were elicited.
Methods: We use the framework to compare twelve 2D EIT reconstruction algorithms, including backprojection (the original and still most frequently used algorithm), GREIT (a more recent consensus algorithm for lung imaging), truncated singular value decomposition (TSVD), several variants of the one-step Gauss-Newton approach and two iterative algorithms. We consider the effects of using a 3D finite element model, assuming non-uniform background conductivity, noise modeling, reconstructing for electrode movement, total variation (TV) reconstruction, robust error norms, smoothing priors, and using difference vs. normalized difference data.
Results and Conclusions: Our results indicate that, while variation in appearance of images reconstructed from the same data is not negligible, clinically relevant parameters do not vary considerably among the advanced algorithms. Among the analysed algorithms, several advanced algorithms perform well, while some others are significantly worse. Given its vintage and ad-hoc formulation backprojection works surprisingly well, supporting the validity of previous studies in lung EIT.
Background: Macrophage Migration Inhibitory Factor (MIF) is highly elevated after cardiac surgery and impacts the postoperative inflammation. The aim of this study was to analyze whether the polymorphisms CATT5–7 (rs5844572/rs3063368,“-794”) and G>C single-nucleotide polymorphism (rs755622,-173) in the MIF gene promoter are related to postoperative outcome. Methods: In 1116 patients undergoing cardiac surgery, the MIF gene polymorphisms were analyzed and serum MIF was measured by ELISA in 100 patients. Results: Patients with at least one extended repeat allele (CATT7) had a significantly higher risk of acute kidney injury (AKI) compared to others (23% vs. 13%; OR 2.01 (1.40–2.88), p = 0.0001). Carriers of CATT7 were also at higher risk of death (1.8% vs. 0.4%; OR 5.12 (0.99–33.14), p = 0.026). The GC genotype was associated with AKI (20% vs. GG/CC:13%, OR 1.71 (1.20–2.43), p = 0.003). Multivariate analyses identified CATT7 predictive for AKI (OR 2.13 (1.46–3.09), p < 0.001) and death (OR 5.58 (1.29–24.04), p = 0.021). CATT7 was associated with higher serum MIF before surgery (79.2 vs. 50.4 ng/mL, p = 0.008). Conclusion: The CATT7 allele associates with a higher risk of AKI and death after cardiac surgery, which might be related to chronically elevated serum MIF. Polymorphisms in the MIF gene may constitute a predisposition for postoperative complications and the assessment may improve risk stratification and therapeutic guidance.
Background: To compare the effect of aprotinin with the effect of lysine analogues (tranexamic acid and ε-aminocaproic acid) on early mortality in three subgroups of patients: low, intermediate and high risk of cardiac surgery.
Methods and Findings: We performed a meta-analysis of randomised controlled trials and observational with the following data sources: Medline, Cochrane Library, and reference lists of identified articles. The primary outcome measure was early (in-hospital/30-day) mortality. The secondary outcome measures were any transfusion of packed red blood cells within 24 hours after surgery, any re-operation for bleeding or massive bleeding, and acute renal dysfunction or failure within the selected cited publications, respectively.
Out of 328 search results, 31 studies (15 trials and 16 observational studies) included 33,501 patients. Early mortality was significantly increased after aprotinin vs. lysine analogues with a pooled risk ratio (95% CI) of 1.58 (1.13–2.21), p<0.001 in the low (n = 14,297) and in the intermediate risk subgroup (1.42 (1.09–1.84), p<0.001; n = 14,427), respectively. Contrarily, in the subgroup of high risk patients (n = 4,777), the risk for mortality did not differ significantly between aprotinin and lysine analogues (1.03 (0.67–1.58), p = 0.90).
Conclusion: Aprotinin may be associated with an increased risk of mortality in low and intermediate risk cardiac surgery, but presumably may has no effect on early mortality in a subgroup of high risk cardiac surgery compared to lysine analogues. Thus, decisions to re-license aprotinin in lower risk patients should critically be debated. In contrast, aprotinin might probably be beneficial in high risk cardiac surgery as it reduces risk of transfusion and bleeding complications.
Background: Cell salvage is commonly used as part of a blood conservation strategy. However concerns among clinicians exist about the efficacy of transfusion of washed cell salvage.
Methods: We performed a meta-analysis of randomized controlled trials in which patients, scheduled for all types of surgery, were randomized to washed cell salvage or to a control group with no cell salvage. Data were independently extracted, risk ratio (RR), and weighted mean differences (WMD) with 95% confidence intervals (CIs) were calculated. Data were pooled using a random effects model. The primary endpoint was the number of patients exposed to allogeneic red blood cell (RBC) transfusion.
Results: Out of 1140 search results, a total of 47 trials were included. Overall, the use of washed cell salvage reduced the rate of exposure to allogeneic RBC transfusion by a relative 39% (RR = 0.61; 95% CI 0.57 to 0.65; P < 0.001), resulting in an average saving of 0.20 units of allogeneic RBC per patient (weighted mean differences [WMD] = -0.20; 95% CI -0.22 to -0.18; P < 0.001), reduced risk of infection by 28% (RR = 0.72; 95% CI 0.54 to 0.97; P = 0.03), reduced length of hospital stay by 2.31 days (WMD = -2.31; 95% CI -2.50 to -2.11; P < 0.001), but did not significantly affect risk of mortality (RR = 0.92; 95% CI 0.63 to 1.34; P = 0.66). No statistical difference could be observed in the number of patients exposed to re-operation, plasma, platelets, or rate of myocardial infarction and stroke.
Conclusions: Washed cell salvage is efficacious in reducing the need for allogeneic RBC transfusion and risk of infection in surgery.
Purpose: Trauma is the leading cause of death in children. In adults, blood transfusion and fluid resuscitation protocols changed resulting in a decrease of morbidity and mortality over the past 2 decades. Here, transfusion and fluid resuscitation practices were analysed in severe injured children in Germany.
Methods: Severely injured children (maximum Abbreviated Injury Scale (AIS) ≥ 3) admitted to a certified trauma-centre (TraumaZentrum DGU®) between 2002 and 2017 and registered at the TraumaRegister DGU® were included and assessed regarding blood transfusion rates and fluid therapy.
Results: 5,118 children (aged 1–15 years) with a mean ISS 22 were analysed. Blood transfusion rates administered until ICU admission decreased from 18% (2002–2005) to 7% (2014–2017). Children who are transfused are increasingly seriously injured. ISS has increased for transfused children aged 1–15 years (2002–2005: mean 27.7–34.4 in 2014–2017). ISS in non-transfused children has decreased in children aged 1–15 years (2002–2005: mean 19.6 to mean 17.6 in 2014–2017). Mean prehospital fluid administration decreased from 980 to 549 ml without affecting hemodynamic instability.
Conclusion: Blood transfusion rates and amount of fluid resuscitation decreased in severe injured children over a 16-year period in Germany. Restrictive blood transfusion and fluid management has become common practice in severe injured children. A prehospital restrictive fluid management strategy in severely injured children is not associated with a worsened hemodynamic state, abnormal coagulation or base excess but leads to higher hemoglobin levels.
Introduction: Cell salvage (CS) is an integral part of patient blood management (PBM) and aims to reduce allogeneic red blood cell (RBC) transfusion.
Material and methods: This observational study analysed patients scheduled for elective cardiac surgery requiring cardiopulmonary bypass (CPB) between November 2015 and October 2018. Patients were divided into a CS group (patients receiving CS) and a control group (no CS). Primary endpoints were the number of patients exposed to allogeneic RBC transfusions and the number of RBC units transfused per patient.
Results: A total of 704 patients undergoing cardiac surgery were analysed, of whom 338 underwent surgery with CS (CS group) and 366 were without CS (control group). Intraoperatively, 152 patients (45%) were exposed to allogeneic RBC transfusions in the CS group and 93 patients (25%) in the control group (P < 0.001). Considering the amount of intraoperative blood loss, regression analysis revealed a significant association between blood loss and increased use of RBC units in patients of the control compared to the CS group (1000 mL: 1.0 vs. 0.6 RBC units; 2000 mL: 2.2 vs. 1.1 RBC units; 3000 mL: 3.4 vs. 1.6 RBC units). Thus, CS was significantly associated with a reduced number of allogeneic RBCs by 40% for 1000 mL, 49% for 2000 mL, and 52% for 3000 mL of blood loss compared to patients without CS.
Conclusions: Cell salvage was significantly associated with a reduced number of allogeneic RBC transfusions. It supports the beneficial effect of CS in cardiac surgical patients as an individual measure in a comprehensive PBM program.
Background: paediatric patients are vulnerable to blood loss and even a small loss of blood can be associated with severe shock. In emergency situations, a red blood cell (RBC) transfusion may become unavoidable, although it is associated with various risks. The aim of this trial was to identify independent risk factors for perioperative RBC transfusion in children undergoing surgery. Methods: to identify independent risk factors for perioperative RBC transfusion in children undergoing surgery and to access RBC transfusion rates and in-hospital outcomes (e.g., length of stay, mortality, and typical postoperative complication rates), a monocentric, retrospective, and observational study was conducted. Descriptive, univariate, and multivariate analyses were performed. Results: between 1 January 2010 and 31 December 2019, data from n = 14,248 cases were identified at the centre. Analysis revealed an RBC transfusion rate of 10.1% (n = 1439) in the entire cohort. The independent predictors of RBC transfusion were the presence of preoperative anaemia (p < 0.001; OR = 15.10 with preoperative anaemia and OR = 2.40 without preoperative anaemia), younger age (p < 0.001; ORs between 0.14 and 0.28 for children older than 0 years), female gender (p = 0.036; OR = 1.19 compared to male gender), certain types of surgery (e.g., neuro surgery (p < 0.001; OR = 10.14), vascular surgery (p < 0.001; OR = 9.93), cardiac surgery (p < 0.001; OR = 4.79), gynaecology (p = 0.014; OR = 3.64), visceral surgery (p < 0.001; OR = 2.48), and the presence of postoperative complications (e.g., sepsis (p < 0.001; OR = 10.16), respiratory dysfunction (p < 0.001; OR = 7.56), cardiovascular dysfunction (p < 0.001; OR = 4.68), neurological dysfunction (p = 0.029; OR = 1.77), and renal dysfunction (p < 0.001; OR = 16.17)). Conclusion: preoperative anaemia, younger age, female gender, certain types of surgery, and postoperative complications are independent predictors for RBC transfusion in children undergoing surgery. Future prospective studies are urgently required to identify, in detail, the potential risk factors and impact of RBC transfusion in children.
Background: Cerebral O2 saturation (ScO2) reflects cerebral perfusion and can be measured noninvasively by near-infrared spectroscopy (NIRS). Objectives: In this pilot study, we describe the dynamics of ScO2 during TAVI in nonventilated patients and its impact on procedural outcome. Methods and Results: We measured ScO2 of both frontal lobes continuously by NIRS in 50 consecutive analgo-sedated patients undergoing transfemoral TAVI (female 58%, mean age 80.8 years). Compared to baseline ScO2 dropped significantly during RVP (59.3% vs. 53.9%, p < .01). Five minutes after RVP ScO2 values normalized (post RVP 62.6% vs. 53.9% during RVP, p < .01; pre 61.6% vs. post RVP 62.6%, p = .53). Patients with an intraprocedural pathological ScO2 decline of >20% (n = 13) had higher EuroSCORE II (3.42% vs. 5.7%, p = .020) and experienced more often delirium (24% vs. 62%, p = .015) and stroke (0% vs. 23%, p < .01) after TAVI. Multivariable logistic regression revealed higher age and large ScO2 drops as independent risk factors for delirium. Conclusions: During RVP ScO2 significantly declined compared to baseline. A ScO2 decline of >20% is associated with a higher incidence of delirium and stroke and a valid cut-off value to screen for these complications. NIRS measurement during TAVI procedure may be an easy to implement diagnostic tool to detect patients at high risks for cerebrovascular complications and delirium.
Introduction: In recent years, resource-saving handling of allogeneic blood products and a reduction of transfusion rates in adults has been observed. However, comparable published national data for transfusion practices in pediatric patients are currently not available. In this study, the transfusion rates for children and adolescents were analyzed based on data from the Federal Statistical Office of Germany during the past 2 decades. Methods: Data were queried via the database of the Federal Statistical Office (Destasis). The period covered was from 2005 to 2018, and those in the sample group were children and adolescents aged 0–17 years receiving inpatient care. Operation and procedure codes (OPS) for transfusions, procedures, or interventions with increased transfusion risk were queried and evaluated in detail. Results: In Germany, 0.9% of the children and adolescents treated in hospital received a transfusion in 2018. A reduction in transfusion rates from 1.02% (2005) to 0.9% (2018) was observed for the total collective of children and adolescents receiving inpatient care. Increases in transfusion rates were recorded for 1- to 4- (1.41–1.45%) and 5- to 10-year-olds (1.24–1.33%). Children under 1 year of age were most frequently transfused (in 2018, 40.2% of the children were cared for in hospital). Transfusion-associated procedures such as chemotherapy or machine ventilation and respiratory support for newborns and infants are on the rise. Conclusion: Transfusion rates are declining in children and adolescents, but the reasons for increases in transfusion rates in other groups are unclear. Prospective studies to evaluate transfusion rates and triggers in children are urgently needed.
Background: Approximately one in three patients suffers from preoperative anaemia. Even though haemoglobin is measured before surgery, anaemia management is not implemented in every hospital. Objective: Here, we demonstrate the implementation of an anaemia walk-in clinic at an Orthopedic University Hospital. To improve the diagnosis of iron deficiency (ID), we examined whether reticulocyte haemoglobin (Ret-He) could be a useful additional parameter. Material and Methods: In August 2019, an anaemia walk-in clinic was established. Between September and December 2019, major orthopaedic surgical patients were screened for preoperative anaemia. The primary endpoint was the incidence of preoperative anaemia. Secondary endpoints included Ret-He level, red blood cell (RBC) transfusion rate, in-hospital length of stay and anaemia at hospital discharge. Results: A total of 104 patients were screened for anaemia. Preoperative anaemia rate was 20.6%. Intravenous iron was supplemented in 23 patients. Transfusion of RBC units per patient (1.7 ± 1.2 vs. 0.2 ± 0.9; p = 0.004) and hospital length of stay (13.1 ± 4.8 days vs. 10.6 ± 5.1 days; p = 0.068) was increased in anaemic patients compared to non-anaemic patients. Ret-He values were significantly lower in patients with ID anaemia (33.3 pg [28.6–40.2 pg]) compared to patients with ID (35.3 pg [28.9–38.6 pg]; p = 0.015) or patients without anaemia (35.4 pg [30.2–39.4 pg]; p = 0.001). Conclusion: Preoperative anaemia is common in orthopaedic patients. Our results proved the feasibility of an anaemia walk-in clinic to manage preoperative anaemia. Furthermore, our analysis supports the use of Ret-He as an additional parameter for the diagnosis of ID in surgical patients.
Background: The intraoperative blood loss is estimated daily in the operating room and is mainly done by visual techniques. Due to local standards, the surgical sponge colours can vary (e.g. white in US, green in Germany). The influence of sponge colour on accuracy of estimation has not been in the focus of research yet. Material and methods: A blood loss simulation study containing four “bleeding” scenarios each per sponge colour were created by using expired whole blood donation samples. The blood donations were applied to white and green surgical sponges after dilution with full electrolyte solution. Study participants had to estimate the absorbed blood loss in sponges in all scenarios. The difference to the reference blood loss was analysed. Multivariate linear regression analysis was performed to investigate other influence factors such as staff experience and sponge colour. Results: A total of 53 anaesthesists participated in the study. Visual estimation correlated moderately with reference blood loss in white (Spearman's rho: 0.521; p = 3.748*10−16) and green sponges (Spearman's rho: 0.452; p = 4.683*10−12). The median visually estimated blood loss was higher in white sponges (250ml IRQ 150–412.5ml) than in green sponges (150ml IQR 100-300ml), compared to reference blood loss (103ml IQR 86–162.8). For both colour types of sponges, major under- and overestimation was observed. The multivariate statistics demonstrates that fabric colours have a significant influence on estimation (p = 3.04*10−10), as well as clinician’s qualification level (p = 2.20*10−10, p = 1.54*10−08) and amount of RBL to be estimated (p < 2*10−16). Conclusion: The deviation of correct blood loss estimation was smaller with white surgical sponges compared to green sponges. In general, deviations were so severe for both types of sponges, that it appears to be advisable to refrain from visually estimating blood loss whenever possible and instead to use other techniques such as e.g. colorimetric estimation.
Background: Perioperative anaemia leads to impaired oxygen supply with a risk of vital organ ischaemia. In healthy and fit individuals, anaemia can be compensated by several mechanisms. Elderly patients, however, have less compensatory mechanisms because of multiple co-morbidities and age-related decline of functional reserves. The purpose of the study is to evaluate whether elderly surgical patients may benefit from a liberal red blood cell (RBC) transfusion strategy compared to a restrictive transfusion strategy.
Methods: The LIBERAL Trial is a prospective, randomized, multicentre, controlled clinical phase IV trial randomising 2470 elderly (≥ 70 years) patients undergoing intermediate- or high-risk non-cardiac surgery. Registered patients will be randomised only if Haemoglobin (Hb) reaches ≤9 g/dl during surgery or within 3 days after surgery either to the LIBERAL group (transfusion of a single RBC unit when Hb ≤ 9 g/dl with a target range for the post-transfusion Hb level of 9–10.5 g/dl) or the RESTRICTIVE group (transfusion of a single RBC unit when Hb ≤ 7.5 g/dl with a target range for the post-transfusion Hb level of 7.5–9 g/dl). The intervention per patient will be followed until hospital discharge or up to 30 days after surgery, whichever occurs first. The primary efficacy outcome is defined as a composite of all-cause mortality, acute myocardial infarction, acute ischaemic stroke, acute kidney injury (stage III), acute mesenteric ischaemia and acute peripheral vascular ischaemia within 90 days after surgery. Infections requiring iv antibiotics with re-hospitalisation are assessed as important secondary endpoint. The primary endpoint will be analysed by logistic regression adjusting for age, cancer surgery (y/n), type of surgery (intermediate- or high-risk), and incorporating centres as random effect.
Discussion: The LIBERAL-Trial will evaluate whether a liberal transfusion strategy reduces the occurrence of major adverse events after non-cardiac surgery in the geriatric population compared to a restrictive strategy within 90 days after surgery.
Trial registration: ClinicalTrials.gov (identifier: NCT03369210).
Background: Approximately every third surgical patient is anemic. The most common form, iron deficiency anemia, results from persisting iron‐deficient erythropoiesis (IDE). Zinc protoporphyrin (ZnPP) is a promising parameter for diagnosing IDE, hitherto requiring blood drawing and laboratory workup.
Study design and methods: Noninvasive ZnPP (ZnPP‐NI) measurements are compared to ZnPP reference determination of the ZnPP/heme ratio by high‐performance liquid chromatography (ZnPP‐HPLC) and the analytical performance in detecting IDE is evaluated against traditional iron status parameters (ferritin, transferrin saturation [TSAT], soluble transferrin receptor–ferritin index [sTfR‐F], soluble transferrin receptor [sTfR]), likewise measured in blood. The study was conducted at the University Hospitals of Frankfurt and Zurich.
Results: Limits of agreement between ZnPP‐NI and ZnPP‐HPLC measurements for 584 cardiac and noncardiac surgical patients equaled 19.7 μmol/mol heme (95% confidence interval, 18.0–21.3; acceptance criteria, 23.2 μmol/mol heme; absolute bias, 0 μmol/mol heme). Analytical performance for detecting IDE (inferred from area under the curve receiver operating characteristics) of parameters measured in blood was: ZnPP‐HPLC (0.95), sTfR (0.92), sTfR‐F (0.89), TSAT (0.87), and ferritin (0.67). Noninvasively measured ZnPP‐NI yielded results of 0.90.
Conclusion: ZnPP‐NI appears well suited for an initial IDE screening, informing on the state of erythropoiesis at the point of care without blood drawing and laboratory analysis. Comparison with a multiparameter IDE test revealed that ZnPP‐NI values of 40 μmol/mol heme or less allows exclusion of IDE, whereas for 65 μmol/mol heme or greater, IDE is very likely if other causes of increased values are excluded. In these cases (77% of our patients) ZnPP‐NI may suffice for a diagnosis, while values in between require analyses of additional iron status parameters.
In-line filtration of intravenous infusion may reduce organ dysfunction of adult critical patients
(2019)
Background: The potential harmful effects of particle-contaminated infusions for critically ill adult patients are yet unclear. So far, only significant improved outcome in critically ill children and new-borns was demonstrated when using in-line filters, but for adult patients, evidence is still missing.
Methods: This single-centre, retrospective controlled cohort study assessed the effect of in-line filtration of intravenous fluids with finer 0.2 or 1.2 μm vs 5.0 μm filters in critically ill adult patients. From a total of n = 3215 adult patients, n = 3012 patients were selected by propensity score matching (adjusting for sex, age, and surgery group) and assigned to either a fine filter cohort (with 0.2/1.2 μm filters, n = 1506, time period from February 2013 to January 2014) or a control filter cohort (with 5.0 μm filters, n = 1506, time period from April 2014 to March 2015). The cohorts were compared regarding the occurrence of severe vasoplegia, organ dysfunctions (lung, kidney, and brain), inflammation, in-hospital complications (myocardial infarction, ischemic stroke, pneumonia, and sepsis), in-hospital mortality, and length of ICU and hospital stay.
Results: Comparing fine filter vs control filter cohort, respiratory dysfunction (Horowitz index 206 (119–290) vs 191 (104.75–280); P = 0.04), pneumonia (11.4% vs 14.4%; P = 0.02), sepsis (9.6% vs 12.2%; P = 0.03), interleukin-6 (471.5 (258.8–1062.8) ng/l vs 540.5 (284.5–1147.5) ng/l; P = 0.01), and length of ICU (1.2 (0.6–4.9) vs 1.7 (0.8–6.9) days; P < 0.01) and hospital stay (14.0 (9.2–22.2) vs 14.8 (10.0–26.8) days; P = 0.01) were reduced. Rate of severe vasoplegia (21.0% vs 19.6%; P > 0.20) and acute kidney injury (11.8% vs 13.7%; P = 0.11) was not significantly different between the cohorts.
Conclusions: In-line filtration with finer 0.2 and 1.2 μm filters may be associated with less organ dysfunction and less inflammation in critically ill adult patients.
Trial registration: The study was registered at ClinicalTrials.gov (number: NCT02281604).
Background: Anemia is the most important complication during major surgery and transfusion of red blood cells is the mainstay to compensate for life threating blood loss. Therefore, accurate measurement of hemoglobin (Hb) concentration should be provided in real-time. Blood Gas Analysis (BGA) provides rapid point-of-care assessment using smaller sampling tubes compared to central laboratory (CL) services. Objective: This study aimed to investigate the accuracy of BGA hemoglobin testing as compared to CL services. Methods: Data of the ongoing LIBERAL-Trial (Liberal transfusion strategy to prevent mortality and anemia-associated ischemic events in elderly non-cardiac surgical patients, LIBERAL) was used to assess the bias for Hb level measured by BGA devices (ABL800 Flex analyzer®, GEM series® and RapidPoint 500®) and CL as the reference method. For that, we analyzed pairs of Hb level measured by CL and BGA within two hours. Furthermore, the impact of various confounding factors including age, gender, BMI, smoker status, transfusion of RBC, intraoperative hemodilution, and co-medication was elucidated. In order to ensure adequate statistical analysis, only data of participating centers providing more than 200 Hb pairs were used. Results: In total, three centers including 963 patients with 1,814 pairs of Hb measurements were analyzed. Mean bias was comparable between ABL800 Flex analyzer® and GEM series®: - 0.38 ± 0.15 g/dl whereas RapidPoint 500® showed a smaller bias (-0.09 g/dl) but greater median absolute deviation (± 0.45 g/dl). In order to avoid interference with different standard deviations caused by the different analytic devices, we focused on two centers using the same BGA technique (309 patients and 1,570 Hb pairs). A Bland-Altman analysis and LOWESS curve showed that bias decreased with smaller Hb values in absolute numbers but increased relatively. The smoker status showed the greatest reduction in bias (0.1 g/dl, p<0.001) whereas BMI (0.07 g/dl, p = 0.0178), RBC transfusion (0.06 g/dl, p<0.001), statins (0.04 g/dl, p<0.05) and beta blocker (0.03 g/dl, p = 0.02) showed a slight effect on bias. Intraoperative substitution of volume and other co-medications did not influence the bias significantly. Conclusion: Many interventions like substitution of fluids, coagulating factors or RBC units rely on the accuracy of laboratory measurement devices. Although BGA Hb testing showed a consistently stable difference to CL, our data confirm that BGA devices are associated with different bias. Therefore, we suggest that hospitals assess their individual bias before implementing BGA as valid and stable supplement to CL. However, based on the finding that bias decreased with smaller Hb values, which in turn are used for transfusion decision, we expect no unnecessary or delayed RBC transfusion, and no major impact on the LIBERAL trial performance.
Background: The use of cell salvage and autologous blood transfusion has become an important method of blood conservation. So far, there are no clinical data about the performance of the continuous autotransfusion device CATSmart.
Methods: In total, 74 patients undergoing either cardiac or orthopedic surgery were included in this prospective, bicenter and observational technical evaluation to validate red cell separation process and washout quality of CATSmart. The target of red cell separation process was defined as a hematocrit value in the packed red cell unit of 55–75% and of washout quality of 80–100% removal ratio.
Results: Hematocrit values measured by CATSmart and laboratory analysis were 78.5% [71.3%; 84.0%] and 73.7% [67.5%; 75.5%], respectively. Removal ratios for platelets 94.7% [88.2%; 96.7%], free hemoglobin 89.3% [85.2%; 94.9%], albumin 97.9% [96.6%; 98.5%], heparin 99.9% [99.9%; 100.0%], and potassium 92.5% [90.8%; 95.0%] were within the target range while removal of white blood cells was slightly worse 72.4% [57.9%; 87.3%].
Conclusion: The new autotransfusion device enables sufficient red cell separation and washout quality.
Background: Mild therapeutic hypothermia following cardiac arrest is neuroprotective, but its effect on myocardial dysfunction that is a critical issue following resuscitation is not clear. This study sought to examine whether hypothermia and the combination of hypothermia and pharmacological postconditioning are cardioprotective in a model of cardiopulmonary resuscitation following acute myocardial ischemia. Methodology/Principal Findings: Thirty pigs (28–34 kg) were subjected to cardiac arrest following left anterior descending coronary artery ischemia. After 7 minutes of ventricular fibrillation and 2 minutes of basic life support, advanced cardiac life support was started according to the current AHA guidelines. After successful return of spontaneous circulation (n = 21), coronary perfusion was reestablished after 60 minutes of occlusion, and animals were randomized to either normothermia at 38°C, hypothermia at 33°C or hypothermia at 33°C combined with sevoflurane (each group n = 7) for 24 hours. The effects on cardiac damage especially on inflammation, apoptosis, and remodeling were studied using cellular and molecular approaches. Five animals were sham operated. Animals treated with hypothermia had lower troponin T levels (p<0.01), reduced infarct size (34±7 versus 57±12%; p<0.05) and improved left ventricular function compared to normothermia (p<0.05). Hypothermia was associated with a reduction in: (i) immune cell infiltration, (ii) apoptosis, (iii) IL-1beta and IL-6 mRNA up-regulation, and (iv) IL-1beta protein expression (p<0.05). Moreover, decreased matrix metalloproteinase-9 activity was detected in the ischemic myocardium after treatment with mild hypothermia. Sevoflurane conferred additional protective effects although statistic significance was not reached. Conclusions/Significance: Hypothermia reduced myocardial damage and dysfunction after cardiopulmonary resuscitation possible via a reduced rate of apoptosis and pro-inflammatory cytokine expression.
In contrast to several smaller studies, which demonstrate that remote ischemic preconditioning (RIPC) reduces myocardial injury in patients that undergo cardiovascular surgery, the RIPHeart study failed to demonstrate beneficial effects of troponin release and clinical outcome in propofol-anesthetized cardiac surgery patients. Therefore, we addressed the potential biochemical mechanisms triggered by RIPC. This is a predefined prospective sub-analysis of the randomized and controlled RIPHeart study in cardiac surgery patients (n = 40) that was recently published. Blood samples were drawn from patients prior to surgery, after RIPC of four cycles of 5 min arm ischemia/5 min reperfusion (n = 19) and the sham (n = 21) procedure, after connection to cardiopulmonary bypass (CPB), at the end of surgery, 24 h postoperatively, and 48 h postoperatively for the measurement of troponin T, macrophage migration inhibitory factor (MIF), stromal cell-derived factor 1 (CXCL12), IL-6, CXCL8, and IL-10. After RIPC, right atrial tissue samples were taken for the measurement of extracellular-signal regulated kinase (ERK1/2), protein kinase B (AKT), Glycogen synthase kinase 3 (GSK-3β), protein kinase C (PKCε), and MIF content. RIPC did not significantly reduce the troponin release when compared with the sham procedure. MIF serum levels intraoperatively increased, peaking at intensive care unit (ICU) admission (with an increase of 48.04%, p = 0.164 in RIPC; and 69.64%, p = 0.023 over the baseline in the sham procedure), and decreased back to the baseline 24 h after surgery, with no differences between the groups. In the right atrial tissue, MIF content decreased after RIPC (1.040 ± 1.032 Arbitrary units [au] in RIPC vs. 2.028 ± 1.631 [au] in the sham procedure, p < 0.05). CXCL12 serum levels increased significantly over the baseline at the end of surgery, with no differences between the groups. ERK1/2, AKT, GSK-3β, and PKCɛ phosphorylation in the right atrial samples were no different between the groups. No difference was found in IL-6, CXCL8, and IL10 serum levels between the groups. In this cohort of cardiac surgery patients that received propofol anesthesia, we could not show a release of potential mediators of signaling, nor an effect on the inflammatory response, nor an activation of well-established protein kinases after RIPC. Based on these data, we cannot exclude that confounding factors, such as propofol, may have interfered with RIPC.
Background: The ability of stroke volume variation (SVV), pulse pressure variation (PPV) and global end-diastolic volume (GEDV) for prediction of fluid responsiveness in presence of pleural effusion is unknown. The aim of the present study was to challenge the ability of SVV, PPV and GEDV to predict fluid responsiveness in a porcine model with pleural effusions.
Methods: Pigs were studied at baseline and after fluid loading with 8 ml kg−1 6% hydroxyethyl starch. After withdrawal of 8 ml kg−1 blood and induction of pleural effusion up to 50 ml kg−1 on either side, measurements at baseline and after fluid loading were repeated. Cardiac output, stroke volume, central venous pressure (CVP) and pulmonary occlusion pressure (PAOP) were obtained by pulmonary thermodilution, whereas GEDV was determined by transpulmonary thermodilution. SVV and PPV were monitored continuously by pulse contour analysis.
Results: Pleural effusion was associated with significant changes in lung compliance, peak airway pressure and stroke volume in both responders and non-responders. At baseline, SVV, PPV and GEDV reliably predicted fluid responsiveness (area under the curve 0.85 (p<0.001), 0.88 (p<0.001), 0.77 (p = 0.007). After induction of pleural effusion the ability of SVV, PPV and GEDV to predict fluid responsiveness was well preserved and also PAOP was predictive. Threshold values for SVV and PPV increased in presence of pleural effusion.
Conclusions: In this porcine model, bilateral pleural effusion did not affect the ability of SVV, PPV and GEDV to predict fluid responsiveness.
Nutrition support is a necessary therapy for critically ill cardiac surgery patients. However, conclusive evidence for this population, consisting of well-conducted clinical trials is lacking. To clarify optimal strategies to improve outcomes, an international multidisciplinary group of 25 experts from different clinical specialties from Germany, Canada, Greece, USA and Russia discussed potential approaches to identify patients who may benefit from nutrition support, when best to initiate nutrition support, and the potential use of pharmaco-nutrition to modulate the inflammatory response to cardiopulmonary bypass. Despite conspicuous knowledge and evidence gaps, a rational nutritional support therapy is presented to benefit patients undergoing cardiac surgery.
Background: Remote ischemic preconditioning (RIPC) has been shown to enhance the tolerance of remote organs to cope with a subsequent ischemic event. We hypothesized that RIPC reduces postoperative neurocognitive dysfunction (POCD) in patients undergoing complex cardiac surgery.
Methods: We conducted a prospective, randomized, double-blind, controlled trial including 180 adult patients undergoing elective cardiac surgery with cardiopulmonary bypass. Patients were randomized either to RIPC or to control group. Primary endpoint was postoperative neurocognitive dysfunction 5–7 days after surgery assessed by a comprehensive test battery. Cognitive change was assumed if the preoperative to postoperative difference in 2 or more tasks assessing different cognitive domains exceeded more than one SD (1 SD criterion) or if the combined Z score was 1.96 or greater (Z score criterion).
Results: According to 1 SD criterion, 52% of control and 46% of RIPC patients had cognitive deterioration 5–7 days after surgery (p = 0.753). The summarized Z score showed a trend to more cognitive decline in the control group (2.16±5.30) compared to the RIPC group (1.14±4.02; p = 0.228). Three months after surgery, incidence and severity of neurocognitive dysfunction did not differ between control and RIPC. RIPC tended to decrease postoperative troponin T release at both 12 hours [0.60 (0.19–1.94) µg/L vs. 0.48 (0.07–1.84) µg/L] and 24 hours after surgery [0.36 (0.14–1.89) µg/L vs. 0.26 (0.07–0.90) µg/L].
Conclusions: We failed to demonstrate efficacy of a RIPC protocol with respect to incidence and severity of POCD and secondary outcome variables in patients undergoing a wide range of cardiac surgery. Therefore, definitive large-scale multicenter trials are needed.
Trial Registration: ClinicalTrials.gov NCT00877305
BACKGROUND: Transient episodes of ischemia in a remote organ or tissue (remote ischemic preconditioning, RIPC) can attenuate myocardial injury. Myocardial damage is associated with tissue remodeling and the matrix metalloproteinases 2 and 9 (MMP-2/9) are crucially involved in these events. Here we investigated the effects of RIPC on the activities of heart tissue MMP-2/9 and their correlation with serum concentrations of cardiac troponin T (cTnT), a marker for myocardial damage.
METHODS: In cardiosurgical patients with cardiopulmonary bypass (CPB) RIPC was induced by four 5 minute cycles of upper limb ischemia/reperfusion. Cardiac tissue was obtained before as well as after CPB and serum cTnT concentrations were measured. Tissue derived from control patients (N = 17) with high cTnT concentrations (≥0.32 ng/ml) and RIPC patients (N = 18) with low cTnT (≤0.32 ng/ml) was subjected to gelatin zymography to quantify MMP-2/9 activities.
RESULTS: In cardiac biopsies obtained before CPB, activities of MMP-2/9 were attenuated in the RIPC group (MMP-2: Control, 1.13 ± 0.13 a.u.; RIPC, 0.71 ± 0.12 a.u.; P < 0.05. MMP-9: Control, 1.50 ± 0.16 a.u.; RIPC, 0.87 ± 0.14 a.u.; P < 0.01), while activities of the pro-MMPs were not altered (P > 0.05). In cardiac biopsies taken after CPB activities of pro- and active MMP-2/9 were not different between the groups (P > 0.05). Spearman's rank tests showed that MMP-2/9 activities in cardiac tissue obtained before CPB were positively correlated with postoperative cTnT serum levels (MMP-2, P = 0.016; MMP-9, P = 0.015).
CONCLUSIONS: Activities of MMP-2/9 in cardiac tissue obtained before CPB are attenuated by RIPC and are positively correlated with serum concentrations of cTnT. MMPs may represent potential targets for RIPC mediated cardioprotection.
TRIAL REGISTRATION: ClinicalTrials.gov identifier NCT00877305.
Background: In intensive care units (ICU) octogenarians become a routine patients group with aggravated therapeutic and diagnostic decision-making. Due to increased mortality and a reduced quality of life in this high-risk population, medical decision-making a fortiori requires an optimum of risk stratification. Recently, the VIP-1 trial prospectively observed that the clinical frailty scale (CFS) performed well in ICU patients in overall-survival and short-term outcome prediction. However, it is known that healthcare systems differ in the 21 countries contributing to the VIP-1 trial. Hence, our main focus was to investigate whether the CFS is usable for risk stratification in octogenarians admitted to diversified and high tech German ICUs.
Methods: This multicentre prospective cohort study analyses very old patients admitted to 20 German ICUs as a sub-analysis of the VIP-1 trial. Three hundred and eight patients of 80 years of age or older admitted consecutively to participating ICUs. CFS, cause of admission, APACHE II, SAPS II and SOFA scores, use of ICU resources and ICU- and 30-day mortality were recorded. Multivariate logistic regression analysis was used to identify factors associated with 30-day mortality.
Results: Patients had a median age of 84 [IQR 82–87] years and a mean CFS of 4.75 (± 1.6 standard-deviation) points. More than half of the patients (53.6%) were classified as frail (CFS ≥ 5). ICU-mortality was 17.3% and 30-day mortality was 31.2%. The cause of admission (planned vs. unplanned), (OR 5.74) and the CFS (OR 1.44 per point increase) were independent predictors of 30-day survival.
Conclusions: The CFS is an easy determinable valuable tool for prediction of 30-day ICU survival in octogenarians, thus, it may facilitate decision-making for intensive care givers in Germany.
Trial registration: The VIP-1 study was retrospectively registered on ClinicalTrials.gov (ID: NCT03134807) on May 1, 2017.
Introduction: Observational studies have demonstrated an association between vitamin D deficiency and increased risk of morbidity and mortality in critically ill patients. Cohort studies and pilot trials have suggested promising beneficial effects of vitamin D replacement in the critical ill, at least in patients with severe vitamin D deficiency. As vitamin D is a simple, low-cost and safe intervention, it has potential to improve survival in critically ill patients.
Methods and analysis: In this randomised, placebo-controlled, double-blind, multicentre, international trial, 2400 adult patients with severe vitamin D deficiency (25-hydroxyvitamin D≤12 ng/mL) will be randomised in a 1:1 ratio by www.randomizer.at to receive a loading dose of 540 000 IU cholecalciferol within 72 hours after intensive care unit (ICU) admission, followed by 4000 IU daily for 90 days or placebo. Hypercalcaemia may occur as a side effect, but is monitored by regular checks of the calcium level. The primary outcome is all-cause mortality at 28 days after randomisation. Secondary outcomes are: ICU, hospital, 90-day and 1-year mortality; hospital and ICU length of stay, change in organ dysfunction on day 5 as measured by Sequential Organ Function Assessment (SOFA) score, number of organ failures; hospital and ICU readmission until day 90; discharge destination, self-reported infections requiring antibiotics until day 90 and health-related quality of life. Recruitment status is ongoing.
Ethics and dissemination: National ethical approval was obtained by the Ethics Committee of the University of Graz for Austria, Erasme University Brussels (Belgium) and University Hospital Frankfurt (Germany), and will further be gained according to individual national processes. On completion, results will be published in a peer-reviewed scientific journal. The study findings will be presented at national and international meetings with abstracts online.
Trial registration: NCT03188796, EudraCT-No: 2016-002460-13.
Background: Age and preoperative anaemia are risk factors for poor surgical outcome and blood transfusion. The aim of this study was to examine the effect of iron supplementation in iron-deficient (ID) elderly patients undergoing major surgery.
Method: In this single-centre observational study, patients ≥ 65 years undergoing major surgery were screened for anaemia and ID. Patients were assigned to the following groups: A− (no anaemia); A−,ID+,T+ (no anaemia, iron-deficient, intravenous iron supplementation); A+ (anaemia); and A+,ID+,T+ (anaemia, iron-deficient, intravenous iron supplementation).
Results: Of 4,381 patients screened at the anaemia walk-in clinic, 2,381 (54%) patients were ≥ 65 years old and 2,191 cases were included in analysis. The ID prevalence was 63% in patients with haemoglobin (Hb) < 8 g/dl, 47.2% in patients with Hb from 8.0 to 8.9 g/dl, and 44.3% in patients with Hb from 9 to 9.9 g/dl. In severely anaemic patients, an Hb increase of 0.6 (0.4; 1.2) and 1.2 (0.7; 1.6) g/dl was detected with iron supplementation 6–10 and > 10 days before surgery, respectively. Hb increased by 0 (-0.1; 0) g/dl with iron supplementation 1–5 days before surgery, 0.2 (-0.1; 0.5) g/dl with iron supplementation 6–10 days before surgery, and 0.2 (-0.2; 1.1) g/dl with supplementation > 10 days before surgery (p < 0.001 for 1–5 vs. 6–10 days). Overall, 58% of A+,ID+,T+ patients showed an Hb increase of > 0.5 g/dl. The number of transfused red blood cell units was significantly lower in patients supplemented with iron (0 (0; 3)) compared to non-treated anaemic patients (1 (0; 4)) (p = 0.03). Patients with iron supplementation > 6 days before surgery achieved mobility 2 days earlier than patients with iron supplementation < 6 days.
Conclusions: Intravenous iron supplementation increases Hb level and thereby reduces blood transfusion rate in elderly surgical patients with ID anaemia.
Patients with risks of ischemic injury, e.g. during circulatory arrest in cardiac surgery, or after resuscitation are subjected to therapeutic hypothermia. For aortic surgery, the body is traditionally cooled down to 18 °C and then rewarmed to body temperature. The role of hypothermia and the subsequent rewarming process on leukocyte-endothelial interactions and expression of junctional-adhesion-molecules is not clarified yet. Thus, we investigated in an in-vitro model the influence of temperature modulation during activation and transendothelial migration of leukocytes through human endothelial cells. Additionally, we investigated the expression of JAMs in the rewarming phase. Exposure to low temperatures alone during transmigration scarcely affects leukocyte extravasation, whereas hypothermia during treatment and transendothelial migration improves leukocyte-endothelial interactions. Rewarming causes a significant up-regulation of transmigration with falling temperatures. JAM-A is significantly modulated during rewarming. Our data suggest that transendothelial migration of leukocytes is not only modulated by cell-activation itself. Activation temperatures and the rewarming process are essential. Continued hypothermia significantly inhibits transendothelial migration, whereas the rewarming process enhances transmigration strongly. The expression of JAMs, especially JAM-A, is strongly modulated during the rewarming process. Endothelial protection prior to warm reperfusion and mild hypothermic conditions reducing the difference between hypothermia and rewarming temperatures should be considered.
More than 30% of the world's population are anemic with serious economic consequences including reduced work capacity and other obstacles to national welfare and development. Red blood cell transfusion is the mainstay to correct anemia, but it is also 1 of the top 5 overused procedures. Patient blood management (PBM) is a proactive, patient-centered, and multidisciplinary approach to manage anemia, optimize hemostasis, minimize iatrogenic blood loss, and harness tolerance to anemia. Although the World Health Organization has endorsed PBM in 2010, many hospitals still seek guidance with the implementation of PBM in clinical routine. Given the use of proven change management principles, we propose simple, cost-effective measures enabling any hospital to reduce both anemia and red blood cell transfusions in surgical and medical patients. This article provides comprehensive bundles of PBM components encompassing 107 different PBM measures, divided into 6 bundle blocks acting as a working template to develop institutions' individual PBM practices for hospitals beginning a program or trying to improve an already existing program. A stepwise selection of the most feasible measures will facilitate the implementation of PBM. In this manner, PBM represents a new quality and safety standard.
Background: Intraoperative blood salvage (IBS) is regarded as an alternative to allogeneic blood transfusion excluding the risks associated with allogeneic blood. Currently, IBS is generally avoided in tumor surgeries due to concern for potential metastasis caused by residual tumor cells in the erythrocyte concentrate.
Methods: The feasibility, efficacy and safety aspects of the new developed Catuvab procedure using the bispecific trifunctional antibody Catumaxomab was investigated in an ex-vivo pilot study in order to remove residual EpCAM positive tumor cells from the autologous erythrocyte concentrates (EC) from various cancer patients, generated by a IBS device.
Results: Tumor cells in intraoperative blood were detected in 10 of 16 patient samples in the range of 69–2.6 × 105 but no residual malignant cells in the final erythrocyte concentrates after Catuvab procedure. IL-6 and IL-8 as pro-inflammatory cytokines released during surgery, were lowered in mean 28-fold and 52-fold during the Catuvab procedure, respectively, whereas Catumaxomab antibody was detected in 8 of 16 of the final EC products at a considerable decreased and uncritical residual amount (37 ng in mean).
Conclusion: The preliminary study results indicate efficacy and feasibility of the new medical device Catuvab allowing potentially the reinfusion of autologous erythrocyte concentrates (EC) produced by IBS device during oncological high blood loss surgery. An open-label, multicenter clinical study on the removal of EpCAM-positive tumor cells from blood collected during tumor surgery using the Catuvab device is initiated to validate these encouraging results.
Background: Intensive Care Resources are heavily utilized during the COVID-19 pandemic. However, risk stratification and prediction of SARS-CoV-2 patient clinical outcomes upon ICU admission remain inadequate. This study aimed to develop a machine learning model, based on retrospective & prospective clinical data, to stratify patient risk and predict ICU survival and outcomes. Methods: A Germany-wide electronic registry was established to pseudonymously collect admission, therapeutic and discharge information of SARS-CoV-2 ICU patients retrospectively and prospectively. Machine learning approaches were evaluated for the accuracy and interpretability of predictions. The Explainable Boosting Machine approach was selected as the most suitable method. Individual, non-linear shape functions for predictive parameters and parameter interactions are reported. Results: 1039 patients were included in the Explainable Boosting Machine model, 596 patients retrospectively collected, and 443 patients prospectively collected. The model for prediction of general ICU outcome was shown to be more reliable to predict “survival”. Age, inflammatory and thrombotic activity, and severity of ARDS at ICU admission were shown to be predictive of ICU survival. Patients’ age, pulmonary dysfunction and transfer from an external institution were predictors for ECMO therapy. The interaction of patient age with D-dimer levels on admission and creatinine levels with SOFA score without GCS were predictors for renal replacement therapy. Conclusions: Using Explainable Boosting Machine analysis, we confirmed and weighed previously reported and identified novel predictors for outcome in critically ill COVID-19 patients. Using this strategy, predictive modeling of COVID-19 ICU patient outcomes can be performed overcoming the limitations of linear regression models. Trial registration “ClinicalTrials” (clinicaltrials.gov) under NCT04455451.
Health economics of Patient Blood Management: a cost‐benefit analysis based on a meta‐analysis
(2019)
Background and Objectives: Patient Blood Management (PBM) is the timely application of evidence‐based medical and surgical concepts designed to improve haemoglobin concentration, optimize haemostasis and minimize blood loss in an effort to improve patient outcomes. The focus of this cost‐benefit analysis is to analyse the economic benefit of widespread implementation of a multimodal PBM programme.
Materials and Methods: Based on a recent meta‐analysis including 17 studies (>235 000 patients) comparing PBM with control care and data from the University Hospital Frankfurt, a cost‐benefit analysis was performed. Outcome data were red blood cell (RBC) transfusion rate, number of transfused RBC units, and length of hospital stay (LOS). Costs were considered for the following three PBM interventions as examples: anaemia management including therapy of iron deficiency, use of cell salvage and tranexamic acid. For sensitivity analysis, a Monte Carlo simulation was performed.
Results: Iron supplementation was applied in 3·1%, cell salvage in 65% and tranexamic acid in 89% of the PBM patients. In total, applying these three PBM interventions costs €129·04 per patient. However, PBM was associated with a reduction in transfusion rate, transfused RBC units per patient, and LOS which yielded to mean savings of €150·64 per patient. Thus, the overall benefit of PBM implementation was €21·60 per patient. In the Monte Carlo simulation, the cost savings on the outcome side exceeded the PBM costs in approximately 2/3 of all repetitions and the total benefit was €1 878 000 in 100·000 simulated patients.
Conclusion: Resources to implement a multimodal PBM concept optimizing patient care and safety can be cost‐effectively.
Coronavirus disease 2019 (COVID-19) is caused by the Severe Acute Respiratory Syndrome Coronavirus 2 (SARS-CoV-2) and can affect multiple organs, among which is the circulatory system. Inflammation and mortality risk markers were previously detected in COVID-19 plasma and red blood cells (RBCs) metabolic and proteomic profiles. Additionally, biophysical properties, such as deformability, were found to be changed during the infection. Based on such data, we aim to better characterize RBC functions in COVID-19. We evaluate the flow properties of RBCs in severe COVID-19 patients admitted to the intensive care unit by using in vitro microfluidic techniques and automated methods, including artificial neural networks, for an unbiased RBC analysis. We find strong flow and RBC shape impairment in COVID-19 samples and demonstrate that such changes are reversible upon suspension of COVID-19 RBCs in healthy plasma. Vice versa, healthy RBCs immediately resemble COVID-19 RBCs when suspended in COVID-19 plasma. Proteomics and metabolomics analyses allow us to detect the effect of plasma exchanges on both plasma and RBCs and demonstrate a new role of RBCs in maintaining plasma equilibria at the expense of their flow properties. Our findings provide a framework for further investigations of clinical relevance for therapies against COVID-19 and possibly other infectious diseases.
Estimating intraoperative blood loss is one of the daily challenges for clinicians. Despite the knowledge of the inaccuracy of visual estimation by anaesthetists and surgeons, this is still the mainstay to estimate surgical blood loss. This review aims at highlighting the strengths and weaknesses of currently used measurement methods. A systematic review of studies on estimation of blood loss was carried out. Studies were included investigating the accuracy of techniques for quantifying blood loss in vivo and in vitro. We excluded nonhuman trials and studies using only monitoring parameters to estimate blood loss. A meta-analysis was performed to evaluate systematic measurement errors of the different methods. Only studies that were compared with a validated reference e.g. Haemoglobin extraction assay were included. 90 studies met the inclusion criteria for systematic review and were analyzed. Six studies were included in the meta-analysis, as only these were conducted with a validated reference. The mixed effect meta-analysis showed the highest correlation to the reference for colorimetric methods (0.93 95% CI 0.91–0.96), followed by gravimetric (0.77 95% CI 0.61–0.93) and finally visual methods (0.61 95% CI 0.40–0.82). The bias for estimated blood loss (ml) was lowest for colorimetric methods (57.59 95% CI 23.88–91.3) compared to the reference, followed by gravimetric (326.36 95% CI 201.65–450.86) and visual methods (456.51 95% CI 395.19–517.83). Of the many studies included, only a few were compared with a validated reference. The majority of the studies chose known imprecise procedures as the method of comparison. Colorimetric methods offer the highest degree of accuracy in blood loss estimation. Systems that use colorimetric techniques have a significant advantage in the real-time assessment of blood loss.