Refine
Year of publication
Document Type
- Article (50)
- Preprint (2)
- Conference Proceeding (1)
Language
- English (53)
Has Fulltext
- yes (53)
Is part of the Bibliography
- no (53)
Keywords
- Patient blood management (5)
- patient blood management (4)
- Transfusion (3)
- anaemia (3)
- transfusion (3)
- COVID-19 (2)
- Cell salvage (2)
- Critical care (2)
- Mortality (2)
- Organ dysfunctions (2)
Institute
- Medizin (53)
Uncalibrated semi-invasive continous monitoring of cardiac index (CI) has recently gained increasing interest. The aim of the present study was to compare the accuracy of CI determination based on arterial waveform analysis with transpulmonary thermodilution. Fifty patients scheduled for elective coronary surgery were studied after induction of anaesthesia and before and after cardiopulmonary bypass (CPB), respectively. Each patient was monitored with a central venous line, the PiCCO system, and the FloTrac/Vigileo-system. Measurements included CI derived by transpulmonary thermodilution and uncalibrated semi-invasive pulse contour analysis. Percentage changes of CI were calculated. There was a moderate, but significant correlation between pulse contour CI and thermodilution CI both before (r(2) = 0.72, P < 0.0001) and after (r(2) = 0.62, P < 0.0001) CPB, with a percentage error of 31% and 25%, respectively. Changes in pulse contour CI showed a significant correlation with changes in thermodilution CI both before (r(2) = 0.52, P < 0.0001) and after (r(2) = 0.67, P < 0.0001) CPB. Our findings demonstrated that uncalibrated semi-invasive monitoring system was able to reliably measure CI compared with transpulmonary thermodilution in patients undergoing elective coronary surgery. Furthermore, the semi-invasive monitoring device was able to track haemodynamic changes and trends.
Background: Invasive off- or on-pump cardiac surgery (elective and emergency procedures, excluding transplants are routinely performed to treat complications of ischaemic heart disease. Randomised controlled trials (RCT) evaluate the effectiveness of treatments in the setting of cardiac surgery. However, the impact of RCTs is weakened by heterogeneity in outcome measuring and reporting, which hinders comparison across trials. Core outcome sets (COS, a set of outcomes that should be measured and reported, as a minimum, in clinical trials for a specific clinical field) help reduce this problem. In light of the above, we developed a COS for cardiac surgery effectiveness trials.
Methods: Potential core outcomes were identified a priori by analysing data on 371 RCTs of 58,253 patients. We reached consensus on core outcomes in an international three-round eDelphi exercise. Outcomes for which at least 60% of the participants chose the response option "no" and less than 20% chose the response option "yes" were excluded.
Results: Eighty-six participants from 23 different countries involving adult cardiac patients, cardiac surgeons, anaesthesiologists, nursing staff and researchers contributed to this eDelphi. The panel reached consensus on four core outcomes: 1) Measure of mortality, 2) Measure of quality of life, 3) Measure of hospitalisation and 4) Measure of cerebrovascular complication to be included in adult cardiac surgery trials.
Conclusion: This study used robust research methodology to develop a minimum core outcome set for clinical trials evaluating the effectiveness of treatments in the setting of cardiac surgery. As a next step, appropriate outcome measurement instruments have to be selected.
BACKGROUND: Transient episodes of ischemia in a remote organ or tissue (remote ischemic preconditioning, RIPC) can attenuate myocardial injury. Myocardial damage is associated with tissue remodeling and the matrix metalloproteinases 2 and 9 (MMP-2/9) are crucially involved in these events. Here we investigated the effects of RIPC on the activities of heart tissue MMP-2/9 and their correlation with serum concentrations of cardiac troponin T (cTnT), a marker for myocardial damage.
METHODS: In cardiosurgical patients with cardiopulmonary bypass (CPB) RIPC was induced by four 5 minute cycles of upper limb ischemia/reperfusion. Cardiac tissue was obtained before as well as after CPB and serum cTnT concentrations were measured. Tissue derived from control patients (N = 17) with high cTnT concentrations (≥0.32 ng/ml) and RIPC patients (N = 18) with low cTnT (≤0.32 ng/ml) was subjected to gelatin zymography to quantify MMP-2/9 activities.
RESULTS: In cardiac biopsies obtained before CPB, activities of MMP-2/9 were attenuated in the RIPC group (MMP-2: Control, 1.13 ± 0.13 a.u.; RIPC, 0.71 ± 0.12 a.u.; P < 0.05. MMP-9: Control, 1.50 ± 0.16 a.u.; RIPC, 0.87 ± 0.14 a.u.; P < 0.01), while activities of the pro-MMPs were not altered (P > 0.05). In cardiac biopsies taken after CPB activities of pro- and active MMP-2/9 were not different between the groups (P > 0.05). Spearman's rank tests showed that MMP-2/9 activities in cardiac tissue obtained before CPB were positively correlated with postoperative cTnT serum levels (MMP-2, P = 0.016; MMP-9, P = 0.015).
CONCLUSIONS: Activities of MMP-2/9 in cardiac tissue obtained before CPB are attenuated by RIPC and are positively correlated with serum concentrations of cTnT. MMPs may represent potential targets for RIPC mediated cardioprotection.
TRIAL REGISTRATION: ClinicalTrials.gov identifier NCT00877305.
Background: To compare the effect of aprotinin with the effect of lysine analogues (tranexamic acid and ε-aminocaproic acid) on early mortality in three subgroups of patients: low, intermediate and high risk of cardiac surgery.
Methods and Findings: We performed a meta-analysis of randomised controlled trials and observational with the following data sources: Medline, Cochrane Library, and reference lists of identified articles. The primary outcome measure was early (in-hospital/30-day) mortality. The secondary outcome measures were any transfusion of packed red blood cells within 24 hours after surgery, any re-operation for bleeding or massive bleeding, and acute renal dysfunction or failure within the selected cited publications, respectively.
Out of 328 search results, 31 studies (15 trials and 16 observational studies) included 33,501 patients. Early mortality was significantly increased after aprotinin vs. lysine analogues with a pooled risk ratio (95% CI) of 1.58 (1.13–2.21), p<0.001 in the low (n = 14,297) and in the intermediate risk subgroup (1.42 (1.09–1.84), p<0.001; n = 14,427), respectively. Contrarily, in the subgroup of high risk patients (n = 4,777), the risk for mortality did not differ significantly between aprotinin and lysine analogues (1.03 (0.67–1.58), p = 0.90).
Conclusion: Aprotinin may be associated with an increased risk of mortality in low and intermediate risk cardiac surgery, but presumably may has no effect on early mortality in a subgroup of high risk cardiac surgery compared to lysine analogues. Thus, decisions to re-license aprotinin in lower risk patients should critically be debated. In contrast, aprotinin might probably be beneficial in high risk cardiac surgery as it reduces risk of transfusion and bleeding complications.
Background: The use of cell salvage and autologous blood transfusion has become an important method of blood conservation. So far, there are no clinical data about the performance of the continuous autotransfusion device CATSmart.
Methods: In total, 74 patients undergoing either cardiac or orthopedic surgery were included in this prospective, bicenter and observational technical evaluation to validate red cell separation process and washout quality of CATSmart. The target of red cell separation process was defined as a hematocrit value in the packed red cell unit of 55–75% and of washout quality of 80–100% removal ratio.
Results: Hematocrit values measured by CATSmart and laboratory analysis were 78.5% [71.3%; 84.0%] and 73.7% [67.5%; 75.5%], respectively. Removal ratios for platelets 94.7% [88.2%; 96.7%], free hemoglobin 89.3% [85.2%; 94.9%], albumin 97.9% [96.6%; 98.5%], heparin 99.9% [99.9%; 100.0%], and potassium 92.5% [90.8%; 95.0%] were within the target range while removal of white blood cells was slightly worse 72.4% [57.9%; 87.3%].
Conclusion: The new autotransfusion device enables sufficient red cell separation and washout quality.
Purpose: Trauma is the leading cause of death in children. In adults, blood transfusion and fluid resuscitation protocols changed resulting in a decrease of morbidity and mortality over the past 2 decades. Here, transfusion and fluid resuscitation practices were analysed in severe injured children in Germany.
Methods: Severely injured children (maximum Abbreviated Injury Scale (AIS) ≥ 3) admitted to a certified trauma-centre (TraumaZentrum DGU®) between 2002 and 2017 and registered at the TraumaRegister DGU® were included and assessed regarding blood transfusion rates and fluid therapy.
Results: 5,118 children (aged 1–15 years) with a mean ISS 22 were analysed. Blood transfusion rates administered until ICU admission decreased from 18% (2002–2005) to 7% (2014–2017). Children who are transfused are increasingly seriously injured. ISS has increased for transfused children aged 1–15 years (2002–2005: mean 27.7–34.4 in 2014–2017). ISS in non-transfused children has decreased in children aged 1–15 years (2002–2005: mean 19.6 to mean 17.6 in 2014–2017). Mean prehospital fluid administration decreased from 980 to 549 ml without affecting hemodynamic instability.
Conclusion: Blood transfusion rates and amount of fluid resuscitation decreased in severe injured children over a 16-year period in Germany. Restrictive blood transfusion and fluid management has become common practice in severe injured children. A prehospital restrictive fluid management strategy in severely injured children is not associated with a worsened hemodynamic state, abnormal coagulation or base excess but leads to higher hemoglobin levels.
Introduction: In recent years, resource-saving handling of allogeneic blood products and a reduction of transfusion rates in adults has been observed. However, comparable published national data for transfusion practices in pediatric patients are currently not available. In this study, the transfusion rates for children and adolescents were analyzed based on data from the Federal Statistical Office of Germany during the past 2 decades. Methods: Data were queried via the database of the Federal Statistical Office (Destasis). The period covered was from 2005 to 2018, and those in the sample group were children and adolescents aged 0–17 years receiving inpatient care. Operation and procedure codes (OPS) for transfusions, procedures, or interventions with increased transfusion risk were queried and evaluated in detail. Results: In Germany, 0.9% of the children and adolescents treated in hospital received a transfusion in 2018. A reduction in transfusion rates from 1.02% (2005) to 0.9% (2018) was observed for the total collective of children and adolescents receiving inpatient care. Increases in transfusion rates were recorded for 1- to 4- (1.41–1.45%) and 5- to 10-year-olds (1.24–1.33%). Children under 1 year of age were most frequently transfused (in 2018, 40.2% of the children were cared for in hospital). Transfusion-associated procedures such as chemotherapy or machine ventilation and respiratory support for newborns and infants are on the rise. Conclusion: Transfusion rates are declining in children and adolescents, but the reasons for increases in transfusion rates in other groups are unclear. Prospective studies to evaluate transfusion rates and triggers in children are urgently needed.
Background: In intensive care units (ICU) octogenarians become a routine patients group with aggravated therapeutic and diagnostic decision-making. Due to increased mortality and a reduced quality of life in this high-risk population, medical decision-making a fortiori requires an optimum of risk stratification. Recently, the VIP-1 trial prospectively observed that the clinical frailty scale (CFS) performed well in ICU patients in overall-survival and short-term outcome prediction. However, it is known that healthcare systems differ in the 21 countries contributing to the VIP-1 trial. Hence, our main focus was to investigate whether the CFS is usable for risk stratification in octogenarians admitted to diversified and high tech German ICUs.
Methods: This multicentre prospective cohort study analyses very old patients admitted to 20 German ICUs as a sub-analysis of the VIP-1 trial. Three hundred and eight patients of 80 years of age or older admitted consecutively to participating ICUs. CFS, cause of admission, APACHE II, SAPS II and SOFA scores, use of ICU resources and ICU- and 30-day mortality were recorded. Multivariate logistic regression analysis was used to identify factors associated with 30-day mortality.
Results: Patients had a median age of 84 [IQR 82–87] years and a mean CFS of 4.75 (± 1.6 standard-deviation) points. More than half of the patients (53.6%) were classified as frail (CFS ≥ 5). ICU-mortality was 17.3% and 30-day mortality was 31.2%. The cause of admission (planned vs. unplanned), (OR 5.74) and the CFS (OR 1.44 per point increase) were independent predictors of 30-day survival.
Conclusions: The CFS is an easy determinable valuable tool for prediction of 30-day ICU survival in octogenarians, thus, it may facilitate decision-making for intensive care givers in Germany.
Trial registration: The VIP-1 study was retrospectively registered on ClinicalTrials.gov (ID: NCT03134807) on May 1, 2017.
Transfusion of red blood cells (RBC) in patients undergoing major elective cranial surgery is associated with increased morbidity, mortality and prolonged hospital length of stay (LOS). This retrospective single center study aims to identify the clinical outcome of RBC transfusions on skull base and non-skull base meningioma patients including the identification of risk factors for RBC transfusion. Between October 2009 and October 2016, 423 patients underwent primary meningioma resection. Of these, 68 (16.1%) received RBC transfusion and 355 (83.9%) did not receive RBC units. Preoperative anaemia rate was significantly higher in transfused patients (17.7%) compared to patients without RBC transfusion (6.2%; p = 0.0015). In transfused patients, postoperative complications as well as hospital LOS was significantly higher (p < 0.0001) compared to non-transfused patients. After multivariate analyses, risk factors for RBC transfusion were preoperative American Society of Anaesthesiologists (ASA) physical status score (p = 0.0247), tumor size (p = 0.0006), surgical time (p = 0.0018) and intraoperative blood loss (p < 0.0001). Kaplan-Meier curves revealed significant influence on overall survival by preoperative anaemia, RBC transfusion, smoking, cardiovascular disease, preoperative KPS ≤ 60% and age (elderly ≥ 75 years). We concluded that blood loss due to large tumors or localization near large vessels are the main triggers for RBC transfusion in meningioma patients paired with a potential preselection that masks the effect of preoperative anaemia in multivariate analysis. Further studies evaluating the impact of preoperative anaemia management for reduction of RBC transfusion are needed to improve the clinical outcome of meningioma patients.
Estimating intraoperative blood loss is one of the daily challenges for clinicians. Despite the knowledge of the inaccuracy of visual estimation by anaesthetists and surgeons, this is still the mainstay to estimate surgical blood loss. This review aims at highlighting the strengths and weaknesses of currently used measurement methods. A systematic review of studies on estimation of blood loss was carried out. Studies were included investigating the accuracy of techniques for quantifying blood loss in vivo and in vitro. We excluded nonhuman trials and studies using only monitoring parameters to estimate blood loss. A meta-analysis was performed to evaluate systematic measurement errors of the different methods. Only studies that were compared with a validated reference e.g. Haemoglobin extraction assay were included. 90 studies met the inclusion criteria for systematic review and were analyzed. Six studies were included in the meta-analysis, as only these were conducted with a validated reference. The mixed effect meta-analysis showed the highest correlation to the reference for colorimetric methods (0.93 95% CI 0.91–0.96), followed by gravimetric (0.77 95% CI 0.61–0.93) and finally visual methods (0.61 95% CI 0.40–0.82). The bias for estimated blood loss (ml) was lowest for colorimetric methods (57.59 95% CI 23.88–91.3) compared to the reference, followed by gravimetric (326.36 95% CI 201.65–450.86) and visual methods (456.51 95% CI 395.19–517.83). Of the many studies included, only a few were compared with a validated reference. The majority of the studies chose known imprecise procedures as the method of comparison. Colorimetric methods offer the highest degree of accuracy in blood loss estimation. Systems that use colorimetric techniques have a significant advantage in the real-time assessment of blood loss.
Coronavirus disease 2019 (COVID-19) is caused by the Severe Acute Respiratory Syndrome Coronavirus 2 (SARS-CoV-2) and can affect multiple organs, among which is the circulatory system. Inflammation and mortality risk markers were previously detected in COVID-19 plasma and red blood cells (RBCs) metabolic and proteomic profiles. Additionally, biophysical properties, such as deformability, were found to be changed during the infection. Based on such data, we aim to better characterize RBC functions in COVID-19. We evaluate the flow properties of RBCs in severe COVID-19 patients admitted to the intensive care unit by using microfluidic techniques and automated methods, including artificial neural networks, for an unbiased RBC analysis. We find strong flow and RBC shape impairment in COVID-19 samples and demonstrate that such changes are reversible upon suspension of COVID-19 RBCs in healthy plasma. Vice versa, healthy RBCs resemble COVID-19 RBCs when suspended in COVID-19 plasma. Proteomics and metabolomics analyses allow us to detect the effect of plasma exchanges on both plasma and RBCs and demonstrate a new role of RBCs in maintaining plasma equilibria at the expense of their flow properties. Our findings provide a framework for further investigations of clinical relevance for therapies against COVID-19 and possibly other infectious diseases.
Editor's evaluation
This report illustrates a comprehensive account detailing the marked alteration of red blood cell (RBC) morphology that occurs with COVID-19 infection. A particularly important result is the observation that RBC morphology is dramatically affected by plasma from COVID-19 patients and reversible with plasma from healthy donors. The claims of the manuscript are well supported by the data, and the approaches used are thoughtful and rigorous. The results are important for consideration of the broader pathophysiology of COVID-19, particularly with regard to the impact on vascular biology and will be of interest to the readership of eLife.
Coronavirus disease 2019 (COVID-19) is caused by the Severe Acute Respiratory Syndrome Coronavirus 2 (SARS-CoV-2) and can affect multiple organs, among which is the circulatory system. Inflammation and mortality risk markers were previously detected in COVID-19 plasma and red blood cells (RBCs) metabolic and proteomic profiles. Additionally, biophysical properties, such as deformability, were found to be changed during the infection. Based on such data, we aim to better characterize RBC functions in COVID-19. We evaluate the flow properties of RBCs in severe COVID-19 patients admitted to the intensive care unit by using in vitro microfluidic techniques and automated methods, including artificial neural networks, for an unbiased RBC analysis. We find strong flow and RBC shape impairment in COVID-19 samples and demonstrate that such changes are reversible upon suspension of COVID-19 RBCs in healthy plasma. Vice versa, healthy RBCs immediately resemble COVID-19 RBCs when suspended in COVID-19 plasma. Proteomics and metabolomics analyses allow us to detect the effect of plasma exchanges on both plasma and RBCs and demonstrate a new role of RBCs in maintaining plasma equilibria at the expense of their flow properties. Our findings provide a framework for further investigations of clinical relevance for therapies against COVID-19 and possibly other infectious diseases.
Coronavirus disease 2019 (COVID-19) is caused by the Severe Acute Respiratory Syndrome Coronavirus 2 (SARS-CoV-2) and can affect multiple organs, among which is the circulatory system. Inflammation and mortality risk markers were previously detected in COVID-19 plasma and red blood cells (RBCs) metabolic and proteomic profiles. Additionally, biophysical properties, such as deformability, were found to be changed during the infection. Based on such data, we aim to better characterize RBC functions in COVID-19. We evaluate the flow properties of RBCs in severe COVID-19 patients admitted to the intensive care unit by using in vitro microfluidic techniques and automated methods, including artificial neural networks, for an unbiased RBC analysis. We find strong flow and RBC shape impairment in COVID-19 samples and demonstrate that such changes are reversible upon suspension of COVID-19 RBCs in healthy plasma. Vice versa, healthy RBCs immediately resemble COVID-19 RBCs when suspended in COVID-19 plasma. Proteomics and metabolomics analyses allow us to detect the effect of plasma exchanges on both plasma and RBCs and demonstrate a new role of RBCs in maintaining plasma equilibria at the expense of their flow properties. Our findings provide a framework for further investigations of clinical relevance for therapies against COVID-19 and possibly other infectious diseases.
Background: The intraoperative blood loss is estimated daily in the operating room and is mainly done by visual techniques. Due to local standards, the surgical sponge colours can vary (e.g. white in US, green in Germany). The influence of sponge colour on accuracy of estimation has not been in the focus of research yet. Material and methods: A blood loss simulation study containing four “bleeding” scenarios each per sponge colour were created by using expired whole blood donation samples. The blood donations were applied to white and green surgical sponges after dilution with full electrolyte solution. Study participants had to estimate the absorbed blood loss in sponges in all scenarios. The difference to the reference blood loss was analysed. Multivariate linear regression analysis was performed to investigate other influence factors such as staff experience and sponge colour. Results: A total of 53 anaesthesists participated in the study. Visual estimation correlated moderately with reference blood loss in white (Spearman's rho: 0.521; p = 3.748*10−16) and green sponges (Spearman's rho: 0.452; p = 4.683*10−12). The median visually estimated blood loss was higher in white sponges (250ml IRQ 150–412.5ml) than in green sponges (150ml IQR 100-300ml), compared to reference blood loss (103ml IQR 86–162.8). For both colour types of sponges, major under- and overestimation was observed. The multivariate statistics demonstrates that fabric colours have a significant influence on estimation (p = 3.04*10−10), as well as clinician’s qualification level (p = 2.20*10−10, p = 1.54*10−08) and amount of RBL to be estimated (p < 2*10−16). Conclusion: The deviation of correct blood loss estimation was smaller with white surgical sponges compared to green sponges. In general, deviations were so severe for both types of sponges, that it appears to be advisable to refrain from visually estimating blood loss whenever possible and instead to use other techniques such as e.g. colorimetric estimation.
Background: The ability of stroke volume variation (SVV), pulse pressure variation (PPV) and global end-diastolic volume (GEDV) for prediction of fluid responsiveness in presence of pleural effusion is unknown. The aim of the present study was to challenge the ability of SVV, PPV and GEDV to predict fluid responsiveness in a porcine model with pleural effusions.
Methods: Pigs were studied at baseline and after fluid loading with 8 ml kg−1 6% hydroxyethyl starch. After withdrawal of 8 ml kg−1 blood and induction of pleural effusion up to 50 ml kg−1 on either side, measurements at baseline and after fluid loading were repeated. Cardiac output, stroke volume, central venous pressure (CVP) and pulmonary occlusion pressure (PAOP) were obtained by pulmonary thermodilution, whereas GEDV was determined by transpulmonary thermodilution. SVV and PPV were monitored continuously by pulse contour analysis.
Results: Pleural effusion was associated with significant changes in lung compliance, peak airway pressure and stroke volume in both responders and non-responders. At baseline, SVV, PPV and GEDV reliably predicted fluid responsiveness (area under the curve 0.85 (p<0.001), 0.88 (p<0.001), 0.77 (p = 0.007). After induction of pleural effusion the ability of SVV, PPV and GEDV to predict fluid responsiveness was well preserved and also PAOP was predictive. Threshold values for SVV and PPV increased in presence of pleural effusion.
Conclusions: In this porcine model, bilateral pleural effusion did not affect the ability of SVV, PPV and GEDV to predict fluid responsiveness.
Background: Cerebral O2 saturation (ScO2) reflects cerebral perfusion and can be measured noninvasively by near-infrared spectroscopy (NIRS). Objectives: In this pilot study, we describe the dynamics of ScO2 during TAVI in nonventilated patients and its impact on procedural outcome. Methods and Results: We measured ScO2 of both frontal lobes continuously by NIRS in 50 consecutive analgo-sedated patients undergoing transfemoral TAVI (female 58%, mean age 80.8 years). Compared to baseline ScO2 dropped significantly during RVP (59.3% vs. 53.9%, p < .01). Five minutes after RVP ScO2 values normalized (post RVP 62.6% vs. 53.9% during RVP, p < .01; pre 61.6% vs. post RVP 62.6%, p = .53). Patients with an intraprocedural pathological ScO2 decline of >20% (n = 13) had higher EuroSCORE II (3.42% vs. 5.7%, p = .020) and experienced more often delirium (24% vs. 62%, p = .015) and stroke (0% vs. 23%, p < .01) after TAVI. Multivariable logistic regression revealed higher age and large ScO2 drops as independent risk factors for delirium. Conclusions: During RVP ScO2 significantly declined compared to baseline. A ScO2 decline of >20% is associated with a higher incidence of delirium and stroke and a valid cut-off value to screen for these complications. NIRS measurement during TAVI procedure may be an easy to implement diagnostic tool to detect patients at high risks for cerebrovascular complications and delirium.
Introduction: Observational studies have demonstrated an association between vitamin D deficiency and increased risk of morbidity and mortality in critically ill patients. Cohort studies and pilot trials have suggested promising beneficial effects of vitamin D replacement in the critical ill, at least in patients with severe vitamin D deficiency. As vitamin D is a simple, low-cost and safe intervention, it has potential to improve survival in critically ill patients.
Methods and analysis: In this randomised, placebo-controlled, double-blind, multicentre, international trial, 2400 adult patients with severe vitamin D deficiency (25-hydroxyvitamin D≤12 ng/mL) will be randomised in a 1:1 ratio by www.randomizer.at to receive a loading dose of 540 000 IU cholecalciferol within 72 hours after intensive care unit (ICU) admission, followed by 4000 IU daily for 90 days or placebo. Hypercalcaemia may occur as a side effect, but is monitored by regular checks of the calcium level. The primary outcome is all-cause mortality at 28 days after randomisation. Secondary outcomes are: ICU, hospital, 90-day and 1-year mortality; hospital and ICU length of stay, change in organ dysfunction on day 5 as measured by Sequential Organ Function Assessment (SOFA) score, number of organ failures; hospital and ICU readmission until day 90; discharge destination, self-reported infections requiring antibiotics until day 90 and health-related quality of life. Recruitment status is ongoing.
Ethics and dissemination: National ethical approval was obtained by the Ethics Committee of the University of Graz for Austria, Erasme University Brussels (Belgium) and University Hospital Frankfurt (Germany), and will further be gained according to individual national processes. On completion, results will be published in a peer-reviewed scientific journal. The study findings will be presented at national and international meetings with abstracts online.
Trial registration: NCT03188796, EudraCT-No: 2016-002460-13.
Background: Anemia is the most important complication during major surgery and transfusion of red blood cells is the mainstay to compensate for life threating blood loss. Therefore, accurate measurement of hemoglobin (Hb) concentration should be provided in real-time. Blood Gas Analysis (BGA) provides rapid point-of-care assessment using smaller sampling tubes compared to central laboratory (CL) services. Objective: This study aimed to investigate the accuracy of BGA hemoglobin testing as compared to CL services. Methods: Data of the ongoing LIBERAL-Trial (Liberal transfusion strategy to prevent mortality and anemia-associated ischemic events in elderly non-cardiac surgical patients, LIBERAL) was used to assess the bias for Hb level measured by BGA devices (ABL800 Flex analyzer®, GEM series® and RapidPoint 500®) and CL as the reference method. For that, we analyzed pairs of Hb level measured by CL and BGA within two hours. Furthermore, the impact of various confounding factors including age, gender, BMI, smoker status, transfusion of RBC, intraoperative hemodilution, and co-medication was elucidated. In order to ensure adequate statistical analysis, only data of participating centers providing more than 200 Hb pairs were used. Results: In total, three centers including 963 patients with 1,814 pairs of Hb measurements were analyzed. Mean bias was comparable between ABL800 Flex analyzer® and GEM series®: - 0.38 ± 0.15 g/dl whereas RapidPoint 500® showed a smaller bias (-0.09 g/dl) but greater median absolute deviation (± 0.45 g/dl). In order to avoid interference with different standard deviations caused by the different analytic devices, we focused on two centers using the same BGA technique (309 patients and 1,570 Hb pairs). A Bland-Altman analysis and LOWESS curve showed that bias decreased with smaller Hb values in absolute numbers but increased relatively. The smoker status showed the greatest reduction in bias (0.1 g/dl, p<0.001) whereas BMI (0.07 g/dl, p = 0.0178), RBC transfusion (0.06 g/dl, p<0.001), statins (0.04 g/dl, p<0.05) and beta blocker (0.03 g/dl, p = 0.02) showed a slight effect on bias. Intraoperative substitution of volume and other co-medications did not influence the bias significantly. Conclusion: Many interventions like substitution of fluids, coagulating factors or RBC units rely on the accuracy of laboratory measurement devices. Although BGA Hb testing showed a consistently stable difference to CL, our data confirm that BGA devices are associated with different bias. Therefore, we suggest that hospitals assess their individual bias before implementing BGA as valid and stable supplement to CL. However, based on the finding that bias decreased with smaller Hb values, which in turn are used for transfusion decision, we expect no unnecessary or delayed RBC transfusion, and no major impact on the LIBERAL trial performance.