610 Medizin und Gesundheit
Refine
Year of publication
Document Type
- Article (95)
- Conference Proceeding (4)
- Preprint (3)
Language
- English (102)
Has Fulltext
- yes (102)
Is part of the Bibliography
- no (102)
Keywords
- Patient blood management (6)
- Transfusion (5)
- COVID-19 (4)
- Critical care (4)
- Outcome (4)
- patient blood management (4)
- ARDS (3)
- Intensive care (3)
- Mortality (3)
- SARS-CoV-2 (3)
Institute
The scope of extracorporeal membrane oxygenation (ECMO) is expanding, nevertheless, pharmacokinetics in patients receiving cardiorespiratory support are fairly unknown leading to unpredictable drug concentrations. Currently, there are no clear guidelines for antibiotic dosing during ECMO. This study aims to evaluate the pharmacokinetics (PK) of cefazolin in patients undergoing ECMO treatment. Total and unbound plasma cefazolin concentration of critically ill patients on veno-arterial ECMO were determined. Observed PK was compared to dose recommendations calculated by an online available, free dosing software. Concentration of cefazolin varied broadly despite same dosage in all patients. The mean total and unbound plasma concentration were high showing significantly (p = 5.8913 E−09) greater unbound fraction compared to a standard patient. Cefazolin clearance was significantly (p = 0.009) higher in patients with preserved renal function compared with CRRT. Based upon the calculated clearance, the use of dosing software would have led to lower but still sufficient concentrations of cefazolin in general. Our study shows that a “one size fits all” dosing regimen leads to excessive unbound cefazolin concentration in these patients. They exhibit high PK variability and decreased cefazolin clearance on ECMO appears to compensate for ECMO- and critical illness-related increases in volume of distribution.
Objectives: The ongoing coronavirus pandemic is challenging, especially in severely affected patients who require intubation and sedation. Although the potential benefits of sedation with volatile anesthetics in coronavirus disease 2019 patients are currently being discussed, the use of isoflurane in patients with coronavirus disease 2019–induced acute respiratory distress syndrome has not yet been reported. Design: We performed a retrospective analysis of critically ill patients with hypoxemic respiratory failure requiring mechanical ventilation. Setting: The study was conducted with patients admitted between April 4 and May 15, 2020 to our ICU. Patients: We included five patients who were previously diagnosed with severe acute respiratory syndrome coronavirus 2 infection. Intervention: Even with high doses of several IV sedatives, the targeted level of sedation could not be achieved. Therefore, the sedation regimen was switched to inhalational isoflurane. Clinical data were recorded using a patient data management system. We recorded demographical data, laboratory results, ventilation variables, sedative dosages, sedation level, prone positioning, duration of volatile sedation and outcomes. Measurements & Main Results: Mean age (four men, one women) was 53.0 (± 12.7) years. The mean duration of isoflurane sedation was 103.2 (± 66.2) hours. Our data demonstrate a substantial improvement in the oxygenation ratio when using isoflurane sedation. Deep sedation as assessed by the Richmond Agitation and Sedation Scale was rapidly and closely controlled in all patients, and the subsequent discontinuation of IV sedation was possible within the first 30 minutes. No adverse events were detected. Conclusions: Our findings demonstrate the feasibility of isoflurane sedation in five patients suffering from severe coronavirus disease 2019 infection. Volatile isoflurane was able to achieve the required deep sedation and reduced the need for IV sedation.
High sedation needs of critically ill COVID-19 ARDS patients - a monocentric observational study
(2021)
Background: Therapy of severely affected coronavirus patient, requiring intubation and sedation is still challenging. Recently, difficulties in sedating these patients have been discussed. This study aims to describe sedation practices in patients with 2019 coronavirus disease (COVID-19)-induced acute respiratory distress syndrome (ARDS). Methods: We performed a retrospective monocentric analysis of sedation regimens in critically ill intubated patients with respiratory failure who required sedation in our mixed 32-bed university intensive care unit. All mechanically ventilated adults with COVID-19-induced ARDS requiring continuously infused sedative therapy admitted between April 4, 2020, and June 30, 2020 were included. We recorded demographic data, sedative dosages, prone positioning, sedation levels and duration. Descriptive data analysis was performed; for additional analysis, a logistic regression with mixed effect was used. Results: In total, 56 patients (mean age 67 (±14) years) were included. The mean observed sedation period was 224 (±139) hours. To achieve the prescribed sedation level, we observed the need for two or three sedatives in 48.7% and 12.8% of the cases, respectively. In cases with a triple sedation regimen, the combination of clonidine, esketamine and midazolam was observed in most cases (75.7%). Analgesia was achieved using sufentanil in 98.6% of the cases. The analysis showed that the majority of COVID-19 patients required an unusually high sedation dose compared to those available in the literature. Conclusion: The global pandemic continues to affect patients severely requiring ventilation and sedation, but optimal sedation strategies are still lacking. The findings of our observation suggest unusual high dosages of sedatives in mechanically ventilated patients with COVID-19. Prescribed sedation levels appear to be achievable only with several combinations of sedatives in most critically ill patients suffering from COVID-19-induced ARDS and a potential association to the often required sophisticated critical care including prone positioning and ECMO treatment seems conceivable.
Background: Iron deficiency (ID) is one of the most common nutritional deficiencies in children worldwide and may result in iron deficiency anemia (IDA). The reticulocyte hemoglobin equivalent (Ret-He) provides information about the current availability of iron in erythropoiesis. This study aims to examine the validation of Ret-He as a screening marker for ID and IDA in children. Methods: Blood samples were retrospectively obtained from medical records. Anemia was defined according to the definition provided by the World Health Organization (WHO) for children. ID was defined by transferrin saturation (TSAT) < 20% and ferritin < 100 ng/mL. Children were classified into four groups: IDA, non-anemia iron deficiency (NAID), control and others. Results: Out of 970 children, 332 (34.2%) had NAID and 278 (28.7%) presented with IDA. Analysis revealed that Ret-He significantly correlates with ferritin (rho = 0.41; p < 0.001), TSAT (rho = 0.66; p < 0.001) and soluble transferrin receptor (sTfR) (rho = −0.72; p < 0.001). For ROC analysis, the area under the curve (AUC) was 0.771 for Ret-He detecting ID and 0.845 for detecting IDA. The cut-off value for Ret-He to diagnose ID was 33.5 pg (sensitivity 90.7%; specificity 35.8%) and 31.6 pg (sensitivity 90.6%; specificity 50.4%) to diagnose IDA. Conclusions: The present study demonstrates Ret-He to be a screening marker for ID and IDA in children. Furthermore, Ret-He can be used as a single screening parameter for ID and IDA in children without considering other iron parameters. Economically, the use of Ret-He is highly relevant, as it can save one blood tube per patient and additional costs.
Background: The intraoperative blood loss is estimated daily in the operating room and is mainly done by visual techniques. Due to local standards, the surgical sponge colours can vary (e.g. white in US, green in Germany). The influence of sponge colour on accuracy of estimation has not been in the focus of research yet. Material and methods: A blood loss simulation study containing four “bleeding” scenarios each per sponge colour were created by using expired whole blood donation samples. The blood donations were applied to white and green surgical sponges after dilution with full electrolyte solution. Study participants had to estimate the absorbed blood loss in sponges in all scenarios. The difference to the reference blood loss was analysed. Multivariate linear regression analysis was performed to investigate other influence factors such as staff experience and sponge colour. Results: A total of 53 anaesthesists participated in the study. Visual estimation correlated moderately with reference blood loss in white (Spearman's rho: 0.521; p = 3.748*10−16) and green sponges (Spearman's rho: 0.452; p = 4.683*10−12). The median visually estimated blood loss was higher in white sponges (250ml IRQ 150–412.5ml) than in green sponges (150ml IQR 100-300ml), compared to reference blood loss (103ml IQR 86–162.8). For both colour types of sponges, major under- and overestimation was observed. The multivariate statistics demonstrates that fabric colours have a significant influence on estimation (p = 3.04*10−10), as well as clinician’s qualification level (p = 2.20*10−10, p = 1.54*10−08) and amount of RBL to be estimated (p < 2*10−16). Conclusion: The deviation of correct blood loss estimation was smaller with white surgical sponges compared to green sponges. In general, deviations were so severe for both types of sponges, that it appears to be advisable to refrain from visually estimating blood loss whenever possible and instead to use other techniques such as e.g. colorimetric estimation.
Background: Anemia is the most important complication during major surgery and transfusion of red blood cells is the mainstay to compensate for life threating blood loss. Therefore, accurate measurement of hemoglobin (Hb) concentration should be provided in real-time. Blood Gas Analysis (BGA) provides rapid point-of-care assessment using smaller sampling tubes compared to central laboratory (CL) services. Objective: This study aimed to investigate the accuracy of BGA hemoglobin testing as compared to CL services. Methods: Data of the ongoing LIBERAL-Trial (Liberal transfusion strategy to prevent mortality and anemia-associated ischemic events in elderly non-cardiac surgical patients, LIBERAL) was used to assess the bias for Hb level measured by BGA devices (ABL800 Flex analyzer®, GEM series® and RapidPoint 500®) and CL as the reference method. For that, we analyzed pairs of Hb level measured by CL and BGA within two hours. Furthermore, the impact of various confounding factors including age, gender, BMI, smoker status, transfusion of RBC, intraoperative hemodilution, and co-medication was elucidated. In order to ensure adequate statistical analysis, only data of participating centers providing more than 200 Hb pairs were used. Results: In total, three centers including 963 patients with 1,814 pairs of Hb measurements were analyzed. Mean bias was comparable between ABL800 Flex analyzer® and GEM series®: - 0.38 ± 0.15 g/dl whereas RapidPoint 500® showed a smaller bias (-0.09 g/dl) but greater median absolute deviation (± 0.45 g/dl). In order to avoid interference with different standard deviations caused by the different analytic devices, we focused on two centers using the same BGA technique (309 patients and 1,570 Hb pairs). A Bland-Altman analysis and LOWESS curve showed that bias decreased with smaller Hb values in absolute numbers but increased relatively. The smoker status showed the greatest reduction in bias (0.1 g/dl, p<0.001) whereas BMI (0.07 g/dl, p = 0.0178), RBC transfusion (0.06 g/dl, p<0.001), statins (0.04 g/dl, p<0.05) and beta blocker (0.03 g/dl, p = 0.02) showed a slight effect on bias. Intraoperative substitution of volume and other co-medications did not influence the bias significantly. Conclusion: Many interventions like substitution of fluids, coagulating factors or RBC units rely on the accuracy of laboratory measurement devices. Although BGA Hb testing showed a consistently stable difference to CL, our data confirm that BGA devices are associated with different bias. Therefore, we suggest that hospitals assess their individual bias before implementing BGA as valid and stable supplement to CL. However, based on the finding that bias decreased with smaller Hb values, which in turn are used for transfusion decision, we expect no unnecessary or delayed RBC transfusion, and no major impact on the LIBERAL trial performance.
Background: paediatric patients are vulnerable to blood loss and even a small loss of blood can be associated with severe shock. In emergency situations, a red blood cell (RBC) transfusion may become unavoidable, although it is associated with various risks. The aim of this trial was to identify independent risk factors for perioperative RBC transfusion in children undergoing surgery. Methods: to identify independent risk factors for perioperative RBC transfusion in children undergoing surgery and to access RBC transfusion rates and in-hospital outcomes (e.g., length of stay, mortality, and typical postoperative complication rates), a monocentric, retrospective, and observational study was conducted. Descriptive, univariate, and multivariate analyses were performed. Results: between 1 January 2010 and 31 December 2019, data from n = 14,248 cases were identified at the centre. Analysis revealed an RBC transfusion rate of 10.1% (n = 1439) in the entire cohort. The independent predictors of RBC transfusion were the presence of preoperative anaemia (p < 0.001; OR = 15.10 with preoperative anaemia and OR = 2.40 without preoperative anaemia), younger age (p < 0.001; ORs between 0.14 and 0.28 for children older than 0 years), female gender (p = 0.036; OR = 1.19 compared to male gender), certain types of surgery (e.g., neuro surgery (p < 0.001; OR = 10.14), vascular surgery (p < 0.001; OR = 9.93), cardiac surgery (p < 0.001; OR = 4.79), gynaecology (p = 0.014; OR = 3.64), visceral surgery (p < 0.001; OR = 2.48), and the presence of postoperative complications (e.g., sepsis (p < 0.001; OR = 10.16), respiratory dysfunction (p < 0.001; OR = 7.56), cardiovascular dysfunction (p < 0.001; OR = 4.68), neurological dysfunction (p = 0.029; OR = 1.77), and renal dysfunction (p < 0.001; OR = 16.17)). Conclusion: preoperative anaemia, younger age, female gender, certain types of surgery, and postoperative complications are independent predictors for RBC transfusion in children undergoing surgery. Future prospective studies are urgently required to identify, in detail, the potential risk factors and impact of RBC transfusion in children.
Transfusion of red blood cells (RBC) in patients undergoing major elective cranial surgery is associated with increased morbidity, mortality and prolonged hospital length of stay (LOS). This retrospective single center study aims to identify the clinical outcome of RBC transfusions on skull base and non-skull base meningioma patients including the identification of risk factors for RBC transfusion. Between October 2009 and October 2016, 423 patients underwent primary meningioma resection. Of these, 68 (16.1%) received RBC transfusion and 355 (83.9%) did not receive RBC units. Preoperative anaemia rate was significantly higher in transfused patients (17.7%) compared to patients without RBC transfusion (6.2%; p = 0.0015). In transfused patients, postoperative complications as well as hospital LOS was significantly higher (p < 0.0001) compared to non-transfused patients. After multivariate analyses, risk factors for RBC transfusion were preoperative American Society of Anaesthesiologists (ASA) physical status score (p = 0.0247), tumor size (p = 0.0006), surgical time (p = 0.0018) and intraoperative blood loss (p < 0.0001). Kaplan-Meier curves revealed significant influence on overall survival by preoperative anaemia, RBC transfusion, smoking, cardiovascular disease, preoperative KPS ≤ 60% and age (elderly ≥ 75 years). We concluded that blood loss due to large tumors or localization near large vessels are the main triggers for RBC transfusion in meningioma patients paired with a potential preselection that masks the effect of preoperative anaemia in multivariate analysis. Further studies evaluating the impact of preoperative anaemia management for reduction of RBC transfusion are needed to improve the clinical outcome of meningioma patients.
Background: Conditions during blood product storage and transportation should maintain quality. The aim of this in vitro study was to investigate the effect of interruption of agitation, temporary cooling (TC), and pneumatic tube system transportation (PTST) on the aggregation ability (AA) and mitochondrial function (MF) of platelet concentrates (PC).
Study Design and Methods: A PC was divided equally into four subunits and then allocated to four test groups. The control group (I) was stored as recommended (continuous agitation, 22 ± 2°C) for 4 days. The test groups were stored without agitation (II), stored as recommended, albeit 4°C for 60 minutes on day (d)2 (III) and PTST (IV). Aggregometry was measured using Multiplate (RocheAG; ADPtest, ASPItest, TRAPtest, COLtest) and MF using Oxygraph‐2k (Oroboros Instruments). The basal and maximum mitochondrial respiratory rate (MMRR) were determined. AA and MF were measured daily in I and II and AA in III and IV on d2 after TC/PTST. Statistical analysis was performed using tests for matched observations.
Results: Eleven PCs were used. TRAP‐6 induced AA was significantly lower in II when compared to I on d4 (P = 0.015*). In III the ASPItest was significantly lower (P = 0.032*). IV showed no significant differences. The basal and MMRR were significantly reduced over 4 days in I and II (for both rates in both groups: P = <0.0001*). No significant differences occurred on d4 (P = 0.495).
Conclusion: Our results indicate that ex vivo AA and MF of PCs are unaffected, even in no‐ideal storage and transport circumstances with respect to agitation, temperature, and force.
Background: Extracorporeal life support (ECLS) has become an integral part of modern intensive therapy. The choice of support mode depends largely on the indication. Patients with respiratory failure are predominantly treated with a venovenous (VV) approach. We hypothesized that mortality in Germany in ECLS therapy did not differ from previously reported literature
Methods: Inpatient data from Germany from 2007 to 2018 provided by the Federal Statistical Office of Germany were analysed. The international statistical classification of diseases and related health problems codes (ICD) and process keys (OPS) for extracorporeal membrane oxygenation (ECMO) types, acute respiratory distress syndrome (ARDS) and hospital mortality were used.
Results: In total, 45,647 hospitalized patients treated with ECLS were analysed. In Germany, 231 hospitals provided ECLS therapy, with a median of 4 VV-ECMO and 9 VA-ECMO in 2018. Overall hospital mortality remained higher than predicted in comparison to the values reported in the literature. The number of VV-ECMO cases increased by 236% from 825 in 2007 to 2768 in 2018. ARDS was the main indication for VV-ECMO in only 33% of the patients in the past, but that proportion increased to 60% in 2018. VA-ECMO support is of minor importance in the treatment of ARDS in Germany. The age distribution of patients undergoing ECLS has shifted towards an older population. In 2018, the hospital mortality decreased in VV-ECMO patients and VV-ECMO patients with ARDS to 53.9% (n = 1493) and 54.4% (n = 926), respectively.
Conclusions: ARDS is a severe disease with a high mortality rate despite ECLS therapy. Although endpoints and timing of the evaluations differed from those of the CESAR and EOLIA studies and the Extracorporeal Life Support Organization (ELSO) Registry, the reported mortality in these studies was lower than in the present analysis. Further prospective analyses are necessary to evaluate outcomes in ECMO therapy at the centre volume level.