Refine
Year of publication
- 2019 (10) (remove)
Document Type
- Article (10)
Language
- English (10)
Has Fulltext
- yes (10)
Is part of the Bibliography
- no (10)
Keywords
- Abdominal surgery (1)
- Anesthesia, Intravenous (1)
- Biomarkers (1)
- Blood (1)
- Chlorides (1)
- Colloids (1)
- Electrolytes (1)
- Glucose Transporter Type 1 (1)
- IgGAM (1)
- In-line filtration (1)
Institute
- Medizin (10)
Background and objectives: Preoperative anaemia is an independent risk factor for a higher morbidity and mortality, a longer hospitalization and increased perioperative transfusion rates. Managing preoperative anaemia is the first of three pillars of Patient Blood Management (PBM), a multidisciplinary concept to improve patient safety. While various studies provide medical information on (successful) anaemia treatment pathways, knowledge of organizational details of diagnosis and management of preoperative anaemia across Europe is scarce.
Materials and methods: To gain information on various aspects of preoperative anaemia management including organization, financing, diagnostics and treatment, we conducted a survey (74 questions) in ten hospitals from seven European nations within the PaBloE (Patient Blood Management in Europe) working group covering the year 2016.
Results: Organization and activity in the field of preoperative anaemia management were heterogeneous in the participating hospitals. Almost all hospitals had pathways for managing preoperative anaemia in place, however, only two nations had national guidelines. In six of the ten participating hospitals, preoperative anaemia management was organized by anaesthetists. Diagnostics and treatment focused on iron deficiency anaemia which, in most hospitals, was corrected with intravenous iron.
Conclusion: Implementation and approaches of preoperative anaemia management vary across Europe with a primary focus on treating iron deficiency anaemia. Findings of this survey motivated the hospitals involved to critically evaluate their practice and may also help other hospitals interested in PBM to develop action plans for diagnosis and management of preoperative anaemia.
Introduction: Balanced fluid replacement solutions can possibly reduce the risks for electrolyte imbalances, for acid-base imbalances, and thus for renal failure. To assess the intraoperative change of base excess (BE) and chloride in serum after treatment with either a balanced gelatine/electrolyte solution or a non-balanced gelatine/electrolyte solution, a prospective, controlled, randomized, double-blind, dual centre phase III study was conducted in two tertiary care university hospitals in Germany.
Material and methods: 40 patients of both sexes, aged 18 to 90 years, who were scheduled to undergo elective abdominal surgery with assumed intraoperative volume requirement of at least 15 mL/kg body weight gelatine solution were included. Administration of study drug was performed intravenously according to patients need. The trigger for volume replacement was a central venous pressure (CVP) minus positive end-expiratory pressure (PEEP) <10 mmHg (CVP <10 mmHg). The crystalloid:colloid ratio was 1:1 intra- and postoperatively. The targets for volume replacement were a CVP between 10 and 14 mmHg minus PEEP after treatment with vasoactive agent and mean arterial pressure (MAP) > 65 mmHg.
Results: The primary endpoints, intraoperative changes of base excess –2.59 ± 2.25 (median: –2.65) mmol/L (balanced group) and –4.79 ± 2.38 (median: –4.70) mmol/L (non-balanced group)) or serum chloride 2.4 ± 1.9 (median: 3.0) mmol/L and 5.2 ± 3.1 (median: 5.0) mmol/L were significantly different (p = 0.0117 and p = 0.0045, respectively). In both groups (each n = 20) the investigational product administration in terms of volume and infusion rate was comparable throughout the course of the study, i.e. before, during and after surgery.
Discussion: Balanced gelatine solution 4% combined with a balanced electrolyte solution demonstrated significant smaller impact on blood gas analytic parameters in the primary endpoints BE and serum chloride when compared to a non-balanced gelatine solution 4% combined with NaCl 0.9%. No marked treatment differences were observed with respect to haemodynamics, coagulation and renal function.
Trial registration: ClinicalTrials.gov (NCT01515397) and clinicaltrialsregister.eu, EudraCT number 2010-018524-58.
Background. Tracheal intubation still represents the "gold standard" in securing the airway of unconscious patients in the prehospital setting. Especially in cases of restricted access to the patient, video laryngoscopy became more and more relevant.
Objectives. The aim of the study was to evaluate the performance and intubation success of four different video laryngoscopes, one optical laryngoscope, and a Macintosh blade while intubating from two different positions in a mannequin trial with difficult access to the patient.
Methods. A mannequin with a cervical collar was placed on the driver’s seat. Intubation was performed with six different laryngoscopes either through the driver’s window or from the backseat. Success, C/L score, time to best view (TTBV), time to intubation (TTI), and number of attempts were measured. All participants were asked to rate their favored device.
Results. Forty-two physicians participated. 100% of all intubations performed from the backseat were successful. Intubation success through the driver’s window was less successful. Only with the Airtraq® optical laryngoscope, 100% success was achieved. Best visualization (window C/L 2a; backseat C/L 2a) and shortest TTBV (window 4.7 s; backseat 4.1 s) were obtained when using the D-Blade video laryngoscope, but this was not associated with a higher success through the driver’s window. Fastest TTI was achieved through the window (14.2 s) when using the C-MAC video laryngoscope and from the backseat (7.3 s) when using a Macintosh blade.
Conclusions. Video laryngoscopy revealed better results in visualization but was not associated with a higher success. Success depended on the approach and familiarity with the device. We believe that video laryngoscopy is suitable for securing airways in trapped accident victims. The decision for an optimal device is complicated and should be based upon experience and regular training with the device.
Objective. Evaluation of C-MAC PM® in combination with a standard Macintosh blade size 3 in direct and indirect laryngoscopy and D-Blade® in indirect laryngoscopy in a simulated difficult airway. Primary outcome was defined as the best view of the glottic structures. Secondary endpoints were subjective evaluation and assessment of the intubation process.
Methods. Prospective monocentric, observational study on 48 adult patients without predictors for difficult laryngoscopy/tracheal intubation undergoing orthopedic surgery. Every participant preoperatively received a cervical collar to simulate a difficult airway. Direct and indirect laryngoscopy w/o the BURP maneuver with a standard Macintosh blade and indirect laryngoscopy w/o the BURP maneuver using D-Blade® were performed to evaluate if blade geometry and the BURP maneuver improve the glottic view as measured by the Cormack-Lehane score.
Results. Using a C-MAC PM® laryngoscope, D-Blade® yielded improved glottic views compared with the Macintosh blade used with either the direct or indirect technique. Changing from direct laryngoscopy using a Macintosh blade to indirect videolaryngoscopy using C-MAC PM® with D-Blade® improved the Cormack-Lehane score from IIb, III, or IV to I or II in 31 cases.
Conclusion. The combination of C-MAC PM® and D-Blade® significantly enhances the view of the glottis compared to direct laryngoscopy with a Macintosh blade in patients with a simulated difficult airway.
Trial Registration Number. This trial is registered under number NCT03403946.
Background: Perioperative anaemia leads to impaired oxygen supply with a risk of vital organ ischaemia. In healthy and fit individuals, anaemia can be compensated by several mechanisms. Elderly patients, however, have less compensatory mechanisms because of multiple co-morbidities and age-related decline of functional reserves. The purpose of the study is to evaluate whether elderly surgical patients may benefit from a liberal red blood cell (RBC) transfusion strategy compared to a restrictive transfusion strategy.
Methods: The LIBERAL Trial is a prospective, randomized, multicentre, controlled clinical phase IV trial randomising 2470 elderly (≥ 70 years) patients undergoing intermediate- or high-risk non-cardiac surgery. Registered patients will be randomised only if Haemoglobin (Hb) reaches ≤9 g/dl during surgery or within 3 days after surgery either to the LIBERAL group (transfusion of a single RBC unit when Hb ≤ 9 g/dl with a target range for the post-transfusion Hb level of 9–10.5 g/dl) or the RESTRICTIVE group (transfusion of a single RBC unit when Hb ≤ 7.5 g/dl with a target range for the post-transfusion Hb level of 7.5–9 g/dl). The intervention per patient will be followed until hospital discharge or up to 30 days after surgery, whichever occurs first. The primary efficacy outcome is defined as a composite of all-cause mortality, acute myocardial infarction, acute ischaemic stroke, acute kidney injury (stage III), acute mesenteric ischaemia and acute peripheral vascular ischaemia within 90 days after surgery. Infections requiring iv antibiotics with re-hospitalisation are assessed as important secondary endpoint. The primary endpoint will be analysed by logistic regression adjusting for age, cancer surgery (y/n), type of surgery (intermediate- or high-risk), and incorporating centres as random effect.
Discussion: The LIBERAL-Trial will evaluate whether a liberal transfusion strategy reduces the occurrence of major adverse events after non-cardiac surgery in the geriatric population compared to a restrictive strategy within 90 days after surgery.
Trial registration: ClinicalTrials.gov (identifier: NCT03369210).
Background: Approximately every third surgical patient is anemic. The most common form, iron deficiency anemia, results from persisting iron‐deficient erythropoiesis (IDE). Zinc protoporphyrin (ZnPP) is a promising parameter for diagnosing IDE, hitherto requiring blood drawing and laboratory workup.
Study design and methods: Noninvasive ZnPP (ZnPP‐NI) measurements are compared to ZnPP reference determination of the ZnPP/heme ratio by high‐performance liquid chromatography (ZnPP‐HPLC) and the analytical performance in detecting IDE is evaluated against traditional iron status parameters (ferritin, transferrin saturation [TSAT], soluble transferrin receptor–ferritin index [sTfR‐F], soluble transferrin receptor [sTfR]), likewise measured in blood. The study was conducted at the University Hospitals of Frankfurt and Zurich.
Results: Limits of agreement between ZnPP‐NI and ZnPP‐HPLC measurements for 584 cardiac and noncardiac surgical patients equaled 19.7 μmol/mol heme (95% confidence interval, 18.0–21.3; acceptance criteria, 23.2 μmol/mol heme; absolute bias, 0 μmol/mol heme). Analytical performance for detecting IDE (inferred from area under the curve receiver operating characteristics) of parameters measured in blood was: ZnPP‐HPLC (0.95), sTfR (0.92), sTfR‐F (0.89), TSAT (0.87), and ferritin (0.67). Noninvasively measured ZnPP‐NI yielded results of 0.90.
Conclusion: ZnPP‐NI appears well suited for an initial IDE screening, informing on the state of erythropoiesis at the point of care without blood drawing and laboratory analysis. Comparison with a multiparameter IDE test revealed that ZnPP‐NI values of 40 μmol/mol heme or less allows exclusion of IDE, whereas for 65 μmol/mol heme or greater, IDE is very likely if other causes of increased values are excluded. In these cases (77% of our patients) ZnPP‐NI may suffice for a diagnosis, while values in between require analyses of additional iron status parameters.
In-line filtration of intravenous infusion may reduce organ dysfunction of adult critical patients
(2019)
Background: The potential harmful effects of particle-contaminated infusions for critically ill adult patients are yet unclear. So far, only significant improved outcome in critically ill children and new-borns was demonstrated when using in-line filters, but for adult patients, evidence is still missing.
Methods: This single-centre, retrospective controlled cohort study assessed the effect of in-line filtration of intravenous fluids with finer 0.2 or 1.2 μm vs 5.0 μm filters in critically ill adult patients. From a total of n = 3215 adult patients, n = 3012 patients were selected by propensity score matching (adjusting for sex, age, and surgery group) and assigned to either a fine filter cohort (with 0.2/1.2 μm filters, n = 1506, time period from February 2013 to January 2014) or a control filter cohort (with 5.0 μm filters, n = 1506, time period from April 2014 to March 2015). The cohorts were compared regarding the occurrence of severe vasoplegia, organ dysfunctions (lung, kidney, and brain), inflammation, in-hospital complications (myocardial infarction, ischemic stroke, pneumonia, and sepsis), in-hospital mortality, and length of ICU and hospital stay.
Results: Comparing fine filter vs control filter cohort, respiratory dysfunction (Horowitz index 206 (119–290) vs 191 (104.75–280); P = 0.04), pneumonia (11.4% vs 14.4%; P = 0.02), sepsis (9.6% vs 12.2%; P = 0.03), interleukin-6 (471.5 (258.8–1062.8) ng/l vs 540.5 (284.5–1147.5) ng/l; P = 0.01), and length of ICU (1.2 (0.6–4.9) vs 1.7 (0.8–6.9) days; P < 0.01) and hospital stay (14.0 (9.2–22.2) vs 14.8 (10.0–26.8) days; P = 0.01) were reduced. Rate of severe vasoplegia (21.0% vs 19.6%; P > 0.20) and acute kidney injury (11.8% vs 13.7%; P = 0.11) was not significantly different between the cohorts.
Conclusions: In-line filtration with finer 0.2 and 1.2 μm filters may be associated with less organ dysfunction and less inflammation in critically ill adult patients.
Trial registration: The study was registered at ClinicalTrials.gov (number: NCT02281604).
Background: Peritonitis is responsible for thousands of deaths annually in Germany alone. Even source control (SC) and antibiotic treatment often fail to prevent severe sepsis or septic shock, and this situation has hardly improved in the past two decades. Most experimental immunomodulatory therapeutics for sepsis have been aimed at blocking or dampening a specific pro-inflammatory immunological mediator. However, the patient collective is large and heterogeneous. There are therefore grounds for investigating the possibility of developing personalized therapies by classifying patients into groups according to biomarkers. This study aims to combine an assessment of the efficacy of treatment with a preparation of human immunoglobulins G, A, and M (IgGAM) with individual status of various biomarkers (immunoglobulin level, procalcitonin, interleukin 6, antigen D-related human leucocyte antigen (HLA-DR), transcription factor NF-κB1, adrenomedullin, and pathogen spectrum).
Methods/design: A total of 200 patients with sepsis or septic shock will receive standard-of-care treatment (SoC). Of these, 133 patients (selected by 1:2 randomization) will in addition receive infusions of IgGAM for 5 days. All patients will be followed for approximately 90 days and assessed by the multiple-organ failure (MOF) score, by the EQ QLQ 5D quality-of-life scale, and by measurement of vital signs, biomarkers (as above), and survival.
Discussion: This study is intended to provide further information on the efficacy and safety of treatment with IgGAM and to offer the possibility of correlating these with the biomarkers to be studied. Specifically, it will test (at a descriptive level) the hypothesis that patients receiving IgGAM who have higher inflammation status (IL-6) and poorer immune status (low HLA-DR, low immunoglobulin levels) have a better outcome than patients who do not receive IgGAM. It is expected to provide information that will help to close the knowledge gap concerning the association between the effect of IgGAM and the presence of various biomarkers, thus possibly opening the way to a personalized medicine.
Trial registration: EudraCT, 2016–001788-34; ClinicalTrials.gov, NCT03334006. Registered on 17 Nov 2017.
Trial sponsor: RWTH Aachen University, represented by the Center for Translational & Clinical Research Aachen (contact Dr. S. Isfort).
Background: GLUT1-deficiency-syndrome (G1DS) is an autosomal dominant genetic disorder based on a mutation of the SLC2A1 gene. This mutation can lead to an encephalopathy due to abnormal glucose transport in the brain. G1DS is a rare disease, with an estimated incidence of 1: 90 000.
Case report: We report a case of a 10-year-old female who presented with recurrent fever, headaches, and vertigo for more than 3 days within 2 weeks following pneumonia. A bilateral mastoiditis was proven by a cerebral magnetic resonance imaging and a cranial computed tomography scan. The patient had to undergo mastoidectomy and thus, her first general anesthesia. Half a year previously she was diagnosed with G1DS. According to the standard of care, a ketogenic diet had been administered since the patient’s diagnosis 6 months earlier. Our patient received a total intravenous anesthesia (TIVA) using propofol, fentanyl, and rocuronium administered without any incidents.
Conclusions: We recommend normoglycemia during the perioperative phase and avoidance of glucose-based medication to keep a patient’s ketotic state. Our case highlights that TIVA, with the outlined medication used in this case, was safe when the patient’s ketotic state and periprocedural blood glucose was monitored continuously. Nevertheless, we would suggest using remifentanil instead of fentanyl for future TIVAs due to a reduced increase in blood glucose level in our patient.
Health economics of Patient Blood Management: a cost‐benefit analysis based on a meta‐analysis
(2019)
Background and Objectives: Patient Blood Management (PBM) is the timely application of evidence‐based medical and surgical concepts designed to improve haemoglobin concentration, optimize haemostasis and minimize blood loss in an effort to improve patient outcomes. The focus of this cost‐benefit analysis is to analyse the economic benefit of widespread implementation of a multimodal PBM programme.
Materials and Methods: Based on a recent meta‐analysis including 17 studies (>235 000 patients) comparing PBM with control care and data from the University Hospital Frankfurt, a cost‐benefit analysis was performed. Outcome data were red blood cell (RBC) transfusion rate, number of transfused RBC units, and length of hospital stay (LOS). Costs were considered for the following three PBM interventions as examples: anaemia management including therapy of iron deficiency, use of cell salvage and tranexamic acid. For sensitivity analysis, a Monte Carlo simulation was performed.
Results: Iron supplementation was applied in 3·1%, cell salvage in 65% and tranexamic acid in 89% of the PBM patients. In total, applying these three PBM interventions costs €129·04 per patient. However, PBM was associated with a reduction in transfusion rate, transfused RBC units per patient, and LOS which yielded to mean savings of €150·64 per patient. Thus, the overall benefit of PBM implementation was €21·60 per patient. In the Monte Carlo simulation, the cost savings on the outcome side exceeded the PBM costs in approximately 2/3 of all repetitions and the total benefit was €1 878 000 in 100·000 simulated patients.
Conclusion: Resources to implement a multimodal PBM concept optimizing patient care and safety can be cost‐effectively.