Refine
Year of publication
Document Type
- Article (5338) (remove)
Has Fulltext
- yes (5338)
Keywords
- inflammation (77)
- COVID-19 (60)
- SARS-CoV-2 (46)
- cancer (38)
- glioblastoma (38)
- apoptosis (36)
- Inflammation (34)
- breast cancer (34)
- autophagy (29)
- prostate cancer (29)
Institute
- Medizin (5338) (remove)
Estimating intraoperative blood loss is one of the daily challenges for clinicians. Despite the knowledge of the inaccuracy of visual estimation by anaesthetists and surgeons, this is still the mainstay to estimate surgical blood loss. This review aims at highlighting the strengths and weaknesses of currently used measurement methods. A systematic review of studies on estimation of blood loss was carried out. Studies were included investigating the accuracy of techniques for quantifying blood loss in vivo and in vitro. We excluded nonhuman trials and studies using only monitoring parameters to estimate blood loss. A meta-analysis was performed to evaluate systematic measurement errors of the different methods. Only studies that were compared with a validated reference e.g. Haemoglobin extraction assay were included. 90 studies met the inclusion criteria for systematic review and were analyzed. Six studies were included in the meta-analysis, as only these were conducted with a validated reference. The mixed effect meta-analysis showed the highest correlation to the reference for colorimetric methods (0.93 95% CI 0.91–0.96), followed by gravimetric (0.77 95% CI 0.61–0.93) and finally visual methods (0.61 95% CI 0.40–0.82). The bias for estimated blood loss (ml) was lowest for colorimetric methods (57.59 95% CI 23.88–91.3) compared to the reference, followed by gravimetric (326.36 95% CI 201.65–450.86) and visual methods (456.51 95% CI 395.19–517.83). Of the many studies included, only a few were compared with a validated reference. The majority of the studies chose known imprecise procedures as the method of comparison. Colorimetric methods offer the highest degree of accuracy in blood loss estimation. Systems that use colorimetric techniques have a significant advantage in the real-time assessment of blood loss.
Introduction: Dysphagia is a common and severe symptom of traumatic brain injury (TBI) affecting up to 78% of patients. It is associated with pneumonia, increased morbidity, and mortality. Although subdural hematoma (SDH) accounts for over 50% of TBI, the occurrence of dysphagia in this subtype has not been investigated yet.
Methods: All patients with SDH admitted to the author's institution between the years 2007 and 2020 were included in the study. Patients with SDH and clinical suspicion for dysphagia received a clinical swallowing assessment by a speech and language pathologist (SLP). Furthermore, the severity of dysphagia was rated according to swallowing disorder scale. Functional outcome was evaluated by the Glasgow outcome scale (GOS).
Results: Out of 545 patients with SDH, 71 patients had dysphagia (13%). The prevalence of dysphagia was significantly lower in the surgical arm compared to the conservative arm (11.8 vs. 21.8%; OR 0.23; p = 0.02). Independent predictors for dysphagia were GCS < 13 at admission (OR 4.17; p < 0.001), cardiovascular disease (OR 2.29; p = 0.002), and pneumonia (OR 2.88; p = 0.002), whereas the operation was a protective factor (OR 0.2; p < 0.001). In a subgroup analysis, right-sided SDH was an additional predictor for dysphagia (OR 2.7; p < 0.001). Overall, patients with dysphagia improved significantly under the SLP treatment from the initial diagnosis to hospital discharge (p < 0.01). However, a subgroup of patients with the most severe grade of dysphagia showed no significant improvement. Patients with dysphagia had significantly worse outcomes (GOS 1–3) compared to those without dysphagia (48.8 vs. 26.4%; p < 0.001).
Conclusion: Dysphagia is a frequent symptom in SDH, and the early identification of dysphagia is crucial regarding the initiation of treatment and functional outcome. Surgery is effective in preventing dysphagia and should be considered in high-risked patients.
Background: The development of robotic systems has provided an alternative to frame-based stereotactic procedures. The aim of this experimental phantom study was to compare the mechanical accuracy of the Robotic Surgery Assistant (ROSA) and the Leksell stereotactic frame by reducing clinical and procedural factors to a minimum.
Methods: To precisely compare mechanical accuracy, a stereotactic system was chosen as reference for both methods. A thin layer CT scan with an acrylic phantom fixed to the frame and a localizer enabling the software to recognize the coordinate system was performed. For each of the five phantom targets, two different trajectories were planned, resulting in 10 trajectories. A series of five repetitions was performed, each time based on a new CT scan. Hence, 50 trajectories were analyzed for each method. X-rays of the final cannula position were fused with the planning data. The coordinates of the target point and the endpoint of the robot- or frame-guided probe were visually determined using the robotic software. The target point error (TPE) was calculated applying the Euclidian distance. The depth deviation along the trajectory and the lateral deviation were separately calculated.
Results: Robotics was significantly more accurate, with an arithmetic TPE mean of 0.53 mm (95% CI 0.41–0.55 mm) compared to 0.72 mm (95% CI 0.63–0.8 mm) in stereotaxy (p < 0.05). In robotics, the mean depth deviation along the trajectory was −0.22 mm (95% CI −0.25 to −0.14 mm). The mean lateral deviation was 0.43 mm (95% CI 0.32–0.49 mm). In frame-based stereotaxy, the mean depth deviation amounted to −0.20 mm (95% CI −0.26 to −0.14 mm), the mean lateral deviation to 0.65 mm (95% CI 0.55–0.74 mm).
Conclusion: Both the robotic and frame-based approach proved accurate. The robotic procedure showed significantly higher accuracy. For both methods, procedural factors occurring during surgery might have a more relevant impact on overall accuracy.
Chimeric antigen receptor (CAR) T cell therapy is a potent new treatment option for relapsed or refractory hematologic malignancies. As the monitoring of CAR T cell kinetics can provide insights into the activity of the therapy, appropriate CAR T cell detection methods are essential. Here, we report on the comprehensive validation of a flow cytometric assay for peripheral blood CD19 CAR T cell detection. Further, a retrospective analysis (n = 30) of CAR T cell and B cell levels over time has been performed, and CAR T cell phenotypes have been characterized. Serial dilution experiments demonstrated precise and linear quantification down to 0.05% of T cells or 22 CAR T cell events. The calculated detection limit at 13 events was confirmed with CAR T cell negative control samples. Inter-method comparison with real-time PCR showed appreciable correlation. Stability testing revealed diminished CAR T cell values already one day after sample collection. While we found long-term CAR T cell detectability and B cell aplasia in most patients (12/17), some patients (5/17) experienced B cell recovery. In three of these patients the coexistence of CAR T cells and regenerating B cells was observed. Repeat CAR T cell infusions led to detectable but limited re-expansions. Comparison of CAR T cell subsets with their counterparts among all T cells showed a significantly higher percentage of effector memory T cells and a significantly lower percentage of naïve T cells and T EMRA cells among CAR T cells. In conclusion, flow cytometric CAR T cell detection is a reliable method to monitor CAR T cells if measurements start without delay and sufficient T cell counts are given.
Background: The inclusion of immune checkpoint inhibitors (ICIs) in therapeutic algorithms has led to significant survival benefits in patients with various metastatic cancers. Concurrently, an increasing number of neurological immune related adverse events (IRAE) has been observed. In this retrospective analysis, we examine the ICI-induced incidence of cerebral pseudoprogression and propose a classification system.
Methods: We screened our hospital information system to identify patients with any in-house ICI treatment for any tumor disease during the years 2007-2019. All patients with cerebral MR imaging (cMRI) of sufficient diagnostic quality were included. cMRIs were retrospectively analyzed according to immunotherapy response assessment for neuro-oncology (iRANO) criteria.
Results: We identified 12 cases of cerebral pseudoprogression in 123 patients treated with ICIs and sufficient MRI. These patients were receiving ICI therapy for lung cancer (n=5), malignant melanoma (n=4), glioblastoma (n=1), hepatocellular carcinoma (n=1) or lymphoma (n=1) when cerebral pseudoprogression was detected. Median time from the start of ICI treatment to pseudoprogression was 5 months. All but one patient developed neurological symptoms. Three different patterns of cerebral pseudoprogression could be distinguished: new or increasing contrast-enhancing lesions, new or increasing T2 predominant lesions and cerebral vasculitis type pattern.
Conclusion: Cerebral pseudoprogression followed three distinct patterns and was detectable in 3.2% of all patients during ICI treatment and in 9.75% of the patients with sufficient brain imaging follow up. The fact that all but one of the affected patients developed neurological symptoms, which would be classified as progressive disease according to iRANO criteria, mandates vigilance in the diagnosis and treatment of ICI-induced cerebral lesions.
Background: To determine the correlation between urine loss in PAD-test after catheter removal, and early urinary continence (UC) in RP treated patients. Methods: Urine loss was measured by using a standardized, validated PAD-test within 24 h after removal of the transurethral catheter, and was grouped as a loss of <1, 1–10, 11–50, and >50 g of urine, respectively. Early UC (median: 3 months) was defined as the usage of no or one safety-pad. Uni- and multivariable logistic regression models tested the correlation between PAD-test results and early UC. Covariates consisted of age, BMI, nerve-sparing approach, prostate volume, and extraprostatic extension of tumor. Results: From 01/2018 to 03/2021, 100 patients undergoing RP with data available for a PAD-test and early UC were retrospectively identified. Ultimately, 24%, 47%, 15%, and 14% of patients had a loss of urine <1 g, 1–10 g, 11–50 g, and >50 g in PAD-test, respectively. Additionally, 59% of patients reported to be continent. In multivariable logistic regression models, urine loss in PAD-test predicted early UC (OR: 0.21 vs. 0.09 vs. 0.03; for urine loss 1–10 g vs. 11–50 g vs. >50 g, Ref: <1 g; all p < 0.05). Conclusions: Urine loss after catheter removal strongly correlated with early continence as well as a severity in urinary incontinence.
Background: Duodenal obstruction is a rare cause of congenital bowel obstruction. Prenatal ultrasound could be suggestive of duodenal atresia if polyhydramnios and the double bubble sign are visible. Prenatal diagnosis should prompt respective prenatal care, including surgery. The aim of this study was to investigate the rate and importance of prenatally diagnosed duodenal obstruction, comparing incomplete and complete duodenal obstruction. Methods: A retrospective, single-center study was performed using data from patients operated on for duodenal obstruction between 2004 and 2019. Prenatal ultrasound findings were obtained from maternal logbooks and directly from the investigating obstetricians. Postnatal data were obtained from electronic charts, including imaging, operative notes and follow-up. Results: A total of 33/64 parents of respective patients agreed to provide information on prenatal diagnostics. In total, 11/15 patients with complete duodenal obstruction and 0/18 patients with incomplete duodenal obstruction showed typical prenatal features. Prenatal diagnosis prompted immediate surgical treatment after birth. Conclusion: Prenatal diagnosis of congenital duodenal obstruction is only achievable in cases of complete congenital duodenal obstruction by sonographic detection of the pathognomonic double bubble sign. Patients with incomplete duodenal obstruction showed no sign of duodenal obstruction on prenatal scans and thus were diagnosed and treated later.
Macrophages are plastic and heterogeneous immune cells that adapt pro- or anti-inflammatory phenotypes upon exposure to different stimuli. Even though there has been evidence supporting a crosstalk between coagulation and innate immunity, the way in which protein components of the hemostasis pathway influence macrophages remains unclear. We investigated the effect of thrombin on macrophage polarization. On the basis of gene expression and cytokine secretion, our results suggest that polarization with thrombin induces an anti-inflammatory, M2-like phenotype. In functional studies, thrombin polarization promoted oxLDL phagocytosis by macrophages, and conditioned medium from the same cells increased endothelial cell proliferation. There were, however, clear differences between the classical M2a polarization and the effects of thrombin on gene expression. Finally, the deletion and inactivation of secreted modular Ca2+-binding protein 1 (SMOC1) attenuated phagocytosis by thrombin-stimulated macrophages, a phenomenon revered by the addition of recombinant SMOC1. Manipulation of SMOC1 levels also had a pronounced impact on the expression of TGF-β-signaling-related genes. Taken together, our results show that thrombin induces an anti-inflammatory macrophage phenotype with similarities as well as differences to the classical alternatively activated M2 polarization states, highlighting the importance of tissue levels of SMOC1 in modifying thrombin-induced macrophage polarization.
Behind the Wall - Compartment-Specific Neovascularisation during Post-Stroke Recovery in Mice
(2022)
Ischemic stroke is a highly prevalent vascular disease leading to oxygen- and glucose deprivation in the brain. In response, ischemia-induced neovascularization occurs, which is supported by circulating CD34+ endothelial progenitor cells. Here, we used the transient middle cerebral artery occlusion (tMCAO) mouse model to characterize the spatio-temporal alterations within the ischemic core from the acute to the chronic phase using multiple-epitope-ligand cartography (MELC) for sequential immunohistochemistry. We found that around 14 days post-stroke, significant angiogenesis occurs in the ischemic core, as determined by the presence of CD31+/CD34+ double-positive endothelial cells. This neovascularization was accompanied by the recruitment of CD4+ T-cells and dendritic cells as well as IBA1+ and IBA1− microglia. Neighborhood analysis identified, besides pericytes only for T-cells and dendritic cells, a statistically significant distribution as direct neighbors of CD31+/CD34+ endothelial cells, suggesting a role for these cells in aiding angiogenesis. This process was distinct from neovascularization of the peri-infarct area as it was separated by a broad astroglial scar. At day 28 post-stroke, the scar had emerged towards the cortical periphery, which seems to give rise to a neuronal regeneration within the peri-infarct area. Meanwhile, the ischemic core has condensed to a highly vascularized subpial region adjacent to the leptomeningeal compartment. In conclusion, in the course of chronic post-stroke regeneration, the astroglial scar serves as a seal between two immunologically active compartments—the peri-infarct area and the ischemic core—which exhibit distinct processes of neovascularization as a central feature of post-stroke tissue remodeling. Based on our findings, we propose that neovascularization of the ischemic core comprises arteriogenesis as well as angiogenesis originating from the leptomenigeal vasculature.
Multiple myeloma (MM) is the second most common hematologic malignancy, which is characterized by clonal proliferation of neoplastic plasma cells in the bone marrow. This microenvironment is characterized by low oxygen levels (1–6% O2), known as hypoxia. For MM cells, hypoxia is a physiologic feature that has been described to promote an aggressive phenotype and to confer drug resistance. However, studies on hypoxia are scarce and show little conformity. Here, we analyzed the mRNA expression of previously determined hypoxia markers to define the temporal adaptation of MM cells to chronic hypoxia. Subsequent analyses of the global proteome in MM cells and the stromal cell line HS-5 revealed hypoxia-dependent regulation of proteins, which directly or indirectly upregulate glycolysis. In addition, chronic hypoxia led to MM-specific regulation of nine distinct proteins. One of these proteins is the cysteine protease legumain (LGMN), the depletion of which led to a significant growth disadvantage of MM cell lines that is enhanced under hypoxia. Thus, herein, we report a methodologic strategy to examine MM cells under physiologic hypoxic conditions in vitro and to decipher and study previously masked hypoxia-specific therapeutic targets such as the cysteine protease LGMN.