Refine
Year of publication
Language
- English (57) (remove)
Has Fulltext
- yes (57)
Is part of the Bibliography
- no (57)
Keywords
- Inflammation (4)
- ACLF (3)
- COVID-19 (3)
- cirrhosis (3)
- Angiogenesis (2)
- Biomarker (2)
- Biomarkers (2)
- Brain tumor (2)
- CRM (2)
- Entscheidungsassistenz (2)
Institute
- Medizin (57) (remove)
Memory Concerns, Memory Performance and Risk of Dementia in Patients with Mild Cognitive Impairment
(2014)
Background: Concerns about worsening memory (“memory concerns”; MC) and impairment in memory performance are both predictors of Alzheimer's dementia (AD). The relationship of both in dementia prediction at the pre-dementia disease stage, however, is not well explored. Refined understanding of the contribution of both MC and memory performance in dementia prediction is crucial for defining at-risk populations. We examined the risk of incident AD by MC and memory performance in patients with mild cognitive impairment (MCI).
Methods: We analyzed data of 417 MCI patients from a longitudinal multicenter observational study. Patients were classified based on presence (n = 305) vs. absence (n = 112) of MC. Risk of incident AD was estimated with Cox Proportional-Hazards regression models.
Results: Risk of incident AD was increased by MC (HR = 2.55, 95%CI: 1.33–4.89), lower memory performance (HR = 0.63, 95%CI: 0.56–0.71) and ApoE4-genotype (HR = 1.89, 95%CI: 1.18–3.02). An interaction effect between MC and memory performance was observed. The predictive power of MC was greatest for patients with very mild memory impairment and decreased with increasing memory impairment.
Conclusions: Our data suggest that the power of MC as a predictor of future dementia at the MCI stage varies with the patients' level of cognitive impairment. While MC are predictive at early stage MCI, their predictive value at more advanced stages of MCI is reduced. This suggests that loss of insight related to AD may occur at the late stage of MCI.
Background: Clinical practice guidelines for patients with primary biliary cholangitis (PBC) have been recently revised and implemented for well-established response criteria to standard first-line ursodeoxycholic acid (UDCA) therapy at 12 months after treatment initiation for the early identification of high-risk patients with inadequate treatment responses who may require treatment modification. However, there are only very limited data concerning the real-world clinical management of patients with PBC in Germany. Objective: The aim of this retrospective multicenter study was to evaluate response rates to standard first-line UDCA therapy and subsequent Second-line treatment regimens in a large cohort of well-characterized patients with PBC from 10 independent hepatological referral centers in Germany prior to the introduction of obeticholic acid as a licensed second-line treatment option. Methods: Diagnostic confirmation of PBC, standard first-line UDCA treatment regimens and response rates at 12 months according to Paris-I, Paris-II, and Barcelona criteria, the follow-up cut-off alkaline phosphatase (ALP) ≤ 1.67 × upper limit of normal (ULN) and the normalization of bilirubin (bilirubin ≤ 1 × ULN) were retrospectively examined between June 1986 and March 2017. The management and hitherto applied second-line treatment regimens in patients with an inadequate response to UDCA and subsequent response rates at 12 months were also evaluated. Results: Overall, 480 PBC patients were included in this study. The median UDCA dosage was 13.2 mg UDCA/kg bodyweight (BW)/d. Adequate UDCA treatment response rates according to Paris-I, Paris-II, and Barcelona criteria were observed in 91, 71.3, and 61.3% of patients, respectively. In 83.8% of patients, ALP ≤ 1.67 × ULN were achieved. A total of 116 patients (24.2%) showed an inadequate response to UDCA according to at least one criterion. The diverse second-line treatment regimens applied led to significantly higher response rates according to Paris-II (35 vs. 60%, p = 0.005), Barcelona (13 vs. 34%, p = 0.0005), ALP ≤ 1.67 × ULN and bilirubin ≤ 1 × ULN (52.1 vs. 75%, p = 0.002). The addition of bezafibrates appeared to induce the strongest beneficial effect in this cohort (Paris II: 24 vs. 74%, p = 0.004; Barcelona: 50 vs. 84%, p = 0.046; ALP < 1.67 × ULN and bilirubin ≤ 1 × ULN: 33 vs. 86%, p = 0.001). Conclusion: Our large retrospective multicenter study confirms high response rates following UDCA first-line standard treatment in patients with PBC and highlights the need for close monitoring and early treatment modification in high-risk patients with an insufficient response to UDCA since early treatment modification significantly increases subsequent response rates of these patients.
The nuclear factor kappa beta (NFκB) signaling pathway plays an important role in liver homeostasis and cancer development. Tax1-binding protein 1 (Tax1BP1) is a regulator of the NFκB signaling pathway, but its role in the liver and hepatocellular carcinoma (HCC) is presently unknown. Here we investigated the role of Tax1BP1 in liver cells and murine models of HCC and liver fibrosis. We applied the diethylnitrosamine (DEN) model of experimental hepatocarcinogenesis in Tax1BP1+/+ and Tax1BP1−/− mice. The amount and subsets of non-parenchymal liver cells in in Tax1BP1+/+ and Tax1BP1−/− mice were determined and activation of NFκB and stress induced signaling pathways were assessed. Differential expression of mRNA and miRNA was determined. Tax1BP1−/− mice showed increased numbers of inflammatory cells in the liver. Furthermore, a sustained activation of the NFκB signaling pathway was found in hepatocytes as well as increased transcription of proinflammatory cytokines in isolated Kupffer cells from Tax1BP1−/− mice. Several differentially expressed mRNAs and miRNAs in livers of Tax1BP1−/− mice were found, which are regulators of inflammation or are involved in cancer development or progression. Furthermore, Tax1BP1−/− mice developed more HCCs than their Tax1BP1+/+ littermates. We conclude that Tax1BP1 protects from liver cancer development by limiting proinflammatory signaling.
Introduction: In an emergency department, the majority of pediatric trauma patients present because of minor injuries. The aim of this study was to evaluate temporal changes in age-related injury pattern, trauma mechanism, and surgeries in pediatric patients. Methods: This retrospective study included patients < 18 years of age following trauma from 01/2009 to 12/2018 at a level I trauma center. They were divided into two groups: group A (A: 01/2009 to 12/2013) and group B (B: 01/2014 to 12/2018). Injury mechanism, injury pattern, and surgeries were analyzed. As major injuries fractures, dislocations, and organ injuries and as minor injuries contusions and superficial wounds were defined. Results: 23,582 patients were included (58% male, median age 8.2 years). There was a slight increase in patients comparing A (n = 11,557) and B (n = 12,025) with no difference concerning demographic characteristics. Significant more patients (A: 1.9%; B: 2.4%) were admitted to resuscitation room, though the number of multiple injured patients was not significantly different. In A (25.5%), major injuries occurred significantly less frequently than in B (27.0%), minor injuries occurred equally. Extremity fractures were significantly more frequent in B (21.5%) than in A (20.2%), peaking at 8–12 years. Most trauma mechanisms of both groups were constant, with a rising of sport injuries at 8–12 years. Conclusion: Although number of patients increases only slightly over a decade, there was a clear increase in major injuries, particularly extremity fractures, peaking at 8–12 years. At this age also sport accidents significantly increased. At least, admittance to resuscitation room rose but without an increase of multiple injured patients.
Background: Polytraumatized patients undergo a strong immunological stress upon insult. Phagocytes (granulocytes and monocytes) play a substantial role in immunological defense against bacteria, fungi and yeast, and in the clearance of cellular debris after tissue injury. We have reported a reduced monocytes phagocytic activity early after porcine polytrauma before. However, it is unknown if both phagocyte types undergo those functional alterations, and if there is a pathogen-specific phagocytic behavior. We characterized the phagocytic activity and capacity of granulocytes and monocytes after polytrauma.
Methods: Eight pigs (Sus scrofa) underwent polytrauma consisting of lung contusion, liver laceration, tibial fracture and hemorrhagic shock with fluid resuscitation and fracture fixation with external fixator. Intensive care treatment including mechanical ventilation for 72 h followed. Phagocytic activity and capacity were investigated using an in vitro ex vivo whole blood stimulation phagocytosis assays before trauma, after surgery, 24, 48, and 72 h after trauma. Blood samples were stimulated with Phorbol-12-myristate-13-acetate and incubated with FITC-labeled E. coli, S. aureus or S. cerevisiae for phagocytosis assessment by flow cytometry.
Results: Early polytrauma-induced significant increase of granulocytes and monocytes declined to baseline values within 24 h. Percentage of E. coli-phagocytizing granulocytes significantly decreased after polytrauma and during further intensive care treatment, while their capacity significantly increased. Interestingly, both granulocytic phagocytic activity and capacity of S. aureus significantly decreased after trauma, although a recovery was observed after 24 h and yet was followed by another decrease. The percentage of S. cerevisiae-phagocytizing granulocytes significantly increased after 24 h, while their impaired capacity after surgery and 72 h later was detected. Monocytic E. coli-phagocytizing percentage did not change, while their capacity increased after 24–72 h. After a significant decrease in S. aureus-phagocytizing monocytes after surgery, a significant increase after 24 and 48 h was observed without capacity alterations. No significant changes in S. cerevisiae-phagocytizing monocytes occurred, but their capacity dropped 48 and 72 h.
Conclusion: Phagocytic activity and capacity of granulocytes and monocytes follow a different pattern and significantly change within 72 h after polytrauma. Both phagocytic activity and capacity show significantly different alterations depending on the pathogen strain, thus potentially indicating at certain and possibly more relevant infection causes after polytrauma.
In Bone Tissue Engineering (BTE), autologous bone-regenerative cells are combined with a scaffold for large bone defect treatment (LBDT). Microporous, polylactic acid (PLA) scaffolds showed good healing results in small animals. However, transfer to large animal models is not easily achieved simply by upscaling the design. Increasing diffusion distances have a negative impact on cell survival and nutrition supply, leading to cell death and ultimately implant failure. Here, a novel scaffold architecture was designed to meet all requirements for an advanced bone substitute. Biofunctional, porous subunits in a load-bearing, compression-resistant frame structure characterize this approach. An open, macro- and microporous internal architecture (100 µm–2 mm pores) optimizes conditions for oxygen and nutrient supply to the implant’s inner areas by diffusion. A prototype was 3D-printed applying Fused Filament Fabrication using PLA. After incubation with Saos-2 (Sarcoma osteogenic) cells for 14 days, cell morphology, cell distribution, cell survival (fluorescence microscopy and LDH-based cytotoxicity assay), metabolic activity (MTT test), and osteogenic gene expression were determined. The adherent cells showed colonization properties, proliferation potential, and osteogenic differentiation. The innovative design, with its porous structure, is a promising matrix for cell settlement and proliferation. The modular design allows easy upscaling and offers a solution for LBDT.
Background: Conversion from calcineurin inhibitor (CNI) therapy to a mammalian target of rapamycin (mTOR) inhibitor following kidney transplantation may help to preserve graft function. Data are sparse, however, concerning the impact of conversion on posttransplant diabetes mellitus (PTDM) or the progression of pre-existing diabetes.
Methods: PTDM and other diabetes-related parameters were assessed post hoc in two large open-label multicenter trials. Kidney transplant recipients were randomized (i) at month 4.5 to switch to everolimus or remain on a standard cyclosporine (CsA)-based regimen (ZEUS, n = 300), or (ii) at month 3 to switch to everolimus, remain on standard CNI therapy or convert to everolimus with reduced-exposure CsA (HERAKLES, n = 497).
Results: There were no significant differences in the incidence of PTDM between treatment groups (log rank p = 0.97 [ZEUS], p = 0.90 [HERAKLES]). The mean change in random blood glucose from randomization to month 12 was also similar between treatment groups in both trials for patients with or without PTDM, and with or without pre-existing diabetes. The change in eGFR from randomization to month 12 showed a benefit for everolimus versus comparator groups in all subpopulations, but only reached significance in larger subgroups (no PTDM or no pre-existing diabetes).
Conclusions: Within the restrictions of this post hoc analysis, including non-standardized diagnostic criteria and limited glycemia laboratory parameters, these data do not indicate any difference in the incidence or severity of PTDM with early conversion from a CsA-based regimen to everolimus, or in the progression of pre-existing diabetes.
Trial registration: clinicaltrials.gov, NCT00154310 (registered September 2005) and NCT00514514 (registered August 2007); EudraCT (2006-007021-32 and 2004-004346-40).
Delayed wound repair in sepsis is associated with reduced local pro-inflammatory cytokine expression
(2013)
Sepsis is one of the main causes for morbidity and mortality in hospitalized patients. Moreover, sepsis associated complications involving impaired wound healing are common. Septic patients often require surgical interventions that in-turn may lead to further complications caused by impaired wound healing. We established a mouse model to the study delayed wound healing during sepsis distant to the septic focus point. For this reason cecal ligation and puncture (CLP) was combined with the creation of a superficial wound on the mouse ear. Control animals received the same procedure without CPL. Epithelialization was measured every second day by direct microscopic visualization up to complete closure of the wound. As interplay of TNF-α, TGF-β, matrix metalloproteinases (MMP), and tissue inhibitors of metalloproteinases (TIMP) is important in wound healing in general, TNF-α, TGF-β, MMP7, and TIMP1 were assessed immunohistochemical in samples of wounded ears harvested on days 2, 6, 10 and 16 after wounding. After induction of sepsis, animals showed a significant delay in wound epithelialization from day 2 to 12 compared to control animals. Complete wound healing was attained after mean 12.2± standard deviation (SD) 3.0 days in septic animals compared to 8.7± SD 1.7 days in the control group. Septic animals showed a significant reduction in local pro-inflammatory cytokine level of TNF-α on day 2 and day 6 as well as a reduced expression of TGF-β on day 2 in wounds. A significant lower expression of MMP7 as well as TIMP1 was also observed on day 2 after wounding. The induction of sepsis impairs wound healing distant to the septic focus point. We could demonstrate that expression of important cytokines for wound repair is deregulated after induction of sepsis. Thus restoring normal cytokine response locally in wounds could be a good strategy to enhance wound repair in sepsis.
Epoxyeicosatrienoic acids (EET) facilitate regeneration in different tissues, and their benefit in dermal wound healing has been proven under normal conditions. In this study, we investigated the effect of 11,12 EET on dermal wound healing in diabetes. We induced diabetes by i.p. injection of streptozotocin 2 weeks prior to wound creation on the dorsal side of the mouse ear. 11,12 EET was applied every second day on the wound, whereas the control groups received only solvent. Epithelialization was monitored every second day intravitally up to wound closure. Wounds were stained for VEGF, CD31, TGF-β, TNF-α, SDF-1α, NF-κB, and Ki-67, and fibroblasts were counted after hematoxylin-eosin stain on days 3, 6, 9, and 16 after wounding. After induction of diabetes, wounds closed on day 13.00 ± 2.20 standard deviation (SD). Local 11,12 ETT application improved wound closure significantly to day 8.40 ± 1.39 SD. EET treatment enhanced VEGF and CD31 expression in wounds on day 3. It also seemed to raise TNF-α level on all days investigated as well as TGF-β level on days 3 and 6. A decrease in NF-κB could be observed on days 9 and 16 after EET application. The latter findings were not significant. SDF-1α expression was not influenced by EET application, and Ki-67 was significantly less in the EET group on day 9 after EET application. The number of fibroblasts was significantly increased on day 9 after the 11,12 EET application. 11,12 EET improve deteriorated wound healing in diabetes by enhancing neoangiogenesis, especially in the early phase of wound healing. Furthermore, they contribute to the dissolution of the initial inflammatory reaction, allowing the crucial transition from the inflammatory to proliferative phase in wound healing.
Introduction: Stem cell transplantation is one of the most promising strategies to improve healing in chronic wounds as systemic administration of endothelial progenitor cells (EPC) enhances healing by promoting neovascularization and homing though a high amount of cells is needed. In the following study, we analysed whether local application can reduce the number of EPC needed achieving the same beneficial effect on wound healing.
Material and Methods: Wound healing after local or systemic treatment with EPC was monitored in vivo by creating standardized wounds on the dorsum of hairless mice measuring wound closure every second day. Systemic group received 2 × 106 EPC i.v. and locally treated group 2 × 105 EPC, locally injected. As control PBS injection was performed the same way. Expression of CD31, VEGF, CD90 and, SDF-1α was analysed immunohistochemically for evaluation of neovascularisation and amelioration of homing.
Results: Local (7.1 ± 0.45 SD) as well as systemic (6.1 ± 0.23 SD) EPC transplantation led to a significant acceleration of wound closure compared to controls (PBS local: 9.7 ± 0.5 SD, PBS systemic 10.9 ± 0.38 SD). Systemic application enhanced CD31 expression on day 6 after wounding and local EPC on 6 and 9 in comparison to control. VEGF expression was not significantly affected. Systemic and local EPC treatment resulted in a significantly enhanced SDF-1α and CD90 expression on all days investigated.
Conclusion: Local as well as systemic EPC treatment enhances wound healing. Moreover, beneficial effects are obtained with a tenfold decrease number of EPC when applied locally. Thus, local EPC treatment might be more convenient way to enhance wound healing as number of progenitor cells is limited.
Introduction: Epoxyeicosatrienoic acids (EETs) are able to enhance angiogenesis and regulate inflammation that is especially important in wound healing under ischemic conditions. Thus, we evaluated the effect of local EET application on ischemic wounds in mice.
Methods: Ischemia was induced by cautherization of two of the three supplying vessels to the mouse ear. Wounding was performed on the ear three days later. Wounds were treated either with 11,12 or 14,15 EET and compared to untreated control and normal wounds. Epithelialization was measured every second day. VEGF, TNF-α, TGF-β, matrix metalloproteinases (MMP), tissue inhibitors of metalloproteinases (TIMP), Ki67, and SDF-1α were evaluated immunohistochemically in wounds on day 3, 6, and 9.
Results: Ischemia delayed wound closure (12.8 days ± 1.9 standard deviation (SD) for ischemia and 8.0 days ± 0.94 SD for control). 11,12 and14,15 EET application ameliorated deteriorated wound healing on ischemic ears (7.6 ± 1.3 SD for 11,12 EET and 9.2 ± 1.4 SD for 14,15 EET). Ischemia did not change VEGF, TNF-α, TGF-β, SDF-1α, TIMP, MMP7 or MMP9 level significantly compared to control. Local application of 11,12 as well as 14,15 EET induced a significant elevation of VEGF, TGF-β, and SDF-1α expression as well as proliferation during the whole phase of wound healing compared to control and ischemia alone.
Conclusion: In summary, EET improve impaired wound healing caused by ischemia as they enhance neovascularization and alter inflammatory response in wounds. Thus elevating lipid mediator level as 11,12 and 14,15 EET in wounds might be a successful strategy for amelioration of deranged wound healing under ischemia.
The emerging disciplines of lipidomics and metabolomics show great potential for the discovery of diagnostic biomarkers, but appropriate pre-analytical sample-handling procedures are critical because several analytes are prone to ex vivo distortions during sample collection. To test how the intermediate storage temperature and storage period of plasma samples from K3EDTA whole-blood collection tubes affect analyte concentrations, we assessed samples from non-fasting healthy volunteers (n = 9) for a broad spectrum of metabolites, including lipids and lipid mediators, using a well-established LC-MS-based platform. We used a fold change-based approach as a relative measure of analyte stability to evaluate 489 analytes, employing a combination of targeted LC-MS/MS and LC-HRMS screening. The concentrations of many analytes were found to be reliable, often justifying less strict sample handling; however, certain analytes were unstable, supporting the need for meticulous processing. We make four data-driven recommendations for sample-handling protocols with varying degrees of stringency, based on the maximum number of analytes and the feasibility of routine clinical implementation. These protocols also enable the simple evaluation of biomarker candidates based on their analyte-specific vulnerability to ex vivo distortions. In summary, pre-analytical sample handling has a major effect on the suitability of certain metabolites as biomarkers, including several lipids and lipid mediators. Our sample-handling recommendations will increase the reliability and quality of samples when such metabolites are necessary for routine clinical diagnosis.
Background & Aims: In ACLF patients, an adequate risk stratification is essential, especially for liver transplant allocation, since ACLF is associated with high short-term mortality. The CLIF-C ACLF score is the best prognostic model to predict outcome in ACLF patients. While lung failure is generally regarded as signum malum in ICU care, this study aims to evaluate and quantify the role of pulmonary impairment on outcome in ACLF patients.
Methods: In this retrospective study, 498 patients with liver cirrhosis and admission to IMC/ICU were included. ACLF was defined according to EASL-CLIF criteria. Pulmonary impairment was classified into three groups: unimpaired ventilation, need for mechanical ventilation and defined pulmonary failure. These factors were analysed in different cohorts, including a propensity score-matched ACLF cohort.
Results: Mechanical ventilation and pulmonary failure were identified as independent risk factors for increased short-term mortality. In matched ACLF patients, the presence of pulmonary failure showed the highest 28-day mortality (83.7%), whereas mortality rates in ACLF with mechanical ventilation (67.3%) and ACLF without pulmonary impairment (38.8%) were considerably lower (p < .001). Especially in patients with pulmonary impairment, the CLIF-C ACLF score showed poor predictive accuracy. Adjusting the CLIF-C ACLF score for the grade of pulmonary impairment improved the prediction significantly.
Conclusions: This study highlights that not only pulmonary failure but also mechanical ventilation is associated with worse prognosis in ACLF patients. The grade of pulmonary impairment should be considered in the risk assessment in ACLF patients. The new score may be useful in the selection of patients for liver transplantation.
Purpose: The primary treatment goals for advanced-stage thumb carpometacarpal (CMC) joint osteoarthritis are complete pain relief and restoration of thumb strength. The purpose of the present study was to introduce a variation of the abductor pollicis longus (APL) suspension arthroplasty using a single looping of a radial slip from the APL tendon around the flexor carpi radialis (FCR) tendon combined with RegJoint™ interposition and to determine its efficacy in the treatment of thumb CMC joint osteoarthritis.
Methods: Between 2015 and 2017, 21 patients were included. The average age was 60.8 years (range 48–79). The mean follow-up was 27.7 months (range 8–50). Evaluation included pain, radial and palmar abduction, tip pinch and grip strength, and Disabilities of the Arm, Shoulder, and Hand (DASH) score.
Results: Pain averaged 0.3 (range 0–4) at rest and 1.4 (range 0–4) on exertion. The radial and palmar abduction were 97% and 99% compared to the contralateral side. The tip pinch and grip strength were 4.1 kg (range 3–6.5) and 22 kg (range 13.3–40), respectively. The DASH score accounted for 18.5 (range 0.8–41.7).
Conclusion: The modified APL suspension interposition arthroplasty was an efficient and simplified option for the treatment of thumb CMC joint osteoarthritis, with results comparable or better than other published procedures. The APL suspension technique was easy to perform avoiding difficult bone tunneling and incision of the FCR tendon. The RegJoint™ interposition as spacer prevented impingement of the first metacarpal base on the second metacarpal base or the trapezoid bone.
Background: The combination of intermediate-dose cytarabine plus mitoxantrone (IMA) can induce high complete remission rates with acceptable toxicity in elderly patients with acute myeloid leukemia (AML). We present the final results of a randomized-controlled trial comparing IMA with the standard 7 + 3 induction regimen consisting of continuous infusion cytarabine plus daunorubicin (DA).
Patients and methods: Patients with newly diagnosed AML >60 years were randomized to receive either intermediate-dose cytarabine (1000 mg/m2 twice daily on days 1, 3, 5, 7) plus mitoxantrone (10 mg/m2 days 1–3) (IMA) or standard induction therapy with cytarabine (100 mg/m2 continuously days 1–7) plus daunorubicin (45 mg/m2 days 3–5) (DA). Patients in complete remission after DA received intermediate-dose cytarabine plus amsacrine as consolidation treatment, whereas patients after IMA were consolidated with standard-dose cytarabine plus mitoxantrone.
Results: Between February 2005 and October 2009, 485 patients were randomized; 241 for treatment arm DA and 244 for IMA; 76% of patients were >65 years. The complete response rate after DA was 39% [95% confidence interval (95% CI): 33–45] versus 55% (95% CI: 49–61) after IMA (odds ratio 1.89, P = 0.001). The 6-week early-death rate was 14% in both arms. Relapse-free survival curves were superimposable in the first year, but separated afterwards, resulting in 3-year relapse-free survival rates of 29% versus 14% in the DA versus IMA arms, respectively (P = 0.042). The median overall survival was 10 months in both arms (P = 0.513).
Conclusion: The dose escalation of cytarabine in induction therapy lead to improved remission rates in the elderly AML patients. This did not translate into a survival advantage, most likely due to differences in consolidation treatment. Thus, effective consolidation strategies need to be further explored. In combination with an effective consolidation strategy, the use of intermediate-dose cytarabine in induction may improve curative treatment for elderly AML patients.
Small molecule biomarker discovery: Proposed workflow for LC-MS-based clinical research projects
(2023)
Mass spectrometry focusing on small endogenous molecules has become an integral part of biomarker discovery in the pursuit of an in-depth understanding of the pathophysiology of various diseases, ultimately enabling the application of personalized medicine. While LC-MS methods allow researchers to gather vast amounts of data from hundreds or thousands of samples, the successful execution of a study as part of clinical research also requires knowledge transfer with clinicians, involvement of data scientists, and interactions with various stakeholders.
The initial planning phase of a clinical research project involves specifying the scope and design, and engaging relevant experts from different fields. Enrolling subjects and designing trials rely largely on the overall objective of the study and epidemiological considerations, while proper pre-analytical sample handling has immediate implications on the quality of analytical data. Subsequent LC-MS measurements may be conducted in a targeted, semi-targeted, or non-targeted manner, resulting in datasets of varying size and accuracy. Data processing further enhances the quality of data and is a prerequisite for in-silico analysis. Nowadays, the evaluation of such complex datasets relies on a mix of classical statistics and machine learning applications, in combination with other tools, such as pathway analysis and gene set enrichment. Finally, results must be validated before biomarkers can be used as prognostic or diagnostic decision-making tools. Throughout the study, quality control measures should be employed to enhance the reliability of data and increase confidence in the results.
The aim of this graphical review is to provide an overview of the steps to be taken when conducting an LC-MS-based clinical research project to search for small molecule biomarkers.
Different treatment options for acetabular fractures in the elderly and nonagenarians exist; a consistent guideline has not been established, yet. The purpose of this study is to give an overview of how those fractures can be handled and compares two different surgical treatment methods.
A total of 89 patients ≥ 18 years between 2016 and 2021 with acetabular fractures in our department received a surgical intervention with plate fixation via the Stoppa approach or a total hip arthroplasty with a Burch–Schneider ring and integrated cup. 60 patients ≥ 65 were compared in two groups, 29 patients between 65 and 79 and 31 patients ≥ 80. For comparison, data on operation times, hospitalization, complications during operation and hospital stay, blood loss and postoperative mobilization were collected.
Characteristics could be found for indications for operative osteosynthesis or endoprosthetics based on the X-ray analysis. There was a tendency to treat simple fractures with osteosynthesis. Patients between 65 and 79 with an osteosynthesis had benefits in almost every comparison. Patients ≥ 80 with a plate fixation had advantages in the categories of postoperative complications, blood loss and transfusion of erythrocyte concentrates. Statistical significant differences were noticed in both groups regarding the operation time. Patients between 65 and 79 with osteosynthesis had significant benefits for postoperative complications, hospitalization, number of blood transfusions and postoperative mobilization.
Finding the best supportive treatment option is difficult, and decision-making must respect fracture patterns and individual risk factors. This study shows that plate fixation via the Stoppa approach has some benefits.
Background & Aims: Spontaneous portosystemic shunts (SPSS) frequently develop in liver cirrhosis. Recent data suggested that the presence of a single large SPSS is associated with complications, especially overt hepatic encephalopathy (oHE). However, the presence of >1 SPSS is common. This study evaluates the impact of total cross-sectional SPSS area (TSA) on outcomes in patients with liver cirrhosis.
Methods: In this retrospective international multicentric study, CT scans of 908 cirrhotic patients with SPSS were evaluated for TSA. Clinical and laboratory data were recorded. Each detected SPSS radius was measured and TSA calculated. One-year survival was the primary endpoint and acute decompensation (oHE, variceal bleeding, ascites) was the secondary endpoint.
Results: A total of 301 patients (169 male) were included in the training cohort. Thirty percent of all patients presented with >1 SPSS. A TSA cut-off of 83 mm2 was used to classify patients with small or large TSA (S-/L-TSA). Patients with L-TSA presented with higher model for end-stage liver disease score (11 vs. 14) and more commonly had a history of oHE (12% vs. 21%, p <0.05). During follow-up, patients with L-TSA experienced more oHE episodes (33% vs. 47%, p <0.05) and had lower 1-year survival than those with S-TSA (84% vs. 69%, p <0.001). Multivariate analysis identified L-TSA (hazard ratio 1.66; 95% CI 1.02–2.70, p <0.05) as an independent predictor of mortality. An independent multicentric validation cohort of 607 patients confirmed that patients with L-TSA had lower 1-year survival (77% vs. 64%, p <0.001) and more oHE development (35% vs. 49%, p <0.001) than those with S-TSA.
Conclusion: This study suggests that TSA >83 mm2 increases the risk for oHE and mortality in patients with cirrhosis. Our results support the clinical use of TSA/SPSS for risk stratification and decision-making in the management of patients with cirrhosis.
Lay summary: The prevalence of spontaneous portosystemic shunts (SPSS) is higher in patients with more advanced chronic liver disease. The presence of more than 1 SPSS is common in advanced chronic liver disease and is associated with the development of hepatic encephalopathy. This study shows that total cross-sectional SPSS area (rather than diameter of the single largest SPSS) predicts survival in patients with advanced chronic liver disease. Our results support the clinical use of total cross-sectional SPSS area for risk stratification and decision-making in the management of SPSS.
Background & Aims: Acute‐on‐chronic liver failure (ACLF) is characterized by high short‐term mortality and systemic inflammation (SI). Recently, different cardiodynamic states were shown to independently predict outcomes in cirrhosis. The relationship between cardiodynamic states, SI, and portal hypertension and their impact on ACLF development remains unclear. The aim of this study was therefore to evaluate the interplay of cardiodynamic state and SI on fatal ACLF development in cirrhosis.
Results: At inclusion, hemodynamic measures including cardiac index (CI) and hepatic venous pressure gradient of 208 patients were measured. Patients were followed prospectively for fatal ACLF development (primary endpoint). SI was assessed by proinflammatory markers such as interleukins (ILs) 6 and 8 and soluble IL‐33 receptor (sIL‐33R). Patients were divided according to CI (<3.2; 3.2‐4.2; >4.2 L/min/m2) in hypo‐ (n = 84), normo‐ (n = 69) and hyperdynamic group (n = 55). After a median follow‐up of 3 years, the highest risk of fatal ACLF was seen in hyperdynamic (35%) and hypodynamic patients (25%) compared with normodynamic (14%) (P = .011). Hyperdynamic patients showed the highest rate of SI. The detectable level of IL‐6 was an independent predictor of fatal ACLF development.
Conclusions: Cirrhotic patients with hyperdynamic and hypodynamic circulation have a higher risk of fatal ACLF. Therefore, the cardiodynamic state is strongly associated with SI, which is an independent predictor of development of fatal ACLF.