Refine
Document Type
- Article (11)
- Working Paper (1)
Language
- English (12)
Has Fulltext
- yes (12)
Is part of the Bibliography
- no (12)
Keywords
- Acute myeloid leukaemia (1)
- Covid19-nmr (1)
- Issue 96 (1)
- Medicine (1)
- Molecularly targeted therapy (1)
- Myelodysplastic syndrome (1)
- NMR spectroscopy (1)
- Phase I trials (1)
- Proteomics (1)
- RNA (1)
Institute
- Medizin (10)
- Biochemie, Chemie und Pharmazie (1)
- Center for Financial Studies (CFS) (1)
- Foundation of Law and Finance (1)
- House of Finance (HoF) (1)
- Rechtswissenschaft (1)
- Sustainable Architecture for Finance in Europe (SAFE) (1)
- Wirtschaftswissenschaften (1)
- Zentrum für Biomolekulare Magnetische Resonanz (BMRZ) (1)
Background: Clinical practice guidelines for patients with primary biliary cholangitis (PBC) have been recently revised and implemented for well-established response criteria to standard first-line ursodeoxycholic acid (UDCA) therapy at 12 months after treatment initiation for the early identification of high-risk patients with inadequate treatment responses who may require treatment modification. However, there are only very limited data concerning the real-world clinical management of patients with PBC in Germany. Objective: The aim of this retrospective multicenter study was to evaluate response rates to standard first-line UDCA therapy and subsequent Second-line treatment regimens in a large cohort of well-characterized patients with PBC from 10 independent hepatological referral centers in Germany prior to the introduction of obeticholic acid as a licensed second-line treatment option. Methods: Diagnostic confirmation of PBC, standard first-line UDCA treatment regimens and response rates at 12 months according to Paris-I, Paris-II, and Barcelona criteria, the follow-up cut-off alkaline phosphatase (ALP) ≤ 1.67 × upper limit of normal (ULN) and the normalization of bilirubin (bilirubin ≤ 1 × ULN) were retrospectively examined between June 1986 and March 2017. The management and hitherto applied second-line treatment regimens in patients with an inadequate response to UDCA and subsequent response rates at 12 months were also evaluated. Results: Overall, 480 PBC patients were included in this study. The median UDCA dosage was 13.2 mg UDCA/kg bodyweight (BW)/d. Adequate UDCA treatment response rates according to Paris-I, Paris-II, and Barcelona criteria were observed in 91, 71.3, and 61.3% of patients, respectively. In 83.8% of patients, ALP ≤ 1.67 × ULN were achieved. A total of 116 patients (24.2%) showed an inadequate response to UDCA according to at least one criterion. The diverse second-line treatment regimens applied led to significantly higher response rates according to Paris-II (35 vs. 60%, p = 0.005), Barcelona (13 vs. 34%, p = 0.0005), ALP ≤ 1.67 × ULN and bilirubin ≤ 1 × ULN (52.1 vs. 75%, p = 0.002). The addition of bezafibrates appeared to induce the strongest beneficial effect in this cohort (Paris II: 24 vs. 74%, p = 0.004; Barcelona: 50 vs. 84%, p = 0.046; ALP < 1.67 × ULN and bilirubin ≤ 1 × ULN: 33 vs. 86%, p = 0.001). Conclusion: Our large retrospective multicenter study confirms high response rates following UDCA first-line standard treatment in patients with PBC and highlights the need for close monitoring and early treatment modification in high-risk patients with an insufficient response to UDCA since early treatment modification significantly increases subsequent response rates of these patients.
Using granular supervisory data from Germany, we investigate the impact of unconventional monetary policies via central banks’ purchase of corporate bonds. While this policy results in a loosening of credit market conditions as intended by policy makers, we document two unintended side effects. First, banks that are more exposed to borrowers benefiting from the bond purchases now lend more to high-risk firms with no access to bond markets. Since more loan write-offs arise from these firms and banks are not compensated for this risk by higher interest rates, we document a drop in bank profitability. Second, the policy impacts the allocation of loans among industries. Affected banks reallocate loans from investment grade firms active on bond markets to mainly real estate firms without investment grade rating. Overall, our findings suggest that central banks’ quantitative easing via the corporate bond markets has the potential to contribute to both banking sector instability and real estate bubbles.
The transcription factor Meis1 drives myeloid leukemogenesis in the context of Hox gene overexpression but is currently considered undruggable. We therefore investigated whether myeloid progenitor cells transformed by Hoxa9 and Meis1 become addicted to targetable signaling pathways. A comprehensive (phospho)proteomic analysis revealed that Meis1 increased Syk protein expression and activity. Syk upregulation occurs through a Meis1-dependent feedback loop. By dissecting this loop, we show that Syk is a direct target of miR-146a, whose expression is indirectly regulated by Meis1 through the transcription factor PU.1. In the context of Hoxa9 overexpression, Syk signaling induces Meis1, recapitulating several leukemogenic features of Hoxa9/Meis1-driven leukemia. Finally, Syk inhibition disrupts the identified regulatory loop, prolonging survival of mice with Hoxa9/Meis1-driven leukemia.
Background: IL28B gene polymorphism is the best baseline predictor of response to interferon alfa-based antiviral therapies in chronic hepatitis C. Recently, a new IFN-L4 polymorphism was identified as first potential functional variant for induction of IL28B expression. Individualization of interferon alfa-based therapies based on a combination of IL28B/IFN-L4 polymorphisms may help to optimize virologic outcome and economic resources.
Methods: Optimization of treatment outcome prediction was assessed by combination of different IL28B and IFN-L4 polymorphisms in patients with chronic HCV genotype 1 (n = 385), 2/3 (n = 267), and 4 (n = 220) infection treated with pegylated interferon alfa (PEG-IFN) and ribavirin with (n = 79) or without telaprevir. Healthy people from Germany (n = 283) and Egypt (n = 96) served as controls.
Results: Frequencies of beneficial IL28B rs12979860 C/C genotypes were lower in HCV genotype 1/4 infected patients in comparison to controls (20–35% vs. 46–47%) this was also true for ss469415590 TT/TT (20–35% vs. 45–47%). Single interferon-lambda SNPs (rs12979860, rs8099917, ss469415590) correlated with sustained virologic response (SVR) in genotype 1, 3, and 4 infected patients while no association was observed for genotype 2. Interestingly, in genotype 3 infected patients, best SVR prediction was based on IFN-L4 genotype. Prediction of SVR with high accuracy (71–96%) was possible in genotype 1, 2, 3 and 4 infected patients who received PEG-IFN/ribavirin combination therapy by selection of beneficial IL28B rs12979860 C/C and/or ss469415590 TT/TT genotypes (p<0.001). For triple therapy with first generation protease inhibitors (PIs) (boceprevir, telaprevir) prediction of high SVR (90%) rates was based on the presence of at least one beneficial genotype of the 3 IFN-lambda SNPs.
Conclusion: IFN-L4 seems to be the best single predictor of SVR in genotype 3 infected patients. For optimized prediction of SVR by treatment with dual combination or first generation PI triple therapies, grouping of interferon-lambda haplotypes may be helpful with positive predictive values of 71–96%.
Background & Aims: Genetic variations near the interferon lambda 3 gene (IFNL3, IL28B) are the most powerful predictors for sustained virologic response (SVR) in patients with chronic hepatitis C virus (HCV) infection, compared to other biochemical or histological baseline parameters. We evaluated whether the interplay of both IFNL3 polymorphisms rs12979860 and rs8099917 together with non-genetic clinical factors contributes to the predictive role of these genetic variants.
Methods: The cohort comprised 1,402 patients of European descent with chronic HCV type 1 infection. 1,298 patients received interferon-based antiviral therapy, and 719 (55%) achieved SVR. The IFNL3 polymorphisms were genotyped by polymerase chain reaction and melting curve analysis.
Results: A significant correlation was found between the IFNL3 polymorphisms and biochemical as well as virologic predictors of treatment outcome such as ALT, GGT, cholesterol, and HCV RNA levels. In multivariate regression analysis, IFLN3 SNPs, HCV RNA levels, and the GGT/ALT ratio were independent predictors of SVR. Dependent on the GGT/ALT ratio and on the HCV RNA concentration, significant variations in the likelihood for achieving SVR were observed in both, carriers of the responder as well as non-responder alleles.
Conclusions: Our data support a clear association between IFNL3 genotypes and baseline parameters known to impact interferon responsiveness. Improved treatment outcome prediction was achieved when these predictors were considered in combination with the IFNL3 genotype.
Background: Vitamin D insufficiency has been associated with the occurrence of various types of cancer, but causal relationships remain elusive. We therefore aimed to determine the relationship between genetic determinants of vitamin D serum levels and the risk of developing hepatitis C virus (HCV)-related hepatocellular carcinoma (HCC).
Methodology/Principal Findings: Associations between CYP2R1, GC, and DHCR7 genotypes that are determinants of reduced 25-hydroxyvitamin D (25[OH]D3) serum levels and the risk of HCV-related HCC development were investigated for 1279 chronic hepatitis C patients with HCC and 4325 without HCC, respectively. The well-known associations between CYP2R1 (rs1993116, rs10741657), GC (rs2282679), and DHCR7 (rs7944926, rs12785878) genotypes and 25(OH)D3 serum levels were also apparent in patients with chronic hepatitis C. The same genotypes of these single nucleotide polymorphisms (SNPs) that are associated with reduced 25(OH)D3 serum levels were found to be associated with HCV-related HCC (P = 0.07 [OR = 1.13, 95% CI = 0.99–1.28] for CYP2R1, P = 0.007 [OR = 1.56, 95% CI = 1.12–2.15] for GC, P = 0.003 [OR = 1.42, 95% CI = 1.13–1.78] for DHCR7; ORs for risk genotypes). In contrast, no association between these genetic variations and liver fibrosis progression rate (P>0.2 for each SNP) or outcome of standard therapy with pegylated interferon-α and ribavirin (P>0.2 for each SNP) was observed, suggesting a specific influence of the genetic determinants of 25(OH)D3 serum levels on hepatocarcinogenesis.
Conclusions/Significance: Our data suggest a relatively weak but functionally relevant role for vitamin D in the prevention of HCV-related hepatocarcinogenesis.
In-depth analyses of cancer cell proteomes are needed to elucidate oncogenic pathomechanisms, as well as to identify potential drug targets and diagnostic biomarkers. However, methods for quantitative proteomic characterization of patient-derived tumors and in particular their cellular subpopulations are largely lacking. Here we describe an experimental set-up that allows quantitative analysis of proteomes of cancer cell subpopulations derived from either liquid or solid tumors. This is achieved by combining cellular enrichment strategies with quantitative Super-SILAC-based mass spectrometry followed by bioinformatic data analysis. To enrich specific cellular subsets, liquid tumors are first immunophenotyped by flow cytometry followed by FACS-sorting; for solid tumors, laser-capture microdissection is used to purify specific cellular subpopulations. In a second step, proteins are extracted from the purified cells and subsequently combined with a tumor-specific, SILAC-labeled spike-in standard that enables protein quantification. The resulting protein mixture is subjected to either gel electrophoresis or Filter Aided Sample Preparation (FASP) followed by tryptic digestion. Finally, tryptic peptides are analyzed using a hybrid quadrupole-orbitrap mass spectrometer, and the data obtained are processed with bioinformatic software suites including MaxQuant. By means of the workflow presented here, up to 8,000 proteins can be identified and quantified in patient-derived samples, and the resulting protein expression profiles can be compared among patients to identify diagnostic proteomic signatures or potential drug targets.
Background: Anemia is the most important complication during major surgery and transfusion of red blood cells is the mainstay to compensate for life threating blood loss. Therefore, accurate measurement of hemoglobin (Hb) concentration should be provided in real-time. Blood Gas Analysis (BGA) provides rapid point-of-care assessment using smaller sampling tubes compared to central laboratory (CL) services. Objective: This study aimed to investigate the accuracy of BGA hemoglobin testing as compared to CL services. Methods: Data of the ongoing LIBERAL-Trial (Liberal transfusion strategy to prevent mortality and anemia-associated ischemic events in elderly non-cardiac surgical patients, LIBERAL) was used to assess the bias for Hb level measured by BGA devices (ABL800 Flex analyzer®, GEM series® and RapidPoint 500®) and CL as the reference method. For that, we analyzed pairs of Hb level measured by CL and BGA within two hours. Furthermore, the impact of various confounding factors including age, gender, BMI, smoker status, transfusion of RBC, intraoperative hemodilution, and co-medication was elucidated. In order to ensure adequate statistical analysis, only data of participating centers providing more than 200 Hb pairs were used. Results: In total, three centers including 963 patients with 1,814 pairs of Hb measurements were analyzed. Mean bias was comparable between ABL800 Flex analyzer® and GEM series®: - 0.38 ± 0.15 g/dl whereas RapidPoint 500® showed a smaller bias (-0.09 g/dl) but greater median absolute deviation (± 0.45 g/dl). In order to avoid interference with different standard deviations caused by the different analytic devices, we focused on two centers using the same BGA technique (309 patients and 1,570 Hb pairs). A Bland-Altman analysis and LOWESS curve showed that bias decreased with smaller Hb values in absolute numbers but increased relatively. The smoker status showed the greatest reduction in bias (0.1 g/dl, p<0.001) whereas BMI (0.07 g/dl, p = 0.0178), RBC transfusion (0.06 g/dl, p<0.001), statins (0.04 g/dl, p<0.05) and beta blocker (0.03 g/dl, p = 0.02) showed a slight effect on bias. Intraoperative substitution of volume and other co-medications did not influence the bias significantly. Conclusion: Many interventions like substitution of fluids, coagulating factors or RBC units rely on the accuracy of laboratory measurement devices. Although BGA Hb testing showed a consistently stable difference to CL, our data confirm that BGA devices are associated with different bias. Therefore, we suggest that hospitals assess their individual bias before implementing BGA as valid and stable supplement to CL. However, based on the finding that bias decreased with smaller Hb values, which in turn are used for transfusion decision, we expect no unnecessary or delayed RBC transfusion, and no major impact on the LIBERAL trial performance.
Maintenance therapy after allogeneic hematopoietic stem cell transplantation (HSCT) for acute myeloid leukemia (AML) or myelodysplastic syndrome (MDS) is conceptually attractive to prevent relapse, but has been hampered by the limited number of suitable anti-leukemic agents. The deacetylase inhibitor (DACi) panobinostat demonstrated moderate anti-leukemic activity in a small subset of patients with advanced AML and high-risk MDS in phase I/II trials.1, 2 It also displays immunomodulatory activity3 that may enhance leukemia-specific cytotoxicity4 and mitigate graft versus host disease (GvHD), but conversely could impair T- and NK cell function.5, 6 We conducted this open-label, multi-center phase I/II trial (NCT01451268) to assess the feasibility and preliminary efficacy of prolonged prophylactic administration of panobinostat after HSCT for AML or MDS. The study protocol was approved by an independent ethics committee and conducted in compliance with the Declaration of Helsinki. All patients provided written informed consent. ...
Background: Different parameters have been determined for prediction of treatment outcome in hepatitis c virus genotype 1 infected patients undergoing pegylated interferon, ribavirin combination therapy. Results on the importance of vitamin D levels are conflicting. In the present study, a comprehensive analysis of vitamin D levels before and during therapy together with single nucleotide polymorphisms involved in vitamin D metabolism in the context of other known treatment predictors has been performed.
Methods: In a well characterized prospective cohort of 398 genotype 1 infected patients treated with pegylated interferon-α and ribavirin for 24–72 weeks (INDIV-2 study) 25-OH-vitamin D levels and different single nucleotide polymorphisms were analyzed together with known biochemical parameters for a correlation with virologic treatment outcome.
Results: Fluctuations of more than 5 (10) ng/ml in 25-OH-vitamin D-levels have been observed in 66 (39) % of patients during the course of antiviral therapy and neither pretreatment nor under treatment 25-OH-vitamin D-levels were associated with treatment outcome. The DHCR7-TT-polymorphism within the 7-dehydrocholesterol-reductase showed a significant association (P = 0.031) to sustained viral response in univariate analysis. Among numerous further parameters analyzed we found that age (OR = 1.028, CI = 1.002–1.056, P = 0.035), cholesterol (OR = 0.983, CI = 0.975–0.991, P<0.001), ferritin (OR = 1.002, CI = 1.000–1.004, P = 0.033), gGT (OR = 1.467, CI = 1.073–2.006, P = 0.016) and IL28B-genotype (OR = 2.442, CI = 1.271–4.695, P = 0.007) constituted the strongest predictors of treatment response.
Conclusions: While 25-OH-vitamin D-levels levels show considerable variations during the long-lasting course of antiviral therapy they do not show any significant association to treatment outcome in genotype 1 infected patients.