Institutes
Refine
Year of publication
Document Type
- Article (1579) (remove)
Language
- English (1579) (remove)
Has Fulltext
- yes (1579)
Is part of the Bibliography
- no (1579)
Keywords
- COVID-19 (39)
- SARS-CoV-2 (35)
- inflammation (29)
- Video (18)
- prostate cancer (15)
- Epilepsy (14)
- Inflammation (14)
- ADHD (12)
- Cancer (12)
- autophagy (12)
Institute
- Medizin (1579)
- Biochemie, Chemie und Pharmazie (24)
- Biochemie und Chemie (22)
- Biowissenschaften (22)
- Georg-Speyer-Haus (20)
- Zentrum für Arzneimittelforschung, Entwicklung und Sicherheit (ZAFES) (16)
- Buchmann Institut für Molekulare Lebenswissenschaften (BMLS) (13)
- Frankfurt Institute for Advanced Studies (FIAS) (10)
- Psychologie und Sportwissenschaften (10)
- Psychologie (7)
Fasting Ramadan is known to influence patients’ medication adherence. Data on patients’ behavior to oral anticoagulant (OAC) drug intake during Ramadan is missing. We aimed to determine patient-guided modifications of OAC medication regimen during Ramadan and to evaluate its consequences. A multicenter cross-sectional study conducted in Saudi Arabia. Data were collected shortly after Ramadan 2019. Participants were patients who fasted Ramadan and who were on long-term anticoagulation. Patient-guided medication changes during Ramadan in comparison to the regular intake schedule before Ramadan were recorded. Modification behavior was compared between twice daily (BID) and once daily (QD) treatment regimens. Rates of hospital admission during Ramadan were determined. We included 808 patients. During Ramadan, 53.1% modified their intake schedule (31.1% adjusted intake time, 13.2% skipped intakes, 2.2% took double dosing). A higher frequency of patient-guided modification was observed in patients on BID regimen compared to QD regimen. During Ramadan, 11.3% of patients were admitted to hospital. Patient-guided modification was a strong predictor for hospital admission. Patient-guided modification of OAC intake during Ramadan is common, particularly in patients on BID regimen. It increases the risk of hospital admission during Ramadan. Planning of OAC intake during Ramadan and patient education on the risk of low adherence are advisable.
Searching for new strategies to trigger apoptosis in rhabdomyosarcoma (RMS), we investigated the effect of two novel classes of apoptosis-targeting agents, i.e. monoclonal antibodies against TNF-related apoptosis-inducing ligand (TRAIL) receptor 1 (mapatumumab) and TRAIL receptor 2 (lexatumumab) and small-molecule inhibitors of inhibitor of apoptosis (IAP) proteins. Here, we report that IAP inhibitors synergized with lexatumumab, but not with mapatumumab, to reduce cell viability and to induce apoptosis in several RMS cell lines in a highly synergistic manner (combination index <0.1). Cotreatment-induced apoptosis was accompanied by enhanced activation of caspase-8, -9, and -3; loss of mitochondrial membrane potential; and caspase-dependent apoptosis. In addition, IAP inhibitor and lexatumumab cooperated to stimulate the assembly of a cytosolic complex containing RIP1, FADD, and caspase-8. Importantly, knockdown of RIP1 by RNA interference prevented the formation of the RIP1·FADD·caspase-8 complex and inhibited subsequent activation of caspase-8, -9, and -3; loss of mitochondrial membrane potential; and apoptosis upon treatment with IAP inhibitor and lexatumumab. In addition, RIP1 silencing rescued clonogenic survival of cells treated with the combination of lexatumumab and IAP inhibitor, thus underscoring the critical role of RIP1 in cotreatment-induced apoptosis. By comparison, the TNFα-blocking antibody Enbrel had no effect on IAP inhibitor/lexatumumab-induced apoptosis, indicating that an autocrine TNFα loop is dispensable. By demonstrating that IAP inhibitors and lexatumumab synergistically trigger apoptosis in a RIP1-dependent but TNFα-independent manner in RMS cells, our findings substantially advance our understanding of IAP inhibitor-mediated regulation of TRAIL-induced cell death.
Background & Aims: HBV genotype G (HBV/G) is mainly found in co-infections with other HBV genotypes and was identified as an independent risk factor for liver fibrosis. This study aimed to analyse the prevalence of HBV/G co-infections in healthy European HBV carriers and to characterize the crosstalk of HBV/G with other genotypes.
Methods: A total of 560 European HBV carriers were tested via HBV/G-specific PCR for HBV/G co-infections. Quasispecies distribution was analysed via deep sequencing, and the clinical phenotype was characterized regarding qHBsAg-/HBV-DNA levels and frequent mutations. Replicative capacity and expression of HBsAg/core was studied in hepatoma cells co-expressing HBV/G with either HBV/A, HBV/D or HBV/E using bicistronic vectors.
Results: Although no HBV/G co-infection was found by routine genotyping PCR, HBV/G was detected by specific PCR in 4%-8% of patients infected with either HBV/A or HBV/E but only infrequently in other genotypes. In contrast to HBV/E, HBV/G was found as the quasispecies major variant in co-infections with HBV/A. No differences in the clinical phenotype were observed for HBV/G co-infections. In vitro RNA and DNA levels were comparable among all genotypes, but expression and release of HBsAg was reduced in co-expression of HBV/G with HBV/E. In co-expression with HBV/A and HBV/E expression of HBV/G-specific core was enhanced while core expression from the corresponding genotype was markedly diminished.
Conclusions: HBV/G co-infections are common in European inactive carriers with HBV/A and HBV/E infection, but sufficient detection depends strongly on the assay. HBV/G regulated core expression might play a critical role for survival of HBV/G in co-infections.
Transdiagnostic comparison of visual working memory capacity in bipolar disorder and schizophrenia
(2021)
Background: Impaired working memory is a core cognitive deficit in both bipolar disorder and schizophrenia. Its study might yield crucial insights into the underpinnings of both disorders on the cognitive and neurophysiological level. Visual working memory capacity is a particularly promising construct for such translational studies. However, it has not yet been investigated across the full spectrum of both disorders. The aim of our study was to compare the degree of reductions of visual working memory capacity in patients with bipolar disorder (PBD) and patients with schizophrenia (PSZ) using a paradigm well established in cognitive neuroscience.
Methods: 62 PBD, 64 PSZ, and 70 healthy controls (HC) completed a canonical visual change detection task. Participants had to encode the color of four circles and indicate after a short delay whether the color of one of the circles had changed or not. We estimated working memory capacity using Pashler’s K.
Results: Working memory capacity was significantly reduced in both PBD and PSZ compared to HC. We observed a small effect size (r = .202) for the difference between HC and PBD and a medium effect size (r = .370) for the difference between HC and PSZ. Working memory capacity in PSZ was also significantly reduced compared to PBD with a small effect size (r = .201). Thus, PBD showed an intermediate level of impairment.
Conclusions: These findings provide evidence for a gradient of reduced working memory capacity in bipolar disorder and schizophrenia, with PSZ showing the strongest degree of impairment. This underscores the importance of disturbed information processing for both bipolar disorder and schizophrenia. Our results are compatible with the cognitive manifestation of a neurodevelopmental gradient affecting bipolar disorder to a lesser degree than schizophrenia. They also highlight the relevance of visual working memory capacity for the development of both behavior- and brain-based transdiagnostic biomarkers.
Since the survival rates of pediatric patients undergoing cancer treatment or hematopoietic stem cell transplantation (HSCT) have increased rapidly in recent decades, the late effects of treatment are now an important focus of patient care. Access to fertility preservation (FP) procedures as well as their financing differs considerably across Europe. However, some countries in Europe have recently changed the legal basis for financing FP procedures; therefore, the implementation of structures is mandatory to give patients access to FP. In this prospective cohort study, we characterized the process for establishing pediatric fertility counseling, including the development of an in-house standard procedure for recommendations regarding FP with potentially gonadotoxic treatment and valuating data from all FP counseling sessions. All data concerning patient characteristics (pubertal status, disease group) and recommendation of FP measures were prospectively collected and adoption of FP measures analyzed. Prior to the establishment of a structured process for FP in our pediatric oncology and stem cell transplantation center, there was no standardized FP counseling. We demonstrate that with the establishment of an inhouse standard procedure, it is possible to give consistent yet individualized FP counseling to approximately 90% of our patients facing gonadotoxic treatment, counseling over 200 patients between 2017 and 2019. This pilot study could potentially be adapted in other pediatric hematology, oncology, and stem cell transplantation centers to allow a more standardized handling of FP counseling for all patients facing gonadotoxic treatment.
Inhibition of the soluble epoxide hydrolase (sEH) has beneficial effects on vascular inflammation and hypertension indicating that the enzyme may be a promising target for drug development. As the enzymatic core of the hydrolase domain of the human sEH contains two tyrosine residues (Tyr383 and Tyr466) that are theoretically crucial for enzymatic activity, we addressed the hypothesis that the activity of the sEH may be affected by nitrosative stress. Epoxide hydrolase activity was detected in human and murine endothelial cells as well in HEK293 cells and could be inhibited by either authentic peroxynitrite (ONOO−) or the ONOO− generator 3-morpholino-sydnonimine (SIN-1). Protection of the enzymatic core with 1-adamantyl-3-cyclohexylurea in vitro decreased sensitivity to SIN-1. Both ONOO− and SIN-1 elicited the tyrosine nitration of the sEH protein and mass spectrometry analysis of tryptic fragments revealed nitration on several tyrosine residues including Tyr383 and Tyr466. Mutation of the latter residues to phenylalanine was sufficient to abrogate epoxide hydrolase activity. In vivo, streptozotocin-induced diabetes resulted in the tyrosine nitration of the sEH in murine lungs and a significant decrease in its activity. Taken together, these data indicate that the activity of the sEH can be regulated by the tyrosine nitration of the protein. Moreover, nitrosative stress would be expected to potentiate the physiological actions of arachidonic acid epoxides by preventing their metabolism to the corresponding diols.
Simple Summary: Currently, it is unclear which kind of axillary staging surgery breast cancer patients with lymph node metastasis should receive after neoadjuvant chemotherapy. For decades, these patients have been treated with a full axillary lymph node dissection, even if they converted to clinical node negativity. However, the removal of a large number of lymph nodes during the procedure can increase arm morbidity and impact quality of life. Therefore, several studies investigated less radical surgical strategies in this setting, such as sentinel lymph node biopsy or targeted axillary dissection, i.e., removal of a previously marked node combined with sentinel node removal. In this review, we summarize current evidence on the different surgical techniques and compare national and international recommendations. We show that many questions regarding oncological safety of different surgery types and the optimal marking technique remain unanswered and present the multinational prospective cohort study AXSANA that will address these open issues.
Abstract: In the last two decades, surgical methods for axillary staging in breast cancer patients have become less extensive, and full axillary lymph node dissection (ALND) is confined to selected patients. In initially node-positive patients undergoing neoadjuvant chemotherapy, however, the optimal management remains unclear. Current guidelines vary widely, endorsing different strategies. We performed a literature review on axillary staging strategies and their place in international recommendations. This overview defines knowledge gaps associated with specific procedures, summarizes currently ongoing clinical trials that address these unsolved issues, and provides the rationale for further research. While some guidelines have already implemented surgical de-escalation, replacing ALND with, e.g., sentinel lymph node biopsy (SLNB) or targeted axillary dissection (TAD) in cN+ patients converting to clinical node negativity, others recommend ALND. Numerous techniques are in use for tagging lymph node metastasis, but many questions regarding the marking technique, i.e., the optimal time for marker placement and the number of marked nodes, remain unanswered. The optimal number of SLNs to be excised also remains a matter of debate. Data on oncological safety and quality of life following different staging procedures are lacking. These results provide the rationale for the multinational prospective cohort study AXSANA initiated by EUBREAST, which started enrollment in June 2020 and aims at recruiting 3000 patients in 20 countries (NCT04373655; Funded by AGO-B, Claudia von Schilling Foundation for Breast Cancer Research, AWOgyn, EndoMag, Mammotome, and MeritMedical).
Patients with acute myeloid leukemia (AML) are often exposed to broad-spectrum antibiotics and thus at high risk of Clostridioides difficile infections (CDI). As bacterial infections are a common cause for treatment-related mortality in these patients, we conducted a retrospective study to analyze the incidence of CDI and to evaluate risk factors for CDI in a large uniformly treated AML cohort. A total of 415 AML patients undergoing intensive induction chemotherapy between 2007 and 2019 were included in this retrospective analysis. Patients presenting with diarrhea and positive stool testing for toxin-producing Clostridioides difficile were defined to have CDI. CDI was diagnosed in 37 (8.9%) of 415 AML patients with decreasing CDI rates between 2013 and 2019 versus 2007 to 2012. Days with fever, exposition to carbapenems, and glycopeptides were significantly associated with CDI in AML patients. Clinical endpoints such as length of hospital stay, admission to ICU, response rates, and survival were not adversely affected. We identified febrile episodes and exposition to carbapenems and glycopeptides as risk factors for CDI in AML patients undergoing induction chemotherapy, thereby highlighting the importance of interdisciplinary antibiotic stewardship programs guiding treatment strategies in AML patients with infectious complications to carefully balance risks and benefits of anti-infective agents.
Treatment‐related complications contribute substantially to morbidity and mortality in acute myeloid leukemia (AML) patients undergoing induction chemotherapy. Although AML patients are susceptible to fluid overload (FO) (e.g., in the context of chemotherapy protocols, during sepsis treatment or to prevent tumor lysis syndrome), little attention has been paid to its role in AML patients undergoing induction chemotherapy. AML patients receiving induction chemotherapy between 2014 and 2019 were included in this study. FO was defined as ≥5% weight gain on day 7 of induction chemotherapy compared to baseline weight determined on the day of admission. We found FO in 23 (12%) of 187 AML patients undergoing induction chemotherapy. Application of >100 ml crystalloid fluids/kg body weight until day 7 of induction chemotherapy was identified as an independent risk factor for FO. AML patients with FO suffered from a significantly increased 90-day mortality rate and FO was demonstrated as an independent risk factor for 90-day mortality. Our data suggests an individualized, weight-adjusted calculation of crystalloid fluids in order to prevent FO-related morbidity and mortality in AML patients during induction chemotherapy. Prospective trials are required to determine the adequate fluid management in this patient population.
Acute kidney injury (AKI) complicates the clinical course of hospitalized patients by increasing need for intensive care treatment and mortality. There is only little data about its impact on AML patients undergoing intensive induction chemotherapy. In this study, we analyzed the incidence as well as risk factors for AKI development and its impact on the clinical course of AML patients undergoing induction chemotherapy. We retrospectively analyzed data from 401 AML patients undergoing induction chemotherapy between 2007 and 2019. AKI was defined and stratified according to KIDGO criteria by referring to a defined baseline serum creatinine measured on day 1 of induction chemotherapy. Seventy-two of 401 (18%) AML patients suffered from AKI during induction chemotherapy. AML patients with AKI had more days with fever (7 vs. 5, p = 0.028) and were more often treated on intensive care unit (45.8% vs. 10.6%, p < 0.001). AML patients with AKI had a significantly lower complete remission rate after induction chemotherapy and, with 402 days, a significantly shorter median overall survival (OS) (median OS for AML patients without AKI not reached). In this study, we demonstrate that the KIDGO classification allows mortality risk stratification for AML patients undergoing induction chemotherapy. Relatively mild AKI episodes have impact on the clinical course of these patients and can lead to chronic impairment of kidney function. Therefore, we recommend incorporating risk factors for AKI in decision-making considering nutrition, fluid management, as well as the choice of potentially nephrotoxic medication in order to decrease the incidence of AKI.