Medizin
Refine
Year of publication
- 2020 (394) (remove)
Document Type
- Article (394) (remove)
Has Fulltext
- yes (394)
Keywords
- inflammation (14)
- COVID-19 (11)
- quality of life (7)
- SARS-CoV-2 (6)
- biomarker (6)
- ADHD (5)
- cancer (5)
- depression (5)
- obesity (5)
- ACLF (4)
Institute
- Medizin (394) (remove)
The incidence of FIX inhibitors in severe hemophilia B (SHB) is not well defined. Frequencies of 3-5% have been reported but most studies to date were small, including patients with different severities, and without prospective follow-up for inhibitor incidence. Study objective was to investigate inhibitor incidence in patients with SHB followed up to 500 exposure days (ED), the frequency of allergic reactions, and the relationship with genotypes. Consecutive previously untreated patients (PUPs) with SHB enrolled into the PedNet cohort were included. Detailed data was collected for the first 50 ED, followed by annual collection of inhibitor status and allergic reactions. Presence of inhibitors was defined by at least two consecutive positive samples. Additionally, data on factor IX gene mutation was collected. 154 PUPs with SHB were included; 75% were followed until 75 ED, and 43% until 500 ED. Inhibitors developed in 14 patients (7 high-titre). Median number of ED at inhibitor manifestation was 11 (IQR 6.5-36.5). Cumulative inhibitor incidence was 9.3% (95%CI 4.4-14.1) at 75 ED, and 10.2% (5.1-15.3) at 500 ED. Allergic reactions occurred in 4 (28.6%) inhibitor patients. Missense mutations were most frequent (46.8%) overall but not associated with inhibitors. Nonsense mutations and deletions with large structural changes comprised all mutations among inhibitor patients and were associated with an inhibitor risk of 26.9% and 33.3%, respectively. In an unselected, well-defined cohort of PUPs with SHB, cumulative inhibitor incidence was 10.2% at 500 ED. Nonsense mutations and large deletions were strongly associated with the risk of inhibitor development. The PedNet Registry is registered at clinicaltrials.gov; identifier: NCT02979119
Introduction: In recent years, resource-saving handling of allogeneic blood products and a reduction of transfusion rates in adults has been observed. However, comparable published national data for transfusion practices in pediatric patients are currently not available. In this study, the transfusion rates for children and adolescents were analyzed based on data from the Federal Statistical Office of Germany during the past 2 decades. Methods: Data were queried via the database of the Federal Statistical Office (Destasis). The period covered was from 2005 to 2018, and those in the sample group were children and adolescents aged 0–17 years receiving inpatient care. Operation and procedure codes (OPS) for transfusions, procedures, or interventions with increased transfusion risk were queried and evaluated in detail. Results: In Germany, 0.9% of the children and adolescents treated in hospital received a transfusion in 2018. A reduction in transfusion rates from 1.02% (2005) to 0.9% (2018) was observed for the total collective of children and adolescents receiving inpatient care. Increases in transfusion rates were recorded for 1- to 4- (1.41–1.45%) and 5- to 10-year-olds (1.24–1.33%). Children under 1 year of age were most frequently transfused (in 2018, 40.2% of the children were cared for in hospital). Transfusion-associated procedures such as chemotherapy or machine ventilation and respiratory support for newborns and infants are on the rise. Conclusion: Transfusion rates are declining in children and adolescents, but the reasons for increases in transfusion rates in other groups are unclear. Prospective studies to evaluate transfusion rates and triggers in children are urgently needed.
Background: Plasma transfusions are most commonly used therapeutically for bleeding or prophylactically in non-bleeding patients prior to invasive procedures or surgery. Although plasma transfusions generally seem to decline, plasma usage for indications that lack evidence of efficacy prevail. Summary: There is wide international, interinstitutional, and interindividual variance regarding the compliance with guidelines based on published references, supported by appropriate testing. There is furthermore a profound lack of evidence from randomized controlled trials comparing the effect of plasma transfusion with that of other therapeutic interventions for most indications, including massive bleeding. The expected benefit of a plasma transfusion needs to be balanced carefully against the associated risk of adverse events. In light of the heterogeneous nature of bleeding conditions and their rapid evolvement over time, fibrinogen and factor concentrate therapy, directed at specific phases of coagulation identified by alternative laboratory assays, may offer advantages over conventional blood product ratio-driven resuscitation. However, their outcome benefit has not been demonstrated in well-powered prospective trials. This systematic review will detail the current evidence base for plasma transfusion in adult surgical patients.
Management of decompensated cirrhosis is currently geared towards the treatment of complications once they occur. To date there is no established disease-modifying therapy aimed at halting progression of the disease and preventing the development of complications in patients with decompensated cirrhosis. The design of clinical trials to investigate new therapies for patients with decompensated cirrhosis is complex. The population of patients with decompensated cirrhosis is heterogeneous (i.e., different etiologies, comorbidities and disease severity), leading to the inclusion of diverse populations in clinical trials. In addition, primary endpoints selected for trials that include patients with decompensated cirrhosis are not homogeneous and at times may not be appropriate. This leads to difficulties in comparing results obtained from different trials. Against this background, the LiverHope Consortium organized a meeting of experts, the goal of which was to develop recommendations for the design of clinical trials and to define appropriate endpoints, both for trials aimed at modifying the natural history and preventing progression of decompensated cirrhosis, as well as for trials aimed at managing the individual complications of cirrhosis.
Objectives: An increasing number of treatment-determining biomarkers has been identified in non-small cell lung cancer (NSCLC) and molecular testing is recommended to enable optimal individualized treatment. However, data on implementation of these recommendations in the “real-world” setting are scarce. This study presents comprehensive details on the frequency, methodology and results of biomarker testing of advanced NSCLC in Germany.
Patients and methods: This analysis included 3,717 patients with advanced NSCLC (2,921 non-squamous; 796 squamous), recruited into the CRISP registry at start of systemic therapy by 150 German sites between December 2015 and June 2019. Evaluated were the molecular biomarkers EGFR, ALK, ROS1, BRAF, KRAS, MET, TP53, RET, HER2, as well as expression of PD-L1.
Results: In total, 90.5 % of the patients were tested for biomarkers. Testing rates were 92.2 % (non-squamous), 70.7 % (squamous) and increased from 83.2 % in 2015/16 to 94.2% in 2019. Overall testing rates for EGFR, ALK, ROS1, and BRAF were 72.5 %, 74.5 %, 66.1 %, and 53.0 %, respectively (non-squamous). Testing rates for PD-L1 expression were 64.5 % (non-squamous), and 58.5 % (squamous). The most common testing methods were immunohistochemistry (68.5 % non-squamous, 58.3 % squamous), and next-generation sequencing (38.7 % non-squamous, 14.4 % squamous). Reasons for not testing were insufficient tumor material or lack of guideline recommendations (squamous). No alteration was found in 37.8 % (non-squamous), and 57.9 % (squamous), respectively. Most common alterations in non-squamous tumors (all patients/all patients tested for the respective biomarker): KRAS (17.3 %/39.2 %), TP53 (14.1 %/51.4 %), and EGFR (11.0 %/15.1 %); in squamous tumors: TP53 (7.0 %/69.1 %), MET (1.5 %/11.1 %), and EGFR (1.1 %/4.4 %). Median PFS (non-squamous) was 8.7 months (95 % CI 7.4–10.4) with druggable EGFR mutation, and 8.0 months (95 % CI 3.9–9.2) with druggable ALK alterations.
Conclusion: Testing rates in Germany are high nationwide and acceptable in international comparison, but still leave out a significant portion of patients, who could potentially benefit. Thus, specific measures are needed to increase implementation.
Human lymph nodes play a central part of immune defense against infection agents and tumor cells. Lymphoid follicles are compartments of the lymph node which are spherical, mainly filled with B cells. B cells are cellular components of the adaptive immune systems. In the course of a specific immune response, lymphoid follicles pass different morphological differentiation stages. The morphology and the spatial distribution of lymphoid follicles can be sometimes associated to a particular causative agent and development stage of a disease. We report our new approach for the automatic detection of follicular regions in histological whole slide images of tissue sections immuno-stained with actin. The method is divided in two phases: (1) shock filter-based detection of transition points and (2) segmentation of follicular regions. Follicular regions in 10 whole slide images were manually annotated by visual inspection, and sample surveys were conducted by an expert pathologist. The results of our method were validated by comparing with the manual annotation. On average, we could achieve a Zijbendos similarity index of 0.71, with a standard deviation of 0.07.
Mental imagery provides an essential simulation tool for remembering the past and planning the future, with its strength affecting both cognition and mental health. Research suggests that neural activity spanning prefrontal, parietal, temporal, and visual areas supports the generation of mental images. Exactly how this network controls the strength of visual imagery remains unknown. Here, brain imaging and transcranial magnetic phosphene data show that lower resting activity and excitability levels in early visual cortex (V1-V3) predict stronger sensory imagery. Further, electrically decreasing visual cortex excitability using tDCS increases imagery strength, demonstrating a causative role of visual cortex excitability in controlling visual imagery. Together, these data suggest a neurophysiological mechanism of cortical excitability involved in controlling the strength of mental images.
Background: Berotralstat (BCX7353) is an oral, once-daily inhibitor of plasma kallikrein in development for the prophylaxis of hereditary angioedema (HAE) attacks.
Objective: Our aim was to determine the efficacy, safety, and tolerability of berotralstat in patients with HAE over a 24-week treatment period (the phase 3 APeX-2 trial).
Methods: APeX-2 was a double-blind, parallel-group study that randomized patients at 40 sites in 11 countries 1:1:1 to receive once-daily berotralstat in a dose of 110 mg or 150 mg or placebo (Clinicaltrials.gov identifier NCT03485911). Patients aged 12 years or older with HAE due to C1 inhibitor deficiency and at least 2 investigator-confirmed HAE attacks in the first 56 days of a prospective run-in period were eligible. The primary efficacy end point was the rate of investigator-confirmed HAE attacks during the 24-week treatment period.
Results: A total of 121 patients were randomized; 120 of them received at least 1 dose of the study drug (n = 41, 40, and 39 in the 110-mg dose of berotralstat, 150-mg of dose berotralstat, and placebo groups, respectively). Berotralstat demonstrated a significant reduction in attack rate at both 110 mg (1.65 attacks per month; P = .024) and 150 mg (1.31 attacks per month; P < .001) relative to placebo (2.35 attacks per month). The most frequent treatment-emergent adverse events that occurred more with berotralstat than with placebo were abdominal pain, vomiting, diarrhea, and back pain. No drug-related serious treatment-emergent adverse events occurred.
Conclusion: Both the 110-mg and 150-mg doses of berotralstat reduced HAE attack rates compared with placebo and were safe and generally well tolerated. The most favorable benefit-to-risk profile was observed at a dose of 150 mg per day.
The in vivo firing patterns of ventral midbrain dopamine neurons are controlled by afferent and intrinsic activity to generate sensory cue and prediction error signals that are essential for reward-based learning. Given the absence of in vivo intracellular recordings during the last three decades, the subthreshold membrane potential events that cause changes in dopamine neuron firing patterns remain unknown. To address this, we established in vivo whole-cell recordings and obtained over 100 spontaneously active, immunocytochemically-defined midbrain dopamine neurons in isoflurane-anaesthetized adult mice. We identified a repertoire of subthreshold membrane potential signatures associated with distinct in vivo firing patterns. Dopamine neuron activity in vivo deviated from single-spike pacemaking by phasic increases in firing rate via two qualitatively distinct biophysical mechanisms: 1) a prolonged hyperpolarization preceding rebound bursts, accompanied by a hyperpolarizing shift in action potential threshold; and 2) a transient depolarization leading to high-frequency plateau bursts, associated with a depolarizing shift in action potential threshold. Our findings define a mechanistic framework for the biophysical implementation of dopamine neuron firing patterns in the intact brain.