Refine
Year of publication
Document Type
- Article (5346) (remove)
Has Fulltext
- yes (5346) (remove)
Keywords
- inflammation (77)
- COVID-19 (60)
- SARS-CoV-2 (46)
- cancer (38)
- glioblastoma (38)
- apoptosis (36)
- Inflammation (34)
- breast cancer (34)
- prostate cancer (30)
- autophagy (29)
Institute
- Medizin (5346) (remove)
Beside mosquitoes, ticks are well-known vectors of different human pathogens. In the Northern Hemisphere, Lyme borreliosis (Eurasia, LB) or Lyme disease (North America, LD) is the most commonly occurring vector-borne infectious disease caused by bacteria of the genus Borrelia which are transmitted by hard ticks of the genus Ixodes. The reported incidence of LB in Europe is about 22.6 cases per 100,000 inhabitants annually with a broad range depending on the geographical area analyzed. However, the epidemiological data are largely incomplete, because LB is not notifiable in all European countries. Furthermore, not only differ reporting procedures between countries, there is also variation in case definitions and diagnostic procedures. Lyme borreliosis is caused by several species of the Borrelia (B.) burgdorferi sensu lato (s.l.) complex which are maintained in complex networks including ixodid ticks and different reservoir hosts. Vector and host influence each other and are affected by multiple factors including climate that have a major impact on their habitats and ecology. To classify factors that influence the risk of transmission of B. burgdorferi s.l. to their different vertebrate hosts as well as to humans, we briefly summarize the current knowledge about the pathogens including their astonishing ability to overcome various host immune responses, regarding the main vector in Europe Ixodes ricinus, and the disease caused by borreliae. The research shows, that a higher standardization of case definition, diagnostic procedures, and standardized, long-term surveillance systems across Europe is necessary to improve clinical and epidemiological data.
The sympathetic nervous system (SNS) is a major regulatory mediator connecting the brain and the immune system that influences accordingly inflammatory processes within the entire body. In the periphery, the SNS exerts its effects mainly via its neurotransmitters norepinephrine (NE) and epinephrine (E), which are released by peripheral nerve endings in lymphatic organs and other tissues. Depending on their concentration, NE and E bind to specific α- and β-adrenergic receptor subtypes and can cause both pro- and anti-inflammatory cellular responses. The co-transmitter neuropeptide Y, adenosine triphosphate, or its metabolite adenosine are also mediators of the SNS. Local pro-inflammatory processes due to injury or pathogens lead to an activation of the SNS, which in turn induces several immunoregulatory mechanisms with either pro- or anti-inflammatory effects depending on neurotransmitter concentration or pathological context. In chronic inflammatory diseases, the activity of the SNS is persistently elevated and can trigger detrimental pathological processes. Recently, the sympathetic contribution to mild chronic inflammatory diseases like osteoarthritis (OA) has attracted growing interest. OA is a whole-joint disease and is characterized by mild chronic inflammation in the joint. In this narrative article, we summarize the underlying mechanisms behind the sympathetic influence on inflammation during OA pathogenesis. In addition, OA comorbidities also accompanied by mild chronic inflammation, such as hypertension, obesity, diabetes, and depression, will be reviewed. Finally, the potential of SNS-based therapeutic options for the treatment of OA will be discussed.
Highlights
• Reduced evoked theta activity in the deaf.
• Reduced theta-gamma and alpha-gamma cross-frequency couplings in the deaf.
• Stronger delta-alpha coupling in the deaf.
Abstract
Neurons within a neuronal network can be grouped by bottom-up and top-down influences using synchrony in neuronal oscillations. This creates the representation of perceptual objects from sensory features. Oscillatory activity can be differentiated into stimulus-phase-locked (evoked) and non-phase-locked (induced). The former is mainly determined by sensory input, the latter by higher-level (cortical) processing. Effects of auditory deprivation on cortical oscillations have been studied in congenitally deaf cats (CDCs) using cochlear implant (CI) stimulation. CI-induced alpha, beta, and gamma activity were compromised in the auditory cortex of CDCs. Furthermore, top-down information flow between secondary and primary auditory areas in hearing cats, conveyed by induced alpha oscillations, was lost in CDCs. Here we used the matching pursuit algorithm to assess components of such oscillatory activity in local field potentials recorded in primary field A1. Additionally to the loss of induced alpha oscillations, we also found a loss of evoked theta activity in CDCs. The loss of theta and alpha activity in CDCs can be directly related to reduced high-frequency (gamma-band) activity due to cross-frequency coupling. Here we quantified such cross-frequency coupling in adult 1) hearing-experienced, acoustically stimulated cats (aHCs), 2) hearing-experienced cats following acute pharmacological deafening and subsequent CIs, thus in electrically stimulated cats (eHCs), and 3) electrically stimulated CDCs. We found significant cross-frequency coupling in all animal groups in > 70% of auditory-responsive sites. The predominant coupling in aHCs and eHCs was between theta/alpha phase and gamma power. In CDCs such coupling was lost and replaced by alpha oscillations coupling to delta/theta phase. Thus, alpha/theta oscillations synchronize high-frequency gamma activity only in hearing-experienced cats. The absence of induced alpha and theta oscillations contributes to the loss of induced gamma power in CDCs, thereby signifying impaired local network activity.
The effect of race/ethnicity on cancer-specific mortality after salvage radical prostatectomy
(2022)
Background: To test the effect of race/ethnicity on cancer-specific mortality (CSM) after salvage radical prostatectomy (SRP).
Material and methods: We relied on the Surveillance, Epidemiology and End Results database (SEER, 2004–2016) to identify SRP patients of all race/ethnicity background. Univariate and multivariate Cox regression models addressed CSM according to race/ethnicity.
Results: Of 426 assessable SRP patients, Caucasians accounted for 299 (69.9%) vs. 68 (15.9%) African-Americans vs. 39 (9.1%) Hispanics vs. 20 (4.7%) Asians. At diagnosis, African-Americans (64 years) were younger than Caucasians (66 years), but not younger than Hispanics (66 years) and Asians (67 years). PSA at diagnosis was significantly higher in African-Americans (13.2 ng/ml), Hispanics (13.0 ng/ml), and Asians (12.2 ng/ml) than in Caucasians (7.8 ng/ml, p = 0.01). Moreover, the distribution of African-Americans (10.3%–36.6%) and Hispanics (0%–15.8%) varied according to SEER region. The 10-year CSM was 46.5% in African-Americans vs. 22.4% in Caucasians vs. 15.4% in Hispanics vs. 15.0% in Asians. After multivariate adjustment (for age, clinical T stage, lymph node dissection status), African-American race/ethnicity was an independent predictor of higher CSM (HR: 2.2, p < 0.01), but not Hispanic or Asian race/ethnicity. The independent effect of African-American race/ethnicity did not persist after further adjustment for PSA.
Conclusion: African-Americans treated with SRP are at higher risk of CSM than other racial/ethnic groups and also exhibited the highest baseline PSA. The independent effect of African-American race/ethnicity on higher CSM no longer applies after PSA adjustment since higher PSA represents a distinguishing feature in African-American patients.
Purpose: To evaluate intermediate and long-term visual outcomes and safety of a phakic intraocular posterior chamber lens with a central hole (ICL V4c) for myopic eyes.
Methods: Retrospective, consecutive case study of patients that uneventfully received a ICL V4c for myopia correction, with a 5-year postoperative follow-up. Department of Ophthalmology, Goethe University Frankfurt, Germany.
Results: From 241 eyes that underwent ICL implantation, we included 45 eyes with a mean age at surgery of 33 years ± 6 (18–48 years), with a 5 years follow-up. CDVA improved from 0.05logMAR ± 0.15 CDVA preoperatively to − 0.00 ± 0,07 at 5 years and did not change significantly from 3 to 5 years’ time (p = 0.266). The mean spherical equivalent (SE) improved from -10.13D ± 3.39 to − 0.45D ± 0.69. The change in endothelial cell count showed a mean decrease of 1.9% per year throughout the follow-up. Safety and efficacy index were 1.16 and 0.78, respectively. Cataract formation was seen in 2 of 241 eyes (0.8%), but in none of the 45 eyes that finished the 5-year follow-up.
Conclusions: Our data show a good intermediate and long-term stability, efficiency, and safety of ICL V4c phakic lenses in myopic eyes comparable to other known literature.
Background: Prostate cancer is a major health concern in aging men. Paralleling an aging society, prostate cancer prevalence increases emphasizing the need for efcient diagnostic algorithms.
Methods: Retrospectively, 106 prostate tissue samples from 48 patients (mean age,
66 ± 6.6 years) were included in the study. Patients sufered from prostate cancer (n = 38) or benign prostatic hyperplasia (n = 10) and were treated with radical prostatectomy or Holmium laser enucleation of the prostate, respectively. We constructed tissue microarrays (TMAs) comprising representative malignant (n = 38) and benign (n = 68) tissue cores. TMAs were processed to histological slides, stained, digitized and assessed for the applicability of machine learning strategies and open–source tools in diagnosis of prostate cancer. We applied the software QuPath to extract features for shape, stain intensity, and texture of TMA cores for three stainings, H&E, ERG, and PIN-4. Three machine learning algorithms, neural network (NN), support vector machines (SVM), and random forest (RF), were trained and cross-validated with 100 Monte Carlo random splits into 70% training set and 30% test set. We determined AUC values for single color channels, with and without optimization of hyperparameters by exhaustive grid search. We applied recursive feature elimination to feature sets of multiple color transforms.
Results: Mean AUC was above 0.80. PIN-4 stainings yielded higher AUC than H&E and
ERG. For PIN-4 with the color transform saturation, NN, RF, and SVM revealed AUC of 0.93 ± 0.04, 0.91 ± 0.06, and 0.92 ± 0.05, respectively. Optimization of hyperparameters improved the AUC only slightly by 0.01. For H&E, feature selection resulted in no increase of AUC but to an increase of 0.02–0.06 for ERG and PIN-4.
Conclusions: Automated pipelines may be able to discriminate with high accuracy between malignant and benign tissue. We found PIN-4 staining best suited for classifcation. Further bioinformatic analysis of larger data sets would be crucial to evaluate the reliability of automated classifcation methods for clinical practice and to evaluate potential discrimination of aggressiveness of cancer to pave the way to automatic precision medicine.
This prospective study sought to evaluate potential savings of radiation dose to medical staff using real-time dosimetry coupled with visual radiation dose feedback during angiographic interventions. For this purpose, we analyzed a total of 214 angiographic examinations that consisted of chemoembolizations and several other types of therapeutic interventions. The Unfors RaySafe i2 dosimeter was worn by the interventionalist at chest height over the lead protection. A total of 110 interventions were performed with real-time radiation dosimetry allowing the interventionalist to react upon higher x-ray exposure and 104 examinations served as the comparative group without real-time radiation monitoring. By using the real-time display during interventions, the overall mean operator radiation dose decreased from 3.67 (IQR, 0.95–23.01) to 2.36 μSv (IQR, 0.52–12.66) (−36%; p = 0.032) at simultaneously reduced operator exposure time by 4.5 min (p = 0.071). Dividing interventions into chemoembolizations and other types of therapeutic interventions, radiation dose decreased from 1.31 (IQR, 0.46-3.62) to 0.95 μSv (IQR, 0.53-3.11) and from 24.39 (IQR, 12.14-63.0) to 10.37 μSv (IQR, 0.85-36.84), respectively, using live-screen dosimetry (p ≤ 0.005). Radiation dose reductions were also observed for the participating assistants, indicating that they could also benefit from real-time visual feedback dosimetry during interventions (−30%; p = 0.039). Integration of real-time dosimetry into clinical processes might be useful in reducing occupational radiation exposure time during angiographic interventions. The real-time visual feedback raised the awareness of interventionalists and their assistants to the potential danger of prolonged radiation exposure leading to the adoption of radiation-sparing practices. Therefore, it might create a safer environment for the medical staff by keeping the applied radiation exposure as low as possible.