Refine
Year of publication
Document Type
- Article (4967)
- Preprint (173)
- Doctoral Thesis (127)
- Conference Proceeding (81)
- Part of Periodical (12)
- Part of a Book (8)
- Book (7)
- Review (4)
- Working Paper (3)
- Report (1)
Language
- English (5383) (remove)
Has Fulltext
- yes (5383)
Keywords
- inflammation (81)
- COVID-19 (57)
- SARS-CoV-2 (48)
- glioblastoma (38)
- apoptosis (37)
- cancer (37)
- Inflammation (35)
- prostate cancer (30)
- autophagy (29)
- breast cancer (27)
Institute
- Medizin (5383) (remove)
The correction of valgus leg malalignment in children using implant-mediated growth guidance is widely used and effective. Despite the minimal invasive character of the procedure, a relevant number of patients sustain prolonged pain and limited mobility after temporary hemiepiphysiodesis. Our aim was to investigate implant-associated risk factors (such as implant position and screw angulation), surgical- or anesthesia-related risk factors (such as type of anesthesia, use, and duration), and pressure of tourniquet or duration of surgery for these complications. Thirty-four skeletally immature patients with idiopathic valgus deformities undergoing hemiepiphysiodesis plating from October 2018–July 2022 were enrolled in this retrospective study. Participants were divided into groups with and without prolonged complications (persistent pain, limited mobility of the operated knee between five weeks and six months) after surgery. Twenty-two patients (65%) had no notable complications, while twelve patients (35%) had prolonged complications. Both groups differed significantly in plate position relative to physis (p = 0.049). In addition, both groups showed significant differences in the distribution of implant location (p = 0.016). Group 1 had a shorter duration of surgery than group 2 (32 min vs. 38 min, p = 0.032) and a lower tourniquet pressure (250 mmHg vs. 270 mmHg, p = 0.019). In conclusion, simultaneous plate implantation at the femur and tibia and metaphyseal plate positioning resulted in prolonged pain and a delay of function. In addition, the amplitude of tourniquet pressure or duration of surgery could play a factor.
Background: Malalignments of the lower extremity are common reasons for orthopedic consultation because it may lead to osteoarthritis in adulthood. An accurate and reliable radiological assessment of lower limb alignment in children and adolescents is essential for clinical decision-making on treatment of limb deformities and for regular control after a surgical intervention.
Objective: First, does the analysis of full-length standing anteroposterior radiographs show a good intra- and interobserver reliability? Second, which parameter is most susceptible to observer-dependent errors? Third, what is the Standard Error of Measurement (SEM95%) of the absolute femoral and tibial length?
Methods: Two observers evaluated digital radiographs of 144 legs from 36 children and adolescents with pathological valgus alignment before a temporary hemiepiphysiodesis and before implant removal. Parameters included Mechanical Femorotibial Angle (MFA), Mechanical Axis Deviation (MAD), mechanical Lateral Distal Femoral Angle (mLDFA), mechanical Medial Proximal Tibial Angle (mMPTA), mechanical Lateral Proximal Femoral Angle (mLPFA), mechanical Lateral Distal Tibial Angle (mLDTA), Joint Line Convergence Angle (JLCA), femur length, tibial length. Intra- and interobserver reliability (ICC2,1), SEM95% and proportional errors were calculated.
Results: The intra- and interobserver reliability for almost all measurements was found to be good to excellent (Intra-ICC2,1: 0.849–0.999; Inter-ICC2,1: 0.864–0.996). The SEM95% of both observers was found to be ± 1.39° (MFA), ± 3.31 mm (MAD), ± 1.06° (mLDFA) and ± 1.29° (mMPTA). The proportional error of MAD and MFA is comparable (47.29% vs. 46.33%). The relevant knee joint surface angles show a lower proportional error for mLDFA (42.40%) than for mMPTA (51.60%). JLCA has a proportional error of 138%. Furthermore, the SEM95% for the absolute values of the femoral and tibial length was 4.53 mm for the femur and 3.12 mm for the tibia.
Conclusions: In conclusion, a precise malalignment measurement and the knowledge about SEM95% of the respective parameters are crucial for correct surgical or nonsurgical treatment. The susceptibility to error must be considered when interpreting malalignment analysis and must be considered when planning a surgical intervention. The results of the present study elucidate that MAD and MFA are equally susceptible to observer-dependent errors. This study shows good to excellent intra- and interobserver ICCs for all leg alignment parameters and joint surface angles, except for JLCA.
Trial registration: This study was registered with DRKS (German Clinical Trials Register) under the number DRKS00015053.
Level of evidence
I, Diagnostic Study.
Background
Lennox–Gastaut syndrome (LGS) is a severe developmental and epileptic encephalopathy characterized by drug-resistant epilepsy with multiple seizure types starting in childhood, a typical slow spike-wave pattern on electroencephalogram, and cognitive dysfunction.
Methods
We performed a systematic literature review according to the PRISMA guidelines to identify, synthesize and appraise the burden of illness in LGS (including “probable” LGS). Studies were identified by searching MEDLINE, Embase and APA PsychInfo, Cochrane’s database of systematic reviews, and Epistemonikos. The outcomes were epidemiology (incidence, prevalence or mortality), direct and indirect costs, healthcare resource utilization, and patient and caregiver health-related quality of life (HRQoL).
Results
The search identified 22 publications evaluating the epidemiology (n = 10), direct costs and resource (n = 10) and/or HRQoL (n = 5). No studies reporting on indirect costs were identified. With no specific ICD code for LGS in many regions, several studies had to rely upon indirect methods to identify their patient populations (e.g., algorithms to search insurance claims databases to identify “probable” LGS). There was heterogeneity between studies in how LGS was defined, the size of the populations, ages of the patients and length of the follow-up period. The prevalence varied from 4.2 to 60.8 per 100,000 people across studies for probable LGS and 2.9–28 per 100,000 for a confirmed/narrow definition of LGS. LGS was associated with high mortality rates compared to the general population and epilepsy population. Healthcare resource utilization and direct costs were substantial across all studies. Mean annual direct costs per person varied from $24,048 to $80,545 across studies, and home-based care and inpatient care were significant cost drivers. Studies showed that the HRQoL of patients and caregivers was adversely affected, although only a few studies were identified. In addition, studies suggested that seizure events were associated with higher costs and worse HRQoL. The risk of bias was low or moderate in most studies.
Conclusions
LGS is associated with a significant burden of illness featuring resistant seizures associated with higher costs and worse HRQoL. More research is needed, especially in evaluating indirect costs and caregiver burden, where there is a notable lack of studies.
Rationale: Attention deficit/hyperactivity disorder (ADHD) is common in alcohol use disorder (AUD). Continuous performance tests (CPTs) allow to measure ADHD related deficits in a laboratory setting. Most studies on this topic focused on CPTs measuring inattention or impulsivity, disregarding hyperactivity as one of the core symptoms of ADHD.
Methods: We examined N = 47 in three groups (ADHD N = 19; AUD N = 16; ADHD + AUD N = 12) with questionnaires on ADHD core symptoms, executive functioning (EF), mind wandering, and quality of life (QoL). N = 46 (ADHD N = 16; AUD N = 16; ADHD + AUD N = 14) were examined with a CPT (QbTest®) that also measures motor activity objectively.
Results: Inattention and impulsivity were significantly increased in AUD vs. ADHD and in AUD vs. ADHD + AUD. Hyperactivity was significantly higher in ADHD + AUD vs. ADHD and ADHD + AUD vs. AUD, but not in ADHD vs. AUD. EF was lower in both ADHD groups vs. AUD. Mind wandering was increased in both ADHD groups vs. AUD. QoL was significantly lower in ADHD + AUD compared to AUD. In contrast, results of the QbTest were not significantly different between groups.
Conclusion: Questionnaires are more useful in assessing ADHD core symptoms than the QbTest®. Hyperactivity appears to be a relevant symptom in ADHD + AUD, suggesting a possible pathway from ADHD to AUD. The lower QoL in ADHD + AUD emphasizes the need for routine screening, diagnostic procedures and treatment strategies for this patient group.
Highlights:
• Assessment of body composition parameters in a large cohort of patients with HCC undergoing TACE.
• Fully automated artificial intelligence-based quantitative 3D volumetry of abdominal cavity tissue composition.
• Skeletal muscle volume and related parameters were independent prognostic factors in patients with HCC undergoing TACE.
Background & Aims: Body composition assessment (BCA) parameters have recently been identified as relevant prognostic factors for patients with hepatocellular carcinoma (HCC). Herein, we aimed to investigate the role of BCA parameters for prognosis prediction in patients with HCC undergoing transarterial chemoembolization (TACE).
Methods: This retrospective multicenter study included a total of 754 treatment-naïve patients with HCC who underwent TACE at six tertiary care centers between 2010–2020. Fully automated artificial intelligence-based quantitative 3D volumetry of abdominal cavity tissue composition was performed to assess skeletal muscle volume (SM), total adipose tissue (TAT), intra- and intermuscular adipose tissue, visceral adipose tissue, and subcutaneous adipose tissue (SAT) on pre-intervention computed tomography scans. BCA parameters were normalized to the slice number of the abdominal cavity. We assessed the influence of BCA parameters on median overall survival and performed multivariate analysis including established estimates of survival.
Results: Univariate survival analysis revealed that impaired median overall survival was predicted by low SM (p <0.001), high TAT volume (p = 0.013), and high SAT volume (p = 0.006). In multivariate survival analysis, SM remained an independent prognostic factor (p = 0.039), while TAT and SAT volumes no longer showed predictive ability. This predictive role of SM was confirmed in a subgroup analysis of patients with BCLC stage B.
Conclusions: SM is an independent prognostic factor for survival prediction. Thus, the integration of SM into novel scoring systems could potentially improve survival prediction and clinical decision-making. Fully automated approaches are needed to foster the implementation of this imaging biomarker into daily routine.
Impact and implications: Body composition assessment parameters, especially skeletal muscle volume, have been identified as relevant prognostic factors for many diseases and treatments. In this study, skeletal muscle volume has been identified as an independent prognostic factor for patients with hepatocellular carcinoma undergoing transarterial chemoembolization. Therefore, skeletal muscle volume as a metaparameter could play a role as an opportunistic biomarker in holistic patient assessment and be integrated into decision support systems. Workflow integration with artificial intelligence is essential for automated, quantitative body composition assessment, enabling broad availability in multidisciplinary case discussions.
Highlights
• A proteomic analysis of the mandibular glands of Shinisaurius crocodilurus and Corucia zebrata, was performed.
• Scanning electron microscopy of S. crocodilurus' teeth revealed a sharp ridge on the anterior surface, but no grooves.
• Scanning electron microscopy of C. zebrata teeth showed a flattened crown with a pointed cusp.
• Proteomic analysis of gland extracts of S. crocodilurus and C. zebrata showed absence of venom-derived peptides or proteins.
• Our results strongly support the non-venomous character of both S. crocodilurus and C. zebrata.
Abstract
Based on its phylogenetic relationship to monitor lizards (Varanidae), Gila monsters (Heloderma spp.), and the earless monitor Lanthanotus borneesis, the Chinese crocodile lizard, Shinisaurus crocodilurus, has been assigned to the Toxicofera clade, which comprises venomous reptiles. However, no data about composition and biological activities of its oral secretion have been reported. In the present study, a proteomic analysis of the mandibular gland of S. crocodilurus and, for comparison, of the herbivorous Solomon Island skink Corucia zebrata, was performed. Scanning electron microscopy (SEM) of the teeth from S. crocodilurus revealed a sharp ridge on the anterior surface, but no grooves, whereas those of C. zebrata possess a flattened crown with a pointed cusp. Proteomic analysis of their gland extracts provided no evidence of venom-derived peptides or proteins, strongly supporting the non-venomous character of these lizards. Data are available via ProteomeXchange with identifier PXD039424.
Highlights
• Proteomic analyses of submandibular gland extracts of two alligator lizards of the Anguidae family are reported.
• A conserved set of putative toxins was found in the submandibular gland extracts of Abronia lythrochila and A. graminea.
• Toxins evolved in oral secretions of paleo- and neoanguimorpha over more than 100 million years of Anguimorpha cladogenesis.
• Electron microscopy of pleurodont teeth of A. lythrochila showed no sign of groove, external opening or striations.
• Assessing the role toxins play in the ecology of extant anguimorph lizards deserves functional studies in natural prey.
Abstract
A useful approach to deepen our knowledge about the origin and evolution of venom systems in Reptilia has been exploring the vast biodiversity of this clade of vertebrates in search of orally produced proteins with toxic actions, as well as their corresponding delivery systems. The occurrence of toxins in anguimorph lizards has been demonstrated experimentally or inferred from reports of the toxic effects of the oral secretions of taxa within the Varanidae and Helodermatidae families. In the present study, we have focused on two alligator lizards of the Anguidae family, the Mexican alligator lizard, Abronia graminea, and the red-lipped arboreal alligator lizard, A. lythrochila. In addition, the fine morphology of teeth of the latter species is described. The presence of a conserved set of proteins, including B-type natriuretic peptides, cysteine-rich secretory proteins, group III phospholipase A2, and kallikrein, in submandibular gland extracts was demonstrated for both Abronia species. These proteins belong to toxin families found in oral gland secretions of venomous reptile species. This finding, along with previous demonstration of toxin-producing taxa in both paleo- and neoanguimorpha clades, provides further support for the existence of a handful of conserved toxin families in oral secretions across the 100+ million years of Anguimorpha cladogenesis.
Introduction: Patients undergoing left atrial appendage closure (LAAC) are often severly anemic and close to the transfusion threshold. The aim was to investigate the prevalence of severe anemia in this cohort and if procedural safety is compromised compared with non-anemic patients.
Methods and results: Comparison of severly anemic patients (Hb < 80 g/l) vs. non-severly anemic patients in the prospective, multicentre observational LAARGE registry of patients undergoing LAAC. A total of 638 patients (anemia 22.3% vs non-anemic 77.7%) were included. Anemic patients were older (77.1 years ± 7.9 vs 75.6 years ± 7.9, p = 0.014), had more comorbidities, higher CHA2DS2-VASc (4.8 vs 4.4, p = 0.017) and higher HAS-BLED (4.3 vs 3.8, p < 0.001) scores. Implant success was not influenced by anemia (99.3% vs 97.2%). Severe in-hospital (0.7% vs 5.6%, p = 0.01) and overall complications (8.5% vs 13.7%, p = 0.11) were less common in patients with anemia, driven by fewer pericardial effusions. Mortality was higher in anemic patients and associated with an increased hazard ratio, albeit not significantly (16.0% vs 10.3%, HR 1.61 (95%-CI: 0.97–2.67), p = 0.06). In the one-year follow-up, composite outcome of death, stroke or systemic embolism occurred in 22/142 anemic and in 54/496 non-anemic patients with an adjusted HR of 1.04 (95%-CI 0.62–1.73, p = 0.89).
Conclusion: Severe anemia close to the transfusion threshold is common in patients undergoing LAAC. However, this does not influence in-hospital complications or implant success. One-year mortality is higher in anemic patients, mainly driven by co-morbidities.
Key Teaching Points
• Wearables such as smartwatches can monitor beyond heart rate and heart rhythm.
• Specific smartwatches provide reliable measurements of electrocardiographic intervals (eg, QT interval).
• Correct analysis and interpretation of the QT interval in an individual with previously unknown long QT syndrome facilitated the diagnosis.
Aim: The aim of this study was to evaluate the relationship between coronary artery calcification (CAC) assessed by multi-detector computed tomography (MDCT) and myocardial perfusion assessed by cardiac magnetic resonance imaging (CMR) in a group of symptomatic patients.
Method: Retrospective analysis of 120 patients (age 65.1 ± 8.9 years, 88 males) who presented with atypical chest pain to Bethanien Hospital, Frankfurt, Germany, between 2007 and 2010 and who underwent CAC scoring using MDCT, CMR, and conventional coronary angiography. Patients were divided into those with high-grade (HG) stenosis (n = 67, age 65.1 ± 9.4 years) and those with no-HG stenosis (n = 53, age 65.1 ± 8.6 years).
Results: There were more males with HG stenosis (82.1% vs. 62.3%, p = 0.015), in whom the percentage and number of abnormal perfusion segments were higher at rest (37.3% vs. 17%, p = 0.014) but not different with stress (p = 0.83) from those with no-HG stenosis. Thirty-four patients had myocardial perfusion abnormalities at rest and 26 patients developed perfusion defects with stress. Stress-induced myocardial perfusion defects were 22.4% sensitive and 79.2% specific for detecting HG stenosis. The CAC score was lower in patients with no-HG stenosis compared to those with HG stenosis (p < 0.0001). On the ROC curve, a CAC score of 293 had a sensitivity of 71.6% and specificity of 83% in predicting HG stenosis [(AUC 0.80 (p < 0.0001)]. A CAC score of 293 or the presence of at least 1 segment myocardial perfusion abnormality was 74.6% sensitive and 71.7% specific in detecting HG stenosis, the respective values for the 2 abnormalities combined being 19.4% and 90.6%. The severity of CAC correlated with the extent of myocardial perfusion in the patient group as a whole with stress (r = 0.22, p = 0.015), particularly in those with no-HG stenosis (r = 0.31, p = 0.022). A CAC score of 293 was 31.6% sensitive and 87.3% specific in detecting myocardial perfusion abnormalities.
Conclusion: In a group of patients with exertional angina, coronary calcification is more accurate in detecting high-grade luminal stenosis than myocardial perfusion defects. In addition, in patients with no stenosis, the incremental relationship between coronary calcium score and the extent of myocardial perfusion suggests coronary wall hardening as an additional mechanism for stress-induced angina other than luminal narrowing. These preliminary findings might have a clinical impact on management strategies of these patients other than conventional therapy.
Rationale and Objectives: Lumbar disk degeneration is a common condition contributing significantly to back pain. The objective of the study was to evaluate the potential of dual-energy CT (DECT)-derived collagen maps for the assessment of lumbar disk degeneration.
Patients and Methods: We conducted a retrospective analysis of 127 patients who underwent dual-source DECT and MRI of the lumbar spine between 07/2019 and 10/2022. The level of lumbar disk degeneration was categorized by three radiologists as follows: no/mild (Pfirrmann 1&2), moderate (Pfirrmann 3&4), and severe (Pfirrmann 5). Recall (sensitivity) and accuracy of DECT collagen maps were calculated. Intraclass correlation coefficient (ICC) was used to evaluate inter-reader reliability. Subjective evaluations were performed using 5-point Likert scales for diagnostic confidence and image quality.
Results: We evaluated a total of 762 intervertebral disks from 127 patients (median age, 69.7 (range, 23.0–93.7), female, 56). MRI identified 230 non/mildly degenerated disks (30.2%), 484 moderately degenerated disks (63.5%), and 48 severely degenerated disks (6.3%). DECT collagen maps yielded an overall accuracy of 85.5% (1955/2286). Recall (sensitivity) was 79.3% (547/690) for the detection of no/mild lumbar disk degeneration, 88.7% (1288/1452) for the detection of moderate disk degeneration, and 83.3% (120/144) for the detection of severe disk degeneration (ICC = 0.9). Subjective evaluations of DECT collagen maps showed high diagnostic confidence (median 4) and good image quality (median 4).
Conclusion: The use of DECT collagen maps to distinguish different stages of lumbar disk degeneration may have clinical significance in the early diagnosis of disk-related pathologies in patients with contraindications for MRI or in cases of unavailability of MRI.
Highlights
• Early reconstruction of injured cruciate ligaments improves functional outcomes.
• Modern CT imaging can be used to rapidly identify patients with injury to the cruciate ligaments and streamline therapeutic pathways.
• Dual-energy CT demonstrates superior diagnostic accuracy compared to single-energy CT.
Abstract
Background: This study aimed to evaluate the clinical utility of modern single and dual-energy computed tomography (CT) for assessing the integrity of the cruciate ligaments in patients that sustained acute trauma.
Methods: Patients who underwent single- or dual-energy CT followed by 3 Tesla magnetic resonance imaging (MRI) or knee joint arthroscopy between 01/2016 and 12/2022 were included in this retrospective, monocentric study. Three radiologists specialized in musculoskeletal imaging independently evaluated all CT images for the presence of injury to the cruciate ligaments. An MRI consensus reading of two experienced readers and arthroscopy provided the reference standard. Diagnostic accuracy parameters and area under the receiver operator characteristic curve (AUC) were the primary metrics for diagnostic performance.
Results: CT images of 204 patients (median age, 49 years; IQR 36 – 64; 113 males) were evaluated. Dual-energy CT yielded significantly higher diagnostic accuracy and AUC for the detection of injury to the anterior (94% [240/255] vs 75% [266/357] and 0.89 vs 0.66) and posterior cruciate ligaments (95% [243/255] vs 87% [311/357] and 0.90 vs 0.61) compared to single-energy CT (all parameters, p <.005). Diagnostic confidence and image quality were significantly higher in dual-energy CT compared to single-energy CT (all parameters, p <.005).
Conclusions: Modern dual-energy CT is readily available and can serve as a screening tool for detecting or excluding cruciate ligament injuries in patients with acute trauma. Accurate diagnosis of cruciate ligament injuries is crucial to prevent adverse outcomes, including delayed treatment, chronic instability, or long-term functional limitations.
Highlights
• Assessment of coronary artery plaque burden according to the CAC-DRS Score correlated well with pulmonary involvement of SARS-CoV-2 pneumonia (min. r=0.81, 95% CI 0.76 to 0.86).
• Visual and quantitative CAC-DRS Score of coronary artery plaque burden provided independent prognostic information on all-cause mortality in patients with SARS-CoV-2 pneumonia (p=0.0016 and p<0.0001, respectively).
• Incorporating CAC-DRS Score and pulmonary involvement into clinical decision making revealed great potential to discriminate patients with fatal outcomes from a mild course of disease (AUC 0.938, 95% CI 0.89 to 0.97) and the need for intensive care treatment (AUC 0.801, 95% CI 0.77 to 0.83).
Purpose: To assess and correlate pulmonary involvement and outcome of SARS-CoV-2 pneumonia with the degree of coronary plaque burden based on the CAC-DRS classification (Coronary Artery Calcium Data and Reporting System).
Methods: This retrospective study included 142 patients with confirmed SARS-CoV-2 pneumonia (58 ± 16 years; 57 women) who underwent non-contrast CT between January 2020 and August 2021 and were followed up for 129 ± 72 days. One experienced blinded radiologist analyzed CT series for the presence and extent of calcified plaque burden according to the visual and quantitative HU-based CAC-DRS Score. Pulmonary involvement was automatically evaluated with a dedicated software prototype by another two experienced radiologists and expressed as Opacity Score.
Results: CAC-DRS Scores derived from visual and quantitative image evaluation correlated well with the Opacity Score (r=0.81, 95% CI 0.76-0.86, and r=0.83, 95% CI 0.77-0.89, respectively; p<0.0001) with higher correlation in severe than in mild stage SARS-CoV-2 pneumonia (p<0.0001). Combined, CAC-DRS and Opacity Scores revealed great potential to discriminate fatal outcomes from a mild course of disease (AUC 0.938, 95% CI 0.89-0.97), and the need for intensive care treatment (AUC 0.801, 95% CI 0.77-0.83). Visual and quantitative CAC-DRS Scores provided independent prognostic information on all-cause mortality (p=0.0016 and p<0.0001, respectively), both in univariate and multivariate analysis.
Conclusions: Coronary plaque burden is strongly correlated to pulmonary involvement, adverse outcome, and death due to respiratory failure in patients with SARS-CoV-2 pneumonia, offering great potential to identify individuals at high risk.
Background: Dual-energy CT (DECT)-derived bone mineral density (BMD) of the distal radius and other CT-derived metrics related to bone health have been suggested for opportunistic osteoporosis screening and risk evaluation for sustaining distal radius fractures (DRFs).
Methods: The distal radius of patients who underwent DECT between 01/2016 and 08/2021 was retrospectively analyzed. Cortical Hounsfield Unit (HU), trabecular HU, cortical thickness, and DECT-based BMD were acquired from a non-fractured, metaphyseal area in all examinations. Receiver-operating characteristic (ROC) analysis was conducted to determine the area under the curve (AUC) values for predicting DRFs based on DECT-derived BMD, HU values, and cortical thickness. Logistic regression models were then employed to assess the associations of these parameters with the occurrence of DRFs.
Results: In this study, 263 patients (median age: 52 years; interquartile range: 36–64; 132 women; 192 fractures) were included. ROC curve analysis revealed a higher area under the curve (AUC) value for DECT-derived BMD compared to cortical HU, trabecular HU, and cortical thickness (0.91 vs. 0.61, 0.64, and 0.69, respectively; p <.001). Logistic regression models confirmed the association between lower DECT-derived BMD and the occurrence of DRFs (Odds Ratio, 0.83; p <.001); however, no influence was observed for cortical HU, trabecular HU, or cortical thickness.
Conclusions: DECT can be used to assess the BMD of the distal radius without dedicated equipment such as calibration phantoms to increase the detection rates of osteoporosis and stratify the individual risk to sustain DRFs. In contrast, assessing HU-based values and cortical thickness does not provide clinical benefit.
Rationale and Objectives: Bone non-union is a serious complication of distal radius fractures (DRF) that can result in functional limitations and persistent pain. However, no accepted method has been established to identify patients at risk of developing bone non-union yet. This study aimed to compare various CT-derived metrics for bone mineral density (BMD) assessment to identify predictive values for the development of bone non-union.
Materials and Methods: CT images of 192 patients with DRFs who underwent unenhanced dual-energy CT (DECT) of the distal radius between 03/2016 and 12/2020 were retrospectively identified. Available follow-up imaging and medical health records were evaluated to determine the occurrence of bone non-union. DECT-based BMD, trabecular Hounsfield unit (HU), cortical HU and cortical thickness ratio were measured in normalized non-fractured segments of the distal radius.
Results: Patients who developed bone non-union were significantly older (median age 72 years vs. 54 years) and had a significantly lower DECT-based BMD (median 68.1 mg/cm3 vs. 94.6 mg/cm3, p < 0.001). Other metrics (cortical thickness ratio, cortical HU, trabecular HU) showed no significant differences. ROC and PR curve analyses confirmed the highest diagnostic accuracy for DECT-based BMD with an area under the curve (AUC) of 0.83 for the ROC curve and an AUC of 0.46 for the PR curve. In logistic regression models, DECT-based BMD was the sole metric significantly associated with bone non-union.
Conclusion: DECT-derived metrics can accurately predict bone non-union in patients who sustained DRF. The diagnostic performance of DECT-based BMD is superior to that of HU-based metrics and cortical thickness ratio.
Highlights
• Piriform cortex and amgydala can be separated based on their distinct structural connectivity.
• Similar to histological findings, the connectivity of the piriform cortex suggests posterior frontal and temporal subregions.
• Subregions of the piriform cortex have distinct connectivity profiles.
• Anterior PC extended into ventrotemporal PC posteriorly, which has not been described before, requiring further investigation.
• All parcellations were made publicly available.
Abstract
The anatomy of the human piriform cortex (PC) is poorly understood. We used a bimodal connectivity-based-parcellation approach to investigate subregions of the PC and its connectional differentiation from the amygdala.
One hundred (55 % female) genetically unrelated subjects from the Human Connectome Project were included. A region of interest (ROI) was delineated bilaterally covering PC and amygdala, and functional and structural connectivity of this ROI with the whole gray matter was computed. Spectral clustering was performed to obtain bilateral parcellations at granularities of k = 2–10 clusters and combined bimodal parcellations were computed. Validity of parcellations was assessed via their mean individual-to-group similarity per adjusted rand index (ARI).
Individual-to-group similarity was higher than chance in both modalities and in all clustering solutions. The amygdala was clearly distinguished from PC in structural parcellations, and olfactory amygdala was connectionally more similar to amygdala than to PC. At higher granularities, an anterior and ventrotemporal and a posterior frontal cluster emerged within PC, as well as an additional temporal cluster at their boundary. Functional parcellations also showed a frontal piriform cluster, and similar temporal clusters were observed with less consistency. Results from bimodal parcellations were similar to the structural parcellations. Consistent results were obtained in a validation cohort.
Distinction of the human PC from the amygdala, including its olfactory subregions, is possible based on its structural connectivity alone. The canonical fronto-temporal boundary within PC was reproduced in both modalities and with consistency. All obtained parcellations are freely available.
Purpose: To describe a novel surgical technique of a combined implantation of an artificial iris and a scleral fixated intraocular lens (IOL) using flanged IOL haptics (“Yamane” technique).
Observations: The suturelessly implanted artificial iris-IOL-sandwich was stable with good functional as well as aesthetic results. However, our case showed a postoperative intraocular pressure rise.
Conclusions: The presented case demonstrates that a visual as well as cosmetical rehabilitation seems to be possible even after severe, penetrating ocular trauma with profound iris defects.
Importance: The sutureless IOL scleral fixation technique can also be used in combination with a sutureless artificial iris implantation. Further studies are needed to evaluate the long-term safety profile and rates of postoperative complications.
Purpose: The IC-8® Apthera™ (AcuFocus Inc.™, Irvine, California, USA) is the first small aperture intraocular lens (IOL) to receive FDA approval for presbyopia correction in the summer of 2022. It is a single-piece hydrophobic acrylic monofocal lens, which is placed in the capsular bag. In its center it carries a black circular mask (FilterRing™) with a diameter of 3.23 mm consisting of polyvinylidene fluoride and carbon black nanoparticles. In the center of this mask sits a 1.36 mm wide aperture. Thanks to this pinhole effect the IC-8® serves as an extended-depth-of-focus (EDOF) IOL and can be used in presbyopia correction.
This report describes the case of a patient with an IC-8® implant who underwent Nd:YAG laser capsulotomy for posterior capsule opacification (PCO). The post laser checkup showed a dark central optical change within the IOL and the patient described optical phenomena as well as blurred central vision, which is why he received IOL exchange. The explanted IC-8® was sent to the Intermountain Ocular Research Center at the University of Utah for further analysis.
Observations: A 56-year-old male underwent cataract surgery with implantation of a non-diffractive EDOF-IOL on the right and the IC-8® small aperture IOL on the left eye. On the left eye, the patient had received penetrating keratoplasty seven years prior to the cataract operation due to posttraumatic corneal scarring. The early checkups after cataract surgery showed a corrected distance visual acuity (CDVA) in the left eye of +0.1 logMAR in the first month. About 5 months after the operation, PCO was first described on the left eye leading to a decrease in visual acuity to +0.4 logMAR (CDVA). Due to PCO, Nd:YAG laser capsulotomy was conducted 5 months after the cataract operation on the left eye. 12 shots were applied at 2.7 mJ. The following appointments showed a continuously reduced visual acuity of +1.3 logMAR (uncorrected) on the left eye and the patient described blurry and ‘swirled’ central vision. By slightly tilting his head and thus not using the center of his optic axis, he would be able to see sharper. Slit lamp examination showed a small optical change inside the IC-8® IOL not resembling a pit but believed to be a small pocket of air. Due to the ongoing symptoms as well as the reduced VA, the seemingly damaged small aperture IOL was exchanged for a three-piece hydrophobic acrylic monofocal lens, which was also placed in the posterior chamber. The explanted IC-8® was sent to the Intermountain Ocular Research Center at the University of Utah for further analysis. Results from gross and light microscopic analysis showed that the change caused by the Nd:YAG laser application consisted of a localized optical area containing carbon black nanoparticles used for the circular mask within the IOL.
Conclusions and importance: When dealing with PCO and performing Nd:YAG laser capsulotomy in eyes with an IC-8® IOL implant, the laser shots should be applied either inside the aperture or outside of the black circular mask of the IOL. Otherwise, the Nd:YAG laser can lead to bursts of carbon nanoparticles within the IOL which may cause optical phenomena as well as decreased visual acuity possibly resulting in an IOL exchange.