Refine
Document Type
- Article (23)
Language
- English (23)
Has Fulltext
- yes (23)
Is part of the Bibliography
- no (23)
Keywords
- Artificial intelligence (3)
- CT (3)
- Magnetic resonance imaging (3)
- Algorithms (2)
- Bone density (2)
- Multidetector computed tomography (2)
- Osteoporosis (2)
- Radiomics (2)
- Spine (2)
- Tomography (x-ray computed) (2)
Institute
- Medizin (23)
- Informatik (1)
- Informatik und Mathematik (1)
Purpose: To investigate the diagnostic performance of noise-optimized virtual monoenergetic images (VMI+) in dual-energy CT (DECT) of portal vein thrombosis (PVT) compared to standard reconstructions. Method: This retrospective, single-center study included 107 patients (68 men; mean age, 60.1 ± 10.7 years) with malignant or cirrhotic liver disease and suspected PVT who had undergone contrast-enhanced portal-phase DECT of the abdomen. Linearly blended (M_0.6) and virtual monoenergetic images were calculated using both standard VMI and noise-optimized VMI+ algorithms in 20 keV increments from 40 to 100 keV. Quantitative measurements were performed in the portal vein for objective contrast-to-noise ratio (CNR) calculation. The image series showing the greatest CNR were further assessed for subjective image quality and diagnostic accuracy of PVT detection by two blinded radiologists. Results: PVT was present in 38 subjects. VMI+ reconstructions at 40 keV revealed the best objective image quality (CNR, 9.6 ± 4.3) compared to all other image reconstructions (p < 0.01). In the standard VMI series, CNR peaked at 60 keV (CNR, 4.7 ± 2.1). Qualitative image parameters showed the highest image quality rating scores for the 60 keV VMI+ series (median, 4) (p ≤ 0.03). The greatest diagnostic accuracy for the diagnosis of PVT was found for the 40 keV VMI+ series (sensitivity, 96%; specificity, 96%) compared to M_0.6 images (sensitivity, 87%; specificity, 92%), 60 keV VMI (sensitivity, 87%; specificity, 97%), and 60 keV VMI+ reconstructions (sensitivity, 92%; specificity, 97%) (p ≤ 0.01). Conclusions: Low-keV VMI+ reconstructions resulted in significantly improved diagnostic performance for the detection of PVT compared to other DECT reconstruction algorithms.
This prospective study sought to evaluate potential savings of radiation dose to medical staff using real-time dosimetry coupled with visual radiation dose feedback during angiographic interventions. For this purpose, we analyzed a total of 214 angiographic examinations that consisted of chemoembolizations and several other types of therapeutic interventions. The Unfors RaySafe i2 dosimeter was worn by the interventionalist at chest height over the lead protection. A total of 110 interventions were performed with real-time radiation dosimetry allowing the interventionalist to react upon higher x-ray exposure and 104 examinations served as the comparative group without real-time radiation monitoring. By using the real-time display during interventions, the overall mean operator radiation dose decreased from 3.67 (IQR, 0.95–23.01) to 2.36 μSv (IQR, 0.52–12.66) (−36%; p = 0.032) at simultaneously reduced operator exposure time by 4.5 min (p = 0.071). Dividing interventions into chemoembolizations and other types of therapeutic interventions, radiation dose decreased from 1.31 (IQR, 0.46-3.62) to 0.95 μSv (IQR, 0.53-3.11) and from 24.39 (IQR, 12.14-63.0) to 10.37 μSv (IQR, 0.85-36.84), respectively, using live-screen dosimetry (p ≤ 0.005). Radiation dose reductions were also observed for the participating assistants, indicating that they could also benefit from real-time visual feedback dosimetry during interventions (−30%; p = 0.039). Integration of real-time dosimetry into clinical processes might be useful in reducing occupational radiation exposure time during angiographic interventions. The real-time visual feedback raised the awareness of interventionalists and their assistants to the potential danger of prolonged radiation exposure leading to the adoption of radiation-sparing practices. Therefore, it might create a safer environment for the medical staff by keeping the applied radiation exposure as low as possible.
Purpose: To assess the diagnostic precision of three different workstations for measuring thoracic aortic aneurysms (TAAs) in vivo and ex vivo using either pre-interventional computed tomography angiography scans (CTA) or a specifically designed phantom model.
Methods: This retrospective study included 23 patients with confirmed TAA on routinely performed CTAs. In addition to phantom tube diameters, one experienced blinded radiologist evaluated the dimensions of TAAs on three different workstations in two separate rounds. Precision was assessed by calculating measurement errors. In addition, correlation analysis was performed using Pearson correlation.
Results: Measurements acquired at the Siemens workstation deviated by 3.54% (range, 2.78–4.03%; p = 0.14) from the true size, those at General Electric by 4.05% (range, 1.46–7.09%; p < 0.0001), and at TeraRecon by 4.86% (range, 3.22–6.45%; p < 0.0001). Accordingly, Siemens provided the most precise workstation at simultaneously most fluctuating values (scattering of 4.46%). TeraRecon had the smallest fluctuation (scattering of 2.83%), but the largest deviation from the true size of the phantom. The workstation from General Electric showed a scattering of 2.94%. The highest overall correlation between the 1st and 2nd rounds was observed with measurements from Siemens (r = 0.898), followed by TeraRecon (r = 0.799), and General Electric (r = 0.703). Repetition of measurements reduced processing times by 40% when using General Electric, by 20% with Siemens, and by 18% with TeraRecon.
Conclusions: In conclusion, all three workstations facilitated precise assessment of dimensions in the majority of cases at simultaneously high reproducibility, ensuring accurate pre-interventional planning of thoracic endovascular aortic repair.
Purpose: To identify transjugular intrahepatic portosystemic shunt (TIPS) thrombosis in abdominal CT scans applying quantitative image analysis.
Materials and methods: We retrospectively screened 184 patients to include 20 patients (male, 8; female, 12; mean age, 60.7 ± 8.87 years) with (case, n = 10) and without (control, n = 10) in-TIPS thrombosis who underwent clinically indicated contrast-enhanced and unenhanced abdominal CT followed by conventional TIPS-angiography between 08/2014 and 06/2020. First, images were scored visually. Second, region of interest (ROI) based quantitative measurements of CT attenuation were performed in the inferior vena cava (IVC), portal vein and in four TIPS locations. Minimum, maximum and average Hounsfield unit (HU) values were used as absolute and relative quantitative features. We analyzed the features with univariate testing.
Results: Subjective scores identified in-TIPS thrombosis in contrast-enhanced scans with an accuracy of 0.667 – 0.833. Patients with in-TIPS thrombosis had significantly lower average (p < 0.001), minimum (p < 0.001) and maximum HU (p = 0.043) in contrast-enhanced images. The in-TIPS / IVC ratio in contrast-enhanced images was significantly lower in patients with in-TIPS thrombosis (p < 0.001). No significant differences were found for unenhanced images. Analyzing the visually most suspicious ROI with consecutive calculation of its ratio to the IVC, all patients with a ratio < 1 suffered from in-TIPS thrombosis (p < 0.001, sensitivity and specificity = 100%).
Conclusion: Quantitative analysis of abdominal CT scans facilitates the stratification of in-TIPS thrombosis. In contrast-enhanced scans, an in-TIPS / IVC ratio < 1 could non-invasively stratify all patients with in-TIPS thrombosis.
Highlights
• Assessment of coronary artery plaque burden according to the CAC-DRS Score correlated well with pulmonary involvement of SARS-CoV-2 pneumonia (min. r=0.81, 95% CI 0.76 to 0.86).
• Visual and quantitative CAC-DRS Score of coronary artery plaque burden provided independent prognostic information on all-cause mortality in patients with SARS-CoV-2 pneumonia (p=0.0016 and p<0.0001, respectively).
• Incorporating CAC-DRS Score and pulmonary involvement into clinical decision making revealed great potential to discriminate patients with fatal outcomes from a mild course of disease (AUC 0.938, 95% CI 0.89 to 0.97) and the need for intensive care treatment (AUC 0.801, 95% CI 0.77 to 0.83).
Purpose: To assess and correlate pulmonary involvement and outcome of SARS-CoV-2 pneumonia with the degree of coronary plaque burden based on the CAC-DRS classification (Coronary Artery Calcium Data and Reporting System).
Methods: This retrospective study included 142 patients with confirmed SARS-CoV-2 pneumonia (58 ± 16 years; 57 women) who underwent non-contrast CT between January 2020 and August 2021 and were followed up for 129 ± 72 days. One experienced blinded radiologist analyzed CT series for the presence and extent of calcified plaque burden according to the visual and quantitative HU-based CAC-DRS Score. Pulmonary involvement was automatically evaluated with a dedicated software prototype by another two experienced radiologists and expressed as Opacity Score.
Results: CAC-DRS Scores derived from visual and quantitative image evaluation correlated well with the Opacity Score (r=0.81, 95% CI 0.76-0.86, and r=0.83, 95% CI 0.77-0.89, respectively; p<0.0001) with higher correlation in severe than in mild stage SARS-CoV-2 pneumonia (p<0.0001). Combined, CAC-DRS and Opacity Scores revealed great potential to discriminate fatal outcomes from a mild course of disease (AUC 0.938, 95% CI 0.89-0.97), and the need for intensive care treatment (AUC 0.801, 95% CI 0.77-0.83). Visual and quantitative CAC-DRS Scores provided independent prognostic information on all-cause mortality (p=0.0016 and p<0.0001, respectively), both in univariate and multivariate analysis.
Conclusions: Coronary plaque burden is strongly correlated to pulmonary involvement, adverse outcome, and death due to respiratory failure in patients with SARS-CoV-2 pneumonia, offering great potential to identify individuals at high risk.
Objectives: To assess the impact of noise-optimised virtual monoenergetic imaging (VMI+) on image quality and diagnostic evaluation in abdominal dual-energy CT scans with impaired portal-venous contrast.
Methods: We screened 11,746 patients who underwent portal-venous abdominal dual-energy CT for cancer staging between 08/2014 and 11/2019 and identified those with poor portal-venous contrast.
Standard linearly-blended image series and VMI+ image series at 40, 50, and 60 keV were reconstructed. Signal-to-noise ratio (SNR) and contrast-to-noise ratio (CNR) of abdominal organs and vascular structures were calculated. Image noise, image contrast and overall image quality were rated by three radiologists using 5-point Likert scale.
Results: 452 of 11,746 (4%) exams were poorly opacified. We excluded 190 cases due to incomplete datasets or multiple exams of the same patient with a final study group of 262. Highest CNR values in all abdominal organs (liver, 6.4 ± 3.0; kidney, 17.4 ± 7.5; spleen, 8.0 ± 3.5) and vascular structures (aorta, 16.0 ± 7.3; intrahepatic vein, 11.3 ± 4.7; portal vein, 15.5 ± 6.7) were measured at 40 keV VMI+ with significantly superior values compared to all other series. In subjective analysis, highest image contrast was seen at 40 keV VMI+ (4.8 ± 0.4), whereas overall image quality peaked at 50 keV VMI+ (4.2 ± 0.5) with significantly superior results compared to all other series (p < 0.001).
Conclusions: Image reconstruction using VMI+ algorithm at 50 keV significantly improves image contrast and image quality of originally poorly opacified abdominal CT scans and reduces the number of non-diagnostic scans.
Advances in knowledge: We validated the impact of VMI+ reconstructions in poorly attenuated DECT studies of the abdomen in a big data cohort.
Highlights
• MRI and ultrasound provided significant correlations between findings suggestive of vasculitis and the final diagnosis.
• Careful selection of available imaging techniques is warranted considering the time course, location, and clinical history.
• Considering its moderate diagnostic power to distinguish tracer uptake, a holistic view of PET/CT findings is essential.
Abstract
Purpose: To assess the diagnostic value of different imaging modalities in distinguishing systemic vasculitis from other internal and immunological diseases.
Methods: This retrospective study included 134 patients with suspected vasculitis who underwent ultrasound, magnetic resonance imaging (MRI), or 18F-fluorodeoxyglucose positron emission tomography/computed tomography (18F-FDG PET/CT) between 01/2010 and 01/2019, finally consisting of 70 individuals with vasculitis. The main study parameter was the confirmation of the diagnosis using one of the three different imaging modalities, with the adjudicated clinical and histopathological diagnosis as the gold standard. A secondary parameter was the morphological appearance of the vessel affected by vasculitis.
Results: Patients with systemic vasculitis had myriad clinical manifestations with joint pain as the most common symptom. We found significant correlations between different imaging findings suggestive of vasculitis and the final adjudicated clinical diagnosis. In this context, on MRI, vessel wall thickening, edema, and diameter differed significantly between vasculitis and non-vasculitis groups (p < 0.05). Ultrasound revealed different findings that may serve as red flags in identifying patients with vasculitis, such as vascular occlusion or halo sign (p = 0.02 vs. non-vasculitis group). Interestingly, comparing maximal standardized uptake values from PET/CT examinations with vessel wall thickening or vessel diameter did not result in significant differences (p > 0.05).
Conclusions: We observed significant correlations between different imaging findings suggestive of vasculitis on ultrasound or MRI and the final adjudicated diagnosis. While ultrasound and MRI were considered suitable imaging methods for detecting and discriminating typical vascular changes, 18F-FDG PET/CT requires careful timing and patient selection given its moderate diagnostic accuracy.
Background: This prospective randomized trial is designed to compare the performance of conventional transarterial chemoembolization (cTACE) using Lipiodol-only with additional use of degradable starch microspheres (DSM) for hepatocellular carcinoma (HCC) in BCLC-stage-B based on metric tumor response. Methods: Sixty-one patients (44 men; 17 women; range 44–85) with HCC were evaluated in this IRB-approved HIPPA compliant study. The treatment protocol included three TACE-sessions in 4-week intervals, in all cases with Mitomycin C as a chemotherapeutic agent. Multiparametric magnetic resonance imaging (MRI) was performed prior to the first and 4 weeks after the last TACE. Two treatment groups were determined using a randomization sheet: In 30 patients, TACE was performed using Lipiodol only (group 1). In 31 cases Lipiodol was combined with DSMs (group 2). Response according to tumor volume, diameter, mRECIST criteria, and the development of necrotic areas were analyzed and compared using the Mann–Whitney-U, Kruskal–Wallis-H-test, and Spearman-Rho. Survival data were analyzed using the Kaplan–Meier estimator. Results: A mean overall tumor volume reduction of 21.45% (± 62.34%) was observed with an average tumor volume reduction of 19.95% in group 1 vs. 22.95% in group 2 (p = 0.653). Mean diameter reduction was measured with 6.26% (± 34.75%), for group 1 with 11.86% vs. 4.06% in group 2 (p = 0.678). Regarding mRECIST criteria, group 1 versus group 2 showed complete response in 0 versus 3 cases, partial response in 2 versus 7 cases, stable disease in 21 versus 17 cases, and progressive disease in 3 versus 1 cases (p = 0.010). Estimated overall survival was in mean 33.4 months (95% CI 25.5–41.4) for cTACE with Lipiosol plus DSM, and 32.5 months (95% CI 26.6–38.4), for cTACE with Lipiodol-only (p = 0.844), respectively. Conclusions: The additional application of DSM during cTACE showed a significant benefit in tumor response according to mRECIST compared to cTACE with Lipiodol-only. No benefit in survival time was observed.
Objectives: To compare radiation dose and image quality of single-energy (SECT) and dual-energy (DECT) head and neck CT examinations performed with second- and third-generation dual-source CT (DSCT) in matched patient cohorts. Methods: 200 patients (mean age 55.1 ± 16.9 years) who underwent venous phase head and neck CT with a vendor-preset protocol were retrospectively divided into four equal groups (n = 50) matched by gender and BMI: second (Group A, SECT, 100-kV; Group B, DECT, 80/Sn140-kV), and third-generation DSCT (Group C, SECT, 100-kV; Group D, DECT, 90/Sn150-kV). Assess- ment of radiation dose was performed for an average scan length of 27 cm. Contrast-to-noise ratio measure- ments and dose-independent figure-of-merit calcu- lations of the submandibular gland, thyroid, internal jugular vein, and common carotid artery were analyzed quantitatively. Qualitative image parameters were evalu- ated regarding overall image quality, artifacts and reader confidence using 5-point Likert scales. Results: Effective radiation dose (ED) was not signifi- cantly different between SECT and DECT acquisition for each scanner generation (p = 0.10). Significantly lower effective radiation dose (p < 0.01) values were observed for third-generation DSCT groups C (1.1 ± 0.2 mSv) and D (1.0 ± 0.3 mSv) compared to second-generation DSCT groups A (1.8 ± 0.1 mSv) and B (1.6 ± 0.2 mSv). Figure-of- merit/contrast-to-noise ratio analysis revealed superior results for third-generation DECT Group D compared to all other groups. Qualitative image parameters showed non-significant differences between all groups (p > 0.06). Conclusion: Contrast-enhanced head and neck DECT can be performed with second- and third-generation DSCT systems without radiation penalty or impaired image quality compared with SECT, while third-generation DSCT is the most dose efficient acquisition method. Advances in knowledge: Differences in radiation dose between SECT and DECT of the dose-vulnerable head and neck region using DSCT systems have not been evaluated so far. Therefore, this study directly compares radiation dose and image quality of standard SECT and DECT protocols of second- and third-generation DSCT platforms.
Objectives: To evaluate the predictive value of volumetric bone mineral density (BMD) assessment of the lumbar spine derived from phantomless dual-energy CT (DECT)-based volumetric material decomposition as an indicator for the 2-year occurrence risk of osteoporosis-associated fractures. Methods: L1 of 92 patients (46 men, 46 women; mean age, 64 years, range, 19–103 years) who had undergone third-generation dual-source DECT between 01/2016 and 12/2018 was retrospectively analyzed. For phantomless BMD assessment, dedicated DECT postprocessing software using material decomposition was applied. Digital files of all patients were sighted for 2 years following DECT to obtain the incidence of osteoporotic fractures. Receiver operating characteristic (ROC) analysis was used to calculate cut-off values and logistic regression models were used to determine associations of BMD, sex, and age with the occurrence of osteoporotic fractures. Results: A DECT-derived BMD cut-off of 93.70 mg/cm3 yielded 85.45% sensitivity and 89.19% specificity for the prediction to sustain one or more osteoporosis-associated fractures within 2 years after BMD measurement. DECT-derived BMD was significantly associated with the occurrence of new fractures (odds ratio of 0.8710, 95% CI, 0.091–0.9375, p < .001), indicating a protective effect of increased DECT-derived BMD values. Overall AUC was 0.9373 (CI, 0.867–0.977, p < .001) for the differentiation of patients who sustained osteoporosis-associated fractures within 2 years of BMD assessment. Conclusions: Retrospective DECT-based volumetric BMD assessment can accurately predict the 2-year risk to sustain an osteoporosis-associated fracture in at-risk patients without requiring a calibration phantom. Lower DECT-based BMD values are strongly associated with an increased risk to sustain fragility fractures.
Key Points: Dual-energy CT–derived assessment of bone mineral density can identify patients at risk to sustain osteoporosis-associated fractures with a sensitivity of 85.45% and a specificity of 89.19%. The DECT-derived BMD threshold for identification of at-risk patients lies above the American College of Radiology (ACR) QCT guidelines for the identification of osteoporosis (93.70 mg/cm 3 vs 80 mg/cm 3 ).