Aims: Patients with cardiovascular comorbidities have a significantly increased risk for a critical course of COVID-19. As the SARS-CoV2 virus enters cells via the angiotensin-converting enzyme receptor II (ACE2), drugs which interact with the renin angiotensin aldosterone system (RAAS) were suspected to influence disease severity.
Methods and results: We analyzed 1946 consecutive patients with cardiovascular comorbidities or hypertension enrolled in one of the largest European COVID-19 registries, the Lean European Open Survey on SARS-CoV-2 (LEOSS) registry. Here, we show that angiotensin II receptor blocker intake is associated with decreased mortality in patients with COVID-19 [OR 0.75 (95% CI 0,59–0.96; p = 0.013)]. This effect was mainly driven by patients, who presented in an early phase of COVID-19 at baseline [OR 0,64 (95% CI 0,43–0,96; p = 0.029)]. Kaplan-Meier analysis revealed a significantly lower incidence of death in patients on an angiotensin receptor blocker (ARB) (n = 33/318;10,4%) compared to patients using an angiotensin-converting enzyme inhibitor (ACEi) (n = 60/348;17,2%) or patients who received neither an ACE-inhibitor nor an ARB at baseline in the uncomplicated phase (n = 90/466; 19,3%; p<0.034). Patients taking an ARB were significantly less frequently reaching the mortality predicting threshold for leukocytes (p<0.001), neutrophils (p = 0.002) and the inflammatory markers CRP (p = 0.021), procalcitonin (p = 0.001) and IL-6 (p = 0.049). ACE2 expression levels in human lung samples were not altered in patients taking RAAS modulators.
Conclusion: These data suggest a beneficial effect of ARBs on disease severity in patients with cardiovascular comorbidities and COVID-19, which is linked to dampened systemic inflammatory activity.
Background: Myocardial perfusion with cardiovascular magnetic resonance (CMR) imaging is an established diagnostic test for evaluation of myocardial ischaemia. For quantification purposes, the 16 segment American Heart Association (AHA) model poses limitations in terms of extracting relevant information on the extent/severity of ischaemia as perfusion deficits will not always fall within an individual segment, which reduces its diagnostic value, and makes an accurate assessment of outcome data or a result comparison across various studies difficult. We hypothesised that division of the myocardial segments into epi- and endocardial layers and a further circumferential subdivision, resulting in a total of 96 segments, would improve the accuracy of detecting myocardial hypoperfusion. Higher (sub-)subsegmental recording of perfusion abnormalities, which are defined relatively to the normal reference using the subsegment with the highest value, may improve the spatial encoding of myocardial blood flow, based on a single stress perfusion acquisition. Objective: A proof of concept comparison study of subsegmentation approaches based on transmural segments (16 AHA and 48 segments) vs. subdivision into epi- and endocardial (32) subsegments vs. further circumferential subdivision into 96 (sub-)subsegments for diagnostic accuracy against invasively defined obstructive coronary artery disease (CAD). Methods: Thirty patients with obstructive CAD and 20 healthy controls underwent perfusion stress CMR imaging at 3 T during maximal adenosine vasodilation and a dual bolus injection of 0.1mmol/kg gadobutrol. Using Fermi deconvolution for blood flow estimation, (sub-)subsegmental values were expressed relative to the (sub)subsegment with the highest flow. In addition, endo−/epicardial flow ratios were calculated based on 32 and 96 (sub-)subsegments. A receiver operating characteristics (ROC) curve analysis was performed to compare the diagnostic performance of discrimination between patients with CAD and healthy controls. Observer reproducibility was assessed using Bland-Altman approaches. Results: Subdivision into more and smaller segments revealed greater accuracy for #32, #48 and # 96 compared to the standard #16 approach (area under the curve (AUC): 0.937, 0.973 and 0.993 vs 0.820, p<0.05). The #96-based endo−/epicardial ratio was superior to the #32 endo−/epicardial ratio (AUC 0.979, vs. 0.932, p<0.05). Measurements for the #16 model showed marginally better reproducibility compared to #32, #48 and #96 (mean difference± standard deviation: 2.0±3.6 vs. 2.3±4.0 vs 2.5±4.4 vs. 4.1±5.6). Conclusions: Subsegmentation of the myocardium improves diagnostic accuracy and facilitates an objective cutoff-based description of hypoperfusion, and facilitates an objective description of hypoperfusion, including the extent and severity of myocardial ischaemia. Quantification based on a single (stress-only) pass reduces the overall amount of gadolinium contrast agent required and the length of the overall diagnostic study.
We sought to determine the effects of the use of a Bioengineered Combo Dual-Therapy CD34 Antibody-Covered Sirolimus-Eluting Coronary Stent (Combo® DTS) in patients with chronic total occlusion (CTO) by evaluating clinical outcomes and by performing an optical coherence tomography (OCT) analysis. We retrospectively analyzed data from 39 patients who had successfully undergone OCT-guided revascularization of a CTO being treated with a Combo® DTS. Clinical assessment, angiography (with quantitative coronary angiography analysis) and OCT examination were performed at baseline and at follow-up. The median follow-up period was 189 days, ranging from 157 to 615 days. At follow-up, revascularization was required due to angiographic restenosis in 40% (14 of 35) of patients. OCT analysis detected neointima proliferation in 23 (76.6%) patients. Neointima formation was often associated with microvessels in 18 patients (60%). Neoatheroslcerosis was observed in 2 (6.6%) patients. Malapposition was found in 4 patients (13.3%), and stent fractures were found in 11 patients (36.6%). Rate of strut coverage was 96.3% at follow-up. In conclusion, the implantation of a Combo® DTS after successful CTO recanalization was associated with a restenosis rate of 40% despite good stent implantation at baseline, proven by OCT. Neointima formation was found as a main contributor to restenosis. Nevertheless, we observed a low rate of major cardiovascular events in our follow-up.
Myocardial fibrosis and inflammation by CMR predict cardiovascular outcome in people living with HIV
(2021)
Objectives_: The goal of this study was to examine prognostic relationships between cardiac imaging measures and cardiovascular outcome in people living with human immunodeficiency virus (HIV) (PLWH) on highly active antiretroviral therapy (HAART).
Background: PLWH have a higher prevalence of cardiovascular disease and heart failure (HF) compared with the noninfected population. The pathophysiological drivers of myocardial dysfunction and worse cardiovascular outcome in HIV remain poorly understood.
Methods: This prospective observational longitudinal study included consecutive PLWH on long-term HAART undergoing cardiac magnetic resonance (CMR) examination for assessment of myocardial volumes and function, T1 and T2 mapping, perfusion, and scar. Time-to-event analysis was performed from the index CMR examination to the first single event per patient. The primary endpoint was an adjudicated adverse cardiovascular event (cardiovascular mortality, nonfatal acute coronary syndrome, an appropriate device discharge, or a documented HF hospitalization).
Results: A total of 156 participants (62% male; age [median, interquartile range]: 50 years [42 to 57 years]) were included. During a median follow-up of 13 months (9 to 19 months), 24 events were observed (4 HF deaths, 1 sudden cardiac death, 2 nonfatal acute myocardial infarction, 1 appropriate device discharge, and 16 HF hospitalizations). Patients with events had higher native T1 (median [interquartile range]: 1,149 ms [1,115 to 1,163 ms] vs. 1,110 ms [1,075 to 1,138 ms]); native T2 (40 ms [38 to 41 ms] vs. 37 ms [36 to 39 ms]); left ventricular (LV) mass index (65 g/m2 [49 to 77 g/m2] vs. 57 g/m2 [49 to 64 g/m2]), and N-terminal pro–B-type natriuretic peptide (109 pg/l [25 to 337 pg/l] vs. 48 pg/l [23 to 82 pg/l]) (all p < 0.05). In multivariable analyses, native T1 was independently predictive of adverse events (chi-square test, 15.9; p < 0.001; native T1 [10 ms] hazard ratio [95% confidence interval]: 1.20 [1.08 to 1.33]; p = 0.001), followed by a model that also included LV mass (chi-square test, 17.1; p < 0.001). Traditional cardiovascular risk scores were not predictive of the adverse events.
Conclusions: Our findings reveal important prognostic associations of diffuse myocardial fibrosis and LV remodeling in PLWH. These results may support development of personalized approaches to screening and early intervention to reduce the burden of HF in PLWH (International T1 Multicenter Outcome Study; NCT03749343).
Background. Transcatheter aortic valve implantation (TAVI) is currently recommended for patients with severe aortic stenosis at intermediate or high surgical risk. The decision process during TAVI evaluation includes a thorough benefit-risk assessment, and knowledge about long-term benefits and outcomes may improve patients’ expectation management. Objective. To evaluate patients’ perceived health status and self-reported long-term outcome more than 5 years after TAVI. Methods and Results. Demographic and procedure data were obtained from all patients treated with TAVI at our institution from 2006 to 2012. A cross-sectional survey was conducted on the patients alive, measuring health status, including the EQ-5D-5L questionnaire, and clinical outcomes. 103 patients (22.8%) were alive at a median follow-up period of 7 years (5.4–9.8). 99 (96%) of the 103 patients were included in the final analysis. The mean age at follow-up was 86.5 years ± 8.0 years, and 56.6% were female. Almost all patients (93.9%) described an improvement of their quality of life after receiving TAVI. At late follow-up, the mean utility index and EQ-VAS score were 0.80 ± 0.20 and 58.49 ± 11.49, respectively. Mobility was found to be the most frequently reported limitation (85.4%), while anxiety/depression was the least frequently reported limitation (19.8%). With respect to functional class, 64.7% were in New York Heart Association (NYHA) class III or IV, compared to 67.0% prior to TAVI (p = 0.51). Self-reported long-term outcomes revealed mainly low long-term complication rates. 74 total hospitalizations were reported after TAVI, and among those 43% for cardiovascular reasons. Within cardiovascular rehospitalizations, new pacemaker implantations were the most frequently reported (18.9%), followed by cardiac decompensation and coronary heart disease (15.6%). Conclusion. The majority of the patients described an improvement of health status after TAVI. More than five years after TAVI, the patients’ perceived health status was satisfactory, and the incidence of clinical events and hospitalizations was very low.
Background: Cerebral O2 saturation (ScO2) reflects cerebral perfusion and can be measured noninvasively by near-infrared spectroscopy (NIRS). Objectives: In this pilot study, we describe the dynamics of ScO2 during TAVI in nonventilated patients and its impact on procedural outcome. Methods and Results: We measured ScO2 of both frontal lobes continuously by NIRS in 50 consecutive analgo-sedated patients undergoing transfemoral TAVI (female 58%, mean age 80.8 years). Compared to baseline ScO2 dropped significantly during RVP (59.3% vs. 53.9%, p < .01). Five minutes after RVP ScO2 values normalized (post RVP 62.6% vs. 53.9% during RVP, p < .01; pre 61.6% vs. post RVP 62.6%, p = .53). Patients with an intraprocedural pathological ScO2 decline of >20% (n = 13) had higher EuroSCORE II (3.42% vs. 5.7%, p = .020) and experienced more often delirium (24% vs. 62%, p = .015) and stroke (0% vs. 23%, p < .01) after TAVI. Multivariable logistic regression revealed higher age and large ScO2 drops as independent risk factors for delirium. Conclusions: During RVP ScO2 significantly declined compared to baseline. A ScO2 decline of >20% is associated with a higher incidence of delirium and stroke and a valid cut-off value to screen for these complications. NIRS measurement during TAVI procedure may be an easy to implement diagnostic tool to detect patients at high risks for cerebrovascular complications and delirium.
Sleep disordered breathing (SDB) is a frequent comorbidity in cardiac disease patients. Nevertheless, the prevalence and relationship between SDB and severe primary mitral regurgitation (PMR) has not been well investigated to date. Methods: A cohort of 121 patients with significant PMR undergoing mitral valve surgery were prospectively enrolled and received a cardiorespiratory single night polygraphy screening using ApneaLink before surgery. Eighty-two of them underwent a follow-up examination including a follow-up single-night sleep study 3 months after surgery. Results: The mean age of patients was 65.3 ± 12.0 years. Sixty patients (49.6%) were female. The mean EuroSCORE II was 2.5 ± 2.4%. Initially, 91 (75.2%) patients presented with SDB, among whom 50.4% (46 patients, 38.0% of total cohort) were classified as moderate to severe. These patients tended to require significantly longer postoperative intensive care and mechanical ventilation. Among the 82 patients who completed follow-up exams, mitral valve surgery led to a significant reduction in relevant SDB (20.7%). The apnea-hypopnea index (from 11/h [4;18] to 4/h [3;14] (p = 0.04)), the oxygenation-desaturation index (from 8/h [3;18] to 5/h [3;12] (p = 0.008)) as well as the saturation time below 90% (from 32 min [13;86] to 18 min [5;36] (p = 0.005)), were all shown to be improved significantly. Conclusion: The prevalence of SDB is very high in patients with severe primary mitral regurgitation and may contribute to postoperative complications and prolonged intensive care. A significantly reduced but still high prevalence of SDB was observed 3 months after mitral valve surgery, highlighting the bidirectional relationship between SDB and heart failure.
Improved integration of single cell transcriptome data demonstrated on heart failure in mice and men
(2023)
Biomedical research frequently uses murine models to study disease mechanisms. However, the translation of these findings to human disease remains a significant challenge. In order to improve the comparability of mouse and human data, we present a cross-species integration pipeline for single-cell transcriptomic assays.
The pipeline merges expression matrices and assigns clear orthologous relationships. Starting from Ensembl ortholog assignments, we allocated 82% of mouse genes to unique orthologs by using additional publicly available resources such as Uniprot, and NCBI databases. For genes with multiple matches, we employed the Needleman-Wunsch global alignment based on either amino acid or nucleotide sequence to identify the ortholog with the highest degree of similarity.
The workflow was tested for its functionality and efficiency by integrating scRNA-seq datasets from heart failure patients with the corresponding mouse model. We were able to assign unique human orthologs to up to 80% of the mouse genes, utilizing the known 17,492 orthologous pairs. Curiously, the integration process enabled the identification of both common and unique regulatory pathways between species in heart failure.
In conclusion, our pipeline streamlines the integration process, enhances gene nomenclature alignment and simplifies the translation of mouse models to human disease. We have made the OrthoIntegrate R-package accessible on GitHub (https://github.com/MarianoRuzJurado/OrthoIntegrate), which includes the assignment of ortholog definitions for human and mouse, as well as the pipeline for integrating single cells.
Background: Patients with chronic kidney disease (CKD) are at high risk of myocardial infarction. Cardiac troponins are the biomarkers of choice for the diagnosis of acute myocardial infarction (AMI) without ST‐segment elevation (NSTE). In patients with CKD, troponin levels are often chronically elevated, which reduces their diagnostic utility when NSTE‐AMI is suspected. The aim of this study was to derive a diagnostic algorithm for serial troponin measurements in patients with CKD and suspected NSTE‐AMI.
Methods and Results: Two cohorts, 1494 patients from a prospective cohort study with high‐sensitivity troponin I (hs‐cTnI) measurements and 7059 cases from a clinical registry with high‐sensitivity troponin T (hs‐cTnT ) measurements, were analyzed. The prospective cohort comprised 280 CKD patients (estimated glomerular filtration rate <60 mL/min/1.73 m2). The registry data set contained 1581 CKD patients. In both cohorts, CKD patients were more likely to have adjudicated NSTE‐AMI than non‐CKD patients. The specificities of hs‐cTnI and hs‐cTnT to detect NSTE‐AMI were reduced with CKD (0.82 versus 0.91 for hs‐cTnI and 0.26 versus 0.73 for hs‐cTnT) but could be restored by applying optimized cutoffs to either the first or a second measurement after 3 hours. The best diagnostic performance was achieved with an algorithm that incorporates serial measurements and rules in or out AMI in 69% (hs‐cTnI) and 55% (hs‐cTnT) of CKD patients.
Conclusions: The diagnostic performance of high‐sensitivity cardiac troponins in patients with CKD with suspected NSTE‐AMI is improved by use of an algorithm based on admission troponin and dynamic changes in troponin concentration.
Background: Treatment of patients presenting with possible acute myocardial infarction (AMI) is based on timely diagnosis and proper risk stratification aided by biomarkers. We aimed at evaluating the predictive value of GDF-15 in patients presenting with symptoms suggestive of AMI.
Methods: Consecutive patients presenting with suspected AMI were enrolled in three study centers. Cardiovascular events were assessed during a follow-up period of 6 months with a combined endpoint of death or MI.
Results: From the 1818 enrolled patients (m/f = 1208/610), 413 (22.7%) had an acute MI and 63 patients reached the combined endpoint. Patients with MI and patients with adverse outcome had higher GDF-15 levels compared with non-MI patients (967.1pg/mL vs. 692.2 pg/L, p<0.001) and with event-free patients (1660 pg/mL vs. 756.6 pg/L, p<0.001). GDF-15 levels were lower in patients with SYNTAX score ≤ 22 (797.3 pg/mL vs. 947.2 pg/L, p = 0.036). Increased GDF-15 levels on admission were associated with a hazard ratio of 2.1 for death or MI (95%CI: 1.67–2.65, p<0.001) in a model adjusted for age and sex and of 1.57 (1.13–2.19, p = 0.008) adjusted for the GRACE score variables. GDF-15 showed a relevant reclassification with regards to the GRACE score with an overall net reclassification index (NRI) of 12.5% and an integrated discrimination improvement (IDI) of 14.56% (p = 0.006).
Conclusion: GDF-15 is an independent predictor of future cardiovascular events in patients presenting with suspected MI. GDF-15 levels correlate with the severity of CAD and can identify and risk-stratify patients who need coronary revascularization.
The use of cardiac troponins (cTn) is the gold standard for diagnosing myocardial infarction. Independent of myocardial infarction (MI), however, sex, age and kidney function affect cTn levels. Here we developed a method to adjust cTnI levels for age, sex, and renal function, maintaining a unified cut-off value such as the 99th percentile. A total of 4587 individuals enrolled in a prospective longitudinal study were used to develop a model for adjustment of cTn. cTnI levels correlated with age and estimated glomerular filtration rate (eGFR) in males/females with rage = 0.436/0.518 and with reGFR = −0.142/−0.207. For adjustment, these variables served as covariates in a linear regression model with cTnI as dependent variable. This adjustment model was then applied to a real-world cohort of 1789 patients with suspected acute MI (AMI) (N = 407). Adjusting cTnI showed no relevant loss of diagnostic information, as evidenced by comparable areas under the receiver operator characteristic curves, to identify AMI in males and females for adjusted and unadjusted cTnI. In specific patients groups such as in elderly females, adjusting cTnI improved specificity for AMI compared with unadjusted cTnI. Specificity was also improved in patients with renal dysfunction by using the adjusted cTnI values. Thus, the adjustments improved the diagnostic ability of cTnI to identify AMI in elderly patients and in patients with renal dysfunction. Interpretation of cTnI values in complex emergency cases is facilitated by our method, which maintains a single diagnostic cut-off value in all patients.
Background: Common ECG criteria such as ST-segment changes are of limited value in patients with suspected acute myocardial infarction (AMI) and bundle branch block or wide QRS complex. A large proportion of these patients do not suffer from an AMI, whereas those with ST-elevation myocardial infarction (STEMI) equivalent AMI benefit from an aggressive treatment. Aim of the present study was to evaluate the diagnostic information of cardiac troponin I (cTnI) in hemodynamically stable patients with wide QRS complex and suspected AMI.
Methods: In 417 out of 1818 patients presenting consecutively between 01/2007 and 12/2008 in a prospective multicenter observational study with suspected AMI a prolonged QRS duration was observed. Of these, n = 117 showed significant obstructive coronary artery disease (CAD) used as diagnostic outcome variable. cTnI was determined at admission.
Results: Patients with significant CAD had higher cTnI levels compared to individuals without (median 250ng/L vs. 11ng/L; p<0.01). To identify patients needing a coronary intervention, cTnI yielded an area under the receiver operator characteristics curve of 0.849. Optimized cut-offs with respect to a sensitivity driven rule-out and specificity driven rule-in strategy were established (40ng/L/96ng/L). Application of the specificity optimized cut-off value led to a positive predictive value of 71% compared to 59% if using the 99th percentile cut-off. The sensitivity optimized cut-off value was associated with a negative predictive value of 93% compared to 89% provided by application of the 99th percentile threshold.
Conclusion: cTnI determined in hemodynamically stable patients with suspected AMI and wide QRS complex using optimized diagnostic thresholds improves rule-in and rule-out with respect to presence of a significant obstructive CAD.
Background: The introduction of modern troponin assays has facilitated diagnosis of acute myocardial infarction due to improved sensitivity with corresponding loss of specificity. Atrial fibrillation (AF) is associated with elevated levels of troponin. The aim of the present study was to evaluate the diagnostic performance of troponin I in patients with suspected acute coronary syndrome and chronic AF.
Methods: Contemporary sensitive troponin I was assayed in a derivation cohort of 90 patients with suspected acute coronary syndrome and chronic AF to establish diagnostic cut-offs. These thresholds were validated in an independent cohort of 314 patients with suspected myocardial infarction and AF upon presentation. Additionally, changes in troponin I concentration within 3 hours were used.
Results: In the derivation cohort, optimized thresholds with respect to a rule-out strategy with high sensitivity and a rule-in strategy with high specificity were established. In the validation cohort, application of the rule-out cut-off led to a negative predictive value of 97 %. The rule-in cut-off was associated with a positive predictive value of 88 % compared with 71 % if using the 99th percentile cut-off. In patients with troponin I levels above the specificity-optimized threshold, additional use of the 3-hour change in absolute/relative concentration resulted in a further improved positive predictive value of 96 %/100 %.
Conclusions: Troponin I concentration and the 3-hour change in its concentration provide valid diagnostic information in patients with suspected myocardial infarction and chronic AF. With regard to AF-associated elevation of troponin levels, application of diagnostic cut-offs other than the 99th percentile might be beneficial.