In the past decades, coronary imaging has evolved as a valuable adjunct to angiography, providing scientific insights into vascular biology and practical guidance by direct visualization of atherosclerosis and other pathologic conditions within the vessel walls. Especially with intravascular ultrasound (IVUS), the signal is able to penetrate below the luminal surface, so the entire cross-section of an artery, including the complete thickness of the plaque, can be imaged in real-time. On the other hand, optical coherence tomography (OCT) has been offering higher image resolution of both the plaque and the luminal surface. These technologies offer the opportunity to gather diagnostic information about the process of atherosclerosis and to directly observe the effects of various interventions on the plaque and arterial walls. IVUS has proven itself to be a practical and useful tool in the evaluation and optimal guidance of interventional vascular medicine. In this review, we detail the current modalities of coronary imaging and their usefulness in the diagnosis and management of patients with high-risk coronary plaques.
Subclinical leaflet thrombosis (SLT) following transcatheter aortic valve replacement (TAVR) has been increasingly recognized. SLT has the hallmark features of hypo-attenuated leaflet thickening (HALT) on multidetector computed tomography (MDCT), which may result in hypoattenuation affecting motion (HAM). The actual prevalence of this condition is uncertain, with limited observational registries. SLT has caught the attention of the cardiovascular community because of concerns regarding its clinical sequelae, specifically the potential increased incidence of cerebrovascular events. There are available, albeit sparse, data to suggest that when left untreated, SLT may lead to valve deterioration with potential hemodynamic compromise and potentially clinically overt prostheses thrombosis. Some clinicians have opted to treat patients with SLT with anticoagulation. Although anticoagulation may be a rational treatment option, little data exist on the safety and efficacy of this treatment. This is particularly important considering TAVR patients also have higher bleeding risk than the standard population. In this review, we aim to summarize the current evidence on SLT, explore its pathophysiological mechanism, discuss the current treatment options and future trials that may clarify the optimal antithrombotic strategies of SLT.
Background:Patients with concomitant atrial fibrillation (AF) and coronary stenting are at high risk for both cardiovascular and bleeding events. We aimed to evaluate the influence of sex on long-term clinical outcomes in this patient subset.
Methods and Results:We identified 1,450 patients with AF and coronary stenting in a patient-level pooled database from 3 Japanese studies, and compared 3-year clinical outcomes between men and women (n=1,075, and n=375, respectively). The cumulative 3-year incidence of all-cause death was significantly higher in women than in men (26.5% vs. 17.2%, log-rank P<0.001), although after adjusting for confounders, the excess mortality risk of women relative to men was no longer significant (hazard ratio (HR): 1.12, 95% confidence interval (CI): 0.85–1.46, P=0.42). There were no significant differences in the adjusted 3-year risks for myocardial infarction or stroke between men and women (HR: 1.25, 95% CI: 0.62–2.40, P=0.52, and HR: 1.15, 95% CI: 0.75–1.74, P=0.52, respectively). However, both the cumulative 3-year incidence of and adjusted risk for major bleeding were significantly higher in women than in men (17.0% vs. 11.3%, log-rank P=0.002, and HR: 1.47, 95% CI: 1.03–2.07, P=0.03).
Conclusions:Among patients with concomitant AF and coronary stenting, there were no significant differences in the adjusted 3-year risks for all-cause death, myocardial infarction, and stroke between men and women. However, women as compared with men were associated with excess adjusted risk for major bleeding.
Background:Although the prevalence of both atrial fibrillation (AF) and metabolic syndrome (MetS) has been increasing in East Asia, the association between them is uncertain.
Methods and Results:A total of 24,741 middle-aged Korean men without baseline AF were enrolled in a health screening program from January 2003 to December 2008. Among them, 21,981 subjects were evaluated to determine the risk of AF based on baseline MetS status through December 2016. At every visit, the subjects were evaluated for AF using ECG. MetS was defined using the criteria of the International Diabetes Federation and was present in 2,529 subjects (11.5%). Mean (±standard deviation) age was 45.9±5.3 years. During a mean follow-up of 8.7 years, 168 subjects (0.8%) were diagnosed with AF. The age-adjusted and multivariate-adjusted hazard ratios (HR) for MetS with AF were 1.62 (P=0.02) and 1.57 (P=0.03), respectively. Among the components of MetS, central obesity (age-adjusted HR 1.62, P<0.01) and raised blood pressure (age-adjusted HR 1.43, P=0.02) were associated with an increased risk of AF.
Conclusions:MetS is associated with an increased risk of AF in middle-aged East Asian men. Of the components of MetS, central obesity is the most potent risk factor for the development of AF in this population.
Background:Atrial fibrillation (AF) frequently coexists with heart failure (HF) with reduced ejection fraction (EF). This meta-analysis compared AF control strategies, that is, rhythm vs. rate, and catheter ablation (CA) vs. anti-arrhythmic drugs (AAD) in patients with AF combined with HF.
Methods and Results:The MEDLINE, EMBASE, and CENTRAL databases were searched, and 13 articles from 11 randomized controlled trials with 5,256 patients were included in this meta-analysis. The outcomes were echocardiographic parameters (left ventricular EF, LVEF), left atrial (LA) size, and left ventricular end-systolic volume, LVESV), clinical outcomes (mortality, hospitalization, and thromboembolism), exercise capacity, and quality of life (QOL). In a random effects model, rhythm control was associated with higher LVEF, better exercise capacity, and better QOL than the rate control. When the 2 different rhythm control strategies were compared (CA vs. AAD), the CA group had significantly decreased LA size and LVESV, and improved LVEF and 6-min walk distance, but mortality, hospitalization, and thromboembolism rates were not different between the rhythm and rate control groups.
Conclusions:In AF combined with HF, even though mortality, hospitalization and thromboembolism rates were similar, a rhythm control strategy was superior to rate control in terms of improvement in LVEF, exercise capacity, and QOL. In particular, the CA group was superior to the AAD group for reversal of cardiac remodeling.
Background:Although increasing evidence suggests that epicardial adipose tissue volume (EATV) is associated with atrial fibrillation (AF), it is controversial whether there is a dose-response relationship of increasing EATV along the continuum of AF. We evaluated the effect of the EATV on the prevalence of paroxysmal AF (PAF) and persistent AF (PeAF) and the relationships with cardiac structure and functional remodeling.
Methods and Results:Subjects who underwent multidetector computed tomography (MDCT) coronary angiography because of symptoms suggestive of coronary artery disease were divided into sinus rhythm (SR) (n=112), PAF (n=133), and PeAF (n=71) groups. The EATV index (EATV/body surface area, mL/m2) was strongly associated with the prevalence of PAF and PeAF on the model adjusted for known AF risk factors. The effect of the EATV index on the prevalence of PeAF, but not on that of PAF, was modified by the left atrial (LA) dimension, suggesting that extension of the LA dimension is related to EATV expansion in PeAF. The cutoff value of the EATV index for the prevalence was higher in PeAF than in PAF (64 vs. 55 mL/m2, P<0.01).
Conclusions:The EATV index is associated with the prevalence of PAF and PeAF, and its cutoff values are predictive for PAF and PeAF development independently of other AF risk factors.
Background:The effect of remote ischemic preconditioning (RIPC) on periprocedural myocardial damage (pMD) in patients undergoing percutaneous coronary intervention (PCI) is controversial. The aim of this study was to investigate the effect of RIPC or intravenous nicorandil on pMD following elective PCI in a subgroup of patients with complex coronary lesions from a multicenter randomized controlled trial.
Methods and Results:Patients with stable angina who underwent elective PCI were assigned to 3 groups: control, upper-limb RIPC or intravenous nicorandil. The major outcome was pMD incidence following PCI, with pMD defined as an elevated level of high-sensitivity cardiac troponin T or creatine kinase myocardial band at 12 or 24 h after PCI. A total of 171 patients with complex coronary lesions (ACC-AHA coronary classification type B2 or C) were analyzed. The incidence of pMD following PCI was significantly lower in the RIPC group than in the control group (44.4% vs. 66.1%; P=0.023). The adjusted odds ratio (95% confidence interval) for pMD in the RIPC vs. the controls was 0.41 (0.18−0.94). The incidence of pMD in the nicorandil group was not significantly reduced compared with the control groups.
Conclusions:This substudy suggested that RIPC prior to PCI prevented pMD in patients with complex coronary lesions. Further investigation in a multicenter prospective study is needed to confirm these results.
Background:Minor ST-T changes are frequently observed on the electrocardiogram (ECG), but the risk of stroke associated with such changes is unclear.
Methods and Results:In 10,642 subjects from the Japanese general population, we evaluated minor and major ST-T changes (major ST depression ≥0.1 mV) on ECGs obtained at annual health examinations. At baseline, minor ST-T changes were found in 10.7% of the subjects and 0.5% had major ST-T changes. Minor ST-T changes were associated with older age, female gender, higher systolic blood pressure, presence of hyperlipidemia, and use of antihypertensive medication. There were 375 stroke events during the follow-up period (128.7±28.1 months). In all subjects, minor ST-T changes (HR, 2.10; 95% CI: 1.57–2.81) and major ST-T changes (HR, 8.64; 95% CI: 4.44–16.82) were associated with an increased risk of stroke, but the stroke risk associated with minor ST-T changes had borderline significance after adjustment for conventional risk factors (P=0.055). In subgroup analysis, the risk of stroke was significantly associated with minor ST-T changes in subjects who had hyperlipidemia (HR, 1.75; 95% CI: 1.15–2.67) compared to those without hyperlipidemia (HR, 1.01; 95% CI: 0.64–1.59; P for interaction=0.016), even after adjustment for ECG-diagnosed left ventricular hypertrophy.
Conclusions:Minor ST-T changes were particularly associated with a higher risk of stroke in subjects with hyperlipidemia and this association was independent of electrocardiographic left ventricular hypertrophy.
Background:Predicting future coronary artery disease (CAD) risk by model-based approaches can facilitate identification of high-risk individuals for prevention and management. Therefore, we compared the consistency and performance of various CAD models for primary prevention using 1 external validation dataset from a national representative cohort in Taiwan.
Methods and Results:The 10 CAD prediction models were assessed in a validation cohort of 3559 participants (≥35 years old, 53.5% women) from a Taiwanese national representative cohort that was followed up for a median 9.70 (interquartile range, 9.63–9.74) years; 63 cases were documented as developing CAD events. The overall κ value was 0.51 for all 10 models, with a higher value for women than for men (0.53 for women, 0.40 for men). In addition, the areas under the receiver operating characteristics curves ranged from 0.804 (95% confidence interval, 0.758–0.851) to 0.847 (95% confidence interval, 0.805–0.889). All non-significant chi-square values indicated good calibration ability.
Conclusions:Our study demonstrated these 10 CAD prediction models for primary prevention were feasible and validated for use in Taiwanese subjects. Further studies of screening and management are warranted.
Background:QRS duration (QRSd) and morphology are established response predictors of cardiac resynchronization therapy (CRT). However, evidence in Japanese populations is lacking.
Methods and Results:We retrospectively analyzed the Japanese multicenter CRT database. We divided patients according to their intrinsic QRSd and morphology, and assessed echocardiographic responses and clinical outcomes. The primary endpoint was a composite of all-cause death or hospitalization because of heart failure. A total of 510 patients were enrolled: 200 (39%) had left bundle branch block (LBBB) and QRSd ≥150 ms; 80 (16%) had LBBB (QRSd: 120–149 ms); 61 (12%) had non-LBBB (NLBBB) (QRSd: ≥150 ms); 54 (11%) had NLBBB (QRSd: 120–149 ms); 115 (23%), narrow (<120 ms). The proportion of echocardiographic responders was higher in LBBB (QRSd ≥150 ms) [74% vs. 51% vs. 38% vs. 52% vs. 50%, LBBB (QRSd ≥150 ms) vs. LBBB (QRSd 120–149 ms) vs. NLBBB (QRSd ≥150 ms) vs. NLBBB (QRSd 120–149 ms) vs. narrow, respectively, P<0.001]. During follow-up (3.2±1.5 years), the incidence of the primary endpoint was lowest in the LBBB group (QRSd ≥150) (28.6% vs. 42.3% vs. 45.9% vs. 55.6% vs. 55.3%, respectively, P<0.001). This difference was still significant after adjusting for other baseline characteristics.
Conclusions:In this Japanese patient population, LBBB intrinsic QRS morphology and prolonged QRSd (≥150 ms) exhibited the best response to CRT.
Background:Acute decompensated heart failure (ADHF) is often accompanied by liver congestion through increased right atrial pressure (RAP). Liver stiffness (LS) assessed non-invasively using transient elastography is related to increased RAP and liver congestion in patients with general HF. We investigated the relationship of LS with clinical and echocardiographic variables and outcomes in patients with ADHF.
Methods and Results:The subjects were 105 patients with ADHF admitted to hospital between October 2016 and June 2017. Patients were divided into 2 groups based on median LS at admission (low LS <8.8 kPa [n=52] vs. high LS ≥8.8 kPa [n=53]). Death from cardiovascular disease and readmission for HF were primary endpoints. Total bilirubin and γ-glutamyl transpeptidase levels, MELD-XI score, diameters of the inferior vena cava and right ventricle, and severity of tricuspid regurgitation were greater in the high LS group (all P<0.05). During a median (interquartile range) follow-up period of 153 (83–231) days, cardiac events occurred in 29 patients (54%) in the high LS group and in 13 (25%) in the low LS group (P=0.001). After adjusting for variables that influence organ congestion, a high LS ≥8.8 kPa was still significantly associated with cardiac events (all P<0.05).
Conclusions:Increased LS measured by transient elastography reflects RAP elevation, hepatic congestion, and hepatic dysfunction. LS upon admission may be a useful prognostic marker in patients with ADHF.
Background:The vascular response, in terms of quality and quantity, of the second- and third-generation drug-eluting stents (2G- and 3G-DES, respectively) was assessed prospectively on coronary angioscopy (CAS).
Methods and Results:The Multicenter study on Intra-Coronary AngioScopy After Stent (MICASA) is a multicenter CAS registry. A total of 107 DES (71 2G- and 36 3G-DES) were prospectively observed on CAS 8.7±2.7 months after percutaneous coronary intervention. Neointimal coverage (NC) grade was evaluated using a 4-point grading scale, from 0 (no coverage) to 3 (complete coverage). Plaque yellow color (YC) was also assessed using a 4-point grading system, from 0 (white) to 3 (bright yellow). Max-NC (2G-DES vs. 3G-DES: 2.14±0.68 vs. 2.44±0.73, P=0.023); min-NC (1.07±0.48 vs. 1.39±0.60, P=0.002), and dominant-NC (1.57±0.69 vs. 2.08±0.84, P=0.002) were significantly higher and the YC grade (1.23±0.82 vs. 0.86±0.76, P=0.031) significantly lower in the 3G-DES group than in the 2G-DES group. There was no significant difference in the presence of thrombus (28.2% vs. 22.2%, P=0.51) between the 2G- and 3G-DES groups.
Conclusions:The higher NC grade and lower YC grade in 3G-DES than in 2G-DES might be associated with better long-term clinical outcome, which remains to be determined in future studies.
Background:Cardiac size measurements require indexing to body size. Allometric indexing has been investigated in Caucasian populations but a range of different values for the so-called allometric power exponent (b) have been proposed, with uncertainty as to whether allometry offers clinical utility above body surface area (BSA)-based indexing. We derived optimal values forbin normal echocardiograms and validated them externally in cardiac patients.
Methods and Results:Values forbwere derived in healthy adult Chinese males (n=1,541), with optimalbfor left ventricular mass (LVM) of 1.66 (95% confidence interval 1.41–1.92). LV hypertrophy (LVH) defined as indexed LVM >75 g/m1.66was associated with adverse outcomes in an external validation cohort (n=738) of patients with acute coronary syndrome (odds ratio for reinfarction: 2.4 (1.1–5.4)). In contrast, LVH defined by BSA-based indexing or allometry using exponent 2.7 exhibited no significant association with outcomes (P=NS for both). Cardiac longitudinal function also varied with body size: septal and RV free wall s’, TAPSE and lateral e’ all scaled allometrically (b=0.3–0.9).
Conclusions:An optimalbof 1.66 for LVM in healthy Chinese was found to validate well, with superior clinical utility both to that of BSA-based indexing and tob=2.7. The effect of allometric indexing of cardiac function requires further study.
Background:Limitations of coronary computed tomography (CTA) include false-positive stenosis at calcified lesions and assessment of in-stent patency. A prototype of ultra-high resolution computed tomography (U-HRCT: 1,792 channels and 0.25-mm slice thickness×128 rows) with improved spatial resolution was developed. We assessed the diagnostic accuracy of coronary artery stenosis using U-HRCT.
Methods and Results:Seventy-nine consecutive patients who underwent CTA using U-HRCT were prospectively included. Coronary artery stenosis was graded from 0 (no plaque) to 5 (occlusion). Stenosis grading at 102 calcified lesions was compared between U-HRCT and conventional-resolution CT (CRCT: 896 channels and 0.5-mm slice thickness×320 rows). Median stenosis grading at calcified plaque was significantly improved on U-HRCT compared with CRCT (1; IQR, 1–2 vs. 2; IQR, 1–3, P<0.0001). Assessability of in-stent lumen was evaluated on U-HRCT in 79 stents. Stent strut thickness and luminal diameter were quantitatively compared between U-HRCT and CRCT. Of 79 stents, 83.5% were assessable on U-HRCT; 80% of stents with diameter 2.5 mm were regarded as assessable. On U-HRCT, stent struts were significantly thinner (median, 0.78 mm; IQR, 0.7–0.83 mm vs. 0.83 mm; IQR, 0.75–0.92 mm, P=0.0036), and in-stent lumens were significantly larger (median, 2.08 mm; IQR, 1.55–2.51 mm vs. 1.74 mm; IQR, 1.31–2.06 mm, P<0.0001) than on CRCT.
Conclusions:U-HRCT with improved spatial resolution visualized calcified lesions with fewer artifacts. The in-stent lumen of stents with diameter ≥2.5 mm was assessable on U-HRCT.
Background:The Hyogo Prefectural Government has been enforcing a smoking ban ordinance since April 2013. The present survey was conducted to determine the extent to which the smoking ban has been successfully implemented in eating establishments in Kobe City and Amagasaki City.
Methods and Results:The Health and Welfare Department of the Hyogo Prefectural Government provided a list of eating establishments in Kobe and Amagasaki City. From these, we chose 1,300 from each city using random number generation. Responses were obtained from 310 establishments in Kobe City (response rate: 23.8%) and 297 in Amagasaki City (22.8%). Overall, 58.1% of the establishments surveyed in Kobe City were aware of the ordinance, a recognition rate significantly higher than that of Amagasaki City, where only 45.5% of eateries were aware of the ordinance (P=0.003). Of the Kobe City eateries, 31.7% had succeeded in implementing a complete ban on smoking. In Amagasaki City, the rate was significantly lower, at just 13.4% (P<0.001). A logistic regression analysis showed that coffee shops, Japanese-style taverns, bars, and eating establishments that served alcohol were the independent significant predictors of low compliance. Kobe City restaurants, women, and families were the independent significant predictors of high compliance with the complete smoking ban.
Conclusions:The rates of recognition and implementation of the complete smoking ban were significantly lower in Amagasaki City than in Kobe City. There needs to be a strong and continuous socialization campaign to promote the ordinance.
Background:Few studies have documented changes in myocardial blood flow (MBF) after percutaneous coronary intervention (PCI). Phase-contrast cine cardiovascular MRI (PC-CCMR) of the coronary sinus (CS) is a promising approach to quantify MBF. The aim of this study was to quantify CS flow (CSF) on PC-CCMR as a measure of volumetric MBF before and after elective PCI.
Methods and Results:We prospectively studied 34 patients with stable angina undergoing elective PCI for a single de novo lesion. Breath-hold PC-CCMR of CS was acquired to assess CSF and coronary flow reserve (CFR) at rest and during maximum hyperemia both before and after PCI (median, 3 days before PCI and 10 days after PCI, respectively). In total, hyperemic CSF increased significantly after PCI (before PCI, median, 2.3 mL/min/g [IQR, 1.5–3.2 mL/min/g] after PCI, 3.0 [1.8–3.7] mL/min/g), although 13 patients (38.2%) had a decrease despite successful PCI and fractional flow reserve (FFR) improvement. Global CFR also significantly increased from a median of 2.5 (IQR, 1.5–3.5) to 3.4 (IQR, 2.1–4.2), whereas 12 patients had decreased CFR after PCI. Pre-PCI hyperemic CSF was the only independent factor of change in CSF following PCI.
Conclusions:Serial PC-CCMR of CS as a measure of change in absolute MBF is feasible. Uncomplicated PCI does not necessarily increase hyperemic global MBF, despite regional FFR improvement.
Background:There is little information regarding comparison of ticagrelor and prasugrel in patients with ST-segment elevation myocardial infarction (STEMI). We sought to compare clinical outcomes between ticagrelor and prasugrel in STEMI.
Methods and Results:A total of 1,440 patients with STEMI who underwent successful primary percutaneous coronary intervention were analyzed; the data were obtained from the Korea Acute Myocardial Infarction Registry-National Institutes of Health. Of the patients, 963 received ticagrelor, and 477 received prasugrel. The primary study endpoint was 12-month major adverse cardiac events (MACE), including cardiac death, myocardial infarction (MI), and target vessel revascularization (TVR). MACE occurred in 91 patients (6.3%) over the 1-year follow-up, and there were no differences in the incidence of MACE (hazard ratio [HR] 1.20, 95% confidence interval [CI] 0.76–1.91, P=0.438) between the 2 groups. Analysis by propensity score matching (429 pairs) did not significantly affect the results. The incidence of in-hospital major bleeding events was still comparable between the 2 groups (2.4% vs. 2.5%, odds ratio 0.75, 95% CI 0.30–1.86, P=0.532), and there was no significant difference in the incidence of MACE (5.4% vs. 5.8%, HR 0.98, 95% CI 0.56–1.74, P=0.951) after matching.
Conclusions:Ticagrelor and prasugrel showed similar efficacy and safety profiles for treating STEMI in this Korean multicenter registry.
Background:Data on bleeding events in Japanese patients with acute coronary syndrome (ACS) are insufficient. In addition, the efficacy and safety of a maintenance dose of prasugrel 2.5 mg/day in high bleeding risk patients are unknown.
Methods and Results:We prospectively enrolled 1,167 consecutive patients with suspected ACS and undergoing percutaneous coronary intervention. The maintenance dose of prasugrel 2.5 mg/day was prescribed for patients with a low body weight (≤50 kg), elderly (≥75 years), or renal insufficiency (eGFR ≤30 mL/min/1.73 m2). In-hospital events were assessed in 992 ACS patients treated with drug-eluting stents. Excluding 29 in-hospital deaths, out-of-hospital events were assessed in 963 ACS patients. The primary safety outcome measure was major bleeding (Bleeding Academic Research Consortium types 3 and 5). The incidence of in-hospital major bleeding was 3.4%. Multivariate analysis showed that being elderly, low body weight, renal insufficiency, stroke history, femoral approach, and mechanical support usage were independent predictors of in-hospital major bleeding. The cumulative 1-year incidence of out-of-hospital major bleeding was not significantly different between the prasugrel 2.5 mg/day (n=284) and 3.75 mg/day (n=487) groups (1.6% vs. 0.7%, log-rank P=0.24). That of out-of-hospital definite or probable stent thrombosis was 0% in both groups.
Conclusions:The maintenance dose of adjusted prasugrel 2.5 mg/day seems to be one option in ACS patients at high bleeding risk.
Background:Accurate risk stratification of non-ST segment elevation myocardial infarction (NSTEMI) patients is important due to great variability in mortality risk, but, to date, no prediction model has been available. The aim of this study was therefore to establish a risk score to predict in-hospital mortality risk in NSTEMI patients.
Methods and Results:We enrolled 5,775 patients diagnosed with NSTEMI from the China Acute Myocardial Infarction (CAMI) registry and extracted relevant data. Patients were divided into a derivation cohort (n=4,332) to develop a multivariable logistic regression risk prediction model, and a validation cohort (n=1,443) to test the model. Eleven variables independently predicted in-hospital mortality and were included in the model: age, body mass index, systolic blood pressure, Killip classification, cardiac arrest, electrocardiogram ST-segment depression, serum creatinine, white blood cells, smoking status, previous angina, and previous percutaneous coronary intervention. In the derivation cohort, the area under curve (AUC) for the CAMI-NSTEMI risk model and score was 0.81 and 0.79, respectively. In the validation cohort, the score also showed good discrimination (AUC, 0.86). Diagnostic performance of CAMI-NSTEMI risk score was superior to that of the GRACE risk score (AUC, 0.81 vs. 0.72; P<0.01).
Conclusions:The CAMI-NSTEMI score is able to accurately predict the risk of in-hospital mortality in NSTEMI patients.
Background:Xanthine oxidoreductase (XOR) is an enzyme that catalyzes the formation of uric acid from hypoxanthine and xanthine, leading to an increase in superoxide and reactive oxygen species. Activation of XOR promotes oxidative stress-related tissue injury. We investigated the associations between metabolic parameters and plasma XOR activity measured by a sensitive and accurate assay using a combination of liquid chromatography and triple quadrupole mass spectrometry to detect [13C2,15N2]-uric acid using [13C2,15N2]-xanthine as a substrate.
Methods and Results:A total of 627 Japanese subjects (M/F, 292/335) from the Tanno-Sobetsu Study, a population-based cohort, were recruited. Plasma XOR activity was significantly higher in males than in females, and habitual smoking was associated with elevation of activity. Plasma XOR activity was positively correlated with body mass index (BMI; r=0.323, P<0.001), waist circumference, blood pressure, and levels of liver enzymes including alanine transaminase (r=0.694, P<0.001), uric acid (r=0.249, P<0.001), triglycerides (r=0.312, P<0.001), hemoglobin A1c, fasting glucose, insulin and HOMA-R (r=0.238, P<0.001) as a marker of insulin resistance and was negatively correlated with high-density lipoprotein cholesterol level. On stepwise and multivariate regression analyses, BMI, smoking and levels of alanine transaminase, uric acid, triglycerides and HOMA-R were independent predictors of plasma XOR activity after adjustment for age and gender.
Conclusions:Plasma XOR activity is a novel biomarker of metabolic disorders in a general population.
Background:There are limited data comparing the outcomes of subintimal vs. intraluminal approach in the treatment of long femoropopliteal artery occlusions. The objective of this study was to investigate the efficacy and safety of the subintimal approach for long femoropopliteal artery occlusions.
Methods and Results:From a multicenter retrospective registry cohort, we included a total of 461 patients with 487 femoropopliteal artery occlusions classified as Inter-Society Consensus for the Management of Peripheral Arterial Disease (TASC) II C/D for this analysis. We compared the immediate and mid-term outcomes of subintimal vs. intraluminal approaches. There were 228 patients with 243 limbs in the subintimal group, and 233 patients with 244 limbs in the intraluminal group. Baseline clinical and lesion characteristics were comparable between the 2 groups. The technical success rate was significantly higher in the subintimal group than in the intraluminal group (95.1% vs. 89.8%, P=0.041). The clinical primary patency (67.5% vs. 73.4% at 12 months, 54.0% vs. 61.3% at 24 months; P=0.086) and target lesion revascularization (TLR)-free survival (89.5% vs. 86.3% at 12 months, 77.6% vs. 76.0% at 24 months; P=0.710) did not differ significantly between the subintimal and the intraluminal groups.
Conclusions:In long femoropopliteal occlusions, the subintimal approach achieved a higher technical success rate and similar mid-term primary patency and TLR-free survival compared with intraluminal approach.
Background:Recent randomized trials have shown the treatment benefits of use of a drug-coated balloon (DCB) over conventional percutaneous transluminal angioplasty (PTA) in patients with femoropopliteal disease. However, the effectiveness and safety of DCB for dialysis patients remain unclear.
Methods and Results:Consecutive dialysis patients, who underwent PTA or DCB for femoropopliteal disease, were assessed retrospectively via 2:1 propensity score matching. Effectiveness and safety endpoints, including binary restenosis, clinically driven target lesion revascularization (CD-TLR), amputations, major adverse cardiac events (MACE), and deaths, were compared between groups. A total of 278 dialysis patients with 339 limbs were eligible for matching: 84 limbs from 77 patients treated with PTA and 46 limbs from 37 patients treated with DCB were compared after matching. Baseline patient and lesion characteristics were not different between groups. Patients treated with DCB had significantly higher rates of freedom from binary restenosis (52.4% vs. 18.6%, P<0.001) and CD-TLR (56.4% vs. 25.9%, P=0.001) at 2 years compared with patients treated with PTA. Both groups had similar outcomes for amputation, MACE, and death. Cox proportional analysis showed that treatment with DCB was independently associated with a reduction of binary restenosis (hazard ratio [HR] 0.368, P=0.001) and CD-TLR (HR 0.390, P=0.004).
Conclusions:This study suggested superior 2-year outcomes using DCB compared with PTA and similar safety profiles in dialysis patients with femoropopliteal disease.
Background:The present study was performed to clarify whether the preoperative clinical symptoms for endovascular therapy (EVT) can predict post-EVT death and cardiovascular prognosis in Japanese patients with peripheral artery disease (PAD), including acute disease.
Methods and Results:The TOkyo taMA peripheral vascular intervention research COmraDE (Toma-Code) Registry is a Japanese prospective cohort of 2,321 consecutive patients with PAD treated with EVT, in 34 hospitals in the Kanto and Kōshin’etsu regions, from August 2014 to August 2016. In total, 2,173 symptomatic patients were followed up for a median of 10.4 months, including 1,370 with claudication, 719 with critical limb ischemia (CLI), and 84 with acute limb ischemia (ALI) for EVT. The all-cause death rates per 100 person-years for claudication, CLI and ALI were 3.5, 26.2, and 24.5, respectively. Similarly, major adverse cardiac and cerebrovascular events (MACCE) rates per 100 person-years for claudication, CLI, ALI, and others were 5.2, 31.2, and 29.7, respectively. After adjusting for the predictors of all-cause death and MACCE, namely, age, body mass index <18, diabetes mellitus, dialysis, cerebrovascular disease, and low left ventricular ejection fraction, it was determined that the preoperative indication for EVT was strongly associated with all-cause death and MACCE.
Conclusions:The preoperative clinical symptoms for EVT can predict the prognosis in patients with PAD undergoing EVT.
Background:Peripheral artery disease (PAD) is a risk factor for the development of cardiovascular disease and death. Surfactant protein-D (SP-D) is a 43-kDa protein secreted from type II pneumocytes in the lungs. Recent studies have demonstrated that circulating SP-D plays a key role in the development of atherosclerosis and is related to clinical outcomes in patients with ischemic heart disease. However, it remains unclear whether circulating SP-D is associated with clinical outcomes in patients with PAD.
Methods and Results:We enrolled 364 patients with PAD who underwent endovascular therapy. We measured serum levels of SP-D and Krebs von den Lungen-6 (KL-6). During a median follow-up period of 974 days, there were 69 major adverse cardiovascular and leg events (MACLE), including 48 major adverse cardiovascular events (MACE). Kaplan-Meier analysis demonstrated that patients with high SP-D (≥110 ng/mL) had higher rates of MACE and MACLE than those with low SP-D. Multivariate Cox proportional hazard regression analysis demonstrated that SP-D, but not KL-6, was an independent predictor of MACE and MACLE. The addition of SP-D to known risk factors significantly improved the C index and net reclassification index. The circulating SP-D level was affected by sex, diabetes mellitus, and cilostazol prescription.
Conclusions:Circulating SP-D was associated with clinical outcomes in patients with PAD, suggesting that it may be a new therapeutic target in these patients.
Background:We determined the 2-year long-term risk-benefit profile in patients with stroke or transient ischemic attack (TIA) receiving warfarin or direct oral anticoagulants (DOACs) for nonvalvular atrial fibrillation (NVAF) using a prospective, multicenter, observational registry in Japan.
Methods and Results:NVAF patients within 7 days after onset of ischemic stroke/TIA were enrolled in 18 stroke centers. Outcome measures included ischemic and bleeding events and death in the 2-year follow-up period. We enrolled 1,116 patients taking either warfarin (650 patients) or DOACs (466 patients) at acute hospital discharge. DOAC users were younger and had lower National Institutes of Health Stroke Scale, CHADS2and discharge modified Rankin Scale scores than warfarin users (P<0.0001 each). Incidences of stroke/systemic embolism (adjusted hazard ratio, 1.07; 95% CI, 0.66–1.72), all ischemic events (1.13; 0.72–1.75), and ischemic stroke/TIA (1.58; 0.95–2.62) were similar between groups. Risks of intracranial hemorrhage (0.32; 0.09–0.97) and death (0.41; 0.26–0.63) were significantly lower for DOAC users. Infection was the leading cause of death, accounting for 40% of deaths among warfarin users.
Conclusions:Stroke/TIA patients receiving DOACs for secondary prevention were younger and had lower stroke severity and risk indices than those receiving warfarin. Estimated cumulative incidences of stroke and systemic embolism within 2 years were similar between warfarin and DOACs users, but those of death and intracranial hemorrhage were significantly lower among DOAC users.
Background:The clinical robustness of contrast-videodensitometric (VD) assessment of aortic regurgitation (AR) after transcatheter aortic valve implantation (TAVI) has been demonstrated. Correct acquisition of aortic root angiography for VD assessment, however, is hampered by the opacified descending aorta and by individual anatomic peculiarities. The aim of this study was to use preprocedural multi-slice computed tomography (MSCT) to optimize the angiographic projection in order to improve the feasibility of VD assessment.
Methods and Results:In 92 consecutive patients, post-TAVI AR (i.e., left ventricular outflow tract [LVOT] AR) was assessed on aortic root angiograms using VD software. The patients were divided into 2 groups: The first group of 54 patients was investigated prior to the introduction of the standardized acquisition protocol; the second group of 38 consecutive patients after implementation of the standardized acquisition protocol, involving MSCT planning of the optimal angiographic projection. Optimal projection planning has dramatically improved the feasibility of VD assessment from 57.4% prior to the standardized acquisition protocol, to 100% after the protocol was implemented. In 69 analyzable aortograms (69/92; 75%), LVOT-AR ranged from 3% to 28% with a median of 12%. Inter-observer agreement was high (mean difference±SD, 1±2%), and the 2 observers’ measurements were highly correlated (r=0.94, P<0.0001).
Conclusions:Introduction of computed tomography-guided angiographic image acquisition has significantly improved the analyzability of the angiographic VD assessment of post-TAVI AR.
Background:The introduction of transcatheter aortic valve implantation (TAVI) into Japan was strictly controlled to optimize patient outcomes. The goal of this study was to assess if increasing experience during the introduction of this procedure was associated with outcomes.
Methods and Results:The initial 1,752 patients registered in the Japanese national TAVI registry were included in the study. The association between operator procedure number and incidence of the early safety endpoint at 30 days (ESE30) as defined in the Valve Academic Research Consortium-2 consensus document was evaluated. Patients were divided into 4 groups by quartiles of procedure count (Groups I–IV in order of increasing number of procedures). Median patient age was 85 years, and 30.5% were male. The 30-day mortality rate was 1.4% (n=24), and 78 patients (7.9%) experienced 95 ESE30. Among the variables included in the model, ESE30 was associated with non-transfemoral approach (P=0.004), renal dysfunction (Cr >2.0 mg/dL) (P=0.01) and NYHA class III/IV (P=0.04). ESE30 incidence was not significantly different between Groups I–III and Group IV. Spline plots demonstrated that experience of 15–20 cases in total was needed to achieve a consistent low risk of ESE30.
Conclusions:Increasing experience was associated with better outcomes, but to a lesser degree than in previous reports. Our findings suggested that the risks associated with the learning curve process were appropriately mitigated.
Background:This study aimed to investigate the effect and safety of sodium glucose cotransporter 2 inhibitors (SGLT2-Is) in patients with drug-refractory heart failure (HF).
Methods and Results:In 12 diabetic patients with advanced HF, SGLT2-Is were added to the treatment regimen. At 6 months after administration, improvements in New York Heart Association class and reduction in B-type natriuretic peptide levels were observed, in particular in patients with high right atrial pressure. During follow-up, they had neither cardiac events nor adverse side effects.
Conclusions:SGLT2-Is may be useful and safe in diabetic patients with drug-refractory HF, in particular accompanied by right-sided HF.