Prevention of sudden cardiac death (SCD) has become an important issue in today’s cardiovascular field, together with various developments in secondary prevention of basic cardiac diseases. The importance of the implantable cardioverter defibrillator (ICD) is now widely accepted because it has exhibited significant improvement in patients’ prognoses in ischemic and non-ischemic cardiovascular diseases. However, there is an unignorable gap between the ICD indication in the guidelines and real-world high-risk patients for SCD, especially in the acute recovery phase of cardiac injury. Although various studies have demonstrated a clinical benefit of defibrillation devices, the studies of immediate ICD use in the acute recovery phase have failed to exhibit a benefit in patients from the point of the view of a decrease in total deaths. To bridge this gap, the wearable cardioverter defibrillator (WCD) provides a safer observation period in the acute phase and eliminates inappropriate overuse of ICD in the subacute phase. Here, we discuss the usefulness of the WCD and current understanding of its indications based on various clinical data. In conclusion, WCD is a feasible bridge to therapy and/or safe observation for patients at high risk of SCD, especially in the acute recovery phase of cardiac diseases.
A prospective randomized clinical trial showed that the BioFreedom stent (Biosensors International), which is a polymer-free and carrier-free drug-coated stent, was significantly superior to a bare-metal stent (BMS) in patients at high bleeding risk who were receiving a 1-month course of dual antiplatelet therapy (DAPT). However, the stent thrombosis rate (2.01% for BioFreedom vs. 2.20% for BMS) was 4–6-fold higher than that of approved drug-eluting stents based on real-world data in Japan. Furthermore, the frequency of stent thrombosis at more than 1 month with the BioFreedom stent was slightly higher than that at less than 1 month. This result suggested that it would not be acceptable to stop DAPT universally at 1 month. Thus, the target patients for the BioFreedom stent are unspecified patients at high bleeding risk needing to continue DAPT for as long as necessary in Japan. Therefore, based on the pre- and post-marketing balance of medical devices regulations, regulatory approval was given for unspecified patients conditionally upon real-world data collection of 2,000 patients with a Use-Results Survey, instead of conducting additional pre-marketing clinical trial(s). The Use-Results Survey System is part of a strategy to expedite patients’ access to innovative medical devices and to accelerate the development of medical devices.
The 67thAnnual Scientific Session and Expo of the American College of Cardiology (ACC) were held at the Orange County Convention Center, Orlando, from March 10–12, 2018. This meeting offered 2,700 accepted abstracts presented in oral and poster sessions by 2,100 experts and 37 Late-Breaking Clinical Trials and Featured Clinical Research presentations. This report introduces the key presentations and highlights from the ACC 2018 Scientific Session.
Background:Patients with reduced-function CYP2C19 genotypes on dual antiplatelet therapy (DAPT) with aspirin and clopidogrel show higher clinical risk for acute myocardial infarction (AMI). We investigated the effect of CYP2C19 genotype-tailored adjunctive cilostazol therapy on treatment of AMI.
Methods and Results:The study group of 138 patients with suspected AMI were screened for CYP2C19 genotype immediately after percutaneous coronary intervention (PCI) using a SPARTAN RX point-of-care device. Carriers of the CYP2C19 reduced-function allele were randomized into DAPT (Carrier/DAPT) and DAPT plus 14-day cilostazol (Carrier/DAPT+Cilostazol) groups, while noncarriers were treated with DAPT (Noncarrier/DAPT). After exclusion of 10 patients, the remaining 128 patients were analyzed for P2Y12 reaction unit (PRU) using VerifyNow®P2Y12 system, and levels of biomarkers immediately after, and 1, 14, and 28 days after PCI. DAPT+Cilostazol reduced PRU levels in carriers (n=46) to those found in the Noncarrier/DAPT group (n=40), and significantly lower than those of the Carrier/DAPT group (n=42) at 14 days post-PCI. Discontinuation of cilostazol for 14 days was associated with a significant rise in PRU levels to those of the Carrier/DAPT group at 28 days post-PCI. Plasma B-type natriuretic peptide levels at 14 days post-PCI were lower in Carrier/DAPT+Cilostazol than in the other 2 groups, and the levels increased to those of the other groups at 28 days post-PCI after withdrawal of cilostazol.
Conclusions:Adjunctive cilostazol therapy tailored to CYP2C19 genotype seemed useful in AMI patients with the CYP2C19 reduced-function allele.
Background:Tissue engineering has advanced the technique of decellularization of the heart valve. The valve is reseeded with the patient’s own cells after implantation with suppression of immunologic reactions. The same advantage has been reported for fresh decellularized heart valves, and more than 10 years of excellent outcomes have been achieved. We began performing such heart valve implantations in 2013 as part of a clinical study at Osaka University. We report our evaluation of the safety and efficacy of heart valve implantation.
Methods and Results:Human pulmonary valves from the German Society for Tissue Transplantation (n=2) or from Japanese heart transplant recipient heart (n=4) were used to make decellularized heart valves; the decellularization process was the same as that used in Europe. Valves were implanted in 5 adults with pulmonary valve insufficiency after tetralogy of Fallot repair and in 1 infant with a double-outlet right ventricle with pulmonary stenosis. Postoperative echocardiography and cardiac magnetic resonance imaging revealed that the valve and ventricular function were significantly improved and maintained postoperatively.
Conclusions:Decellularized heart valves could be the new material used as artificial heart valves. Pulmonary allografts derived from the hearts of heart transplant recipients are considered to be useful material for decellularized heart valves. The application of this valve to Japanese clinical circumstances and using the hearts of heart transplant recipients is considered to be very significant.
Background:Despite the specific characteristics of heart failure with preserved ejection fraction (HFpEF) having been demonstrated predominantly from registries in Western countries, important international differences exist in terms of patient characteristics, management and medical infrastructure between Western and Asian countries.
Methods and Results:We performed nationwide registration of consecutive Japanese hospitalized HFpEF patients with left ventricular EF ≥50% from 15 sites between November 2012 and March 2015. Follow-up data were obtained up to 2 years post-discharge. A total of 535 patients were registered. The median age was 80 years and 50% were female. The most common comorbid conditions were hypertension (77%) and atrial fibrillation (AF: 62%), but body mass index was relatively low. In-hospital mortality rate was 1.3% and the median length of hospitalization was 16 days. By 2 years post-discharge, 40.8% of patients had all-cause death or HF hospitalization. Approximately one-half of deaths had a cardiac cause. Lower serum albumin on admission was one of the strongest independent determinants of worse clinical outcome.
Conclusions:Japanese HFpEF patients were less obese, but had a substantially higher prevalence of AF and lower incidence of subsequent events compared with previous reports. Our findings indicated that specific preventative and therapeutic strategies focusing on AF and nutritional status might need to be considered for Japanese hospitalized patients with HFpEF.
Background:The entirely subcutaneous implantable cardioverter defibrillator (S-ICD) was introduced as a new alternative to conventional transvenous ICD (TV-ICD) in Japan in February 2016, but its safety and efficacy are unclear.
Methods and Results:A total of 60 patients (48 men, median age, 60 years; IQR, 44–67 years; primary prevention, n=24) underwent S-ICD implantation between February 2016 and August 2017. The device pocket was formed in the intermuscular space between the serratus anterior muscle and the latissimus dorsi muscle, and the parasternal S-ICD lead was placed according to pre-implant screening. Defibrillation test was performed in 56 patients (93%). Ventricular fibrillation (VF) was induced in 55 patients and terminated by a single 65-J shock in all patients. The median time to shock therapy was 13.4 s (IQR, 12.1–14.9 s) and the median post-shock impedance of the S-ICD lead was 64 Ω (IQR, 58–77 Ω). There were no operation-related complications or subsequent infectious complications. During follow-up (median, 275 days; IQR, 107–421 days), 1 patient (1.7%) had appropriate shock for VF with successful termination, whereas 5 patients (8.3%) had inappropriate shock due to oversensing of myopotential (n=3) or T-wave (n=1), and detection of supraventricular tachycardia (n=1).
Conclusions:S-ICD is a safe and effective alternative to conventional TV-ICD. The long-term safety and efficacy of the S-ICD need further investigation.
Background:Periprocedural anticoagulation is important in catheter ablation (CA) of atrial fibrillation (AF) and there is increasing evidence that uninterrupted vitamin K antagonist (VKA) therapy is superior to interrupted anticoagulation strategies. Since the emergence of direct oral anticoagulants (DOACs), numerous studies have shown promising results for their use in uninterrupted strategies. However, further studies are needed to further deﬁne the efﬁcacy and safety of performing AF ablation with uninterrupted factor XA inhibitors or direct thrombin inhibitors.
Methods and Results:We have performed CA of AF without discontinuation of either VKA or DOAC therapy since April 2014. A total of 376 patients with AF underwent CA including pulmonary vein isolation. All of the patients were divided into 2 groups (uninterrupted VKA or uninterrupted DOACs). Anticoagulation with DOACs was associated with fewer complications than uninterrupted VKA therapy (P=0.04). There were significant differences between groups in the rates of congestive heart failure, left ventricular ejection fraction, body weight, and estimated glomerular filtration rate and of the CHADS2, CHA2DS2-VASc and HAS-BLED scores. Therefore, we also analyzed the results using the propensity score-matching method. We found no significant difference in periprocedural complications between uninterrupted VKA or DOACs therapy (P=0.65).
Conclusions:CA of AF without discontinuation of DOACs is not inferior to CA without discontinuation of a VKA, with regard to ischemic or hemorrhagic complications.
Background:The incidence of pulmonary vein stenosis (PVS) after AF ablation following contemporary procedures remains unclear. We compared the incidence of PVS/narrowing (PVS/N) after PV isolation (PVI) for (1) 3-D mapping-guided wide-area encircling irrigated radiofrequency current (RFC) ablation; (2) first–third-generation big cryoballoon (CB1–3) ablation; and (3) laser balloon (LB) ablation.
Methods and Results:All patients undergoing a second procedure between January 2012 and November 2016 were subgrouped according to index ablation (PVI): RFC; CB; or LB. PVS/N was classified using PV diameter ratio (second/index procedure) on selective PV angiogram performed before ablation: mild, 25–49%; moderate, 50–74%; or severe, ≥75%. A total of 344 patients (1,362 PV) were analyzed (RFC, n=211; 840 PV; CB1, n=21; 82 PV; CB2,3, n=64; 250 PV; LB, n=48; 190 PV). In the LB group, 45 patients (94%) were treated with dose ≥8.5 W. Second procedures were performed on average 14.9±14.1 months after the index procedure. Mild PVS/N was observed in 18.4%, 9.5% and 3.6% of PV in the LB, RFC and CB groups, respectively (P<0.01). Moderate PVS was recognized in 2 PV (0.1%; RFC, LB). Severe PVS was never observed, and no PV intervention/surgery was required.
Conclusions:The risk for significant PVS is low after RFC/CB. The incidence of mild PVS/N was highest after standard-dose LB ablation and lowest after high-dose CB ablation.
Background:Everolimus-eluting stents (EES) have equivalent short-term angiographic and clinical outcomes to sirolimus-eluting stents (SES), but EES may be superior to SES with regard to long-term clinical safety. We report the 3-year clinical outcomes of EES and SES from the prospective EXCELLENT Randomized Trial (NCT00698607).
Methods and Results:We randomly assigned 1,443 patients undergoing percutaneous coronary intervention 3:1 to receive EES and SES, respectively. We investigated endpoints including target lesion failure (TLF) and individual clinical outcomes including stent thrombosis (ST) at 3 years. For EES and SES, the TLF rate was 4.82% and 4.12% (risk ratio [RR], 1.16, 95% CI: 0.65–2.06, P=0.62), respectively. Results were similar in other efficacy endpoints including target lesion revascularization. For safety endpoints, rate of all-cause death was significantly lower for EES (1.67%) than SES (3.57%; RR, 0.46; 95% CI: 0.23–0.94, P=0.03), while the incidence of cardiac death or myocardial infarction was numerically lower in EES. On 1-year landmark analysis, rates of all-cause death and major adverse cardiovascular events were significantly lower for EES than SES. Definite or probable ST was numerically 3-fold higher for SES (1.37%) compared with EES (0.46%).
Conclusions:EES and SES had similar efficacy with regard to 3-year outcomes in the EXCELLENT trial, while delayed safety events all trended to favor EES.
Background:Development of methods for accurate reconstruction of bioresorbable scaffolds (BRS) and assessing local hemodynamics is crucial for investigation of vascular healing after BRS implantation.
Methods and Results:Patients with BRS that crossed over in a coronary bifurcation were included for analysis. Reconstructions of the coronary lumen and BRS were performed by fusion of optical coherence tomography and coronary angiography generating a tree model (TM) and a hybrid model with BRS (TM-BRS). A virtual BRS model with thinner struts was created and all 3 models were analyzed using computational fluid dynamics to derive: (1) time-average shear stress (TASS), (2) TASS gradient (TASSG), which represents SS heterogeneity, and (3) fractional flow reserve (FFR). Reconstruction of the BRS was successful in all 10 patients. TASS and TASSG were both higher by TM-BRS than by TM in main vessels (difference 0.27±4.30 Pa and 10.18±27.28 Pa/mm, P<0.001), with a remarkable difference at side branch ostia (difference 13.51±17.40 Pa and 81.65±105.19 Pa/mm, P<0.001). With thinner struts, TASS was lower on the strut surface but higher at the inter-strut zones, whereas TASSG was lower in both regions (P<0.001 for all). Computational FFR was lower by TM-BRS than by TM for both main vessels and side branches (P<0.001).
Conclusions:Neglecting BRS reconstruction leads to significantly lower SS and SS heterogeneity, which is most pronounced at side branch ostia. Thinner struts can marginally reduce SS heterogeneity.
Background:Whether the short-term effect of cardiac rehabilitation (CR) in elderly patients with heart failure (HF) is influenced by nutritional status is uncertain, so the present study investigated the effect of nutritional status on functional recovery after CR in elderly HF inpatients.
Methods and Results:We enrolled 145 patients admitted for treatment of HF who were aged ≥65 years and had a low functional status defined as a Barthel index (BI) score ≤85 points at the commencement of CR. Nutritional status was assessed by the Mini Nutritional Assessment Short Form (MNA-SF) and total energy intake per day. The primary endpoint was functional status determined by the BI score at discharge. The median CR period was 20 days (interquartile range: 14–34 days), and 87 patients (60%) were functionally dependent (BI score ≤85) at discharge. Multivariate logistic regression analysis showed that MNA-SF score (odds ratio [OR]: 0.76, P=0.02) and total energy intake at the commencement of CR (OR: 0.91, P=0.02) were independent predictors of functional dependence after CR. MNA-SF score ≤7 and total energy intake ≤24.5 kcal/kg/day predicted functional dependence at discharge with moderate sensitivity and specificity.
Conclusions:MNA-SF score and total energy intake at the commencement of CR are novel predictors of the extent of functional recovery of elderly HF inpatients after in-hospital CR.
Background:The hospital mortality rate in >80-year-old patients undergoing surgical aortic valve replacement (SAVR) is reportedly satisfactory, but how such patients’ functional status both at discharge and during the postoperative hospitalization period might affect their quality of life and medical costs remains unclear.
Methods and Results:The adverse events of 161 patients aged >80 years who underwent SAVR with or without coronary artery bypass grafting were retrospectively investigated. Adverse events were defined as hospital death, a long hospital stay (>60 days) attributable to major complications or requirement for rehabilitation, or a depressed status at discharge (modified Rankin scale score >4). A total of 18.6% of patients developed adverse events, and their hospital mortality rate was 4.3%. Logistic regression analysis revealed that a perfusion time >3 h (P=0.0331; odds ratio, 2.685) and EuroSCORE II >10% (P<0.0001; odds ratio, 8.232) were significant risk factors for adverse events. The average medical cost was approximately 1.5-fold higher in patients with adverse events (¥8,360,880 vs. ¥5,234,660, P=0.0016).
Conclusions:Clinical findings focusing on status at discharge and during postoperative hospitalization of SAVR in patients aged >80 years was relatively high compared with hospital mortality, especially in patients with a longer perfusion time and high EuroSCORE. Further studies are necessary to define the indications for SAVR in patients aged >80 years in the era of transcatheter AVR.
Background:The effect of postprandial glucose on the risk of cardiovascular disease has been emphasized, but it is controversial whether nonfasting glucose is related to incident stroke and its types.
Methods and Results:We investigated the associations of nonfasting glucose with incident stroke and its types among 7,198 participants aged 40–74 years from the Circulatory Risk in Communities Study, enrolled in 1995–2000. We estimated multivariable hazard ratios (HR) using Cox proportional hazard models. Over a median follow-up of 14.1 years, 291 cases of total stroke (ischemic strokes: 191 including 109 lacunar infarctions) were identified. Nonfasting glucose concentration was associated with greater risk of incident total stroke, ischemic stroke and lacunar infarction when modeled categorically (for prediabetic type: 7.8–11.0 mmol/L vs. normal type: <7.8 mmol/L among all subjects, HR for lacunar infarction was 2.02, 95% confidence interval (CI): 1.19, 3.43) or continuously (per one standard deviation increment among all subjects, HR for lacunar infarction was 1.29, 95% CI: 1.15, 1.45). Diabetic type showed similar results. Population attributable fractions of nonfasting hyperglycemia were 13.2% for ischemic stroke and 17.4% for lacunar infarction.
Conclusions:Nonfasting glucose concentration, either as a diagnosis of prediabetic and diabetic types or as a continuous variable, proved to be an independent predictor significantly attributed to incident total stroke, especially ischemic stroke and lacunar infarction, in the general population.
Background:To understand the recent management status in Japan, we determined the low-density lipoprotein cholesterol (LDL-C) goal attainment (GA) rate of patients initiating statin monotherapy for dyslipidemia.
Methods and Results:Dyslipidemic patients undergoing either primary prevention with high cardiovascular risk or secondary prevention (defined by 2012 Japan Atherosclerosis Society Guidelines) were retrospectively analyzed from a hospital-based claims database. In both groups, the LDL-C levels and GA rates of patients treated with intensive or standard statin monotherapy for ≥4 weeks (January 2012–August 2016) were evaluated. Among 1,501,013 dyslipidemic patients, 11,695 and 9,642 were included in the primary and secondary prevention groups, respectively. A total of 94% of patients underwent statin monotherapy as the initial lipid-lowering therapy, of which most (≥80%) took intensive statins. The proportions of patients in the primary prevention group who achieved an LDL-C goal <120 mg/dL by intensive and standard statins were 81.1% and 61.2%, respectively, and the proportions of those who achieved a goal <100 mg/dL in the secondary prevention group were 73.3% and 48.1%, respectively. The GA rates were similar regardless of disease complications.
Conclusions:Most patients (>70%) in both groups achieved LDL-C management goals using intensive statin monotherapy. Further treatment approaches are required for high-risk patients not achieving LDL-C goals by initial statin monotherapy. Continuous efforts are crucial for adherence and persistence of lipid-lowering therapies.
Background:The geriatric nutritional risk index (GNRI) is a simple and objective nutritional assessment tool for elderly patients. Lower GNRI values are associated with a worse prognosis in patients with heart failure (HF). However, few data are available regarding the prognostic effect of the GNRI value for risk stratification in patients at risk for HF.
Methods and Results:We retrospectively investigated 1,823 consecutive patients at risk for HF (Stage A/B) enrolled in the IMPACT-ABI Study. GNRI on admission was calculated as follows: 14.89×serum albumin (g/dL)+41.7×body mass index/22. Patients were divided into 2 groups according to the median GNRI value (107.1). The study endpoint was a composite of cardiovascular (CV) events, including CV death and hospitalization for worsening HF. Over a 4.7-year median follow-up, CV events occurred in 130 patients. In the Kaplan-Meier analysis, patients with low GNRI (<107.1, n=904) showed worse prognoses than those with high GNRI (≥107.1, n=919) (20.2% vs. 12.4%, P<0.001). In the multivariable Cox proportional hazards analysis, low GNRI was significantly associated with the incidence of CV events (hazard ratio: 1.48, 95% confidence interval: 1.02–2.14; P=0.040).
Conclusions:The simple and practical assessment of GNRI may be useful for predicting CV events in patients with Stage A/B HF.
Background:Although hepatitis C virus (HCV) is a known risk factor for cardiovascular disease, whether antiviral therapy (AVT) can reduce heart failure (HF) hospitalizations is unknown.
Methods and Results:In this population-based cohort study, we used data from the Taiwan National Health Insurance Research Database to evaluate the effect of interferon-based therapy (IBT) on cardiovascular events in patients with chronic HCV infection. Clinical outcomes evaluated included HF hospitalizations; a composite of acute myocardial infarction, ischemic stroke, and peripheral artery disease; all-cause death; and cardiovascular death. Of 83,229 eligible patients with chronic HCV infection, we compared 16,284 patients who received IBT with untreated subjects after propensity score matching. Patients who received IBT were less likely to be hospitalized for HF compared with untreated subjects (incidence density.ID, 0.9 vs. 1.5 events per 103person-years; hazard ratio.HR, 0.58; 95% confidence interval.CI, 0.42–0.79; P=0.001). Compared with untreated subjects, the treated group had significantly lower risk of composite vascular events (ID, 3.7 vs. 5.0 events per 103person-years; P<0.001), all-cause death (ID, 5.6 vs. 17.2 events per 103person-years; P<0.001), and cardiovascular death (ID, 0.2 vs. 0.6 events per 103person-years; P=0.001).
Conclusions:AVT for chronic HCV infection might offer protection against HF hospitalizations, critical vascular events, and cardiovascular death beyond known beneficial effects.
Background:Indwelling urethral catheters (IUC) are routinely inserted for the purpose of monitoring urine output in patients with acute heart failure (AHF). The benefit of IUC in patients capable of complying with urine collection protocols is unclear, and IUC carry multiple risks. This study describes the impact of IUC on AHF treatment.
Methods and Results:A total of 540 records were retrospectively analyzed. After exclusion criteria were applied, 316 patients were propensity matched to establish groups of 100 AHF patients who either did (IUC(+)) or did not receive an IUC (IUC(−)) upon admission. Hospital length of stay (9 vs. 7 days), in-hospital urinary complications (24 vs. 5%), and 1-year urinary tract infection rate (17 vs. 6%; HR, 3.145; 95% CI: 1.240–7.978) were significantly higher in the IUC(+) group (P<0.05 for all). There were no differences in 30-day rehospitalization (6 vs. 6%; HR, 0.981; 95% CI: 0.318–3.058; P=0.986) or major adverse cardiac/cerebrovascular events at 1 year (37 vs. 32%, HR, 1.070; 95% CI: 0.636–1.799; P=0.798).
Conclusions:Based on this retrospective analysis, the routine use of IUC may increase length of stay and UTI complications in AHF patients without reducing the risk for major cardiovascular and cerebrovascular events or 30-day rehospitalization rate.
Background:Research suggests that heart failure with reduced ejection fraction (HFrEF) is a state of systemic inflammation that may be triggered by microbial products passing into the bloodstream through a compromised intestinal barrier. However, whether the intestinal microbiota exhibits dysbiosis in HFrEF patients is largely unknown.
Methods and Results:Twenty eight non-ischemic HFrEF patients and 19 healthy controls were assessed by 16S rRNA analysis of bacterial DNA extracted from stool samples. After processing of sequencing data, bacteria were taxonomically classified, diversity indices were used to examine microbial ecology, and relative abundances of common core genera were compared between groups. Furthermore, we predicted gene carriage for bacterial metabolic pathways and inferred microbial interaction networks on multiple taxonomic levels.Bacterial communities of both groups were dominated by the Firmicutes and Bacteroidetes phyla. The most abundant genus in both groups wasBacteroides. Although α diversity did not differ between groups, ordination by β diversity metrics revealed a separation of the groups across components of variation.StreptococcusandVeillonellawere enriched in the common core microbiota of patients, whileSMB53was depleted. Gene families in amino acid, carbohydrate, vitamin, and xenobiotic metabolism showed significant differences between groups. Interaction networks revealed a higher degree of correlations between bacteria in patients.
Conclusions:Non-ischemic HFrEF patients exhibited multidimensional differences in intestinal microbial communities compared with healthy subjects.
Background:Diastolic function is an independent predictor of death in heart failure (HF), but the effect of a change in diastolic function during hospitalization on clinical outcomes in patients with hypertensive HF (HHF) has been poorly studied. Therefore, the aim of this study was to investigate the effect of predischarge diastolic functional recovery (DFR) on future clinical outcomes in hospitalized patients with a first diagnosis of HHF.
Methods and Results:A total of 175 hospitalized patients with HHF were divided into 2 groups according to the change in diastolic function on predischarge echocardiography in comparison with baseline echocardiography: DFR group (n=74, 54.2±17.1 years, 55 males) vs. no DFR group (n=101, 59.1±16.8 years, 72 males). During 66.5±37 months of clinical follow-up, major adverse cardiac events (MACE) occurred in 89 patients: 85 HF rehospitalizations, 4 deaths, no MI. The number of MACE were significantly higher in the no DFR group than in the DFR group (61.6% vs. 32.4%, P<0.001). Predischarge systolic functional recovery was not a predictor of MACE, but impaired DFR was an independent predictor of MACE (RR=2.952, P=0.010, confidence interval, 1.878–6.955).
Conclusions:Impaired predischarge DFR, regardless of the type of HF or predischarge systolic functional recovery, is an independent predictor of future MACE in HHF. Changes in diastolic function should be carefully monitored and would be useful in risk stratification of HHF.
Byung Jin Kim, Dae Chul Seo, Bum Soo Kim, Jin Ho Kang
Type: ORIGINAL ARTICLE
Subject area: Hypertension and Circulatory Control
2018 Volume 82 Issue 6 Pages
Published: May 25, 2018
Released: May 25, 2018 [Advance publication] Released: February 28, 2018
Background:The relationship between chronic smoking and hypertension (HTN) is inconclusive in previous studies, which were mainly based on self-reported smoking status. The aim of this study was to evaluate the association of cotinine-verified smoking status with incident HTN.
Methods and Results:A total of 74,743 participants (43,104 men; age 38±5.4 years) were included in the study, with a mean follow-up period of 29 months. Individuals were divided into 4 groups on the basis of their cotinine-verified smoking status at baseline and at follow-up (never-smoking, new-smoking, former-smoking, and sustained-smoking). The incidence rate of HTN in the never-smoking, new-smoking, former-smoking, and sustained-smoking groups was 8.2%, 7.6%, 10.1%, and 8.7% for men and 1.8%, 2.5%, 1.5%, and 2.2% for women, respectively. In a multivariate Cox-hazard regression analysis adjusted for the variables with a univariate relationship, new-smoking and sustained-smoking had decreased relative risks (RRs) for incident HTN compared with never-smoking (RR [95% CI], 0.75 [0.58, 0.96] for new-smoking and 0.82 [0.74, 0.90] for sustained-smoking). Cotinine-verified current smoking at baseline was also inversely associated with incident HTN compared with cotinine-verified never-smoking at baseline (0.91 [0.84, 0.98]). These results remained significant only in men, although there was no sex interaction.
Conclusions:This longitudinal study showed that cotinine-verified new-smoking and sustained-smoking decreased the risk for incident HTN, especially in men, compared with never-smoking.
Background:There are few reports examining regional differences between rural prefectures and metropolitan areas in the management of acute myocardial infarction (AMI) in Japan.
Methods and Results:In the Rural AMI registry, a prospective, multi-prefectural registry of AMI in 4 rural prefectures (Ishikawa, Aomori, Ehime and Mie), a total of 1,695 consecutive AMI patients were registered in 2013. Among them, 1,313 patients who underwent primary percutaneous coronary intervention (PPCI) within 24 h of onset were enrolled in this study (Rural group), and compared with the cohort data from the Tokyo CCU Network registry for AMI in the same period (Metropolitan group, 2,075 patients). The prevalence of direct ambulance transport to PCI-capable facilities in the Rural group was significantly lower than that in the Metropolitan group (43.8% vs. 60.3%, P<0.01), which resulted in a longer onset-to-balloon time (OTB: 225 vs. 210 min, P=0.02) and lower prevalence of PPCI in a timely fashion (OTB ≤2 h: 11.5% vs. 20.7%, P<0.01) in the Rural group. Multivariate analysis revealed that direct ambulance transport was the strongest predictor for PPCI in a timely fashion (odds ratio=4.13, P<0.001).
Conclusions:AMI patients in rural areas were less likely to be transported directly to PCI-capable facilities, resulting in time delay to PPCI compared with those in metropolitan areas.
Background:Several studies have reported a relationship between clinical outcomes and the ankle-brachial index (ABI) in different populations. However, the relationship in Japanese patients or in patients undergoing percutaneous coronary intervention (PCI) has not been examined well.
Methods and Results:The subjects were 1,857 patients who underwent PCI from July 2007 to May 2010 and in whom the carotid and renal arteries and abdominal aorta were examined simultaneously by ultrasonography and ABI. We investigated the relationship between ABI and major adverse cardiovascular events (MACE: all-cause death, myocardial infarction, and stroke). The median follow-up was 1,322 days (interquartile range: 1,092–1,566 days). Patients with low (<0.9), borderline (0.9–1.0) and high ABI (>1.4) had significantly higher incidence of MACE at 4 years (31%, 15%, 10%, and 29% for the low, borderline, normal, and high groups, respectively; log-rank P<0.0001) and all-cause mortality at 4 years (22%, 12%, 6.9%, and 29%, respectively; P<0.0001) compared with the normal ABI group (1.0≤ABI≤1.4). The adjusted hazard ratios for MACE were 2.35 (1.72–3.20), 1.27 (0.89–1.80) and 1.87 (0.81–3.79) for low, borderline and high ABI, respectively.
Conclusions:This study suggested that ABI provides additional information for cardiovascular disease risk stratification in Japanese patients undergoing PCI, even it is borderline ABI.
Background:Pulmonary hypertension (PH) is more progressive in trisomy 21 patients. However, pulmonary arteriopathic lesions in these patients have not been fully characterized histopathologically.
Methods and Results:A retrospective review of a lung biopsy registry identified 282 patients: 188 patients with trisomy 21 (Group D) and 94 without (Group N). The mean age at lung biopsy was 3 and 7 months (P<0.0001). Pulmonary arterial pressure (PAP) and pulmonary vascular resistance were similar between the 2 groups. There were no significant differences in the proportion of patients with irreversible intimal lesions or the index of pulmonary vascular disease (IPVD; a measure of the degree of pulmonary arteriopathy progression) between the 2 groups. In addition, after propensity score matching for patient background (n=43 in each group), there were no significant differences in IPVD (P=0.29) or the ratio of irreversible intimal changes between the D and N groups (P=0.39). Multivariate analysis identified age (P<0.0001) and PAP (P=0.03) as the only risk factors for progression of pulmonary arteriopathy.
Conclusions:Histopathologically, early progression of pulmonary arteriopathy in patients with trisomy 21 was not proved compared with patients without trisomy 21. Although we cannot exclude the possibility of bias in the Group D and N patients who were slated for lung biopsy, factors other than pulmonary arteriopathy may affect the marked progression of clinical PH in trisomy 21 patients.
Background:The therapeutic efficacy of bone marrow mononuclear cells (BM-MNC) autotransplantation in critical limb ischemia (CLI) has been reported. Variable proportions of circulating monocytes express low levels of CD34 (CD14+CD34lowcells) and behave in vitro as endothelial progenitor cells (EPCs). The aim of the present randomized clinical trial was to compare the safety and therapeutic effects of enriched circulating EPCs (ECEPCs) with BM-MNC administration.
Methods and Results:ECEPCs (obtained from non-mobilized peripheral blood by immunomagnetic selection of CD14+and CD34+cells) or BM-MNC were injected into the gastrocnemius of the affected limb in 23 and 17 patients, respectively. After a mean of 25.2±18.6-month follow-up, both groups showed significant and progressive improvement in muscle perfusion (primary endpoint), rest pain, consumption of analgesics, pain-free walking distance, wound healing, quality of life, ankle-brachial index, toe-brachial index, and transcutaneous PO2. In ECEPC-treated patients, there was a positive correlation between injected CD14+CD34lowcell counts and the increase in muscle perfusion. The safety profile was comparable between the ECEPC and BM-MNC treatment arms. In both groups, the number of deaths and major amputations was lower compared with eligible untreated patients and historical reference patients.
Conclusions:This study supports previous trials showing the efficacy of BM-MNC autotransplantation in CLI patients and demonstrates comparable therapeutic efficacy between BM-MNC and EPEPCs.
Background:Abnormalities in the left atrium (LA) detected on transesophageal echocardiography (TEE) are reliable predictors of thromboembolism in patients with atrial fibrillation (AF). Cardiac troponin I, a marker of subclinical myocardial damage, may also be a predictor of thromboembolic events in patients with AF. The relationship between cardiac troponin I and thromboembolic risk on TEE, however, remains unclear.
Methods and Results:TEE and laboratory data, including high sensitivity cardiac troponin I (hs-cTnI) and CHA2DS2-VASc score, were analyzed in 199 patients with non-valvular AF (NVAF). Patients were stratified into those with or without LA abnormality, defined as LA appendage flow velocity <20 cm/s or dense spontaneous echo contrast. On multiple logistic analysis of the clinical variables, hs-cTnI was associated with LA abnormality (95% CI: 1.0003–1.020, P=0.034). The area under the curve for LA abnormality increased on addition of hs-cTnI to CHA2DS2-VASc score. The incidence rate of ischemic stroke was higher in the high hs-cTnI group than in the low-hs-cTnI group (log-rank test, P<0.05).
Conclusions:Elevated hs-cTnI was independently associated with LA abnormality in NVAF patients. hs-cTnI level may be a useful biomarker for risk stratification of thromboembolism in NVAF patients.
Background:Although minimally invasive mitral valve repair (MIMVR) is increasingly being performed, only a few clinical studies from Japanese institutions have been reported.
Methods and Results:From 2006 to 2017, 387 consecutive patients (135 females, mean age 56±13 years) underwent an initial isolated MIMVR through a right minithoracotomy. The mitral etiology was degenerative in 348, functional in 22, and endocarditis in 13 cases. Repair techniques included leaflet resection/plication in 280, chordal reconstruction in 109, and annuloplasty alone in 24 patients, and concomitant procedures included tricuspid valve repair and atrial fibrillation ablation in 70 (18.1%) and 78 (20.2%), respectively. Hospital mortality rate was 0.26%; 2 patients (0.5%) required intraoperative conversion to a median sternotomy. Perioperative morbidity included stroke (1.3%), reoperation for bleeding (0.8%), prolonged ventilation (0.5%), and permanent pacemaker implantation (2.1%). The transfusion rate was 14.7% and median ventilation time was 4 hours. Overall 5-year survival was 96.9%. For patients with degenerative mitral regurgitation (MR), the 5-year freedom from reoperation or severe recurrent MR, and freedom from ≥moderate MR were 94.7% and 82.2%, respectively. Repair for anterior mitral leaflet prolapse and the initial 30 cases were associated with higher occurrence of recurrent MR.
Conclusions:MIMVR can be performed safely with low levels of mortality and morbidity, and provides sufficient repair durability. A learning curve exists in terms of repair durability, especially for anterior mitral leaflet repair.