Circulation Reports
Online ISSN : 2434-0790
Advance online publication
Displaying 1-35 of 35 articles from this issue
  • Takuya Nishino, Katsuhito Kato, Shuhei Tara, Daisuke Hayashi, Tomohisa ...
    Article type: ORIGINAL ARTICLE
    Subject Area: Heart Failure
    Article ID: CR-25-0337
    Published: March 10, 2026
    Advance online publication: March 10, 2026
    JOURNAL OPEN ACCESS FULL-TEXT HTML ADVANCE PUBLICATION
    Supplementary material

    Background: The number of patients with heart failure (HF) is increasing with aging of the population, resulting in a shift in care from hospitals to community settings. Although predicting medium-term prognosis after discharge could improve community-based management and reduce readmissions, no established model has integrated structured multidimensional assessments into HF prognostic modeling.

    Methods and Results: This multicenter study developed and validated machine learning (ML) models (i.e., logistic regression, random forest, extreme gradient boosting, and light gradient boosting) to predict 180-day mortality or emergency hospitalization in 4,904 patients with HF. Patients were randomly divided into training and validation sets (8 : 2). Nursing care needs, derived from structured nursing assessments that capture patients’ physical status and care dependency, were included as a predictive feature. All models demonstrated acceptable discriminative performance based on the area under the precision-recall curve, favorable calibration assessed by the calibration slope and Brier score, and effective risk stratification. The Shapley additive explanations algorithm identified nursing care needs as an important prognostic factor, alongside established laboratory variables for HF prognosis.

    Conclusions: ML models incorporating nursing care needs effectively predicted the 180-day prognosis of patients with HF. The prominent contribution of nursing care needs underscores the value of incorporating structured multidimensional care-related information into prognostic modeling and highlights the importance of team-based post-discharge HF management.

  • Chiaki Mizuno, Hiroaki Hiraiwa, Shinya Yokoyama, Takanori Ito, Kazuki ...
    Article type: ORIGINAL ARTICLE
    Subject Area: Heart Failure
    Article ID: CR-25-0323
    Published: March 07, 2026
    Advance online publication: March 07, 2026
    JOURNAL OPEN ACCESS FULL-TEXT HTML ADVANCE PUBLICATION
    Supplementary material

    Background: Locomotive syndrome (LS), an early predictor of physical frailty, is frequently evaluated during health screenings in Japan. However, its relationship with early cardiac dysfunction remains unclear. We investigated the association between LS severity and N-terminal pro B-type natriuretic peptide (NT-proBNP) concentration, a myocardial stress biomarker, in a general health screening population.

    Methods and Results: This cross-sectional study analyzed 8,593 individuals who underwent health checkups between 2018 and 2023 in Aichi, Japan. LS was assessed using the Short Test Battery for LS, including physical tests and a 25-item questionnaire, and NT-proBNP was measured from blood samples. Individuals were categorized by LS stage (non-LS and stages 1–3) and NT-proBNP concentration (<55, 55–124, 125–299, ≥300 pg/mL). Logistic regression was used to evaluate the association between LS severity and elevated NT-proBNP (cut-off ≥125 pg/mL). A stepwise increase in NT-proBNP was observed with higher LS stage (Ptrend<0.001). Compared with non-LS, LS stages 2 (odds ratio [OR] 2.24; 95% confidence interval [CI] 1.35–3.72) and 3 (OR 3.51; 95% CI 2.02–6.12) were independently associated with elevated NT-proBNP.

    Conclusions: The increase in NT-proBNP with LS severity suggests a link between myocardial stress progression and physical function decline in asymptomatic individuals. LS assessment may help identify early stage cardiovascular dysfunction, although the causal relationship between LS and cardiac dysfunction remains to be clarified.

  • Satoshi Oka, Koji Miyamoto, Chisa Asahina, Toshihiro Nakamura, Akinori ...
    Article type: ORIGINAL ARTICLE
    Subject Area: Arrhythmia/Electrophysiology
    Article ID: CR-25-0340
    Published: March 07, 2026
    Advance online publication: March 07, 2026
    JOURNAL OPEN ACCESS FULL-TEXT HTML ADVANCE PUBLICATION
    Supplementary material

    Background: Significant atrial low-voltage zones (LVZs), indicative of advanced atrial fibrillation (AF), are associated with atrial arrhythmia (AA) recurrence following catheter ablation. Although preoperative prediction remains challenging, a low plasma atrial natriuretic peptide (ANP) level relative to B-type natriuretic peptide (BNP) reflects atrial fatigue with impaired ANP secretion and may indicate advanced atrial remodeling and LVZs.

    Methods and Results: We retrospectively evaluated 166 consecutive patients with persistent AF who underwent initial catheter ablation using a 3-dimensional mapping system. The optimal ANP/BNP ratio cut-off for predicting LVZ presence was determined using receiver operating characteristic curve analysis. The primary outcome was AA recurrence. An ANP/BNP ratio of 0.7 was optimal for predicting LVZ presence (area under the curve 0.76; sensitivity 81%; specificity 60%). Patients with an ANP/BNP ratio ≤0.7 (n=91) had a significantly higher prevalence of LVZs (52% vs. 15%; P<0.001) and higher AA recurrence risk following initial pulmonary vein isolation (log-rank P=0.025; hazard ratio 1.85; 95% confidence interval 1.09–3.14; median follow-up period 583 days).

    Conclusions: Serum ANP/BNP ratio is a useful surrogate biomarker for predicting advanced atrial remodeling with significant LVZs and AA recurrence following catheter ablation. ANP secretion assessment may help in candidate selection among patients with persistent AF who can benefit from catheter ablation.

  • Daichi Kobayashi, Masakazu Saitoh, Kentaro Hori, Shinya Tajima, Kotaro ...
    Article type: ORIGINAL ARTICLE
    Subject Area: Cardiac Rehabilitation
    Article ID: CR-25-0341
    Published: March 07, 2026
    Advance online publication: March 07, 2026
    JOURNAL OPEN ACCESS FULL-TEXT HTML ADVANCE PUBLICATION
    Supplementary material

    Background: Transcatheter aortic valve implantation (TAVI) has enhanced outcomes in patients with severe aortic stenosis (AS). However, the additional effect of introducing outpatient cardiac rehabilitation (OCR) remains unclear. In this study, we investigated how OCR participation is associated with post-discharge all-cause mortality among patients who underwent TAVI.

    Methods and Results: A retrospective cohort study involving 1,446 patients with AS who underwent elective TAVI was conducted. The patients were classified into the OCR participation group (n=100) and the non-participation group (n=1,346) based on whether they participated in OCR after discharge. Propensity score matching was conducted to adjust for confounding factors. The mean follow-up period was 2.9±2.0 years. Patients undergoing OCR experienced a lower all-cause mortality rate (log-rank test P=0.001). Multivariate analysis showed that OCR participation was independently associated with all-cause mortality after discharge, even after adjusting for known prognostic factors.

    Conclusions: OCR participation after TAVI in patients with AS is an independent prognostic factor of life outcome. Cardiac rehabilitation teams should actively encourage patients to participate in OCR.

  • Koki Hanamoto, Junya Tanabe, Tadashi Takasaki, Kazuhiro Yamazaki, Kazu ...
    Article type: IMAGES IN CARDIOVASCULAR MEDICINE
    Article ID: CR-25-0225
    Published: March 06, 2026
    Advance online publication: March 06, 2026
    JOURNAL OPEN ACCESS FULL-TEXT HTML ADVANCE PUBLICATION
  • Asaki Saijo, Hidetaka Itoh, Yuko Tanabe, Chinatsu Komiyama, Ayako Hari ...
    Article type: ORIGINAL ARTICLE
    Subject Area: Cardiac Rehabilitation
    Article ID: CR-25-0331
    Published: March 05, 2026
    Advance online publication: March 05, 2026
    JOURNAL OPEN ACCESS FULL-TEXT HTML ADVANCE PUBLICATION

    Background: Despite growing interest in cardio-oncology rehabilitation (CORE), data on cardiopulmonary exercise testing (CPX/CPET) in Japanese cancer patients remain scarce.

    Methods and Results: We reviewed 440 CPX examinations at Toranomon Hospital (2018–2023) and identified 37 tests from 28 patients with active cancer and cardiovascular problems. CPX parameters included peak oxygen uptake (peak V̇O2), anaerobic threshold (AT), ventilatory efficiency (V̇E vs. V̇CO2slope), and metabolic equivalents (METs). Patients were classified into those with cancer therapy-related cardiac dysfunction (CTRCD or subclinical CTRCD) and those without. Various anticancer agents had been used, with anthracycline exposure more frequent in the CTRCD group. No patient received rehabilitation before CPX. Median age was 60 years; 68% female. Cancers included breast (n=17), lymphoma (n=5), leukemia (n=3), and others. Cardiovascular problems comprised CTRCD/subclinical CTRCD (n=15), ischemic heart disease (n=2), and others. Median peak V̇O2was 14.7 mL/kg/min (63% predicted), with 43% below the prognostic threshold of 14 mL/kg/min. Median AT was 10.9 mL/kg/min and V̇E vs. V̇CO2slope 30.7, indicating reduced cardiorespiratory function. No significant differences were observed between the CTRCD and non-CTRCD groups. In 4 patients with serial CPX, exercise capacity changes did not always parallel left ventricular ejection fraction.

    Conclusions: In cancer patients with cardiovascular problems, CPX revealed reduced exercise tolerance beyond cardiac function. These findings highlight the need for individualized rehabilitation inform future CORE protocols.

  • Tetsuya Matsuyama, Takayuki Okamura, Tatsuhiro Fujimura, Yosuke Miyaza ...
    Article type: ORIGINAL ARTICLE
    Subject Area: Ischemic Heart Disease
    Article ID: CR-26-0006
    Published: March 05, 2026
    Advance online publication: March 05, 2026
    JOURNAL OPEN ACCESS FULL-TEXT HTML ADVANCE PUBLICATION

    Background: Periprocedural myocardial injury (PMI) is a common complication of percutaneous coronary intervention (PCI). Elevated post-PCI index of microcirculatory resistance (IMR) has been linked to PMI. The angiography-derived IMR (angio-IMR) serves as a pressure-wire-free method to assess coronary microvascular function. This study aimed to establish the association between post-PCI angio-IMR and PMI.

    Methods and Results: We retrospectively analyzed 101 consecutive elective PCI cases where PMI diagnosis and post-PCI angio-IMR calculation were feasible. Angio-IMR was computed using computational flow and pressure simulations. Patients were categorized into 2 groups based on PMI status: PMI (n=33), and non-PMI (n=68). The PMI group had significantly higher post-PCI angio-IMR values than the non-PMI group (31.8±5.9 vs. 23.8±6.0; P<0.001). Both univariate and multivariate logistic regression analyses revealed an association between post-PCI angio-IMR and PMI. Patients with post-PCI angio-IMR ≥29 had a significantly higher incidence of PMI (67.6% vs. 12.5%; P<0.001).

    Conclusions: Increased post-PCI angio-IMR values were strongly associated with PMI. Post-PCI angio-IMR might serve as a useful non-invasive predictive of PMI following elective PCI.

  • Naoto Murakami, Kenichi Ishizu, Masaomi Hayashi, Shinichi Shirai
    Article type: ORIGINAL ARTICLE
    Subject Area: Ischemic Heart Disease
    Article ID: CR-25-0227
    Published: February 27, 2026
    Advance online publication: February 27, 2026
    JOURNAL OPEN ACCESS FULL-TEXT HTML ADVANCE PUBLICATION
    Supplementary material

    Background: The relationship between the reduction in aortic valve pressure gradient (AVG) after transcatheter aortic valve implantation (TAVI) and improvements in left ventricular ejection fraction (LVEF) or long-term survival remains unclear.

    Methods and Results: We retrospectively analyzed 121 patients with aortic stenosis (AS) and LVEF <50% who underwent TAVI. Transthoracic echocardiography (TTE) was performed before and after TAVI, and the difference in mean AVG was defined as delta-AVG. LVEF improvement was defined as a ≥10% increase from baseline at 1 year. Among 82 patients with 1-year TTE evaluation, LVEF improvement was observed in 37 (45.1%) patients. A higher delta-AVG was identified as an independent predictor of LVEF improvement (odds ratio 1.04; 95% confidence interval [CI] 1.01–1.07; P=0.002), and receiver operating characteristic analysis indicated the optimal cut-off was 20.5 mmHg (sensitivity 48.9%; specificity 89.2%; area under curve 0.723; P=0.002). During a mean follow up of 1,042.6±577.5 days, 5 cardiac deaths occurred. Cox proportional hazards analysis revealed that a low delta-AVG was identified as an independent predictor of cardiac death (hazard ratio 0.91; 95% CI 0.7–0.99; P=0.023). Additionally, a greater delta-AVG was significantly associated with a larger increase in delta-stroke volume between post-TAVI to 1 year (r=0.255; P=0.021).

    Conclusions: In patients with severe AS and reduced LVEF, delta-AVG could be a novel predictor of 1-year LVEF improvement and long-term survival after TAVI.

  • Yuka Odate, Yuki Nakano, Mayumi Nagasaka, Yukiko Hirose, Misao Suzuki, ...
    Article type: The 30th Japanese Association of Cardiac Rehabilitation Annual Meeting (2024)
    Article ID: CR-26-0012
    Published: February 26, 2026
    Advance online publication: February 26, 2026
    JOURNAL OPEN ACCESS FULL-TEXT HTML ADVANCE PUBLICATION

    Background: Early detection of heart failure (HF) relies on community-based interventions supported by seamless coordination. In this study, we examined the challenges and opportunities in developing HF support system, drawing on hospital-led community collaboration activities and a participant survey.

    Methods and Results: A cross-sectional survey of 31 participants in a regional multidisciplinary meeting yielded 13 responses (41.9%). Non-medical professionals demonstrated lower baseline knowledge but higher satisfaction and learning effectiveness. Principal component analysis revealed occupational differences.

    Conclusions: Foundational education was effective, particularly for non-medical professionals, and highlighted the need to address disparities in knowledge and tool utilization.

  • Yoko M. Nakao, Atsushi Takayama, Koji Kawakami
    Article type: PROTOCOL PAPER
    Article ID: CR-25-0314
    Published: February 14, 2026
    Advance online publication: February 14, 2026
    JOURNAL OPEN ACCESS FULL-TEXT HTML ADVANCE PUBLICATION
    Supplementary material

    Background: Day-to-day home blood pressure variability (BPV) is associated with cardiovascular risk and influenced by environmental conditions. However, it is unclear whether short-term increases in day-to-day BPV can be predicted from personal sensor data. In this study, our aim is to develop and validate a machine-learning prediction model for short-term increases in day-to-day BPV using personal sensor data on behavioral and environmental exposure.

    Methods and Results: We will conduct a 30-day monitoring study in community-dwelling adults. Participants will measure home BP twice daily, while a portable sensor and an activity tracker record environmental conditions and physical activity. The primary outcome is an episode of increased systolic day-to-day BPV, defined as a rolling 5-day coefficient of variation ≥11.0%. Candidate predictors will be derived from the preceding 5-day exposure window. We will construct window-level data, allocate participants to training and test sets, and train machine-learning models with participant-level cross-validation. We will evaluate performance using the area under the receiver operating characteristic curve, calibration, Brier score, and decision-curve analysis, and interpret the XGBoost model with Shapley additive explanations to quantify the predictor contributions.

    Conclusions: This protocol outlines a framework for predicting short-term increases in day-to-day BPV from personally experienced environmental exposure and behaviors, supporting future personalized interventions targeting modifiable environmental and behavioral factors.

  • Saeko Iikura, Yuki Ikeda, Shohei Nakahara, Yuki Watanabe, Yosuke Haruk ...
    Article type: ORIGINAL ARTICLE
    Subject Area: Critical Care
    Article ID: CR-25-0283
    Published: February 21, 2026
    Advance online publication: February 21, 2026
    JOURNAL OPEN ACCESS FULL-TEXT HTML ADVANCE PUBLICATION

    Background: The clinical differences between intra-aortic balloon pumping (IABP) and a microaxial flow pump (Impella) for left ventricular (LV) unloading in patients with fulminant myocarditis (FM) supported with venoarterial extracorporeal membrane oxygenation (VA-ECMO) remain unclear.

    Methods and Results: In this single-center, retrospective cohort study, we analyzed 27 consecutive patients with lymphocytic FM who received VA-ECMO support. Patients were stratified by the LV unloading device that was used: IABP (n=15); or Impella (n=12). The primary endpoint was a composite of all-cause mortality or implantation of an extracorporeal ventricular assist device (exVAD) within 30 days of VA-ECMO initiation. Temporal changes in laboratory and hemodynamic parameters during the first 7 days of support were also assessed. Baseline characteristics, including LV ejection fraction (IABP 16% vs. Impella 18%; P=0.814) and QRS duration (139 vs. 105 ms; P=0.805), were comparable between groups. Nine patients met the primary endpoint (mortality [n=7]; exVAD implantation [n=2]). Kaplan-Meier analysis revealed a significantly lower incidence of the primary endpoint in the Impella group (log-rank P=0.018). The Impella group also showed a significantly greater improvement in cardiac power output (group×time interaction, P=0.040). Hemolysis, elevated total bilirubin, and increased serum creatinine were more pronounced in the Impella group.

    Conclusions: In patients with FM requiring VA-ECMO, LV unloading with Impella was associated with improved short-term clinical outcomes compared with IABP.

  • Tomohito Gohda, Nozomu Kamei, Marenao Tanaka, Masato Furuhashi, Tatsuy ...
    Article type: ORIGINAL ARTICLE
    Subject Area: Renal Disease
    Article ID: CR-25-0345
    Published: February 21, 2026
    Advance online publication: February 21, 2026
    JOURNAL OPEN ACCESS FULL-TEXT HTML ADVANCE PUBLICATION
    Supplementary material

    Background: The original cardiovascular–kidney–metabolic (CKM) staging system uniformly categorized all individuals with type 2 diabetes (T2D) as Stage 2. We aimed to improve the prognostic accuracy for chronic kidney disease (CKD) progression by incorporating the Kidney Disease: Improving Global Outcomes risk categories – based on estimated glomerular filtration rate (eGFR) and urinary albumin-to-creatinine ratio (UACR) – into CKM Stage 2.

    Methods and Results: This study included 600 individuals with T2D from Kure Medical Center and Chugoku Cancer Center. The primary outcome was CKD progression, defined as a ≥30% decline in eGFR. The refined system significantly improved risk stratification for CKD progression compared with the original system, showing a higher area under the receiver operating characteristic curve and greater integrated discrimination improvement. The risk of CKD progression, reflected by hazard ratios derived from the Fine–Gray subdistribution hazard models, increased progressively across the refined CKM stages after adjustment for potential confounders, including baseline eGFR. However, the independent prognostic value of the refined system was attenuated when baseline UACR was additionally included in the model.

    Conclusions: Integrating eGFR and UACR into the original CKM staging system enhances the prognostic performance for CKD progression in individuals with T2D. This refined system, incorporating these renal biomarkers, provides superior risk stratification compared with the original system, and serves as a more robust tool for clinical prognostic assessment.

  • Kohei Shiota, Masakazu Saitoh, Kotaro Iwatsu, Tomoyuki Morisawa, Tetsu ...
    Article type: ORIGINAL ARTICLE
    Subject Area: Cardiac Rehabilitation
    Article ID: CR-25-0267
    Published: February 20, 2026
    Advance online publication: February 20, 2026
    JOURNAL OPEN ACCESS FULL-TEXT HTML ADVANCE PUBLICATION
    Supplementary material

    Background: Falls are a serious medical problem. With the aging of patients with cardiovascular disease (CVD), falls have become an important clinical outcome. However, evidence regarding falls in this population is limited, and the impact of cardiac rehabilitation (CR) remains unclear. This study investigated the incidence of falls and examined the association between outpatient CR (OCR) and falls among older patients with CVD.

    Methods and Results: This single-center prospective cohort study included 110 patients with CVD aged ≥65 years who participated in early phase II CR (mean age 77±6 years; 36% women). The occurrence and frequency of falls within 1 year of discharge were assessed using a mailed self-reported questionnaire. Participants were divided into non-OCR and OCR groups. The overall incidence rate of falls was 20.9%. The non-OCR group had a significantly higher occurrence and frequency of falls than the OCR group. Negative binomial and modified Poisson regression analyses demonstrated that OCR participation was significantly associated with a lower fall rate (adjusted incidence rate ratio 0.42; 95% confidence interval [CI] 0.23–0.76; P<0.01) and risk (adjusted risk ratio 0.39; 95% CI 0.18–0.89; P=0.02).

    Conclusions: Among older patients with CVD, approximately 20% experienced a fall within 1 year after hospital discharge. Patients who participated in the OCR program had significantly lower fall rates and risks.

  • Tetsuya Takahashi, Taiga Ishigaki, Wataru Katawaki, Taku Toshima, Yu K ...
    Article type: ORIGINAL ARTICLE
    Subject Area: Ischemic Heart Disease
    Article ID: CR-25-0320
    Published: February 20, 2026
    Advance online publication: February 20, 2026
    JOURNAL OPEN ACCESS FULL-TEXT HTML ADVANCE PUBLICATION
    Supplementary material

    Background: Drug-coated balloon (DCB) is a novel treatment option for percutaneous coronary intervention (PCI). The presence of heart failure (HF) in patients with coronary artery disease (CAD) is associated with a poor prognosis. However, the clinical significance of DCB-based PCI in CAD patients with HF is unknown.

    Methods and Results: This was a retrospective analysis of a cohort study from a prospective, single center registry from 2015 to 2024. We enrolled 258 CAD patients with chronic HF who underwent PCI with DCB or in combination with a drug-eluting stent (DES). Propensity score matching analysis was performed between the DCB-based PCI and DES-only PCI groups. The primary endpoint of this study was all-cause mortality. Baseline clinical characteristics were comparable between the groups. The total DES number and length were significantly reduced in patients with DCB-based PCI than in those with DES-only PCI. Kaplan-Meier analysis revealed that the DCB-based PCI group had a significantly lower rate of all-cause mortality compared with the DES-only group (log-rank test, P=0.04).

    Conclusions: In CAD patients with chronic HF, DCB-based PCI was associated with a lower risk of mortality compared with DES-only PCI.

  • Ayano Yoshida, Takuma Takada, Eiji Shibahashi, Takuro Abe, Kensuke Shi ...
    Article type: RESEARCH LETTER
    Article ID: CR-25-0306
    Published: February 19, 2026
    Advance online publication: February 19, 2026
    JOURNAL OPEN ACCESS FULL-TEXT HTML ADVANCE PUBLICATION

    Background: The prognosis of acute myocarditis (AM) is difficult to predict due to its variable presentation. We investigated the prognosticators of AM patients requiring hospitalization.

    Methods and Results: We conducted a multicenter observational study including 80 hospitalized AM patients. The primary endpoint was a composite of all-cause death, heart transplantation, and implantation of ventricular assist devices during the index hospitalization. Thirteen (16%) patients reached the endpoint. Longer QRS duration at admission independently predicted adverse outcomes, with an optimal cut-off of 130 ms.

    Conclusions: Prolonged QRS duration at admission might predict in-hospital prognosis in AM patients regardless of whether or not it was fulminant.

  • Mayuka Masuda, Hiroyuki Yamamoto, Shinsuke Nakano, Nobuyuki Takahashi, ...
    Article type: ORIGINAL ARTICLE
    Subject Area: Cardiovascular Intervention
    Article ID: CR-25-0322
    Published: February 17, 2026
    Advance online publication: February 17, 2026
    JOURNAL OPEN ACCESS FULL-TEXT HTML ADVANCE PUBLICATION
    Supplementary material

    Background: Drug-coated balloon (DCB) angioplasty has shown short-term feasibility for large-vessel coronary artery disease (CAD; ≥3 mm); however, long-term outcomes remain unclear. This study aimed to evaluate the 5-year cardiovascular outcomes of DCB angioplasty vs. drug-eluting stents (DES) for de novo large-vessel CAD.

    Methods and Results: This single-center retrospective study analyzed patients undergoing percutaneous coronary intervention (PCI) with either DCB (SeQuent Please) or DES (Xience Alpine) between January 2016 and December 2018. The primary outcomes were cardiovascular events (CVE), defined as a composite of cardiac death, non-fatal myocardial infarction, and target lesion revascularization (TLR). Secondary outcomes included minimal lumen diameter (MLD), diameter stenosis (DS), and late lumen loss (LLL), assessed at the index PCI and at the 1-year angiographic follow up. Overall, 114 patients (122 lesions) in the DCB group and 269 patients (293 lesions) in the DES group were analyzed, with similar median follow-up durations (1,678 vs. 1,825 days; P=0.687). At 5 years, TLR and CVE rates were comparable between the DCB and DES groups (7.9% vs. 4.5%, P=0.239; and 11.4% vs. 10.4%, P=0.773, respectively). No significant differences in MLD, DS, or LLL were observed between the groups at the 1-year follow up.

    Conclusions: With careful lesion selection and preparation, DCB angioplasty could serve as a feasible treatment option for de novo large-vessel CAD in clinical practice.

  • Naoto Yabu, Tomoyuki Minami, Shota Yasuda, Yoshiyuki Kobayashi, Aya Sa ...
    Article type: RESEARCH LETTER
    Article ID: CR-25-0335
    Published: February 10, 2026
    Advance online publication: February 10, 2026
    JOURNAL OPEN ACCESS FULL-TEXT HTML ADVANCE PUBLICATION

    Background: Postoperative atrial fibrillation (POAF) after cardiac surgery requires prompt intervention. We compared landiolol with verapamil for POAF treatment.

    Methods and Results: This randomized trial enrolled 179 patients; 45 developed POAF. Landiolol achieved higher sinus rhythm conversion at 8 h (73.3% vs 16.7%; P=0.0016), but not at 12 h. No differences were observed in recurrence, adverse events, or intensive care unit stay.

    Conclusions: Landiolol facilitates earlier rhythm conversion without clear short-term clinical benefit.

  • Rie Aoyama, Tatsuki Ugawa, Shinichi Okino, Shigeru Fukuzawa
    Article type: RESEARCH LETTER
    Article ID: CR-26-0005
    Published: February 14, 2026
    Advance online publication: February 14, 2026
    JOURNAL OPEN ACCESS FULL-TEXT HTML ADVANCE PUBLICATION

    Background: The Kumamoto criteria are proposed for predicting 99 mTc-pyrophosphate (PYP) uptake, but real-world performance is uncertain.

    Methods and Results: We reviewed 102 consecutive patients who underwent PYP scintigraphy; grade ≥2 was positive, and grade 1 equivocal/negative. Of them, 15 were positive; 11 had scores 0–1 yet were positive, while 8 scored 2 and yet were negative. Adding age, sex, PR interval, and atrial fibrillation improved the area under the curve (AUC) from 0.598 to 0.866; excluding sex yielded AUC 0.842.

    Conclusions: Using the Kumamoto criteria alone showed limited discrimination; combining routine variables may help select patients for PYP scintigraphy.

  • Tomohiro Kato, Yuta Ozaki, Shigefumi Honda, Yusuke Uemura, Kenji Takem ...
    Article type: ORIGINAL ARTICLE
    Subject Area: Cardiac Rehabilitation
    Article ID: CR-25-0299
    Published: February 13, 2026
    Advance online publication: February 13, 2026
    JOURNAL OPEN ACCESS FULL-TEXT HTML ADVANCE PUBLICATION
    Supplementary material

    Background: Postoperative declines in activities of daily living (ADL) are concerning in older adults undergoing cardiovascular surgeries. Sarcopenia represents a determinant of such adverse outcomes. We examined whether preoperative sarcopenia and its components predicted postoperative ADL decline in older patients who underwent elective cardiovascular surgeries.

    Methods and Results: This retrospective cohort study included 589 patients aged ≥65 years who underwent elective coronary artery bypass grafting, heart valve surgery, or thoracic aortic surgery. Sarcopenia was defined according to the Asian Working Group for Sarcopenia 2019 criteria. ADLs were assessed using the Barthel Index, with in-hospital ADL decline being defined as a ≥10-point reduction. Thirty-three (5.6%) patients had sarcopenia preoperatively. ADL decline was significantly higher in the patients with sarcopenia compared with those without (15.2% vs. 5.0%; P=0.014). Multivariable logistic regression analyses demonstrated that sarcopenia was independently associated with ADL decline (odds ratio 3.094; 95% confidence interval 1.067–8.968; P=0.038). Each sarcopenia component – low muscle mass, low muscle strength, and slow gait speed – was also independently associated with ADL decline (all P<0.050). Age-adjusted receiver operating characteristic analyses showed that sarcopenia demonstrated moderate discrimination for predicting postoperative ADL decline, with an area under the curve of 0.707.

    Conclusions: Preoperative sarcopenia and its individual components independently predicted in-hospital ADL decline following cardiovascular surgery. Preoperative assessments may help identify high-risk patients.

  • Kenichi Sasaki, Shingo Kuwata, Masaki Izumo, Yukio Sato, Takahiko Kai, ...
    Article type: ORIGINAL ARTICLE
    Subject Area: Valvular Heart Disease
    Article ID: CR-25-0249
    Published: February 07, 2026
    Advance online publication: February 07, 2026
    JOURNAL OPEN ACCESS FULL-TEXT HTML ADVANCE PUBLICATION
    Supplementary material

    Background: The clinical impact of left QRS axis deviation (LAD) during new-onset left bundle branch block (LBBB) after transcatheter aortic valve replacement (TAVR) remains unclear.

    Methods and Results: This single-center retrospective study analyzed 254 patients who developed new-onset LBBB during hospitalization after TAVR. Clinical and echocardiographic outcomes were compared between patients with LBBB and LAD (LBBBLAD) and those with LBBB and a normal QRS axis (LBBBNA). 96 patients (38%) had LBBBLAD, defined as a QRS axis <−30°. A more leftward preprocedural QRS axis independently predicted LBBBLAD(odds ratio 1.20 per 10° decrement; 95% confidence interval (CI) 1.09–1.33; P<0.01). At 3 years, there were no significant differences between groups in all-cause death (28% vs. 19%; P=0.14), cardiovascular death (6% vs. 5%; P=0.73), or heart failure rehospitalization (18% vs. 10%; P=0.07). However, LBBBLADwas associated with a higher incidence of permanent pacemaker implantation (PPI) for atrioventricular conduction disorder (16% vs. 6%; P=0.02) and remained an independent predictor of PPI (Cox hazard ratio 2.46; 95% CI 1.06–5.73; P=0.04). Echocardiographic measures, including left ventricular ejection fraction, chamber size, and mitral regurgitation severity showed no significant longitudinal differences between groups.

    Conclusions: Compared to post-TAVR LBBBNA, post-TAVR LBBBLADis associated with an increased need for PPI, but not with adverse mortality or heart failure outcomes at 3-year follow-up. Closer and extended rhythm monitoring may be warranted in this subgroup.

  • Kazuya Kito, Masakazu Saitoh, Yuji Mori, Keita Fujiyama, Masahiro Toda ...
    Article type: ORIGINAL ARTICLE
    Subject Area: Heart Failure
    Article ID: CR-25-0274
    Published: February 06, 2026
    Advance online publication: February 06, 2026
    JOURNAL OPEN ACCESS FULL-TEXT HTML ADVANCE PUBLICATION
    Supplementary material

    Background: Renal dysfunction (RD) is common at admission for acute heart failure (AHF), but there is limited evidence focusing on older adults and considering the influence of physical function. We evaluated the prognostic significance of admission RD severity as a risk factor for adverse outcomes in older patients with AHF, while considering the potential modifying effect of physical function.

    Methods and Results: This multicenter prospective cohort study enrolled 710 patients aged ≥65 years with an estimated glomerular filtration rate (eGFR) <60 mL/min/1.73 m2. Admission RD was stratified into 4 severity classes: mild RD (eGFR 45–59), moderate RD (eGFR 30–44), severe RD (eGFR 15–29), and kidney failure (eGFR <15). The primary outcome was a composite of HF readmission and all-cause death within 1 year post-discharge. Subgroup analyses assessed potential effect modification by physical function and other variables. After multivariable adjustment, severe RD or kidney failure was significantly associated with a higher risk of the composite outcome compared with mild RD (adjusted hazard ratio: 1.529; 95% confidence interval: 1.005–2.326). A possible interaction was observed between moderate RD and the Short Physical Performance Battery score at discharge (P for interaction=0.093).

    Conclusions: Severe RD or kidney failure at admission independently predicted 1-year HF readmission and all-cause death. In moderate RD, physical function may modify RD prognostic impact.

  • Takayuki Gyoten, Yuta Kanazawa, Yu Kumagai, Takayuki Akatsu, Yuko Gata ...
    Article type: ORIGINAL ARTICLE
    Subject Area: Cardiovascular Surgery
    Article ID: CR-25-0307
    Published: February 06, 2026
    Advance online publication: February 06, 2026
    JOURNAL OPEN ACCESS FULL-TEXT HTML ADVANCE PUBLICATION

    Background: Data on the clinical outcomes and hemodynamic performance of the smallest commercially available bioprostheses (19 mm) in Japan for aortic valve replacement (AVR) remain limited.

    Methods and Results: We analyzed the data of 187 adults (median age, 76 [interquartile range (IQR): 73–80] years; 165 women [88%]; median follow-up, 65 [IQR: 32–95] months) with symptomatic aortic valve stenosis, regurgitation, and valve deterioration who underwent surgical AVR between January 2015 and July 2024 with the Avalus (n=7), Magna (n=77), Epic (n=26), Inspiris (n=58), or Mosaic (n=27) bioprosthesis because of having small aortic annuli. The primary and secondary endpoints were all-cause death and major adverse cardiac events, respectively. Moderate-to-severe prosthesis-patient mismatch occurred in 53 patients (28%). The overall survival rates (95% confidence interval [CI]) at 1, 3, and 5 years after valve replacement were 93.0% (88.3–95.9%), 87.0% (81.0–91.2%), and 85.7% (79.5–90.1%), respectively. The rates of freedom from major adverse cardiac and cerebrovascular events (95% CI) at 1, 3, and 5 years were 96.2% (92.1–98.2%), 90.2(84.5–93.9%), and 88.7(82.5–92.7%), respectively. Four patients required re-intervention (3, re-AVR and 1, medication). No significant differences were observed in either outcomes or hemodynamics among the different aortic bioprostheses.

    Conclusions: Surgical replacement with 19-mm third-generation aortic valve bioprostheses for small aortic annuli is feasible with favorable early and mid-term hemodynamics.

  • Tetsuya Kimura, Yugo Yamashita, Yasutaka Ihara, Megumi Mizutani, Ryota ...
    Article type: ORIGINAL ARTICLE
    Subject Area: Onco-Cardiology
    Article ID: CR-25-0167
    Published: January 31, 2026
    Advance online publication: January 31, 2026
    JOURNAL OPEN ACCESS FULL-TEXT HTML ADVANCE PUBLICATION
    Supplementary material

    Background: Non-small cell lung cancer (NSCLC) is associated with a high risk of venous thromboembolism (VTE). However, data on specific risk factors for VTE in patients with advanced NSCLC remain limited.

    Methods and Results: Using a Japanese nationwide administrative database, we analyzed 20,206 patients aged ≥18 years with advanced NSCLC who received first-line chemotherapy between January 2016 and January 2023. VTE events were identified through International Classification of Diseases, Tenth Revision codes and imaging studies. Risk factors were evaluated using Cox proportional hazards models with time-dependent covariates. The cumulative incidence of VTE was 4.2% and 6.1% at 365 and 730 days after the first date of chemotherapy for NSCLC, respectively. Several significant risk factors for VTE were identified, including female sex (hazard ratio [HR] 1.374; 95% confidence interval [CI] 1.157–1.631), higher body mass index (HR 1.029 per 1-kg/m2increase; 95% CI 1.009–1.048), previous VTE (HR 2.707; 95% CI 1.907–3.843), platinum-based chemotherapy (HR 1.217; 95% CI 1.051–1.410), anti-vascular endothelial growth factor agent (HR 1.763; 95% CI 1.458–2.132), heart failure (HR 1.677; 95% CI 1.432–1.965), and stroke/transient ischemic attack (HR 1.296; 95% CI 1.055–1.593).

    Conclusions: This large-scale study identified several significant risk factors for VTE in patients with advanced NSCLC. The findings suggest the need for risk-stratified monitoring and prophylactic strategies to reduce VTE-related complications in high-risk patients.

  • Norihiro Kogame, Yoshihisa Nakagawa, Ken Kozuma, Raisuke Iijima, Anna ...
    Article type: ORIGINAL ARTICLE
    Subject Area: Cardiovascular Intervention
    Article ID: CR-25-0298
    Published: January 30, 2026
    Advance online publication: January 30, 2026
    JOURNAL OPEN ACCESS FULL-TEXT HTML ADVANCE PUBLICATION
    Supplementary material

    Background: In patients at high bleeding risk (HBR), short dual antiplatelet therapy (DAPT) after percutaneous coronary intervention (PCI) reduces bleeding without increasing ischemic events. However, the sex-based differences in the effects of short DAPT strategy followed by prasugrel monotherapy compared with conventional DAPT strategy remain unclear.

    Methods and Results: The 24-month outcomes from 2 multicenter, non-interventional, prospective registries, PENDULUM mono (n=872; short DAPT strategy followed by prasugrel monotherapy) and an HBR subset of the PENDULUM registry (n=1,553; conventional DAPT strategy), were analyzed using the inverse probability of treatment weighting method. Primary endpoints were major adverse cardiovascular and cerebrovascular events (MACCE) and clinically relevant bleeding (CRB: Bleeding Academic Research Consortium [BARC] types 2, 3, and 5). In women, short DAPT strategy was associated with numerically lower rates of MACCE (8.2% vs. 12.3%; hazard ratio [HR] 0.71, 95% confidence interval [CI] 0.42–1.20; P=0.197) and CRB (4.7% vs. 7.0%; HR 0.68, 95% CI 0.35–1.32; P=0.258). In men, similar trends were observed for MACCE (8.8% vs. 11.0%; HR 0.86, 95% CI 0.62–1.21; P=0.388) and CRB (7.0% vs. 8.1%; HR 0.87, 95% CI 0.60–1.26; P=0.460). No significant interaction between treatment and sex was found for MACCE (P=0.599) or CRB (P=0.537).

    Conclusions: In HBR patients undergoing PCI, a short DAPT strategy followed by prasugrel monotherapy had numerically fewer ischemic and bleeding events than conventional DAPT strategy, without evidence of sex-based heterogeneity.

  • Masato Uchida, Satoshi Yoshimura, Kanna Arimoto, Hirotoshi Nishikita, ...
    Article type: RESEARCH LETTER
    Article ID: CR-25-0301
    Published: January 30, 2026
    Advance online publication: January 30, 2026
    JOURNAL OPEN ACCESS FULL-TEXT HTML ADVANCE PUBLICATION

    Background: The associations between symptom characteristics and patients’ interpretations and sources of knowledge about acute coronary syndrome (ACS) remains unclear.

    Methods and Results: We enrolled 81 patients with ACS. Patients who misinterpreted their symptoms more frequently reported atypical features such as tenderness (13.3% vs 0%; P=0.028) and syncope (11.8% vs 0%; P=0.011). Common knowledge sources among patients who correctly interpreted their symptoms included television, healthcare professionals, and the internet without social media.

    Conclusions: Patients experiencing atypical symptoms often misinterpret them. Most knowledge sources for those who interpreted correctly were traditional.

  • Masahiro Koide, Kan Zen, Tomotsugu Seki, Kento Fukui, Kazuaki Takamats ...
    Article type: ORIGINAL ARTICLE
    Subject Area: Cardiovascular Intervention
    Article ID: CR-25-0219
    Published: January 29, 2026
    Advance online publication: January 29, 2026
    JOURNAL OPEN ACCESS FULL-TEXT HTML ADVANCE PUBLICATION
    Supplementary material

    Background: Percutaneous coronary intervention (PCI) for calcified coronary lesions without stent implantation remains a challenging therapeutic strategy. The efficacy of drug-coated balloon (DCB) therapy in relation to specific calcified plaque morphologies has not been previously investigated.

    Methods and Results: We conducted a retrospective analysis of 150 lesions (136 patients) who underwent optical coherence tomography (OCT)-guided PCI using DCB for angiographically moderate-to-severe calcified lesions. Based on the OCT findings, target lesions were categorized into 3 groups: superficial calcific sheet (SC) group; calcific protrusion (CP) group; and eruptive calcified nodule (eCN) group. Long-term clinical outcomes, including clinically driven target lesion revascularization (CD-TLR), myocardial infarction (MI), cardiac death, and the composite endpoint of major adverse cardiovascular events (MACE), were assessed over a median follow up of 2.6 years. No significant differences in rates of CD-TLR, MI, cardiac death, or MACE were observed between the SC and CP groups. In contrast, the eCN group showed significantly higher incidences of MI (P<0.01 vs. SC; P<0.05 vs. CP), cardiac death (P<0.01 vs. SC and CP), and MACE (P<0.01 vs. SC and CP) compared with the other 2 groups.

    Conclusions: In moderate-to-severe calcified lesions where adequate vessel preparation was achieved, DCB therapy was associated with favorable outcomes in lesions with SC or CP morphologies. In contrast, lesions involving eCN were linked to significantly worse clinical outcomes.

  • Ryusuke Hamada, Kyohei Onishi, Masakazu Yasuda, Kosuke Fujita, Naoko S ...
    Article type: ORIGINAL ARTICLE
    Subject Area: Valvular Heart Disease
    Article ID: CR-25-0243
    Published: January 29, 2026
    Advance online publication: January 29, 2026
    JOURNAL OPEN ACCESS FULL-TEXT HTML ADVANCE PUBLICATION

    Background: Trans-subclavian access transcatheter aortic valve implantation (TAVI), typically from the left side, is feasible. However, right subclavian artery access is technically challenging because of the anatomical orientation, resulting in malalignment of the transcatheter heart valve within the aortic annular plane.

    Methods and Results: We aimed to evaluate procedural outcomes, device–annulus alignment, and clinical efficacy of right trans-subclavian (RtTS) TAVI. Of a consecutive 423 patients who underwent TAVI, 32 cases performed via right and left subclavian access were analyzed. Implanted device depth and angle were analyzed angiographically. The device–annulus angle was measured angiographically. Fifteen of 22 patients were treated with a balloon-expandable valve, and 7 patients received a self-expanding valve, via RtTS. Procedural success was achieved in all cases. Compared with femoral and left subclavian approaches, RtTS led to a significantly larger device–annulus angle (6.0° vs. 8.7°; P<0.05), with deep left coronary cusp implantation (2.4 vs. 4.4 mm; P=0.05). Post-procedural transcatheter heart valve function was comparable across the groups, and no patients had greater than moderate paravalvular leakage. However, the incidence of symptomatic stroke occurred in 2 patients in the RtTS group (9.1%; P=0.21).

    Conclusions: RtTS TAVI is a feasible alternative access route, with comparable procedural and clinical outcomes to those of conventional approaches, albeit with a higher risk of stroke.

  • Erika Yamamoto, Takao Kato, Takeshi Morimoto, Hidenori Yaku, Yasutaka ...
    Article type: ORIGINAL ARTICLE
    Subject Area: Heart Failure
    Article ID: CR-25-0316
    Published: January 29, 2026
    Advance online publication: January 29, 2026
    JOURNAL OPEN ACCESS FULL-TEXT HTML ADVANCE PUBLICATION
    Supplementary material

    Background: The residential environment may influence access to care and prognosis in patients with heart failure (HF). Evidence on the impact of geographic factors in Japan is limited. We investigated the association of home‐to‐hospital distance and residential population density with 1‐year clinical outcomes in patients hospitalized for acute decompensated HF.

    Methods and Results: We used the Kyoto Congestive Heart Failure registry to analyze 3,616 patients who were discharged alive after their first hospitalization. Home‐to‐hospital distance was calculated using road travel distance and dichotomized by the median (8.0 km). Residential density was classified as urban (densely inhabited districts [DID]) or suburban (non-DID). The primary outcome was all‐cause death at 1 year, assessed using hospital‐stratified Cox proportional hazards models. The median home‐to‐hospital distance was 8.0 km (interquartile range 4.1–14.5 km); 1,797 (49.7%) patients were in the long‐distance group. The long‐distance group had a higher risk of all‐cause death than the short‐distance group (adjusted hazard ratio [HR] 1.19; 95% confidence interval [CI] 1.02, 1.39; P=0.02). As a continuous variable, each doubling of distance was associated with increased all‐cause death (HR 1.06; 95% CI 1.02, 1.10). Suburban residence was not significantly associated with the primary outcome compared with urban residence (adjusted HR 1.18; 95% CI 0.99, 1.44; P=0.06).

    Conclusions: In Japanese patients hospitalized for acute decompensated HF, longer home-to-hospital distance, but not residential population density, was associated with a higher risk of 1-year all-cause death.

  • Kanna Nakamura, Tomohiko Taniguchi, Aoi Omori, Hirotoshi Nishi, Gakuto ...
    Article type: ORIGINAL ARTICLE
    Subject Area: Valvular Heart Disease
    Article ID: CR-25-0281
    Published: January 27, 2026
    Advance online publication: January 27, 2026
    JOURNAL OPEN ACCESS FULL-TEXT HTML ADVANCE PUBLICATION
    Supplementary material

    Background: The impact of coexisting malnutrition and sarcopenia on survival after transcatheter aortic valve replacement (TAVR) has not been fully studied.

    Methods and Results: Among 513 consecutive patients undergoing TAVR between February 2014 and June 2023, 340 with available preoperative Geriatric after Nutritional Risk Index (GNRI) and Short Physical Performance Battery (SPPB) data were categorized into 4 groups based on malnutrition (GNRI <98) and sarcopenia (SPPB ≤9) status: malnutrition and sarcopenia (N=98); malnutrition without sarcopenia (N=69); no malnutrition with sarcopenia (N=83); neither malnutrition nor sarcopenia (N=90, reference). The primary outcome measure was all-cause death. Patients with both malnutrition and sarcopenia were older and had a higher prevalence of anemia compared with the reference group. The cumulative 5-year mortality rate was significantly higher in this group. After adjusting for confounders, coexistence of malnutrition and sarcopenia had a significantly higher risk for all-cause death (hazard ratio [HR] 3.15; 95% confidence interval [CI]: 1.68–5.89; P<0.001). In contrast, malnutrition without sarcopenia (HR 1.36; 95% CI 0.64–2.90; P=0.42) and no malnutrition with sarcopenia (HR 1.86; 95% CI 0.92–3.79; P=0.08) were not associated with increased mortality.

    Conclusions: The coexistence of malnutrition and sarcopenia significantly increased mortality risk after TAVR, which highlights the importance of integrating both nutritional and sarcopenia assessments into preoperative risk stratification to optimize outcomes in patients undergoing TAVR.

  • Takahiro Kuno, Yoshiaki Ohyama, Yoko Sumita, Koshiro Kanaoka, Yoshihir ...
    Article type: ORIGINAL ARTICLE
    Subject Area: Epidemiology
    Article ID: CR-25-0232
    Published: January 24, 2026
    Advance online publication: January 24, 2026
    JOURNAL OPEN ACCESS FULL-TEXT HTML ADVANCE PUBLICATION

    Background: Infective endocarditis (IE) is a life-threatening condition with high mortality. The Coronavirus disease 2019 (COVID-19) pandemic disrupted healthcare systems, potentially affecting IE management and outcomes. However, its impact in Japan remains unclear. This study aimed to evaluate the impact of the COVID-19 pandemic on in-hospital mortality and the rate of valve surgery among patients with IE in Japan.

    Methods and Results: We conducted a retrospective analysis using the Japanese registry of all Cardiac and Vascular Diseases Diagnostic Procedure Combination (JROAD-DPC) nationwide database, including 19,077 adult patients hospitalized with IE between April 2016 and March 2022. The study period was divided into pre-COVID-19 (n=12,419) and post-COVID-19 (n=6,658) periods. Patient baseline characteristics were well-balanced after 1 : 1 propensity score matching (6,652 pairs). Before matching, crude total in-hospital mortality was higher in the post-COVID-19 period (15.7% vs. 13.9%; P<0.001). However, after matching, there were no significant differences in total in-hospital mortality (15.7% vs. 15.3%, P=0.60). The rate of valve surgery did not differ significantly between the groups after matching (26.4% vs. 25.5%; P=0.22). The incidence of stroke was higher in the post-COVID-19 period (8.3% vs. 7.3%; P=0.049).

    Conclusions: This nationwide study showed that risk-adjusted in-hospital mortality in patients with IE was not different during the COVID-19 pandemic, although unadjusted mortality was higher in the post-COVID-19 period in Japan.

  • Shiori Iwane, Masayuki Tanaka, Tomoyoshi Miyamoto, Kentaro Nishida, Sh ...
    Article type: ORIGINAL ARTICLE
    Subject Area: Heart Failure
    Article ID: CR-25-0208
    Published: January 21, 2026
    Advance online publication: January 21, 2026
    JOURNAL OPEN ACCESS FULL-TEXT HTML ADVANCE PUBLICATION
    Supplementary material

    Background: The early detection and treatment of cancer have led to an aging population of cancer survivors, and mortality rates from cardiovascular diseases are increasing. The incidence of heart failure (HF) after treatment with paclitaxel (PTX), a microtubule polymerization promoter and cardiotoxic anticancer agent, is low and the risk factors for post-PTX HF remain unclear. A history of heart disease has been suggested as a potential cardiovascular risk factor in cancer survivors. Using the JMDC database of real-world medical data in Japan, we investigated whether heart and lifestyle-related diseases affect the onset of HF after PTX treatment.

    Methods and Results: Patients who underwent PTX treatment were identified in the JMDC database, and the occurrence of HF was determined to analyze associations between heart- and lifestyle-related diseases and the occurrence of HF after PTX administration. Of the patients who received PTX, 17.7% developed HF. The results of multivariable Cox proportional hazards analysis indicated that comorbidities such as ischemic heart disease, atrial fibrillation, pericarditis, pulmonary embolism, and hypertension were associated with the onset of HF in patients receiving PTX.

    Conclusions: Although the incidence of HF after PTX administration is not high, patients with specific medical histories or comorbidities may be at increased risk, and careful monitoring is warranted to detect potential cardiovascular complications.

  • Keisuke Okano, Kouki Sano, Yo Mukai, Yusuke Seto, Ritsu Nisimura, Tets ...
    Article type: RESEARCH LETTER
    Article ID: CR-25-0305
    Published: January 20, 2026
    Advance online publication: January 20, 2026
    JOURNAL OPEN ACCESS FULL-TEXT HTML ADVANCE PUBLICATION

    Background: Dynapenia is characterized by normal muscle mass with low muscle function, but its response to cardiac rehabilitation is not fully understood, so we aimed to clarify the clinical characteristics of cardiovascular disease (CVD) patients with dynapenia and evaluate the effects of a 3month outpatient physical therapy program.

    Methods and Results: Data from 62 CVD patients who completed the outpatient physical therapy program were analyzed; 12 (19.4%) met the diagnostic criteria for dynapenia. After 3 months, short Physical Performance Battery scores increased from 10.4±1.3 to 11.2±1.9, knee extension strength from 0.43±0.19 to 0.46±0.19 kgf/kg, and Mini Nutritional Assessment®︎ scores from 22.4±3.2 to 25.5±2.8 (all P<0.05).

    Conclusions: Outpatient physical therapy may improve physical function, muscle strength, and nutritional status in CVD patients with dynapenia.

  • Hiroki Okamoto, Hidemitsu Miyatake, Noritsugu Matsutani, Naoto Shiomi, ...
    Article type: IMAGES IN CARDIOVASCULAR MEDICINE
    Article ID: CR-25-0303
    Published: January 17, 2026
    Advance online publication: January 17, 2026
    JOURNAL OPEN ACCESS FULL-TEXT HTML ADVANCE PUBLICATION
    Supplementary material
  • Shinsuke Miyazaki
    Article type: REVIEW
    Article ID: CR-25-0284
    Published: December 25, 2025
    Advance online publication: December 25, 2025
    JOURNAL OPEN ACCESS FULL-TEXT HTML ADVANCE PUBLICATION

    Catheter ablation of atrial fibrillation (AF) is an established therapeutic strategy, with pulmonary vein isolation as the cornerstone of ablative therapy. Although radiofrequency and cryothermal energies have been the main energy sources, the past decade has witnessed remarkable scientific progress and growing interest in pulsed field ablation (PFA) as a novel energy modality, leading to the recent clinical adoption of PFA technologies for AF treatment. In Japan, PFA was introduced into clinical practice in 2024 and has been rapidly accepted. Unlike traditional thermal energies, PFA uses pulsed electric fields to induce irreversible electroporation, selectively targeting myocardial tissue while preserving adjacent structures from thermal or mechanical injury. Offering procedural efficacy comparable to conventional thermal ablation, PFA distinguishes itself by enabling shorter procedure times and reducing the risk of complications. This review summarizes the mechanisms of PFA, currently available systems in Japan, reported clinical outcomes and complications, as well as limitations and future perspectives.

  • Tadashi Hoshiyama, Kenichi Tsujita, Yuko Inoue, Masanobu Ishii, Koichi ...
    Article type: RESEARCH LETTER
    Article ID: CR-25-0310
    Published: December 24, 2025
    Advance online publication: December 24, 2025
    JOURNAL OPEN ACCESS FULL-TEXT HTML ADVANCE PUBLICATION

    Background: Age-specific differences in the association between sleep duration and atrial fibrillation (AF) remain uncertain.

    Methods and Results: Using the estimated sleep duration derived from accelerometer data embedded in the Holter-electrocardiogram, association between sleep duration and AF was explored among individuals in their 50 s (working-age) and 70s (retirement-age). In the overall population, AF risk decreased with longer sleep, but the benefit diminished with excessively long sleep (P=0.03). Also, consistent risk reduction with increasing sleep was observed in the 50s age group (P=0.02), but not in the 70s.

    Conclusions: Inadequate sleep may be associated with AF, particularly, among middle-aged individuals.

feedback
Top