Circulation Reports
Online ISSN : 2434-0790
Current issue
Displaying 1-24 of 24 articles from this issue
Reviews
  • Shinsuke Miyazaki
    Article type: REVIEW
    2026Volume 8Issue 4 Pages 531-536
    Published: April 10, 2026
    Released on J-STAGE: April 10, 2026
    Advance online publication: December 25, 2025
    JOURNAL OPEN ACCESS FULL-TEXT HTML

    Catheter ablation of atrial fibrillation (AF) is an established therapeutic strategy, with pulmonary vein isolation as the cornerstone of ablative therapy. Although radiofrequency and cryothermal energies have been the main energy sources, the past decade has witnessed remarkable scientific progress and growing interest in pulsed field ablation (PFA) as a novel energy modality, leading to the recent clinical adoption of PFA technologies for AF treatment. In Japan, PFA was introduced into clinical practice in 2024 and has been rapidly accepted. Unlike traditional thermal energies, PFA uses pulsed electric fields to induce irreversible electroporation, selectively targeting myocardial tissue while preserving adjacent structures from thermal or mechanical injury. Offering procedural efficacy comparable to conventional thermal ablation, PFA distinguishes itself by enabling shorter procedure times and reducing the risk of complications. This review summarizes the mechanisms of PFA, currently available systems in Japan, reported clinical outcomes and complications, as well as limitations and future perspectives.

Original Articles
Cardiac Rehabilitation
  • Tomohiro Kato, Yuta Ozaki, Shigefumi Honda, Yusuke Uemura, Kenji Takem ...
    Article type: ORIGINAL ARTICLE
    Subject area: Cardiac Rehabilitation
    2026Volume 8Issue 4 Pages 537-543
    Published: April 10, 2026
    Released on J-STAGE: April 10, 2026
    Advance online publication: February 13, 2026
    JOURNAL OPEN ACCESS FULL-TEXT HTML
    Supplementary material

    Background: Postoperative declines in activities of daily living (ADL) are concerning in older adults undergoing cardiovascular surgeries. Sarcopenia represents a determinant of such adverse outcomes. We examined whether preoperative sarcopenia and its components predicted postoperative ADL decline in older patients who underwent elective cardiovascular surgeries.

    Methods and Results: This retrospective cohort study included 589 patients aged ≥65 years who underwent elective coronary artery bypass grafting, heart valve surgery, or thoracic aortic surgery. Sarcopenia was defined according to the Asian Working Group for Sarcopenia 2019 criteria. ADLs were assessed using the Barthel Index, with in-hospital ADL decline being defined as a ≥10-point reduction. Thirty-three (5.6%) patients had sarcopenia preoperatively. ADL decline was significantly higher in the patients with sarcopenia compared with those without (15.2% vs. 5.0%; P=0.014). Multivariable logistic regression analyses demonstrated that sarcopenia was independently associated with ADL decline (odds ratio 3.094; 95% confidence interval 1.067–8.968; P=0.038). Each sarcopenia component – low muscle mass, low muscle strength, and slow gait speed – was also independently associated with ADL decline (all P<0.050). Age-adjusted receiver operating characteristic analyses showed that sarcopenia demonstrated moderate discrimination for predicting postoperative ADL decline, with an area under the curve of 0.707.

    Conclusions: Preoperative sarcopenia and its individual components independently predicted in-hospital ADL decline following cardiovascular surgery. Preoperative assessments may help identify high-risk patients.

Cardiovascular Intervention
  • Masahiro Koide, Kan Zen, Tomotsugu Seki, Kento Fukui, Kazuaki Takamats ...
    Article type: ORIGINAL ARTICLE
    Subject area: Cardiovascular Intervention
    2026Volume 8Issue 4 Pages 544-553
    Published: April 10, 2026
    Released on J-STAGE: April 10, 2026
    Advance online publication: January 29, 2026
    JOURNAL OPEN ACCESS FULL-TEXT HTML
    Supplementary material

    Background: Percutaneous coronary intervention (PCI) for calcified coronary lesions without stent implantation remains a challenging therapeutic strategy. The efficacy of drug-coated balloon (DCB) therapy in relation to specific calcified plaque morphologies has not been previously investigated.

    Methods and Results: We conducted a retrospective analysis of 150 lesions (136 patients) who underwent optical coherence tomography (OCT)-guided PCI using DCB for angiographically moderate-to-severe calcified lesions. Based on the OCT findings, target lesions were categorized into 3 groups: superficial calcific sheet (SC) group; calcific protrusion (CP) group; and eruptive calcified nodule (eCN) group. Long-term clinical outcomes, including clinically driven target lesion revascularization (CD-TLR), myocardial infarction (MI), cardiac death, and the composite endpoint of major adverse cardiovascular events (MACE), were assessed over a median follow up of 2.6 years. No significant differences in rates of CD-TLR, MI, cardiac death, or MACE were observed between the SC and CP groups. In contrast, the eCN group showed significantly higher incidences of MI (P<0.01 vs. SC; P<0.05 vs. CP), cardiac death (P<0.01 vs. SC and CP), and MACE (P<0.01 vs. SC and CP) compared with the other 2 groups.

    Conclusions: In moderate-to-severe calcified lesions where adequate vessel preparation was achieved, DCB therapy was associated with favorable outcomes in lesions with SC or CP morphologies. In contrast, lesions involving eCN were linked to significantly worse clinical outcomes.

  • Norihiro Kogame, Yoshihisa Nakagawa, Ken Kozuma, Raisuke Iijima, Anna ...
    Article type: ORIGINAL ARTICLE
    Subject area: Cardiovascular Intervention
    2026Volume 8Issue 4 Pages 554-563
    Published: April 10, 2026
    Released on J-STAGE: April 10, 2026
    Advance online publication: January 30, 2026
    JOURNAL OPEN ACCESS FULL-TEXT HTML
    Supplementary material

    Background: In patients at high bleeding risk (HBR), short dual antiplatelet therapy (DAPT) after percutaneous coronary intervention (PCI) reduces bleeding without increasing ischemic events. However, the sex-based differences in the effects of short DAPT strategy followed by prasugrel monotherapy compared with conventional DAPT strategy remain unclear.

    Methods and Results: The 24-month outcomes from 2 multicenter, non-interventional, prospective registries, PENDULUM mono (n=872; short DAPT strategy followed by prasugrel monotherapy) and an HBR subset of the PENDULUM registry (n=1,553; conventional DAPT strategy), were analyzed using the inverse probability of treatment weighting method. Primary endpoints were major adverse cardiovascular and cerebrovascular events (MACCE) and clinically relevant bleeding (CRB: Bleeding Academic Research Consortium [BARC] types 2, 3, and 5). In women, short DAPT strategy was associated with numerically lower rates of MACCE (8.2% vs. 12.3%; hazard ratio [HR] 0.71, 95% confidence interval [CI] 0.42–1.20; P=0.197) and CRB (4.7% vs. 7.0%; HR 0.68, 95% CI 0.35–1.32; P=0.258). In men, similar trends were observed for MACCE (8.8% vs. 11.0%; HR 0.86, 95% CI 0.62–1.21; P=0.388) and CRB (7.0% vs. 8.1%; HR 0.87, 95% CI 0.60–1.26; P=0.460). No significant interaction between treatment and sex was found for MACCE (P=0.599) or CRB (P=0.537).

    Conclusions: In HBR patients undergoing PCI, a short DAPT strategy followed by prasugrel monotherapy had numerically fewer ischemic and bleeding events than conventional DAPT strategy, without evidence of sex-based heterogeneity.

  • Mayuka Masuda, Hiroyuki Yamamoto, Shinsuke Nakano, Nobuyuki Takahashi, ...
    Article type: ORIGINAL ARTICLE
    Subject area: Cardiovascular Intervention
    2026Volume 8Issue 4 Pages 564-571
    Published: April 10, 2026
    Released on J-STAGE: April 10, 2026
    Advance online publication: February 17, 2026
    JOURNAL OPEN ACCESS FULL-TEXT HTML
    Supplementary material

    Background: Drug-coated balloon (DCB) angioplasty has shown short-term feasibility for large-vessel coronary artery disease (CAD; ≥3 mm); however, long-term outcomes remain unclear. This study aimed to evaluate the 5-year cardiovascular outcomes of DCB angioplasty vs. drug-eluting stents (DES) for de novo large-vessel CAD.

    Methods and Results: This single-center retrospective study analyzed patients undergoing percutaneous coronary intervention (PCI) with either DCB (SeQuent Please) or DES (Xience Alpine) between January 2016 and December 2018. The primary outcomes were cardiovascular events (CVE), defined as a composite of cardiac death, non-fatal myocardial infarction, and target lesion revascularization (TLR). Secondary outcomes included minimal lumen diameter (MLD), diameter stenosis (DS), and late lumen loss (LLL), assessed at the index PCI and at the 1-year angiographic follow up. Overall, 114 patients (122 lesions) in the DCB group and 269 patients (293 lesions) in the DES group were analyzed, with similar median follow-up durations (1,678 vs. 1,825 days; P=0.687). At 5 years, TLR and CVE rates were comparable between the DCB and DES groups (7.9% vs. 4.5%, P=0.239; and 11.4% vs. 10.4%, P=0.773, respectively). No significant differences in MLD, DS, or LLL were observed between the groups at the 1-year follow up.

    Conclusions: With careful lesion selection and preparation, DCB angioplasty could serve as a feasible treatment option for de novo large-vessel CAD in clinical practice.

Cardiovascular Surgery
  • Takayuki Gyoten, Yuta Kanazawa, Yu Kumagai, Takayuki Akatsu, Yuko Gata ...
    Article type: ORIGINAL ARTICLE
    Subject area: Cardiovascular Surgery
    2026Volume 8Issue 4 Pages 572-579
    Published: April 10, 2026
    Released on J-STAGE: April 10, 2026
    Advance online publication: February 06, 2026
    JOURNAL OPEN ACCESS FULL-TEXT HTML

    Background: Data on the clinical outcomes and hemodynamic performance of the smallest commercially available bioprostheses (19 mm) in Japan for aortic valve replacement (AVR) remain limited.

    Methods and Results: We analyzed the data of 187 adults (median age, 76 [interquartile range (IQR): 73–80] years; 165 women [88%]; median follow-up, 65 [IQR: 32–95] months) with symptomatic aortic valve stenosis, regurgitation, and valve deterioration who underwent surgical AVR between January 2015 and July 2024 with the Avalus (n=7), Magna (n=77), Epic (n=26), Inspiris (n=58), or Mosaic (n=27) bioprosthesis because of having small aortic annuli. The primary and secondary endpoints were all-cause death and major adverse cardiac events, respectively. Moderate-to-severe prosthesis-patient mismatch occurred in 53 patients (28%). The overall survival rates (95% confidence interval [CI]) at 1, 3, and 5 years after valve replacement were 93.0% (88.3–95.9%), 87.0% (81.0–91.2%), and 85.7% (79.5–90.1%), respectively. The rates of freedom from major adverse cardiac and cerebrovascular events (95% CI) at 1, 3, and 5 years were 96.2% (92.1–98.2%), 90.2(84.5–93.9%), and 88.7(82.5–92.7%), respectively. Four patients required re-intervention (3, re-AVR and 1, medication). No significant differences were observed in either outcomes or hemodynamics among the different aortic bioprostheses.

    Conclusions: Surgical replacement with 19-mm third-generation aortic valve bioprostheses for small aortic annuli is feasible with favorable early and mid-term hemodynamics.

Critical Care
  • Saeko Iikura, Yuki Ikeda, Shohei Nakahara, Yuki Watanabe, Yosuke Haruk ...
    Article type: ORIGINAL ARTICLE
    Subject area: Critical Care
    2026Volume 8Issue 4 Pages 580-588
    Published: April 10, 2026
    Released on J-STAGE: April 10, 2026
    Advance online publication: February 21, 2026
    JOURNAL OPEN ACCESS FULL-TEXT HTML

    Background: The clinical differences between intra-aortic balloon pumping (IABP) and a microaxial flow pump (Impella) for left ventricular (LV) unloading in patients with fulminant myocarditis (FM) supported with venoarterial extracorporeal membrane oxygenation (VA-ECMO) remain unclear.

    Methods and Results: In this single-center, retrospective cohort study, we analyzed 27 consecutive patients with lymphocytic FM who received VA-ECMO support. Patients were stratified by the LV unloading device that was used: IABP (n=15); or Impella (n=12). The primary endpoint was a composite of all-cause mortality or implantation of an extracorporeal ventricular assist device (exVAD) within 30 days of VA-ECMO initiation. Temporal changes in laboratory and hemodynamic parameters during the first 7 days of support were also assessed. Baseline characteristics, including LV ejection fraction (IABP 16% vs. Impella 18%; P=0.814) and QRS duration (139 vs. 105 ms; P=0.805), were comparable between groups. Nine patients met the primary endpoint (mortality [n=7]; exVAD implantation [n=2]). Kaplan-Meier analysis revealed a significantly lower incidence of the primary endpoint in the Impella group (log-rank P=0.018). The Impella group also showed a significantly greater improvement in cardiac power output (group×time interaction, P=0.040). Hemolysis, elevated total bilirubin, and increased serum creatinine were more pronounced in the Impella group.

    Conclusions: In patients with FM requiring VA-ECMO, LV unloading with Impella was associated with improved short-term clinical outcomes compared with IABP.

Epidemiology
  • Takahiro Kuno, Yoshiaki Ohyama, Yoko Sumita, Koshiro Kanaoka, Yoshihir ...
    Article type: ORIGINAL ARTICLE
    Subject area: Epidemiology
    2026Volume 8Issue 4 Pages 589-594
    Published: April 10, 2026
    Released on J-STAGE: April 10, 2026
    Advance online publication: January 24, 2026
    JOURNAL OPEN ACCESS FULL-TEXT HTML

    Background: Infective endocarditis (IE) is a life-threatening condition with high mortality. The Coronavirus disease 2019 (COVID-19) pandemic disrupted healthcare systems, potentially affecting IE management and outcomes. However, its impact in Japan remains unclear. This study aimed to evaluate the impact of the COVID-19 pandemic on in-hospital mortality and the rate of valve surgery among patients with IE in Japan.

    Methods and Results: We conducted a retrospective analysis using the Japanese registry of all Cardiac and Vascular Diseases Diagnostic Procedure Combination (JROAD-DPC) nationwide database, including 19,077 adult patients hospitalized with IE between April 2016 and March 2022. The study period was divided into pre-COVID-19 (n=12,419) and post-COVID-19 (n=6,658) periods. Patient baseline characteristics were well-balanced after 1 : 1 propensity score matching (6,652 pairs). Before matching, crude total in-hospital mortality was higher in the post-COVID-19 period (15.7% vs. 13.9%; P<0.001). However, after matching, there were no significant differences in total in-hospital mortality (15.7% vs. 15.3%, P=0.60). The rate of valve surgery did not differ significantly between the groups after matching (26.4% vs. 25.5%; P=0.22). The incidence of stroke was higher in the post-COVID-19 period (8.3% vs. 7.3%; P=0.049).

    Conclusions: This nationwide study showed that risk-adjusted in-hospital mortality in patients with IE was not different during the COVID-19 pandemic, although unadjusted mortality was higher in the post-COVID-19 period in Japan.

Heart Failure
  • Shiori Iwane, Masayuki Tanaka, Tomoyoshi Miyamoto, Kentaro Nishida, Sh ...
    Article type: ORIGINAL ARTICLE
    Subject area: Heart Failure
    2026Volume 8Issue 4 Pages 595-602
    Published: April 10, 2026
    Released on J-STAGE: April 10, 2026
    Advance online publication: January 21, 2026
    JOURNAL OPEN ACCESS FULL-TEXT HTML
    Supplementary material

    Background: The early detection and treatment of cancer have led to an aging population of cancer survivors, and mortality rates from cardiovascular diseases are increasing. The incidence of heart failure (HF) after treatment with paclitaxel (PTX), a microtubule polymerization promoter and cardiotoxic anticancer agent, is low and the risk factors for post-PTX HF remain unclear. A history of heart disease has been suggested as a potential cardiovascular risk factor in cancer survivors. Using the JMDC database of real-world medical data in Japan, we investigated whether heart and lifestyle-related diseases affect the onset of HF after PTX treatment.

    Methods and Results: Patients who underwent PTX treatment were identified in the JMDC database, and the occurrence of HF was determined to analyze associations between heart- and lifestyle-related diseases and the occurrence of HF after PTX administration. Of the patients who received PTX, 17.7% developed HF. The results of multivariable Cox proportional hazards analysis indicated that comorbidities such as ischemic heart disease, atrial fibrillation, pericarditis, pulmonary embolism, and hypertension were associated with the onset of HF in patients receiving PTX.

    Conclusions: Although the incidence of HF after PTX administration is not high, patients with specific medical histories or comorbidities may be at increased risk, and careful monitoring is warranted to detect potential cardiovascular complications.

  • Erika Yamamoto, Takao Kato, Takeshi Morimoto, Hidenori Yaku, Yasutaka ...
    Article type: ORIGINAL ARTICLE
    Subject area: Heart Failure
    2026Volume 8Issue 4 Pages 603-615
    Published: April 10, 2026
    Released on J-STAGE: April 10, 2026
    Advance online publication: January 29, 2026
    JOURNAL OPEN ACCESS FULL-TEXT HTML
    Supplementary material

    Background: The residential environment may influence access to care and prognosis in patients with heart failure (HF). Evidence on the impact of geographic factors in Japan is limited. We investigated the association of home‐to‐hospital distance and residential population density with 1‐year clinical outcomes in patients hospitalized for acute decompensated HF.

    Methods and Results: We used the Kyoto Congestive Heart Failure registry to analyze 3,616 patients who were discharged alive after their first hospitalization. Home‐to‐hospital distance was calculated using road travel distance and dichotomized by the median (8.0 km). Residential density was classified as urban (densely inhabited districts [DID]) or suburban (non-DID). The primary outcome was all‐cause death at 1 year, assessed using hospital‐stratified Cox proportional hazards models. The median home‐to‐hospital distance was 8.0 km (interquartile range 4.1–14.5 km); 1,797 (49.7%) patients were in the long‐distance group. The long‐distance group had a higher risk of all‐cause death than the short‐distance group (adjusted hazard ratio [HR] 1.19; 95% confidence interval [CI] 1.02, 1.39; P=0.02). As a continuous variable, each doubling of distance was associated with increased all‐cause death (HR 1.06; 95% CI 1.02, 1.10). Suburban residence was not significantly associated with the primary outcome compared with urban residence (adjusted HR 1.18; 95% CI 0.99, 1.44; P=0.06).

    Conclusions: In Japanese patients hospitalized for acute decompensated HF, longer home-to-hospital distance, but not residential population density, was associated with a higher risk of 1-year all-cause death.

  • Kazuya Kito, Masakazu Saitoh, Yuji Mori, Keita Fujiyama, Masahiro Toda ...
    Article type: ORIGINAL ARTICLE
    Subject area: Heart Failure
    2026Volume 8Issue 4 Pages 616-625
    Published: April 10, 2026
    Released on J-STAGE: April 10, 2026
    Advance online publication: February 06, 2026
    JOURNAL OPEN ACCESS FULL-TEXT HTML
    Supplementary material

    Background: Renal dysfunction (RD) is common at admission for acute heart failure (AHF), but there is limited evidence focusing on older adults and considering the influence of physical function. We evaluated the prognostic significance of admission RD severity as a risk factor for adverse outcomes in older patients with AHF, while considering the potential modifying effect of physical function.

    Methods and Results: This multicenter prospective cohort study enrolled 710 patients aged ≥65 years with an estimated glomerular filtration rate (eGFR) <60 mL/min/1.73 m2. Admission RD was stratified into 4 severity classes: mild RD (eGFR 45–59), moderate RD (eGFR 30–44), severe RD (eGFR 15–29), and kidney failure (eGFR <15). The primary outcome was a composite of HF readmission and all-cause death within 1 year post-discharge. Subgroup analyses assessed potential effect modification by physical function and other variables. After multivariable adjustment, severe RD or kidney failure was significantly associated with a higher risk of the composite outcome compared with mild RD (adjusted hazard ratio: 1.529; 95% confidence interval: 1.005–2.326). A possible interaction was observed between moderate RD and the Short Physical Performance Battery score at discharge (P for interaction=0.093).

    Conclusions: Severe RD or kidney failure at admission independently predicted 1-year HF readmission and all-cause death. In moderate RD, physical function may modify RD prognostic impact.

Ischemic Heart Disease
  • Tetsuya Takahashi, Taiga Ishigaki, Wataru Katawaki, Taku Toshima, Yu K ...
    Article type: ORIGINAL ARTICLE
    Subject area: Ischemic Heart Disease
    2026Volume 8Issue 4 Pages 626-633
    Published: April 10, 2026
    Released on J-STAGE: April 10, 2026
    Advance online publication: February 20, 2026
    JOURNAL OPEN ACCESS FULL-TEXT HTML
    Supplementary material

    Background: Drug-coated balloon (DCB) is a novel treatment option for percutaneous coronary intervention (PCI). The presence of heart failure (HF) in patients with coronary artery disease (CAD) is associated with a poor prognosis. However, the clinical significance of DCB-based PCI in CAD patients with HF is unknown.

    Methods and Results: This was a retrospective analysis of a cohort study from a prospective, single center registry from 2015 to 2024. We enrolled 258 CAD patients with chronic HF who underwent PCI with DCB or in combination with a drug-eluting stent (DES). Propensity score matching analysis was performed between the DCB-based PCI and DES-only PCI groups. The primary endpoint of this study was all-cause mortality. Baseline clinical characteristics were comparable between the groups. The total DES number and length were significantly reduced in patients with DCB-based PCI than in those with DES-only PCI. Kaplan-Meier analysis revealed that the DCB-based PCI group had a significantly lower rate of all-cause mortality compared with the DES-only group (log-rank test, P=0.04).

    Conclusions: In CAD patients with chronic HF, DCB-based PCI was associated with a lower risk of mortality compared with DES-only PCI.

Onco-Cardiology
  • Tetsuya Kimura, Yugo Yamashita, Yasutaka Ihara, Megumi Mizutani, Ryota ...
    Article type: ORIGINAL ARTICLE
    Subject area: Onco-Cardiology
    2026Volume 8Issue 4 Pages 634-641
    Published: April 10, 2026
    Released on J-STAGE: April 10, 2026
    Advance online publication: January 31, 2026
    JOURNAL OPEN ACCESS FULL-TEXT HTML
    Supplementary material

    Background: Non-small cell lung cancer (NSCLC) is associated with a high risk of venous thromboembolism (VTE). However, data on specific risk factors for VTE in patients with advanced NSCLC remain limited.

    Methods and Results: Using a Japanese nationwide administrative database, we analyzed 20,206 patients aged ≥18 years with advanced NSCLC who received first-line chemotherapy between January 2016 and January 2023. VTE events were identified through International Classification of Diseases, Tenth Revision codes and imaging studies. Risk factors were evaluated using Cox proportional hazards models with time-dependent covariates. The cumulative incidence of VTE was 4.2% and 6.1% at 365 and 730 days after the first date of chemotherapy for NSCLC, respectively. Several significant risk factors for VTE were identified, including female sex (hazard ratio [HR] 1.374; 95% confidence interval [CI] 1.157–1.631), higher body mass index (HR 1.029 per 1-kg/m2increase; 95% CI 1.009–1.048), previous VTE (HR 2.707; 95% CI 1.907–3.843), platinum-based chemotherapy (HR 1.217; 95% CI 1.051–1.410), anti-vascular endothelial growth factor agent (HR 1.763; 95% CI 1.458–2.132), heart failure (HR 1.677; 95% CI 1.432–1.965), and stroke/transient ischemic attack (HR 1.296; 95% CI 1.055–1.593).

    Conclusions: This large-scale study identified several significant risk factors for VTE in patients with advanced NSCLC. The findings suggest the need for risk-stratified monitoring and prophylactic strategies to reduce VTE-related complications in high-risk patients.

Valvular Heart Disease
  • Kanna Nakamura, Tomohiko Taniguchi, Aoi Omori, Hirotoshi Nishi, Gakuto ...
    Article type: ORIGINAL ARTICLE
    Subject area: Valvular Heart Disease
    2026Volume 8Issue 4 Pages 642-649
    Published: April 10, 2026
    Released on J-STAGE: April 10, 2026
    Advance online publication: January 27, 2026
    JOURNAL OPEN ACCESS FULL-TEXT HTML
    Supplementary material

    Background: The impact of coexisting malnutrition and sarcopenia on survival after transcatheter aortic valve replacement (TAVR) has not been fully studied.

    Methods and Results: Among 513 consecutive patients undergoing TAVR between February 2014 and June 2023, 340 with available preoperative Geriatric after Nutritional Risk Index (GNRI) and Short Physical Performance Battery (SPPB) data were categorized into 4 groups based on malnutrition (GNRI <98) and sarcopenia (SPPB ≤9) status: malnutrition and sarcopenia (N=98); malnutrition without sarcopenia (N=69); no malnutrition with sarcopenia (N=83); neither malnutrition nor sarcopenia (N=90, reference). The primary outcome measure was all-cause death. Patients with both malnutrition and sarcopenia were older and had a higher prevalence of anemia compared with the reference group. The cumulative 5-year mortality rate was significantly higher in this group. After adjusting for confounders, coexistence of malnutrition and sarcopenia had a significantly higher risk for all-cause death (hazard ratio [HR] 3.15; 95% confidence interval [CI]: 1.68–5.89; P<0.001). In contrast, malnutrition without sarcopenia (HR 1.36; 95% CI 0.64–2.90; P=0.42) and no malnutrition with sarcopenia (HR 1.86; 95% CI 0.92–3.79; P=0.08) were not associated with increased mortality.

    Conclusions: The coexistence of malnutrition and sarcopenia significantly increased mortality risk after TAVR, which highlights the importance of integrating both nutritional and sarcopenia assessments into preoperative risk stratification to optimize outcomes in patients undergoing TAVR.

  • Ryusuke Hamada, Kyohei Onishi, Masakazu Yasuda, Kosuke Fujita, Naoko S ...
    Article type: ORIGINAL ARTICLE
    Subject area: Valvular Heart Disease
    2026Volume 8Issue 4 Pages 650-656
    Published: April 10, 2026
    Released on J-STAGE: April 10, 2026
    Advance online publication: January 29, 2026
    JOURNAL OPEN ACCESS FULL-TEXT HTML

    Background: Trans-subclavian access transcatheter aortic valve implantation (TAVI), typically from the left side, is feasible. However, right subclavian artery access is technically challenging because of the anatomical orientation, resulting in malalignment of the transcatheter heart valve within the aortic annular plane.

    Methods and Results: We aimed to evaluate procedural outcomes, device–annulus alignment, and clinical efficacy of right trans-subclavian (RtTS) TAVI. Of a consecutive 423 patients who underwent TAVI, 32 cases performed via right and left subclavian access were analyzed. Implanted device depth and angle were analyzed angiographically. The device–annulus angle was measured angiographically. Fifteen of 22 patients were treated with a balloon-expandable valve, and 7 patients received a self-expanding valve, via RtTS. Procedural success was achieved in all cases. Compared with femoral and left subclavian approaches, RtTS led to a significantly larger device–annulus angle (6.0° vs. 8.7°; P<0.05), with deep left coronary cusp implantation (2.4 vs. 4.4 mm; P=0.05). Post-procedural transcatheter heart valve function was comparable across the groups, and no patients had greater than moderate paravalvular leakage. However, the incidence of symptomatic stroke occurred in 2 patients in the RtTS group (9.1%; P=0.21).

    Conclusions: RtTS TAVI is a feasible alternative access route, with comparable procedural and clinical outcomes to those of conventional approaches, albeit with a higher risk of stroke.

  • Kenichi Sasaki, Shingo Kuwata, Masaki Izumo, Yukio Sato, Takahiko Kai, ...
    Article type: ORIGINAL ARTICLE
    Subject area: Valvular Heart Disease
    2026Volume 8Issue 4 Pages 657-667
    Published: April 10, 2026
    Released on J-STAGE: April 10, 2026
    Advance online publication: February 07, 2026
    JOURNAL OPEN ACCESS FULL-TEXT HTML
    Supplementary material

    Background: The clinical impact of left QRS axis deviation (LAD) during new-onset left bundle branch block (LBBB) after transcatheter aortic valve replacement (TAVR) remains unclear.

    Methods and Results: This single-center retrospective study analyzed 254 patients who developed new-onset LBBB during hospitalization after TAVR. Clinical and echocardiographic outcomes were compared between patients with LBBB and LAD (LBBBLAD) and those with LBBB and a normal QRS axis (LBBBNA). 96 patients (38%) had LBBBLAD, defined as a QRS axis <−30°. A more leftward preprocedural QRS axis independently predicted LBBBLAD(odds ratio 1.20 per 10° decrement; 95% confidence interval (CI) 1.09–1.33; P<0.01). At 3 years, there were no significant differences between groups in all-cause death (28% vs. 19%; P=0.14), cardiovascular death (6% vs. 5%; P=0.73), or heart failure rehospitalization (18% vs. 10%; P=0.07). However, LBBBLADwas associated with a higher incidence of permanent pacemaker implantation (PPI) for atrioventricular conduction disorder (16% vs. 6%; P=0.02) and remained an independent predictor of PPI (Cox hazard ratio 2.46; 95% CI 1.06–5.73; P=0.04). Echocardiographic measures, including left ventricular ejection fraction, chamber size, and mitral regurgitation severity showed no significant longitudinal differences between groups.

    Conclusions: Compared to post-TAVR LBBBNA, post-TAVR LBBBLADis associated with an increased need for PPI, but not with adverse mortality or heart failure outcomes at 3-year follow-up. Closer and extended rhythm monitoring may be warranted in this subgroup.

Research Letters
Protocol Papers
  • Yoko M. Nakao, Atsushi Takayama, Koji Kawakami
    Article type: PROTOCOL PAPER
    2026Volume 8Issue 4 Pages 683-687
    Published: April 10, 2026
    Released on J-STAGE: April 10, 2026
    Advance online publication: February 14, 2026
    JOURNAL OPEN ACCESS FULL-TEXT HTML
    Supplementary material

    Background: Day-to-day home blood pressure variability (BPV) is associated with cardiovascular risk and influenced by environmental conditions. However, it is unclear whether short-term increases in day-to-day BPV can be predicted from personal sensor data. In this study, our aim is to develop and validate a machine-learning prediction model for short-term increases in day-to-day BPV using personal sensor data on behavioral and environmental exposure.

    Methods and Results: We will conduct a 30-day monitoring study in community-dwelling adults. Participants will measure home BP twice daily, while a portable sensor and an activity tracker record environmental conditions and physical activity. The primary outcome is an episode of increased systolic day-to-day BPV, defined as a rolling 5-day coefficient of variation ≥11.0%. Candidate predictors will be derived from the preceding 5-day exposure window. We will construct window-level data, allocate participants to training and test sets, and train machine-learning models with participant-level cross-validation. We will evaluate performance using the area under the receiver operating characteristic curve, calibration, Brier score, and decision-curve analysis, and interpret the XGBoost model with Shapley additive explanations to quantify the predictor contributions.

    Conclusions: This protocol outlines a framework for predicting short-term increases in day-to-day BPV from personally experienced environmental exposure and behaviors, supporting future personalized interventions targeting modifiable environmental and behavioral factors.

Images in Cardiovascular Medicine
feedback
Top