2025 Volume 58 Issue 2 Pages 59-67
Endoscopic ultrasound-guided fine-needle aspiration/biopsy (EUS-FNA/B) is critical for determining treatment strategies for patients with pancreatic cancer. However, conventional pathological examination using hematoxylin and eosin (H&E) staining is time-consuming. Microscopy with ultraviolet surface excitation (MUSE) enables rapid pathological diagnosis without requiring slide preparation. This study explores the potential of combining MUSE imaging with a cycle-consistent generative adversarial network (CycleGAN), an image generation algorithm capable of learning translations without paired images, to enhance diagnostic workflows for pancreatic EUS-FNA/B. Thirty-five pancreatic specimens were stained with Terbium/Hoechst 33342, and deep ultraviolet (DUV) fluorescence images were captured by exciting the tissue surface. These fluorescence images, along with H&E-stained formalin-fixed, paraffin-embedded (FFPE) sections from the same specimens, were divided into 256 × 256-pixel segments for CycleGAN training. The algorithm was employed to translate pseudo-H&E images from MUSE test images. The pseudo-H&E images generated by the CycleGAN showed improved inter-pathologist agreement among three pathologists compared with the original MUSE images. We established a technique to perform MUSE imaging on small pancreatic samples obtained through EUS-FNA/B and confirmed that H&E-style translation using CycleGAN simplified interpretation for pathologists. Integrating MUSE imaging with CycleGAN has the potential to offer a rapid, cost-effective, and accurate diagnostic tool.