2025 年 58 巻 2 号 p. 59-67
Endoscopic ultrasound-guided fine-needle aspiration/biopsy (EUS-FNA/B) is critical for determining treatment strategies for patients with pancreatic cancer. However, conventional pathological examination using hematoxylin and eosin (H&E) staining is time-consuming. Microscopy with ultraviolet surface excitation (MUSE) enables rapid pathological diagnosis without requiring slide preparation. This study explores the potential of combining MUSE imaging with a cycle-consistent generative adversarial network (CycleGAN), an image generation algorithm capable of learning translations without paired images, to enhance diagnostic workflows for pancreatic EUS-FNA/B. Thirty-five pancreatic specimens were stained with Terbium/Hoechst 33342, and deep ultraviolet (DUV) fluorescence images were captured by exciting the tissue surface. These fluorescence images, along with H&E-stained formalin-fixed, paraffin-embedded (FFPE) sections from the same specimens, were divided into 256 × 256-pixel segments for CycleGAN training. The algorithm was employed to translate pseudo-H&E images from MUSE test images. The pseudo-H&E images generated by the CycleGAN showed improved inter-pathologist agreement among three pathologists compared with the original MUSE images. We established a technique to perform MUSE imaging on small pancreatic samples obtained through EUS-FNA/B and confirmed that H&E-style translation using CycleGAN simplified interpretation for pathologists. Integrating MUSE imaging with CycleGAN has the potential to offer a rapid, cost-effective, and accurate diagnostic tool.
Pancreatic cancer is among the most lethal malignancies worldwide, with a 5-year survival rate of < 10%, due to late diagnosis and rapid disease progression [1, 2]. Early and accurate pathological diagnosis is crucial to improve patient outcomes by enabling timely and personalized therapeutic interventions. Endoscopic ultrasound-guided fine-needle aspiration/biopsy (EUS-FNA/B) is an essential tool for obtaining tissue samples from pancreatic lesions, offering a minimally invasive approach with high sensitivity and specificity [3, 4]. However, the conventional histopathological evaluation of EUS-FNA/B samples typically requires time-consuming processes, including formalin fixation, paraffin embedding, and staining. This can lead to diagnostic delays, hindering prompt clinical decision-making, particularly for patients with aggressive cancers such as pancreatic ductal adenocarcinoma (PDAC).
Advances in optical imaging techniques have created new possibilities for rapid pathological assessment. Microscopy with ultraviolet surface excitation (MUSE) is a promising technique for obtaining high-resolution images of fresh tissue specimens without extensive preparation [5]. MUSE uses deep ultraviolet (DUV) light to excite tissue surfaces, enabling the detailed visualization of cellular and tissue architecture using a slide-free histology method [6]. MUSE has mainly been investigated for its role in rapid intraoperative diagnosis [7, 8]. However, its application to small biopsy samples, such as those obtained from the gastrointestinal tract or EUS-FNA/B procedures, remains underexplored. Expanding the use of MUSE to these smaller specimens could help reduce the workload of pathological examinations, particularly in resource-limited medical facilities, where access to comprehensive pathology services is often constrained. In our previous study, we combined terbium (Tb) ions as RNA tags with Hoechst dye to develop a novel staining method suitable for MUSE imaging. This approach enhanced the fluorescence signals and improved tissue characterization under DUV excitation [9]. Building on these advancements, we explored the application of deep learning techniques to enhance diagnostic capabilities further. We demonstrated that MUSE combined with deep convolutional neural networks can effectively detect lymph node metastasis in patients with gastric cancer [10]. In a subsequent study, we integrated cycle-consistent generative adversarial network (CycleGAN)-assisted image translation with MUSE images to enhance the precise detection of lymph node metastasis significantly [11]. These studies highlight the potential of combining advanced imaging techniques with artificial intelligence (AI) to overcome the limitations of conventional pathology examinations.
AI-driven pathology diagnostic systems are continually evolving; however, a definitive diagnosis requires confirmation by pathologists. Moreover, interpreting MUSE images is challenging for pathologists because they differ significantly from conventional hematoxylin and eosin (H&E)-stained slides in appearance and color. To address this limitation, we employed CycleGAN [12], a deep learning-based generative adversarial network, to translate MUSE images into pseudo-H&E images. This approach uses neural networks to translate the unique fluorescence patterns of MUSE into a format that closely resembles that of traditional H&E staining, facilitating easier interpretation by pathologists. By combining the rapid imaging capabilities of MUSE, enhanced by our novel Tb/Hoechst staining method, with the interpretability of CycleGAN-generated pseudo-H&E images, this study evaluated the feasibility and diagnostic concordance of these technologies for the immediate pathological diagnosis of pancreatic cancer from EUS-FNA/B samples. We hypothesize that this method will provide a rapid and reliable alternative to conventional histopathology, offering timely diagnostic information to support clinical decision-making in pancreatic cancer.
All clinical experiments were conducted with the approval of the Ethics Committee of Kyoto Prefectural University of Medicine (approval No. ERB-C-1852) and in accordance with guidelines from the committees and regional laws related to clinical research. We obtained the pancreatic tissues used in this study from 35 patients who underwent EUS-FNA/B at the Kyoto Prefectural University of Medicine (Kyoto, Japan) between April 2022 and May 2023. Written informed consent was obtained from all participants. Before EUS-FNA/B, the patients had been diagnosed with pancreatic cancer (n = 31), metastatic pulmonary cancer (n = 1), pancreatic neuroendocrine tumor (PanNET) (n = 1), and autoimmune pancreatitis (n = 2). None of the patients received radiotherapy or chemotherapy before undergoing EUS-FNA/B. After MUSE imaging, the pancreatic samples were fixed in 10% formalin and embedded in paraffin. Formalin-fixed, paraffin-embedded (FFPE) thin-sectioned H&E specimens were prepared, and pathological diagnosis was performed routinely. After EUS-FNA/B, the patients were diagnosed with PDAC (n = 30), adenosquamous carcinoma (n = 2), PanNET (n = 1), metastatic pulmonary small cell carcinoma (n = 1), and autoimmune pancreatitis (n = 1).
Staining protocol and MUSE imagingAs pancreatic tissues obtained by EUS-FNA/B often contain large amounts of blood, FNA/B samples were hemolyzed using CytoRich RED (Becton, Dickinson and Company) (Fig. 1). After hemolysis treatment FNA/B samples were immersed in 50% ethanol for 1 min, rinsed with HEPES-buffered solution (10 mM HEPES, pH 7.38 adjusted with NaOH), stained in 100% deuterium oxide (D2O) HEPES buffer containing 20 mM TbCl3 (TBH03XB, Kojundo Chemical Laboratory) and 10 μg/mL Hoechst 33342 (Dojindo Molecular Technologies) for 3 mins, and then rinsed with 100% D2O HEPES buffer. All staining procedures were performed on ice at 4°C.
A schematic view representing the protocol for fluorescence staining and MUSE imaging of pancreatic EUS-FNA/B samples. In total, 247 MUSE images at 5,472 × 3,648 pixels acquired from 35 cases were used in this study. For CycleGAN training datasets, 35 H&E WSIs were obtained from FFPE thin-sectioned specimens of the corresponding EUS-FNA/B samples following MUSE imaging. CycleGAN, cycle-consistent generative adversarial network; DUV, deep ultraviolet; EUS-FNA/B, endoscopic ultrasound-guided fine-needle aspiration/biopsy; FFPE, formalin fixed paraffin embedded; H&E, hematoxylin and eosin; MUSE, microscopy with ultraviolet surface excitation; Tb, terbium; WSI, whole slide image.
Stained samples were covered with a silica coverslip (SF-S-D12, Fine Plus International) and placed on the original inverted microscope [10]. Briefly, the sample surface was illuminated with a 285 nm DUV light emitted from an LED (VPS-164, Nikkiso). The fluorescence emitted from the sample was collimated using an objective lens (UPLFLN 10×, Olympus) and imaged using a CMOS camera (BFS-U3-200S6C-C, FLIR) to capture the MUSE images at 5,472 × 3,648 pixels. An optical filter (FF01-464/547-25, Semrock) was placed between the objective and imaging lenses to attenuate the DUV light. The brightness of the acquired MUSE images was normalized such that the value of the highest brightness among the 99.99th percentiles for the green and blue channels was maximized.
Image dataset preparation for CycleGANWe developed the CycleGAN models using 247 MUSE images from 35 patients with pancreatic FNA/B (6–8 images per patient). H&E whole-slide images (WSIs) of the FFPE thin-sectioned specimens were obtained from the same pancreatic FNA/B samples using a digital slide scanner (NanoZoomer S360, Hamamatsu Photonics) equipped with a 40× objective lens mode.
Image translation by the CycleGAN modelThe details of CycleGAN training are described in a previous study by Sato et al. [11]. CycleGAN is an unsupervised learning algorithm that learns image translation between two unpaired image datasets [12]. Two pairs of generators and discriminators perform domain translations and learn bidirectional translation. By training these pairs of models simultaneously, the CycleGAN can produce images that are restyled from one to another. The MUSE images were trained and translated into H&E style using five-fold cross validation (Fig. 2A) [13]. The MUSE images were randomly assigned to five folds; one fold was used as the test data, while the remaining four folds served as the training data. The MUSE images from the same patient were not traversed in the data for training or testing. The MUSE images were divided into small patches of 256 × 256 pixels, and 57,918–58,212 patches per fold were prepared (Fig. 2B). FFPE H&E image patches of 256 × 256 pixels were cropped from the H&E WSIs to the same scale as the MUSE image patches, and 79,680 patches were used in every training (Fig. 2B). Distinct differences between the domain of MUSE images as fluorescence images and that of FFPE H&E images as bright-field images can cause instability in CycleGAN training. MUSE images were inverted to bring their domain closer to that of FFPE H&E images, and these inverted MUSE images were used in the CycleGAN model [14]. Subsequently, the model was trained on 40 epochs. Image translation for the test datasets was performed using a sliding window process (Fig. 2C).
H&E style translation by the CycleGAN model. (A) Five-fold cross validation for H&E-style image translation using CycleGAN. For each patient, the MUSE images were randomly assigned to five folds; one fold was used as test data, and the remaining four folds served as training data. All MUSE images were utilized as test data and translated into pseudo-H&E images. (B) A bidirectional translation model was trained on 256 × 256-pixel patches extracted from MUSE images and FFPE H&E images. Bars = 10 μm. (C) The transfer model translated MUSE test images into H&E style using a sliding window process. CycleGAN, cycle-consistent generative adversarial network; FFPE, formalin fixed paraffin embedded; H&E, hematoxylin and eosin; MUSE, microscopy with ultraviolet surface excitation.
Three pathologists reviewed the MUSE images obtained from pancreatic EUS-FNA/B and the pseudo-H&E images translated using CycleGAN. The images were classified into three categories: (1) benign, (2) indeterminate, and (3) malignant. For each case, the highest category among the image classifications was assigned as the category for that case. We calculated the inter-rater agreement among the three pathologists using Krippendorff’s alpha coefficient (α) [15] and the pairwise agreements between each two pathologists using the weighted Cohen’s kappa coefficient (κ), comparing the original MUSE images and the pseudo-H&E images. Moreover, we employed the bootstrap method to estimate the 95% confidence intervals (CIs) for Krippendorff’s α between MUSE and pseudo-H&E images, at both the image and case levels [16]. Specifically, we performed 1,000 iterations of resampling with replacement from the original dataset to generate bootstrap samples separately for each comparison. For each bootstrap sample, Krippendorff’s α was recalculated, and the 95% CIs were determined using the bootstrap percentile method with the 2.5th and 97.5th percentiles of the bootstrap distributions as the lower and upper bounds, respectively. To statistically evaluate the difference in Krippendorff’s α between MUSE and pseudo-H&E images, we conducted a bootstrap hypothesis test. The null hypothesis assumed no difference in Krippendorff’s α between the two groups. The P-value was computed as the proportion of bootstrap resampled differences that were less than zero, and the statistical significance was set at P < 0.05.
Computer and software used for calculationsFor all calculations of the CycleGAN, we used a ready-made PC (GALLERIA UA9C-R49, Thirdwave) with a CPU (Core i9-14900KF, 3.20 GHz, Intel) and a GPU (GeForce RTX 4090, 24 GB, NVIDIA). The installed OS was Ubuntu 22.04 LTS. The CycleGAN was built using TensorFlow version 2.14.0 and Keras version 2.14.0.
Because FNA/B samples were obtained by puncturing pancreatic lesions with a fine needle, imaging targets already appeared on the sample surface. Therefore, we stained the samples with Tb/Hoechst 33342 without sectioning and obtained DUV-excitation images (Fig. 1). A brief treatment with a high concentration of ethanol before fluorescence staining was essential to enhance the staining properties of Tb. However, excessive ethanol treatment hardened the small pancreatic tissues, interfering with DUV-excitation imaging. Therefore, we treated them with 50% ethanol, a concentration lower than that previously reported when staining lymph nodes [11]. Unlike thin-sectioned FFPE, FNA/B samples have uneven surfaces. Hence, we flattened the surface for imaging by sandwiching the sample between coverslips and applying pressure with a weight on the non-imaging side. Fig. 3 shows representative MUSE images of normal pancreatic tissue and PDAC obtained using pancreatic EUS-FNA/B, along with the corresponding H&E images of FFPE specimens prepared from the same FNA/B samples. When excited by DUV, green fluorescence from Tb was observed from the RNA-rich cytoplasm, and blue fluorescence was observed from the Hoechst-bound nuclei [9]. The acinar architecture of the pancreatic acinar cells was clearly visible in the MUSE images of the normal pancreas, similar to that in the FFPE H&E images. In contrast, cancer glands invading the fibrous stroma were observed in the MUSE images of PDAC, which was consistent with the FFPE H&E images.
Representative MUSE and corresponding FFPE H&E images of pancreatic EUS-FNA/B. MUSE images (A, B) and corresponding H&E image (C) of normal pancreatic acinar cells. MUSE images (D, E) and corresponding H&E image (F) of pancreatic invasive ductal adenocarcinoma. Enlarged views of the regions outlined by red squares in the images (A, D) are shown to the right as the images (B, E), respectively. Bars = 200 μm (A, D) and 50 μm (B, C, E, F). EUS-FNA/B, endoscopic ultrasound-guided fine-needle aspiration/biopsy; FFPE, formalin fixed paraffin embedded; H&E, hematoxylin and eosin; MUSE, microscopy with ultraviolet surface excitation.
We translated MUSE images into H&E style using CycleGAN to facilitate pathologists’ interpretation of MUSE images obtained from pancreatic EUS-FNA/B samples. After training the CycleGAN to perform bidirectional image translation between the MUSE and FFPE H&E images (Fig. 2B), we translated the MUSE test images into pseudo-H&E images. Image translation of the MUSE test datasets was performed using a sliding window approach (Fig. 2C). Initially, the training datasets for CycleGAN comprised 256 × 256-pixel images. Hence, we sequentially translated 5,472 × 3,648-pixel MUSE images by sliding them in increments of 256 × 256 pixels. However, the images translated by the CycleGAN model often contained artifacts at the edges. To address this, we removed the surrounding 48 pixels to create 160 × 160-pixel patches. Then, we applied the CycleGAN translation to these patches and tiled them, sliding them in increments of 160 × 160 pixels (Supplementary Fig. S1). Figs. 4 and 5 show representative pseudo-H&E images translated from the MUSE images using CycleGAN. Normal pancreatic duct epithelia arranged in sheet-like structures and PDAC with fused glandular formations were faithfully translated into pseudo-H&E images. While seams were slightly visible in out-of-focus areas in high power fields (Fig. 5D), they did not noticeably affect the overall image assessment (Fig. 5C). Fine intracellular structures, such as nucleoli, were also accurately translated into the pseudo-H&E images.
Representative MUSE images (A, B) and pseudo-H&E images (C, D) of normal pancreatic duct epithelia obtained by EUS-FNA/B. Enlarged views of the regions outlined by red squares in the images (A, C) are shown to the right as the images (B, D), respectively. Bars = 200 μm (A, C) and 50 μm (B, D). EUS-FNA/B, endoscopic ultrasound-guided fine-needle aspiration/biopsy; H&E, hematoxylin and eosin; MUSE, microscopy with ultraviolet surface excitation.
Representative MUSE images (A, B) and pseudo-H&E images (C, D) of pancreatic invasive ductal adenocarcinoma obtained by EUS-FNA/B. Enlarged views of the regions outlined by red squares in the images (A, C) are shown to the right as the images (B, D), respectively. Bars = 200 μm (A, C) and 50 μm (B, D). EUS-FNA/B, endoscopic ultrasound-guided fine-needle aspiration/biopsy; H&E, hematoxylin and eosin; MUSE, microscopy with ultraviolet surface excitation.
Pathologists conducted pathological reviews to validate the H&E style translation from the MUSE images of pancreatic EUS-FNA/B. Supplementary Table S1 presents the results of this pathological review. At the image level (n = 247), Krippendorff’s α improved from 0.230 for the original MUSE images to 0.464 for pseudo-H&E images (Table 1). The bootstrap estimates of Krippendorff’s α increased significantly from a mean of 0.230 for the original MUSE images to a mean of 0.461 for pseudo-H&E images (P = 0.000) (Supplementary Fig. S2A). At the case level (n = 35), Krippendorff’s α improved from 0.053 for the original MUSE images to 0.594 for pseudo-H&E images (Table 1). The bootstrap estimates of Krippendorff’s α increased significantly from a mean of 0.039 for the original MUSE images to a mean of 0.567 for pseudo-H&E images (P = 0.001) (Supplementary Fig. S2B). Compared to the original MUSE images, weighted Cohen’s κ also improved with pseudo-H&E images across all pairings, at both the image and case levels (Table 1). These findings indicate that H&E style translation significantly improved inter-rater agreement compared to that of MUSE images, suggesting its potential to increase the consistency of pathological assessments.
Diagnostic accordance of three pathologists (A, B, and C) at the image and case levels
Coefficient | Image (n = 247) | Case (n = 35) | ||
---|---|---|---|---|
Original MUSE | Pseudo-H&E | Original MUSE | Pseudo-H&E | |
α (A, B, and C) | 0.230 | 0.464 | 0.053 | 0.594 |
α (bootstrap estimates)a | 0.230 (0.146, 0.312) |
0.461 (0.380, 0.538) |
0.039 (−0.146, 0.250) |
0.567 (0.249, 0.811) |
κ (A and B) | 0.268 | 0.475 | 0.146 | 0.664 |
κ (A and C) | 0.490 | 0.611 | 0.091 | 0.607 |
κ (B and C) | 0.075 | 0.339 | 0.060 | 0.475 |
Abbreviations: H&E, hematoxylin and eosin; MUSE, microscopy with ultraviolet surface excitation; α, Krippendorff’s alpha coefficient; κ, weighted Cohen’s kappa coefficient.
a Data are presented as the mean value (95% confidence interval).
In this study, we established a technique to capture DUV excitation fluorescence images from pancreatic EUS-FNA/B samples and translate them into pseudo-H&E images using CycleGAN, thereby enhancing their interpretability of MUSE images for pathologists. MUSE enables optical sectioning of tissue surfaces without requiring thin-section processing, allowing rapid acquisition of pathological tissue images with minimal staining [6]. MUSE is applied in intraoperative diagnostics for brain tumors and surgical margins of breast and skin tumors [7, 8, 17]. To further build upon the effectiveness of MUSE for rapid and convenient pathology with minimal effort, we explored whether diagnostic-quality histopathological images could be derived from pancreatic EUS-FNA/B.
Recently, EUS-FNA/B has become the standard diagnostic approach for pancreatic tissue collection [3, 4]. EUS-FNA/B uses fine needles (19–25 gauge, with an inner diameter of 0.26–0.70 mm) to collect very small tissue samples. Regions of diagnostic interest are typically located on the tissue surface of EUS-FNA/B samples, enabling MUSE imaging with minimal preprocessing, even more so than the typical methods. Furthermore, all EUS-FNA/B samples after MUSE imaging can be prepared into FFPE blocks, resulting in less tissue loss compared to conventional rapid diagnostic techniques, such as the frozen section method [18] and rapid on-site cytological evaluation (ROSE) [19, 20]. DUV is a high-energy electromagnetic radiation that can harm nucleic acids in irradiated tissues [21]. However, in MUSE imaging, the DUV interacts solely with the tissue surface, allowing nucleic acid extraction from samples after imaging [6, 8]. This capability supports subsequent genomic analyses, aligning with the growing need for genomic insights in personalized medicine.
However, MUSE produces fluorescence images in which nuclei and cytoplasm are visualized against a dark background. Although most pathologists are highly skilled in interpreting H&E-stained specimens, they are less accustomed to diagnosing fluorescence images. In this study, we attempted to improve the diagnostic workflows for pancreatic cancer cases by translating MUSE images into pseudo-H&E images using CycleGAN [12]. CycleGAN, a deep-learning-based image translation algorithm, can learn translations without paired images and does not require setting complex parameters manually. Previous studies have explored translating histopathological images from various modalities into H&E-like images [22, 23]. Our study showed that pseudo-H&E images had a higher inter-pathologist agreement than the original MUSE images. This result indicates that CycleGAN can translate MUSE images into pathologist-friendly images, facilitating easier interpretation by pathologists.
We observed that artifacts frequently appeared at the edges of the 256 × 256-pixel patches translated by CycleGAN. These artifacts consistently appeared in the same locations within individual epochs, although their patterns varied across epochs, and they were independent of the image content. These observations suggest that the artifacts are inherent to the image training process in convolutional neural networks, although the precise cause remains unclear. To mitigate this issue, we cropped the central 160 × 160-pixel region from each patch, effectively reducing the impact of edge artifacts (Supplementary Fig. S1).
ROSE for pancreatic EUS-FNA/B is desirable in clinical practice in terms of time and cost [19, 20] because real-time feedback between the endoscopist and pathologist or cytotechnologists enables more effective specimen collection for diagnosis, reduces the number of unnecessary punctures, and increases examination efficiency. However, the number of pathologists or cytotechnologists is limited, and it is difficult for them to be present at the time of examination. In the future, MUSE imaging with CycleGAN may be combined with online pathology to improve the efficiency of EUS-FNA examinations.
This study has certain limitations, including its single-center design, a small sample size of 35 cases, and limited representation of benign cases due to the clinical focus of EUS-FNA/B in patients with a high suspicion of malignancy. Consequently, we could not comprehensively evaluate the diagnostic accuracy of pancreatic cancer.
In conclusion, our findings demonstrate that pancreatic EUS-FNA/B specimens can be rapidly stained and imaged using MUSE and that translating these images into pseudo-H&E images using CycleGAN improves inter-pathologist agreement. In addition, MUSE is more cost-effective and has a simpler system configuration compared to other optical imaging techniques such as confocal laser microscopy and stimulated Raman microscopy [24, 25]. MUSE simplifies sample handling and enables automated staining and imaging, thereby facilitating integration with remote diagnostic systems and machine learning. Maintaining consistency in MUSE image quality remains a challenge, and the reliability of CycleGAN-based image translation needs further validation. Conducting multi-institutional studies will be essential to enhance the reproducibility and generalizability of the imaging and diagnostic workflow.
T.T. received a research grant from Terasaki Electric Co., Ltd.
Y.K., R.N., H. Niioka, and T.T. conceived the project and wrote the manuscript; Y.K. performed the fluorescence imaging experiments; Y.K., H.Y., and Y.I. contributed to pancreatic sample collection and preparation; R.N., J.S., and H. Niioka performed CycleGAN analysis; M.H., O.I., N.T., Y.M., and E.K. contributed to histopathological reviews; Y.K., R.N., J.S., M.H., Y.H., H.T., H. Nagahara, H. Niioka, and T.T. discussed the results. All authors reviewed the manuscript.
We would like to thank Satoki Yamane, Keisuke Takemura, Yuki Sawai, Hayato Miyake, Toshifumi Doi, and Yoshio Sogame for their contributions to the collection and preparation of pancreatic samples. This study was partially supported by Terasaki Electric Co., Ltd. We would like to thank Editage (www.editage.jp) for English language editing.