The Official Journal of the Japanese Society of Interventional Radiology
Online ISSN : 2185-6451
Print ISSN : 1340-4520
ISSN-L : 1340-4520
Volume 36, Issue 4
Displaying 1-9 of 9 articles from this issue
State of the Art
Potential of Augmented Reality and Mixed Reality for IVR
  • Satoru Morita
    2022 Volume 36 Issue 4 Pages 325
    Published: 2022
    Released on J-STAGE: December 08, 2022
    JOURNAL RESTRICTED ACCESS
    Download PDF (501K)
  • Maki Sugimoto, Sueyoshi Takuya
    2022 Volume 36 Issue 4 Pages 326-334
    Published: 2022
    Released on J-STAGE: December 08, 2022
    JOURNAL RESTRICTED ACCESS
    In order to perform IVR safely and accurately, individual patient medical images obtained preoperatively are indispensable. However, even if the image data is reconstructed in 3D on a 2D display, it shows a pseudo-3D effect due to shading, transparency, and reflection of light based on X, Y, and Z information. We have developed a system that supports accurate, safe, and reliable IVR by superimposing medical image data of individual patients in virtual and real space and presenting them in 3D space like a hologram, applying virtual reality (VR), augmented reality (AR), and mixed reality (MR), together termed extended reality (XR) technology.
    Image-assisted IVR using XR and the metaverse is a natural and intuitive method of image support with a high degree of freedom, allowing the patient to experience the depth and positional relationship of organs and pathological conditions that cannot be obtained with a flat display, and to share the experience while moving their own hands. It is thought that the ability to spatially observe and confirm the anatomy and pathology of individual patients and the procedure in advance will help the practitioner feel more comfortable with the procedure, and contribute to time reduction and accident prevention.
    Download PDF (3900K)
  • Wataru Narita
    2022 Volume 36 Issue 4 Pages 335-341
    Published: 2022
    Released on J-STAGE: December 08, 2022
    JOURNAL RESTRICTED ACCESS
    Due to recent advances in computer-related technology, almost all medical images are now controlled by computers, and the performance of diagnostic and therapeutic devices that use them has come to depend on the software that controls them. In this context, a new rev-olution is taking place in the technology that feeds the results of computations performed by computers into human vision. Virtual reality (VR), augmented reality (AR), and mixed reality (MR) are representative of these technologies, and their widespread use is re-markable. The author has applied these applications to spine surgery, and developed a simulator for vertebral root screw insertion and an image-assist device that allows the user to confirm the anatomy of the individual patient during surgery. The simulator represents the operating table and three-dimensional anatomy in a virtual space, and the user can insert a screw while observing the necessary anatomy in any cross section while moving freely in the space. The intraoperative image support system allows the surgeon to set the screw trajectory before surgery and project the trajectory onto the surgical field. These techniques not only improved the surgeon’s three-dimensional understanding of the anatomy and spatial awareness, but were also useful for medical education regarding surgical techniques. In addition, online conferencing (the metaverse) using XR made it possible to share implant and bone organ data in a virtual space with AR markers to share the avatar’s position while performing surgery and conferencing with remote sites.
    Download PDF (3446K)
  • Atsushi Komemushi, Shogo Takashima, Atsushi Nagai, Masakatsu Usui, Mas ...
    2022 Volume 36 Issue 4 Pages 342-345
    Published: 2022
    Released on J-STAGE: December 08, 2022
    JOURNAL RESTRICTED ACCESS
    Background: The technology of virtual reality is advancing rapidly, and in recent years, the technology of merging the virtual world with the real world (mixed reality, MR) has become available at the individual developer level. The Microsoft HoloLens is a virtual reality headset with a transparent display that creates MR by superimposing computer graphics on a real landscape background. It is already being used in the medical field to display a 3D hologram of the anatomical position of each organ near the surgical field as a reference image during surgery.
    In the use of mixed reality (MR) in the field of interventional radiology, the 3D hologram should not be used as a reference image, but as a direct guide to the actual procedure. The tolerable error in a real procedure is less than a few millimeters, and the fusion of the real and virtual worlds requires a high degree of accuracy in superimposition. However, there is no clear reference point on the real patient when superimposing the 3D hologram on the patient, and accurate superimposition requires skill and know-how.
    Objective: To evaluate the accuracy of mixed reality (MR)-guided puncture.
    Method: A tungsten carbide sphere with a diameter of 1 mm was embedded in an EVA resin block to make a puncture phantom. The phantom was imaged by CT, and the 3D hologram data were generated from the CT image. The holographic data were transferred to the MR headset (Microsoft HoloLens), and the puncture phantom was punctured with an 18-G needle using the 3D holographic puncture guide. Fifteen punctures were performed. The distance between the needle and the target was measured on the workstation by a radiologist, who was not informed of the puncture, using the CT image after the puncture.
    Results: Mixed reality-guided puncture with a stereoscopic hologram was possible 15/15 times (100%). The error to the puncture target was 4.1±2.4 mm (range: 0-7mm).
    Conclusion: Mixed reality (MR) puncture guidance needs to be improved in terms of ac-curacy as of 2018.
    Download PDF (1313K)
  • Satoru Morita, Kazufumi Suzuki, Takahiro Yamamoto, Shuji Sakai
    2022 Volume 36 Issue 4 Pages 346-351
    Published: 2022
    Released on J-STAGE: December 08, 2022
    JOURNAL RESTRICTED ACCESS
    Augmented reality (AR) superimposes digital information in the real world through a two-dimensional screen. Mixed reality (MR) further increases interactivity by anchoring movable digital three-dimensional (3D) objects in a physical space using a head-mounted display. In various fields including surgery, AR/MR has been used, such as for navigation or simulation. For interventional radiology, there have been some reports of promising results using AR/MR for needle guidance in phantom experiments. These require preprocedural image reconstruction to produce 3D objects that must be imported into the device in advance. Image registration between the AR/MR and a phantom is also required using additional markers or specialized software. We developed an MR needle guidance application using HoloLens2, “MR Puncture”, which does not require pre-procedural image reconstruction or import by manually matching the spatial and MR coordinate systems. We also developed an iPhone app “AR Puncture/AR Needle Guide” to assist in CT-guided biopsies. It can measure the angle of the biopsy needle in a non-contact, real-time process and display the angle according to a coordinate system based on the CT scan device or the patient’s body axis. We will introduce our way of developing AR/MR applications and discuss the potential and current problems of AR/MR.
    Download PDF (1975K)
  • Taku Yasumoto, Hiroya Shiomi, Koichi Yamada, Hakketsu Koh, Daisaku Tat ...
    2022 Volume 36 Issue 4 Pages 352-357
    Published: 2022
    Released on J-STAGE: December 08, 2022
    JOURNAL RESTRICTED ACCESS
    Computed tomography (CT)-fluoroscopy is widely used for various interventional radiology (IVR) procedures. To visualize the needle on the axial images clearly, needle insertion is often limited in terms of the positions and orientations. In this study, a novel navigation system for CT-guided IVR using multiple cameras was developed. In-house software was developed using a Qt and open-source libraries including Open CV, Open GL, and ArUco, a camera position estimation library. The 3D positions were analyzed from matrix-type 2D pattern markers. Multiple cameras were used to achieve flexible analysis without dead areas. Organ contours were delineated on pre-acquired CT images, and a DICOM-RT structure set was imported before the procedure to overlay the 3D organ contours on the images. Multi-planar images with arbitrarily selected cross-sections were reconstructed from CT images, and the needle was successfully drawn in real-time on the images. Because the patient’s body and needle were calculated based on the 3D coordinates, the system could visualize the images from any direction, such as the view from the operator or from the needle. By overlaying the organ contours, the organs were shown on the “needle’s eye view” to ensure safe and accurate operation. The usefulness of this novel navigation system is discussed.
    Download PDF (1868K)
Medical Staff Corner
feedback
Top