日本インターベンショナルラジオロジー学会雑誌
Online ISSN : 2185-6451
Print ISSN : 1340-4520
ISSN-L : 1340-4520
36 巻, 4 号
選択された号の論文の9件中1~9を表示しています
総説/特集
IVRにおける拡張現実・複合現実の可能性
  • 森田 賢
    2022 年 36 巻 4 号 p. 325
    発行日: 2022年
    公開日: 2022/12/08
    ジャーナル 認証あり
  • 杉本 真樹, 末吉 巧弥
    2022 年 36 巻 4 号 p. 326-334
    発行日: 2022年
    公開日: 2022/12/08
    ジャーナル 認証あり
    In order to perform IVR safely and accurately, individual patient medical images obtained preoperatively are indispensable. However, even if the image data is reconstructed in 3D on a 2D display, it shows a pseudo-3D effect due to shading, transparency, and reflection of light based on X, Y, and Z information. We have developed a system that supports accurate, safe, and reliable IVR by superimposing medical image data of individual patients in virtual and real space and presenting them in 3D space like a hologram, applying virtual reality (VR), augmented reality (AR), and mixed reality (MR), together termed extended reality (XR) technology.
    Image-assisted IVR using XR and the metaverse is a natural and intuitive method of image support with a high degree of freedom, allowing the patient to experience the depth and positional relationship of organs and pathological conditions that cannot be obtained with a flat display, and to share the experience while moving their own hands. It is thought that the ability to spatially observe and confirm the anatomy and pathology of individual patients and the procedure in advance will help the practitioner feel more comfortable with the procedure, and contribute to time reduction and accident prevention.
  • 成田 渉
    2022 年 36 巻 4 号 p. 335-341
    発行日: 2022年
    公開日: 2022/12/08
    ジャーナル 認証あり
    Due to recent advances in computer-related technology, almost all medical images are now controlled by computers, and the performance of diagnostic and therapeutic devices that use them has come to depend on the software that controls them. In this context, a new rev-olution is taking place in the technology that feeds the results of computations performed by computers into human vision. Virtual reality (VR), augmented reality (AR), and mixed reality (MR) are representative of these technologies, and their widespread use is re-markable. The author has applied these applications to spine surgery, and developed a simulator for vertebral root screw insertion and an image-assist device that allows the user to confirm the anatomy of the individual patient during surgery. The simulator represents the operating table and three-dimensional anatomy in a virtual space, and the user can insert a screw while observing the necessary anatomy in any cross section while moving freely in the space. The intraoperative image support system allows the surgeon to set the screw trajectory before surgery and project the trajectory onto the surgical field. These techniques not only improved the surgeon’s three-dimensional understanding of the anatomy and spatial awareness, but were also useful for medical education regarding surgical techniques. In addition, online conferencing (the metaverse) using XR made it possible to share implant and bone organ data in a virtual space with AR markers to share the avatar’s position while performing surgery and conferencing with remote sites.
  • 米虫 敦, 髙島 章伍, 永井 淳, 碓井 太雄, 福田 将啓, 中谷 幸, 小野 泰之, 丸山 拓士, 狩谷 秀治, 宇都宮 啓太, 谷川 ...
    2022 年 36 巻 4 号 p. 342-345
    発行日: 2022年
    公開日: 2022/12/08
    ジャーナル 認証あり
    Background: The technology of virtual reality is advancing rapidly, and in recent years, the technology of merging the virtual world with the real world (mixed reality, MR) has become available at the individual developer level. The Microsoft HoloLens is a virtual reality headset with a transparent display that creates MR by superimposing computer graphics on a real landscape background. It is already being used in the medical field to display a 3D hologram of the anatomical position of each organ near the surgical field as a reference image during surgery.
    In the use of mixed reality (MR) in the field of interventional radiology, the 3D hologram should not be used as a reference image, but as a direct guide to the actual procedure. The tolerable error in a real procedure is less than a few millimeters, and the fusion of the real and virtual worlds requires a high degree of accuracy in superimposition. However, there is no clear reference point on the real patient when superimposing the 3D hologram on the patient, and accurate superimposition requires skill and know-how.
    Objective: To evaluate the accuracy of mixed reality (MR)-guided puncture.
    Method: A tungsten carbide sphere with a diameter of 1 mm was embedded in an EVA resin block to make a puncture phantom. The phantom was imaged by CT, and the 3D hologram data were generated from the CT image. The holographic data were transferred to the MR headset (Microsoft HoloLens), and the puncture phantom was punctured with an 18-G needle using the 3D holographic puncture guide. Fifteen punctures were performed. The distance between the needle and the target was measured on the workstation by a radiologist, who was not informed of the puncture, using the CT image after the puncture.
    Results: Mixed reality-guided puncture with a stereoscopic hologram was possible 15/15 times (100%). The error to the puncture target was 4.1±2.4 mm (range: 0-7mm).
    Conclusion: Mixed reality (MR) puncture guidance needs to be improved in terms of ac-curacy as of 2018.
  • 森田 賢, 鈴木 一史, 山本 敬洋, 坂井 修二
    2022 年 36 巻 4 号 p. 346-351
    発行日: 2022年
    公開日: 2022/12/08
    ジャーナル 認証あり
    Augmented reality (AR) superimposes digital information in the real world through a two-dimensional screen. Mixed reality (MR) further increases interactivity by anchoring movable digital three-dimensional (3D) objects in a physical space using a head-mounted display. In various fields including surgery, AR/MR has been used, such as for navigation or simulation. For interventional radiology, there have been some reports of promising results using AR/MR for needle guidance in phantom experiments. These require preprocedural image reconstruction to produce 3D objects that must be imported into the device in advance. Image registration between the AR/MR and a phantom is also required using additional markers or specialized software. We developed an MR needle guidance application using HoloLens2, “MR Puncture”, which does not require pre-procedural image reconstruction or import by manually matching the spatial and MR coordinate systems. We also developed an iPhone app “AR Puncture/AR Needle Guide” to assist in CT-guided biopsies. It can measure the angle of the biopsy needle in a non-contact, real-time process and display the angle according to a coordinate system based on the CT scan device or the patient’s body axis. We will introduce our way of developing AR/MR applications and discuss the potential and current problems of AR/MR.
  • 保本 卓, 塩見 浩也, 山田 広一, 黄 博傑, 辰己 大作, 呉 隆進
    2022 年 36 巻 4 号 p. 352-357
    発行日: 2022年
    公開日: 2022/12/08
    ジャーナル 認証あり
    Computed tomography (CT)-fluoroscopy is widely used for various interventional radiology (IVR) procedures. To visualize the needle on the axial images clearly, needle insertion is often limited in terms of the positions and orientations. In this study, a novel navigation system for CT-guided IVR using multiple cameras was developed. In-house software was developed using a Qt and open-source libraries including Open CV, Open GL, and ArUco, a camera position estimation library. The 3D positions were analyzed from matrix-type 2D pattern markers. Multiple cameras were used to achieve flexible analysis without dead areas. Organ contours were delineated on pre-acquired CT images, and a DICOM-RT structure set was imported before the procedure to overlay the 3D organ contours on the images. Multi-planar images with arbitrarily selected cross-sections were reconstructed from CT images, and the needle was successfully drawn in real-time on the images. Because the patient’s body and needle were calculated based on the 3D coordinates, the system could visualize the images from any direction, such as the view from the operator or from the needle. By overlaying the organ contours, the organs were shown on the “needle’s eye view” to ensure safe and accurate operation. The usefulness of this novel navigation system is discussed.
メディカルスタッフコーナー
日本IVR学会ガイドライン
次号予告/編集後記/奥付
feedback
Top