Freehand three-dimensional ultrasound system allows intra-operative imaging of pathologic tissue for image-guided surgical procedures. Accurate registration of ultrasound image for surgical navigation requires calibration that is time-consuming and tedious. We present a novel automatic and robust calibration method without a phantom. In the proposed method, a needle equipped with an electromagnetic tracking sensor is moved in the vicinity of an ultrasound imaging plane taken by a fixed ultrasound probe also equipped with an EM tracking sensor. The ultrasound images, tracker’s and needle tip’s physical coordinates are recorded simultaneously. For each ultrasound image, the needle tip is recognized and the image coordinates are identified automatically. A point registration between the needle tip image and physical coordinates is performed to estimate the calibration matrix. RANSAC is applied to minimize the registration error. Experiments are performed to verify the accuracy of the new calibration method. Results show that the needle tip can be accurately recognized in ultrasound images and the average fiducial registration error is 1.2 mm and hence applicable to the calibration for freehand ultrasound.
Image modalities such as ultrasound imaging, MRI and CT are widely used for surgical planning or navigation. However, surgeons are forced to observe the affected area with switching their gaze direction between the patient and medical images on a monitor. Therefore, it’s difficult to obtain the surgical target position intuitively and to use the accurate positional information of medical images effectively. To solve the problems, we developed an image overlay system consisting of a tablet PC, which is gradually spread and easily available, a 3-D optical tracking system, an image processing PC, and wireless communication system. The image processing PC constructs appropriate overlay images of 3-D CG model from pre-acquired MRI (or CT) images by using the tablet PC’s position and posture data measured by the 3-D optical tracking system. The overlay images are transmitted from the image processing PC to the tablet PC via wireless communication system and are overlaid onto the video images of the surgical-field on the tablet PC’s monitor. In a phantom experiment, the maximum image overlay error was 1.0±0.5 mm, and it was considered that the proposed system enables surgeons to plan the surgical approach and navigate the surgical target intuitively and accurately.
Introduction: Augmented reality is an innovative technology which superimposes virtual reality images onto a real world. It can visualize an affected lesion hidden by normal structures. We’ve introduced this computer-aided technique into the arthroscopic surgery. The purpose of this study is to examine the usefulness of the augmented reality navigation system in the arthroscopic surgery. Materials & Methods: The navigation system for arthroscopic surgery was built with new designed arthroscopy for navigation and modified point to point registration using the customized reference stay with optical markers. We designed 10 models of osteochondritis dissecans (OCD) knees with Sawbones®. 3D OCD knee models were made from CT knee data set using the 3-D reconstruction algorithm. CT was performed with skin fiducial markers. Registration was performed using skin markers and Poraris® optical tracking system. The accuracy of registration and that of image overlay using new devices were evaluated. Results: The accuracy of registration by using the new devices was mean 0.94±0.74 mm. The overlay accuracy was mean 3.51±2.64 mm on 2D monitor and 5.76±3.91 mm on 3D space. Conclusion: Augmented Reality would be a useful help for arthroscopic surgery not only in the treatment of OCD lesion but also in the treatment of ACL injury, the visualization of tumor site and important vessels.