Title
Augmented Reality Guidance with Multimodality Imaging Data and Depth-Perceived Interaction for Robot-Assisted Surgery.
Abstract
Image-guided surgical procedures are challenged by mono image modality, two-dimensional anatomical guidance and non-intuitive human-machine interaction. The introduction of Tablet-based augmented reality (AR) into surgical robots may assist surgeons with overcoming these problems. In this paper, we proposed and developed a robot-assisted surgical system with interactive surgical guidance using tablet-based AR with a Kinect sensor for three-dimensional (3D) localization of patient anatomical structures and intraoperative 3D surgical tool navigation. Depth data acquired from the Kinect sensor was visualized in cone-shaped layers for 3D AR-assisted navigation. Virtual visual cues generated by the tablet were overlaid on the images of the surgical field for spatial reference. We evaluated the proposed system and the experimental results showed that the tablet-based visual guidance system could assist surgeons in locating internal organs, with errors between 1.74 and 2.96 mm. We also demonstrated that the system was able to provide mobile augmented guidance and interaction for surgical tool navigation.
Year
DOI
Venue
2017
10.3390/robotics6020013
ROBOTICS
Keywords
Field
DocType
image-guided surgery,augmented reality,augmented interaction,tablet computer,image registration
Sensory cue,Visual guidance,Computer vision,Multimodality,Image-guided surgery,Augmented reality,Artificial intelligence,Anatomical structures,Engineering,Robot,Image registration
Journal
Volume
Issue
Citations 
6
2.0
1
PageRank 
References 
Authors
0.37
10
3
Name
Order
Citations
PageRank
Rong Wen1334.73
Chin-Boon Chng2254.49
Chee-Kong Chui324538.34