A Multiparametric Magnetic Resonance Imaging- Based Virtual Reality Surgical System for Robotic- Assisted Laparoscopic Radical Prostatectomy- A Feasibiliity Study
A Multiparametric Magnetic Resonance Imaging - Based Virtual Reality Surgical System for
Robotic - Assisted Laparoscopic Radical Prostatectomy - A Feasibility Study
S. Mehralivand1,2,3, A. Kolagunda4, C. Kambhamettu4, K. Hammerich5, K. Cobb6, V. Valera Romero2, J. Bloom2, G.
Pena Lagrave2, V. Sabarwal6, S. Gold2, G. Hale2, K. Rayn2, M. Czarniecki3, B. Wood7, P. Choyke3, B. Turkbey3, P.
1Universitätsmedizin der Johannes Gutenberg Universität, Urologische Klinik und Poliklinik, Mainz, Deutschland, 2National Cancer Institute (NCI), National Institutes of Health (NIH), Urologic Oncology Branch, Bethesda, Vereinigte Staaten von Amerika, 3National Cancer Institute (NCI), National Institutes of Health (NIH), Molecular Imaging Program, Bethesda, Vereinigte Staaten von Amerika, 4University of Delaware, Department of Computer and Information Sciences, Newark, Vereinigte Staaten von Amerika, 5Bristol Hospital Multi-Specialty Group, Bristol, Vereinigte Staaten von Amerika, 6George Washington University, Department of Urology, Washington, D.C., Vereinigte Staaten von Amerika, 7National Cancer Institute (NCI), National Institutes of Health (NIH), Center for Interventional Oncology, Bethesda, Vereinigte Staaten von Amerika
Introduction Robotic-assisted radical prostatectomy (RARP) has become a more commonly used approach for prostate cancer (PCa).Multiparametric magnetic resonance imaging (mpMRI) is more commonly used at pre-operative stage. In this study, we evaluated use of mpMRI data and intraoperative stereo captures to create 3D virtual reality (VR) models for surgical guidance during RARP.
Material and Methods Prostate, bladder, rectum, neurovascular bundles (NVB), seminal vesicles, urethra and PCa lesions were manually contoured on T2W MRI preoperatively. The contours were then used to create 3D mesh models for a commercially available VR platform with head mounted display (HMD) and touch controls. During RARP, stereo images of the laparoscopic views were extracted to create live 3D mesh models. These models were then aligned with
the MRI models by an automated alignment algorithm. When needed, surgeons withdrew from the console and interacted with the 3D models using the HMD. All surgeons (1 experienced urologic surgeon, 3 clinical fellows, 3 urology residents) were polled for usability of the system.
Results A video introduction is attached. VR system was used during 9 RARPs. The time between the stereo image capture to full applicability of the model was approximately 3 minutes. The VR system was utilized at 4 steps. The system was used for 1 minute during each step and did not interfere with routine workflow. All surgeons found the system useful for improved spatial awareness. Main criticism was switching between console and HMD.
Conclusion Our study demonstrates feasibility of interactive visualization of 3D mpMRI data during RARP. Future goals include use of the robot's stereo image viewer instead of HMD and advancement of the MRI-stereo image alignment.