- Library Home /
- Search Collections /
- Open Collections /
- Browse Collections /
- UBC Theses and Dissertations /
- Augmented reality guidance for robot-assisted laparoscopic...
Open Collections
UBC Theses and Dissertations
UBC Theses and Dissertations
Augmented reality guidance for robot-assisted laparoscopic surgery Kalia, Megha
Abstract
The most common treatment for organ confined prostate cancer is radical prostatectomy (RP), where cancerous prostate is incised out. Nowadays, mostly the da Vinci system is used to do robot assisted laparoscopic radical prostatectomy (RALRP), providing improved dexterity and significantly faster patient recovery times. However similar to open surgery, the RALRP has higher numbers of reported positive surgical margins. This is potentially due to a surgeon’s effort to remove the cancer and preserving the healthy tissue, when the cancerous and non-cancerous boundaries are indistinguishable in the endoscope. Therefore, the objective of this thesis is to clearly display these boundaries and tumors, using Augmented/Mixed Reality (AR/MR) technology, utilizing imaging data such as, magnetic resonance imaging and ultrasound (US). The successful intra-operative AR/MR depends on primarily two things. First, the surgically compatible calibration steps, to map the imaging data correctly in camera image. Second, the visualization of the co-located data to give reliable depth of subsurface structures, which is imperative to patient safety. Unlike existing methods, in our first work, we propose a method that performs the required hand-eye and camera calibrations without using external markers during surgery. To further streamline the process, in another work, we use an optimization scheme to combine both the calibrations in a single step. The method allows to register the robotic data to the camera within minutes. Additionally, we presented an evaluation of a full AR system that registers the phantom US to the camera image. Next, we address the well-known problem of occlusion while visualizing the overlayed imaging data. For this, our deep learning-based method segments surgical instruments in the human RALRPs videos without using any labelled data. In our other two works we explore the color and motion parallax as depth cues to provide a reliable depth judgement. The usefulness of these are validated through user studies showing significantly better depth perception when using our methods. In conclusion, this thesis presents multiple methods to make AR/MR guidance feasible for RALRP by addressing two pressing challenges in the field of surgical AR, i.e. surgically compatible calibrations and reliable visualization of the registered data.
Item Metadata
Title |
Augmented reality guidance for robot-assisted laparoscopic surgery
|
Creator | |
Supervisor | |
Publisher |
University of British Columbia
|
Date Issued |
2023
|
Description |
The most common treatment for organ confined prostate cancer is radical prostatectomy (RP), where cancerous prostate is incised out. Nowadays, mostly the da Vinci system is used to do robot assisted laparoscopic radical prostatectomy (RALRP), providing improved dexterity and significantly faster patient recovery times. However similar to open surgery, the RALRP has higher numbers of reported positive surgical margins. This is potentially due to a surgeon’s effort to remove the cancer and preserving the healthy tissue, when the cancerous and non-cancerous boundaries are indistinguishable in the endoscope. Therefore, the objective of this thesis is to clearly display these boundaries and tumors, using Augmented/Mixed Reality (AR/MR) technology, utilizing imaging data such as, magnetic resonance imaging and ultrasound (US). The successful intra-operative AR/MR depends on primarily two things. First, the surgically compatible calibration steps, to map the imaging data correctly in camera image. Second, the visualization of the co-located data to give reliable depth of subsurface structures, which is imperative to patient safety. Unlike existing methods, in our first work, we propose a method that performs the required hand-eye and camera calibrations without using external markers during surgery. To further streamline the process, in another work, we use an optimization scheme to combine both the calibrations in a single step. The method allows to register the robotic data to the camera within minutes. Additionally, we presented an evaluation of a full AR system that registers the phantom US to the camera image. Next, we address the well-known problem of occlusion while visualizing the overlayed imaging data. For this, our deep learning-based method segments surgical instruments in the human RALRPs videos without using any labelled data. In our other two works we explore the color and motion parallax as depth cues to provide a reliable depth judgement. The usefulness of these are validated through user studies showing significantly better depth perception when using our methods. In conclusion, this thesis presents multiple methods to make AR/MR guidance feasible for RALRP by addressing two pressing challenges in the field of surgical AR, i.e. surgically compatible calibrations and reliable visualization of the registered data.
|
Genre | |
Type | |
Language |
eng
|
Date Available |
2023-05-29
|
Provider |
Vancouver : University of British Columbia Library
|
Rights |
Attribution-NonCommercial-NoDerivatives 4.0 International
|
DOI |
10.14288/1.0432737
|
URI | |
Degree | |
Program | |
Affiliation | |
Degree Grantor |
University of British Columbia
|
Graduation Date |
2023-11
|
Campus | |
Scholarly Level |
Graduate
|
Rights URI | |
Aggregated Source Repository |
DSpace
|
Item Media
Item Citations and Data
Rights
Attribution-NonCommercial-NoDerivatives 4.0 International