Image-based Guidance for ProstateInterventionsbySiavash KhallaghiB.Sc., Amirkabir University of Technology, 2008M.Sc., Queen’s University, 2010A THESIS SUBMITTED IN PARTIAL FULFILLMENT OFTHE REQUIREMENTS FOR THE DEGREE OFDOCTOR OF PHILOSOPHYinThe Faculty of Graduate and Postdoctoral Studies(Electrical and Computer Engineering)THE UNIVERSITY OF BRITISH COLUMBIA(Vancouver)October 2015c© Siavash Khallaghi 2015AbstractProstate biopsy is the gold standard for cancer diagnosis. This procedureis guided using a 2D transrectal ultrasound (TRUS) probe. Unfortunately,early stage tumors are not visible in ultrasound and prostate motion/deformationsmake targeting challenging. This results in a high number of false negativesand patients are often required to repeat the procedure.Fusion of magnetic resonance images (MRI) into the workspace of aprostate biopsy has the potential to detect tumors invisible in TRUS. This al-lows the radiologist to better target early stage cancerous lesions. However,due to different body positions and imaging settings, the prostate under-goes motion and deformation between the biopsy coordinate system and theMRI. Furthermore, due to variable probe pressure, the prostate moves anddeforms during biopsy as well. This introduces additional targeting errors.A biopsy system that compensates for these sources of error has the poten-tial to improve the targeting accuracy and maintain a 3D record of biopsylocations.The goal of this thesis is to provide the necessary tools to perform free-hand MR-TRUS fusion for prostate biopsy using a 3D guidance system. Tothis end, we have developed two novel surface-based registration methods forincorporating the MRI into the biopsy workspace. The proposed methodsare the first methods that are robust to missing surface regions for MR-TRUSfusion (up to 30% missing surface points). We have validated these fusiontechniques on 19 biopsy, 10 prostatectomy and 11 brachytherapy patients.In this thesis, we have also developed methods that combine intensity-based information with biomechanical constraints to compensate for prostatemotion and deformations during the biopsy. To this end, we have developeda novel 2D-3D registration framework, which was validated on an additional10 biopsy patients. Our results suggest that accurate 2D-3D registration forfreehand biopsy is feasible.The results presented suggest that accurate registration of MR and TRUSdata in the presence of partially missing data is feasible. Moreover, wedemonstrate that in the presence of variable probe pressure during freehandbiopsy, a combination of intensity-based and biomechanically constrainediiAbstract2D-3D registration can enable accurate alignment of pre-procedure TRUSwith 2D real time TRUS images.iiiPrefaceThis thesis is primarily based on five publications, resulting from collabora-tion between multiple researchers. All publications have been modified tomake the thesis coherent. Ethics approval for conducting this research hasbeen provided by the Clinical Research Ethics Board, certificate numbers:H11-01789.The study in Chapter 2 has been accepted for publication in:• Siavash Khallaghi, C. Antonio Sánchez, Abtin Rasoulian, Yue Sun,Farhad Imani, Amir Khojaste Galesh Khale, Orcun Goksel, Cesare Ro-magnoli, Hamidreza Abdi, Silvia Chang, Parvin Mousavi, Aaron Fen-ster, Aaron Ward, Sidney Fels and Purang Abolmaesumi, “Biomechan-ically Constrained Surface Registration: Application to MR-TRUS Fu-sion for Prostate Interventions”, IEEE Transactions on Medical Imag-ing, 2015.The contribution of the author was in developing, implementing and eval-uating the method. C. Antonio Sánchez contributed to the implementa-tion, biomechanical modeling and mathematical derivation of the registra-tion method. Yue Sun, Dr. Imani and Amir Khojaste Galesh Khale helpedwith the data collection. Drs. Romagnoli, Abdi and Chang (clinical collabo-rators) helped with data collection and fiducial identification to validate theregistration method. All co-authors contributed to editing of the manuscript.The study in Chapter 3 has been published in:• Andriy Fedorov, Siavash Khallaghi, C. Antonio Sánchez, Andras Lasso,Sidneyy Fels, Kemal Tuncali, Emily Neubauer Sugar, Tina Kapur,Chenxi Zhang, William Wells, Paul Nguyen, Purang Abolmaesumiand Clare Tempany, “Open-source Image Registration for MRI-TRUSFusion Guided Prostate Interventions”, International Journal of Com-puter Assisted Radiology and Surgery, Springer, Berlin, Heidelberg, pp1-10, 2015.This chapter is the open-source release of the method published in Chap-ter 2 and entails further validation of our method on 11 brachytherapy pa-tients. Dr. Fedorov developed and implemented one of the two registrationivPrefacemethods used in this manuscript, designed the study and conducted theexperiments. The contribution of the author was in providing the secondregistration method and writing parts of the manuscript and editing. Dr.Tuncali performed the clinical study, collected the data and selected fidu-cials for quantitative validation of the registration methods. All co-authorscontributed to the editing of the manuscript.The study in Chapter 4 has been accepted for publication in:• Siavash Khallaghi, C. Antonio Sánchez, Abtin Rasoulian, Saman Noura-nian, Cesare Romagnoli, Hamidreza Abdi, Silvia D. Chang, Peter C. Black,Larry Goldenberg, William J. Morris, Ingrid Spadinger, Aaron Fen-ster, Aaron Ward, Sidney Fels and Purang Abolmaesumi, “StatisticalBiomechanical Surface Registration: Application to MR-TRUS Fusionfor Prostate Interventions”, IEEE Transactions on Medical Imaging,2015.The contribution of the author was in developing, implementing and eval-uating the method. C. Antonio Sánchez contributed to the implementationand biomechanical modeling. Saman Nouranian collected the brachytherapydataset in collaboration with Drs. Morris and Spadinger. Dr. Rasoulianprovided the code for the construction of the statistical shape model. Drs.Romagnoli, Abdi and Chang (clinical collaborators) helped with data col-lection and fiducial identification to validate the registration method. Allco-authors contributed to the editing of the manuscript.The study in Chapter 5 has been accepted for publication in:• Siavash Khallaghi, C. Antonio Sánchez, Saman Nouranian, Samira So-joudia, Lindsay Machan, Allison Harris, Silvia Chang, Peter Black,Martin Gleavec, Larry Goldenberg, Sidney Fels and Purang Abolmae-sumi, “A 2D-3D Registration Framework for Freehand TRUS-GuidedProstate Biopsy”, Medical Image Computing and Computer AssistedInterventions, Springer, Berlin, Heidelberg, 2015.The contribution of the author was in developing the method, collectingdata, designing experiments and writing the manuscript. C. Antonio Sánchezhelped with biomechanical modeling and mathematical derivation of themethod. Saman Nouranian and Samira Sojoudi contributed to software de-velopment and data collection. Drs. Machan, Harris, Chang, Black, Gleaveand Goldenberg were the clinicians involved in data collection. All co-authorscontributed to the editing of the manuscript.vTable of ContentsAbstract . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . iiPreface . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . ivTable of Contents . . . . . . . . . . . . . . . . . . . . . . . . . . . . viList of Tables . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . ixList of Figures . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . xList of Abbreviations . . . . . . . . . . . . . . . . . . . . . . . . . xiiAcknowledgments . . . . . . . . . . . . . . . . . . . . . . . . . . . . xiv1 Introduction . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 11.1 Anatomy and Histology of the Prostate . . . . . . . . . . . . 11.2 Epidemiology of Prostate Cancer . . . . . . . . . . . . . . . . 21.3 PCa Screening . . . . . . . . . . . . . . . . . . . . . . . . . . 31.3.1 Digital Rectal Examination . . . . . . . . . . . . . . . 31.3.2 Prostate Specific Antigen . . . . . . . . . . . . . . . . 31.4 PCa Imaging . . . . . . . . . . . . . . . . . . . . . . . . . . . 31.4.1 X-Ray and Computed Tomography . . . . . . . . . . 31.4.2 Magnetic Resonance . . . . . . . . . . . . . . . . . . . 31.4.3 Ultrasound Examination of the Prostate . . . . . . . 51.5 PCa Diagnosis . . . . . . . . . . . . . . . . . . . . . . . . . . 51.6 3D Prostate Biopsy Systems . . . . . . . . . . . . . . . . . . 71.6.1 In-bore MR . . . . . . . . . . . . . . . . . . . . . . . . 71.6.2 MR-TRUS Fusion . . . . . . . . . . . . . . . . . . . . 71.6.3 Commercial 3D-Biopsy Systems . . . . . . . . . . . . 81.7 Required Targeting Accuracy for Prostate Biopsy Systems . 121.8 Objective . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 121.9 Contributions . . . . . . . . . . . . . . . . . . . . . . . . . . 13viTable of Contents1.10 Thesis Outline . . . . . . . . . . . . . . . . . . . . . . . . . . 132 Biomechanically Constrained Surface Registration . . . . . 172.1 Introduction . . . . . . . . . . . . . . . . . . . . . . . . . . . 172.1.1 Surface-based Registration . . . . . . . . . . . . . . . 182.1.2 Contributions . . . . . . . . . . . . . . . . . . . . . . 202.2 Method . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 212.2.1 GMM-FEM Registration . . . . . . . . . . . . . . . . 222.2.2 Related Registration Methods Used for Comparison . 262.3 Experiments and Results . . . . . . . . . . . . . . . . . . . . 272.3.1 Data . . . . . . . . . . . . . . . . . . . . . . . . . . . 282.3.2 Registration to Full and Partial Surfaces . . . . . . . 302.3.3 Sensitivity to Biomechanical Parameters . . . . . . . 392.3.4 Sensitivity to Number of Surface Points . . . . . . . . 402.3.5 Registration Time and Computational Complexity . . 402.4 Discussion and Conclusions . . . . . . . . . . . . . . . . . . . 423 Open-source Image Registration for MRI-TRUS fusion . . 463.1 Introduction . . . . . . . . . . . . . . . . . . . . . . . . . . . 463.1.1 Contributions . . . . . . . . . . . . . . . . . . . . . . 483.2 Methods . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 483.2.1 Image Acquisition and Pre-processing . . . . . . . . . 493.2.2 Registration of Signed Distance Maps with B-splineRegularization . . . . . . . . . . . . . . . . . . . . . . 503.2.3 GMM-FEM Registration . . . . . . . . . . . . . . . . 513.2.4 Evaluation Setup . . . . . . . . . . . . . . . . . . . . 513.3 Results . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 523.4 Discussion . . . . . . . . . . . . . . . . . . . . . . . . . . . . 563.5 Conclusions . . . . . . . . . . . . . . . . . . . . . . . . . . . . 594 Statistical Biomechanical Surface Registration . . . . . . . 604.1 Introduction . . . . . . . . . . . . . . . . . . . . . . . . . . . 604.1.1 Related Work . . . . . . . . . . . . . . . . . . . . . . 614.1.2 Contributions . . . . . . . . . . . . . . . . . . . . . . 644.2 Method . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 654.2.1 Clinical Workflow . . . . . . . . . . . . . . . . . . . . 654.2.2 Partial Surface to Partial Surface Registration . . . . 664.2.3 Probabilistic SSM-FEM . . . . . . . . . . . . . . . . 674.2.4 Registration Methods Used for Comparison . . . . . . 744.3 Experiments and Results . . . . . . . . . . . . . . . . . . . . 75viiTable of Contents4.3.1 Data . . . . . . . . . . . . . . . . . . . . . . . . . . . 754.3.2 Registration of Full and Partial Surfaces . . . . . . . 814.3.3 Sensitivity to Errors in Segmentation . . . . . . . . . 884.3.4 Sensitivity to Missing Surface Points . . . . . . . . . . 894.3.5 Sensitivity to Regularization Parameters . . . . . . . 904.4 Discussion and Conclusions . . . . . . . . . . . . . . . . . . . 915 2D-3D Registration for Freehand Prostate Biopsies . . . . 955.1 Introduction . . . . . . . . . . . . . . . . . . . . . . . . . . . 955.1.1 Related Work . . . . . . . . . . . . . . . . . . . . . . 955.2 Methods . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 975.2.1 Data Acquisition . . . . . . . . . . . . . . . . . . . . . 975.2.2 2D-3D Registration . . . . . . . . . . . . . . . . . . . 995.3 Experiments and Results . . . . . . . . . . . . . . . . . . . . 1015.4 Discussion and Conclusions . . . . . . . . . . . . . . . . . . . 1036 Conclusions and Future Work . . . . . . . . . . . . . . . . . . 1066.1 Contributions . . . . . . . . . . . . . . . . . . . . . . . . . . 1066.2 Future Work . . . . . . . . . . . . . . . . . . . . . . . . . . . 107Bibliography . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 110AppendicesA Derivation of Shape Parameters . . . . . . . . . . . . . . . . . 130B Derivation of FE Nodal Displacements . . . . . . . . . . . . 132C Derivative of the Interpolation Matrix . . . . . . . . . . . . 134viiiList of Tables1.1 Commercial 3D-TRUS biopsy systems. . . . . . . . . . . . . . 82.1 Mathematical notations . . . . . . . . . . . . . . . . . . . . . 222.2 The Dice coefficient for prostatectomy and biopsy data fol-lowing affine/rigid and non-rigid registration . . . . . . . . . . 362.3 TRE for prostatectomy and biopsy data following affine/rigidand non-rigid registration . . . . . . . . . . . . . . . . . . . . 372.4 P -values from Wilcoxon signed-rank between different regis-tration methods. . . . . . . . . . . . . . . . . . . . . . . . . . 392.5 P -values from Wilcoxon signed-rank between different regis-tration methods for full vs. partial data. . . . . . . . . . . . . 392.6 Time (seconds) required for each component of GMM-FEMregistration reported across all patients. . . . . . . . . . . . . 423.1 Volumes of the prostate gland segmentation in MRI and US . 543.2 Summary statistics of the initial TRE . . . . . . . . . . . . . 554.1 Mathematical notations . . . . . . . . . . . . . . . . . . . . . 684.2 TRE for registration algorithms . . . . . . . . . . . . . . . . . 844.3 TRE at base, mid-gland and apex for SSM-FEM registrationfor full and partial data . . . . . . . . . . . . . . . . . . . . . 854.4 Wilcoxon signed-rank tests between methods . . . . . . . . . 854.5 Total registration time . . . . . . . . . . . . . . . . . . . . . 865.1 Registration results and p-values. . . . . . . . . . . . . . . . . 103ixList of Figures1.1 Prostate anatomy [65]. . . . . . . . . . . . . . . . . . . . . . . 11.2 Prostate zones: peripheral, central and transitional [65]. . . . 21.3 TRUS-guided prostate biopsy [76]. . . . . . . . . . . . . . . . 61.4 Sagittal view of the 3D reconstructed volume . . . . . . . . . 92.1 GMM-FEM registration workflow for MR-TRUS fusion . . . . 202.2 Prostatectomy data collection protocol . . . . . . . . . . . . . 282.3 An example of prostate biopsy data . . . . . . . . . . . . . . . 292.4 GMM-FEM registration for prostatectomy data . . . . . . . . 312.5 Typical results for a prostatectomy data . . . . . . . . . . . . 322.6 GMM-FEM registration for prostate biopsy data . . . . . . . 332.7 Typical results for a prostate biopsy data . . . . . . . . . . . 342.8 Examples of fiducial pairs in prostatectomy images . . . . . . 352.9 Robustness of GMM-FEM registration for different regular-ization weights . . . . . . . . . . . . . . . . . . . . . . . . . . 382.10 Robustness of GMM-FEM registration for different surfaceresolutions . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 413.1 Example of the registration result for case 10 using BWHmethod . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 533.2 Visualization of the deformation field for a typical patient . . 543.3 Example of the surface registration result using GMM-FEMmethod . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 563.4 Summary of the TREs for the datasets used in the evaluation 574.1 Overview of the MR-TRUS fusion workflow using SSM-FEMregistration . . . . . . . . . . . . . . . . . . . . . . . . . . . . 654.2 SSM modes of variation . . . . . . . . . . . . . . . . . . . . . 764.3 A visual representation of fine statistical sampling . . . . . . 774.4 Axial slices from MR and TRUS with their corresponding seg-mentations . . . . . . . . . . . . . . . . . . . . . . . . . . . . 794.5 An example of SSM-FEM registration to full surfaces . . . . . 80xList of Figures4.6 An example of SSM-FEM registration to partial surfaces . . . 824.7 Examples of fiducial pairs in MR and TRUS images . . . . . 834.8 The effect of Gaussian noise on TRE . . . . . . . . . . . . . . 874.9 Distribution of robustness in the presence of missing data . . 894.10 Robustness of SSM-FEM registration for different regulariza-tion weights . . . . . . . . . . . . . . . . . . . . . . . . . . . . 905.1 Clinical workflow for freehand TRUS guided prostate biopsy . 985.2 Example of a calcification and a cyst on the slice and volume 1025.3 Typical 2D-3D registration results . . . . . . . . . . . . . . . 103xiList of Abbreviations2D 2-dimensional3D 3-dimensionalAUA American Urological AssociationAUC Area Under CurveBPH Benign Prostatic HyperplasiaBWH Brigham and Women’s HospitalCPD Coherent Point DriftCT Computed TomographyDCE Dynamic Contrast EnhancedDRE Digital Rectal ExamDWI Diffusion Weighted ImagingEM Expectation MaximizationESUR European Society of Urogenital RadiologyFE Finite ElementFEM FE ModelFLE Fiducial Localization ErrorGMM Gaussian Mixture ModelGd GadoliniumGPU Graphics Processing UnitHARDI High Angular Resolution Diffusion ImagingICP Iterative Closest PointICP-FEM ICP-based FEMIGI Image-Guided InterventionITK Insight ToolkitMD Mean DiffusivityMR Magnetic ResonanceMRI Magnetic Resonance Imagingmp-MR Multi-Parametric MRO2D Orthogonal 2DPCa Prostate CancerPre-op Pre-operativexiiList of AbbreviationsPSA Prostate Specific AntigenRAS Right-Anterior-SuperiorRF Radio FrequencySDM Statistical Deformation ModelSSM Statistical Shape ModelSSD Sum-of-Squared DifferencesTPS Thin-Plate SplineTPS-RPM TPS Robust Point MatchingTRE Target Registration ErrorTRUS Transrectal UltrasoundUA Urethra at ApexUB Urethra at BaseUBC University of British ColumbiaUS UltrasoundVM VerumontanumxiiiAcknowledgmentsFirst and foremost, I would like to express my gratitude to my advisor Dr.Purang Abolmaesumi, for his support and guidance throughout my PhD. Heis not just an amazing researcher, he is a wonderful person. He genuinelycares about his students and treats them like family. I am eternally gratefulto him for his positive role in my professional and personal life. I would alsolike to thank Drs. Fels, Goksel, Rohling, Mousavi, Fenster and Ward, fortheir generous advice and helpful discussions.I would like to thank our clinical collaborators, Drs. Machan, Chang,Harris, Abdi, Black, Gleave and Goldenberg, without whom this thesis wouldnot have been feasible.I would also like to express my gratitude to Antonio for sharing his knowl-edge of biomechanical modeling. He is not only a competent researcher, witha solid mathematical knowledge, but also a great friend.Special thanks to Saman, Samira and Nika. You feel more like familythan friends or colleagues.I am also thankful to current and former members of the RCL at UBC∗:Jeff, Troy, Raoul, Kyle, Weiqi, Greg, Omid, Shekoofeh, Julio, Hussam, Yas-min, Parmida, Angelica, Ali, Arthur, Farhad, Narges, Mohammad (Honar-var and Najafi), Alex, Caitlin, Tom, Abtin, Hamid, Phillip, Julie, Delaram,Hedyeh, Mahdi, Golnoosh, and last but not least, Guy.I would like to thank my loving parents, Ali and Leili. Without your loveand support, I would never have had the good fortune of being where I amtoday.Finally, I would like to thank Elmira for our ten days together in Van-couver: “So it was that he saw her whom few mortals had yet seen; Arwen,daughter of Elrond, in whom it was said that the likeness of Lúthien hadcome on earth again; and she was called Undómiel, for she was the Evenstarof her people.” -The Fellowship of the Ring, J. R. R. Tolkien∗When you have been in the same group for five years, the list tends to become long!xivChapter 1Introduction1.1 Anatomy and Histology of the ProstateFigure 1.1: Prostate anatomy [65].The prostate is a walnut-shaped organ that completely surrounds theurethra and is part of the male reproductive system (Figure 1.1). The phys-iological function of the prostate is the production of seminal fluid, whichcombines with sperm from the testes. Since seminal fluid is alkaline, it neu-tralizes the acidity of the vaginal tract, prolonging the lifespan of sperm.The prostate gland is composed of fibromuscular and glandular tissues, andcontracts during ejaculation in order to secrete. It is oriented with its baseangled posteriorly toward the bladder neck and its apex is angled anteri-orly in continuity with the urethra. The prostate is fixed to the pubic bonevia puboprostatic ligaments and sits anterior to the rectum. The prostateis composed of three primary regions along the urethra. The superior re-11.2. Epidemiology of Prostate Cancer(a) (b) (c)Figure 1.2: Prostate zones: peripheral, central and transitional [65].gion proximal to the bladder neck, the inferior region near the urogenitaldiaphragm, and the region in-between the two are referred to as the base,apex and mid-gland, respectively.Histologically, the prostate is composed of three primary tissue regions:peripheral, central and transitional zones (Figure 1.2). The peripheral regionis the largest zone and encompasses the posterior of the gland. The centralzone surrounds the ejaculatory ducts, near the base. The transitional zonecontains the remainder of the prostate and encapsulates the proximal ure-thra.1.2 Epidemiology of Prostate CancerProstate Cancer (PCa) is the most commonly diagnosed noncutaneous can-cer in Canadian men, with 1 in 8 males expected to be diagnosed with PCain their lifetime. It is the leading cancer for Canadian males with 23,600(24% of all) new cases expected in 2014 [14]. The risk of developing PCaincreases with age: beginning at 8% for men in their 20s until 80% of menin their 70s harbour invasive PCa [69]. From the 40,000 expected cancerdeaths in 2014, PCa is the third most common cause of mortality (10%) inmales [14]. Despite its prevalence, over the past 40 years, early detection hasdecreased mortality rates for PCa patients without substantial changes insurgical or radiation treatment strategies [66, 128]. When PCa is diagnosedat an early and localized stage, it can be managed with hormone therapy,radiation treatment or surgical resection, to improve the patient’s survivaland quality of life. [115].21.3. PCa Screening1.3 PCa Screening1.3.1 Digital Rectal ExaminationIn a digital rectal examination (DRE), the urologist checks for growths,enlargements or suspicious hard areas by palpating the prostate through therectum. Since 80-85% of all cancers arise in the peripheral zone [96], whichis adjacent to the rectum, about 18% of patients with PCa can be diagnosedusing a DRE [152]. Due to its lack of sensitivity to early stage cancer, theAmerican Urological Association (AUA) does not recommend DREs for first-line screening [24]. However, this test can be used in conjunction with otherdiagnostic tools when screening for PCa [24].1.3.2 Prostate Specific AntigenThe prostate specific antigen (PSA) is a protein produced by the prostategland which can be used as a biological marker for tumors. While the antigenis present in small quantities in the serum of men with healthy prostates,elevated levels often indicate the presence of PCa. However, while PSAscreening is sensitive to PCa, it lacks specificity. Only a quarter of men withhigh PSA actually have PCa [5]. Some men have naturally elevated PSAlevels and the antigen also raises in presence of clinically insignificant tu-mors and benign prostatic hyperplasia1 (BPH) [123]. As a result, urologistsevaluate PSA levels with caution to confirm or reject the cancer hypothesis.1.4 PCa Imaging1.4.1 X-Ray and Computed TomographyX-ray and computed tomography (CT) have no diagnostic value in PCa, asseparation from surrounding muscle is poor and the anatomy interior to theprostate is not well defined. The primary role of CT is in the detection ofmetastases and to plan radiation-based therapies in patients with confirmedPCa [76].1.4.2 Magnetic ResonanceMagnetic resonance (MR) imaging provides excellent soft tissue contrast. Ithas shown promise as a local staging modality for identifying and localizing1Benign prostate growth31.4. PCa Imagingpotential PCa lesions within the prostate [78, 99, 145]. Due to low con-trast, the basic T1-weighted pulse sequence has limited use for PCa imag-ing. T2-wighted images, however, can clearly differentiate prostate zonalanatomy [15]. In these images, healthy tissue in the peripheral zone appearsbright, whereas cancerous regions appear as dark regions. The shortcomingof T2-weighted imaging is that signal patterns of BPH and prostatitis2 mimicmalignancies in the transitional and peripheral zones, respectively [93]. Dueto these factors, PCa detection using T2-weighted images alone is challeng-ing. T2-weighted MR is often used in conjunction with other MR modalitiesto improve detection rates [50, 120, 150].Diffusion weighted imaging (DWI) is a functional imaging technique usedto map and characterize the three-dimensional diffusion of water as a tensorof spatial location. It is typically formulated as a 2nd order tensor which de-scribes the magnitude, anisotropy and orientation of fluid movement. Whilethis formulation is the first and most popular, its shortcomings are known inregions that contain fiber-crossings. High angular resolution diffusion imag-ing (HARDI) techniques overcome this limitation using new reconstructiontechniques based on Q-ball [148] and higher order tensors [106].Regardless of the reconstruction method used in DWI, this imaging tech-nique is commonly used to calculate the mean diffusivity (MD), which char-acterizes the average random motion of hydrogen nuclei within the prostate.Healthy tissue typically exhibit a high MD, which indicates that the fluid flowis primarily in a single direction. PCa, however, tends to destroy glandularstructures, which allows for unconstrained, omni-directional fluid movement.As a result, PCa appears darker compared to surrounding healthy tissue [80].However, since BPH and prostatitis also exhibit lower MD values, DWI isbest in combination with other MR modalities [77].Dynamic contrast enhanced (DCE) imaging measures blood flow throughthe acquisition of serial T1-weighted images before, during, and after theadministration of a Gadolinium (Gd) contrast agent. Since PCa tumors arevascular, the presence of asymmetrical contrast uptake followed by a rapidwash-out is indicative of cancer [44].The combination of T2-weighted images with functional imaging tech-niques is promising for high sensitivity and specificity of PCa staging. Thecollective information provided by multiple MR techniques is known as multi-parametric MR (mpMR). The recent European Society of Urogenital Radi-ology (ESUR) report [4] recommends the use of T2-weighted and at leasttwo functional acquisitions for screening patients suspicious of PCa. How-2Infection of the prostate41.5. PCa Diagnosisever, the reported results of mpMR for sensitivity and specificity vary widelyacross the literature, with area under curve (AUC) values ranging between0.66-0.90 for different combinations of T2, DCE and DWI [46, 75, 151]. Asa result, as of yet, mpMR cannot be used for the definitive diagnosis ofPCa. However, it can be used to identify suspicious cancerous lesions to betargeted during a biopsy session (see Section 1.5).1.4.3 Ultrasound Examination of the ProstateSince the prostate is anterior to the rectum, it can be imaged using a tran-srectal ultrasound (TRUS) probe through the rectal wall. There are threemain types of TRUS probes used in the clinic: 1) end-firing; 2) side-firing;and 3) 3D probes. End-firing and side-firing probes are commonly used inclinics for live guidance in the diagnosis and treatment of PCa. 3D probescome with enhanced functionality: they can simultaneously acquire bothaxial and sagittal imaging planes. They can also generate a full 3D volumeusing a motorized angular sweep, however, 3D-TRUS image formation islengthier compared to 2D-TRUS. As a result, the frame rate of these probestypically lower compared to their 2D counterparts.PCa in the peripheral zone is typically considered to appear darker com-pared to medium-echogenity in TRUS [137]. In the transitional zone, PCadoes not show a specific echo pattern in appearance. Unfortunately, PCadetection is challenging using TRUS alone. TRUS-based screening has beenreported with low sensitivity (35-91%) and specificity (24-81%) values forPCa detection [34, 53, 83, 104, 127].1.5 PCa DiagnosisHistopathological analysis of tissue samples, harvested using biopsy, is con-sidered the gold standard for PCa diagnosis. The biopsy is an out-patientprocedure, performed using an end-firing 2D-TRUS probe to manually imagethe prostate and guide the biopsy needle through the rectal wall (Figure 1.3).The current sextant protocol entails sampling of the prostate, by collectingeight to 12 cores depending on its volume and more cores at suspicious re-gions. The biopsy is done systematically, i.e. the cores are taken frompredefined anatomical zones and are not tailored to the patient’s anatomy.Unfortunately, it is estimated that the current biopsy regimen suffers froma high (30%) false-negative rate [107]. Using more samples may increasecancer yield; however, there is a clinical trade-off between number of cores,51.5. PCa DiagnosisFigure 1.3: TRUS-guided prostate biopsy [76].pain, risk of infection and procedure time/cost, which makes 12 biopsy coresthe plateau zone of cancer detection [33].The low sensitivity of prostate biopsy is mainly due to the isoechogenicnature of small tumors. It is estimated that 40-70% of PCa tumors areinvisible in TRUS, and therefor may not be sampled during the biopsy ses-sion [121]. Furthermore, 2D-TRUS images do not provide a clear spatialcontext of the probe with respect to the prostate, and as a result, its inter-pretation requires a certain level of expertise for accurate needle placement.In addition the prostate moves and deforms during the procedure, whichrequires the radiologist to mentally adjust for tracking errors during system-atic sampling. The cumulative effect of occult tumors and targeting errorscontributes to the high number of false negatives for prostate biopsies.When there is strong suspicion of cancer, patients with elevated PSAlevels and a negative biopsy are frequently required to repeat the procedure.However, since the initial procedure was performed under 2D guidance, onlya coarse record of previous biopsy locations is available for the re-biopsy.In the absence of 3D information, the radiologist is unable to maximize thecancer yield by avoiding areas that have been shown cancer-free and to re-biopsy regions prone to develop invasive cancer.Information from prior biopsy procedures or from other 3D imagingmodalities, such as MR, could by used to to target suspicious areas directly.This information is best presented to the radiologist in terms of a 3D-biopsysystem that enables the radiologist to plan and record biopsy locations in 3D.61.6. 3D Prostate Biopsy SystemsSuch an approach aims to combine image-specific information for improvedguidance within the same systematic biopsy regimen.1.6 3D Prostate Biopsy Systems1.6.1 In-bore MRIn-bore systems allow the interventional radiologist to acquire MR imagesdirectly during the biopsy procedure. The architecture of the MR imagingsystem is either closed [35, 79, 142] or open bore [27, 31, 54]. There is atrade-off between the strength of the magnetic field and the interventionalwork space for closed vs. open bore systems. Closed bore systems producehigher quality images due to the stronger magnetic field for biopsy guid-ance, however, the interventional workspace is much more confined. Openbore systems, provide a larger dexterous workspace compared to closed boresystems at the expense of lower quality images.The clinical workflow for in-bore biopsy systems entails acquiring a di-agnostic MR image. The image is used to identify and delineate suspiciousregions to be targeted during the biopsy. These potential targets are mappedto the interventional space by registering the diagnostic MR to a 3D MR im-age, acquired just before the start of the procedure. Throughout the biopsy,intra-procedure MR slices are used to ensure correct needle placement.A comparative study of in-bore systems has shown an overall improve-ment of 88% vs. 55% over systematic TRUS-guided biopsy [52]. However,in-bore systems are unlikely to replace TRUS-based biopsy. Several key dis-advantages make an in-bore approach infeasible as the standard-of-care forPCa diagnosis. Since MR image formation is lengthy, the biopsy takes longercompared to the conventional TRUS-based procedure. As a result, patientsare often sedated using general anesthesia to avoid involuntary motion dueto discomfort. Considering the annual number of prostate biopsies, lengthyprcoedures and longer patient recovery time impose a huge cost burden onthe healthcare system. Furthermore, using MR for image guidance requiresnew equipment, renovating biopsy rooms and retraining which makes inte-gration difficult for most clinics.1.6.2 MR-TRUS FusionA more prudent approach to allow integration of MR in the procedure roominvolves the registration of the diagnostic MR to real-time TRUS images,thus exploiting the advantages of each modality. Typically the clinical work-71.6. 3D Prostate Biopsy SystemsTable 1.1: Commercial 3D-TRUS biopsy systems.System Probe Tracking 3D-TRUS Fusion RegistrationGenerationUronav 2D Magnetic Freehand Manual Rigid(In Vivo, USA) Axial SweepArtemis 2D Mechanical Rotational Surface-based Rigid(Eigen, USA) SweepUrostation 3D Image- Panoramaic Surface-based Rigid and(Koelis, France) based Reconstruction Deformableflow of MR-TRUS fusion systems involves the following steps. First, a diag-nostic MR is acquired prior to the biopsy session. The radiologist studies theMR image to identify potential targets for biopsy. Just prior to the start ofthe procedure, a 3D-TRUS image is acquired to define the systematic sextantinterventional space [6, 7, 158]. Targets from the MR image are mapped tothis 3D-TRUS image using multi-modality intensity-based [141] or surface-based registration [25, 90]. This registration needs to account for the changeof coordinates between the MR and TRUS acquisitions and deformationsinduced due to probe pressure and endorectal coils. During the procedure,tracked 2D-TRUS images are used for real-time guidance with respect to thereference 3D-TRUS.1.6.3 Commercial 3D-Biopsy SystemsOver the years, several solutions have been developed for MR-TRUS fusionin 3D-guided prostate biopsies. Table 1.1 is a short list of the commerciallyavailable transrectal systems and contains the probe type, tracking scheme,image acquisition, fusion technique and registration used to compensate forprostate motion and deformations during the procedure. I have excluded theBiopsee system (Pi Medical, Greece) from this list as they follow a transper-ineal approach, which is only used by 2% of urologists in North America [130].3D-TRUS GenerationUronav and Artemis systems are designed to retrofit the same 2D-TRUSprobe used in the conventional biopsy. In the Uronav system, a base-to-81.6. 3D Prostate Biopsy Systems(a) (b)Figure 1.4: Sagittal view of the 3D reconstructed volume following incon-sistent (a) and consistent (b) probe pressure during a freehand sweep of aprostate biopsy patient.apex axial sweep of the prostate is acquired just before the start of theprocedure [158]. The magnetically tracked 2D-TRUS images are then usedto reconstruct a 3D-TRUS volume. The main disadvantage of this methodof 3D-TRUS generation is the inconsistent probe pressure during the sweep.Since the prostate is an elastic organ and the probe is hand-held, the radiolo-gist may not exert uniform probe pressure as he/she sweeps the prostate. Asa result, reconstructed 3D-TRUS volumes frequently suffer from deformationartifacts which introduce additional errors in 3D guidance. Figure 1.4a is anextreme case of inconsistent probe pressure during a freehand sweep, while inFigure 1.4b, the radiologist was instructed to maintain uniform probe pres-sure on the same patient. In practice, the quality of 3D-TRUS images willdepend on the dexterity of the radiologist, tracking errors and involuntarypatient motion due to discomfort.3D-TRUS acquisition in the Artemis system is performed using a rota-tional sweep of the 2D-TRUS probe around its axis [7]. Since the probeis mechanically stabilized using a robotic arm, the prostate is less likely todeform during 3D-TRUS generation. The main drawback of the Artemissystem is that their solution requires an additional mechanical arm, which isnot part of the standard clinical hardware. As a result, their solution doesnot allow for freehand guidance as part of the current standard-of-care.The Koelis system supports direct 3D acquisition using a special 3Dprobe. Since the average prostate is larger than the probe beam-width, three91.6. 3D Prostate Biopsy Systems3D volumes are typically acquired and fused together to create a panoramicvolume [6]. This 3D generation method requires only three contact pointsand as a result, the radiologist needs to only hold his/her hand steady for alimited time. This means that deformation artifacts are less likely to appearin the panoramic 3D-TRUS.Fusion SchemeOnce the 3D-TRUS volume is generated, the radiologist needs to bring theMR image into the workspace of the 3D-TRUS. However, the MR is typicallycollected weeks in advance, and usually in a different body position (supinevs. lateral). As a result, the fusion technique needs to account for a changeof coordinates. Furthermore, MR images are frequently acquired in the pres-ence of an endorectal coil, which causes the appearance of prostate to changedue to biomechanical pressure. Finally, the 3D-TRUS is generated using anendocavity probe, which also causes the prostate to change in appearance.Fusion systems account for these effects through multi-modality rigid anddeformable registration.The fusion technique in the Uronav system is based on manual alignmentof the MR and 3D-TRUS [158]. The main disadvantage of this approach,as with all manual registration techniques, is user-variability. Furthermore,since the registration transform is rigid, they do not account for prostatedeformations between MR and 3D-TRUS acquisitions.In the Artemis system, fusion is performed using a surface-based ap-proach. Assuming that the MR has already been segmented, they performa quick segmentation of the 3D-TRUS using a semi-automatic dynamic con-tour propagation approach [81]. The MR and 3D-TRUS surfaces are firstaligned using a manual landmark-based approach. In order to account fordeformations, they use a thin-plate spline (TPS) surface-based registrationapproach [12]. In addition to drawback of user-variability due to semi-automatic segmentation and manual alignment, fusion accuracy has beenreported to decrease following TPS registration [25]. This result suggeststhat their method does not accurately account for prostate deformationsbetween MR and 3D-TRUS acquisitions.The Koelis system follows a surface-based approach for fusion as well. Inthis system, the MR and TRUS volumes are segmented using a statisticalmodel constructed from previously segmented prostate MR images. Theycompensate for rigid change of coordinates and also prostate shape defor-mations using an elastic registration approach. However, they only providesurface errors for fusion accuracy [90]. It is unclear how accurate this method101.6. 3D Prostate Biopsy Systemscan map potential targets inside the prostate from the MR into TRUS. Theyprovide no values for the run-time of the registration algorithm.One of the main shortcomings of current surface-based fusion techniques [25,90, 154] is that these methods do not account for inaccuracies in segmen-tation. Manual prostate segmentation is notoriously difficult, and there isoften large variability even among experts [138]. One approach to mitigatethis is to use an intensity-based approach [141], thereby avoiding the need forsegmentation entirely. However, this would still require a significant amountof correspondence in appearance between MR and 3D-TRUS images. Thismay be challenging in cases where the local appearance of the anatomy isdifferent between the two modalities. For example, the boundary of theseminal vesicle is clearly visible in MR but not in TRUS.Intra-procedure RegistrationOnce biopsy planning is completed in 3D-TRUS space, tracking the motion ofthe probe can be used to map the current 2D plane to the pre-procedure 3D-TRUS. The Uronav system uses magnetic sensors to track the probe [158],while the Artemis system records the 3D position and orientation of thetransducer tip via the joint encoders of the mechanical arm [7]. The Koelissystem does not require an additional tracker, it uses image information totrack the prostate [6].The prostate undergoes constant motion and deformation due to probepressure during biopsy, which introduces additional errors into the track-ing environment. As a result, rigid and deformable registration is requiredto maintain alignment of the 2D imaging plane with respect to the pre-procedure 3D-TRUS. The Uronav system accounts for the gross motion ofthe prostate using multi-slice intensity-based rigid registration [158]. How-ever, Uronav does not compensate for prostate deformations even thoughthe probe is hand held. As a result, when the prostate deformation is large,the radiologist may miss potential targets that were identified during theplanning phase. Furthermore, the targeting accuracy in this system is onlyreported on phantom data [158].In the Artemis system, motion compensation is performed using an intensity-based slice-to-volume registration scheme [136]. Since the probe is mechan-ically stabilized, the prostate is less likely to undergo deformations due toprobe pressure and a rigid registration suffices. As a result, the prostate canbe approximated as a rigid organ, and even its motion can be learned basedon probe positions/orientations [134]. However, it has been shown that theaccuracy of their system degrades with probe pressure [136]. This indicates111.7. Required Targeting Accuracy for Prostate Biopsy Systemsthat for freehand biopsies, which typically induce a larger probe pressure,rigid registration may not be sufficient to compensate for intra-procedureprostate motion/deformation. While the targeting accuracy of the Artemissystem with motion correction is within clinical requirements, mechanicalsystems are not used in clinics. A solution for freehand 3D-guidance using2D-TRUS probes, as part of the current standard-of-care, is highly desirable.The Koelis system does not use any method to track ultrasound probemotion; therefore, it relies only on the image-based registration for track-ing. The live 3D-TRUS volume is registered in three steps. First a globalthree degree-of-freedom (DOF) rigid registration, constrained using a man-ually delineated bounding ellipsoid that approximates the prostate shape, isperformed. This registration ensures that the probe tip is in contact withthe rectal wall. Then, the rigid registration is fine-tuned using a local 6-DOF rigid registration. These two steps account for the gross motion of theprostate. Finally, an elastic registration is performed to account for prostatedeformations [6].The drawback of the Koelis system is that 3D information is only avail-able when the probe is used in 3D mode. With a long volume acquisitiontime of up to 5 seconds (depending on the image quality) and a total reg-istration time of up to 10 seconds [6], this system cannot be used for live3D guidance throughout the procedure. However, the registration time isstill sufficiently low to provide feedback for targeting and to record the 3Dlocation of biopsy cores.1.7 Required Targeting Accuracy for ProstateBiopsy SystemsIn order to avoid over-diagnosis and over-treatment, only PCa tumors largerthan 0.5 cm3 are considered clinically significant [36, 111]. This translatesinto a spherical tumor with a radius of 5 mm. If the target registration error(TRE) of the 3D biopsy system is below 2.5 mm, it can be shown that 95.4%of smallest PCa tumors can be targeted using this system [71]. Throughoutthis thesis, we shall use this bound as the clinical requirement for a 3D biopsysystem.1.8 ObjectiveThe objective of this thesis is to facilitate integration of MR data in 3Dfreehand prostate biopsy systems to decrease the number of false negatives.121.9. ContributionsMaintaining the freehand workflow and imaging tools of the current 2D-TRUS procedure, are the primary guidelines when designing such a system.To this end, we investigate and develop registration techniques that can bringMR data into the prostate biopsy framework. We also provide registrationsolutions to compensate for prostate motion and deformations during theprocedure.1.9 ContributionsThis thesis is an attempt to develop techniques that are essential for MR-TRUS guided freehand 3D prostate biopsies. In the course of achieving thisobjective, the following contributions were made:• Developing a novel technique for registration of two surfaces that ac-counts for missing points in the target surface. This is an essentialstep for MR-TRUS fusion in prostate biopsies, since it allows the fu-sion algorithm to ignore regions in the 3D-TRUS where the prostatecontour has poor visibility. We denote this method as full-to-partialsurface registration. We validate the method using data obtained fromprostate biopsy and radical prostatectomy patients.• Validation of full-to-partial surface registration on data obtained fromprostate brachytherapy patients. The former and latter contributionssuggest that the fusion method is invariant with respect to the prostateintervention and patient cohort used for validation.• Developing a novel technique for registration of two surfaces that ac-counts for missing points in both MR and 3D-TRUS surfaces. This isan improvement on our full-to-partial fusion scheme, since it does notrequire either MR or 3D-TRUS surfaces to be fully segmented. Wedenote this method as partial-to-partial surface registration.• Developing a novel 2D-3D registration technique for intra-procedureguidance during freehand prostate biopsies. This is a fundamentalrequirement for accurate needle placement with respect to plannedtargets from MR-TRUS fusion or sextant protocols.1.10 Thesis OutlineThe rest of this thesis is subdivided into five chapters as outlined below:131.10. Thesis OutlineCHAPTER 2: BIOMECHANICALLY CONSTRAINED SURFACE REG-ISTRATIONIn surface-based registration for image-guided interventions, the presenceof missing data can be a significant issue. This often arises with real-timeimaging modalities such as ultrasound, where poor contrast can make tis-sue boundaries difficult to distinguish from surrounding tissue. Missing dataposes two challenges: ambiguity in establishing correspondences; and extrap-olation of the deformation field to those missing regions. To address these,we present a novel non-rigid registration method. For establishing correspon-dences, we use a probabilistic framework based on a Gaussian mixture model(GMM) that treats one surface as a potentially partial observation. To ex-trapolate and constrain the deformation field, we incorporate biomechanicalprior knowledge in the form of a finite element model (FEM). We validatethe algorithm, referred to as GMM-FEM, in the context of prostate inter-ventions. Our method leads to a significant reduction in TRE compared tosimilar state-of-the-art registration algorithms, with a mean TRE ≈ 2.6 mm.We also analyze robustness of our approach, showing that GMM-FEM is apractical and reliable solution for surface-based registration.CHAPTER 3: OPEN-SOURCE IMAGE REGISTRATION FORMRI-TRUS FU-SIONThe goal of this chapter is to provide an independent validation of theGMM-FEM registration approach that was described in Chapter 2. Further-more, this chapter is the validation of the GMM-FEM registration againsta distance-based metric regularized by a B-spline transform. Our ultimategoal is to develop an open-source solution to support MRI-TRUS fusion im-age guidance of prostate interventions, such as targeted biopsy for prostatecancer detection and focal therapy. It is widely hypothesized that image reg-istration is an essential component in such systems. The two non-rigid regis-tration methods are: 1) a deformable registration of the prostate segmenta-tion distance maps with B-spline regularization, and 2) a finite-element-baseddeformable registration of the segmentation surfaces in presence of partialdata. We evaluate the methods retrospectively using clinical patient imagedata collected during standard clinical procedures. Computation time andTRE calculated at the expert-identified anatomical landmarks were used asquantitative measures for the evaluation. The presented image registrationtools were capable of completing deformable registration computation within5 minutes. Average TRE was approximately 3 mm for both methods, which141.10. Thesis Outlineis comparable with the slice thickness in our MRI data. Both tools are avail-able under non-restrictive open-source license. We release open-source toolsthat may be used for registration during MRI-TRUS guided prostate inter-ventions. Our tools implement novel registration approaches and produceacceptable registration results. We believe these tools will lower the barriersin development and deployment of interventional research solutions, and fa-cilitate comparison with similar tools.CHAPTER 4: STATISTICAL BIOMECHANICAL SURFACE REGISTRA-TIONA common issue that arises when performing surface-based registration ofimages is whether or not the surfaces accurately represent the boundary ofthe region of interest. Image segmentation may be difficult in some regionsdue to either poor contrast, low slice resolution, or ambiguities. To addressthese concerns, we present a novel non-rigid surface registration method de-signed to register two partial surfaces, capable of ignoring regions where theanatomical boundary is unclear. Our approach incorporates prior geomet-ric information in the form of a statistical shape model (SSM), as well asphysical knowledge in the form of a finite element model, in a probabilis-tic framework. We validate results in the context of prostate interventionsby registering pre-operative magnetic resonance imaging to 3D TRUS, bothacquired from patients undergoing prostate biopsies. We show that boththe geometric and physical priors are required in order to decrease the netTRE. Our registration approach leads to a TRE of 2.35 mm and 2.81 mmfor full and partial surfaces, respectively. We investigate the robustness ofthe technique by varying the tunable parameters, and by removing sectionsfrom the segmented prostate surface in areas where the prostate boundaryis typically difficult to discern. Results demonstrate that the proposed sur-face registration method is an efficient, robust, and effective technique forfusing data from multiple modalities, particularly when dealing with missingor ambiguous data.CHAPTER 5: 2D-3D REGISTRATION FOR FREEHAND PROSTATEBIOPIESWe present a 2D to 3D registration framework to compensate for prostatemotion and deformations during freehand prostate biopsies. The frameworkhas two major components: 1) to compensate for the gross motions of theprostate, we use the trajectory of a tracked ultrasound probe to limit the151.10. Thesis Outlinesolution to the location of the live imaging plane with respect to the pre-procedure 3D ultrasound volume; 2) to compensate for residual deformations,we developed a non-rigid registration method that is constrained using a fi-nite element model of the prostate and the surrounding tissue. We validatethe proposed framework on 10 prostate biopsy patients and demonstrate amean TRE of 4.63 mm and 3.15 mm for rigid and FEM-based components,respectively.CHAPTER 6: CONCLUSION AND FUTURE WORKThis chapter includes a short summary followed by a discussion of the in-tegration of the registration algorithms into the clinical workflow. It alsoincludes suggestions for future work.16Chapter 2Biomechanically ConstrainedSurface Registration2.1 IntroductionThe goal of an image-guided intervention (IGI) is to localize and track the po-sition of a surgical tool with respect to a plan during the procedure. Surgicalplanning often requires pre-operative (pre-op) images, captured by either CTor MRI. For practicality, a different modality is often used during the proce-dure, typically ultrasound. One of the drawbacks of ultrasound, however, isits poor image contrast. This can lead to difficulties in distinguishing tissueboundaries of the organ of interest. A second complication in IGIs involvingsoft-tissue is that the tissue is flexible. Pre-operative images are usually ac-quired weeks in advance, and in a different body position. Thus, there canbe large changes in shape and position of the anatomy between acquisitions.Most navigational assistance systems account for these changes through acombination of rigid and non-rigid image registration. However, efficientand accurate multi-modality registration is challenging, and intensity-basedmethods such as [58, 141, 157] still require a significant amount of corre-spondence in appearance between the images. These methods may fail ifthe local appearance differs significantly between the two modalities. Forexample, the seminal vesicle boundary is clearly visible in MRI but not inTRUS.To avoid the issue of image dissemblance between modalities, we caninstead rely on clinical expertise. Both the pre-operative and intra-operativeimages can be segmented during the clinical workflow and used in a surface-based method. However, segmentation of anatomical boundaries can alsobe difficult in a number of applications, including prostate interventions,where there is a high variability even among experts [138]. Part of theboundary of the anatomy may not even be visible in one or both of the172.1. Introductionimages, for example the support region during liver resection [16, 125], orthe base and apex regions of the prostate during biopsies [95]. Therefore,a method that is robust to this variability, or that can handle missing datain regions where there is no clear anatomical boundary, would be highlyvaluable. Addressing this is the major goal of this chapter. We proposea general solution to the problem of registering two volumes when: a) theanatomy of interest undergoes mainly biomechanical deformations; and b)there is a lack of visibility in some regions of the tissue of interest. Weapply our method in the context of prostate interventions through MR-TRUSfusion. In regions where the tissue boundary is not clear, we ignore thesegmentation, treating these as areas where data is missing.2.1.1 Surface-based RegistrationThe original surface-based registration techniques relied on manual landmarkselection [26]. This is a tedious, time-consuming process and is subject touser variability. To automatically identify corresponding pairs of markers,a number of methods have been proposed that rely on the iterative closestpoint (ICP) algorithm [9, 165] and its variations [9, 124, 125, 165]. Un-fortunately, ICP-based algorithms are very sensitive to initialization, noise,and outliers [48, 92, 98, 116, 156]. One approach to mitigate this is to mapthe two surfaces into a topologically equivalent space [61, 95, 161]. Thiswas attempted by Moradi et al. [95] for MR-TRUS fusion. Its accuracywas found to still be highly sensitive to contiguous regions of missing data,such as around the apex where the prostate contour has poor visibility [95].To overcome the limitations of ICP due to binary correspondences, someapproaches compute probabilistic (soft) correspondences using a Gaussian-mixture model [21, 48, 67, 92, 98, 116, 156] or use a particle filtering ap-proach [101]. These methods convert the registration into a probability-density estimation problem, maximizing the likelihood that one set of pointsis drawn from a probability distribution governed by the other set of points.This approach also directly accounts for having partial observations, sincethere is no requirement to sample any particular coverage of points.Due to the large space of possible solutions allowed by non-rigid defor-mations, many algorithms require constraints on the deformation field toconverge. One method of constraining deformations is with a statisticaldeformation model (SDM). An SDM describes the allowable set of transfor-mations based on statistics derived from an initial population. Parametersof the transform are restricted to a linear combination of those that arepresent during training [60, 94]. The drawback of SDMs is that they re-182.1. Introductionquire an additional training step, and registration results are highly depen-dent on the quality of the training data. To generate a diverse training set,Hu et al. [60] use finite element simulations. Applied to MR-TRUS fusionfor prostate interventions, they require a detailed segmentation of not onlythe prostate, but also the bladder and pelvic bone to create a personalizedmodel. Subsequently, they use this model to generate an SDM to be used ina model-to-image registration framework.Another class of constraints are in the form of regularizers, where priorknowledge is incorporated as a penalty term in a minimization problem.This approach has been followed by many to limit surface bending of amodel [17, 21, 62, 71, 98, 126, 163, 166]. In the coherent point drift (CPD)algorithm [98], for example, nearby points are constrained to move “coher-ently” as a group by introducing a penalty on high spatial frequencies. Thecoherence term, however, can only penalize local variations in the defor-mation field; it does not consider volumetric properties such as Poissoneffects or incompressibility. Other regularizers have been introduced toconstrain volumetric deformation within a closed surface. These are typi-cally based on splines, radial basis functions, or finite element (FE) tech-niques [16, 42, 95, 102, 103, 125, 144]. The choice of regularizer is especiallyimportant for deformable organs: in addition to guiding the minimizationsearch, it directly governs the deformation field inside the anatomy, partic-ularly in regions where data is missing. In many applications, the interiorregion is the workspace during an IGI. One drawback of surface-based regu-larizers is that the deformation field away from the surface is not consideredduring the course of registration. Instead, internal deformation is recoveredvia post-processing in an interpolation step.In this chapter, we use a finite element model for regularization, sincethe nature of deformations is known to be mainly biomechanical duringmany prostate interventions, including biopsies, low-dose brachytherapy andprostatectomies. Such a model is applicable as long there is no substantialloss of mass between the two images. Deformation of the prostate is drivenby pressure from surrounding organs, the pubic bone, and the TRUS probeor endorectal MR coil acting through the rectal wall. To the best of ourknowledge, most existing FE-based registration techniques use explicit sur-face forces applied to the model to drive the deformation [16, 42, 95, 103, 125].With the exception of [102, 144], the common thread in FE-based methodsis to estimate boundary forces in a local neighborhood. These forces are typ-ically formulated in a way that corresponding points are brought closer toeach other. Correspondences are usually found based on a proximity search.This search is typically done using a variation of ICP [42, 95, 125], making it192.2. Methodsusceptible to the drawbacks of local-search techniques. For a more detaileddescription of the correspondence problem, the reader is referred to a recentsurvey paper [149]. It is possible to use a region-based metric [102, 144]for the data-driven term to avoid the correspondence problem, however, thiswould require a suitable connected geometry on the target surface. Thismight be difficult to achieve when the target surface exhibits missing surfacepoints. Rather than applying explicit forces based on ICP, or using a region-based approach, we combine the correspondence search and force calculationinto a single framework using a global probabilistic approach. Forces ariseimplicitly during the minimization, and the single objective function allowsfor an efficient implementation.2.1.2 ContributionsThe major contribution of this work is the development of the novel reg-istration method, GMM-FEM, that combines the ability to handle missingdata using a soft-correspondence approach (GMM), with a biomechanicalregularizer supplied by a FEM. We validate our registration approach onMR-TRUS images acquired from two sets of patients: one group who under-went a prostatectomy, and the other a prostate biopsy.We compare our registration approach to three other similar surface-based methods: thin-plate spline robust point matching (TPS-RPM) [21],CPD [98], and ICP-based FEM (ICP-FEM) [42]. TPS-RPM and CPD aretwo popular registration methods that use soft-correspondences based ona GMM. The major difference is that TPS-RPM constrains surface defor-mations using thin-plate-splines, whereas CPD uses Gaussian kernels. Bycomparing to TPS-RPM and CPD, we isolate and evaluate the need for theFEM component of our method, since both methods use an identical cor-respondence scheme. ICP-FEM is a surface registration method which usesICP to estimate surface-correspondences, then applies elastic forces on thesurface of an FEM to drive the deformation. This is the point-cloud equiva-lent to Ferrant et al. [42]. By comparing to ICP-FEM, we isolate the GMMcomponent of our method, since both methods use an identical FEM regu-larizer. This isolates the need for soft correspondences when dealing withmissing data.2.2 MethodAn outline of our proposed GMM-FEM registration approach is shown inFigure 2.1. The inputs are two surfaces, referred to as the source (MR) and202.2. MethodFigure 2.1: Overview of the registration framework. The pre-operative MR iscaptured before the procedure, and is segmented to create a surface represen-tation of the anatomy (in this case, the prostate). This surface is then usedto create an FE model of the volume. At the beginning of the procedure,an intra-operative (3D-TRUS) image is acquired and visible parts of theanatomy (midgland) are segmented. The GMM-FEM registration maps tar-gets from the surface in the pre-operative plan to that of the intra-operativespace.target (TRUS). We model the surface-to-surface registration as an expectation-maximization (EM) problem. One surface is used to construct a probabilitydensity function that defines the boundary of the structure. The other sur-face is considered a set of observations, which may be incomplete (a partialobservation). We wish to find the deformation field that maximizes thelikelihood that the observations are drawn from a transformed probabilitydistribution.To define the probability distribution representing the complete sourcesurface, we use a Gaussian-mixture model. These are widely used to establishsoft correspondences [67, 98, 119]. A GMM is a parametric probability den-sity function represented as a weighted sum of Gaussian densities. Verticesof the source surface are taken to be the centroids of Gaussian components,each represented by a mean and a variance. For simplicity, we take the vari-ance to be isotropic in all directions for all components. This implies thatthe corresponding target point for a given Gaussian centroid is equally likelyto appear in all directions. This assumption might not be correct in general,however, the method can be extended to use anisotropic GMMs [59] (alsosee Section 2.4). In this work, the source surface is extracted from the MRI,since the segmentation in this space is assumed to be reliable, and the target212.2. Methodsurface (i.e. observations) is extracted from the TRUS.Table 2.1: Mathematical notationsN,M, J Number of observations, GMM centroidsand FEM nodesxn n-th point of observationsym m-th GMM centroids~x3N×1 Vector of observations~y3M×1 Vector of GMM centroids~u3J×1 Vector of FEM node displacementsΦ3M×3J FEM interpolation matrixK Stiffness matrixE Young’s modulusν Poisson’s ratioPM×N GMM posterior probabilitiesσ2 Variance of Gaussian components0≤w<1 Estimate of noise/outliersdiag(~v) Diagonal matrix of a vector ~vP˜ = kron(P, I3×3) Kronecker product of matrices P and I1 Column vector of all ones2.2.1 GMM-FEM RegistrationIn the derivations that follow, we use the notations listed in Table 2.1. Fornon-rigid registration, the Gaussian centres of the GMM {ym} move withdisplacements {vm}. We then wish to maximize the likelihood that thetarget surface (partial observation) is drawn from this deformed probabilitydistribution by minimizing the negative log-likelihood function:E(θ, σ2) = −N∑n=1logM∑m=1P (ym + vm)P (xn|ym + vm), (2.1)where θ is a set of parameters controlling the deformation, P (·) denotes theGMM probability density function with variance σ2 [98], P (·|ym + vm) isa Gaussian distribution centered at (ym + vm), and vm represents the non-rigid displacement of the m-th point on the source surface. Similar to [98],we model mis-assignments due to noise and outliers as a uniform distribution222.2. Methodadded to the mixture model:P (x) =wN+ (1− w)M∑i=11MP (x|ym + vm). (2.2)This uniform spatial distribution accounts for points present in the observa-tions that do not correspond well to any point on the source. The parameterw indicates the probability of an observed point being classified as an out-lier. A uniform distribution is selected to prevent biasing the classificationof outliers to any region of space.The deformation field still needs to be constrained. We use a finite el-ement model, so the parameters of the deformation are the FEM node dis-placements, uj , where j ∈ {1, . . . , J} is the node index. Displacements canbe concatenated into a vector ~u = [u11, u12, u13, . . . , uJ1, uJ2, uJ3]T. Motion isconstrained by adding a penalty term to the log-likelihood function, leadingto the following objective function to be minimized:Q(~u, σ2) =−N∑n=1logM∑m=1P (ym + vm)P (xn|ym + vm)+β2~uTK~u. (2.3)The penalty term represents the volumetric strain energy of the FEM, scaledby a Tikhonov weight β. This is derived from a linear stress-strain relation-ship, with linear stiffness matrix K [11]. K is a large sparse matrix thatcan be systematically constructed based on the FEM mesh and a linear ma-terial model. The volumetric mesh is generated from the source surface,segmented from MR. This can be done with most FE-meshing tools, suchas TetGen [131]. To compute the stiffness matrix, we additionally need aconstitutive material law. For linear materials, K is dependent on two pa-rameters: Young’s modulus, E, and Poisson’s ratio, ν. These are inputs tothe registration.Note that other GMM-based registration methods use a similar formu-lation for adding spatial regularization [49, 162]. While their formulation issimilar in notation, they do not use a biomechanical regularizer to constrainthe deformation field inside the surface. The method by Habert et al. [49]is identical to CPD registration in the use of Gaussian regularizers. Themethod by Zetting et al. [162] is similar to TPS-RPM since it uses a TPSkernel to regularize the deformation field.Since Equation 2.1 only involves displacements on the surface of themodel, we need to relate surface locations to FEM node displacements with232.2. Methodthe use of an interpolation matrix:~v = Φ~u. (2.4)In our implementation, all points on the surface correspond to nodes, andinternal nodes are appended to the end of the node list for the FEM. As aresult, the interpolation matrix has the form, Φ =[I3M×3M 00 0]. If surfacepoints to not correspond to nodes, Φ can be derived directly from the shapefunctions of the FEM elements.The unknowns, σ2 and ~u, are computed using an EM algorithm. Initially,the variance is estimated from the data as in Myronenko and Song [98]:σ2 =13NMN∑n=1M∑m=1‖xn − ym‖2 . (2.5)In the expectation step, we compute how likely an observation correspondsto a GMM centroid by calculating the posterior probabilityP (ym + vm|xn) =exp(−12 ‖xn−ym−vm)‖2σ2)∑Mj=1 exp(−12‖xn−yj−vj)‖2σ2)+ c, (2.6)where c = (2piσ2)3/2 w1−wMN is the contribution related to outliers. The weight0≤w<1 is set to zero if observations do not exhibit outliers, or up to oneif there is little correspondence between observations and GMM centroids.Ignoring constants independent of ~u and σ2, we can rewrite the maximizationstep of Equation (2.3) asQ(~u, σ2) =12σ2M,N∑m,n=1P (ym + vm|xn) ‖xn − ym − vm‖2+3NP2log(σ2) +β2~uTK~u, (2.7)where NP =∑M,Nm,n=1 P (ym + vm|xn). Following [98], we evaluate P (·|xn)based on the previously computed displacements. This allows us to sepa-rate the expectation and maximization steps in the EM algorithm. The newoptimal FEM node displacements are then obtained by minimizing Equa-tion (2.7) with respect to the vector ~u. This yields the following sparselinear system:[ΦTdiag(P˜1)Φ + βσ2K]~u = ΦTP˜ ~x− ΦTdiag(P˜1)~y, (2.8)242.2. Methodwhere ~x and ~y are the concatenized vectors of all observations and GMM cen-troids, respectively. The correspondence probability matrix P is constructedby evaluating Equation (2.6) using the previous FEM node displacements.A tilde represents taking the Kronecker product, P˜ = kron(P, I3×3), so thatthe matrix can be multiplied by a concatenated vector of positions. Nowthat we have an updated estimate of FE nodal displacements (and hencelocations), we update the GMM probability distribution. The algorithm it-erates between the expectation step (updating σ2 and P˜ ) and maximizationstep (updating ~u) until the variance drops below a threshold. The updatedvariance at each iteration isσ2 =13NPM,N∑m,n=1‖xn − (ym + Φm~u)‖2 (2.9)The subscripted Φm refers to only the rows of Φ that correspond to surfacepoint m. The correspondence probabilities, P , can then be updated usingthe new FEM displacements ~u and variance σ2.For the biomechanical material properties, we apply a homogeneous, lin-early elastic material with a constant Young’s modulus to all elements. Forthe prostate, we use a Young’s modulus of E = 5 kPa and Poisson’s ratioof ν = 0.49, which are in the range of values in [72]. For other applications,material parameters should be chosen appropriately. Note that for linearmaterials, it can be shown that the Young’s Modulus can be factored out ofthe stiffness matrix, allowing it to be combined with the Tikhonov weightβ. This combined parameter controls the flexibility of the model, and canbe tuned to allow an appropriate level of deformation for the application.It can also be viewed as tuning the magnitude of the “implicit forces” thatarise when updating the displacement field in Equation (2.8). These forcesare driven by proximity between the soft-correspondences; they do not rep-resent physical forces in the system. Reducing β is equivalent to increasingthe force of attraction between the two surfaces.In total, there are three free parameters: w, which controls the fraction ofoutliers in the observation surface; ν, the Poisson ratio of the material; and β,controlling model flexibility. For relatively smooth and reliable observations,we find that w = 0.1 works well in rejecting few outliers and avoiding localminima. The Poisson’s ratio should be near but not equal to 0.5 for mostsoft-tissues, ν = 0.49 being a stable option. The final parameter, β, istunable to fit the application. It is up to the user to decide how much forceis reasonable given the context, subject to the trade-off between surface-fitting and volume-restoring regularization. We find that a value of β = 0.1252.2. Methodallows sufficient flexibility without over-fitting the surfaces in the context ofprostate motion. If this results in an unrealistic amount of deformation, βshould be increased.2.2.2 Related Registration Methods Used for ComparisonIn this section, we provide a brief description of the three registration meth-ods used as a basis for comparison. These were selected since they are themost relevant works in the field, and they allow us to isolate the importanceof the two major components of our objective function. For further detailson the methods, the reader is referred to [21, 42, 98].TPS-RPMThis method uses a similar EM algorithm for the registration. Deterministicannealing is used in the expectation step, decreasing both a temperature,T , and regularization weights, {λ1, λ2}. The temperature in their frame-work corresponds to the variance of GMM centroids in ours. The weights λ1and λ2 control the contributions of affine and non-rigid components, respec-tively, in their objective function. Its implementation is freely available3.By comparing to TPS-RPM, we isolate and evaluate the contribution of theFEM component of our method, since both use a similar soft-correspondencescheme. The two methods also handle outliers and missing data differently.In TPS-RPM, outliers and source points which have no corresponding targetare assigned to a separate cluster placed at the centre of mass. In GMM-FEM outliers in the observation are handled with a uniform distribution,and missing data is handled naturally by the probabilistic approach: corre-spondences are computed one-way, there is no requirement that every GMMcentroid has a corresponding observation. The output of the algorithm is aset of surface displacements. To recover a volumetric deformation field, weuse the TPS kernel to propagate the deformation field inside the prostate,as described in [132].CPDThis is the closest work to the proposed GMM-FEM. The major differenceis the choice of regularizer, which results in a different maximization rule.CPD has three free parameters: w is an estimate of noise and outliers,β controls smoothness, and λ surface-to-surface fitting. By comparing to3http://noodle.med.yale.edu/~chui/tps-rpm.html262.3. Experiments and ResultsCPD, we isolate and evaluate the FEM component of our method, sinceboth use an identical soft-correspondence scheme, as well as an identicalapproach to handling outliers and missing data. We use the open-sourceimplementation4. Just as with TPS-RPM, the output of the registrationis a set of surface displacements. To recover the volumetric deformationfield, we used the TPS kernel to propagate surface deformations inside theprostate [132].ICP-FEMThe FE-based registration approach by Ferrant et al. [42] uses image-drivenforces to deform the surface of the source object to that of the target. Thesystem evolves by discretizing in time and using the semi-implicit updatescheme:(I + τK)ut = ut−1 − τF t, (2.10)where τ is the time step, F t is the current estimate of surface forces, ut−1are the previous nodal displacements and ut is the next estimate of nodaldisplacements. Image forces are determined by a local search, driving thesurface nodes of the FE-model to the nearest feature in the image. In oursurface-based registration, instead of using image forces, we use ICP to esti-mate the nearest features in the segmentation. By comparing to ICP-FEM,we isolate the GMM component of our method, since both methods use anidentical FEM regularizer.2.3 Experiments and ResultsIn this section, we evaluate the proposed non-rigid registration method onMR-TRUS image pairs acquired from patients who underwent a prostateintervention. In Section 2.3.1, we discuss the data acquisition, segmentationprotocol and the initialization of the registration. In Section 2.3.2, we vali-date our registration method using intrinsic fiducials found in the interior ofthe prostate on full and partial surfaces, and compare to TPS-RPM, CPD,and ICP-FEM. In Sections 2.3.3 and 2.3.4, we investigate the sensitivity ofour approach to variations in the free-parameters (β, ν), and to the resolutionof source and target surfaces.The prostatectomy and biopsy data were segmented at two different in-stitutes, using 3D Slicer [38] and Stradwin [147], respectively. We then usedTetGen [131] to automatically create a tetrahedral volumetric mesh of the4https://sites.google.com/site/myronenko/research/cpd272.3. Experiments and Resultsprostate based on the MR segmentations. The models used in this studyare composed of (≈ 2000) surface nodes and ≈ 7500 elements. Throughoutour experiments, we used a stopping condition of σ2 ≤ 1e−4 mm2, Young’smodulus of E = 5 kPa, which is in the range of values reported in [72] forthe prostate, and a Poisson’s ratio of ν = 0.49.(a) In vivo TRUS (b) Ex vivo prostate(c) Ex vivo MRI (d) Initial alignmentFigure 2.2: Prostatectomy data collection protocol. Prior to the prostatec-tomy, a volumetric US is acquired using a side-firing probe (a). Followingthe prostatectomy, strand-shaped fiducials are attached to the prostate tomark anatomical coordinates of the prostate in the MR. These strands arehighlighted with a green circle on the prostate (b) and are visible in theex vivo MRI (c). Both MR and TRUS images are segmented and broughtto an initial alignment at the beginning of the registration (d).2.3.1 DataProstatectomyWe acquired MR and TRUS volumes of ten patients scheduled for a prostate-ctomy. This data consists of in vitro MRI and in vivo TRUS. The advantageof the prostatectomy data is that we have access to a strong ground truth forin vitro MRI segmentation. This also helps in guiding the in vivo TRUS seg-mentation. The major steps in the data acquisition and alignment protocol282.3. Experiments and Results(a) Segmented MRI (blue) (b) Segmented TRUS (red)Figure 2.3: An axial slice of the T2-weighted MRI (a). An axial slice 3D-TRUS volume (b).are outlined in Figure 2.2. The TRUS volumes were collected with an Ul-trasonix Touch machine (Ultrasonix, BC, Canada) using a BPL-95/55 sidefiring transducer mounted on a motorized cradle. Parasagittal 2D-TRUSimages were acquired at 5◦ intervals, beginning with a sagittal view thatmarks the anterior-superior plane. These slices are used to reconstruct a3D-TRUS volume with an axial and lateral spacings of 0.12 mm, expressedin patient-centered right-anterior-superior (RAS) coordinates. Following theprostatectomy, the prostate was fixated in a 10% buffered formalin solutionfor preservation and MR-visible strand-shaped fiducial markers were appliedto the specimen using the protocol by Gibson et al. [47]. These strand fidu-cials are used to mark RAS coordinates on the specimen, so that registrationcan be initialized with a consistent orientation. Specimens were scanned us-ing a Discovery MR750 scanner (GE Healthcare, Waukesha, USA) at 3T,with an endorectal coil (Prostate eCoil, Medrad, Warrendale, USA) to ac-quire T1-weighted MR images with a 0.3 mm slice thickness.Prostate BiopsyThe second set of data was acquired from 19 patients, each scheduled fora prostate biopsy. We acquired T2-weighted MR images from 19 patientsusing a 3 Tesla GE Excite HD MRI system (Milwaukee, WI, USA) with aspacing of 0.27 × 0.27 × 2.2 mm. TRUS images were acquired using a 3D-TRUS mechanical biopsy system [7] with a Philips HDI-5000 US machineand a C9-5 transducer using an axial rotation of the biopsy probe. For these292.3. Experiments and Resultsimages, the rotational sweep began with an axial view of the prostate mid-gland, thereby defining the right-anterior plane. An example slice of bothMR and TRUS is provided in Figure 2.3. These were then reconstructedinto a 3D-volume with a spacing of 0.19 × 0.19 × 0.19 mm. Similar to theprostatectomy data, both sets of images were acquired in a patient-centeredcoordinate system.Full and Partial SegmentationBoth prostatectomy and biopsy MR and TRUS volumes were manually andfully segmented under the supervision of an expert clinician. For MR prosta-tectomy, the surgical margins and strand fiducials were not included in thesegmentation. We refer to the complete segmentations as full surfaces. In re-gions where the prostate boundary was not fully visible in TRUS (e.g., nearthe base and apex), segmentations were completed based on prior knowl-edge of the full prostate shape, and by exploiting symmetries. These regionswere labeled as “uncertain”, since they are not based on observed data. Wethen created partial surfaces by removing any uncertain portions from thesegmentation. What remained represented the midgland, consisting of ap-proximately 70% of the original surface. This is where segmentations aremost reliable and consistent between clinicians. No points were removedfrom the segmented MR, since these segmentations are assumed reliable.2.3.2 Registration to Full and Partial SurfacesTo initialize registration, we first oriented the MR and TRUS surfaces basedon the consistent RAS coordinates. We then applied a centre-of-mass align-ment, followed by a global registration to compensate for large differences inposition and orientation. The global registration did not exhibit sensitivityto initialization in a range of [−10.0,+10.0] mm/degrees for a 100 trials. Forthe prostatectomy data, we opted for an affine registration to compensatefor the bulk volume loss due to the ex vivo nature of the data. The residualdeformation is then assumed to be mainly biomechanically driven. For theprostate biopsy data, we used a rigid transform for the initial alignment.ProstatectomyWe applied GMM-FEM, as well as the methods for comparison (TPS-RPM,CPD, and ICP-FEM), to all prostatectomy MR-TRUS pairs. We then re-peated registration using the partial TRUS surfaces to see the impact of302.3. Experiments and Results(a) Affine (b) TPS-RPM withT0 = 10 andλ1/λ2 = 0.01(c) CPD with β = 3.5and λ = 0.1(d) ICP-FEM withτ = 0.1, E = 0.1 kPaand ν = 0.49(e) GMM-FEM withE = 5.0 kPa,ν = 0.49 and β = 0.05Figure 2.4: An example of registration to full data. Registration resultson prostatectomy data following surface based registration between the MR(blue) and TRUS (red) for: Rigid (a), TPS-RPM (b), CPD (c), ICP-FEM(d) and GMM-FEM (e). The estimate for missing data, w, was fixed at zerofor all three non-rigid methods.312.3. Experiments and Results(a) TRUS (b) GMM-FEM (c) Magnitude of thedeformation field(mm)(d) Comparison of(a) and (b)Figure 2.5: Typical results for a prostatectomy data. The axial slice ofTRUS (a) and the corresponding GMM-FEM (b) registration results. Thedeformation map of the non-rigid registration and the corresponding TRUSslice (dashed line) are also shown in (c). The checkerboard of (a) and (b) isshown for comparison.ignoring uncertain regions in the segmentation. An example registration re-sult for full TRUS surfaces is shown in Figure 2.4. For the three comparisonmethods, we tuned the parameters such that the best TRE was achieved.For TPS-RPM, the initial temperature was set to T0 = 10 mm2 and the stop-ping condition was T ≤ 0.2 mm2. For CPD, we used β = 3.5 and λ = 0.1.For ICP-FEM, we set a time step of τ = 0.1, Poisson’s ratio of ν = 0.49,and Young’s modulus of E = 0.1 kPa. Note that this Young’s modulus issignificantly lower than the value of E = 5 kPa reported in literature. Thediscrepancy relates to a scaling of the surface-to-surface forces, which is therole β plays in GMM-FEM (Section 2.2.1). To visualize the result of ourGMM-FEM registration in the interior of the prostate, we resampled theMR image into the space of the TRUS. The registration result in an axialslice is shown in Figure 2.5. As seen in Figure 2.5d, the boundary of theprostate matches that of the TRUS. The magnitude of the deformation fieldalong the slice plane is shown in Figure 2.5c. Note that the majority of thecorrection is made near the boundary of the prostate, which is where thesurface-based forces are applied.Prostate BiopsyTo further validate our registration results, we applied the algorithms to thebiopsy data described in Section 2.3.1. A typical registration result for a full-to-partial surface registration is shown in Figure 2.6. The registered imagesand deformation field on an axial slice are shown in Figure 2.7. Again, we322.3. Experiments and Results(a) Rigid (b) TPS-RPM withT0 = 50 andλ1/λ2 = 1e− 5(c) CPD with β = 3.5and λ = 0.1(d) ICP-FEM withτ = 0.1, E = 0.1 kPaand ν = 0.49(e) GMM-FEM withw = 0.1, E = 5.0 kPa,ν = 0.49 and β = 0.05Figure 2.6: An example of registration to missing data. Prostate biopsyresults following surface based registration between the MR (blue) and TRUS(red) for: Rigid (a), TPS-RPM (b), CPD (c), ICP-FEM (d) and GMM-FEM(e).find most of the deformation near the surface.Quantitative ValidationTo quantify the registration results, we used the Dice similarity coefficientand the TRE. Using the Dice similarity metric, we ensure that the surface-based method has converged. Furthermore, a volumetric measure such asDice allows us to investigate volume preservation property of the registrationalgorithm given the Poisson’s ratio of ν ≈ 0.5. This property may not beeasily captured using other surface-based distance measures, such as theHausdorff distance. The TRE is used to validate the registration approachfor targeting.The Dice coefficient between two surfaces, A and B is defined as:Dice(A,B) = 2|A∩B||A|+|B| . The Dice coefficient for the datasets in this study is332.3. Experiments and Results(a) TRUS (b) GMM-FEM (c) Magnitude of thedeformation field(mm)(d) Comparison of(a) and (b)Figure 2.7: Typical results for a prostate biopsy data. The axial slice ofTRUS (a) and the corresponding GMM-FEM (b) registration results. Thedeformation map of the non-rigid registration and the corresponding TRUSslice (dashed line) are also shown in (c). The checkerboard of (a) and (b) isshown for comparison.shown in Table 2.2. Note that the Dice values are quite high (≈ 0.95), whichis to be expected since we are directly optimizing for best surface fit. Theone-sample Kolmogorov-Smirnov test did not reject the null hypothesis thatthe Dice values belonged to a normal distribution at the 95% significancelevel (0.21 ≤ p ≤ 0.42). Using paired t-tests, we did not observe statisti-cally significant differences between Dice values from different surface-basedregistration techniques, however, the FEM-based methods (ICP-FEM andGMM-FEM) did exhibit a slightly smaller volume change (2.5 ± 0.52 vs.3.1 ± 0.86) compared to kernel-based methods (TPS-RPM and CPD) as aresult of registration. This is expected, since the FE mesh in our approachis assume to be nearly incompressible (ν = 0.49).Note that we could have used the Dice coefficient, or any other region-based metric, to drive the registration. However, as stated in Section 2.1,this would require a suitable connected geometry on the target surface. Thismight be difficult to achieve for our current problem for which the targetsurface may exhibit missing surface points. An advantage of the proposedcorrespondence-based metric is that it avoids this issue entirely.To quantify the TRE, we selected a set of intrinsic fiducials on the MRand TRUS images. For our ten prostatectomy cases, we marked up to fivecalcification pairs (a total of 30) in both modalities per patient. These land-marks were validated by a radiologist. While these numbers might not beadequate to validate TRE across the prostate, we were limited by landmarksthat could be accurately and reliably identified by our clinical expert. Forthe biopsy data, we also selected up to five fiducials per patient consisting of342.3. Experiments and ResultsFigure 2.8: Examples of fiducial pairs in MR (left column) and TRUS (rightcolumn) for prostatectomy (top row) and biopsy (bottom row) data.cysts and benign prostatic hyperplasia. This yielded a total of 93 landmarksacross all subjects, distributed approximately 64.5% in the mid-gland, 21.5%in the base and 14% in the apex. An example of corresponding fiducial pairsin MR and TRUS is shown in Figure 2.8. The L2 distance between corre-sponding fiducials was used to quantify the TRE. We did not attempt tomeasure the fiducial localization error (FLE). The biopsy data used in thischapter was collected as part of a separate study [71, 141], where an FLE of0.21 mm was reported for TRUS, and 0.18 mm for MR. Given that we usethe same dataset, but the landmarks are selected using a different clinican,we expect the FLE to be in the same range.The mean and standard deviation of the TRE, and p-value compar-isons are shown in Table 2.3. For both prostatectomy and biopsy data,the proposed GMM-FEM registration approach was found to consistentlyoutperform TPS-RPM, CPD and ICP-FEM. Furthermore, the two FEM-based methods (ICP-FEM, GMM-FEM) led to lower TREs compared to thekernel-based ones (TPS-RPM, CPD), suggesting that there is an advantageto incorporating physical priors in the form of a finite element model.Compared to MIND [140], the proposed GMM-FEM registration ap-proach produces a higher TRE for full surfaces (2.72± 1.15 vs. 1.93± 0.73).Given that we do not have access to the landmarks used to validate theMIND-based approach, we are unable to draw a direct comparison. How-ever, the lower TRE reported in their study suggests that an intensity-basedapproach is more suitable if similar intensity patterns are visible in bothimages.352.3. Experiments and ResultsTable 2.2: The Dice coefficient for prostatectomy and biopsy data followingaffine/rigid and non-rigid registration. For both full and partial registration,the full TRUS surface was used in the calculation of Dice.Method ProstatectomyFull PartialAffine 89.10± 4.45 88.32± 4.76TPS-RPM 97.31± 1.23 96.12± 1.76CPD 97.68± 1.48 96.17± 1.55ICP-FEM 97.15± 0.88 96.22± 1.10GMM-FEM 97.16± 0.92 96.46± 1.26Method BiopsyFull PartialRigid 88.56± 5.23 87.01± 6.12TPS-RPM 97.42± 1.78 96.28± 1.86CPD 97.73± 1.58 96.78± 1.97ICP-FEM 97.86± 0.98 96.23± 1.22GMM-FEM 97.72± 1.15 96.91± 1.33We found that TPS-RPM often fails to produce realistic deformationfields when dealing with partial surfaces. In Figure 2.6), we see that thetwo ends of the prostate are flattened after registration, where there is datamissing. This leads to a large increase in TRE, since the deformation fieldis then interpolated inside the volume. In TPS-RPM, points in the sourcewhich cannot be matched to the target are assigned to an outlier cluster atthe centre of mass. We believe this is the cause of the unrealistic inwardpull. In [21], the authors acknowledge that their handling of outliers isnot optimal. In ICP, points without a match can be filtered out based onproximity. The probabilistic approach in CPD and GMM-FEM eliminatesthe need for special handling of missing data.To investigate statistical significance of results compared to the initialglobal registration (rigid/affine), we first determined whether the TRE wasnormally distributed. Using a one-sample Kolmogorov-Smirnov test, theTREs were found not to be normally distributed at the 95% significancelevel. As a result, we performed a set of Wilcoxon signed-rank tests, withthe null hypothesis that the TREs of initialization versus subsequent non-rigid registration share a common median at the 95% significance level. In362.3. Experiments and ResultsTable 2.3: TRE (mm) for prostatectomy and biopsy data followingaffine/rigid and non-rigid registration. The star (*) denotes statistical signif-icance at the 95% confidence interval compared to affine/rigid registration.Method ProstatectomyFull PartialAffine 3.17± 1.38 4.15± 1.23TPS-RPM 4.19± 1.76* 5.26± 1.49*CPD 4.02± 1.12* 4.76± 1.07ICP-FEM 3.00± 1.39* 3.89± 1.24GMM-FEM 2.65± 1.29* 2.89± 1.44*Method BiopsyFull PartialRigid 4.01± 1.45 4.65± 1.31TPS-RPM 3.76± 1.01 6.50± 2.12*CPD 3.81± 0.91 5.05± 1.11ICP-FEM 2.95± 0.94* 3.72± 1.23*GMM-FEM 2.72± 1.15* 2.61± 1.10*Table 2.3, results for which this hypothesis was rejected (and hence arestatistically significant) are indicated with a star (*), each resulting in p <10−4. Note that in some cases, TPS-RPM and CPD led to an increase inTRE compared to initialization. The increase in TRE for TPS kernels hasbeen previously reported in the literature [25].To investigate the statistical significance of the proposed algorithm overTPS-RPM, CPD and ICP-FEM, we performed additional Wilcoxon signed-rank tests (Table 2.4). Results show that GMM-FEM does lead to statisti-cally significant improvements in TRE over CPD, TPS-RPM, and ICP-FEMwhen applied to partial surface observations (p ≤ 10−6), as well as over CPDand TPS-RPM when applied to full surfaces (p ≤ 10−6). When comparingto ICP-FEM for fully segmented TRUS surfaces, however, although the TREis reduced, the signed-rank test fails to reject the null hypothesis. Hence,the improvement is not statistically significant if full segmentations of bothMR and TRUS are available.Finally, to investigate the statistical significance of registration using fullversus partial segmentations, we performed four additional sets of Wilcoxonsigned-rank tests. Results of are shown in Table 2.5. For TPS-RPM, CPD372.3. Experiments and ResultsTable 2.4: P -values fromWilcoxon signed-rank between different registrationmethods.Prostatectomy BiopsyTest Full Partial Full PartialGMM-FEM vs. TPS-RPM 10−9 10−12 10−8 10−9GMM-FEM vs. CPD 10−9 10−9 10−9 10−9GMM-FEM vs. ICP-FEM 0.45 10−6 0.33 10−9Table 2.5: P -values fromWilcoxon signed-rank between different registrationmethods for full vs. partial data.Method Prostatectomy BiopsyTPS-RPM 10−9 10−8CPD 10−7 10−7ICP-FEM 10−6 10−3GMM-FEM 0.28 0.42and ICP-FEM, the increase in the TRE is significant, suggesting that thesemethods are not suitable when the TRUS is only partially segmented. How-ever, for GMM-FEM, the test fails to reject the hypothesis. This impliesthat, for our method, a full segmentation of the prostate is not necessary: apartial surface (with up to 30% missing points) registration leads to a com-parable TRE. This can help speed-up the segmentation process, which musttake place during the clinical procedure.2.3.3 Sensitivity to Biomechanical ParametersThe final deformation field in GMM-FEM is affected by the values of thetwo FEM material parameters (Young’s modulus and Poisson’s ratio), andby the regularization weight. Since it can be shown that E can be factoredout of the stiffness matrix, there are only two free parameters controllingdeformation: 1) the product of the regularization weight and elasticity, βE;and 2) Poisson’s ratio, ν. To investigate the sensitivity of our registrationmethod to these parameters, we perturbed the values for one of our prosta-tectomy patients and examined the resulting deformation fields. We referto the L2 distance between fields produced by perturbed parameters versusoptimal as robustness, shown in Figure 2.9.As seen in Figure 2.9a, the result of GMM-FEM registration can be some-382.3. Experiments and Results(a) Regularization weight. (b) Poisson’s ratio.(c) Left to right: Spatial distribution ofrobustness when regularization weight ischanged to 0.05, 0.5 and 1.0, respectively.(d) Left to right: Spatial distribution of ro-bustness when Poisson’s ratio is changed to0.40, 0.45 and 0.48, respectively.Figure 2.9: Robustness of GMM-FEM registration for different regular-ization weights (a) and Poisson’s ratios (b). Biomechanical parametersare perturbed around parameters which provided the best surface overlap(E = 5.0 kPa, ν = 0.49 and β = 0.1). For better visualization, distanceslarger than 5 mm are shown with the same color.392.3. Experiments and Resultswhat sensitive to the regularization weight, β. Varying the value by orders ofmagnitude leads to three distinct behaviors. For large values (e.g. β ≥ 1.0),the FEM acts essentially rigid, allowing no deformation. For small weights(e.g. β ≤ 0.01), the FEM provides little regularization, allowing the surfaceto move freely. For moderate values (e.g. β ≈ 0.1), the level of deformationvaries, balancing resistance to internal strain with the surface-to-surface fit-ting. This parameter, β, should be tuned for the application, taking intoconsideration the level of deformation expected given the context.Figure 2.9b shows the sensitivity of the internal deformation field toPoisson’s ratio. The registration is much less sensitive to perturbations inthis parameter compared to the regularization weight. A value of ν = 0.49leads to the nearly-incompressible behavior of soft-tissues while avoiding thesingularity at 0.5.2.3.4 Sensitivity to Number of Surface PointsThe number of points on the segmented surface may affect both the ro-bustness of the correspondence scheme, and the fidelity of the finite-elementmodel. To establish a ground truth deformation field, we performed the reg-istration with 10, 000 surface nodes in both MR and TRUS surfaces. Next,we systematically decimated both surfaces down to 1, 200 points and com-puted the resulting internal deformation fields following registration. Wefound that GMM-FEM is not sensitive to the number of surface points upa to minimum resolution: differences in the TRE and internal deformationsstay below 1.0 mm when the surface resolution drops from 10, 000 to 1, 800points (Figure 2.10). This seems to be an appropriate lower bound on sur-face resolution. If the surfaces are further decimated to 1, 200 points, webegin to see artifacts. At this level, we find that the surface no longer well-approximates the original shape.The number of surface nodes on the source surface, and the resolutionof the FEM are intimately related. Theoretically, a higher resolution FEMwould result in a more accurate representation of the deformation field up toa point of convergence. However, errors due to lack of convergence given thecurrent FEM resolution are expected to be much lower than errors causedby inaccuracies in segmentation and in the estimated soft correspondences.Thus, the resolution of the FEM should be chosen such that it is able toproduce an adequate deformation field for the particular application. Weconsider the current resolution of our prostate surfaces to be a lower limit forthis. For computational efficiency, the number of surface points (and henceFEM nodes) should be chosen to be the minimum such that the nature of402.3. Experiments and Results(a) TRE across all prostatectomy patients for different surface resolutions(b) Left to right: Spatial distribution of robustness for one sample prostatectomy patientwhen number of surface points are reduced to 1200, 1800 and 8600 points, respectively.Figure 2.10: Robustness of GMM-FEM registration for different surface res-olutions. Biomechanical parameters are tuned for best TRE (E = 5.0 kPa,ν = 0.49 and β = 0.1).412.4. Discussion and ConclusionsTable 2.6: Time (seconds) required for each component of GMM-FEM reg-istration reported across all patients.FE-meshing Stiffness Matrix Average Iteration Total Time0.64± 0.22 1.03± 0.27 0.15± 0.01 9.34± 3.63the surface is sufficiently captured and the deformation field is sufficientlysmooth.2.3.5 Registration Time and Computational ComplexityWe implemented GMM-FEM in Matlab (MathWorks, MA) with C++ mexcode to interface with TetGen [131]. Timings were recorded on a PC witha 3.4 GHz Intel Core i7 processor and 8.00 GB of RAM. Average timesfor each step, across all patients, is summarized in Table 2.6. The totaltimes encapsulate all steps, including mesh generation and stiffness matrixcomputation. In all cases, the algorithm converged within 100 iterations.The proposed registration algorithm converges on average within ten sec-onds on a regular desktop PC. This efficiency is achieved due to the linearityassumption of the FEM, and the sparse nature of the equations involved.Since the stiffness matrix does not change between iterations in a linearmaterial model, it only needs to be computed once given the original volu-metric mesh. For the size of models considered in this work, this step takeson average 1.03 seconds. The next most computationally expensive stepis the matrix solve in Equation (2.8) for updating the FEM displacements.Since the system is sparse, it can be solved quite efficiently in Matlab. Suchsparse solves can typically be computed with complexity between O(M1.5)and O(M2), where M is the number of FEM nodes.2.4 Discussion and ConclusionsIn this chapter, we presented a novel non-rigid surface-based registrationapproach that handles missing data up to 30% and accounts for volumet-ric deformation effects known to be present in soft tissues. We applied themethod to MR-TRUS fusion for prostate interventions, where a full segmen-tation from MR was fit to a potentially partial segmentation from TRUS.In Section 2.3.2, we validated the algorithm on data from both prosta-tectomies and prostate biopsies. We compared the TRE against three other422.4. Discussion and Conclusionsnon-rigid registration methods: TPS-RPM, CPD, and ICP-FEM. The pro-posed GMM-FEM was found to outperform both TPS-RPM and CPD in allcases. This improvement is shown to be statistically significant. Since theonly effective difference between these registration methods is in the regular-ization scheme (FEM-based vs. Gaussian/TPS-based), we conclude that theFEM component is responsible. Compared to ICP-FEM, our method didnot lead to statistically significant improvements when a full TRUS segmen-tation was available. However, if only a partial segmentation is available,then the improvement was found to be significant. Since the only effectivedifference between the two methods is in the correspondence scheme (soft vs.binary assignments), we conclude that the GMM component of the proposedframework is responsible.While there are some techniques that focus on partial-surface registra-tion, most do not consider volumetric properties (e.g. Poisson effects, incom-pressibility) during the registration, only surface bending. A 3D deformationfield would then need to be interpolated internally based on the fitted sur-face, as a post-processing step. Physically, it is the internal volume thatresists deformation, affecting the shape of the surface. Our method betterreflects this. By comparing to CPD, which requires post-processing to re-cover a 3D field, we show that keeping the FE-regularizer in-loop leads toimproved results.For a tumor to be considered clinically significant, its radius should beno less than 5 mm. The TRE, as a root-mean-square error, provides anestimate of the standard deviation for biopsy targets. A standard deviationgiven by a TRE of 2.5 mm yields a confidence interval in which 95.4% oftargets fall within the clinically significant 5 mm radius [71]. The currentresults (2.61 mm) of our GMM-FEM registration bring us closer to thesebounds.In Section 2.3.3, we investigated the robustness of the internal deforma-tion field to perturbations in biomechanical parameters. GMM-FEM seemsto be quite robust to perturbations in Poisson’s ratio, however, it can besensitive to changes in the biomechanical regularization weight (β). Thisparameter controls the trade-off between model flexibility and the ability toperform surface-to-surface fitting. It must be tuned to allow an appropriatelevel of deformation, given the context of the involved anatomy.We also examined robustness to surface resolution (Section 2.3.4), sincethis may affect both surface correspondences, and FEM resolution. Fortu-nately, we found that GMM-FEM does not seem sensitive to the number ofsurface points. This holds true as long as the prostate surfaces adequatelyrepresents the anatomy. Based on our findings, we consider 2000 surface432.4. Discussion and Conclusionsnodes to be sufficient for our registration pipe-line. For surfaces with highercurvature or a more complex shape, however, more points will be required.In this chapter, we used an isotropic GMM to model the source surface.This implies that for a given source point, its observation is equally likelyto appear in all directions. This assumption is not true in general. Forexample, points near the TRUS probe tip are more likely to move in thedirection which the probe is pushing. A possible research direction is toextend GMM-FEM registration to use anisotropic GMMs using a similarapproach to Horaud et al. [59].In our GMM-FEM registration, we used a tetrahedral mesh to createan FE model of the prostate. We note that tetrahedral meshes are proneto mesh locking behavior, however, we did not encounter this issue in ourdataset. However, should this issue arise in other datasets, one workaroundis to relax the Poisson’s ratio to a lower value, given that the deformationfield is less sensitive to this parameter. Another solution is to use a differentelement type such as a hexahedron, which is less susceptible to mesh locking.The EM algorithm is known to be susceptible to local minima for pointcloud registration [67]. These local minima usually arise in the presence oflarge global motion, symmetries, or transformation models with high degreesof freedom [67]. In our application, we typically have a priori knowledge ofthe general location and orientation of the prostate. This allows for a reason-able initialization that will prevent any large mis-assignments. Due to thesmooth and unique curvatures of the prostate shape, we have not encoun-tered such convergence to local minima given the full or 70% partial surfaceobservations used in this study. If more data is missing (e.g., 50%+), and ifthe observation does not contain unique features, then local minima will beunavoidable. In such a case, we would recommend combining the methodwith other optimization heuristics, such as a multi-resolution approach, orhaving multiple starting points [67]. If too much of the surface segmentationis deemed unreliable, then a surface-based registration approach may not beappropriate.One shortcoming of our algorithm is that it requires the MR and TRUSto be segmented prior to the registration. While the MR can be segmentedahead of time, the 3D-TRUS needs to be segmented within minutes due toclinical requirements. Since the presented method is designed to be robustto missing data, we suggest to only segment regions in which the boundaryof the anatomy can be clearly distinguished, such as in the mid-gland forprostate interventions. This can help speed up the segmentation process.Where the anatomical boundary is clearly visible, we recommend using fastsemi-automated segmentation methods (e.g., [114]).442.4. Discussion and ConclusionsIn this chapter, we used a linear, homogeneous material to create thefinite element model of the prostate. We did not take into account that thecalcification, cysts, and cancerous regions typically have different stiffnesswithin the normal tissue. These spatially-varying properties can easily beincorporated into finite-element-based techniques, affecting computation ofthe stiffness matrix. However, this would require having an expert classifyregions of tissue and assigning appropriate relative stiffness parameters. Thedefinition of spatially-varying properties could potentially be automated bycombining with elastography, should it become part of the prostate biopsyprotocol. Note that with other, non FEM-based interpolators, homogeneityis also implicitly assumed.The presented method, GMM-FEM, is shown to be both efficient androbust. It was designed to handle cases where visibility can limit accuracy ofboundary segmentation in some regions of an organ, leading to the need tohandle missing data. However, the method also performs well when full seg-mentations are available. It was shown to outperform current state-of-the-artsurface-registration techniques: TPS-RPM, CPD, and ICP-FEM. We believethis makes it a strong candidate for use in image-guided interventions.45Chapter 3Open-source ImageRegistration for MRI-TRUSfusion3.1 IntroductionProstate cancer is a leading cause of cancer-related deaths in males in theUSA and Canada [133]. Accurate and early diagnosis of aggressive PCa iscritical for adequate patient management. TRUS and MRI are complemen-tary imaging modalities in visualizing anatomy of the prostate and charac-terizing the tissue for cancer presence. While MRI is the ideal imaging toolfor PCa staging and characterization [55], TRUS is the most widely usedmodality due to its real-time nature, low-cost and ubiquity. It is also theprimary modality used for interventional applications such as biopsy andbrachytherapy. In this chapter we present and compare two practical soft-ware tools that can be used for non-rigid registration of prostate TRUS andMRI data to enable joint use of these modalities.The concept of MRI-TRUS fusion targeted prostate biopsy was first in-troduced in 2002 by Kaplan et al. [70]. MRI is typically acquired weeks priorto the biopsy, with the patient in a different position (supine vs. lateral de-cubitus) and often with an endorectal coil, leading to substantial differencesin prostate shape between the MR and TRUS volumes. This leads to theneed for a non-trivial technique to consolidate the data. Image registrationcan be used to bring these two modalities in alignment.Over the last decade, MRI-TRUS fusion biopsy has evolved and severalsolutions have been implemented in commercial products [85]. Strong ev-idence exists that targeted prostate biopsy, enabled in particular by suchfusion systems, improves accuracy of PCa sampling [85]. In a recent studyPuech et al. conclude that software-based image registration does not cur-463.1. Introductionrently offer any advantages over cognitive registration done by visual re-identification of the biopsy targets between the two modalities [113]. Incontrast, a study of Delongchamps et al. confirmed the utility of softwareregistration but produced no evidence that deformable registration leads toany tangible improvements over rigid registration [30]. Most commercialMRI-TRUS fusion products implement linear registration only [85]. Furtherstudies are needed to evaluate the overall clinical value of software registra-tion as well as specific registration methods.Comparison studies of image registration algorithms for the purposes oftargeted prostate biopsy are challenging. Commercial tools are typically con-strained to the manufacturer-specific registration algorithms, which are oftennot described in sufficient detail, and do not allow exporting of the registra-tion results. Numerous registration algorithms have been proposed in theliterature for MRI-TRUS fusion [60, 95, 141], but very few academic chap-ters are accompanied by a software implementation (the study by Moradi etal. [95] is the only study known to us that uses a publicly available registra-tion tool) that could be easily used in a comparison study, or considered fortranslation into clinical research setting. Therefore, we believe open-sourcesolutions that could readily be applied to MRI-TRUS fusion studies wouldgreatly benefit the community.MRI-TRUS registration approaches of prostate images can be categorizedinto intensity-based and segmentation-based methods. Efficient and accurate3D non-rigid MRI-TRUS registration is inherently challenging because ofthe inter-modality nature of the problem and the low signal-to-noise ratio ofTRUS. To the best of our knowledge, the only fully intensity-based approachfor MRI-TRUS fusion is the method by Sun et al. [141]. All other methodsrely on TRUS segmentation [60, 95, 100]. Similar to all intensity-basedapproaches, the method proposed by Sun et al. [141] requires homologousanatomical features to appear in both images. The challenge with MRI-TRUS fusion is that since the imaging physics are substantially differentbetween the two modalities, there may be parts of the anatomy that can bevisible in one image but not the other.MRI can be segmented in advance of the procedure without sacrificing theprocedure time. TRUS images are typically segmented during brachytherapyworkflow. In the biopsy workflow, manual segmentation of the prostate glandis considered acceptable in the commercial fusion tools. Therefore, we canbypass difficulties associated with multi-modal intensity-based registrationusing a method that relies on the availability of the prostate gland segmenta-tion. However, especially for prostate interventions, even experts are proneto over- and under-segmentation of the anatomy, as discussed in [138]. This473.2. Methodsis also evident in the results presented in this chapter. The discrepancycan be attributed to the poor visibility of the prostate boundary near thebase and apex in TRUS. Therefore a method that is robust to this potentialvariability, or that can handle missing data in regions where the prostateboundary is not clear, would be highly valuable, as mid-gland segmentationcan be done robustly [138].3.1.1 ContributionsThe goal of this chapter is to provide an independent validation of the GMM-FEM (see Chapter 2) registration method against a signed distance mapapproach described in Section 3.2.2. This method represents the deformationfield interior to the prostate using B-splines. Both methods are validated andcompared independent of the author using data collected for 11 PCa patientswho underwent standard MRI and TRUS imaging as part of their clinicalcare at Brigham and Women’s Hospital (BWH). While this chapter can beadded to Section 2.3 as another dataset, I feel that since Chapters 2 and 3have been published in different journals, it might not be suitable to combinethem into a single chapter.The group at BWH has made the signed distance approach available asan open-source tool to facilitate development and evaluation of registrationmethodologies, and to support clinical research in image-guided prostateinterventions.3.2 MethodsThe registration approaches we propose consider clinical setup consisting ofthe two stages:1. Pre-processing (planning) stage: the MRI exam of the patient is ana-lyzed to identify the planned biopsy targets. The prostate gland canbe contoured in MRI, and post-processing of the segmentation can beapplied to recover a smooth surface.2. Intra-procedural stage: a volumetric sweep of the prostate gland withTRUS is obtained, followed by reconstruction of a volumetric image.The prostate is segmented on the volumetric image and it is used togenerate a smooth surface of the gland. The MRI and TRUS surfacesare then set as input to either of the registration methods describedfurther to compute displacements that can be used for target positioncomputation or fused MRI-TRUS display.483.2. MethodsIn this section we describe image acquisition and the various processing stepsin detail, and discuss our approach to the retrospective evaluation of theregistration techniques.3.2.1 Image Acquisition and Pre-processingThe imaging data used in this evaluation was collected as part of a HIPAA-compliant prospective study that was approved by the institutional reviewboard of the Brigham and Women’s Hospital. The author was not involvedin the acquisition and processing of this dataset, which is described in thissection. Clinical indication for both MRI and TRUS imaging was histologi-cally confirmed prostate cancer, with low dose rate radiation brachytherapyas a preferred treatment option. TRUS image acquisition was performedduring brachytherapy prostate volume studies, with the goal of confirmingsuitability of the patient for the procedure (volume of the prostate gland iswithin the clinically acceptable range, and there is no interference of the pu-bic arch with the brachytherapy needle insertion plan). Per standard clinicalprotocol, no anesthesia was administered to the patient during either MRIor TRUS imaging.Multiparametric MRI data was collected using the standard imagingprotocols established at our institution [55]. All MR imaging exams wereperformed on a GE Signa HDx 3.0T system (GE Healthcare, Waukesha,WI) with the patient in a supine position using a combination of 8-channelabdominal array and endorectal coils (Medrad, Pittsburgh, PA). The imag-ing study included anatomical T2-weighted imaging (T2WI) (FRFSE se-quence, TR/TE = 3500ms/102ms over a 16 cm2 FOV, reconstructed pixelsize 0.3×0.3×3 mm), which was the series used for registration experimentspresented in this work. The total time of the multiparametric MRI examwas about 45 minutes.TRUS imaging was done in a separate session, with the patient in a litho-tomy position. Per standard clinical setup, the TRUS probe (BK 8848) wasattached to a motorized mover (Nucletron EndoCavity Rotational Mover(ECRM)) and mounted on a rigid stand with the enclosure for the TRUSprobe (Nucletron OncoSelect stepper). Imaging was performed using thesagittal array of the probe rotated by the ECRM. Camera link and OEMresearch interfaces of the BK ProFocus US scanner (BK Medical) were usedto collect radiofrequency (RF) TRUS concurrently with the clinical imageacquisition. A position tracking device equipped with accelerometer, magne-tometer and gyroscope (Phidget Spatial 3/3/3) was attached to the handleof the probe to track sagittal array orientation during motorized sweep. Syn-493.2. Methodschronous collection of the RF and tracking data was performed using PublicLibrary for UltraSound research (PLUS) [82] on a workstation equipped witha camera link interface (Dalsa X64 CL Express). The total time of the TRUSimage collection was less than 5 minutes.The following pre-processing steps were applied to prepare the data be-fore applying the registration procedure. PLUS was used for converting RFTRUS data into B-mode images and for 3D reconstruction of the TRUSvolumes from the tracked data using the gyroscope sensor tracking informa-tion. Volumetric TRUS images were reconstructed at 0.2 mm isotropic voxelsize. TRUS and axial T2WI MRI volumes were brought into initial align-ment by rigidly registering three fiducial points (left-most, right-most andanterior points identified on the mid-gland axial slice of the prostate) placedmanually in reconstructed volumes using 3D Slicer [39], and were alignedwith the T2WI MRI images. The prostate gland was contoured manually inboth TRUS and T2WI volumes using the 3D Slicer Editor module. For thepurposes of simplifying the segmentation procedure, TRUS volumes were re-sampled to the resolution of the T2WI dataset (3 mm slice thickness). Themanually segmented masks were resampled back to the 0.2 mm isotropicspacing and smoothed by applying a recursive Gaussian image filter withσ = 3. The resulting masks were then used as input for the two registrationtools we describe next. The group at BWH has made three of the datasetsused in the evaluation publicly available 5.3.2.2 Registration of Signed Distance Maps with B-splineRegularizationThe BRAINSFit [68] registration module of 3D Slicer was used to per-form non-rigid registration between MR and TRUS pairs, which was ear-lier adapted to prostate MRI intensity-based hierarchical registration [37] atBWH. Over the last few years, this module has been used to support clinicaltrials of MRI-guided in-bore transperineal prostate biopsy at BWH [110]. Toapply this registration approach to MRI-TRUS registration, the BWH groupimplemented additional pre-processing of the segmentations, and modifiedthe registration parameters as follows. First, the isotropic segmentationmasks were cropped using a fixed size (≈10 mm) margin around the bound-ing box of the segmentation to reduce computation time of the subsequentsteps. Maurer signed distance transformation [91] as implemented in InsightToolkit (ITK) was applied to the smooth segmentations of the prostate gland5See http://www.spl.harvard.edu/publications/item/view/2718.503.2. Methodsin both MRI and TRUS. The BWH group chose Maurer implementation ofthe distance transformation due to its improved (linear time) performance ascompared to other implementations available. The resulting distance mapswere registered using the standard BRAINSFit module of 3D Slicer (v4.3.1)with affine and B-spline (isotropic grid of six control points) registrationstages applied in sequence. The SPL group used the mean squared differ-ence similarity metric with a fixed number of 10000 samples. All of theprocessing was done either in 3D Slicer, or using standard classes of ITK.This approach was developed by the team at the BWH, and thus will befurther referred as such in the text.The registration tool implementing the approach above is available as amodule within ProstateIGT extension of 3D Slicer software 6.3.2.3 GMM-FEM RegistrationThe second method in this chapter is the GMM-FEM registration methodthat was previously introduced in Chapter 2. For brevity, I refrain fromduplicating the description of the method, which has already been discussedin Section 2.2.1. For the purpose of evaluating the capabilities of the GMM-FEM method in registering partial data, a partial surface datasets werecreated for each case by cropping the full surface 10 mm from the end pointsusing planes perpendicular to the prostate gland main axis. The choice ofdata to be discarded was motivated by the practical difficulties in accuratesegmentation of the prostate at apex and base [138].3.2.4 Evaluation SetupThe two registration tools described above were applied to the MRI andTRUS datasets collected for prostate cancer patients at BWH. The authorof this thesis was not involved in the evaluation setup. Identical parameterswere used for each of the algorithms across the datasets used in the evalua-tion. Quantitative assessment was done based on the observed computationtime and TRE. Computation time was measured for each of the processingsteps. The accuracy of registration was evaluated using the TRE measuredbetween the corresponding landmarks identified by an interventional radiolo-gist specializing in abdominal image-guided interventions with over 10 yearsof experience in both MRI and ultrasound guided procedures. The land-marks were localized independently from the process of gland segmentation.The landmarks used in the evaluation included anatomical landmarks that6The source code and license are available at https://github.com/SlicerProstate.513.3. Resultscould be consistently identified in each patient (entry of the urethra at base(coded as UB) and apex (coded as UA) of the prostate gland, and verumon-tanum (coded as VM)) as well as patient-specific landmarks (calcificationsor cysts). The landmarks were marked using a setup where both MRI andvolume reconstructed TRUS images were shown to the operator side by sideusing 3D Slicer to facilitate consistent identification.Normality testing was performed using Shapiro-Wilk test, statistical com-parisons were done using paired t-test. Statistical analysis and plotting wereperformed using R version 3.0.1 7. Registration experiments were performedon a MacBook Pro laptop (early-2011 model, 2.3 GHz Intel Core i7, 8GBRAM, SSD, OS X 10.9.5). C++ code was compiled using XCode 6.1 clang-600.0.54 compiler in Release mode. Matlab version 2013b was used for theGMM-FEM registration tool.3.3 ResultsEvaluation was conducted using imaging data collected for 11 patients. Vol-umes of the segmented prostate gland for the cases used in the evaluationare shown in Table 3.1. The volume of the gland segmented in TRUS wastypically smaller than the one in MRI, the difference exceeded 20% in 4 outof 11 cases.Computation time was as follows. BWH registration pre-processing tookon average 35 sec (range 31-51 sec), while registration (including resampling)took 40 sec (range 32-56 sec). Pre-processing for the GMM-FEM methodwas comparable and on average took 40 sec (range 23-64 sec). Average regis-tration time for the GMM-FEM method was 93 sec (range 33-248 sec) whileusing the full surface data, and 60 sec (range 19-137 sec) when partial datawas used. No statistically significant correlation was observed between thevolume of the prostate gland segmentation and the registration time. A rep-resentative example of a registration result is shown in Fig. 3.1. Visualizationof the displacement fields obtained with both methods for the same case isin Fig. 3.2.The total of 48 landmarks across all cases were identified for the purposesof TRE assessment. In the majority of the cases (6 out of 11) the landmarkscorresponding to the UA and/or UB anatomical locations were outside thegland segmentation (also see Fig. 3.4 showing landmarks located outside thegland segmentation). Only landmarks that were inside the gland in bothMRI and TRUS segmented volumes (the total of 37) were considered in7http://www.r-project.org/523.3. Resultsaxial view sagittal view coronal viewFigure 3.1: Example of the registration result for case 10 using BWHmethod.The green outline corresponds to the smoothed surface of the segmentedprostate gland in the US image (both rows). Top row shows views of theTRUS volume, bottom row corresponds to the registered MRI volume forthe same case. Annotations show examples of the landmarks used in theevaluation: urethra entry at base (red arrow) and apex (yellow arrow).533.3. ResultsCase ID VMR, mL VUS , mL Percent difference,%9 33.1 30.7 7.110 27.1 26.9 0.712 27.6 23.2 15.914 49.5 38.2 22.916 18.8 14.0 25.417 16.9 12.3 27.318 28.2 28.9 -2.319 24.4 22.3 8.820 55.9 44.2 20.821 17.8 16.4 8.022 32.7 28.4 13.3Table 3.1: Volumes of the prostate gland segmentation in MRI (VMR) andUS (VUS). Large discrepancies were observed in a number of cases, which isattributed to the difficulties of accurately localizing prostate apex and basein US. Percent difference is calculated as (VMR − VUS)/VMR) ∗ 100.axial view sagittal view coronal viewFigure 3.2: Visualization of the deformation field for case 10 using both BWH(top row) and GMM-FEM (bottom row) methods. The green outline corre-sponds to the MR surface before registration, and the purple outline is theintersection of the TRUS prostate surface with the image plane. Note thatfor the GMM-FEM method deformation is restricted to the inside the glandsegmentation, while BWH method produces continuous smooth deformationfield that extends beyond the prostate segmentation.543.3. ResultsInit. BWH-aff BWH-bsplineMean±SD 7.8±4 3.7±1.8 3.8±1.8Range [1.7-15.4] [0.5-7.3] [1.1-7.8]CPD-aff GMM-FEM-full GMM-FEM-partMean±SD 3.5±1.7 3.5±1.7 3.6±1.5Range [0.3-7.1] [0.5-7.3] [0.7-6.6]Table 3.2: Summary statistics in mm of the initial TRE (Init.), TRE follow-ing affine registration using BWH (BWH-aff) and UBC (CPD-aff) methods,and using deformable registration using BWH (BWH-bspline) and GMM-FEM methods with full (GMM-FEM-full) and partial (GMM-FEM-part)surface information. Significant reduction in TRE was observed as a resultof affine registration, deformable registration component did not produceimprovements.the quantitative evaluation, to ensure the same set of landmarks is used inevaluating both methods. Among those landmarks, mean initial TRE was7.8 mm (range 1.7-15.3 mm). The detailed summary of the TRE statisticsis shown in Table 3.2.There was no sufficient evidence to reject the hypothesis about the nor-mality of the observed errors based on Shapiro-Wilk test (p>0.05). BothGMM-FEM and BWH led to significantly smaller TREs as a result of anaffine registration step (p<0.0001), leading to mean residual TRE of about3.5 mm (range 0.1-7.3 mm) (the CPD-affine step refers the “affine” versionof [98]). The deformable component of the registration did not result in astatistically significant improvement of mean TRE. Comparison of the regis-tration results obtained using BWH and full surface GMM-FEM methods doshow a statistically significant difference between them (p<0.05). However,the difference between the means was only 0.3 mm. A detailed summary ofthe GMM-FEM and BWH registration results for each landmark is shownin Fig. 3.4. No significant difference was observed between the TREs corre-sponding to the registration results obtained with the GMM-FEM methodwhile comparing full and partial surface registration results. At the sametime, we observed that visually the results can be noticeably different, asillustrated in Fig. 3.3.553.4. DiscussionFigure 3.3: Example of the surface registration result using GMM-FEMmethod. The surface of the prostate gland in TRUS used for registration isshown as a wireframe, and the registered surface is colored by the displace-ment magnitude. The registration result that used full surface informationis on the left panel, whereas the partial surface registration result is on theright. The differences are most apparent at the apex (yellow arrow) and thebase (red arrow) of the gland.3.4 DiscussionIn this chapter two software tools were presented that I believe are practi-cal for MRI-TRUS fusion in clinical research concerned with prostate inter-ventional applications. The first tool was presented in The results of thischapter are an independent validation of the GMM-FEM registration ap-proach presented in Chapter 2. The implemented approaches both rely onthe availability of the prostate gland segmentation, but are quite differentin the methodology and capabilities. The BWH approach has been imple-mented based on the easily accessible “off-the-shelf” components of 3D Slicerand ITK. The GMM-FEM approach has the benefit of utilizing a biomechan-ical model, which has the potential to lead to a more realistic displacementfield and can handle partial surface information. However its implementationrequired significantly more custom code development.To the best of our knowledge, only two of the currently available com-mercial tools, Urostation (Koelis) and Artemis (Eigen), support elastic regis-tration [85]. While both of these operate on segmented prostate gland, noneis using distance map representation or biomechanical model for registra-tion, or is capable of handling partial surface data. Numerous MRI-TRUSapproaches have been presented in academic literature, but most are not ac-companied with reliable implementations for testing. We are aware of onlyone publication that has an open-source implementation [95]. A major in-563.4. Discussion9 10 12 14 16 17 18 19 20 21 22369369369369 Other UA UB VM3 6 9 3 6 9 3 6 9 3 6 9 3 6 9 3 6 9 3 6 9 3 6 9 3 6 9 3 6 9 3 6 9UBC full surface registration TRE, mmBWH TRE, mm9 10 12 14 16 17 18 19 20 21 22369369369369 Other UA UB VM3 6 9 3 6 9 3 6 9 3 6 9 3 6 9 3 6 9 3 6 9 3 6 9 3 6 9 3 6 9 3 6 9UBC partial surface registration TRE, mmBWH TRE, mmFigure 3.4: Summary of the TREs for the datasets used in the evaluation.Each point corresponds to a single landmark (“UA” is urethra at apex, “UB” -urethra at base, “VM” - verumontanum, “Other” corresponds to case-specificlandmarks identified for calcifications or cysts), with the BWH method TREplotted on the vertical axis, and GMM-FEM method (using full surface datain the top, and partial data in the bottom panel). Red points correspondto the landmarks that were marked outside the gland segmentation. Notethat GMM-FEM TREs for the landmarks located outside the gland includeonly the affine registration component, since the deformation can only beestimated inside the tetrahedral mesh. For this reason, only those landmarksthat were located inside the gland in both MRI and TRUS were consideredin the quantitative evaluation.573.4. Discussionnovation of our work is in streamlining translation of the MRI-TRUS fusioncapability into the clinical research workflows. Possibly the closest work toours in terms of developing an open-source translational system for prostateinterventions is by Shah et al. [129]. Our work is complementary in thatwhile Shah et al. investigate system integration, we focus solely on softwareregistration tools.Once image data is collected and the prostate gland is segmented, all theprocessing steps for both methods can be completed without user interactionin under 5 minutes. The mean error observed by the BWH group is in theorder of 3 mm, which is the slice thickness for the MRI data. We also notethat the BWH group did not attempt to quantify the error in localization ofthe anatomical landmarks, as they did not have resources to conduct a multi-reader study. Such study would require clinical experts that are familiarwith both MRI and TRUS appearance of the prostate. This expertise is rareat both UBC and BWH institutes, since clinical reads of prostate MRI isdone by the radiology department, while most of the TRUS-guided prostateprocedures are done by either radiation oncology or urology departments.Overall, we (UBC and BWH) believe our tools are suitable for prospectiveevaluation in the context of clinical research prostate biopsy applications.The BWH group did not observe a significant differences between thetwo approaches in terms of TRE that are of practical value. The evaluationwas complicated by the possible inconsistencies in the segmentation of theprostate gland, and the difficulties in placement of some of the anatomicallandmarks that resulted in UA/UB points being located outside the prostategland. Accurate and consistent segmentation of the prostate is challenging inTRUS, especially at the apex and base of the gland [97, 138]. Note that thedifferences between the prostate volumes estimated from 3D TRUS and MRhave been recognized earlier in a number of studies. The average TRUS/MRvolume ratio that the BWH group observed was 0.87, which is similar toSmith et al. [138] who reported average ratio of 0.9. While the BWH groupcannot with absolute certainty determine the sources of variability, there areseveral factors that could have contributed to the difference. First, TRUSimages have poor contrast at apex and base, potentially leading to under-segmentation of these areas. Second, the actual physical volumes of the glandcould be affected by the compression of the prostate gland, to a differentdegree by both endorectal MR coil and ultrasound probe. Heijmink et al.observed average reduction of 17% in prostate volume due to the use ofendorectal MR coil [56].The BWH group adopted 3D Slicer for implementing the BWH approachpresented here. 3D Slicer includes a variety of registration tools and in-583.5. Conclusionstegrates ITK, thus enabling reuse and sharing of the existing technology.Extensions framework of 3D Slicer allows for contributing new functional-ity without the need to change the core of the application, thus variousMRI-TRUS specific registration algorithms can be contributed by the inter-ested groups. The PLUS toolkit [82] and OpenIGTLink [146] are tightlyintegrated with 3D Slicer and thus data collection of tracking and intra-procedural imaging data can be implemented for a variety of devices usinglibraries such as PLUS [82]. This is supportive of our longer-term goal of pro-viding an open-source solution in 3D Slicer for MRI-TRUS guided prostateinterventions. The BWH group makes the registration tools available underBSD-style open source license, permitting unrestricted academic and com-mercial use.The work presented in this chapter has several limitations. First, thesigned distance map approach used in the BWH approach is known to besusceptible to noise in segmentation [143]. This can be overcome by using acombined distance-map and feature-based approach such as the method byTang et al. [143]. Image acquisition was done during prostate brachytherapyvolume studies. More complex approaches based on electromagnetic or opti-cal tracking would be required for freehand TRUS volumetric reconstruction.The use of data supplied by prostate volume studies could have potentiallyintroduced selection bias towards smaller prostate volumes. Further evalua-tion on a larger biopsy cohort is warranted. The BWH group did not assessthe consistency of landmark identification and prostate gland segmentation,and did not evaluate the sensitivity of the registration tools to variability inthe segmentation or extent of missing data.3.5 ConclusionsWe (UBC and BWH) proposed open-source tools that can be used as acomponent of a system for MRI-TRUS fusion guided prostate interventions.The registration tools implement novel registration approaches and produceacceptable registration results, aiming to reduce the barriers in develop-ment and deployment of interventional research solutions for prostate image-guided interventions, and facilitate comparison with similar tools.59Chapter 4Statistical BiomechanicalSurface Registration4.1 IntroductionImage guided interventions often require the registration of pre-operativeimages to intra-operative data. Pre-operative imaging, usually either CT orMRI, provides high-resolution information about the patient anatomy thatis invaluable for assessing the location and extent of disease. Intra-operativeimaging, on the other hand, must be fast and portable to be practical forguidance. The most popular choice is ultrasound, which is a low-cost, real-time and radiation-free modality, widely used for abdominal and urologicalinterventions. Registration of pre-operative and intra-operative images isinherently challenging: the anatomy undergoes motion and deformation dueto changes in body positions posture, and in response to external pressuressuch as from an ultrasound probe. As a result, registration is needed tocompensate. If the structure is highly rigid, the transform may involve onlyrotation and translation. For non-rigid deformations, a larger parameterspace is required to account for changes in shape.Intensity-based registration of multimodal images is an active field ofresearch (e.g. [112, 139]). Techniques typically involve maximizing a similar-ity metric. When comparing separate modalities, it is important to considervarying image properties due to the different imaging physics being exploited.One method of particular note attempts to account for this by defining afeature set, based on denoising literature, called the modality independentneighborhood descriptor (MIND) [58]. This technique has been successfullyapplied to multi-modal prostate registration [141]. However, just as withall intensity-based approaches, MIND still requires a sufficient number ofanatomical features to appear in both images [58]. This may be challengingwhen parts of the anatomy are only visible in one image. For example, in604.1. IntroductionTRUS, the boundary of the prostate is not easily distinguished at its base,making it difficult to see where the prostate ends and the seminal vesiclesbegin.If both pre-operative and intra-operative images are segmented duringthe clinical workflow, we can use these for a surface-based registration tech-nique, avoiding the issue of finding intensity-based correlations. Instead, werely on clinical expertise. The trade-off is that we then heavily rely on seg-mentation accuracy. Smith et al. [138] found that for prostate interventions,there was a high variability in segmentations, even when they were completedby clinical experts. The primary cause of inconsistencies were mainly dueto tissue ambiguities or lack of visibility of the boundary, forcing cliniciansto rely on prior knowledge of prostate shape and symmetries. To addressthis, we have developed a technique that allows for ambiguous or missingdata. We propose that the surfaces be segmented only where the anatomicalboundary is clear. This leads to the need to register two or more partialsurfaces. The main topic and contribution of this chapter is to present atechnique to register two partial surfaces in an efficient and robust way.4.1.1 Related WorkSurface-based registration of a source surface, Y , and a target surface, X, isoften presented as the minimization of an objective function of the formE (X,T (Y,Θ) ) +R(Θ). (4.1)The first term, E(·), is a distance metric between source and target surfaces.The source surface is transformed by a mapping, T (·), parameterized bya collection of inputs, Θ, that bring it closer to the target. The secondterm, R(·), represents regularization that seeks to constrain properties inthe transform parameters. The goal is to find the optimal set of inputs,Θ. Regularization terms are often used to guarantee convergence. For non-rigid registration, they can also dictate the nature of the transformation byrestricting the solution space.The distance metric in surface-based techniques is typically formulatedas a summation of distances between corresponding points. Thus, the metricand the correspondence scheme are intimately related. The iterative clos-est point algorithm [9, 165], one of the most popular approaches, assignscorrespondences based on proximity. However, ICP requires accurate initial-ization, and is sensitive to noise, outliers, and missing data. To address this,soft-correspondences have been proposed based on the Gaussian Mixture614.1. IntroductionModel [21, 59, 67, 98, 153]. Rather than treating correspondences as bi-nary assignments, the source is interpreted as a probability density functionconsisting of a mixture of Gaussian probabilities. Registration is then castas a likelihood maximization problem: find the transform that maximizesthe likelihood that the target points are drawn from the transformed sourcedistribution. These methods are robust to missing points on the target, butthey assume a complete source surface for defining the GMM. When bothsurfaces are only partially segmented, more informed priors are required totackle the under-determined nature of the problem.A standard approach in gathering prior knowledge of surface geometry isthrough statistical modeling of a population. A statistical shape model [57],describes shape statistics based on a set of training data. It consists of amean shape, plus a combination of variational modes. Surface-based SSMshave been widely used in the literature (e.g., see [20, 28, 32, 117]). Orig-inally, correspondences between points in the training data were assignedmanually [26]. This is a tedious, time-consuming process, and subject touser variability. Automatic pair-wise construction techniques have sincebeen proposed, where one shape is arbitrarily chosen as the mean, and isregistered to all other shapes in the training set. Correspondences are thenestimated based on proximity. Unfortunately, this approach is inherentlybiased towards the selected mean shape, which may reduce the generality ofthe resulting SSM. To remove this bias, group-wise registration techniqueshave been developed [3, 117], where both the mean shape and its pair-wisetransformations to the training data are considered unknown. They are con-currently estimated by registering the entire group together at once.Statistical shape models have been extended to multiple modalities. Chowd-hury et al. [19] investigate the concurrent segmentation of the prostate in MRand CT using a linked-SSM. This linked-SSM has two separate mean shapes:one for MRI and one for CT. Fitting the SSM to a new MRI predicts thecorresponding prostate boundary in CT due to the linkage. The implicitassumption is that changes in shape between modalities is predictable. Thismay hold in CT and MRI, as long as imaging conditions are consistent for allsubjects. Unfortunately, the assumption breaks down for TRUS, where theprobe location, orientation, and pressure will inevitably vary across subjects.When dealing with a collection of observations of a shape, each with miss-ing sections of data, it may not be clear what the full target shape shouldbe; it is like piecing together a puzzle without having a complete referenceimage. A reference for the full 3D structure would naturally simplify theprocess. A surface-based SSM for the anatomy of interest could be used tofind an intermediate shape to help piece multiple partial surfaces together.624.1. IntroductionBy applying an approach similar to the group-wise registration frameworkof Rasoulian et al. [117], we find the single surface geometry generated bythe SSM that most closely matches two incomplete observations. Once thisintermediate shape is registered to both partial surfaces, we use it to con-struct a map between the two. In statistical shape modeling, the registrationtransform between shape instance and target is usually considered to be rigid(e.g., see [2, 8]). Its purpose is to compensate for differing coordinate sys-tems. However, more complex transforms can be used. For instance, if theSSM does not capture a certain behavior, such as physical deformation, thena non-rigid transformation function might be desired (e.g., see [118]). In theliterature, the combination of SSMs and non-rigid transformations is alsoused to generate synthetic data [13, 51].Non-rigid transformation models can be separated into one of three groups:1) Geometric-inspired; 2) Interpolation-inspired; and 3) Knowledge-basedmodels.Geometric-inspired models provide a minimal set of conditions that arecommon when mapping together similar objects. Such conditions may in-clude inverse consistency [22], topology preservation [61, 95, 161] and isome-try [166]. To the best of our knowledge, the only geometric-inspired approachfor prostate interventions is the method by Moradi et al [95], which enforcesa preservation of topology in establishing correspondences, and leverages fi-nite element methods to model non-rigid deformations. However, they donot account for uncertainties in segmentation. As a result, they found thatregistration accuracy degraded in the apex area, where the prostate contourhas poor visibility.In interpolation-inspired models, there is an assumption of “smoothness”of the displacement field. Deformation is controlled by a set of control pointsor modes, and the full field is interpolated from these. Examples of impor-tant families of interpolation strategies include radial basis functions [21],free-form deformations [153] and Gaussian basis functions [98]. These meth-ods have been used for statistical shape analysis [1, 117, 164] and prostateinterventions [87, 88]. However, in most methods, only surface displace-ments are regularized during the registration process. Internal, volumetricdeformations are then computed post-registration. This may result in un-natural changes in volume and shape, since the interior deformation field isnot directly considered during the course of the registration.Knowledge-based transformation models are typically used in scenariosinvolving a well-defined procedure or task. Introducing knowledge regard-ing the deformation may be achieved in two ways: statistical analysis, andbiomechanical modeling. Similar to SSMs, statistical deformation models634.1. Introductioncapture statistical information about deformation fields across a population.For a new deformation instance, the registration strives to find a linear combi-nation of SDM basis transforms that minimizes an objective function [60, 94].The drawback of SDMs is that they require an additional training step, andtheir capture range is limited to previously-observed deformations [60].The main motivation behind biomechanical modeling is the premise thatphysically consistent models allow a more accurate and compact descriptionof complex tissue behaviors. Finite element models can represent the de-formable nature of soft tissue. FEM-based registration is typically drivenusing a combination of surface forces and boundary conditions [16, 42, 95,102, 103, 125, 144]. A variation of ICP is generally used to compute thesurface forces that drive the registration [42, 95, 125]. This method is sus-ceptible to the drawbacks of local-search techniques. Rather than using ICP,we apply a probabilistic framework to estimate surface-to-surface correspon-dences, and forces are implicitly applied through an expectation maximiza-tion framework.4.1.2 ContributionsIn this work, we aim to improve the accuracy of IGIs by developing a non-rigid surface-based registration method, referred to as SSM-FEM, that con-siders both the geometric statistics and biomechanical nature of the defor-mation between pre-operative and intra-operative acquisitions. The methodis designed with the consideration that boundaries of the desired anatomymay not be visible in regions of one or both images. To compensate fordifficulties in segmentation, we propose to segment each image only wherethe boundary can be reliably and precisely traced. This results in two par-tial segmentations, which are then concurrently fused with the help of anSSM through a group-wise approach (similar to [118]). Since the anatomymay exist in different biomechanical states in the two images, a finite elementmodel is used to define a deformation between the SSM and each observation.This adds flexibility to the SSM. We combine both the geometric (SSM) andbiomechanical (FEM) priors into a single framework using a probabilisticapproach.We validate our algorithm in the context of MR-TRUS fusion on 19patients who underwent prostate biopsies. By modifying it to remove ei-ther the geometric or biomechanical priors and contrasting the results, wedemonstrate the need for both the SSM and FEM when dealing with theregistration of deformable bodies with missing data.644.2. MethodFigure 4.1: Overview of the MR-TRUS fusion workflow. The pre-operative(MR) image is acquired before the procedure and is segmented to create asurface representation of the anatomy (e.g. the prostate). At the beginningof the procedure, an intra-operative (3D-TRUS) image is acquired and partsof the anatomy (mid-gland) are segmented. SSM-FEM maps both surfacesand their interior to a common intermediate SSM instance. Subsequently,the MRI is registered to the TRUS through this intermediate shape.4.2 Method4.2.1 Clinical WorkflowAn outline of the proposed MR-TRUS fusion is shown in Figure 4.1. TheMR image can be acquired any time prior to the biopsy. It is subsequentlycontoured in axial slices by tracing the boundary of the prostate only whereit can be reliably segmented, as determined by a clinician. We refer to thepartial MR segmentation as the MR observation.At the beginning of the biopsy procedure, the radiologist acquires asweep of the prostate gland using tracked 2D-TRUS. We used an axial ro-tation along the probe with a mechanical system [7] to acquire this sweep,however, other trackers/sweep protocols [158] should perform just as well.These tracked slices are then used to generate a 3D-TRUS volume [41].The prostate is subsequently segmented where its boundary can be reliablytraced. We refer to partial TRUS segmentation as the US observation. Wethen perform the proposed SSM-FEM registration to map the MR imageinto the space of TRUS. Throughout the procedure, mono-modality 2D/3DTRUS registration is used to compensate for intra-procedure motions and654.2. Methoddeformations [136].4.2.2 Partial Surface to Partial Surface RegistrationWe treat the registration problem as two coupled SSM to partial-surface reg-istrations, which are solved concurrently using an expectation-maximizationapproach. The SSM is used to construct two probability density functionsthat define the boundary of the structure, one for solving the registrationproblem for each modality. This probability density function is transformedusing two separate spatial transformations that maps to each modality. Forthese transformations, we wish to find:a. The single set of SSM modes that best fit both partial observationsb. Two non-rigid transformations (one for each modality) that map theSSM to each observationNote that the SSM modes are shared between observations, while the non-rigid transformations are not. The reason is that we assume the SSM tocapture inter-subject variability among patients. This implies that a singleinstance of the SSM represents the prostate “at rest”. The two non-rigidtransformations are biomechanical deformations that map the SSM instanceto observations in each modality.To construct the SSM, we use a group-wise GMM approach. This hasbeen shown to provide a generally applicable, anatomically specific, andcompact representation of surfaces [117]. Details of the training are providedin Section 4.3.1. The result is a linear SSM model where shapes can begenerated by adding a combination of modes of variation to a mean shape.The linear weights, hereafter referred to as shape parameters, along withthe modes of variation, define a single instance of the SSM. Parametersthat control the rigid transformation of an instance are referred to as poseparameters. During registration, instances of the SSM are used to createa finite element model of the prostate, which is used to compute and limitinternal deformations. Note that to form a FEM given a surface, we need togenerate a volumetric mesh which often includes additional nodes interior tothe volume. This can be done automatically with most meshing tools [131].From an SSM instance, we create a single Gaussian mixture model torepresent the anatomical boundary of interest. Vertices define the Gaus-sian centres, each with a common mean and variance. We then constructa pair of composite transforms that deforms the SSM surface to fit each ofthe observations. These involve a rigid component to account for a change664.2. Methodin coordinates, and FEM-based interpolation to govern prostate deforma-tion. The key idea is that the surfaces in US and MR represent deformedobservations of a single common shape. Hence, the solution to the SSMshape parameters depends on both. Since the two images are acquired atdifferent times and under different contexts, any deformations are consideredindependent. We therefore compute two separate deformation fields, one foreach modality. The shape parameters and deformation fields are combinedin a single framework, where the goal is to minimize net deformation andsurface-to-surface errors.4.2.3 Probabilistic SSM-FEMSSM-FEM is based on an expectation-maximization framework. It is con-strained by Tikhonov regularization of shape parameters (to control the in-termediate shape instance) and of the volumetric strain energy of a finiteelement model (to limit deformation). The algorithm involves fitting theSSM to all observations simultaneously, constructing a simple FEM of theprostate based on the SSM instance, then allowing the model to deform to fiteach observation independently. There are no explicit boundary conditionsapplied to the FEM; instead, the boundary is allowed to move freely. Regis-tration is driven by implicit surface-to-surface forces that arise by minimizinga global objective function.Following Equation (4.1), we can define a separate distance measurebetween the SSM and each independent observation. These consist of amodality-dependent surface error, Emd, and regularization term, Rmd, whichboth involve modality-specific transform parameters, Θmd. By summing er-rors across modalities, we form a unified objective function:E =∑md∈{MR,US}Emd(σ2md,Θmd) +Rmd(Θmd). (4.2)The surface error, Emd, also depends on a Gaussian variance, σ2md, whichparameterizes the Gaussian mixture models. This can be interpreted ascontrolling the capture radius of each point on the SSM when computingcorrespondences.For each modality, a transformed version of the SSM is used to definethe GMM representing the complete anatomical boundary. We wish to findthe set of transform parameters, Θmd, that maximizes the likelihood thatthe observed partial surface is drawn from that model. This motivates us todefine the modality-dependent surface error using the negative log-likelihood674.2. MethodTable 4.1: Mathematical notationsVariables: scalars, vectors and matricesmd ∈ {US,MR} ModalityNmd Number of points per observationXmd Nmd× 3 matrix of points per observationM,L Number of SSM nodes/modes of variationZM×3 SSM meanYM×3 SSM instanceΨ3M×L SSM modes of variationbL×1 SSM shape parametersRmd, tmd, smd Rigid transform parametersJ Number of FEM nodesUmd J × 3 FEM nodal displacementsVmd M × 3 Surface deformationsΦM×J FEM interpolation matrixK3J×3J Stiffness matrixE Young’s modulusν Poisson’s ratioPmd M ×Nmd GMM posterior probabilitiesσ2md Variance of GMM Gaussian componentswmd ∈ [0, 1] Estimate of GMM outliersI3×3 Identity matrixI˜T3×3M[I3×3 · · · I3×3]3×3M1 Column vector of all onesOperatorsdiag(~v) Diagonal matrix of any vector ~v~v3N×1 Rasterization of any VN×3 matrixP˜ = kron(P, I) Kronecker product between two matricesfunction:Emd(σ2md,Θmd) = −Nmd∑n=1logM∑m=1Pmd(zm)Pmd(xmdn |T (zm,Θmd)), (4.3)where Pmd(·) denotes the GMM probability density function, zm is the lo-684.2. Methodcation of the m-th vertex in the SSM mean, and T (zm,Θmd) represents theideal transformed location. This transform involves:• SSM modes for determining the intermediate shape• FEM deformation to provide flexibility• Rigid transform parameters for gross alignmentFollowing the notations defined in Table 2.1, we can write the net trans-form asT (zm,Θmd) = smdRmd[(zm + Ψmb) + vmdm]+ tmd. (4.4)The term (zm + Ψmb) represents the m-th instance vertex, given shape pa-rameters b and corresponding modes of variation in the columns of Ψ (withΨm consisting of only the three rows of Ψ pertaining to vertex m). Theshape instance is then deformed by adding the FEM displacement field,vmdm . Finally, the rigid transform is applied, with rotation, translation, andscale given by {Rmd, tmd, smd}. It is important that the residual deforma-tion between the shape instance and observation, vmdm , is computed in theSSM coordinate space. This reduces rotational artifacts when using a linearmaterial model for the FEM. Since the SSM is only a surface model whilethe FEM is volumetric, we need an interpolation matrix, Φ, to relate surfacedisplacements to FEM node displacements:vmdm = ΦmUmd, (4.5)where Umd is the J × 3 matrix of all FEM node displacements, and Φm themth row of Φ. In our implementation, all points on the surface correspondto FE nodes, and internal nodes are appended to the end of the node list.As a result, the interpolation matrix has the form,Φ =[IM×M 00 0].In the general case, Φ is a sparse matrix which can be computed directly fromthe FE mesh via its shape functions (e.g., see [11]). With this substitution,we can expand Equation (4.4) asT (zm,Θmd) = smdRmd(zm + Ψmb+ ΦmUmd) + tmd. (4.6)This transform completes the definition of the negative log-likelihood ob-jective functions in Equation (4.3). For each modality, we have Θmd =694.2. Method{Rmd, tmd, smd, Umd, b}, noting that the shape parameters b are shared. Theset of unknowns {σ2md, Θmd} are solved using an EM algorithm. An initialGMM variance is estimated directly from the data as in Myronenko andSong [98],σ2md =13MNmdNmd∑n=1M∑m=1∥∥∥xmdn − zm∥∥∥2 . (4.7)In the expectation step, we compute how likely it is that an observationcorresponds to a GMM centroid by calculating the posterior probabilityPmd(zm|xmdn ) =exp(−12‖xmdn −T (zm,Θmd)‖2σ2md)∑Mj=1 exp(−12‖xn−T (zj ,Θmd)‖2σ2md)+ c, (4.8)wherec =(2piσ2md)3/2 wmd1− wmdMNmd(4.9)is the contribution of an additional uniform distribution to account for noiseand outliers. The scalar weight w ∈ [0, 1) controls the probability thata given point is classified as an outlier, set to zero if observations do notexhibit any noise or outliers. A value of one implies that no correspondencebetween observed points and GMM centroids can be made. Ignoring termsindependent of σ2md and Θmd, we can rewrite the negative log-likelihoodfunction of Equation (4.3) asEmd(σ2md,Θmd) =3NmdP2log(σ2md)+12σ2mdM,Nmd∑m,n=1Pmd(zm|xmdn )∥∥∥xmdn − T (zm,Θmd)∥∥∥2 , (4.10)where NmdP =∑M,Nmdm,n=1 Pmd(zm|xmdn ). To constrain the SSM shape parame-ters and the FE nodal displacements, we add two Tikhonov regularizers:Rmd(Θmd) = µ4bTΛb+β2~uTmdK~umd. (4.11)This adds two free parameters: µ and β. The first, µ, limits the shape pa-rameters, with Λ the diagonal matrix of inverted SSM eigenvalues. This isscaled by 1/4 since the term is split equally between the two observations.704.2. MethodFor registering to n surfaces, the fraction would be 1/2n. The parameter βcontrols the trade-off between tightness of the fit and biomechanical regular-ization. This term represents the total linearized strain energy of the FEM[11]. It involves the stiffness matrix, K, and a rasterized vector of FEM nodaldisplacements, ~u. The stiffness matrix is a large sparse matrix that can besystematically constructed based on the FEM mesh and properties of thematerial. For simplicity, we assume a linear stress-strain relationship. Morecomplex material models can be applied, but the linearized strain-energywill then include an additional forcing term dependent on the current defor-mation state. In the current implementation, whenever the SSM is updated,we remesh the FEM to ensure a high mesh quality.To perform the registration, we need to minimize the total objective func-tion in Equation (4.2) using the distance metric and regularization termsdefined in Equations (4.10) and (4.11). Unfortunately, due to the couplingbetween the rigid transforms, FEM deformation, and shape parameters, find-ing the optimal solution to this objective function directly is non-trivial.Instead, we propose a simple alternating minimization strategy. First, weassume that the FEM deformation and SSM shape parameters are fixed, andsolve for the pose parameters of each observation: {Rmd, tmd, smd}. Since thetransformation is independent between observations, this results in two rigidregistration problems between pairs of surfaces. These are trivially solvedby minimizing Equation (4.10) for each modality, which can be interpretedas an orthogonal Procrustes problem. This aligns the current best estimateof each deformed model to its corresponding observation.The next step is to update the intermediate SSM shape instance. Thisrequires finding the optimal shape parameters, b, to fit both observations.The solution given a single observation is outlined in [117]. By extension,(see Appendix A) the optimal shape parameters provided two observationsis the solution to the following set of linear equations:ΓSSMb = ΥSSM, (4.12)withΓSSM =µΛ +∑mds2mdσ2mdΨTdiag(P˜md1)Ψ, (4.13)ΥSSM =∑mdsmdσ2md[ΨTR˜TmdP˜md~xmd (4.14)−ΨTdiag(P˜md1)(smd(~z + ~vmd) + I˜RTmdtmd)].714.2. MethodThis is a dense system that can be solved with numeric complexity O(L3),where L is the number of SSM modes. Fortunately, we use a small numberof modes (i.e. 50), so b can be computed quite quickly. The resulting shapeinstance can be thought of as a common “starting shape”, from which we willdetermine further biomechanical deformation for each observation.Finally, we compute FEM nodal displacements, keeping the pose andshape parameters fixed. A single finite element model is automatically gen-erated from the SSM shape instance. For this, we use TetGen [131], andassume a linear material model. Given the FEM, displacements can besolved independently for each input modality. Minimizing Equation (4.2)with respect to the rasterized vector ~umd yields the following system:ΓFEM~umd = ΥFEM, (4.15)whereΓFEM = βσ2mdK + s2mdΦ˜Tdiag(P˜md1)Φ˜ (4.16)ΥFEM = smdΦ˜TR˜TmdP˜md~xmd−Φ˜Tdiag(P˜md1)(s2md~y + smdI˜RTmdtmd).Full details of the derivation are provided in Appendix B. This is a sparselinear system that can be solved efficiently with complexity between O(J1.5)and O(J2), where J is the number of FE-nodes. The nodal displacements,~umd, can be easily computed using any sparse linear solver. At the end ofthis step, we have now updated all unknown parameters required by thetransformation function (4.6).With the transformation from the SSM to each observation known, werecalculate the estimated variances:σ2md =1NmdpM,Nmd∑m,n=1∥∥∥xmdn − T (zm,Θmd)∥∥∥2 . (4.17)With this new variance estimate, we update the GMM probability distribu-tions (Equation (4.8)), and repeat the process.The registration algorithm iterates between the expectation step (up-dating GMM distributions) and maximization step (updating the transformparameters) until the variances drop below a certain threshold. The com-plete algorithm for the proposed SSM-FEM registration is summarized inAlgorithm 1.Once the SSM-FEM registration has converged, we can propagate theFEM using the transform parameters {Θmd} into the space of the TRUS724.2. MethodRequire: E, ν, µ, β, Z, Ψ, Xmd and wmd;Initialize: b, smd, Rmd, tmd, ~umd and σ2md;where md ∈ {MR,US};while not converged doE-Step:for md ∈ {MR,US} doUpdate Pmd using Equation (4.8);endM-Step:for md ∈ {MR,US} doRigid registration between Xmd and Y + ΦUmd;Update pose: smd, Rmd and tmd;endShape registration using Equation (4.12);Update b which updates Y ;for md ∈ {MR,US} doBiomechanical registration using Equation (4.15);Update Umd;endfor md ∈ {MR,US} doUpdate σ2md using Equation (4.17);endendAlgorithm 1: SSM-FEM registrationand MR images. This enables us to express voxels in either modality usingthe natural coordinates of the FEM (similar to barycentric coordinates).Thus, for voxels inside the prostate in one modality, it is possible to find itscorresponding voxel in the other.For material properties, we apply a homogeneous elastic material with aconstant Young’s modulus to all elements. The models used in this studyare composed of ≈ 7500 elements. Throughout our experiments, we used astopping condition of σ2 ≤ 10−4 mm2, Young’s modulus of E = 5 kPa, whichis in the range of values reported in [72] for the prostate, and a Poisson’sratio of ν = 0.49 to maintain near incompressibility.In total, there are five free parameters in the proposed method: µ, β, w,and the two FEM-specific material parameters E and ν. The first two pa-rameters are tunable. The first, µ, controls the impact of SSM modes. If set734.2. Methodtoo high, the SSM will be restricted to the mean shape. If too low, the SSMmay produce unlikely instances. For our model, a value of µ = 400 allowedreasonable variations from the mean. The second parameter, β, controlsthe balance between implicit surface-to-surface forces and internal resistanceprovided by the FEM. It should be tuned to allow for “reasonable” flexibilityof the model. We found β = 10 to be sufficient for this prostate applica-tion. This parameter can be viewed as scaling external forces acting on theprostate to fall within a reasonable range. The fraction of outliers, w, is aproperty of the quality of the data. For high-quality data, we set w = 0. Ifover-fitting to extraneous points is observed, then w should be increased, butremain low. For linear materials, the Young’s modulus, E, can be factoredout of the stiffness matrix in Equation (4.11). This allows it to be combinedwith β. Thus, the actual value for Young’s modulus has no impact on theregistration as long as β is scaled appropriately. However, we still recom-mend keeping a reasonable value for consistency. Finally, Poisson’s ratio νcontrols Poisson’s effects in the volume: when compressed along one dimen-sion, how much the material expands perpendicularly. For incompressiblematerials, ν ≈ 0.5, although this value results in a singularity. Most softtissues, including the prostate, are considered nearly incompressible, so weset ν = 0.49 [72].4.2.4 Registration Methods Used for ComparisonThe proposed SSM-FEM registration method consists of two priors: 1) Geo-metric (SSM); and 2) Biomechanical (FEM). To highlight the importance ofeach component, we remove each a priori piece of information and compareresults.Keeping only the geometric prior, we implicitly assume that the SSM,TRUS and MR surfaces are in the same biomechanical state. The concurrentregistration of a SSM to both TRUS and MR surfaces without the biome-chanical component is equivalent to a group-wise rigid SSM registration.Henceforth, we refer to this approach as SSM registration.If a full surface of the prostate is available, e.g. through a full segmenta-tion of the MRI, then there is no need for the geometric prior. In this case,we can treat the fully segmented MR surface as the source shape instancerather than relying on the SSM. We solve only for the coordinate trans-form and biomechanical deformations between source and target surfacesusing Equation (4.15). We refer to this approach as GMM-FEM registration(Chapter 2), which is only presented here for continuity. The displacements744.3. Experiments and Resultsin this case can be determined by solving the linear system:Γ~u = Υ, (4.18)where P, σ2 describes a GMM that corresponds to the fully segmented sourcesurface andΓ = βσ2K + s2Φ˜Tdiag(P˜1)Φ˜ (4.19)Υ = sΦ˜TR˜TP˜ ~xtarget−Φ˜Tdiag(P˜1)(s2~ysource + sI˜RTt).GMM-FEM registration is performed exactly as SSM-FEM registration, butreplacing the role of the statistical shape model with the fully segmentedMR surface. In the E-step, using Equation (4.8), we compute the posteriorprobabilities between source and target surfaces. The M-step is substitutedby Equation (4.18) for updating FEM nodal displacements.4.3 Experiments and ResultsWe evaluate the proposed registration method on MR-TRUS image pairsacquired from patients who underwent a prostate intervention. The data ac-quisition protocol was approved by our respective institutional ethics boards,and all patients provided written consent to be included in the study. Therest of this section is divided into five subsections. In Section 4.3.1, we dis-cuss the training population used for SSM construction. In Section 4.3.1, wediscuss the biopsy data acquisition, segmentation protocol, and initialization.We then validate the accuracy of our registration in Section 4.3.2, by mea-suring the target registration error between pairs of intrinsic fiducials in theprostate. To highlight the importance of the SSM, we compare the outcomeof the combined SSM-FEM registration with the similar registration thatexcludes either the SSM or FEM components. Finally, we investigate sensi-tivity of the method to errors in segmentation (Section 4.3.3, missing data(Section 4.3.4), and to variations in the two tunable parameters controllingregularization (Section 4.3.5).4.3.1 DataSSM ConstructionTo construct a statistical model that represents inter-subject variation ofprostate shapes, we require a large set of training data. A suitable source is a754.3. Experiments and ResultsFigure 4.2: Graphical representation of the prostate shapes described by theSSM after varying weights corresponding to the first three principal modesof variation by three standard deviations (3√λ). The mean shape is shownwith a green model in the middle column. The amount of variation in theleft and right column is color coded for each mode.large population of prostate images acquired in a single modality. The choiceof modality is important, since the technique may affect the prostate shape(e.g., forces from end-firing probes or endorectal coils). Ideally, the prostatesshould be in the same mechanical state so that differences in appearance aremostly due to inter-subject variations. Finally, for surface-based SSMs, theprostate needs to be accurately and reliably segmented.Brachytherapy volumes are routinely acquired and segmented in largeclinics and interventional centers. We used a dataset of images acquiredfrom 290 brachytherapy patients in the construction of our SSM. Each TRUSvolume consists of 7 to 14 parallel equally spaced (5 mm apart) axial B-mode images of the prostate, captured using a side-firing transrectal probe.The in-plane resolution of these images was (0.16, 0.16) mm. For each B-mode image, the prostate gland is delineated using Variseed (Varian MedicalSystems, Palo Alto, CA, USA) and a contouring plug-in. This plug-in isbased on the semi-automatic prostate segmentation method presented byMahdavi et al. [86]. Contours are manually corrected by an expert clinician,764.3. Experiments and ResultsFigure 4.3: Two brachytherapy prostates from the SSM training set. Slicesfrom each patient is colored by black and blue, respectively. Even thougheach patient is sampled coarsely (5.0 mm out-of-plane resolution), the setof both prostates is a finer sampling of the mean-shape than each prostatealone.and subsequently used as the SSM training population.For construction of the SSM, we used the group-wise GMM method ofRasoulian et al. [117]. This resulted in a SSM with a mean shape consistingof 1000 vertices and 1996 faces. The mean and primary modes of variationare depicted in Figure 4.2. As seen in the figure, the first two modes controlthe scale, whereas the third mode controls the curvature of the prostate.The choice of the number of modes is governed by the compactness of theSSM. Compactness simply measures the cumulative variance of the modelas modes are added in descending order of eigenvalues [57]. A popular ruleis to keep the first L modes that span 95% of the total variation. For ourSSM, this results in a model with 50 principal modes. After this point, eachadditional mode contributes negligibly to the cumulative variation (< 0.1%).Although the slice thickness is relatively thick, 5.0 mm, we are able tocreate a higher resolution SSM along the base-apex axis due to varying slicepositions among subjects. Since the location of slices are random between pa-tients, the training data represents a fine statistical sampling of the prostateanatomy. This is demonstrated graphically in Figure 4.3. Two segmented774.3. Experiments and Resultsbrachytherapy prostates, from different patients in the SSM training set, arerepresented by blue and black slices. Even though each individual prostateis coarsely sampled, under the assumption that the prostate is smooth, thecombination allows for a finer out-of-plane resolution than the individualensembles.Note that this method of SSM construction implicitly assumes that themean shape is a GMM and training examples are spatially transformed ob-servations of this GMM. However, during SSM construction, the GMM isdecoupled when the mean shape is updated, i.e. the means shape is simplythe average of the back-projected training examples under a spatial smooth-ness constraint. Since the focus of this thesis is not construction of SSMs,we will not tackle this issue. However, it would be interesting to see how theresults vary if the GMM and mean shape are updated simultaneously.Prostate BiopsyValidation data was acquired from 19 patients scheduled for a prostatebiopsy. In Figure 4.4, a typical example of an MR and TRUS image pair isshown. The T2-weighted MR images were acquired using a 3 Tesla GE ExciteHD MRI system (Milwaukee, WI, USA) with a spacing of 0.27× 0.27× 2.2mm. Slices from base, mid-gland and apex are shown in Figures 4.4a, 4.4band 4.4c, respectively. The TRUS images were acquired using a 3D-TRUSmechanical biopsy system [7] with a Philips HDI-5000 US machine and a C9-5 transducer using an axial rotation of the biopsy probe. The TRUS imageswere reconstructed into a 3D-volume with a spacing of 0.19 × 0.19 × 0.19mm. Figures 4.4d, 4.4e and 4.4f show slices from base, mid-gland and apex,respectively. In each modality, areas around the mid-gland (e.g. Figures 4.4band 4.4e) were segmented where the prostate boundary could be reliably andaccurately traced, as determined by an expert clinician. We refer to thesecontours as a partial segmentation. Additionally, we segmented regions ofthe prostate where the boundary was not clearly visible based on symme-tries and prior knowledge of the general prostate shape. White arrows inFigure 4.4 indicate examples of these uncertain regions. These added con-tours are referred to as uncertain data. The combination of both uncertainand partial data make up the full segmentations. Example segmentationsof MR and TRUS volumes are shown in Figures 4.4g and 4.4h, respectively.For the data used in this study, the partial segmentations of MR repre-sent approximately 80% of the full surface, and the partial segmentationsof TRUS approximately 70%. All segmentations were performed manuallyusing Stradwin (Cambridge University, UK).784.3. Experiments and Results(a) Base (b) Mid-gland (c) Apex(d) Base (e) Mid-gland (f) Apex(g) MR Segmentation (h) TRUS SegmentationFigure 4.4: Axial slices from MR (top-row) and TRUS (middle-row) volumes.Typically, the prostate boundary can be accurately and reliably segmentedin the mid-gland, i.e. (b) and (e). White arrows highlight regions wherethe true prostate boundary is ambiguous. (g) and (h) depict the resultingsegmentation of MR and TRUS, respectively. Partial segmentations are colorcoded in green, uncertain contours are in red.Note that even though the prostate boundary is typically more visible inMR than TRUS, it is still prone to segmentation error [138]. For example,since axial MR slices are typically very thick, it may be ambiguous where794.3. Experiments and Resultsthe prostate ends and the bladder begins, as the prostate may terminatebetween slices. Another source of MR segmentation error, specifically at thebase, is the potential inclusion of the seminal vesicles, as this varies betweenexperts [138].(a) Initial(b) RegisteredFigure 4.5: An example of SSM-FEM registration with w = 0.0, µ = 400,β = 10.0, E = 5.0 kPa and ν = 0.49 to full data (TRUS on the left, MRon the right). Following center of mass initialization (a), the SSM meanis evolved to target surfaces (b). The target surfaces are shown in red,the current SSM instance is shown in green, and the result of SSM-FEMregistration is shown in blue.804.3. Experiments and Results4.3.2 Registration of Full and Partial SurfacesWe used the biopsy data to validate our registration pipeline using both thepairs of full surfaces, and the pairs of partial surfaces. To initialize trans-lation, we use a center of mass alignment between SSM-mean and targetsurfaces, as seen in Figures 4.5a and Figure 4.6a. For the initial rotation,we simply matched the right-anterior-superior coordinates of SSM-mean andtarget surfaces. Within a range of [−10.0,+10.0] mm/degrees, we did notobserve sensitivity to initialization for SSM, GMM-FEM and SSM-FEM reg-istration in 100 trials.814.3. Experiments and Results(a) Initial(b) RegisteredFigure 4.6: An example of SSM-FEM registration with w = 0.2, µ = 400,β = 10.0, E = 5.0 kPa and ν = 0.49 to partial data (TRUS on the left,MR on the right). Following center of mass initialization (a), the SSM meanis evolved to target surfaces (b). The target surfaces are shown in red,the current SSM instance is shown in green, and the result of SSM-FEMregistration is shown in blue.Following initialization, we apply the SSM-FEM algorithm until registra-tion converges. A typical result for full MR-TRUS surfaces is shown in Fig-ure 4.5b. The Tikhonov regularization weights were tuned to allow sufficientflexibility while still resulting in plausible prostate shapes. This resulted ina choice of µ = 400 and β = 10. Since the quality of data was quite highand the surfaces sufficiently smooth, we set the estimate of outliers, wmd, to824.3. Experiments and Resultszero. For a fair comparison, we used the same parameters for SSM and forGMM-FEM registration methods as well.The same experiment was applied to the partial MR and TRUS surfaces.A typical SSM-FEM registration result for a partial MR-TRUS surface pairis shown in Figure 4.6. We used the same Tikhonov regularization weightsas for the full data. However, since partial surfaces exhibit missing points,we set the estimate of noise and outliers to wmd = 0.2. As demonstrated byMyronenko and Song [98], this helps avoid falling into local minima due topoor initialization. The same parameters were used for SSM and GMM-FEMregistration methods.Figure 4.7: Examples of fiducial pairs in MR (left column) and TRUS (middlecolumn) images. The composite image following SSM-FEM registration isshown in the right column. The segmented prostate boundary for MR andTRUS is shown in blue and red, respectively.Quantitative ValidationTo quantify the registration results, we asked an expert radiologist to selectfive intrinsic fiducials per patient consisting of cysts and benign prostatichyperplasia. While we did not follow a strict protocol regarding the preciseanatomical location of these landmarks, we did ask our clinical collaboratorto only draw samples interior to the prostate. This resulted in a total of 93landmarks from the cohort of 19 patients. The spatial distribution of these834.3. Experiments and Resultslandmarks was: 64.5% in the mid-gland, 21.5% in the base and 14% in theapex. An example of corresponding fiducial pairs in MR and TRUS, as wellas the fused MR-TRUS image following SMM-FEM registration, is shownin Figure 4.7. The L2 distance between these fiducial pairs was used toquantify the TRE. The fiducial localization error is approximately 0.21 mmfor TRUS and 0.18 mm for MR, as reported in a previous study [71, 141].This suggests that the FLE is not likely to dominate.Table 4.2: TRE for registration algorithms. The p-values reflect Wilcoxonsigned-rank tests compared to initialization. Partial data represents ≈70%of the TRUS and ≈80% of the MR surfaces, respectively.Method TRE (mm) p-valueFull Partial Full PartialInitial 4.86± 1.43 4.53± 1.35 NA NASSM 3.86± 1.26 3.95± 1.26 1e−2 0.30GMM-FEM 2.72± 1.15 4.89± 1.46 1e−3 0.27SSM-FEM 2.35± 0.81 2.81± 0.66 1e−4 1e−4The mean and standard deviation of the TREs, as well as correspondingp-values when compared to initialization, are shown in Table 4.2. To investi-gate the statistical significance, we first check whether or not the TRE distri-butions are normally distributed. Using a one-sample Kolmogorov-Smirnovtest, the TREs were found not to be normally distributed at the 95% sig-nificance level (p ≤ 10−4). As a result, we performed a set of Wilcoxonsigned-rank tests, which are robust to deviations from normality. The nullhypothesis is that the TREs of the initialization and the subsequent regis-tration method share a common median at the 95% significance level.For full surfaces, all three methods (SSM, GMM-FEM and SSM-FEM)significantly decrease the TRE compared to the initial rigid registration. Themean TRE improvement for SSM, GMM-FEM and SSM-FEM registrationmethods is 1.00 mm, 2.14 mm and 2.51, respectively. The mean TRE forSSM-FEM registration is 1.51 mm lower compared to SSM registration alone.This suggests that the transformation between MR and TRUS surfaces isnot rigid: substantial deformations exist. For partial surfaces, SSM led toa decrease in mean TRE by 0.58 mm, and GMM-FEM to an increase by0.36 mm. However, these differences are deemed not significant at the 95%level. The 1.72 mm decrease in TRE of the proposed SSM-FEM, however,is significant at the 95% level (p < 10−4).Table 4.3 shows the TRE following SSM-FEM registration for fiducials844.3. Experiments and ResultsTable 4.3: TRE for SSM-FEM registration at base, mid-gland and apex forfull and partial data. Partial data represents ≈70% of the TRUS and ≈80%of the MR surfaces, respectively.Base Mid-gland ApexFull 2.87± 0.75 1.95± 0.61 2.53± 0.91Partial 3.12± 0.65 2.21± 0.45 3.05± 0.76Table 4.4: Wilcoxon signed-rank tests between methods. Partial data repre-sents ≈70% of the TRUS and ≈80% of the MR surfaces, respectively.Test p-valueFull PartialGMM-FEM vs. SSM 10−4 10−3SSM-FEM vs. SSM 10−6 10−5SSM-FEM vs. GMM-FEM 0.45 10−4at the base, mid-gland and apex, separately. Errors in the mid-gland areconsistently lower compared to those in the base and apex for both full andpartial data. The reason for this is likely that prostate segmentations here aremore reliable compared to the base and apex, leading to fewer assumptionswhen reconstructing the deformation field.To compare the methods, we performed additional Wilcoxon signed-ranktests on the paired TREs, the p-values of which are reported in Table 4.4.From these tests, we see that GMM-FEM significantly outperforms SSMalone when the full segmentations are available (p < 10−4), but SSM is thebetter of the two when only partial segmentations are available (p < 10−4).Upon inspection, we observed that the increased error in GMM-FEM ismainly due to a mis-assignment of the base to apex regions (i.e. flipping).The SSM component is able to mitigate this, providing enough additionalinformation to prevent flips caused by a lack of corresponding surface fea-tures. Comparing SSM to SSM-FEM, we see that the flexibility added by thefinite element model leads to significant reductions in TRE for both full andpartial surfaces (p < 10−5). Finally, comparing SSM-FEM to GMM-FEM,we find that the difference in TRE is not significant at the 95% level whenfull surfaces are available. However, when applied to partial segmentations,SSM-FEM does significantly outperform GMM-FEM (p < 10−4).Finally, we investigate the impact on TREs of using full versus partialsegmentations. For GMM-FEM, there is a significantly large increase inTRE of 2.1 mm (p < 10−3) when applied to partial TRUS data. The soft-854.3. Experiments and Resultscorrespondences alone are found not sufficient for handling uncertainties insegmentations. For SSM, the difference in TRE between full and partialsurfaces is minor (0.39 mm, p = 0.25). Similarly, for SSM-FEM, there is nostatistically relevant difference in TRE (0.46 mm, p = 0.33). This suggeststhat for SSM and SSM-FEM, a full segmentation of the prostate is notnecessary: registration based on partial surfaces produces similar TREs.Computation times for the SSM, GMM-FEM and SSM-FEMmethods areprovided in Table 4.5. Timings are reported on a regular desktop PC witha 3.4 GHz Intel Core i7 CPU with 8.00 GB of RAM. Due to the increasedcomputational complexity of both the FEM and SSM components, it is notsurprising that the SSM-FEM algorithm is the slowest. However, the meantotal computation time is still under one minute when applied to both fulland partial surfaces, making it practical for clinical use.Table 4.5: Total registration timeMethod Time (s)Full PartialSSM 5.72± 0.53 5.51± 0.42GMM-FEM 9.54± 3.26 8.69± 2.21SSM-FEM 50.61± 6.82 29.34± 2.55864.3. Experiments and Results(a) Box-plot distribution of TREs (mm) with different magnitudes of Gaussian noise.(b) Box-plot distribution of TREs (mm) with different Gaussian kernel widths. A valueof zero refers to a segmentation without any added noise.Figure 4.8: The effect of Gaussian noise on TRE (mm) values with registra-tion parameters E = 5.0 kPa, µ = 400, β = 10.0, and ν = 0.49.874.3. Experiments and Results4.3.3 Sensitivity to Errors in SegmentationSegmentation of MR and TRUS images will inevitably contain intra-observerand inter-observer variability. Here, we explore the sensitivity of SSM-FEMto such errors. We synthetically perturbed both TRUS and MR full seg-mentations using a Gaussian kernel to simulate an additive Gaussian noise.This kernel has two parameters: 1) width (σ), which controls the locality ofsegmentation errors; and 2) magnitude (α). Our protocol for creating noisysurfaces is as follows:1. Select a random seed face.2. Calculate the normal (~n) and center (c) of the seed face.3. For every point (x) on the segmented prostate, add the following Gaus-sian noise: α~n exp(‖x− c‖2 /2σ2)For every combination of α ∈ [−10.0, 10.0] mm and σ2 ∈ [5.0, 50.0], wegenerated 1000 noisy MR and TRUS surfaces. These surfaces were usedto investigate the sensitivity of the TRE in the presence of Gaussian noise.Figures 4.8a and 4.8b, illustrate the sensitivity of the TRE to the magnitudeand width of the Gaussian noise.Figure 4.8a shows the sensitivity of the TRE to the magnitude of Gaus-sian noise with a constant kernel width (σ2 = 25.0 mm2). The general trendin this figure is that the TRE increases with the magnitude of the additiveGaussian noise. The mean TRE stays below 3.0 mm for Gaussian noisemagnitudes between [−5.0, 5.0] mm. This value provides a rough confidenceinterval for segmentation accuracy.Figure 4.8b shows the sensitivity of the TRE to the kernel width given aconstant noise magnitude (α = 7.5 mm). The general trend, again, is thatthe TRE increases with the width of the Gaussian kernel. At small kernelwidths (σ2 ≤ 10.0 mm2), the TRE seems to be unaffected with the addednoise, even though its magnitude is large (α = 7.5 mm). We believe themain contributor to this behavior is the soft-correspondence approach, sinceprobabilistic solutions to point cloud registration are known to be robust tooutliers [21, 98].884.3. Experiments and Results(a) Box-plot distribution of robustness asobservations are removed(b) Left to right: Spatial distribution ofrobustness for SSM-FEM registration when10%, 20% and 40% of observations are re-moved.Figure 4.9: Distribution of robustness as different percentages of observationsare removed with E = 5.0 kPa, µ = 400, β = 10.0, and ν = 0.49. Sagittalslice of robustness for SSM-FEM registration when different amounts of ob-servations are removed. For better visualization, distances larger than 6 mmare shown in the same color.4.3.4 Sensitivity to Missing Surface PointsUntil now, the partial surfaces were defined based on a visibility criterion:if the prostate boundary is unclear in certain regions of the images, thosesections of the segmentations were removed. What remained representedapproximately 80% of the original MR surfaces, and 70% of the TRUS sur-faces. Here, we investigate robustness of our method to missing data bysystematically removing points (i.e. rows of Xmd) from both the base andapex (from 2.5% to 20% from each). Subsequently, we applied our algorithmto the cropped observations. To quantify performance, we compared theinternal deformation computed using partial surfaces to that using the fullsurfaces, measuring the L2 distance between the displacement fields. Intu-itively, this evaluates the method’s ability to recover the same deformationsin the presence of missing data.Results of this experiment are shown in Figure 4.9 for a typical biopsypatient. The majority of the deformation stays below 2.0 mm when 10%of the observations are removed. The crosses in Figure 4.9a, correspondto points that fall outside the 1.5×interquartile-range. These are mostlyconcentrated along boundary of the prostate, seen as green–red in Figure4.9b. The robustness of SSM-FEM registration progressively declines as894.3. Experiments and Resultsmore points from the base and apex are removed.As seen in Figure 4.9b, the deformation field in the centre (interior) ofthe prostate is more robust to missing data compared to near the surface.Removing surface data most strongly affects results in the immediate vicinity.The effect, however, rapidly drops-off as we move away from the surface. Thisis a consequence of the FE model: forces are propagated from the boundaries,inward, and begin to balance each other.(a) Box-plot distribution of robustness as µis perturbed(b) Box-plot distribution of robustness as βis perturbed(c) Left to right: Spatial distribution of ro-bustness for SSM-FEM registration when µis set to 200, 350 and 550, respectively.(d) Left to right: Spatial distribution of ro-bustness for SSM-FEM registration when βis set to 2.5, 7.5 and 20.0, respectively.Figure 4.10: Robustness of SSM-FEM registration for different regularizationweights. (a) varying the SSM scale factor, µ. (b) varying the FEM scalefactor, β. Weights are perturbed around parameters which provided thebest surface overlap (E = 5.0 kPa, ν = 0.49, µ = 400 and β = 10.0). Forbetter visualization contrast, distances larger than 6 mm are shown with thesame color.4.3.5 Sensitivity to Regularization ParametersThere are two tunable parameters that control the regularization: µ and β.To examine the sensitivity to these values, we perturbed µ ∈ [200, 700], andβ ∈ [0.1, 1000]. Results are shown in Figure 4.10 for a typical fully segmented904.4. Discussion and Conclusionsprostate.Sensitivity of the deformation field to µ, controlling SSM shape parame-ters, is shown in Figure 4.10a. The registration seems not to be sensitive tothese perturbations. For small values, µ ≤ 200.0, the registration becomesunstable; the SSM is not sufficiently constrained, so it produces unrealis-tic shapes. For large Tikhonov values, µ ≥ 550.0, the SSM is restrictedto instances very close to the mean. For values between the two extremes,the search space is restricted to reasonable SSM instances. As seen in Fig-ure 4.10c, changes in µ mostly affect the boundary of the prostate; regionsinterior are less sensitive to these perturbations.SSM-FEM is more sensitive to the biomechanical regularization weight,β (Figure 4.10b). When tuning this parameter, the registration exhibitsthree distinct behaviors. For large values, β ≥ 50.0, the FEM is essentiallyrigid, allowing no further deformation. For very small weights (e.g. β ≤ 2.5),the FEM does not resist any deformation: the surface is allowed to movefreely, independently of interior nodes. For values between the two extremes,the parameter allows a trade-off between surface-fitting and deforming. Aspatial distribution of robustness for the three different behaviors is shownin Figure 4.10d. This parameter should be tuned for the application. Wefound a value of β = 10 is a good starting point for allowing reasonabledeformations. If this leads to a model that is overly flexible, the value shouldbe increased.4.4 Discussion and ConclusionsIn this chapter, we presented a novel non-rigid surface-based registrationapproach that is robust to missing data. The method uses a statistical shapemodel as an intermediary to help co-register two partial surfaces. The SSMintroduces geometric prior knowledge, allowing for varying shapes across apopulation. To account for physical deformations, we introduce a finite-element based regularizer. This adds flexibility to the SSM, accounting fordifferences in biomechanical states between the observed partial surfaces.The proposed registration algorithm converges within a minute on a reg-ular desktop PC. We showed the internal deformation found through ourmethod to be robust up to 2.0 mm when 10% of the surfaces are removed. Agreat advantage of the algorithm is that it estimates point-correspondences,nodal displacements, shape, and pose parameters in a single minimizationframework. This is one of the contributing factors to the efficiency. We alsoobtain a full volumetric deformation field as a by-product of the regulariza-914.4. Discussion and Conclusionstion. This is in contrast to many existing surface-based approaches, wheredeformations need to be estimated from the surface in a post-processing step.In Section 4.3.2 we investigated the performance of our method usingintrinsic fiducial pairs. We compare to two alternatives: group-wise SSM,where the influence of the biomechanical (FEM) prior is removed; and GMM-FEM, where the influence of the geometric (SSM) prior is removed. SSM-FEM yields a statistically significant improvement compared to SSM reg-istration alone (by 1.51 mm for full surfaces, 1.14 mm for partial). Thisindicates that the FEM component plays an important role, accounting fordifferences in deformation states of the prostate in the two observations.When comparing to GMM-FEM, the difference in TRE was only found tobe statistically significant when partial surfaces are used (2.0 mm, p < 10−4).In this case, we saw that GMM-FEM sometimes resulted in mis-assignmentsof the base and apex. The SSM adds sufficient information to prevent this.We did not compare SSM-FEM to other methods that combine a SSMand FEM, such as that by Hu et al. [60]. While both use an EM algorithmto maximize a similar functional, the role of the SSM and FEM are different.The approach by Hu et al. requires full segmentations of the MR, includingnot only the prostate, but also the surrounding anatomy. Thus, it is assumedall tissues boundaries are clearly visible. Based on the segmentations, a per-sonalized FEM is created and used to train a subject-specific SSM, which isthen used to register directly to a new image. The training process must berepeated for every new subject, since it is a personalized model. In SSM-FEM, the SSM represents variations in prostate shapes across subjects. TheFEM then adds flexibility to account for additional deviations and deforma-tions.In Section 4.3.3 we investigated the sensitivity of SSM-FEM to errorsin segmentation of the MR and TRUS surfaces. For ambiguities within±5.0 mm, we found the TRE to be insensitive to segmentation variability. Ithas been previously reported that expert inter-subject variation of prostatecontours falls in the range of [0.7, 2.5] mm [138]. Since the TRE for ourregistration method seems to be insensitive to variations in contouring withinthis range, we infer that errors caused by inter-subject variations will benegligible.The most time-consuming portion of the registration method is the finite-element mesh creation, which is repeated every time a new SSM shape in-stance is generated. Efficiency can be improved by meshing only when theSSM instance differs significantly from the previous iteration. Another short-coming of our algorithm is that it requires both the MR and TRUS to besegmented prior to the registration. While the MR can typically be seg-924.4. Discussion and Conclusionsmented ahead of time, the 3D-TRUS needs to be segmented within minutesdue to the clinical requirements. Since our algorithm is designed to handlepartial surfaces and is robust to missing data, we suggest to only segmentregions in which the boundary of the anatomy can be clearly distinguished(such as the mid-gland). This can help expedite the segmentation process.For the regions that have clear boundaries, a semi-automatic segmentationmethod can also be useful [114].While the improvement in the TRE seems small (on the order of a mil-limeter), it should be compared to the acceptable error bounds of a tar-geted biopsy system. A clinically significant tumor has a radius of at least5 mm [71]. A TRE of 2.5 mm would lead to a confidence interval in which95% of targets fall within that radius. Our values of 2.35 ± 0.81 mm and2.81 ± 0.66 mm for full and partial surfaces respectively, brings us close tothese bounds.In this chapter, the method was limited to surfaces segmented from twomodalities. However, it can be trivially extended to concurrently registermore than two surfaces. Such a situation may arise when the patient isscheduled for a re-biopsy, and it is desirable to simultaneously register MR,the previous TRUS, and the current TRUS. This would enable the radiologistto avoid areas that in the previous biopsy were found to be cancer-free,and enable re-sampling suspicious areas. The extension is accomplished bysimply modifying the objective function in Equation (4.2) to sum over furtherobservations.The implicit assumption when using the SSM as a “starting shape” forcomputing biomechanical deformations is that it represents the prostate “atrest”. This assumption is not true in general, since there will always be somebiomechanical forces present in the training population (e.g. pressure fromthe bladder, gravity, probes/endorectal coils). Ideally, the training imageswould be acquired under conditions where these forces are minimized. Thetraining population in our study is a compromise in size, segmentation ac-curacy, and initial biomechanical forces. To account for errors caused byexisting deformations in the training data, an initial strain field can be esti-mated. This can be accomplished by measuring or estimating any externalforces (e.g. from a force sensor on a transrectal probe), and back-solving forthe initial strain [108].We currently assume a homogeneous linear material model for the FEM.The homogeneity assumption is similar to uniform smoothness of the interpo-lation field in other surface-to-surface registration techniques. An advantageof the FEM framework is that including spatially varying properties, suchas a localized stiff region caused by a tumor, is possible as long as the ap-934.4. Discussion and Conclusionspropriate data is available. Such data may be defined manually, or obtainedvia elastography and registered to the model. The only step modified in theregistration algorithm is the computation of the stiffness matrix, K. It isalso possible to incorporate non-linear material models, such as the Mooney-Rivlin material [122]. This requires modifying the strain energy term usedin the regularization, linearizing about the current deformation state. Thiswill introduce a synthetic force to be added to the objective function.While the application in this chapter is MR-TRUS fusion for prostatebiopsies, the proposed method is not limited to this context. It is appli-cable as long as the following conditions are met: 1) a statistical shapemodel of inter-subject variability is available; and 2) each observation sur-face represents a snapshot of the deformed anatomy. A free-hand ultrasoundprobe may be used if it is tracked to allow for 3D image reconstructionprior to segmentation. However, the second condition may fail if motionof the probe induces significantly different deformations between slices. Inthat case, the reconstructed image would not represent a single consistentdeformation state.Our probabilistic algorithm is based on very simple concepts: soft cor-respondences via Gaussian mixture models, geometric prior knowledge pro-vided by a statistical shape model, and biomechanical regularization basedon a linearized finite element model. The soft correspondences make themethod robust to noise and missing data points; the SSM provides geometricinformation when boundaries of anatomical regions are not clear in images;and the FEM adds flexibility, allowing the shape to deform to account forsoft-tissue motion between acquisitions. The method is general and robust,able to co-register any set of full or partial segmented surfaces.94Chapter 52D-3D Registration forFreehand Prostate Biopsies5.1 IntroductionProstate biopsy is the gold standard for prostate cancer diagnosis. This istypically performed freehand using 2D TRUS guidance. Unfortunately, thecurrent systematic biopsy approach is prone to false negatives [84], and pa-tients are frequently asked to repeat the procedure. To improve the canceryield, 3D biopsy systems have been developed [6, 135, 159]. In these systems,biopsy locations are planned and recorded with respect to a 3D-TRUS ref-erence volume acquired just prior to the procedure. However, the prostatemoves and deforms during the biopsy process, in a way which cannot becompensated for using passive tracking alone [6, 135, 159]. Without motionand deformation compensation through slice-to-volume registration, the liveultrasound image will deviate more with respect to the reference volume,making it infeasible for the radiologist to accurately sample planned targets.As a result, slice-to-volume registration is required to maintain alignment ofthe 2D imaging plane with respect to the pre-procedure 3D-TRUS. The goalof this chapter is to provide the registration tools to perform such freehand3D-guided prostate biopsies.5.1.1 Related Work2D-3D registration in the literature can refer to the alignment of 3D images toa single tomographic or projective slice. In the first case, each pixel from the2D slice image is assumed to have a corresponding voxel in the 3D image.In this case, 2D-3D registration is an extreme case of 3D-3D registration,where one of the images spans a single slice. Examples of tomographic2D-3D registration include the alignment of interventional MRI slices andpre-procedure 3D MRI [40] or 2D-US and CT [23, 63, 155].955.1. IntroductionIn the second case, for projective slices, the one-to-one correspondencebetween 3D and 2D data is no longer valid and therefore a different approachis required. Methods for projective 2D-3D registration incorporate either aprojection [10, 105], or a reconstruction [43] operator. Using a projectiveoperator, the 3D data is transformed into the 2D domain such that the pro-jections and 2D data until the best match is obtained. With a reconstructionoperator, the 2D projections are back-projected into the 3D domain until thereconstructed data is aligned with the 3D image.Both tomographic and projective 2D-3D methods formulate the registra-tion problem as a minimization of a metric with respect to a set of trans-formation parameters. This transformation dictates the spatial mappingbetween the 2D slice and the 3D volume. Common metrics for 2D-3D regis-tration problems include sum-of-squared differences (SSD) [18], normalizedcross correlation (NCC) [135] and mutual information (MI) [167].2D-3D registration for prostate biopsy falls into tomographic 2D-3D reg-istration. As a result, we assume that the 2D slice is a 3D image with asingle frame and treat 2D-3D registration similar to 3D-3D registration (seeSection 5.2.2). However, robust 2D-3D TRUS registration in a freehand en-vironment is still a challenging task. First, an accurate volume representingthe prostate “at rest” must be generated. For freehand biopsies, this volumeis typically acquired using a tracked axial sweep from base to apex [74, 159].Unfortunately, due to patient discomfort and inconsistent probe pressure, re-sulting volumes suffer from deformation artifacts. To reduce these, systemshave been developed that use mechanical stabilization [135] or 3D probes [6],ensuring consistent probe pressure. The second challenge relates to the na-ture of the 2D-3D registration problem. The limited 2D spatial information,low signal-to-noise ratio of TRUS, and varying probe-induced pressures, allmake motion and deformation compensation difficult. When the probe ismechanically stabilized, De Silva et al. [134, 135] show that rigid 2D-3D reg-istration is sufficient for 3D guidance. They approximate the prostate asrigid, and its motion is learned based on probe positions/orientations [134].Systems that use 3D probes [6] avoid the 2D-3D registration issue, since theadditional out-of-plane information can be used to increase accuracy androbustness [29, 74]. However, neither mechanical systems nor 3D probes arewidely used in clinics. A solution for freehand 3D-guidance using 2D-TRUSprobes, as part of the current standard-of-care, is highly desirable.In this chapter, we provide a solution for freehand TRUS-guided prostatebiopsies using a combined rigid and non-rigid 2D-3D registration. The clos-est works to ours, involving slice-to-volume registration on freehand TRUSdata, are by Xu et al. [159] and Khallaghi et al. [74]. The shortcoming965.2. Methodsin Xu et al. [159] is that their method does not compensate for non-rigidprostate deformations, and target registration error is only evaluated onphantom studies. Khallaghi et al. [74] compensate for deformations usinga B-spline approach. However, their validation is on axial slices of TRUSimages acquired in a similar motion as the pre-procedure sweep, leading toa simpler problem. It has not been applied to TRUS slices obtained fromarbitrary orientations.In the absence of strong prior knowledge of prostate motion, even rigid2D-3D registration often does not converge. To improve robustness, Bau-mann et al. [6] constrain the registration using a manually delineated bound-ing ellipsoid approximating the prostate shape. This provides a boundarycondition for rectal probe kinematics. We follow a similar approach. How-ever, in our case the rigid registration is constrained using the trajectory ofthe tracked probe’s tip during the freehand sweep. This ensures that therectal wall in both the slice and volume are approximately aligned. We thencompensate for residual motion and deformation using a non-rigid 2D-3Dregistration based on a finite element model. This incorporates physicalprior knowledge, since deformations during a prostate biopsy are mostlyprobe-induced, and therefore biomechanical in nature. FEM-based methodshave been previously applied to 3D-3D registration [60, 89, 154], but theseuse the FEM as part of a forward transformation model: from source to thetarget image. When the target is a 2D slice, this requires scattered datainterpolation from the warped volume onto the slice plane, which can becomputationally expensive. To alleviate this issue, we propose a backwardtransformation model that maps the regular 2D grid on the slice directly tothe 3D volume. To the best of our knowledge, this is the first report of aFEM-based 2D-3D registration for freehand prostate biopsies.5.2 Methods5.2.1 Data AcquisitionThe TRUS images in this study were acquired using a magnetically trackedEC9-5/10 probe with a custom data collection software running on a SonixTouch machine (Ultrasonix Inc., Canada). Prior to the procedure, the probewas calibrated at a depth of 6.0 cm with an N-wire phantom using fCal [82](calibration accuracy of 0.45 ± 0.2 mm). At the start of the procedure,we asked an interventional radiologist to perform a freehand axial sweepof the prostate gland, from base to apex. We refer to the tracked probe’stip along the sweep as the trajectory. This trajectory is shown graphically975.2. MethodsFigure 5.1: Clinical workflow. Pre-procedure: The radiologist takes a free-hand sweep of the prostate gland from base to apex (frame stack shown ingray). This sweep is used to construct a probe tip trajectory and a 3D volume(sagittal view). This volume is segmented to create a FEM consisting of tworegions: the prostate, and the surrounding soft tissue. Intra-procedure: The2D-TRUS is registered using a rigid transform constrained by the trajectory.Subsequently, a FEM-based non-rigid registration is used to compensate forresidual deformations.with red/green/blue axes in Figure 5.1. Each sweep was obtained at 20frames/second (≈ 500 frames total), and used to reconstruct a 3D volumeof the prostate and surrounding tissue. To account for small fluctuationsin probe pressure during the pre-procedure sweep, a moving average withwindow length of 65 was applied to the transforms. Since the prostate isknown to be stiffer than the surrounding tissue [72], we manually segmentedthe prostate in the 3D-TRUS and created a simplified FEM consisting of tworegions: prostate, and the surrounding tissue (see Section 5.2.2). If desired,segmentation can be automated [160], or skipped and a single homogeneousFEM can be used [89]. Note that commercial 3D-TRUS systems, such asUroNav (Invivo Co., USA) and Artemis (Eigen Inc., USA) already incor-porate intra-operative segmentation of the prostate on TRUS images. Ifelastography becomes part of the prostate biopsy protocol, spatially-varyingmaterial properties can also easily be incorporated into the FEM.Throughout the procedure, tracked TRUS images were obtained continu-ously at 20 frames/second. At each biopsy location, we asked the radiologistto press a foot-pedal to tag the TRUS image associated with the core. The2D-TRUS at each biopsy core was used then for off-line validation of our985.2. Methodsframework. The two major components are the rigid registration, and theFEM-based deformable registration methods, which are discussed in Sec-tion 5.2.2. We refer to the pre-procedure 3D-TRUS and the intra-procedure2D-TRUS as source and target images, respectively. A full workflow is pre-sented in Figure 5.1.5.2.2 2D-3D RegistrationWe treat the 2D images in this chapter as 3D images with a single slice. Thisfacilitates the linear algebra in our implementation since the calibration andmagnetic tracking transforms are recorded using homogeneous transforms oftype[R3×3 t3×101×3 1], where R3×3 is the rotation and scale component of thetransformation and the t3×1 represents the translation.The 2D-3D registration is framed as the minimization of an objectivefunctional between the planar 2D target image, F (x) : R3 → R, and the3D source, M(x) : R3 → R, where x ∈ Ω refers to grid points on thetarget slice. We denote the number of points in Ω by N and concatenatethem into a single 3N × 1 vector, ~x . Since the problem is mono-modal,we used the SSD for the intensity metric. Other intensity-based metrics,such as NCC, mutual information, and MIND can also be used. However, ithas been previously shown that the Demons registration method works wellfor ultrasound registration of elastic organs [73]. Since Demons is derivedusing SSD [109], we chose this metric for the image-based component of ourframework. For rigid registration, the transform is constrained using theprobe trajectory. For FEM-based registration, the objective functional isregularized using the total strain energy of the FEM.Trajectory-based Rigid Registration: We formulate the rigid registra-tion as the minimization of the objective functional:Qr(s, θ1, θ2, θ3) =12N‖F (~x)−M(Tr(s, θ1, θ2, θ3, ~x))‖2 , (5.1)where Tr(s, θ1, θ2, θ3, x) is a rigid transform. The translation componentis restricted to the probe tip trajectory parameterized by s ∈ [0, 1]. Thistrajectory approximates the rectal wall in the 3D volume, on which theprobe tip in the live 2D imaging plane should also fall. We did not constrainthis parameter in our implementation, however, a quadratic penalty such ass2 can be applied if rigid registration is trapped in local minima in futuredatasets. Rotation around the probe tip is controlled by the three Eulerangles (θ1, θ2, θ3). Euler angles are susceptible to the Gimbal lock issue,995.2. Methodshowever, we only expect the amount of correction in rotation to be smallsince the position and orientation of the probe is already tracked.To initialize the location parameter s, we project the tracked probe tip tothe trajectory. The rotation is initialized using the orientation from magnetictracking. We then used the “pattern search” optimizer in Matlab (Math-Works, USA) to find the optimal rigid parameters (s, θ1, θ2, θ3).FEM-based Registration: Central to the deformable registration is aFEM, constructed from the 3D image volume using a 8 × 8 × 8 grid ofhexahedral elements. For simplicity, we use a linear material, which dependson a Young’s Modulus, E, and Poisson’s ratio, ν. Since the prostate morestiff than the surrounding tissue, we increase the Young’s Modulus withinthis region. Similar to [89], we do not assume any boundary conditions; theFEM is freely able to move based on image-driven forces. We systematicallycompute the stiffness matrix of the FEM, K3J×3J , where J is the number ofFEM nodes in the hexahedral grid. We formulate the objective functionalas a regularized SSD metric:QFEM(~u) =12N‖F (~x)−M(~x− Φ~u)‖2 + α2∥∥∥~u− ~u(p)∥∥∥2 + β2~uTK~u, (5.2)where ~u3J×1 = (u11, . . . , u13, . . . , uJ3)T is the vector of FEM node displace-ments. A damping term is added for stability, scaled by a coefficient α,that limits deviation from the displacement values in the previous iteration,~u(p). Deformation is controlled by the strain energy, scaled by regularizationweight β. The interpolation matrix, Φ3N×3J , is used to represent spatial coor-dinates ~x in terms of the FEM node locations. It is constructed by detectingwhich deformed element contains each point, x, and computing interpola-tion coefficients based on the element’s shape functions (similar to computingbarycentric coordinates). Differentiating Equation (5.2) with respect to thelth coordinate of node k, ukl, yields∂Q∂ukl=1NN∑n=1[Fn(~x)−Mn(~y)]3∑m=1∂Mn(~y)∂ynmΦnk + J∑j=1∂Φnj∂uklujm+ α(ukl − u(p)kl)+ βJ∑j=13∑m=1K(3k+l)(3j+m)ujm, (5.3)where ~y = ~x−Φ~u are the mapped coordinates in the moving image volume,n loops over all pixels in the fixed image plane, m loops over the three dimen-sions, and j loops over all FEM nodes. The derivative of the shape functions1005.3. Experiments and Resultswith respect to nodal displacements, ∂Φnj/∂ukl, can be derived based on theFEM shape functions. We use standard linear shape functions, which aretypically defined in terms of isoparamteric (or ‘natural’) coordinates (ξ, η, µ).In such a case, one can derive the following expression:∂Φnj∂(uk0, uk1, uk2)= −Φnj(J−1) ∂Φ∂(ξ, η, µ), (5.4)where J is the Jacobian matrix of the element containing point x. Detailsof the derivation are given in Appendix C. By differentiating Equation (5.2)for all coordinates (k, l), we arrive at the sparse linear system:(Γ + αI + βK) ~u = Υ + α~u(p), (5.5)where Υ and Γ are defined by collecting the appropriate terms from Equation(5.3). For the implementation, it is useful to note that if a point ~xn fallsin an element, then Φnk = ∂Φnk/∂uil = 0 for all nodes i not belonging tothat element. Still, computation of Υ and Γ is the most time-consumingportion of our registration. For each point x, it requires finding the elementcontaining that point, and determining its interpolation functions and theirderivatives within the element. We use a bounding volume hierarchy toaccelerate the element-detection component. To further improve efficiency,Υ and Γ can be computed in parallel per pixel.For linear materials, it can be shown that the stiffness matrix scaleslinearly with the Young’s modulus. As a result, the Young’s modulus canbe factored out and combined with β to create a single free parameter.This means that the proposed registration method has four free parameters:damping coefficient (α), relative soft tissue to prostate elasticity (Et/Ep),scaled prostate elasticity (βEp) and Poisson’s ratio (ν). Throughout the ex-periments, we used the following values: α = 1.0, βEp = 0.25× 5.0 kPa [72],Et/Ep = 0.2 [72], ν = 0.49 [89]. The damping parameter, α, prevents largegradients from inducing too strong a force. It should be set large enough tomaintain stability, but not too large or it will reduce the convergence rate.The scale parameter, β, controls the relative influence of the image-drivenforces and the restoring energy of the FEM. We tuned this parameter toallow realistic deformations. The same parameter values were used for allsubjects.5.3 Experiments and ResultsThe proposed 2D-3D registration was evaluated on 10 patients. The sys-tematic sextant biopsy protocol in our hospital requires the acquisition of1015.3. Experiments and Results(a) (b) (c) (d)Figure 5.2: Example of a calcification (a) and a cyst (c) on the target slice.The corresponding fiducial on the source volume is shown in (b) and (d),respectively.8–12 distributed cores, with extra cores in suspicious regions. For the pa-tients in this study, this yielded a total of 10 pre-procedure TRUS volumesand 115 2D TRUS slices at biopsy cores. To quantify registration error,we computed the Euclidean distance between intrinsic fiducials, consistingof micro-calcifications and cysts (Figure 5.2). A total of 65 fiducials wereidentified by the author.Figure 5.3 shows registration results for three patients. In the top-row,rigid registration performs well, however, there is a slight improvement nearthe prostate boundary following FEM-based registration. The middle-rowshows an example where FEM-based registration corrects for the boundaryand brings additional structures into the registration plane. In the bottom-row, FEM-based registration is required to correctly identify the calcification.As seen in Table 5.1, rigid and FEM-based registration reduce the meanTRE from the initial 6.31 mm down to 4.63 mm and 3.15 mm, respectively.This suggests that substantial biomechanical deformations exist and can becompensated for using our FEM-based approach. To investigate the statisti-cal significance of TRE reduction, we first checked if the TREs were normallydistributed. Using the one-sample Kolmogorov-Smirnov test, we found thatthe distribution of the TRE is not normal at the 5% significance level forinitial, rigid and FEM-based methods (p < 10−4). Therefore, we performeda signed Wilcoxon rank sum test. The p−value from this experiment is alsoshown in Table 5.1. The test rejected the hypothesis that TREs for initialvs. rigid and rigid vs. FEM-based methods belong to a distribution withequal medians. Therefore, the improvements in mean TRE following rigidand FEM-based registration are statistically significant.1025.4. Discussion and ConclusionsTarget Initial Rigid FEM-basedFigure 5.3: Typical registration results for three patients (top, middle andbottom rows). The first column denotes the live 2D-TRUS (target) images.The next three columns show the initial alignment using the trajectory, rigidregistration, and FEM-based registration results, respectively. White arrowsindicate locations where the registration framework shows improvements.Table 5.1: Registration results and p-values.TRE, mean ± s.d. (mm) p−valueInitial Rigid FEM-based Initial vs. Rigid Rigid vs. FEM-based6.31± 1.86 4.63± 1.05 3.15± 0.82 p < 10−4 p < 10−55.4 Discussion and ConclusionsWe presented a novel registration framework for motion and deformationcompensation during a freehand TRUS-guided prostate biopsy. The im-provement in the TRE using the FEM-based registration should be com-pared to the acceptable error bounds of a 3D prostate biopsy system. ATRE of 2.5 mm yields a confidence interval in which 95% of registered tar-gets come within the smallest clinically significant tumor [71]. The result ofour FEM-based registration (3.15 mm) brings us closer to this acceptableerror bound. In our study, no instructions were given to the radiologist tocontrol the probe pressure. If the biopsy protocol is slightly modified tomaintain a low probe pressure, it should be possible to decrease the errorcloser to a clinically acceptable range [71].The explicit assumption when using the pre-procedure volume is that1035.4. Discussion and Conclusionsit represents the prostate “at rest”. This assumption is not true in general,since there will always be some biomechanical forces present during the sweep(e.g. pressure from the probe, the bladder, and gravity). Ideally, the sweepwould be acquired under conditions where these forces are minimized. Toaccount for errors caused by existing deformations in the sweep, a possiblesolution is to use a super resolution approach similar to [45] that is capableof compensating for deformations between slices in the freehand sweep.Our next steps aim to increase the accuracy of the proposed framework.Given that the landmarks used to calculate the TRE in this study wereacquired using the author, a more extensive validation using landmarks se-lected by expert clinicians is warranted. Furthermore, since the quality ofthe pre-procedure volume directly affects registration results [74], we wishto tackle challenges associated with deformation artifacts due to breath-ing and inconsistent probe pressure, including estimating a “rest shape” ofthe prostate using shape statistics, and validating the reconstructed volumeagainst magnetic resonance images. Another direction is to perform multi-slice registration, which has been shown to improve the TRE [29, 74], andcompare our non-rigid registration approach to other methods [74].The current run-times of the rigid and FEM-based components of ourframework are in the order of ≈ 30 seconds and ≈ 10 minutes, respec-tively. Decreasing the registration time can be accomplished by calculatingthe terms in Equation (5.3) in parallel per pixel on a graphics processingunit, by adopting a multi-resolution approach, using a boundary elementmethod [64], through a partial differential equation formulation [102] and byperforming registration continuously during the procedure [29].There are four plausible scenarios for the execution time once we have aGPU implementation. If the registration can be done in less than one second,then our method can be readily used for live guidance. If the execution time isbetween 1-10 seconds, similar to [6], then continuous FEM-based registrationis not feasible. However, we might still be able to use a Kalman filter toestimate the deformation state of the prostate [89], such that we only requireFEM-based registration every 10 seconds. If the FEM-based run-time isbetween 10-30 seconds, then we most likely have to rely on rigid registrationfor guidance throughout the procedure and use FEM-based registration onlyat biopsy targets. If the execution time is above 30 seconds, our FEM-basedregistration is not suitable for prostate biopsies.Given that Baumann et al. [6] achieve a non-rigid registration time of 7seconds for 3D-3D registration, and their volumes are larger than our singleslice, we believe an expected execution time of 1-10 seconds is most likelyfeasible using a more efficient implementation of our method. This would1045.4. Discussion and Conclusionsallow us to integrate the registration framework into the biopsy procedure,so it can be validated on a larger cohort of patients.105Chapter 6Conclusions and Future WorkThe work in this thesis is intended to decrease the high number of falsenegatives in freehand TRUS-guided prostate biopsies. There are two majorchallenges that hinder accurate targeting of prostate tumors: 1) lack of vis-ibility in TRUS; and 2) intra-procedure prostate motion and deformations.In order to overcome these challenges, we have developed methods thatwere presented throughout this thesis. Since PCa is more visible in MRI,Chapters 2, 3 and 4 address the issues of tumor visibility in TRUS, byproviding the registration tools for MR-TRUS fusion. The salient featureof these surface-based methods is their ability to ignore regions where theprostate boundary cannot be reliably traced. This is important, since theaccuracy of surface-based MR-TRUS fusion methods is known to decreasein the presence of segmentation errors [95]. In Chapter 5, we tackled thechallenges associated with tracking the prostate during the biopsy session.To this end, we extended our FEM-based surface registration framework tointensity-based registration. This provided us with a framework to compen-sate for prostate motion and deformations due to variable probe pressure ina freehand biopsy session.6.1 ContributionsThe contributions of this thesis are summarized as follows:• A novel technique, i.e. GMM-FEM, was developed for the registrationof two surfaces. The method requires the source surface to be com-plete, however, the target surface need not be fully segmented. Theregistration is considered as a probability density estimation problemin the presence of a biomechanical regularizer. The points in the sourcesurface are assumed to be the centroids of an isotropic GMM, and thepoints on the target surface are the observations. The registration isdriven by surface-based forces, which maximize the likelihood of theGMM generating the observations. Additionally, the points on thesource surface are forced to move in a physically realistic manner by1066.2. Future Workadding a total strain energy. We validated this method against threeother state of the art surface-based methods on MR/TRUS surfacesfrom prostate biopsy patients.• Further validation of the GMM-FEM method on MR/TRUS surfacesfrom brachytherapy patients and its open-source implementation. Whilethe improvement in the TRE was small compared to affine registrationon this dataset (0.3 mm), the TRE in our method does not degrade inthe presence of missing data. Note that the experiments in this chapterwere done independently at BWH, and the UBC group did not havedirect access to the brachytherapy surfaces used in this study.• A novel technique for the registration of two partial surfaces, i.e. SSM-FEM registration. The technique utilizes a geometrical prior (SSM) asa reference for the complete shape, thereby casting the partial-surface-to-partial-surface registration problem into two full-surface-to-partial-surface problems. Given that the two partial surfaces and the SSM arein different deformation states, the method uses a FEM to compensatefor deformations present in both partial surfaces.• A novel framework for 2D-3D intensity-based registration is developedto compensate for prostate motion and deformations in a freehandbiopsy procedure. The framework has two components: 1) trajectory-based rigid registration; and 2) FEM-based non-rigid registration. Thefirst component compensate for the bulk motion of the prostate duringa biopsy session using the prostate pre-procedure sweep trajectory.This constraint is necessary to avoid local minima, given the limitedspatial information in a 2D slice and the low signal-to-noise ratio ofTRUS. The second component accounts for off-trajectory motion andprobe-induced biomechanical deformations.6.2 Future WorkNovel methods have been presented in this thesis for MR-TRUS fusion andintra-procedure motion and deformation compensation. A number of inter-esting areas of research can be suggested as follows:• In Chapter 2, I validated the GMM-FEM registration approach againstthree other surface-based registration techniques. These techniqueswere chosen to highlight specific components of the proposed method.However, there are a multitude of other methods in the literature for1076.2. Future WorkMR-TRUS fusion that were not considered in our validation [60, 102,140, 144, 154]. As a result, a more extensive validation of the GMM-FEM method against other MR-TRUS fusion methods is warranted.• GMM-FEM and SSM-FEM registration have only been validated un-der the assumption of homogeneous elasticity, with a single Young’smodulus used for different patients. However, it is known that theprostate has inhomogeneous elasticity and that the Young’s modulusvaries across subjects. Theoretically, spatially varying elasticity can beincorporated into the registration methods, affecting the computationof the stiffness matrix. Should elastography be available, a natural ex-tension of our surface-based methods is to incorporate inhomogeneouselasticity as part of the MR-TRUS fusion workflow. It would be espe-cially interesting to compare the GMM-FEM and SSM-FEM methodswith FEM-based methods in the literature that are validated usingknowledge of inhomogeneous of material properties [102, 144].• The EM algorithm used in GMM-FEM and SSM-FEM registrationmethods is known to be susceptible to local minima. In our MR-TRUS fusion application, we did not observe this issue for the selectedparameters over a wide range of datasets. However, if the problem oflocal minima arises in other datasets, the method needs to be extendedwith other optimization heuristics, such as a multi-resolution approach,or having multiple starting points [67].• Note that the SSM construction method [119] used Chapter 4 implic-itly assumes that the mean shape is a GMM and training examplesare spatially transformed observations of this GMM. However, duringSSM construction, the GMM is decoupled when the mean shape is up-dated, i.e. the means shape is simply the average of the back-projectedtraining examples under a spatial smoothness constraint. Since the fo-cus of this thesis is not construction of SSMs, we did not tackle thisissue. However, it would be interesting to see how the results vary ifthe GMM and mean shape are updated simultaneously. This can bean interesting direction for future research.• The 2D-3D registration framework for prostate motion and deforma-tion compensation requires the generation of a pre-procedure 3D-TRUSvolume. Due to inconsistent probe pressure, this 3D-TRUS containsdeformation artifacts. Since our 2D-3D registration is intensity-based,these artifacts push the optimizer away from the true solution, which1086.2. Future Workhas an adverse effect on the final TRE. Estimating a “rest” shape ofthe prostate from the deformed 3D-TRUS may improve the TRE in2D-3D registration. A possible solution to this problem is to use asuper-resolution approach similar to [45] that is capable of compensat-ing for deformations between slices in the freehand sweep.• A limitation of the work presented in this thesis is the lack of in-dependent TRE measurements. This issue is partially addressed forGMM-FEM registration during an independent validation at BWH.However, especially for the 2D-3D registration framework presented inChapter 5, TRE measurements were only performed using landmarksselected by the author. As a result, a more extensive validation byclinical experts needs to be performed.• The proposed MR-TRUS fusion (GMM-FEM and SSM-FEM) and 2D-3D registration methods need to be integrated into the clinical work-flow. Given that GMM-FEM registration is a special case of SSM-FEMregistration, and simpler to implement, we recommend this method asa starting point. While the run-time of GMM-FEM registration iswell within clinical requirements, the run-time of our 2D-3D registra-tion needs to be improved by two orders of magnitude. This speed-upshould be possible using the approaches discussed in Section 5.4.109Bibliography[1] Anusha Achuthan, Mandava Rajeswari, and Win Mar @ SalmahJalaluddin. Hippocampus localization guided by coherent point driftregistration using assembled point set. In Hybrid Artificial IntelligentSystems, volume 8073 of Lecture Notes in Computer Science, pages92–102. 2013.[2] N. Baka, B. L. Kaptein, M. de Bruijne, T. van Walsum, J. E. Giphart,W. J. Niessen, and B. P. F. Lelieveldt. 2D-3D shape reconstructionof the distal femur from stereo X-ray imaging using statistical shapemodels. Medical Image Analysis, 15(6):840 – 850, 2011.[3] SK Balci, P Golland, M Shenton, and WM Wells. Free-form b-splinedeformation model for groupwise registration. In Medical Image Com-puting and Computer-Assisted Interventions, volume 10, pages 23–30,2007.[4] JelleO. Barentsz, Jonathan Richenberg, Richard Clements, PeterChoyke, Sadhna Verma, Geert Villeirs, Olivier Rouviere, Vibeke Lo-gager, and JurgenJ. Fütterer. ESUR prostate MR guidelines 2012.European Radiology, 22(4):746–757, 2012.[5] Michael J Barry. Prostate-specific–antigen testing for early diagnosisof prostate cancer. New England Journal of Medicine, 344(18):1373–1377, 2001.[6] Michael Baumann, Pierre Mozer, Vincent Daanen, and Jocelyne Troc-caz. Prostate biopsy tracking with deformation estimation. MedicalImage Analysis, 16(3):562–576, 2012.[7] Jeffrey Bax, Derek Cool, Lori Gardi, Kerry Knight, David Smith,Jacques Montreuil, Shi Sherebrin, Cesare Romagnoli, and Aaron Fen-ster. Mechanically assisted 3D ultrasound guided prostate biopsy sys-tem. Medical Physics, 35(12):5397–5410, 2008.110Bibliography[8] Floris F. Berendsen, Uulke A. van der Heide, Thomas R. Langerak,Alexis N.T.J. Kotte, and Josien P.W. Pluim. Free-form image regis-tration regularized by a statistical shape model: application to organsegmentation in cervical MR. Computer Vision and Image Under-standing, 117(9):1119 – 1127, 2013.[9] P.J. Besl and Neil D. McKay. A method for registration of 3D shapes.IEEE Transactions on Pattern Analysis and Machine Intelligence,14(2):239–256, 1992.[10] Wolfgang Birkfellner, Michael Figl, Joachim Kettenbach, Johann Hum-mel, Peter Homolka, Rüdiger Schernthaner, Thomas Nau, and HelmarBergmann. Rigid 2d/3d slice-to-volume registration and its applicationon fluoroscopic ct images. Medical Physics, 34(1), 2007.[11] J. Bonet and R. D. Wood. Nonlinear continuum mechanics for finiteelement analysis. Cambridge University Press, 2000.[12] Fred L. Bookstein. Principal warps: thin-plate splines and the decom-position of deformations. IEEE Transactions on Pattern Analysis andMachine Intelligence, 11(6):567–585, Jun 1989.[13] Brian G Booth and Ghassan Hamarneh. Dti-deformit: Generatingground-truth validation data for diffusion tensor image analysis tasks.In Biomedical Imaging (ISBI), 2014 IEEE 11th International Sympo-sium on, pages 730–733. IEEE, 2014.[14] Canadian Cancer Society’s Advisory Committee on Cancer Statistics.Canadian cancer statistics 2015. Toronto, ON: Canadian Cancer So-ciety.[15] CL Carrol, FG Sommer, JE McNeal, and TA Stamey. The abnor-mal prostate: MR imaging at 1.5 T with histopathologic correlation.Radiology, 163(2):521–525, 1987.[16] D. M. Cash, M. I. Miga, T. K. Sinha, R. L. Galloway, and W. C.Chapman. Compensating for intraoperative soft-tissue deformationsusing incomplete surface data and finite elements. IEEE Transactionson Medical Imaging, 24(11):1479–1491, 2005.[17] P. Cerveri, A. Manzotti, A. Vanzulli, and G. Baroni. Local shapesimilarity and mean-shift curvature for deformable surface mapping ofanatomical structures. IEEE Transactions on Biomedical Engineering,61(1):16–24, 2014.111Bibliography[18] HM Chan, Albert Chung, Simon CH Yu, and William M Wells III. 2d-3d vascular registration between digital subtraction angiographic (dsa)and magnetic resonance angiographic (mra) images. In BiomedicalImaging: Nano to Macro, 2004. IEEE International Symposium on,pages 708–711. IEEE, 2004.[19] Najeeb Chowdhury, Robert Toth, Jonathan Chappelow, Sung Kim,Sabin Motwani, Salman Punekar, Haibo Lin, Stefan Both, Neha Vapi-wala, Stephen Hahn, and Anant Madabhushi. Concurrent segmenta-tion of the prostate on MRI and CT via linked statistical shape modelsfor radiotherapy planning. Medical Physics, 39(4):2214–2228, 2012.[20] H. Chui, A Rangarajan, Jie Zhang, and C.M. Leonard. Unsupervisedlearning of an atlas from unlabeled point-sets. IEEE Transactions onPattern Analysis and Machine Intelligence, 26(2):160–172, Feb 2004.[21] Haili Chui and Anand Rangarajan. A new point matching algorithmfor non-rigid registration. Computer Vision and Image Understanding,89(2-3):114–141, 2003.[22] Benoît Combès and Sylvain Prima. An efficient EM-ICP algorithm forsymmetric consistent non-linear registration of point sets. In MedicalImage Computing and Computer-Assisted Interventions, volume 6362,pages 594–601. 2010.[23] Roch M Comeau, Abbas F Sadikot, Aaron Fenster, and Terry M Pe-ters. Intraoperative ultrasound for guidance and tissue shift correctionin image-guided neurosurgery. Medical Physics, 27(4):787–800, 2000.[24] Michael S Cookson, Bruce J Roth, Philipp Dahm, Christine En-gstrom, Stephen J Freedland, Maha Hussain, Daniel W Lin, William TLowrance, Mohammad Hassan Murad, William K Oh, et al.Castration-resistant prostate cancer: AUA guideline. The Journal ofUrology, 190(2):429–438, 2013.[25] Derek W Cool, Jeff Bax, Cesare Romagnoli, Aaron D Ward, LoriGardi, Vaishali Karnik, Jonathan Izawa, Joseph Chin, and AaronFenster. Fusion of MRI to 3D TRUS for mechanically-assisted tar-geted prostate biopsy: system design and initial clinical experience. InProstate Cancer Imaging. Image Analysis and Image-Guided Interven-tions, pages 121–133. Springer, 2011.112Bibliography[26] T. F. Cootes, C. J. Taylor, D. H. Cooper, and J. Graham. Activeshape models-their training and application. Computer Vision andImage Understanding, 61(1):38–59, 1995.[27] AV D’amico, CM Tempany, R Cormack, N Hata, M Jinzaki, K Tuncali,M Weinstein, and JP Richie. Transperineal magnetic resonance imageguided prostate biopsy. The Journal of Urology, 164(2):385–387, 2000.[28] R.H. Davies, C.J. Twining, T.F. Cootes, J.C. Waterton, and C.J. Tay-lor. A minimum description length approach to statistical shape mod-eling. IEEE Transactions on Medical Imaging, 21(5):525–537, May2002.[29] Tharindu De Silva, Derek W. Cool, Cesare Romagnoli, Aaron Fen-ster, and Aaron D. Ward. Evaluating the utility of intraprocedural 3DTRUS image information in guiding registration for displacement com-pensation during prostate biopsy. Medical Physics, 41(8):1–11, 2014.[30] NB Delongchamps, Michaël Peyromaure, Alexandre Schull, FrédéricBeuvon, Naïm Bouazza, Thierry Flam, Marc Zerbib, Naira Muradyan,Paul Legman, and François Cornud. Prebiopsy magnetic resonanceimaging and prostate cancer detection: comparison of random andtargeted biopsies. Journal of Urology, 189(2):493–499, February 2013.[31] SP DiMaio, S Pieper, K Chinzei, N Hata, SJ Haker, DF Kacher,G Fichtinger, CM Tempany, and R Kikinis. Robot-assisted needleplacement in open MRI: system architecture, integration and valida-tion. Computer Aided Surgery, 12(1):15–24, 2007.[32] Stanley Durrleman, Xavier Pennec, Alain Trouvé, and Nicholas Ay-ache. Statistical models of sets of curves and surfaces based on cur-rents. Medical Image Analysis, 13(5):793–808, 2009.[33] Klaus Eichler, Susanne Hempel, Jennifer Wilby, Lindsey Myers, Lu-cas M Bachmann, and Jos Kleijnen. Diagnostic value of systematicbiopsy methods in the investigation of prostate cancer: a systematicreview. The Journal of Urology, 175(5):1605–1612, 2006.[34] WJ Ellis, MP Chetner, SD Preston, and MK Brawer. Diagnosis ofprostatic carcinoma: the yield of serum prostate specific antigen, dig-ital rectal examination and transrectal ultrasonography. The Journalof Urology, 152(5 Pt 1):1520–1525, 1994.113Bibliography[35] K. Engelhard, H.P. Hollenbach, B. Kiefer, A. Winkel, K. Goeb, andD. Engehausen. Prostate biopsy in the supine position in a stan-dard 1.5-T scanner under real time MR-imaging control using a MR-compatible endorectal biopsy device. European Radiology, 16(6):1237–1243, 2006.[36] Jonathan I Epstein, Patrick C Walsh, Marné Carmichael, andCharles B Brendler. Pathologic and clinical findings to predict tumorextent of nonpalpable (stage t1 c) prostate cancer. Jama, 271(5):368–374, 1994.[37] A Fedorov, K Tuncali, FM Fennessy, J Tokuda, N Hata, WM Wells,R Kikinis, and CM Tempany. Image registration for targeted MRI-guided transperineal prostate biopsy. Journal of Magnetic ResonanceImaging, 36(4):987–992, May 2012.[38] Andriy Fedorov, Reinhard Beichel, Jayashree Kalpathy-Cramer, JulienFinet, Jean-Christophe Fillion-Robin, Sonia Pujol, Christian Bauer,Dominique Jennings, Fiona Fennessy, Milan Sonka, John Buatti,Stephen Aylward, James V. Miller, Steve Pieper, and Ron Kikinis.3D slicer as an image computing platform for the quantitative imagingnetwork. Magnetic Resonance Imaging, 30(9):1323–1341, 2012.[39] Andriy Fedorov, Reinhard Beichel, Jayashree Kalpathy-Cramer, JulienFinet, Jean-Christophe C Fillion-Robin, Sonia Pujol, Christian Bauer,Dominique Jennings, Fiona Fennessy, Milan Sonka, John Buatti,Stephen Aylward, James Miller, V, Steve Pieper, and Ron Kikinis.3D slicer as an image computing platform for the quantitative imagingnetwork. Magn. Reson. Imaging, 30(9):1323–1341, July 2012.[40] Baowei Fei, J.L. Duerk, D.T. Boll, J.S. Lewin, and D.L. Wilson. Slice-to-volume registration and its potential application to interventionalmri-guided radio-frequency thermal ablation of prostate cancer. Med-ical Imaging, IEEE Transactions on, 22(4):515–525, April 2003.[41] Aaron Fenster, Donal B Downey, and H Neale Cardinal. Three-dimensional ultrasound imaging. Physics in medicine and biology,46(5):R67, 2001.[42] M. Ferrant, A. Nabavi, B. Macq, F. A. Jolesz, R. Kikinis, and S. K.Warfield. Registration of 3D intraoperative MR images of the brainusing a finite-element biomechanical model. IEEE Transactions onMedical Imaging, 20(12):1384–1397, 2001.114Bibliography[43] M. Fleute and S. Lavallée. Nonrigid 3-d/2-d registration of imagesusing statistical models. In Chris Taylor and Alain Colchester, ed-itors, Medical Image Computing and Computer-Assisted InterventionŰ Medical Image Computing and Computer-Assisted InterventionsŠ99,volume 1679 of Lecture Notes in Computer Science, pages 138–147.Springer Berlin / Heidelberg, 1999.[44] Jurgen J Futterer, Stijn WTPJ Heijmink, Tom WJ Scheenen, JeroenVeltman, Henkjan J Huisman, Pieter Vos, Christina A Hulsbergen-Van de Kaa, J Alfred Witjes, Paul FM Krabbe, Arend Heerschap,et al. Prostate cancer localization with dynamic contrast-enhanced MRimaging and proton MR spectroscopic imaging. Radiology, 241(2):449–458, 2006.[45] A. Gholipour, J.A. Estroff, and S.K. Warfield. Robust super-resolutionvolume reconstruction from slice acquisitions: Application to fetalbrain MRI. Medical Imaging, IEEE Transactions on, 29(10):1739–1758, Oct 2010.[46] Peter Gibbs, Martin D Pickles, and Lindsay W Turnbull. Diffusionimaging of the prostate at 3.0 Tesla. Investigative Radiology, 41(2):185–188, 2006.[47] Eli Gibson, Cathie Crukley, Mena Gaed, José A. Gómez, MadeleineMoussa, Joseph L. Chin, Glenn S. Bauman, Aaron Fenster, andAaron D. Ward. Registration of prostate histology images to ex vivoMR images via strand-shaped fiducials. Journal of Magnetic ResonanceImaging, 36(6):1402–1412, 2012.[48] Steven Gold, Anand Rangarajan, Chien-Ping Lu, Suguna Pappu, andEric Mjolsness. New algorithms for 2D and 3D point matching:: poseestimation and correspondence. Pattern Recognition, 31(8):1019–1031,1998.[49] Séverine Habert, Parmeshwar Khurd, and Christophe Chefd’hotel.Registration of multiple temporally related point sets using a novelvariant of the coherent point drift algorithm: application to coronarytree matching. In SPIE Medical Imaging, pages 86690M–86690M. In-ternational Society for Optics and Photonics, 2013.[50] Masoom A Haider, Theodorus H van der Kwast, Jeff Tanguay, An-drew J Evans, Ali-Tahir Hashmi, Gina Lockwood, and John Tracht-115Bibliographyenberg. Combined T2-weighted and diffusion-weighted MRI for lo-calization of prostate cancer. American Journal of Roentgenology,189(2):323–328, 2007.[51] Ghassan Hamarneh, Preet Jassi, and Lisa Tang. Simulation of ground-truth validation data via physically- and statistically-based warps. InDimitris Metaxas, Leon Axel, Gabor Fichtinger, and Gábor Székely,editors, Medical Image Computing and Computer-Assisted Interven-tion Ű MICCAI 2008, volume 5241 of Lecture Notes in Computer Sci-ence, pages 459–467. Springer Berlin Heidelberg, 2008.[52] Thomas Hambrock, Caroline Hoeks, Christina Hulsbergen van de Kaa,Tom Scheenen, Jurgen Fütterer, Stefan Bouwense, Inge van Oort, FritzSchröder, Henkjan Huisman, and Jelle Barentsz. Prospective assess-ment of prostate cancer aggressiveness using 3-t diffusion-weightedmagnetic resonance imagingŰguided biopsies versus a systematic 10-core transrectal ultrasound prostate biopsy cohort. European Urology,61(1):177 – 184, 2012.[53] P Hammerer and H Huland. Systematic sextant biopsies in 651 patientsreferred for prostate evaluation. The Journal of Urology, 151(1):99–102, 1994.[54] Nobuhiko Hata, Masahiro Jinzaki, Daniel Kacher, Robert Cormak,David Gering, Arya Nabavi, Stuart G Silverman, Anthony V D’Amico,Ron Kikinis, Ferenc A Jolesz, et al. MR imaging-guided prostate biopsywith surgical navigation software: Device validation and feasibility.Radiology, 220(1):263–268, 2001.[55] VJ Hegde, RV Mulkern, LP Panych, FM Fennessy, A Fedorov,SE Maier, and CM Tempany. Multiparametric MRI of prostate can-cer: An update on state-of-the-art techniques and their performancein detecting and localizing prostate cancer. J. Magn. Reson. Imaging,37(5):1035–1054, 2013.[56] SWTPJ Heijmink, TWJ Scheenen, ENJT van Lin, AG Visser, LALMKiemeney, JAWitjes, and JO Barentsz. Changes in prostate shape andvolume and their implications for radiotherapy after introduction ofendorectal balloon as determined by MRI at 3T. International Journalof Radiation Oncology Biology Physics, 73(5):1446–1453, April 2009.116Bibliography[57] Tobias Heimann and Hans-Peter Meinzer. Statistical shape models for3D medical image segmentation: A review. Medical Image Analysis,13(4):543–563, 2009.[58] Mattias P. Heinrich, Mark Jenkinson, Manav Bhushan, TahreemaMatin, Fergus V. Gleeson, Sir Michael Brady, and Julia A. Schnabel.MIND: Modality independent neighbourhood descriptor for multi-modal deformable registration. Medical Image Analysis, 16(7):1423–1435, 2012.[59] R. Horaud, F. Forbes, M. Yguel, G. Dewaele, and Jian Zhang. Rigidand articulated point registration with expectation conditional maxi-mization. IEEE Transactions on Pattern Analysis and Machine Intel-ligence, 33(3):587–602, 2011.[60] Yipeng Hu, Hashim Uddin Ahmed, Zeike Taylor, Clare Allen, MarkEmberton, David Hawkes, and Dean Barratt. MR to ultrasound regis-tration for image-guided prostate interventions. Medical Image Anal-ysis, 16(3):687–703, 2012.[61] Heng Huang, Li Shen, Rong Zhang, F. Makedon, A.. Saykin, andJ. Pearlman. A novel surface registration algorithm with biomedicalmodeling applications. IEEE Transactions on Information Technologyin Biomedicine, 11(4):474–482, 2007.[62] Qi-Xing Huang, Bart Adams, Martin Wicke, and Leonidas J. Guibas.Non-rigid registration under isometric deformations. Computer Graph-ics Forum, 27(5):1449–1457, 2008.[63] Johann Hummel, Michael Figl, Michael Bax, Helmar Bergmann, andWolfgang Birkfellner. 2d/3d registration of endoscopic ultrasound toct volume data. Physics in medicine and biology, 53(16):4303, 2008.[64] Doug L James and Dinesh K Pai. Artdefo: accurate real time de-formable objects. In Proceedings of the 26th annual conference onComputer graphics and interactive techniques, pages 65–72. ACMPress/Addison-Wesley Publishing Co., 1999.[65] Nicholas James. Primer on Prostate Cancer. Springer, 2014.[66] Ahmedin Jemal, Rebecca Siegel, Elizabeth Ward, Taylor Murray, Ji-aquan Xu, and Michael J. Thun. Cancer statistics, 2007. CA: A CancerJournal for Clinicians, 57(1):43–66, 2007.117Bibliography[67] Bing Jian and B.C. Vemuri. Robust point set registration using Gaus-sian mixture models. IEEE Transactions on Pattern Analysis andMachine Intelligence, 33(8):1633–1645, 2011.[68] H J Johnson, G Harris, and K Williams. BRAINSFit: Mutual informa-tion registrations of Whole-Brain 3D images, using the insight toolkit.Insight Journal, (July-December), 2007.[69] J Stephen Jones. Prostate biopsy: indications, techniques, and compli-cations. Springer Science & Business Media, 2009.[70] I Kaplan, NE Oldenburg, P Meskell, M Blake, P Church, andEJ Holupka. Real time MRI-ultrasound image guided stereotacticprostate biopsy. Magn. Reson. Imaging, 20(3):295–299, April 2002.[71] V. V. Karnik, A. Fenster, J. Bax, D. W. Cool, L. Gardi, I. Gyacskov,C. Romagnoli, and A. D. Ward. Assessment of image registrationaccuracy in three-dimensional transrectal ultrasound guided prostatebiopsy. Medical Physics, 37(2):802–813, 2010.[72] J Kemper, R Sinkus, J Lorenzen, C Nolte-Ernsting, A Stork, andG Adam. MR elastography of the prostate: initial in-vivo applica-tion. In RöFo-Fortschritte auf dem Gebiet der Röntgenstrahlen undder bildgebenden Verfahren, volume 176, pages 1094–1099, 2004.[73] Siavash Khallaghi, Corina G. M. Leung, Pezhman Foroughi, ChrisNguan, Keyvan Hastrudi-Zaad, and Purang Abolmaesumi. Experi-mental validation of an intra-subject elastic registration algorithm fordynamic 3D ultrasound images. Medical Physics, Accepted for Publi-cation, 2012.[74] Siavash Khallaghi, Saman Nouranian, Samira Sojoudi, Hussam A.Ashab, Lindsay Machan, Silvia Chang, Peter Black, Martin Gleave,Larry Goldenberg, and Purang Abolmaesumi. Motion and deformationcompensation for freehand prostate biopsies. Proc. SPIE, 9036:903620–6, 2014.[75] Chan Kyo Kim, Byung Kwan Park, and Bohyun Kim. Localizationof prostate cancer using 3T MRI: comparison of T2-weighted and dy-namic contrast-enhanced imaging. Journal of computer assisted to-mography, 30(1):7–11, 2006.[76] Roger S Kirby and Manish I Patel. Fast facts: Prostate cancer. HealthPress, 2012.118Bibliography[77] Piotr Kozlowski, Silvia D. Chang, Edward C. Jones, Kenneth W.Berean, Henry Chen, and S. Larry Goldenberg. Combined diffusion-weighted and dynamic contrast-enhanced MRI for prostate cancer di-agnosisŮcorrelation with biopsy and histopathology. Journal of Mag-netic Resonance Imaging, 24(1):108–113, 2006.[78] Piotr Kozlowski, Silvia D. Chang, Ran Meng, Burkhard Mädler,Robert Bell, Edward C. Jones, and S. Larry Goldenberg. Combinedprostate diffusion tensor imaging and dynamic contrast enhanced MRIat 3T quantitative correlation with biopsy. Magnetic Resonance Imag-ing, 28(5):621 – 628, 2010.[79] Axel Krieger, Robert C Susil, Cynthia Ménard, Jonathan A Coleman,Gabor Fichtinger, Ergin Atalar, and Louis L Whitcomb. Design of anovel MRI compatible manipulator for image guided prostate interven-tions. IEEE Transactions on Biomedical Engineering, 52(2):306–313,2005.[80] Virendra Kumar, N. R. Jagannathan, Rajeev Kumar, S. Thulkar,S. Dutta Gupta, S. N. Dwivedi, A. K. Hemal, and N. P. Gupta. Ap-parent diffusion coefficient of the prostate in men prior to biopsy: de-termination of a cut-off value to predict malignancy of the peripheralzone. NMR in Biomedicine, 20(5):505–511, 2007.[81] Hanif M Ladak, Fei Mao, Yunqiu Wang, Dónal B Downey, David ASteinman, and Aaron Fenster. Prostate boundary segmentation from2D ultrasound images. Medical physics, 27(8):1777–1788, 2000.[82] A Lasso, T Heffter, A Rankin, C Pinter, T Ungi, and G Fichtinger.PLUS: open-source toolkit for ultrasound-guided intervention systems.IEEE Transactions on Biomedical Engineering, 61(10):2527–2537, Oc-tober 2014.[83] Fred Lee, ST Torp-Pedersen, DB Siders, PJ Littrup, and RD McLeary.Transrectal ultrasound in the diagnosis and staging of prostatic carci-noma. Radiology, 170(3):609–615, 1989.[84] Katia Ramos Moreira Leite, Luiz HA Camara-Lopes, Marcos FDall’Oglio, Jose Cury, Alberto A Antunes, Adriana Sañudo, andMiguel Srougi. Upgrading the gleason score in extended prostatebiopsy: implications for treatment choice. International Journal ofRadiation Oncology Biology Physics, 73(2):353–356, 2009.119Bibliography[85] JK Logan, S Rais-Bahrami, B Turkbey, A Gomella, H Amalou,PL Choyke, BJ Wood, and PA Pinto. Current status of magneticresonance imaging (MRI) and ultrasonography fusion software plat-forms for guidance of prostate biopsies. British Journal of UrologyInternational, 114(5):641–652, November 2014.[86] S. Sara Mahdavi, Nick Chng, Ingrid Spadinger, William J. Morris,and Septimiu E. Salcudean. Semi-automatic segmentation for prostateinterventions. Medical Image Analysis, 15(2):226–237, 2011.[87] N. Makni, I Toumi, P. Puech, M. Issa, O. Colot, S. Mordon, andN. Betrouni. A non rigid registration and deformation algorithm forultrasound amp; MR images to guide prostate cancer therapies. InEngineering in Medicine and Biology Society (EMBC), 2010 AnnualInternational Conference of the IEEE, pages 3711–3714, Aug 2010.[88] Nasr Makni, Philippe Puech, Pierre Colin, Abdelrahmene Azzouzi,Serge Mordon, and Nacim Betrouni. Elastic image registration forguiding focal laser ablation of prostate cancer: Preliminary results.Computer Methods and Programs in Biomedicine, 108(1):213–223,2012.[89] Bahram Marami, Shahin Sirouspour, Suha Ghoul, Jeremy Cepek,Sean R.H. Davidson, David W. Capson, John Trachtenberg, and AaronFenster. Elastic registration of prostate MR images based on estima-tion of deformation states. Medical Image Analysis, 21(1):87–103, 2015.[90] S. Martin, M. Baumann, V. Daanen, and J. Troccaz. MR prior basedautomatic segmentation of the prostate in TRUS images for MR/TRUSdata fusion. In Biomedical Imaging: From Nano to Macro, 2010 IEEEInternational Symposium on, pages 640–643, April 2010.[91] C R Maurer and V Raghavan. A linear time algorithm for comput-ing exact euclidean distance transforms of binary images in arbitrarydimensions. IEEE Transactions on Pattern Analysis and Machine In-telligence, 25(2):265–270, February 2003.[92] Graham McNeill and Sethu Vijayakumar. A probabilistic approach torobust shape matching. In Image Processing, 2006 IEEE InternationalConference on, pages 937–940. IEEE, 2006.120Bibliography[93] Huadong Miao, Hiroshi Fukatsu, and Takeo Ishigaki. Prostate cancerdetection with 3-T MRI: Comparison of diffusion-weighted and T2-weighted imaging. European Journal of Radiology, 61(2):297 – 302,2007.[94] Ashraf Mohamed, Christos Davatzikos, and Russell Taylor. A com-bined statistical and biomechanical model for estimation of intra-operative prostate deformation. In Medical Image Computing andComputer-Assisted Intervention, volume 2489, pages 452–460. 2002.[95] Mehdi Moradi, Firdaus Janoos, Andriy Fedorov, Petter Risholm, TinaKapur, Luciant D. Wolfsberger, Paul L. Nguyen, Clare M. Tempany,and William M. Wells III. Two solutions for registration of ultra-sound to MRI for image-guided prostate interventions. In Engineeringin Medicine and Biology Society (EMBC), 2012 Annual InternationalConference of the IEEE, pages 1129–1132, 2012.[96] Nicolas Mottet, Joaquim Bellmunt, Michel Bolla, Steven Joniau, Mal-colm Mason, Vsevolod Matveev, Hans-Peter Schmid, Theo Van derKwast, Thomas Wiegel, Filiberto Zattoni, et al. EAU guidelineson prostate cancer. part ii: Treatment of advanced, relapsing, andcastration-resistant prostate cancer. Actas Urológicas Españolas (En-glish Edition), 35(10):565–579, 2011.[97] YR Murciano-Goroff, Luciant D Wolfsberger, Arti Parekh, Fiona MFennessy, Kemal Tuncali, Peter F Orio, 3rd, Thomas R Niedermayr,W Warren Suh, Phillip M Devlin, Clare Mary C Tempany, EmilyH Neubauer Sugar, Desmond A O’Farrell, Graeme Steele, MichaelO’Leary, Ivan Buzurovic, Antonio L Damato, Robert A Cormack, An-driy Y Fedorov, and Paul L Nguyen. Variability in MRI vs. ultrasoundmeasures of prostate volume and its impact on treatment recommen-dations for favorable-risk prostate cancer patients: a case series. Ra-diation Oncology, 9:200, 9 September 2014.[98] A. Myronenko and Xubo Song. Point set registration: Coherent pointdrift. IEEE Transactions on Pattern Analysis and Machine Intelli-gence, 32(12):2262–2275, 2010.[99] Jun Nakashima, Akihiro Tanimoto, Yutaka Imai, Makio Mukai, Yu-taka Horiguchi, Ken Nakagawa, Mototsugu Oya, Takashi Ohigashi,Ken Marumo, and Masaru Murai. Endorectal MRI for prediction of121Bibliographytumor site, tumor size, and local extension of prostate cancer. Urology,64(1):101 – 105, 2004.[100] S Natarajan, Leonard S Marks, Daniel J A Margolis, Jiaoti Huang,Maria Luz Macairan, Patricia Lieu, Aaron Fenster, M S, M D, andD Ph. Clinical application of a 3D ultrasound-guided prostate biopsysystem. Urologic Oncology: Seminars and Original Investigations,29(3):334–342, 2011.[101] G. Nir, R.S. Sahebjavaher, P. Kozlowski, S.D. Chang, E.C. Jones, S.L.Goldenberg, and S.E. Salcudean. Registration of whole-mount his-tology and volumetric imaging of the prostate using particle filtering.Medical Imaging, IEEE Transactions on, 33(8):1601–1613, Aug 2014.[102] G. Nir, R.S. Sahebjavaher, P. Kozlowski, S.D. Chang, R. Sinkus, S.L.Goldenberg, and S.E. Salcudean. Model-based registration of ex vivoand in vivo MRI of the prostate using elastography. Medical Imaging,IEEE Transactions on, 32(7):1349–1361, July 2013.[103] KarstenØstergaard Noe and ThomasSangild Sørensen. Solid mesh reg-istration for radiotherapy treatment planning. In Biomedical Simula-tion, volume 5958, pages 59–70. 2010.[104] M Norberg, L Egevad, L Holmberg, P Sparen, BJ Norlen, and C Busch.The sextant protocol for ultrasound-guided core biopsies of the prostateunderestimates the presence of cancer. Urology, 50(4):562–566, 1997.[105] Y. Otake, M. Armand, Robert S. Armiger, M.D. Kutzer, E. Basafa,P. Kazanzides, and R.H. Taylor. Intraoperative image-based multiview2d/3d registration for image-guided orthopaedic surgery: Incorpora-tion of fiducial-based c-arm tracking and gpu-acceleration. MedicalImaging, IEEE Transactions on, 31(4):948–962, April 2012.[106] Evren Özarslan and Thomas H Mareci. Generalized diffusion tensorimaging and analytical relationships between diffusion tensor imagingand high angular resolution diffusion imaging. Magnetic resonance inMedicine, 50(5):955–965, 2003.[107] Uday Patel and David Rickards. Handbook of transrectal ultrasoundand biopsy of the prostate. Taylor & Francis, 2002.[108] J-PV Pelteret and BD Reddy. Computational model of soft tissues inthe human upper airway. International journal for numerical methodsin biomedical engineering, 28(1):111–132, 2012.122Bibliography[109] Xavier Pennec, Pascal Cachier, and Nicholas Ayache. Understandingthe Demon’s algorithm: 3D non-rigid registration by gradient descent.In Medical Image Computing and Computer-Assisted Intervention–MICCAIŠ99, pages 597–605. Springer, 1999.[110] T Penzkofer, K Tuncali, A Fedorov, S-E Song, J Tokuda, FM Fennessy,MG Vangel, AS Kibel, RV Mulkern, WM Wells, Nobuhiko Hata, andClare M C Tempany. Transperineal In-Bore 3-T MR imaging-guidedprostate biopsy: A prospective clinical observational study. Radiology,0(0):140221, 2014.[111] Guillaume Ploussard, Jonathan I Epstein, Rodolfo Montironi, Pe-ter R Carroll, Manfred Wirth, Marc-Oliver Grimm, Anders S Bjartell,Francesco Montorsi, Stephen J Freedland, Andreas Erbersdobler, et al.The contemporary concept of significant versus insignificant prostatecancer. European Urology, 60(2):291–303, 2011.[112] J. P W Pluim, J.B.A Maintz, and M.A Viergever. Mutual-information-based registration of medical images: a survey. IEEE Transactions onMedical Imaging, 22(8):986–1004, 2003.[113] P Puech, Olivier Rouvière, Raphaele Renard-Penna, Arnauld Villers,Patrick Devos, Marc Colombel, Marc-Olivier Bitker, Xavier Leroy,Florence Mège-Lechevallier, Eva Comperat, Adil Ouzzane, and Lau-rent Lemaitre. Prostate cancer diagnosis: Multiparametric MR-targeted biopsy with cognitive and transrectal US-MR fusion guidanceversus systematic Biopsy-Prospective multicenter study. Radiology,268(2):461–469, 2013.[114] Wu Qiu, Jing Yuan, Eranga Ukwatta, David Tessier, and Aaron Fen-ster. Rotational-slice-based prostate segmentation using level set withshape constraint for 3D end-firing TRUS guided biopsy. In MedicalImage Computing and Computer-Assisted Interventions, volume 7510,pages 537–544. 2012.[115] J Ramon and LJ Denis. Prostate cancer: Recent results in cancerresearch.[116] Anand Rangarajan, Haili Chui, Eric Mjolsness, Suguna Pappu, LilaDavachi, Patricia Goldman-Rakic, and James Duncan. A robust point-matching algorithm for autoradiograph alignment. Medical ImageAnalysis, 1(4):379–398, 1997.123Bibliography[117] A. Rasoulian, R. Rohling, and P. Abolmaesumi. Group-wise registra-tion of point sets for statistical shape models. IEEE Transactions onMedical Imaging, 31(11):2025–2034, 2012.[118] A Rasoulian, R. Rohling, and P. Abolmaesumi. Lumbar spine seg-mentation using a statistical multi-vertebrae anatomical shape+posemodel. IEEE Transactions on Medical Imaging, 32(10):1890–1900, Oct2013.[119] Abtin Rasoulian, Purang Abolmaesumi, and Parvin Mousavi. Feature-based multibody rigid registration of CT and ultrasound images oflumbar spine. Medical Physics, 39(6):3154–3166, 2012.[120] Stefan A Reinsberg, Geoffrey S Payne, Sophie F Riches, Sue Ashley,Jonathan M Brewster, Veronica A Morgan, and Nandita M Desouza.Combined use of diffusion-weighted MRI and 1H MR spectroscopy toincrease accuracy in prostate cancer detection. American Journal ofRoentgenology, 188(1):91–98, 2007.[121] Matthew D. Rifkin, Elias A. Zerhouni, Constantine A. Gatsonis,Leslie E. Quint, David M. Paushter, Jonathan I. Epstein, Ulrike Ham-per, Patrick C. Walsh, and Barbara J. McNeil. Comparison of magneticresonance imaging and ultrasonography in staging early prostate can-cer. results of a multi-institutional cooperative trial. The New EnglandJournal of Medicine, 323(10):621–626, 1990.[122] RS Rivlin. Large elastic deformations of isotropic materials. IV. fur-ther developments of the general theory. Philosophical Transactionsof the Royal Society of London. Series A, Mathematical and PhysicalSciences, 241(835):379–397, 1948.[123] Claus G. Roehrborn, John Mcconnell, Jaime Bonilla, Sidney Rosen-blatt, Perry B. Hudson, Gholem H. Malek, Paul F. Schellham-mer, Reginald Bruskewitz, Alvin M. Matsumoto, Lloyd H. Harrison,Harold A. Fuselier, Patrick Walsh, Johnny Roy, Gerald Andriole, Mar-tin Resnick, and Joanne Waldstreicher. Serum prostate specific antigenis a strong predictor of future prostate growth in men with benign pro-static hyperplasia. The Journal of Urology, 163(1):13 – 20, 2000.[124] K. Rohr, H.S. Stiehl, R. Sprengel, W. Beil, T.M. Buzug, J. Weese, andM.H. Kuhn. Point-based elastic registration of medical image datausing approximating thin-plate splines. In Visualization in BiomedicalComputing, volume 1131, pages 297–306. 1996.124Bibliography[125] D. C. Rucker, Yifei Wu, L. W. Clements, J. E. Ondrake, T. S. Pheiffer,A. L. Simpson, W. R. Jarnagin, and M. I. Miga. A mechanics-basednonrigid registration method for liver surgery using sparse intraoper-ative data. IEEE Transactions on Medical Imaging, 33(1):147–158,2014.[126] Y. Sahillioglu and Y. Yemez. Minimum-distortion isometric shape cor-respondence using EM algorithm. IEEE Transactions on Pattern Anal-ysis and Machine Intelligence, 34(11):2203–2215, 2012.[127] JL Sauvain, P Palascak, D Bourscheid, C Chabi, A Atassi, JM Bremon,and R Palascak. Value of power doppler and 3d vascular sonographyas a method for diagnosis and staging of prostate cancer. EuropeanUrology, 44(1):21–31, 2003.[128] Emil Scosyrev and Edward M Messing. Reply to prostate-specific anti-gen screening for prostate cancer and the risk of overt metastatic dis-ease at presentation. Cancer, 119(5):1113–1114, 2013.[129] A Shah, O Zettinig, T Maurer, C Precup, C Schulte zu Berge, B Frisch,and N Navab. An open source multimodal image-guided prostatebiopsy framework. In 3rd Workshop on Clinical Image-based Proce-dures: Translational Research in Medical Imaging (CLIP), 17th Medi-cal Image Computing and Computer-Assisted Interventions, 2014.[130] Kevin C Shandera, Gregory P Thibault, and George E Deshon. Vari-ability in patient preparation for prostate biopsy among american urol-ogists. Urology, 52(4):644–646, 1998.[131] Hang Si. A quality tetrahedral mesh generator and three-dimensionaldelaunay triangulator. Weierstrass Institute for Applied Analysis andStochastic, Berlin, Germany, 2006.[132] Robin Sibson and G Stone. Computation of thin-plate splines. SIAMJournal on Scientific and Statistical Computing, 12(6):1304–1313,1991.[133] R Siegel, D Naishadham, and A Jemal. Cancer statistics, 2013. CACancer J. Clin., 63(1):11–30, January 2013.[134] Tharindu Silva, DerekW. Cool, Jing Yuan, Cesare Romognoli, AaronFenster, and AaronD. Ward. Improving 2D-3D registration optimiza-tion using learned prostate motion data. In Medical Image Comput-125Bibliographying and Computer-Assisted Interventions, volume 8150, pages 124–131.Springer Berlin Heidelberg, 2013.[135] Tharindu De Silva, Aaron Fenster, Jeffrey Bax, Lori Gardi, Cesare Ro-magnoli, Jagath Samarabandu, and Aaron D. Ward. 2D-3D rigid reg-istration to compensate for prostate motion during 3D TRUS-guidedbiopsy. volume 83160, pages 83160O–83160B. SPIE, 2012.[136] Tharindu De Silva, Aaron Fenster, Derek W. Cool, Lori Gardi, Ce-sare Romagnoli, Jagath Samarabandu, and Aaron D. Ward. 2D-3Drigid registration to compensate for prostate motion during 3D TRUS-guided biopsy. Medical Physics, 40(2):022904, 2013.[137] Eric A Singer, Dragan J Golijanin, Robert S Davis, and Vikram Dogra.What’s new in urologic ultrasound? Urologic Clinics of North America,33(3):279–286, 2006.[138] Wendy L. Smith, Craig Lewis, Glenn Bauman, George Rodrigues,David D’Souza, Robert Ash, Derek Ho, Varagur Venkatesan, DónalDowney, and Aaron Fenster. Prostate volume contouring: A 3D anal-ysis of segmentation using 3DTRUS, CT, and MR. International Jour-nal of Radiation Oncology Biology Physics, 67(4):1238–1247, 2007.[139] A Sotiras, C. Davatzikos, and N. Paragios. Deformable medical im-age registration: A survey. IEEE Transactions on Medical Imaging,32(7):1153–1190, 2013.[140] Yue Sun, Jing Yuan, Wu Qiu, M. Rajchl, C. Romagnoli, and A. Fen-ster. Three-dimensional nonrigid MR-TRUS registration using dualoptimization. Medical Imaging, IEEE Transactions on, 34(5):1085–1095, May 2015.[141] Yue Sun, Jing Yuan, Martin Rajchl, Wu Qiu, Cesare Romagnoli, andAaron Fenster. Efficient convex optimization approach to 3D non-rigidMR-TRUS registration. In Medical Image Computing and Computer-Assisted Interventions, volume 8149, pages 195–202. 2013.[142] Robert C Susil, Cynthia Ménard, Axel Krieger, Jonathan A Coleman,Kevin Camphausen, Peter Choyke, Gabor Fichtinger, Louis L Whit-comb, C Norman Coleman, and Ergin Atalar. Transrectal prostatebiopsy and fiducial marker placement in a standard 1.5 t magneticresonance imaging scanner. The Journal of Urology, 175(1):113–120,2006.126Bibliography[143] L. Tang and G. Hamarneh. Smrfi: Shape matching via registration ofvector-valued feature images. In Computer Vision and Pattern Recog-nition, 2008. CVPR 2008. IEEE Conference on, pages 1–8, June 2008.[144] Farheen Taquee, Orcun Goksel, S. Sara Mahdavi, Mira Keyes,W. James Morris, Ingrid Spadinger, and Septimiu Salcudean. De-formable prostate registration from MR and TRUS images using sur-face error driven fem models. Proc. SPIE, 8316:831612, 2012.[145] Clare M Tempany, Xiao Zhou, Elias A Zerhouni, Matthew D Rifkin,Leslie E Quint, Catherine W Piccoli, James H Ellis, and Barbara JMcNeil. Staging of prostate cancer: results of radiology diagnosticoncology group project comparison of three MR imaging techniques.Radiology, 192(1):47–54, 1994.[146] J Tokuda, Gregory S Fischer, Xenophon Papademetris, Ziv Yaniv,Luis Ibanez, Patrick Cheng, Haiying Liu, Jack Blevins, Jumpei Arata,Alexandra J Golby, Tina Kapur, Steve Pieper, Everette C Bur-dette, Gabor Fichtinger, Clare M Tempany, and Nobuhiko Hata.OpenIGTLink: an open network protocol for image-guided therapyenvironment. The International Journal of Medical Robotics and Com-puter Assisted Surgery, 5(4):423–434, December 2009.[147] G. M. Treece, R. W. Prager, A. H. Gee, and L. Berman. Surfaceinterpolation from sparse cross sections using region correspondence.IEEE Transactions on Medical Imaging, 19(11):1106–1114, Nov 2000.[148] David S Tuch. Q-ball imaging. Magnetic Resonance in Medicine,52(6):1358–1372, 2004.[149] Oliver Van Kaick, Hao Zhang, Ghassan Hamarneh, and Daniel Cohen-Or. A survey on shape correspondence. In Computer Graphics Forum,volume 30, pages 1681–1707. Wiley Online Library, 2011.[150] Geert M Villeirs, Willem Oosterlinck, Els Vanherreweghe, and Gert ODe Meerleer. A qualitative approach to combined magnetic resonanceimaging and spectroscopy in the diagnosis of prostate cancer. Europeanjournal of Radiology, 73(2):352–356, 2010.[151] Arnauld Villers, Philippe Puech, Damien Mouton, Xavier Leroy,Charles Ballereau, and Laurent Lemaitre. Dynamic contrast enhanced,pelvic phased array magnetic resonance imaging of localized prostate127Bibliographycancer for predicting tumor volume: correlation with radical prostate-ctomy findings. The Journal of Urology, 176(6):2432–2437, 2006.[152] Patrick C Walsh and Herbert Lepor. The role of radical prostatectomyin the management of prostatic cancer. Cancer, 60(S3):526–537, 1987.[153] H. Wang and Baowei Fei. A robust b-splines-based point match methodfor non-rigid surface registration. In Bioinformatics and BiomedicalEngineering, 2008. ICBBE 2008. The 2nd International Conferenceon, pages 2353–2356, May 2008.[154] Yi Wang, Dong Ni, Jing Qin, Muqing Lin, Xiaoyan Xie, Ming Xu,and PhengAnn Heng. Towards personalized biomechanical model andMIND-weighted point matching for robust deformable MR-TRUS reg-istration. In Computer-Assisted and Robotic Endoscopy, pages 121–130. Springer International Publishing, 2014.[155] W. Wein, S. Brunke, A. Khamene, M.R. Callstrom, and N. Navab. Au-tomatic CT-ultrasound registration for diagnostic imaging and image-guided intervention. Medical Image Analysis, 12:577–585, 2008.[156] William M Wells III. Statistical approaches to feature-based objectrecognition. International Journal of Computer Vision, 21(1-2):63–98,1997.[157] William M. Wells III, Paul Viola, Hideki Atsumi, Shin Nakajima, andRon Kikinis. Multi-modal volume registration by maximization of mu-tual information. Medical Image Analysis, 1(1):35–51, 1996.[158] Sheng Xu, Jochen Kruecker, Peter Guion, Neil Glossop, Ziv Neeman,Peter Choyke, Anurag K. Singh, and Bradford J. Wood. Closed-loopcontrol in fused MR-TRUS image-guided prostate biopsy. In MedicalImage Computing and Computer-Assisted Interventions, pages 128–135, 2007.[159] Sheng Xu, Jochen Kruecker, Baris Turkbey, Neil Glossop, Anurag KSingh, Peter Choyke, Peter Pinto, and Bradford J Wood. Real-timeMRI-TRUS fusion for guidance of targeted prostate biopsies. Interna-tional Society for Computer Aided Surgery, 13(5):255–264, 2008.[160] Pingkun Yan, Sheng Xu, Baris Turkbey, and J. Kruecker. Adaptivelylearning local shape statistics for prostate segmentation in ultrasound.IEEE Transactions on Biomedical Engineering, 58(3):633–641, 2011.128[161] B.T.T. Yeo, M.R. Sabuncu, T. Vercauteren, N. Ayache, B. Fischl, andP. Golland. Spherical demons: Fast diffeomorphic landmark-free sur-face registration. IEEE Transactions on Medical Imaging, 29(3):650–668, 2010.[162] Oliver Zettinig, Amit Shah, Christoph Hennersperger, Matthias Eiber,Christine Kroll, Hubert Kübler, Tobias Maurer, Fausto Milletarì, JuliaRackerseder, Christian Schulte zu Berge, Enno Storz, Benjamin Frisch,and Nassir Navab. Multimodal image-guided prostate fusion biopsybased on automatic deformable registration. International Journal ofComputer Assisted Radiology and Surgery, pages 1–11, 2015.[163] H. Zhang, A. Sheffer, D. Cohen-Or, Q. Zhou, O. Van Kaick, andA. Tagliasacchi. Deformation-driven shape correspondence. ComputerGraphics Forum, 27(5):1431–1439, 2008.[164] Shaoting Zhang, Yiqiang Zhan, Xinyi Cui, Mingchen Gao, JunzhouHuang, and Dimitris Metaxas. 3D anatomical shape atlas constructionusing mesh quality preserved deformable models. Computer Vision andImage Understanding, 117(9):1061–1071, 2013.[165] Zhengyou Zhang. Iterative point matching for registration of free-form curves and surfaces. International Journal of Computer Vision,13(2):119–152, 1994.[166] Q. Zheng, A. Sharf, A. Tagliasacchi, B. Chen, H. Zhang, A. Sheffer, andD. Cohen-Or. Consensus skeleton for non-rigid space-time registration.Computer Graphics Forum, 29(2):635–644, 2010.[167] Lilla Zöllei, E Grimson, Alexander Norbash, and W Wells. 2d-3d rigidregistration of x-ray fluoroscopy and ct images using mutual informa-tion and sparsely sampled histogram estimators. In Computer Visionand Pattern Recognition, 2001. CVPR 2001. Proceedings of the 2001IEEE Computer Society Conference on, volume 2, pages II–696. IEEE,2001.129Appendix ADerivation of ShapeParametersTo minimize Equation (4.2) w.r.t. shape parameters, conditioned by obser-vations in both modalities, we first ignore terms independent of b:E(b) =∑md12σ2mdM,Nmd∑m,n=1Pmd∥∥∥xmdn − sR(zmdm + Ψmb)− t∥∥∥2+µ2bTΛb, (A.1)where md ∈ {MR,US}, Pmd = P (zm|xmdn ) and zmdm = zm + vmdm . We canexpand Equation (A.1) and re-write it in a rasterized form:E(b) =∑md12σ2md[Nmdp tTmdtmd + ~xTmddiag(P˜Tmd1)~xmd− 2tTmdIˆTdiag(P˜ Tmd1)~xmd+ s2md (~zmd + Ψb)Tdiag(P˜md1)(~zmd + Ψb)+ 2smd tTmdRmdI˜Tdiag(P˜md1)(~zmd + Ψb)−2smd ~xTmdP˜ TmdR˜md(~zmd + Ψb)]+µ2bTΛb, (A.2)where IˆT =[I3×3 · · · I3×3]3×3N . The matrix P˜md = kron(Pmd, I3×3) isthe Kronecker-delta product of the correspondence probability matrix withan identity matrix, so it can be applied to a concatenation of all (x, y, z)130Appendix A. Derivation of Shape Parameterscoordinates of input points. Differentiation w.r.t. b results in:∂E(b)∂b=µΛb+∑md12σ2md[2s2mdΨTdiag(P˜md1)Ψb+ 2s2md ΨTdiag(P˜md1)~zmd+ 2smd ΨTdiag(P˜md1)I˜RTmdtmd−2smd ΨTR˜mdP˜md~xmd]. (A.3)Setting Equation (A.3) to zero yields the set of linear equations described inEquation (4.12).131Appendix BDerivation of FE NodalDisplacementsMinimization of Equation (4.2) w.r.t. nodal displacements in each modalityis equivalent to registering the SSM instance to each observation indepen-dently. To avoid clutter, we drop the modality subscript, md, throughoutthis section.Limiting to a single modality and ignoring terms independent of U , Equa-tion (4.2) can be written as:E(U) = 12σ2M,N∑m,n=1P (zm|xn) ‖xn − sR (ym + ΦmU)− t‖2+β2~uTK~u, (B.1)where ~u = (u11, . . . , u13, . . . , uJ3)T is the rasterized representation of U .Written in matrix form, this is equivalent toE(U) = 12σ2[NP tTt+ tr (XTdiag(P T1)X)− 2tTXTP T1+ s2 tr ((Y + ΦU)Tdiag(P1)(Y + ΦU))+ 2s tTR(Y + ΦU)TP1−2s tr (XTP T(Y + ΦU)RT)]+β2~uTK~u, (B.2)where tr(·) denotes the trace of a square matrix. To simplify the derivation,132Appendix B. Derivation of FE Nodal Displacementswe can rewrite Equation (B.2) in a rasterized form:E(~u) = 12σ2[NP tTt+ ~xTdiag(P˜ T1)~x− 2tTIˆTdiag(P˜ T1)~x+ s2 (~y + Φ˜~u)Tdiag(P˜1)(~y + Φ˜~u)+ 2s tTRI˜Tdiag(P˜1)(~y + Φ˜~u)−2s ~xTP˜ TR˜(~y + Φ˜~u)]+β2~uTK~u, (B.3)where IˆT =[I3×3 · · · I3×3]3×3N . Excluding terms independent of ~u, thisreduces toE(~u) = s22σ2~uTΦ˜Tdiag(P˜1)Φ˜~u+β2~uTK~u (B.4)+1σ2[(s2~yT + s tTRI˜T)diag(P˜1)− s ~xTP˜ TR˜]Φ˜~uDifferentiating with respect to ~u results in:∂E(~u)∂~u=[s2σ2Φ˜Tdiag(P˜1)Φ˜ + βK]~u (B.5)+1σ2[Φ˜Tdiag(P˜1)(s2~y + sI˜RTt)− sΦ˜TR˜TP˜ ~x].Setting ∂E(~u)/∂~u= 0 results in the system of equations described in Equa-tion (4.15).133Appendix CDerivative of the InterpolationMatrixThroughout this appendix, we use the following notations:• Let ~y be a spatial coordinate independent of the FEM.• Let ~x be an Eulerian coordinate of a point in the FEM.• Let ~X be the Lagrangian spatial coordinate of vx within the FEM. vXmoves with the FEM, so its representation remains unchanged.• Let ~vi be the Eulerian position of the ith FEM node. vij denotes thejth coordinate of ~vi.• Let ~ui be the Eulerian displacement of the ith FEM node. uij denotesthe jth coordinate of ~ui.For a particular point within the FEM (e.g. the centre of an element), itsLagrangian coordinate remains fixed as the FEM moves, but the Euleriancoordinate changes. If the FEM hasn’t moved, then the Eulerian and La-grangian coordinates coincide (i.e. ~X = ~x(0)). This leads to the expression:~x(t) =∑iφi(~x(0))~vi=∑iφi(~X)~vi(t) (C.1)~x(t) = ~x(0) +∑iφi( ~X)~ui(t) (C.2)which says nothing more than that locations inside the material are interpo-lated based positions (C.1) or displacements (C.2) of the FEM nodes. Theinterpolation functions φi are called shape functions in FEM literature, andare defined on the material (Lagrangian) coordinates ~X. To actually expressthe shape functions, a set of “natural coordinates” (ξ, η, µ) within elements134Appendix C. Derivative of the Interpolation Matrixare often introduced. The material coordinates are then interpolated withinthe element as~X(ξ, η, µ) =∑iφi(ξ, η, µ) ~Xi=∑iφi(ξ, η, µ)~vi(0), (C.3)where ~Xi is the Lagrangian coordinate of the ith FEM node. This holdseverywhere within the material, particularly at time t = 0. From Equation(C.1), this means that:~x(0) =∑iφi(~X)~vi(0)=⇒ ~X =∑iφi(~X)~vi(0)=⇒ φi(~X)= φi(ξ, η, µ)This is just a complicated way of defining a separate set of “natural coor-dinates” within elements and express the shape functions in terms of thesecoordinates. For tetrahedral elements, the natural coordinates are barycen-tric coordinates. For hexahedral elements, they are derived by warping asize 2× 2× 2 cube to the hex.Normally in the FEM framework, we are interested only in how the orig-inal material moves in space. The shape functions remain fixed for a givenpoint in Lagrangian (i.e. material) coordinates. In Chapter 5, we ask a dif-ferent question. We have a point in spatial (Eulerian) coordinates from theslice, ~y, which happens to fall within the FEM volume. At a given point intime, ~y corresponds to a particular point in the material, ~y = ~x1(t), whichin turn corresponds to a fixed material (Lagrangian) coordinate in the FEM,~X1. If the FEM moves, the same spatial coordinate will correspond to adifferent point in the material, ~y = ~x2(t). The new Lagrangian coordinate,~X2, will have different interpolation coefficients. How do the shape functionschange at a given spatial coordinate as the FEM nodes move?At a given time t, we know all FEM node positions {~vi}. We also knowwe are interpolating with the elements based on natural coordinates. FromEquation (C.1), and given ~y = ~x(t), we have~y =∑iφ(ξ, η, µ)~vi(t). (C.4)135Appendix C. Derivative of the Interpolation MatrixBased on knowledge of the node positions and functional form of φ in termsof the natural coordinates, we can invert this expression to determine thenatural coordinates (ξ, η, µ). For tetrahedral elements, this is a linear inver-sion problem. For other element types, it typically requires a few numericaliterations to solve. To compute the desired derivative terms, we start withEquation (C.4) and differentiate with respect to the lth coordinate of ~uk:∂yj∂ukl=∑i∂∂ukl[φi(ξ, η, µ)vij(t)] , j = {1, 2, 3}, k = {1, . . . , N}, l = {1, 2, 3}=∑iφi(ξ, η, µ)∂vij(t)∂ukl+∂φi(ξ, η, µ)∂uklvij(t)=∑iφi(ξ, η, µ)∂[vij(0) + uij ]∂ukl+∂(ξ, η, µ)∂ukl∂φi(ξ, η, µ)∂(ξ, η, µ)vij(t)=∑iφi(ξ, η, µ)δkiδlj +∂(ξ, η, µ)∂ukl∂φi(ξ, η, µ)∂(ξ, η, µ)vij(t)= δljφk(ξ, η, µ) +∑i∂(ξ, η, µ)∂ukl∂φi(ξ, η, µ)∂(ξ, η, µ)vij(t), (C.5)where δlj denotes the Kronecker delta. The location ~y is fixed in space, so∂yj∂ukl= 0. We can express the above system of equations in matrix form fora fixed k ∈ {1, . . . , N}:0 =φk 0 00 φk 00 0 φk+∂ξ∂uk1∂η∂uk1∂µ∂uk1∂ξ∂uk2∂η∂uk2∂µ∂uk2∂ξ∂uk3∂η∂uk3∂µ∂uk3︸ ︷︷ ︸unknowns ∂(ξ,η,µ)∂~uk∂φ1∂ξ . . .∂φN∂ξ∂φ1∂η . . .∂φN∂η∂φ1∂µ . . .∂φN∂µ v11 v12 v13...vN1 vN2 vN3 .For every k, this is a linear system of nine equations with nine unknowns,which has a unique solution as long as the element is not degenerated (in-verted). In such a case, we have:∂(ξ, η, µ)∂~uk= −φk 0 00 φk 00 0 φk∂φ1∂ξ . . .∂φN∂ξ∂φ1∂η . . .∂φN∂η∂φ1∂µ . . .∂φN∂µ v11 v12 v13...vN1 vN2 vN3−1.(C.6)136Appendix C. Derivative of the Interpolation MatrixTo recover ∂φi∂~uk , we simply use the chain rule which results in Equation (5.4):∂φi(ξ, η, µ)∂~uk=∂(ξ, η, µ)∂~uk∂φi∂(ξ, η, µ)(C.7)= −φk 0 00 φk 00 0 φk∂φ1∂ξ . . .∂φN∂ξ∂φ1∂η . . .∂φN∂η∂φ1∂µ . . .∂φN∂µ v11 v12 v13...vN1 vN2 vN3−1 ∂φi∂ξ∂φi∂η∂φi∂µ137
- Library Home /
- Search Collections /
- Open Collections /
- Browse Collections /
- UBC Theses and Dissertations /
- Image-based guidance for prostate interventions
Open Collections
UBC Theses and Dissertations
Featured Collection
UBC Theses and Dissertations
Image-based guidance for prostate interventions Khallaghi, Siavash 2015
pdf
Page Metadata
Item Metadata
Title | Image-based guidance for prostate interventions |
Creator |
Khallaghi, Siavash |
Publisher | University of British Columbia |
Date Issued | 2015 |
Description | Prostate biopsy is the gold standard for cancer diagnosis. This procedure is guided using a 2D transrectal ultrasound (TRUS) probe. Unfortunately, early stage tumors are not visible in ultrasound and prostate motion/deformations make targeting challenging. This results in a high number of false negatives and patients are often required to repeat the procedure. Fusion of magnetic resonance images (MRI) into the workspace of a prostate biopsy has the potential to detect tumors invisible in TRUS. This allows the radiologist to better target early stage cancerous lesions. However, due to different body positions and imaging settings, the prostate undergoes motion and deformation between the biopsy coordinate system and the MRI. Furthermore, due to variable probe pressure, the prostate moves and deforms during biopsy as well. This introduces additional targeting errors. A biopsy system that compensates for these sources of error has the potential to improve the targeting accuracy and maintain a 3D record of biopsy locations. The goal of this thesis is to provide the necessary tools to perform freehand MR-TRUS fusion for prostate biopsy using a 3D guidance system. To this end, we have developed two novel surface-based registration methods for incorporating the MRI into the biopsy workspace. The proposed methods are the first methods that are robust to missing surface regions for MR-TRUS fusion (up to 30% missing surface points). We have validated these fusion techniques on 19 biopsy, 10 prostatectomy and 11 brachytherapy patients. In this thesis, we have also developed methods that combine intensitybased information with biomechanical constraints to compensate for prostate motion and deformations during the biopsy. To this end, we have developed a novel 2D-3D registration framework, which was validated on an additional 10 biopsy patients. Our results suggest that accurate 2D-3D registration for freehand biopsy is feasible. The results presented suggest that accurate registration of MR and TRUS data in the presence of partially missing data is feasible. Moreover, we demonstrate that in the presence of variable probe pressure during freehand biopsy, a combination of intensity-based and biomechanically constrained 2D-3D registration can enable accurate alignment of pre-procedure TRUS with 2D real time TRUS images. |
Genre |
Thesis/Dissertation |
Type |
Text |
Language | eng |
Date Available | 2015-10-24 |
Provider | Vancouver : University of British Columbia Library |
Rights | Attribution-NonCommercial-NoDerivs 2.5 Canada |
DOI | 10.14288/1.0166782 |
URI | http://hdl.handle.net/2429/55055 |
Degree |
Doctor of Philosophy - PhD |
Program |
Electrical and Computer Engineering |
Affiliation |
Applied Science, Faculty of Electrical and Computer Engineering, Department of |
Degree Grantor | University of British Columbia |
Graduation Date | 2015-11 |
Campus |
UBCV |
Scholarly Level | Graduate |
Rights URI | http://creativecommons.org/licenses/by-nc-nd/2.5/ca/ |
Aggregated Source Repository | DSpace |
Download
- Media
- 24-ubc_2015_november_khallaghi_siavash.pdf [ 17.91MB ]
- Metadata
- JSON: 24-1.0166782.json
- JSON-LD: 24-1.0166782-ld.json
- RDF/XML (Pretty): 24-1.0166782-rdf.xml
- RDF/JSON: 24-1.0166782-rdf.json
- Turtle: 24-1.0166782-turtle.txt
- N-Triples: 24-1.0166782-rdf-ntriples.txt
- Original Record: 24-1.0166782-source.json
- Full Text
- 24-1.0166782-fulltext.txt
- Citation
- 24-1.0166782.ris
Full Text
Cite
Citation Scheme:
Usage Statistics
Share
Embed
Customize your widget with the following options, then copy and paste the code below into the HTML
of your page to embed this item in your website.
<div id="ubcOpenCollectionsWidgetDisplay">
<script id="ubcOpenCollectionsWidget"
src="{[{embed.src}]}"
data-item="{[{embed.item}]}"
data-collection="{[{embed.collection}]}"
data-metadata="{[{embed.showMetadata}]}"
data-width="{[{embed.width}]}"
async >
</script>
</div>
Our image viewer uses the IIIF 2.0 standard.
To load this item in other compatible viewers, use this url:
http://iiif.library.ubc.ca/presentation/dsp.24.1-0166782/manifest