UBC Theses and Dissertations

UBC Theses Logo

UBC Theses and Dissertations

Image-based enhancement and tracking of an epidural needle in ultrasound using time-series analysis Beigi, Parmida 2017

Your browser doesn't seem to have a PDF viewer, please download the PDF to view this item.

Item Metadata

Download

Media
24-ubc_2017_november_beigi_parmida.pdf [ 8.25MB ]
Metadata
JSON: 24-1.0355263.json
JSON-LD: 24-1.0355263-ld.json
RDF/XML (Pretty): 24-1.0355263-rdf.xml
RDF/JSON: 24-1.0355263-rdf.json
Turtle: 24-1.0355263-turtle.txt
N-Triples: 24-1.0355263-rdf-ntriples.txt
Original Record: 24-1.0355263-source.json
Full Text
24-1.0355263-fulltext.txt
Citation
24-1.0355263.ris

Full Text

Image-based Enhancement andTracking of an Epidural Needle inUltrasound Using Time-series AnalysisbyParmida BeigiM.Eng., The University of British Columbia, 2013M.A.Sc., Simon Fraser University, 2011B.Sc., Sharif University of Technology, 2009A THESIS SUBMITTED IN PARTIAL FULFILLMENT OFTHE REQUIREMENTS FOR THE DEGREE OFDOCTOR OF PHILOSOPHYinThe Faculty of Graduate and Postdoctoral Studies(Electrical and Computer Engineering)THE UNIVERSITY OF BRITISH COLUMBIA(Vancouver)August 2017c© Parmida Beigi 2017AbstractAccurate placement of the needle to the target spot is crucial to the safety,efficiency and success of any ultrasound (US) guided procedures, e.g. epidu-rals. Real-time single-operator US guidance of epidurals is currently im-practical with conventional 2D US and a standard needle. A novel 3D USimaging technology (3DUS+Epiguide) has been developed by our group toprovide such capability, which comprises thick-slice rendering and a customneedle guide, Epiguide. This system aims to facilitate a single-operator real-time midline epidural needle insertion by visualizing the needle progressiontoward the epidural space. The visibility of the needle in US-guided proce-dures is, however, limited by dispersion of the needle’s echoes away from thetransducer. This makes needle visibility an ongoing challenge in US-guidedprocedures, e.g. epidurals.The aim of this thesis is to provide a software-based clinically-suitablesolution to enhance needle visibility in US. In particular, we are interestedin difficult cases, where the needle is invisible (i.e. minimal visibility) in anUS image. To this end, we have developed frameworks to extract signaturesfrom the needle using time-series analysis. We demonstrate that nearly in-visible changes in motion dynamics of the needle can be revealed throughspatio-temporal processing of the standard US image sequences. The ex-tracted needle features are used to detect, track and localize a hand-heldneedle in a machine learning-based framework. Clinical, animal and phan-tom studies are designed to evaluate: 3DUS+Epiguide’s capability to iden-tify the needle puncture site for a midline insertion in the lumbar spine, andthe capability of the proposed detection frameworks in localizing a needlewithin clinical acceptance. Methods are evaluated on the data from studiesconducted at BC Women’s Hospital and Jack Bell Research Facility, and arecompared to the gold standard (GS). Results demonstrate that comparedto the state-of-the-art needle detection methods using a finely-tuned HoughTransform, with 14% success rate, our proposed method is 100% successfulin detecting the trajectory within the GS’s close proximity (success: angularerror < 10◦). The proposed method is especially suitable for barely-visibleneedles, for which appearance-based enhancement methods fail.iiLay SummaryEpidural anaesthesia is a pain management technique used during laborand delivery, and is also an effective alternative to general anesthesia duringcesarean section. Anesthetic is injected into the epidural space using anepidural needle in order to provide pain relief for the lower body. Accurateplacement of the needle in the target space is crucial for the success ofthe procedure. We have developed a guidance framework using ultrasoundimaging to facilitate this procedure by visualizing the anatomy and needleprogression toward the epidural space. The main limitation is, however, poorneedle visibility in standard ultrasound B-mode images. Our novelty is toanalyze the motion dynamics of tissue surrounding the needle to localize itusing a sequence of ultrasound B-mode images. We demonstrate that nearlyinvisible changes in motion dynamics of the needle during insertion can berevealed using time-series analysis of the standard B-mode image sequences.iiiPrefaceThis thesis is based on several manuscripts resulting from the work doneby the author of this thesis and in collaboration with multiple researchers.The author was responsible for design, development, implementation andevaluation of the method as well as the production of the manuscripts. Allco-authors have contributed to the editing of the manuscripts and providingfeedback. Publications have been modified to make the thesis coherent. Eth-ical approvals for clinical human studies and animal studies conducted forthis research have been provided by UBC/Women’s Health Centre ResearchEthics Board (certificate #H10-01974), and UBC animal care (certificates#A11-0223 and #A14-0171).Chapter 2 describes a version of the published work in the followingpublications:• P. Beigi, P. Malenfant, A. Rasoulian, R. Rohling, A. Dube, and V.Gunka, Three-dimensional ultrasound-guided real-time midline epi-dural needle placement with Epiguide: A prospective feasibility study.Ultrasound in Medicine & Biology, 43(1), 375-379, 2016.• P. Malenfant, P. Beigi, V. Gunka, A. Rasoulian, R. Rohling and A.Dube , “Accuracy of 3D ultrasound for identification of epidural needleskin insertion point in parturients,” Society for Obstetric Anesthesiaand Perinatology (SOAP), 308, 2014.The author, as the engineering lead of the study, contributed by per-forming the analysis, statistical investigations and evaluating the results.Dr. Malenfant performed the patient scanning. Dr. Gunka, the medicalteam leader, provided advice on the procedure and clinical evaluation ap-proaches. Dr. Rasoulian initially programmed the thick-slice rendering step.Prof. Rohling invented Epiguide. All authors were involved in the develop-ment of the clinical study protocol. All authors contributed to improvingthe manuscript with their valuable comments and feedback.Chapter 3 describes a version of the published work in the followingpublications:ivPreface• J. Stone, P. Beigi, R. Rohling, V. A. Lessoway, A. Dube, V. Gunka,“Novel 3D ultrasound system for midline single-operator epidurals: Afeasibility study on a porcine model,” International Journal of Obstet-ric Anesthesia, 31, 51-56, 2017.• J. Stone, P. Beigi, R. Rohling, V. A. Lessoway, A. Dube, V. Gunka,“Novel 3D-ultrasound-guided midline lumbar epidural placement, uti-lizing Epiguide needle guide in porcine model: a comparison of stan-dard versus Pajunk epidural needles,” Society for Obstetric Anesthesia& Perinatology (SOAP), 278, 2015.The study described in this chapter is primarily to investigate the chal-lenges of needle visibility using the implemented rendering system, describedin Chapter 1. The author, as the engineering lead of the study, contributedby performing the analysis, organizing and running the studies, helping withthe statistical investigations and evaluating the results. Dr. Stone performedthe patient scanning and the analysis for the manuscript. Dr. Gunka pro-vided advice on the clinical evaluation approaches. All authors were involvedin the development of the clinical study protocol. All authors contributedto improving the manuscript with their valuable comments and feedback.Chapter 4 describes a version of the published work in the followingpublications:• P. Beigi, R. Rohling, T. Salcudean, V. A. Lessoway, G. C. Ng, “Needletrajectory and tip localization in 3D ultrasound using a moving sty-lus,” Journal of Ultrasound in Medicine & Biology , 41(7), 2057-2070,2015.• P. Beigi and R. Rohling, “Needle localization using a moving stylus/cat-heter in ultrasound-guided regional anesthesia: a feasibility study,” InProc. of SPIE Medical Imaging, the International Society for Opticsand Photonics, vol. 9036, pp. 90362-90366, 2014.The author contributed by designing, developing and implementing themethod, performing the experiments and evaluating the results. Prof. Sal-cudean and Prof. Rohling contributed to the improvement of the methodol-ogy. Ms. Lessoway, the sonographer, helped with the gold standard identi-fication for method validation. As our industrial collaborator from Philips,Dr. Ng, provided suggestions on the methodology. All authors contributedto improving the manuscript with their valuable comments and feedback.Chapter 5 describes a version of the published work in the followingpublications:vPreface• P. Beigi, R. Rohling, T. Salcudean, G. C. Ng, “Spectral Analysis ofthe Tremor Motion for Needle Detection in Curvilinear Ultrasound viaSpatio-temporal Linear Sampling,” International Journal of ComputerAssisted Radiology and Surgery , 11(6), 1183-1192, 2016.• P. Beigi, R. Rohling, T. Salcudean, V. A. Lessoway, G. C. Ng, “NeedleDetection in Ultrasound using the Spectral Properties of the Displace-ment Field,” In Proc. of SPIE Medical Imaging, the InternationalSociety for Optics and Photonics, vol. 9415, pp. 94150U-6, 2015.The author contributed by designing, developing and implementing themethod, performing the experiments and evaluating the results. Prof. Sal-cudean and Prof. Rohling contributed to the improvement of the method-ology. Ms. Lessoway, the sonographer, helped with the gold standard iden-tification for the method validation. Dr. Ng, provided suggestions on themethodology. All authors contributed to improving the manuscript withtheir valuable comments and feedback.Chapter 6 describes a version of the published work in the followingpublications:• P. Beigi, R. Rohling, T. Salcudean, G. C. Ng, “Detection of an invis-ible needle in ultrasound using a probabilistic svm and time-domainfeatures,” Journal of Ultrasonics, 78, 18-22, 2016.• P. Beigi, R. Rohling, T. Salcudean, G. C. Ng, “Automatic detectionof a hand-held needle in ultrasound via phased-based analysis of thetremor motion,” SPIE Medical Imaging, the International Society forOptics and Photonics, vol. 9786, pp. 97860I-1, 2016.The author contributed by designing, developing and implementing themethod, performing the experiments and evaluating the results. Prof. Sal-cudean and Prof. Rohling contributed to the improvement of the method-ology. Dr. Ng, provided suggestions on the methodology. All authors con-tributed to improving the manuscript with their valuable comments andfeedback.Chapter 7 describes a version of the published work in the followingpublications:• P. Beigi, R. Rohling, T. Salcudean and G. C. Ng, “CASPER: Computer-aided segmentation of imperceptible motion – A Learning-based Track-ing of an Invisible Needle in Ultrasound,” International Journal ofComputer Assisted Radiology and Surgery., pp. 1-10, 2017.viPrefaceThe author contributed by designing, developing and implementing themethod, performing the experiments and evaluating the results. Prof. Sal-cudean and Prof. Rohling contributed to the improvement of the method-ology. Dr. Ng, provided suggestions on the methodology. All authors con-tributed to improving of the manuscript with their valuable comments andfeedback.viiTable of ContentsAbstract . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . iiLay Summary . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . iiiPreface . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . ivTable of Contents . . . . . . . . . . . . . . . . . . . . . . . . . . . . viiiList of Tables . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . xiiList of Figures . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . xivGlossary . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . xxiAcknowledgements . . . . . . . . . . . . . . . . . . . . . . . . . . . xxiii1 Introduction . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 11.1 Needle insertion in spinal interventions: epidural anesthesia . 11.2 Complications of the manual palpation . . . . . . . . . . . . 51.3 Background . . . . . . . . . . . . . . . . . . . . . . . . . . . 71.3.1 Ultrasound for epidurals . . . . . . . . . . . . . . . . 81.3.2 Needle visibility enhancement . . . . . . . . . . . . . 91.3.3 SURE: the proposed solution for epidural guidance . 131.4 Proposed solution . . . . . . . . . . . . . . . . . . . . . . . . 161.4.1 Objectives . . . . . . . . . . . . . . . . . . . . . . . . 161.4.2 Contributions . . . . . . . . . . . . . . . . . . . . . . 171.5 Materials . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 181.6 Thesis outline . . . . . . . . . . . . . . . . . . . . . . . . . . 182 Single-Operator midline Epidurals Using Epiguide: Punc-ture Site Selection . . . . . . . . . . . . . . . . . . . . . . . . . 232.1 Introduction . . . . . . . . . . . . . . . . . . . . . . . . . . . 232.2 Materials and methods . . . . . . . . . . . . . . . . . . . . . 24viiiTable of Contents2.2.1 Study population and design . . . . . . . . . . . . . . 242.2.2 3DUS+Epiguide development . . . . . . . . . . . . . 242.3 Experimental results . . . . . . . . . . . . . . . . . . . . . . . 292.4 Discussion and conclusion . . . . . . . . . . . . . . . . . . . . 303 Needle Visibility in Ultrasound Using Epiguide: Challenges 323.1 Introduction . . . . . . . . . . . . . . . . . . . . . . . . . . . 323.2 Materials and methods . . . . . . . . . . . . . . . . . . . . . 333.3 Statistical analysis . . . . . . . . . . . . . . . . . . . . . . . . 363.4 Experimental results . . . . . . . . . . . . . . . . . . . . . . . 373.5 Discussion and conclusion . . . . . . . . . . . . . . . . . . . . 394 Motion Analysis of a Standard Needle: Moving stylus . . 434.1 Introduction . . . . . . . . . . . . . . . . . . . . . . . . . . . 434.2 Materials and methods . . . . . . . . . . . . . . . . . . . . . 454.2.1 Method overview . . . . . . . . . . . . . . . . . . . . 454.2.2 Stylus motion . . . . . . . . . . . . . . . . . . . . . . 464.2.3 Absolute volume difference . . . . . . . . . . . . . . . 494.2.4 Maximum intensity projection . . . . . . . . . . . . . 494.2.5 Histogram analysis . . . . . . . . . . . . . . . . . . . 524.2.6 Reverberation suppression . . . . . . . . . . . . . . . 544.2.7 Discontinuity connection . . . . . . . . . . . . . . . . 544.2.8 Convex hull calculation and linear fit . . . . . . . . . 554.2.9 Experimental setup . . . . . . . . . . . . . . . . . . . 574.3 Experimental results . . . . . . . . . . . . . . . . . . . . . . . 624.4 Discussion and conclusion . . . . . . . . . . . . . . . . . . . . 665 Optical Flow Analysis of a Hand-held Needle: Fixed Imag-ing . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 685.1 Introduction . . . . . . . . . . . . . . . . . . . . . . . . . . . 685.2 Materials and methods . . . . . . . . . . . . . . . . . . . . . 695.2.1 Method overview . . . . . . . . . . . . . . . . . . . . 695.2.2 Approximate puncture site identification . . . . . . . 705.2.3 ROI identification . . . . . . . . . . . . . . . . . . . . 725.2.4 Optical flow calculation . . . . . . . . . . . . . . . . . 725.2.5 Regularized least squares . . . . . . . . . . . . . . . . 745.2.6 Spectral coherence . . . . . . . . . . . . . . . . . . . . 755.2.7 Trajectory detection . . . . . . . . . . . . . . . . . . . 765.2.8 Experimental analysis and setup . . . . . . . . . . . . 795.3 Experimental results . . . . . . . . . . . . . . . . . . . . . . . 81ixTable of Contents5.3.1 Parameters Sensitivity . . . . . . . . . . . . . . . . . 815.4 Discussion and conclusion . . . . . . . . . . . . . . . . . . . . 846 Multi-scale Phase-based Analysis of a Hand-held Needle:Fixed Imaging . . . . . . . . . . . . . . . . . . . . . . . . . . . . 866.1 Introduction . . . . . . . . . . . . . . . . . . . . . . . . . . . 866.2 Materials and methods . . . . . . . . . . . . . . . . . . . . . 876.2.1 Phase-based motion estimation . . . . . . . . . . . . . 876.2.2 Learning-based motion estimation using ARMA fea-tures of the phase . . . . . . . . . . . . . . . . . . . . 916.2.3 Experimental setup . . . . . . . . . . . . . . . . . . . 966.3 Experimental results . . . . . . . . . . . . . . . . . . . . . . . 976.4 Discussion and conclusion . . . . . . . . . . . . . . . . . . . . 997 Needle Tracking Using Phase-based Optical-flow Analysisand Machine Learning: Free-hand Imaging . . . . . . . . . 1017.1 Introduction . . . . . . . . . . . . . . . . . . . . . . . . . . . 1017.2 Materials and methods . . . . . . . . . . . . . . . . . . . . . 1027.2.1 Method overview . . . . . . . . . . . . . . . . . . . . 1027.2.2 Motion descriptors . . . . . . . . . . . . . . . . . . . . 1047.2.3 Feature extraction . . . . . . . . . . . . . . . . . . . . 1087.2.4 Pixel-based classification . . . . . . . . . . . . . . . . 1117.2.5 Online evaluation . . . . . . . . . . . . . . . . . . . . 1127.2.6 Experimental analysis and setup . . . . . . . . . . . . 1147.3 Experimental results . . . . . . . . . . . . . . . . . . . . . . . 1157.4 Discussion and Conclusion . . . . . . . . . . . . . . . . . . . 1198 Conclusion and Future Work . . . . . . . . . . . . . . . . . . 1218.1 Contributions . . . . . . . . . . . . . . . . . . . . . . . . . . 1238.2 Future work . . . . . . . . . . . . . . . . . . . . . . . . . . . 124Bibliography . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 127Appendix . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 138Clinical and Animal Study Protocols . . . . . . . . . . . . . 1381 Epiguide: clinical studies at BC Women’s Hospital . . . . . . 1381.1 Equipment preparation . . . . . . . . . . . . . . . . . 1381.2 L2-3 and L3-4 Identification . . . . . . . . . . . . . . 1381.3 3D US scanning . . . . . . . . . . . . . . . . . . . . . 139xTable of Contents1.4 Palpation . . . . . . . . . . . . . . . . . . . . . . . . . 1402 Needle tracking: animal study at Jack Bell Animal ResearchFacility . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 1412.1 Setup preparation . . . . . . . . . . . . . . . . . . . . 1412.2 Data acquisition . . . . . . . . . . . . . . . . . . . . . 141xiList of Tables1.1 Epidural anatomy measurements during pregnancy. . . . . . . 63.1 Success at LOR categorized by mean visibility score. Visibil-ity score key: 0=cannot see, 1=poor, 2=satisfactory, 3=ex-cellent . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 383.2 Mean needle visibility score categorized by needle types (echo-genic and standard needle): Visibility score key: 0=cannotsee, 1=poor, 2=satisfactory, 3=excellent. %=percentage . . . 384.1 Localization error for three different tissue samples, for allinsertion angles, needle lengths and depth settings. ∆θ, σ∆θand RMS(∆θ) (degree) are the mean, SD and RMS of theangular deviation and ∆P , σ∆P and RMS(∆P ) (mm) are themean, SD and RMS of the needle tip deviation, respectively. . 634.2 Localization error for low/high depth settings (50−70 mm/70−90 mm), averaged for three tissue types. . . . . . . . . . . . . 634.3 Localization error for shallow/steep insertion angles averagedfor three tissue types. . . . . . . . . . . . . . . . . . . . . . . 635.1 Parameter selection for experimental analysis. . . . . . . . . . 805.2 Needle trajectory detection accuracy on porcine experimentsin vivo (n=20). . . . . . . . . . . . . . . . . . . . . . . . . . . 825.3 Sensitivity analysis for parameter selection on n = 5 ran-domly selected data sets. . . . . . . . . . . . . . . . . . . . . . 825.4 Needle trajectory detection results averaged over the sameporcine data in vivo (n=20), varying the overall imaging gain.(% is the percentage of the intensity saturated to 0 (–) and255 (+)). . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 826.1 Localization accuracy for medium-steep needle insertions inan agar phantom, averaged over 20 data set. . . . . . . . . . . 98xiiList of Tables6.2 Needle localization accuracy comparison of the proposed methodand the existing state-of-the-art approaches for in vivo porcinetest-set. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 987.1 Comparison of the needle localization trajectory results onporcine femoris muscle in vivo. . . . . . . . . . . . . . . . . . 1187.2 Comparison of the needle localization tip results on porcinefemoris muscle in vivo. . . . . . . . . . . . . . . . . . . . . . . 1187.3 P-values of paired two-sided Mann-Whitney-Wilcoxon test onthe results of offine-CASPER against CASPER, PB and theOF-based method. . . . . . . . . . . . . . . . . . . . . . . . . 118xiiiList of Figures1.1 Needle insertion for epidural anesthesia: the needle passesthrough the skin, the supraspinous and intraspinous liga-ments and the dense LF before entering the epidural space(enhancement of the original drawing by Vicky Earle). . . . . 21.2 Needle insertion during the epidural anesthesia procedure:(a) Skin infiltration with local anesthetic, (b) insertion of theepidural needle, (c) LOR technique to advance the needleto the epidural space, and (d) epidural catheter advancingthrough the needle, after occurrence of LOR. Image reprintedwith permission from Medscape Drugs & Diseases. . . . . . . 31.3 Interspace localization for lumbar spine using palpation. L2-3interspace for needle insertion is localized by palpation of thesuperior aspect of the iliac crest and L5 vertebra (enhance-ment of the original drawing by Vicky Earle). . . . . . . . . . 41.4 Midline and paramedian needle insertions into the epiduralspace: (a) transverse, and (b) paramedian view of the verte-brae and epidural needle insertion (enhancement of the orig-inal drawing by Vicky Earle). . . . . . . . . . . . . . . . . . . 51.5 Catheter insertion with the Tuohy epidural needle in theepidural space. The curved distal end of the needle facilitatespassage of the catheter upwards and mitigates tissue dam-ages while the needle is advanced. Original image reprintedwith permission from: J. F. Butterworth, D. C. Mackey andJ. D. Wasnick: Morgan & Mikhail’s Clinical Anesthesiology,5th edition, c© McGraw-Hill Education. . . . . . . . . . . . . 71.6 (a) Sagittal, and (b) transverse imaging planes of the spine. . 8xivList of Figures1.7 Illustration of our 3D thick-slice system used to guide epidu-rals in a porcine tissue model: (a) Epiguide placed on a motor-ized m4DC7−3/40 microconvex 4D transduer, (b) posteriorview of the transducer, with an inset showing a simulatedthick-slice rendering of 3DUS data (enhancement of the orig-inal drawing by Vicky Earle), and (c) thick-slice rendering ofthe acquired volume from a porcine study with an insertedechogenic needle. . . . . . . . . . . . . . . . . . . . . . . . . . 141.8 Depiction of 3D thick-slice rendering (transverse view) used toguide a midline epidural needle insertion in a spine phantom:(a) coronal and (b) sagittal views of the transducer and theinserted needle in the phantom, and (c) thick-slice renderingshowing the needle as well as the spinal anatomy. . . . . . . . 151.9 Graphical outline of thesis. . . . . . . . . . . . . . . . . . . . 192.1 Epiguide model sketch: (a) side and (b) top views of m4DC7−3/40US transducer equipped with Epiguide. Red arrows point tothe channel for needle placement. . . . . . . . . . . . . . . . . 252.2 3DUS with Epiguide. (a) Transverse view of the spine andthe US field of view. Black parallel lines show the resliceplane and the red line shows the needle trajectory. (b) Op-erator holding the transducer paramedin for a midline needleinsertion. (c) The transducer is shown as a blue box withinset showing a simulated thick-slice rendering of 3DUS data(between black lines about the midline) depicting both targetepidural space and needle trajectory. . . . . . . . . . . . . . . 272.3 Description of the puncture site labeling and measurements:(a) Determine 3D1 and 3D2 with thick-slice US imaging, (b)copy points 3D1 and 3D2 on the transparency, (c) IdentifyP1 and P2 by palpation and (d) copy points P1 and P2 onthe same transparency. . . . . . . . . . . . . . . . . . . . . . . 282.4 (a) Thick-slice rendering showing the LF in the midline. LF isalso aligned with the anticipated needle trajectory, shown ashorizontal lines at each centimeter of depth. (b) Conceptualillustration of the skin markings and measurements based onvertebral levels L2-3 and L3-4. . . . . . . . . . . . . . . . . . 293.1 Epiguide mounted on US transducer with epidural needle insitu: (a) probe orientated in sagittal plane and (b) probeorientated in transverse plane. . . . . . . . . . . . . . . . . . . 34xvList of Figures3.2 Paramedian sagittal oblique 3D thick slice rendered view ofporcine lumbar vertebrae with echogenic needle in situ. Pos-terior complex consists of LF, epidural space and posteriordura. Anterior complex comprises anterior dura, posteriorlongitudinal ligament, vertebral body. (a) needle in half way,(b) needle tip just before the posterior complex. . . . . . . . . 353.3 Screenshot of video showing continuous pressure technique forLOR, with Epiguide and needle held in one hand, and syringeheld in other hand. . . . . . . . . . . . . . . . . . . . . . . . . 363.4 Bland Altman plot of Manual depth of needle insertion vs USmeasured depth, with 95% limits of agreement. . . . . . . . . 384.1 MB of B-mode volume (top row) and absolute difference vol-ume (bottom row) for: (a) bovine muscle, (b) porcine muscleand (c) bovine liver. Difference operation removes most ofthe anatomical echoes leaving mainly the needle. . . . . . . . 454.2 Illustration of stylus motion. . . . . . . . . . . . . . . . . . . . 464.3 Diagram of the projection-based 3D needle localization ap-proach. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 474.4 Detailed description of the projection-based 3D needle local-ization approach. . . . . . . . . . . . . . . . . . . . . . . . . . 484.5 Absolute difference illustration in 4 stylus locations. Consec-utive B-mode frames are on the left and the correspondingabsolute frame differences are on the right. The reflectionfrom the approximate stylus tip is indicated with an arrow. . 504.6 Example of a short-axis (SA) needle insertion, versus long-axis (LA). . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 514.7 MIP of orthogonal projections. . . . . . . . . . . . . . . . . . 524.8 Sketch of the typical histogram of the MIP of difference volume. 534.9 Description of the Discontinuity Connection step. The crossin red, shows the needle tip detected before the correctionstep and the black dot shows the final tip detected by thecorrection step. . . . . . . . . . . . . . . . . . . . . . . . . . . 554.10 An example of a projection difference image appearance inthe presence of random stylus reflections and reverberationartifacts. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 564.11 Experimental setup - needle inserted at the short axis in theliver tissue ex vivo. . . . . . . . . . . . . . . . . . . . . . . . . 584.12 Gold standard: 20G echogenic needle within the 17G com-bined spinal & epidural cannula. . . . . . . . . . . . . . . . . 59xviList of Figures4.13 Diagram illustrating the performance metrics when the needleis inserted through (a) SA and (b) LA of the transducer. . . . 594.14 Tissue motion simulator setup. . . . . . . . . . . . . . . . . . 614.15 MIP results on the stacks of projections along the lateral andelevational directions of the output for volumes taken from(a) bovine liver and (b) bovine muscle. The arrow indicatesthe correct needle tip which is also detected by the algorithm. 644.16 MIP results on the stacks of projections along (a) lateral and(b) elevational directions of the output, for three pairs of vol-umes in the two-operator scenario. . . . . . . . . . . . . . . . 654.17 MIP results on the stacks of projections along (a) lateral and(b) elevational directions of the output, for three pairs of vol-umes with the use of a needle guide. . . . . . . . . . . . . . . 654.18 Localization results for (a) two-operator scenario and (b) nee-dle guide. The solid line represents the result of the algorithmand the dotted line is the GS. . . . . . . . . . . . . . . . . . . 665.1 Block diagram of the proposed approach. (I) Insertion siteis estimated automatically on the reference frame. (II) Dis-placement map is computed within the ROI in a block-wiseapproach using an OF approach. (III) Trajectory channelis estimated according to the block segmentation results ofspectral coherency between displacement map of each MBand that of the target MB. (IV) Needle trajectory is detectedfrom spectral coherency between spatio-temporal linear sam-ple paths and the trace of the estimated puncture site withinthe trajectory channel. . . . . . . . . . . . . . . . . . . . . . . 715.2 Block-based analysis of OF. . . . . . . . . . . . . . . . . . . . 745.3 Illustration of significantly dropping spectral coherence (withinthe tremor frequency range of 2-5 Hz) as the block of inter-est moves away from the needle: (a) A frame of the B-modeUS image, with the sample blocks and the detected needleas overlays. (b) Spectral coherence plots for correspondingblocks of different distances from the needle trajectory: ∆dishows the lateral distance of the block of interest i from thenearest block containing the needle shaft, in terms of the num-ber of blocks (each block is 2 mm × 2 mm in size). . . . . . . 76xviiList of Figures5.4 Illustration of block-based trajectory estimation: (a) Schematicdrawing of the imaging setup and a B-mode frame contain-ing the needle. (b) Flowchart showing the estimation of thetrajectory channel. (c) Schematic drawing showing the needlecandidates overlaid on the segmented MBs based on the spec-tral coherence results in gray. (d) Selected MBs for trajectorychannel and (e) endpoints of the channel boundaries. (e) Ob-tained line segments forming the trajectory channel and (f)extended channel boundaries spanning the entire imaging depth. 775.5 Sample result of porcine tissue in vivo: (a) Spatio-temporalsample paths overlaid on top of a frame of B-mode US image.(b) Sampling instants for each sample path as well as thereference path, with respect to the position and frame. . . . 795.6 Displacement field of a block at a random frame with re-spect to a reference frame (top row) and the correspondingcoherency map of the lateral displacement (bottom row), for:(a) abdominal aorta and (b) a hand-held needle. The refer-ence window in (a) contains only the tissue whereas in (b) itcontains the needle. . . . . . . . . . . . . . . . . . . . . . . . 835.7 A sequence of frames displaying the results of the presentedmethod in detecting a hand-held needle in an agar phantom. 836.1 Block diagram of our proposed approach. A reference frameis selected from the input sequence of the B-mode data andall frames are sent as the input to the algorithm (a). Threecomplex Gabor wavelet pairs at three orientations (70◦, 50◦and 30◦) form the steerable pyramid for spatial decompositionof the sequence (b) into local magnitude and phase measure-ments (c). Phase differences of all frames will then be com-puted from the reference frame and (e) temporally filteredusing an FIR bandpass filter. Amplitude weighting is thenperformed on the filtered phase differences for adjustmentsin cases of weak magnitude responses (f). Results from allscales and orientations are combined (g) and thresholded togenerate the binary mask for the HT (h). The HT derives anestimate of the trajectory and discards some of the outliers(i). Polynomial fitting is finally used to remove any remain-ing outliers and improve the trajectory detection (j). Thedetected needle is then added to the input sequence as anoverlay (k). . . . . . . . . . . . . . . . . . . . . . . . . . . . . 89xviiiList of Figures6.2 Illustration of the proposed feature selection pipeline. Phasedifference of local phase measurements obtained from pyra-mid decomposition of the input frames (a) using oriented Ga-bor wavelets are obtained (b). Phase variations over time arecomputed for phase values at each spatial location over sub-sequent frames (c). Features are extracted at various scalesand orientations (e) from the sequence of the phase variations(d). . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 926.3 Schematic Diagram of the needle detection and localization.Based on the obtained feature vectors, an SVM classifier istrained on the selected features in the training data-set and isvalidated on the test data-set resulting in the predicted labelsl and their probability estimate p. A weighted HT based onthe probability estimate is performed on the classified pixelsto localize the needle (f). . . . . . . . . . . . . . . . . . . . . . 936.4 Details of the selected features from the statistics of ARMAmodel. Characteristics of an ARMA model, obtained from(P)ACF, are selected as features. . . . . . . . . . . . . . . . . 956.5 Needle detection result for an in vivo sequence of a hand-heldneedle in porcine femoris muscle. . . . . . . . . . . . . . . . . 986.6 (a) A B-mode frame of the captured sequence, (b) the prob-ability map of the masked pixels in the frame, and averageprobability scores of regions surrounding the needle tip (box)and the pulsating vessels (circles), and (c) the detection re-sult and the GS overlayed on the B-mode frame, as dashedand solid lines, respectively. . . . . . . . . . . . . . . . . . . . 997.1 Block diagram of the proposed approach: Complex steerablepyramids are used for spatial decomposition and motion de-scriptors are computed at each spatial scale. Spatio-temporaland spectral features are extracted within the STCs aroundeach pixel and forwarded to an incremental SVM for classifica-tion. Classification results are spatially analyzed for continu-ity of the positive needle pixels and false positive non-needlepixels are fed into the adaptive training for online learning.Detection result is shown as an overlay on the current frame,and the procedure is repeated in the subsequent frames. . . . 103xixList of Figures7.2 Multi-scale spatial decomposition with oriented filters. (a)Half-octave bandwidth filters in the insertion angle range. (b)Multi-scale PB decomposition of each frame (i) according tothe reference frame (r). . . . . . . . . . . . . . . . . . . . . . . 1057.3 (a) Captured frames containing the needle, arrow is perpen-dicular to the invisible needle shaft. (b) Several STCs con-structed for the captured frames. (c) frame selection forspatio-temporal analysis at the current frame, and the STCsfor each candidate pixel (two candidate pixels in this exam-ple). (d) OF is computed for each pair of the previous con-secutive frames. . . . . . . . . . . . . . . . . . . . . . . . . . . 1067.4 Differential flow scheme shown for a 3 × 3 grid of cells asan example. The center cell is the spatial cross section ofthe image with the corresponding STC of size 5 × 5 × Lt asan example. Arrows represent the diagonal direction of therelative flow. . . . . . . . . . . . . . . . . . . . . . . . . . . . 1087.5 Summary of the spatial distribution analysis and online up-date. I is the input image, fr is the frame number, pi ispixel i and f s1/fs2 : feature vectors at both scales. Itot isthe classifier’s output binary image that gets updated at eachiteration. pfp is the false-positive pixel set, i.e., non-needlepixel misclassified as needle. Outlier is defined based on thehistogram analysis described in Section 7.2.5 . . . . . . . . . 1137.6 Summary of the steps in the classification pipeline. Initialsequence of frames are shown on the left side, where a whitearrow points to the (invisible) needle trajectory. OF is com-puted and the superposition of their magnitude is shown forthe selected STCs. finally, the classification result of the in-cremental training is shown as an overlay on each frame. . . . 1167.7 Localization results shown for 3 subsequent frames. (a) B-mode US image, with two arrows pointing to the needle tipand the shaft locations. (b) and (c) zoom in of the classifier’soutput and the final output of the algorithm, respectively.(d) GS (green line), the HT estimated trajectory from theclassifier’s output (dashed line) and the algorithm’s output(white solid line) overlaid on two corresponding sample frames.1177.8 Localization results shown for 3 frames. CASPER output(white solid line), the OF-based method (red dashed line)and GS (green line) are overlaid on the B-mode image. . . . . 119xxGlossaryACF Autocorrelation FunctionAW Amplitude-weightedARMA Auto-regressive Moving AverageCI Confidence IntervalCASPER Computer-Aided Segmentation of imPERceptible motionFDoG First Derivative of GaussianFIR Finite Impulse ResponseGS Gold StandardHT Hough TransformLA Long AxisLF Ligamentum FlavumLOR Loss-of-ResistanceMB Macro-blockMIP Maximum Intensity ProjectionML Machine LearningOF Optical FlowPACF Partial Autocorrelation FunctionPB Phase-basedPSD Power Spectral DensityxxiGlossaryRANSAC Random Sample ConsensusRBF Radial Basis FunctionRMS Root Mean SquareROI Region of InterestSA Short AxisSD Standard DeviationSTC Spatio-temporal CellSURE Smart Ultrasound Rendering for EpiduralsSVM Support Vector MachineUS Ultrasound2D Two-dimensional3D Three-dimensional2DUS 2D Ultrasound3DUS 3D UltrasoundxxiiAcknowledgementsFirst and foremost, I would like to express my sincere gratitude to the kindestadvisor, Prof. Robert Rohling. I am grateful for his continuous support,motivation and immense knowledge during this chapter of my life. Thankyou for being an amazing supervisor, a wonderful teacher and a supportivefriend. Thank you for your trust, for believing in me and my abilities, andfor giving me freedom to explore the field of my interest.I am very grateful to my supervisory committee, Prof. Tim Salcudeanand Dr. Vit Gunka. Prof. Salcudean has been a phenomenal supervisor,who provided me with insightful comments, wonderful suggestions and en-couragements. He has taught me how to think critically, and has motivatedme to enhance my research from various perspectives. Dr. Gunka has beena fabulous mentor and an amazing clinical collaborator, whose insightfuladvice has truly helped me during this endeavour. Thank you for giving meevery opportunity to learn about the practical aspects of clinical trials.My sincere thanks go to Prof. Purang Abolmaesumi, for his constantadvice, encouragement and guidance that helped me find the right path.I would like to thank Drs. Gunka, Kamani and Massey, for giving mereal-life clinical exposure and helping me with the studies at BC Women’sHospital. I am also thankful to Vickie Lessoway, our wonderful sonographer.I would also like to thank Dr. Ng, our industry collaborator at PhilipsUltrasound, for his invaluable suggestions and support throughout this re-search.My sincere thanks also go to all RCL members at UBC, and all otherfriends who shared the joy with me, especially Manyou Ma, for her support,friendship and enthusiasm during my time at RCL.Last but not least, a special gratitude and love go to my amazing familyfor their unconditional love and encouragement throughout my life. I wouldlike to thank my wonderful parents for their continuous support and guid-ance, and for giving me an invaluable early insight into the medical fieldsince childhood. This journey would not have been possible without them,and I dedicate this thesis to them.xxiiiChapter 1Introduction1.1 Needle insertion in spinal interventions:epidural anesthesiaEpidural anesthesia is a type of regional anesthesia commonly used for painmanagement during labor and delivery, and is an effective alternative togeneral anesthesia during cesarean section [1]. This technique involves theinsertion of a needle into a space between the vertebrae called the epiduralspace (Fig. 1.1). Anesthetic is then injected into this space to provide painrelief for the lower body. A catheter is often threaded through the needleto deliver continuous anesthetic over a longer period of time. The needle iswithdrawn once the catheter tip is within in the spinal canal.The epidural space is located anterior to the ligamentum flavum (LF)and posterior to the dura mater, a layer of tissue protecting the spinalcord. The vertebrae are joined together by connective tissue, which providesstability during rest and motion. The LF connects the laminae of adjacentvertebrae and is one of the important ligaments in the path of epidural needleinsertion. The distance between the LF and the dura mater ranges from 4−6mm in the lumbar spine [2]. Patients are placed in a lateral or sitting positiondepending on their medical status. In the sitting position, the patient archesforward and the needle is inserted approximately perpendicular to the skin(up to 20◦ angular variations).Epidural needle insertion is normally a “blind” procedure, as the anes-thesiologist has no prior knowledge about the patient’s spinal anatomy toguide the needle. Selection of the puncture site and the needle orientation istherefore solely based on manual palpation, which is particularly challengingfor obese patients. In particular, the anesthesiologist palpates the surfacelandmarks of the vertebrae to identify an appropriate intervertebral spaceand to select a puncture site along the midline of the spine. The insertiondepth is verified based on the loss-of-resistance (LOR) technique. UsingLOR, pressure is applied on a saline-filled (or air-filled) syringe attached tothe epidural needle, and the anesthesiologist feels the resistance to injection11.1. Needle insertion in spinal interventions: epidural anesthesiaEpidural spaceLigamentum flavumDuraSkinSuperspinous ligamentInterspinous ligamentFigure 1.1: Needle insertion for epidural anesthesia: the needle passesthrough the skin, the supraspinous and intraspinous ligaments and the denseLF before entering the epidural space (enhancement of the original drawingby Vicky Earle).as the needle advances. The needle enters the skin and passes through thesubcutaneous tissue, the stiff supraspinous and intraspinous ligaments andthe dense LF. LOR is felt once the needle leaves the LF and enters theepidural space, which tells the anesthesiologist to stop the insertion (Fig.1.1).Fig. 1.2 shows the steps of an epidural anesthesia procedure. Localanesthetic is injected prior to the insertion to numb the skin surroundingthe area selected by palpation (Fig. 1.2a). The epidural needle is insertedbetween the selected vertebrae (Fig. 1.2b), and is advanced up to the epidu-ral space using the LOR technique (Fig. 1.2c). This is followed by threadinga catheter within the needle for delivering continuous anesthesia (Fig. 1.2d).Lumbar epidurals are commonly introduced at the L2-3 and L3-4 in-terspinous space, which is usually localized by palpation of the superioraspect of the iliac crest and the L5 vertebra (Fig. 1.3). In lumbar epidurals,the needle may be inserted via a midline or a paramedian approach (Fig.1.4). The paramedian approach, which could be used for patients with cal-cified interspinous ligaments, requires 17G needle passing through erectorspinae muscles and causes pain and discomfort for the patient. Imaging ishowever facilitated in this approach through a window containing the nee-21.1. Needle insertion in spinal interventions: epidural anesthesia(a)(d)(c)(b)Figure 1.2: Needle insertion during the epidural anesthesia procedure: (a)Skin infiltration with local anesthetic, (b) insertion of the epidural needle,(c) LOR technique to advance the needle to the epidural space, and (d)epidural catheter advancing through the needle, after occurrence of LOR.Image reprinted with permission from Medscape Drugs & Diseases.31.1. Needle insertion in spinal interventions: epidural anesthesiaL2-3 interspinous spaceIliac crestFigure 1.3: Interspace localization for lumbar spine using palpation. L2-3interspace for needle insertion is localized by palpation of the superior aspectof the iliac crest and L5 vertebra (enhancement of the original drawing byVicky Earle).dle trajectory, obtained by a midline transverse ultrasound. Currently, theparamedian approach is NOT the standard of care in Canada (midline ap-proach is the standard of care since the epidural space is at its largest inthat approach). The midline approach is more commonly used since theanesthesiologist is not required to have a three dimensional visualization ofthe lumbar anatomy, and the identification of LF, which is the landmarkfor finding the epidural space, is easier in this approach. The needle alsotraverses the midline intraspinous ligament, which causes less discomfort tothe patient.Epidural needles are typically Tuohy needles in 16 to 18 gauge withcentimeter markings. The Tuohy needle has a slight curve at the end tofacilitate the passage of the catheter upwards the spinal canal and to mitigatetissue damages while the needle advances (Fig. 1.5). Although fluoroscopycould be incorporated in to the guidance of lumbar epidural injection, dueto the radiation involved, the blind approach is preferred for the majorityof parturient patients.41.2. Complications of the manual palpationEpidural spaceLigamentum flavumDuraEpidural spaceLigamentum flavumDura(b)(a)Midline ParamedianParamedianMidline10° − 15°Figure 1.4: Midline and paramedian needle insertions into the epiduralspace: (a) transverse, and (b) paramedian view of the vertebrae and epiduralneedle insertion (enhancement of the original drawing by Vicky Earle).1.2 Complications of the manual palpationSince the LF, epidural space and dura mater are only a few millimeters apart,the needle insertion is required to be accurate for a successful injection. Inobstetric patients, the average perpendicular distance from skin surface toLF is reported as 49.5 ± 8.1 mm. The average diameter of lumbar spineinterspace is 10.7± 1.7 mm. The average width, height (accessible throughintervertebral space) and depth (between LF and the dura mater) of theepidural space is reported as 6.8 ± 1.9 mm, 9.1 ± 1.4 mm and 4.9 ± 0.5mm, respectively [3–6]. Table 1.1 summarizes these. Clinically-acceptableaccuracy for needle tip placement is computed as half of the lowest dimension(2.5 mm), as well as the corresponding angular accuracy of around 3◦.The most common complications (0.5− 2.5% [7, 8]) happen due to over-shoot and accidental puncture through the dura mater covering the spinalcord. This is followed by the leakage of cerebrospinal fluid, which causes sideeffects such as postdural puncture headache in 88% of cases [7]. Headachesare usually self-limiting, however, they may develop into disabling cranialnerve palsies (e.g. diplopia), and even cranial subdural haematoma in somecases [9, 10]. Less significant complications such as blood aspiration and per-sistent paresthesias are also reported at 3.3% and 0.13%, respectively [8]. In51.2. Complications of the manual palpationTable 1.1: Epidural anatomy measurements during pregnancy.Measured values Mean Standard deviationSkin-LF 49.5 mm 8.1 mmIntervertebral space 10.7 mm 1.7 mmEpidural width 6.8 mm 1.9 mmEpidural height 9.1 mm 1.4 mmEpidural depth 4.9 mm 0.5 mmextreme cases, complications can lead to respiratory failure or death [7, 11].In clinical training stages, the incidence rates of failed anesthesia and com-plications are even higher, showing the steep learning curve associated withthis procedure [12]. A trainee’s success rate is reported to be on average60% after 20 procedures, which increases to 80% after 90 attempts [13].Although the occurrence of accidental puncture through the dura materis reported to be below 0.8% [14, 15], it is much higher in university train-ing programs (3-5%). The steep learning curve and the high failure ratefor epidurals have the following consequences: (1) complications lead to pa-tient discomfort, anxiety and lengthy hospital stay, (2) a small number ofpatients will suffer permanent neurologic deficiency, and (3) the risks hinderthe adoption of percutaneous neuraxial insertions in every clinics. Compli-cations of the blind procedure for epidural injection, ranging from headacheto respiratory failure and death, happen in 20% of the cases [11]. Amongthe deliveries, 45% on average uses epidurals, based on the Canadian Insti-tute for Health Information. Given the 374,000 annual births in Canada,a simple calculation predicts 170,000 epidurals being administered, whichmay lead to 5,000 complications per year. There is a clear need for a systemto guide epidurals, especially in difficult cases, since most of these complica-tions can be avoided if the physician can visualize the needle trajectory andthe target. A guidance method is needed in order to select the intervertebrallevel more accurately and enhances the needle visualization as it advancesto the epidural space.61.3. BackgroundLigamentum flavumEpidural spaceDuraTuohy needleCatheterFigure 1.5: Catheter insertion with the Tuohy epidural needle in the epiduralspace. The curved distal end of the needle facilitates passage of the catheterupwards and mitigates tissue damages while the needle is advanced. Originalimage reprinted with permission from: J. F. Butterworth, D. C. Mackey andJ. D. Wasnick: Morgan & Mikhail’s Clinical Anesthesiology, 5th edition,c© McGraw-Hill Education.1.3 BackgroundThe manual palpation and the LOR techniques have remained relatively un-changed since the 1930’s. The accuracy of the LOR technique is still in con-troversy due to possible false LOR prior to reaching the epidural space [16],which may lead to accidental intravascular or subarachnoid needle place-ments. Such issues become more significant in difficult cases such as obesepatients or patients with spine diseases. The optimum intervertebral levelis also smaller during pregnancy and the soft tissue between the spinousprocesses is narrower [4].New technologies have been proposed to increase the needle placementaccuracy and reduce the complication rate. For example, pressure infusionpumps and auditory/visual conversion of the pressure at the needle tip hasbeen introduced to replace LOR [17]. Bioimpedance has also been used as ameasure of tissue’s opposition to variable current at the needle tip [18]. Noneof these techniques received widespread clinical acceptance to replace the71.3. BackgroundLigamentum flavum Ligamentum flavumAnterior complexPosterior complexLamina Transverse process(a) (b)Figure 1.6: (a) Sagittal, and (b) transverse imaging planes of the spine.traditional LOR. To replace LOR, we need a guidance system that satisfisesthe following requirements: (Req. 1) help with better puncture site selection,(Req. 2) enhance the needle visibility and (Req. 3) visually confirm theneedle progressing toward and then entering the epidural space.1.3.1 Ultrasound for epiduralsUltrasound (US) is a real-time, portable and non-ionizing imaging modal-ity [19–23]. Unlike X-ray fluoroscopy, US poses no risk to the patient, makingit the only imaging modality feasible for obstetric anesthesiology. AlthoughUS was originally proposed to guide epidurals as early as 1980 [24], mostof the influential works came two decades later when more advanced USmachines became available. Initially, two-dimensional (2D) US (2DUS) wasused to depict the spinal anatomy [25, 26]. The paramedian plane imagingis preferred because of its high quality images [27], although the transverseplane imaging has also shown success [28]. Fig. 1.6 compares the para-median and transverse plane imaging for the spine. In these studies, USwas used to measure the epidural space depth according to the anatomi-cal landmarks before needle insertion. Although such “pre-puncture” UShas shown promise in reducing the number of puncture attempts and in-creasing the learning curve [29], it cannot be used for real-time guidance orsupplementing LOR.Currently, there are two limitations in the US guidance system. First,real-time guidance of a midline epidural needle insertion is impractical withthe conventional 2DUS. This is because a transducer sitting in a midlineposition would obscure the puncture site, while a transducer at the side81.3. Backgroundcannot view both the needle and the target epidural space at the same time.Second, it is required to simultaneously perform the US scanning, insertionand syringe push (LOR) during an epidural. This procedure could only beperformed with two operators [26] which does not fit in the busy clinicalworkflow in Canadian hospitals, where single-operator procedures are moredesirable.An LOR-based real-time guidance system for single operator, using aneedle guide has been proposed by our group using the paramedian ap-proach [30]. However, as mentioned in Section 1.1, the paramedian approachcauses patient discomfort and the needle trajectory is longer to avoid thefootprint of the transducer. Our lab has recently proposed a practical USguidance solution for single-operator midline insertion, using a standard nee-dle and a needle guide (more in Section 1.3.3). Although this could help withReq. 1 (puncture site selection) and Req. 3 (during LOR), visibility of theepidural needle in US still needs to be improved [31, 32].1.3.2 Needle visibility enhancementNeedle insertion is widely used in many medical diagnostic and therapeu-tic procedures, including biopsies, treatment injections, radioactive seed im-plantations and anesthesia. In these applications, accurate needle placementis crucial for the success of the procedure, to avoid damage to neighbour-ing tissue and to minimize complications. For this purpose, US guidance iscommonly used because it is safe, real-time and low cost [19–23]. A reliableguidance system can help the operator to accurately navigate and place theneedle, hence increasing the success rate of these procedures (e.g. epidurals)and minimizing complications.However, despite the wide range and long history of image-guided needleinsertions, an unresolved issue is poor needle visibility [33]. Needle visual-ization is especially challenging for deep insertions and for insertions withsteep angles relative to the US beam. Needle identification in US, including2D and three-dimensional (3D) US images, can be a challenging procedure,due to several factors. Firstly, specular reflection off the needle degradesits sonographic visibility and reduces its brightness in the image. Secondly,even if the needle is visible to some degree, for example when it is nearly per-pendicular to the US beam in the imaging plane, speckle noise, shadowingand reverberation artifacts are often still present. Due to these artifacts andsignal fallout, most of the state-of-the-art often fails to detect the needle.A number of attempts have been made to enhance sonographic needlevisibility. These can be divided into three main categories: (1) signal and91.3. Backgroundimage processing-based techniques to augment the needle, (2) modificationsto the needle and insertion, and (3) changes to US image formation. A fewrepresentative examples are described here.Signal and image post-processingSince most needles are made of steel and thin, they ideally show up as afine bright line in the US B-mode image. Therefore, signal and image pro-cessing methods such as line detection [34–37] and projection-based [38, 39]approaches have been used to augment and localize the needle in the US im-age. Most of these types of approaches rely on the visibility of the needle inthe un-processed US image, as a long line-like structure. Okazawa et al. [37]proposed a segmentation method to identify straight and curved needles,which uses Sobel edge detection, an enhanced Hough transform (HT) and acoordinate transformation. Validation was performed on a tissue-mimickingphantom, which shows a mean error of 0.5 mm in needle localization. Dingand Fenster [38] used image projections to segment the needle in 3DUS(3D ultrasound) images. The approach was based on the assumption thata needle can be identified by a high intensity cluster in the volume. Theyvalidated the method on a turkey breast phantom, which showed an accu-racy of 0.69 mm in needle localization. Ayvali and Desai [34] developed aneedle tracking and localization method based on optical flow and circularHT, taking advantage of the beveled cannula tip for the out-of-plane case.Draper et al. [40] presented a needle detection method based on principalcomponent analysis on clusters formed by the thresholded variance image.A line representing the needle is obtained by the assumption that the nee-dle is the dominant structure, with longest major axis. Their method isalso dependent on the choice of the variance kernel size and the thresh-old, which must be selected empirically. Their method was validated on atissue-mimicking phantom and showed an accuracy of 4◦ in trajectory angle,1 mm in trajectory intercept and 1 mm in tip identification (for insertiondepth greater than 15 mm), for most cases. Hacihaliloglu et al. presented amethod that first extracts the image local phase using Gabor wavelets, andthen uses Random Sample Consensus (RANSAC) to locate a needle basedon the visible insertion site and the needle tip [41]. Zhao et al. proposeda method that uses line filtering to enhance contrast followed by RANSACand Kalman filtering [36]. Wu et al. used a phase grouping approach basedon gradient orientation and magnitude computations followed by a 3D ran-domized HT to segment line coordinates in the volume [39], whose detectionaccuracy was reported to be 84% in vivo.101.3. BackgroundAlthough these methods show success, they all rely on the needle to beat least partially visible in the image with line-like features. Therefore thesemethods assume that there is a high intensity line of pixels in the imagesof the needle. This group of methods are only suitable for cases where, atleast a part of the needle (including the tip) is already visible in the image.Modifications to the needle and insertionModifications to the needle and insertion, usually involves additional toolsand changes to the apparatus and clinical work-flow. Sensors and actuatorshave been adopted widely to help with the needle detection [42–45]. Forexample, Perrella et al. [42] designed a miniature receive-only sensor at thetip of the needle that displays a flashing needle tip. The disadvantage ofthis method is safety considerations since the needle carries electric signals inwires inside the cannula. Adebar and Okamura [45] used a similar approach,where Power Doppler imaging was used to detect the vibration caused bythe actuator attached to the curved needle. They reported an average errorof 1.09 mm in needle shaft localization validated ex-vivo. Harmat et al. pre-sented a technique that augments a vibrating needle using Power Dopplerby moving the stylus inside the cannula [46]. They used Power Dopplerimaging to detect the tissue oscillations caused by stylus vibration inside astationary cannula. Their method’s performance is mostly independent ofinsertion angle, depth and stylus curvature, however strongly dependent ontissue stiffness. Mechanical needle guides [47] as well as robot-assisted nee-dles [48–50] have also been used. Boctor et al. designed a calibrated roboticsystem to manipulate the needle insertion and an imaging system for recon-struction [50]. Tip-steerable needles that predict the needle trajectory basedon steering commands have also started to be used for better insertion accu-racy [49, 50]. Using echogenic coatings or dimples on specialized needles toimprove sonographic needle visibility is another solution. The Tuohy Sononeedle (Pajunk, Geisingen, Germany) for instance, has machined “Corner-stone” reflectors that covers the distal portion of the cannula, to improvethe tip’s ultrasonic visibility. Although such echogenic needles have demon-strated enhanced tip and shaft visibility [51–54], they are typically moreexpensive and therefore less likely to be adopted widely. Most needles forclinical use currently are still made from smooth stainless steel, and comeswith steel or plastic styluses (solid removable cores inside the needles).Each of these methods helped to improve needle detection, however, theyall introduced additional cost, complexity, changes to the clinical practice,and in the case of needle guides, restrictions on the freedom of needle tra-111.3. Backgroundjectory paths relative to the transducer, making them not suitable for someapplications. In addition, the current clinical demand is inclined towardsthe combination of 2DUS imaging and standard needles, which makes cus-tomized apparatus harder to be adopted clinically.Changes to ultrasound image formationAnother way to enhance needle visibility is by changing the imaging tools,and modifying the US transmit and receive beamforming sequences. Forexample, Zhao et al. used 3D imaging to further enhance localization as thethird dimension may give additional information about the needle locationby providing the 3D view [55]. The needle visibility challenges associatedwith poor needle echogenicity due to weak needle echo, however, still re-mains. Cheung and Rohling [56] developed a beam steering technique forstandard needles that adaptively steers the US beam at an angle perpen-dicular to the needle, which enhances the needle shaft visibility because ofthe strong reflections created. The brightened needle in the steered imageis then fused with the original image to form an enhanced image with abetter visualized needle. Variations and extensions to this idea have beenimplemented on commercial US machines including BK and Philips. Hattet al. introduced a machine learning-based approach to segment the needlein beam-steered B-mode images, by first classifying the pixels as needle orbackground, and then using a Radon transform to localizes the needle [57].However their method requires the needle orientation to be known a priori.In the beam-steered image, the needle may not be the most prominentlinear structure in the image, and sometimes other anatomy could be mis-taken as needles. In addition, since the strong side lobe generated fromlarge steering angles may create shadows on the image and degrade the im-age quality, steering to larger angles remains a challenge. Therefore it isstill up to the operator to choose the correct tip and trajectory. Zhuang etal. [58] used spatial compounding on tensor-based filtered images to enhanceneedle visibility for larger insertion angles. They claimed to have solved thefalse enhanced boundaries issue and eliminated shadows caused by side lobesat steep steering angles, however no quantitative validation was reported.Limitations on maximum steering angle in these methods, in turn, causelimitations on steep insertions and the use of curvilinear transducers.121.3. Background1.3.3 SURE: the proposed solution for epidural guidanceThe unique aspect of the technique proposed by our group is the design of asystem using 3DUS with a clip-on needle guide (called “Epiguide”) to guideepidurals. This is based on midline thick-slice rendered images constructedfrom the acquired volumes. The system is named SURE: Smart UltrasoundRendering for Epidurals. The key is the lateral placement of the transducerwith the thick-slice plane aligned with the midline of the spine. The custom-designed platform used for our system includes a motorized m4DC7−3/40microconvex 4D transducer and Epiguide placed on the transducer.A “re-slice” plane is extracted from the acquired volume which containsanatomical landmarks (e.g. laminae), target epidural space and the needlepath. This system therefore allows real-time guidance of the needle up to theepidural space without having the transducer obscuring the puncture site.Fig. 1.7 clarifies the positioning of the transducer equipped with Epiguide,illustrates a paramedian sagittal placement of the transducer, and shows therendered output of an echogenic needle inserted in the L3-4 level of a porcinemodel. Epiguide is placed on a motorized m4DC7−3/40 microconvex 4Dtransduer. The posterior view of the transducer is shown with an insetdepicting a simulated thick-slice rendering of 3DUS data, to demonstratethe location of both the target epidural space and needle trajectory. Thissystem can also be used in transverse direction by having the needle channelat the top. Fig. 1.8 illustrates the transverse positioning of the transducer aswell as the rendered output using a spine phantom in a water bath. Clinicalstudies about the transverse positioning of Epiguide are ongoing. Detaileddiscussions on the applications of Epiguide and clinical investigations ofSURE can be found in Chapter 2 and Chapter 3.131.3. Background(b)(a)(c)Needle shaftAnterior complexLaminaPosterior complexFigure 1.7: Illustration of our 3D thick-slice system used to guide epiduralsin a porcine tissue model: (a) Epiguide placed on a motorized m4DC7−3/40microconvex 4D transduer, (b) posterior view of the transducer, with aninset showing a simulated thick-slice rendering of 3DUS data (enhancementof the original drawing by Vicky Earle), and (c) thick-slice rendering of theacquired volume from a porcine study with an inserted echogenic needle.141.3. Background(a)(c)NeedleSuperior articular processes (SAP)Superior articular processesSAPNeedleTransverse processTransverse processTransverse process(b)Figure 1.8: Depiction of 3D thick-slice rendering (transverse view) used toguide a midline epidural needle insertion in a spine phantom: (a) coronaland (b) sagittal views of the transducer and the inserted needle in the phan-tom, and (c) thick-slice rendering showing the needle as well as the spinalanatomy.151.4. Proposed solution1.4 Proposed solutionAs mentioned before, accurate placement of the needle is crucial for the suc-cess of US-guided procedures. Most of signal processing-based approachesrely on a high intensity line of pixels rendered by the needle in the US image.In reality however, the needle might not show up as high intensity pixels atall and there is still the challenge to identify the needle when it is invisible ornearly visible. Other possible solutions involve modifications to the appara-tus (e.g. needle), hence it is harder to be adopted to current clinical practice.In summary, needle visualization is still a challenging research topic espe-cially in the following situations: (1) using a standard needle without anyadditional hardware, (2) for needles inserted at steep angles, (3) using curvi-linear transducers, for which beam steering is challenging, and (4) to avoidfalse-positive identification of needle when there are strong line-like featuresin the tissue.This thesis attempts to provide a clinically-suitable solution to enhanceneedle visibility in ultrasound-guided interventions. We are especially inter-ested in challenging cases where the needle is invisible (has barely detectablefeatures) in the still US image. We hypothesize that there is sufficient infor-mation in the motion dynamics of the needle to successfully detect it, evenwhen it is invisible in still B-mode images. The final learning-based needletracking solution aims to detect, track and localize a hand-held needle in theultrasound image sequences. To this end, we have evaluated the proposedneedle detection methods using various motion analysis approaches on thetime-series data, and compared them to the gold-standard identified by thesonographer in-vivo and ex-vivo.1.4.1 ObjectivesThe ultimate goal of this thesis is to provide a clinically-acceptable technol-ogy that maximizes accuracy of needle localization and has the potential toimprove the needle placement procedure in epidurals. This thesis thereforehas the following objectives: (1) provide a clinically-suitable software-basedapproach to detect an invisible needle in the US image, and (2) providea clinically-acceptable system to facilitate single-operator midline epiduralneedle insertions, and investigate the clinical feasibility of the system, real-life challenges of needle visibility and its effect on the success of epidurals.161.4. Proposed solution1.4.2 ContributionsIn this thesis we look into the problem of ultrasonic needle visibility ingeneral (not limited to epidurals) and aim to detect a hand-held needle thatis invisible in the still US image. We also developed several techniques thatare fundamental for a reliable US-guided single-operator midline epiduralprocedure. To this end, we have proposed needle visibility enhancementmethodologies that can be incorporated to our developed epidural guidancesystem. Several frameworks are designed and evaluated to detect a hand-held needle based on spatio-temporal and spectral signatures obtained frommotion analysis. We demonstrate that nearly invisible changes in motiondynamics of the needle can be revealed through spatio-temporal processingof the standard B-mode video sequences. The proposed methodology isa fully software-based approach with consideration toward clinical utility,to track a clinically-used standard needle. We have also investigated thefeasibility of the proposed epidural guidance system on parturients withprospective observational studies. In the course of achieving the objectives,the following contributions have been made:1. Proposed a motion-based real-time needle detection framework to lo-calize an epidural needle in US based on stylus motion.2. Proposed the novel idea of detecting an invisible hand-held needlebased on the analysis of dynamics of the hand tremor motion.3. Developed a novel framework that distinguishes a hand-held needlefrom the tissue using optical flow analysis and spectral coherence.4. Developed a framework that extracts spatio-temporal and spectral fea-tures of a hand-held needle in US using spatial decomposition andtemporal filtering.5. Extended the proposed needle detection framework to incorporateprobabilistic learning-based methods, and improve the detection ac-curacy using a confidence map.6. Proposed a learning-based framework to track a hand-held needle whilebeing inserted in tissue with intrinsic motion, imaged by a hand-heldtransducer.Note, contributions 3-5 represent increasingly sophisticated and effectivesolutions, resulting in findings for the ultimate solution for needle trackingin contribution 6.171.5. MaterialsThis thesis aims to provide ultrasonic needle visibility enhancement so-lutions in general, and especially for our target application, epidurals. Inorder to have a deeper understanding of the real issues and investigating pos-sible solutions to tackle them, we started off with clinical investigations ofSURE and needle visibility challenges in epidurals. It is acknowledged thatthe first two chapters, discussing Epiguide and its validation on parturientsand porcine models, are mostly focusing on the clinical needs and statisticalvalidation of the technology. Whereas the main focus of this thesis is on thelast four chapters addressing the needle visibility issue, the initial clinicalinvestigations of SURE and Epiguide have been quite useful for learningseveral aspects of the epidural procedure and understanding the challengesof needle visibility and the needs of the clinicians.1.5 MaterialsMaterials utilized in this dissertation are collected from studies conductedat UBC, BC Women’s Hospital and Jack Bell Animal Research Facility,following approval from the University of British Columbia/Children’s andWomen’s Health Centre Research Ethics Board (certificates #A11-0223, #A14-0171 and # H10-01974). Detailed protocol used to collect data for eachphase is discussed in the material section of each chapter individually.1.6 Thesis outlineThe outline of the thesis is illustrated in Fig. 1.9. The rest of the thesis issubdivided into seven chapters as outlined below:Chapter 2: Single-Operator midline Epidurals Using Epiguide:Puncture Site SelectionIn this chapter, we introduce a novel real-time 3DUS image processing tech-nique and an innovative needle guide (Epiguide), which facilitate real-timesingle-operator midline insertions, and validate it clinically. Our main inno-vation is that we shift the US transducer away from obscuring the puncturesite and use a re-slice of a 3DUS volume in order to provide a midline viewof lumbar anatomy. As our first clinical study using human subjects, itaims to evaluate the system’s performance in identifying the needle’s punc-ture site against a standard palpation-based method. Our proposed methodproduces puncture site identification accuracy that is within the range ofintra-observer palpation variability.181.6. Thesis outlineChapter 4Epiguide• Puncture siteNeedle visibilityChapter 11. Real-time imaging2. Needle visibilityChapter 2Epiguide• Puncture site/depth• Visibility challengesChapter 3Chapter 7Chapter 6Chapter 5Stylet motionTremor motion (OF)• Optical flowTremor motion (PB)• Phase-based• Machine LearningTremor motion (OF + PB)• Machine Learning• Tracking• FreehandFigure 1.9: Graphical outline of thesis.191.6. Thesis outlineChapter 3: Needle Visibility in Ultrasound Using Epiguide:ChallengesTo understand the challenges in needle visibility, we investigate the visibilityof a standard needle in US images and compare it with an echogenic needlethrough a porcine study ex vivo. (1) We further evaluate the feasibility ofusing Epiguide to successfully achieve LOR, and (2) investigate the successof the procedure using a standard or an echogenic needle. Statistically sig-nificant improvement is observed by the use of an echogenic needle, with94.2% of insertions with echogenic needle rated as excellent/satisfactory,compared with just 29.4% for standard needle. Analysis also confirms thechallenge of visualizing a standard needle; echogenic needle increases thelikelihood of successful LOR, with 93.3% of needle insertions rated as ex-cellent, compared with just 50% of insertions rated as invisible, achievingsuccessful LOR. An LOR up to the epidural space confirms being successfulin 76% of cases overall, which mostly belong to the echogenic needle group.Chapter 4: Motion Analysis of a Standard Needle: Moving stylusIn search of a clinically-suitable solution to enhance needle visibility in USimages and through consultation with our collaborators at BC Women’sHospital, we proposed our first method that localizes a needle based on theanalysis of motion dynamics. A real-time motion analysis of stylus oscilla-tion within a stationary cannula has been proposed. By stylus oscillation,including its full insertion into the cannula to the tip, image processing tech-niques can localize the needle trajectory and the tip in the 3DUS volume.Minute intensity variation, helps with localization of a needle that is invisi-ble to the eye otherwise. The method is evaluated on three different tissuetypes ex vivo, with method accuracy lying within the clinical acceptance.Results also indicate that the method’s performance is independent of theessential visibility of the needle in a still image and the echogenicity of thetissue. Results of this technique has motivated us to continue investigatingmicro-motion analysis to produce minute ultrasonic intensity variations inorder to help with the detection of invisible needles.Chapter 5: Optical Flow Analysis of a Hand-held Needle: FixedImagingIn Chapter 4, analysis of motion dynamics proved to reveal needle signaturesthat were invisible in the image otherwise. This motivated us to propose a201.6. Thesis outlinesingle-operator clinically-acceptable framework to incorporate motion com-ponents of the needle to help with the detection without the need of anexplicit motion. In particular, we aim to detect regions moving coherentlyunder hand tremor motion in order to detect an invisible hand-held nee-dle. Unlike chapter 4, no explicit/separate stylus motion is required. Subtledisplacements arising from the applied hand tremor motion on the needlehave a periodic pattern which is usually imperceptible to the naked eye inthe B-mode image. A coarse-fine estimation process is proposed, where themotion pattern, computed using optical flow analysis, is processed alongspatio-temporal linear paths with various angles originating from the esti-mated puncture site, within the trajectory channel. Spectral coherency isused to compute the trajectory with maximum correlation in the motionpattern. Results are evaluated on porcine femoris muscle in-vivo with atleast one pulsating vessel in the field of view. Results demonstrate thatnatural tremor motion creates minute coherent motion along the needle,which could be used to localize the needle trajectory within the acceptableaccuracy.Chapter 6: Multi-scale Phase-based Analysis of a Hand-heldNeedle: Fixed ImagingIn search of a more robust, accurate and computationally faster technique tolocate regions moving with tremor motion, we investigate phase-based analy-sis to estimate displacements due to the hand tremor. Minute displacementarising from the tremor motion is detectable using multi-scale spatial de-composition followed by temporal filtering. The phase-based approach isalso extended to a probabilistic learning-based approach using novel hand-crafted features. We hypothesize that formulating needle localization asa classification problem would make the needle detection more robust toimprecise assumptions and incorrect detection. In addition to the classifica-tion, we also obtain a probability map of the segmented pixels, to show thelikelihood of needle pixels. The two-step probability-weighted localizationon the segmented needle in a learning framework is the key innovation whichresults in localization improvement and adaptability to specific clinical ap-plications. Results are evaluated on porcine femoris muscle in-vivo with atleast one pulsating vessel in the field of view. Phase-based analysis and theprobabilistic SVM, result in clinically-acceptable accuracy and enhance theneedle detection from the surrounding vessels.211.6. Thesis outlineChapter 7: Needle Tracking Using Phase-based Optical-flowAnalysis and Machine Learning: Free-hand ImagingOptical flow and the phase-based approaches presented previously are effi-ciently combined in an online learning-based framework for real-time track-ing and a potentially more accurate localization. The novel micro-motionbased approach presented in this chapter aims to track a needle in US im-ages captured by a hand-held transducer. We have evaluated novel spatio-temporal and spectral features in a self-supervised tracking framework toimprove the detection accuracy in the subsequent frames using incrementaltraining. Using spatio-temporal features and differential flow analysis weincorporate the neighboring pixels, and mitigate the effects of the subtletremor motion of a hand-held transducer. We have compared the track-ing method with respect to our previous proposed approaches as well asthe intensity-based state-of-the-art. Compared to the appearance-based de-tection approaches, the proposed method is especially suitable for needleswith ultrasonic characteristics that are imperceptible in the static image tothe naked eye. Results are obtained from tests on porcine femoris musclein-vivo, while the needle is being inserted in the tissue. A pulsating vesselwas present when possible. The method’s comparison against an intensity-based approach, achieves 100% success rate versus only 14%. The proposedframework can detect and track a needle in US with clinically-acceptableaccuracy, despite the essential visibility of the needle.Chapter 8: Conclusion and Future WorkThis chapter includes a short summary of the thesis followed by discussionsand suggestions for future work.22Chapter 2Single-Operator midlineEpidurals Using Epiguide:Puncture Site Selection2.1 IntroductionUS has been previously introduced as an adjunct in neuraxial procedures toidentify the midline, identify the intervertebral levels, choose the puncturesite and measure the distance to the LF [60–62]. Most research has been onpre-puncture US but several studies attempted real-time US during needleinsertion. Grau et al. [26] described real-time 2DUS and Belavy et al. [31]described 3DUS, but both required two operators which is impractical forroutine practice. Niazi et al. [63] performed a single-operator real-time US-guided spinal anesthesia using the SonixGPS needle tracking system, butit required a complex procedure with an electronic sensor inserted in theneedle that precludes using LOR to determine the depth. Tran et al. [30]described real-time 2DUS with a single operator using LOR, but requireda paramedian needle insertion instead of the preferred midline approach.Karmakar et al. [32] and Menac et al. [64] also investigated real-time 2DUS,but also required a paramedian approach. A paramedian approach passesthrough the erector spinae muscle, increasing patient discomfort comparedto the standard midline approach.We have developed SURE: a novel, real-time 3DUS image processingtechnique and an innovative needle guide (Epiguide) to facilitate real-time,single-operator, midline insertions. The overall concept is to shift the UStransducer away from obscuring the puncture site and use a re-slice of a3DUS volume in order to provide real-time imaging. As the first step inThis chapter is adapted from [59]: P. Beigi, P. Malenfant, A. Rasoulian, R. N. Rohling,A. Dube and V. Gunka (2017). Three-dimensional US-guided real-time midline epiduralneedle placement with Epiguide: A prospective feasibility study. US in Medicine & Biol-ogy, vol. 43, no. 1, pp. 375-379.232.2. Materials and methodsevaluation on human subjects, the aim of the present study is to investigatethe ability of the system to identify the needle’s puncture site by comparingto standard palpation. The first hypothesis was that 3DUS+Epiguide canidentify the puncture site within a 5 mm radius compared to the site identi-fied by palpation, to compare to current practice of neuraxial insertions (5mm is the approximate intra-observer palpation variation). The second hy-pothesis was that the difference between the puncture sites identified by the3DUS+Epiguide and palpation are not correlated to patient characteristics.2.2 Materials and methods2.2.1 Study population and designFollowing approval from the University of British Columbia/Children’s andWomen’s Health Centre Research Ethics Board (Certificate #H10− 01974)we conducted a prospective observational study in parturients deliveringat BC Women’s Hospital, Vancouver, Canada between December 2013 andJanuary 2014 (n=20). The study was registered on the Clinical Trials Reg-istry (NCT01523249, Principle Investigator: Vit Gunka, January 27, 2012).After signed consent, patients were recruited prior to scheduled Cesarean de-livery. Inclusion criteria included ASA (American Society of Anesthesiology)physical states I or II, age ≥ 19 years, term pregnancy (≥ 37 weeks gesta-tion age) and ability to read English. Exclusion criteria included BMI (BodyMass Index) ≥ 40, scoliosis, previous lower back surgery, active labour, al-lergy to epidural tape, surgical paper tape, or felt pen. All health informa-tion remained confidential and personal identifiers were not attached to thecollected data. Patient characteristics were (mean ± standard deviation):36.4±5.0 years of age, 74.4±8.3 kg weight, 165.0±6.1 cm height, 28.0±3.3kg/m2 BMI and 39.0± 1.0 weeks gestational weeks.2.2.2 3DUS+Epiguide development3DUS was implemented on a commercial US machine (Sonix Touch, Ultra-sonix Medical Corp., Richmond, Canada) with a motorized US transducer(m4DC7−3/40). The machine and the transducer are Health-Canada ap-proved. 3D thick-slice rendering was calculated with custom software alongthe mid-sagittal plane that includes the needle trajectory.The prototype of the Epiguide was designed in collaboration with StarfishMedical Corporation (Victoria, Canada) under ISO 13485 and fabricatedfrom stainless steel for sterilization in a steam autoclave. The needle guide242.2. Materials and methods(a) (b)Figure 2.1: Epiguide model sketch: (a) side and (b) top views ofm4DC7−3/40 US transducer equipped with Epiguide. Red arrows pointto the channel for needle placement.comprises two separate pieces that clamp precisely together to the trans-ducer in a stable and unique locked position. Epiguide has a channel inwhich the needle is pressed and held with a finger (Fig. 2.1).Thick-slice Volume RenderingThe concept of the 3D thick-slice rendering is illustrated in Fig. 2.2 andFig. 2.4. To match the current standard of care for midline insertion, thetransducer is designed to be placed paramedian while the needle is insertedmidline using the custom built needle guide, and allowing the LOR tech-nique. Maximum Intensity Projection (MIP) was performed on interpolatedslices of the B-mode US volume within ±2 mm of the mid-slice, to superim-pose the inserted needle on the anatomical structures such as laminae andspinous processes.Fig. 2.2a shows the 3D volume imaged by the transducer as a light blueoutline and the 4 mm thick slice of the volume as a pair of vertical blacklines. In the depicted pose of the transducer, the thick slice is aligned withthe midline of the spine. The needle trajectory passes through this thickslice and is shown as a red line in the image. The LF provides a strongUS echo above the epidural space and is depicted as a short horizontal line252.2. Materials and methods(at the tip of the red line) below the wave-shaped echos from the lamina ofneighbouring vertebrae (Fig. 2.2b). The intersection of the needle trajectory(red) with the LF indicates proper alignment to the anesthesiologist.Data CollectionProcedural steps were performed by an anesthesiologist (Paul Malenfant).Patients were seated, with neck and hips flexed, and feet supported. Theanesthesiologist first determined the midline with 2DUS scanning, then pal-pated L2-3 and L3-4 and marked the levels on the skin with a felt-tippedmarker. A blunt-tipped needle was placed into the Epiguide to mimic thehandling before an actual insertion. The 3DUS transducer equipped withEpiguide was placed in the paramedian plane at L2-3 and L3-4. The needlewas first aligned visually to be perpendicular to the skin, then US guidancewas used for fine positioning of the transducer. The wave-shaped patternwas observed as the US reflected off the lamina, and the transducer wasmoved cranially-caudally until the intended interspace was visualized in thecentre of the image. The position and angle of the transducer was adjusteduntil the best view of LF was obtained.Based on where the needle touched the skin, a green felt-tipped penwas used to mark the 3DUS+Epiguide selected puncture site on the skincorresponding to the L2-3 and L3-4 interspaces, labeled as 3D1 and 3D2,respectively. A transparency was attached to the patient’s back and itsedges were marked on the skin. 3D1 and 3D2 were then transferred tothe transparency, marks on the skin were erased and the transparency wasrolled up. Next, the puncture sites at L2-3 and L3-4 were identified usingpalpation, marked on the skin using a red felt-tipped marker and labeled asP1 and P2, respectively. The transparency was rolled down with a visualcheck that the corners re-aligned to the edge marks. Points were transferredto the transparency and each transparency was tagged with a non-identifyingpatient number. The marks were transferred to a transparency film to recordthe puncture sites and blind the anesthesiologist to previous measurements.These steps are summarized graphically in Fig. 2.3. A complete descriptionof the clinical study protocol is also included in Appendix 1.The difference between puncture sites from 3DUS+Epiguide and palpa-tion was determined in two ways. The first calculation was the Euclideandistance between 3D1 to P1 and the distance between 3D2 to P2. The sec-ond calculation was the distance from 3D1 to the estimated midline, and for3D2 to the estimated midline. The midline was estimated by the line thatintersects P1 to P2, called the “Pline” (see Fig. 2.4).262.2. Materials and methods(b)(c)(a)Figure 2.2: 3DUS with Epiguide. (a) Transverse view of the spine andthe US field of view. Black parallel lines show the reslice plane and thered line shows the needle trajectory. (b) Operator holding the transducerparamedin for a midline needle insertion. (c) The transducer is shown as ablue box with inset showing a simulated thick-slice rendering of 3DUS data(between black lines about the midline) depicting both target epidural spaceand needle trajectory.272.2. Materials and methods(a)(b)(c)(d)Figure 2.3: Description of the puncture site labeling and measurements: (a)Determine 3D1 and 3D2 with thick-slice US imaging, (b) copy points 3D1and 3D2 on the transparency, (c) Identify P1 and P2 by palpation and (d)copy points P1 and P2 on the same transparency.Data processing and statistical analysisPuncture site measurements were obtained from two levels on each subject(n=20) with both methods for a total of 40 paired sites. Another anesthe-siologist (Vit Gunka) repeated the puncture site identification (n=10) on asingle subject to independently assess intra-observer variability. The stan-dard deviation (SD) and maximum extents of the set of puncture sites weremeasured. Statistical analysis were carried out in R version 3.1.1 (R CoreTeam).282.3. Experimental results(a) (b)Figure 2.4: (a) Thick-slice rendering showing the LF in the midline. LFis also aligned with the anticipated needle trajectory, shown as horizontallines at each centimeter of depth. (b) Conceptual illustration of the skinmarkings and measurements based on vertebral levels L2-3 and L3-4.2.3 Experimental resultsThe mean and SD of the distances (mean ± SD) for 3D1-P1 was 3.1 ± 1.7mm, for 3D2-P2 was 2.8 ± 1.3 mm, for 3D1-Pline was 1.8 ± 1.2 mmand for 3D2-Pline was 2.2 ± 1.2 mm. The mean and SD distances werenot significantly different between L2-3 and L3-4 interspaces. In 95% ofmeasurements (38 out of 40), the difference between the needle puncturesite identified by the 3DUS+Epiguide to that identified by palpation was lessthan 5 mm. In 100% of measurements, the perpendicular distance betweenthe needle puncture sites identified by the 3DUS+Epiguide to the midlineidentified by palpation was less than 5 mm. A Pearson correlation analysiswas performed between 3DUS+Epiguide and palpation. For the correlationerrors to patient characteristics, the correlation coefficients and p-valuesfor all the tests were calculated with the hypothesis of correlation rejectedfor p = 0.05. The distances between the puncture sites identified by theThis section is also adapted from [65]: P. Malenfant, P. Beigi, V. Gunka, A. Rasoulian,R. N. Rohling and A. Dube (2014). Accuracy of 3D Ultrasound for identification ofepidural needle skin insertion point in parturients: A prospective observational study.Society for Obstetric Anesthesia and Perinatology, pp. 308.292.4. Discussion and conclusion3DUS+Epiguide and palpation were not correlated to patient characteristicsof age, weight, height, BMI, or gestational age.2.4 Discussion and conclusionThe distances between puncture sites from 3DUS+Epiguide and palpationwere comparable to the intra-observer variability measured for the perform-ing anesthesiologist were (SD/max): 1.9/5.2 mm laterally and 1.6/4.7 mmcaudo-cranially, for L2-3 and 1.8/5.4 mm laterally and 2.2/7.8 mm caudo-cranially, for L3-4. In particular, all of the measured distances fall withinthe lateral range of intra-observer variability of manual palpation on thesame patient, suggesting the mid-sagittal plane was correctly identified by3DUS+Epiguide.The long-term aim of the proposed system is to extend the abilitiesand potential benefits of current US systems. A high success rate of thepre-puncture 2DUS to determine the puncture site for labour epidurals wasreported in both the general obstetric population [66] and in obese parturi-ents [67]. The limitation of current pre-puncture US is the lack of guidanceduring needle insertion and possibility of an erroneous trajectory once US isremoved. Full guidance would provide information on puncture site, needleangle, and the depth before and during the insertion.Belavy et al. [31] assessed feasibility of standard real-time 3DUS forepidural catheter placement. The authors concluded that real-time 3DUShad the potential to improve operator orientation on the vertebral columnbut this came at a price of decreased resolution, frame rate, and needlevisibility. 3DUS+Epiguide has similar limitations of resolution and framerate due to the current motorized technology of the 3DUS transducer, butslow needle advancement is typical and matrix-array technology is also avail-able (e.g. xMATRIX array transducer, Philips Ultrasound, Bothell, WA) toimprove frame rate if needed.Successful use of the single-operator real-time US-guided lumbar epidu-ral insertion (without Epiguide) in parturients, but using a paramedian ap-proach was previously studied by our group [30]. 3DUS+Epiguide, on theother hand, determines a midline needle puncture site using a single-operatorreal-time technique. Epiguide was developed to allow a single operator toperform a midline epidural needle insertion guided in real-time by 3DUS.The ergonomic design of Epiguide allows for ease of assembly, mounting andholding, a zero-force removal of the needle from the guide, and the abilityto simultaneously perform LOR for endpoint determination. Epiguide also302.4. Discussion and conclusionhas a physical zero-mark that corresponds to a digital zero-mark on the topof the US image that will be used in a future study on guiding the depth ofinsertion.In conclusion, we found that 3DUS+Epiguide was feasible for depictingthe vertebral levels and selecting the needle puncture site in parturients, ina single-operator real-time midline setup. Further studies in the followingchapter will investigate the ability of 3DUS+Epiguide’s ability to measurethe depth of the target LF and visualize the needle advancing toward thetarget for a successful LOR.31Chapter 3Needle Visibility inUltrasound Using Epiguide:Challenges3.1 IntroductionAs mentioned in previous chapters, lumbar epidurals are frequently per-formed in pregnant women for analgesia and anesthesia during labor anddelivery. Traditional insertion methods utilize a landmark technique forlumbar level selection and midline identification. Once partially inserted,the needle is subsequently guided by ‘feel’ until a loss of resistance (LOR)to fluid or air signifies entry into the epidural space. This technique maybe challenging in obese parturients and in those with anatomical variations,where surface landmarks may not reflect underlying anatomy. Pre-punctureUS to identify inter-vertebral space has previously been shown to reducethe number of attempts and improve labor analgesia [26, 69]. Indeed, in theUK, the National Institute for Health and Care Excellence issued guidelinesin 2008 concluding that US guidance can improve both patient comfort dur-ing procedure and success rate for entering the epidural space on the firstattempt [70].Recently there has been an increased interest in real time US-guidedneuraxial placement [30–32, 64, 71, 72]. Real-time, as compared to pre-puncture US, has potential benefits of providing continuous visual guidanceof the trajectory of the needle to meet the target. However, it demandsrelatively complex techniques which remain in the experimental stages. Asmentioned in the previous chapter, real-time US-guided epidural placementhas been described using either a paramedian approach [32, 64, 71, 72],This chapter is adapted from [68]: J. Stone, P. Beigi, R. Rohling, V. A. Lessoway,A. Dube, and V. Gunka (2017). Novel 3D Ultrasound system for midline single-operatorepidurals: A feasibility study on a porcine model. International Journal of ObstetricAnesthesia. International Journal of Obstetric Anesthesia, vol. 17, pp. 51-56.323.2. Materials and methodsor with two operators [26, 31]. The proposed system, SURE, presented inChapter 2, however, is based on a single operator holding the probe, per-forming the needle insertion in the midline, and using an LOR technique toidentify the epidural space. The proposed 3DUS technique allows a para-median scan and utilizes a thick slice rendering technique to then obtain animage that includes a midline view, including the advancing needle. Whencombined with an innovative needle-guide (Epiguide) that provides a fixedpoint of needle attachment, this allows for a rendered, in-plane, real-time,single-operator, midline US guided epidural needle insertion.As mentioned in Section 1.3.2, despite several attempts, needle visibilityhas been a continuous challenge in many needle insertion procedures. Inthis chapter, we compare the visibility of a standard needle with respectto a special type of needle with enhanced visibility, called echogenic needle.The Tuohy Sono needle for instance, has machined “Cornerstone” reflec-tors covering the distal portion of the cannula, to improve its ultrasonicvisibility [51–54].The objectives of the study in this chapter is to confirm the feasibility ofsingle-operator real-time midline in-plane US guided epidurals in a porcinemodel ex vivo, and to determine whether the use of echogenic needles has aneffect on needle visibility, and consequently, to determine whether this affectsthe rate of successful insertion into the epidural space. We hypothesize thatthis technique would offer a practical solution to single-operator US guidedepidurals, and that echogenic needles can further improve this by enhancingneedle visibility.3.2 Materials and methodsThe study was conducted at the Robotics and Control laboratory at theUniversity of British Columbia. Freshly slaughtered porcine tissue was ob-tained through a certified butcher. The UBC guidelines and notification ofthe Animal Care and Biosafety Committee were observed. A conveniencesample of six intact porcine spines ex vivo were mounted on a stand, andsecured to create a lumbar curvature.3DUS was implemented on a commercial US machine (Sonix Touch,Ultrasonix Medical Corp., Richmond, Canada) with a motorized US trans-ducer (m4DC7−3/40). The transducer, frequency 3 − 7 MHz and field ofview 79◦, is placed in the paramedian plane, while the needle is inserted mid-line. Depth and focus were adjusted by the anesthesiologist while imagingthe patient. MIP was performed on the captured slices of the B-mode US333.2. Materials and methods(a) (b)Needle channel Needle channelEpiguideEpiguideFigure 3.1: Epiguide mounted on US transducer with epidural needle in situ:(a) probe orientated in sagittal plane and (b) probe orientated in transverseplane.volume within ±2 mm (slice thickness) of the mid-slice, to superimpose theinserted needle on the anatomical structures such as laminae and spinousprocesses. The machine and the transducer are Health-Canada approved.A sterile transducer cover (CIV-Flex, USA) and coupling gel, (Aquasonic100, Parker Lab, USA) were applied to the US transducer. The Epiguide,our innovative needle guide, was mounted on the transducer (Fig. 3.1) andcalibrations were taken to equate the US distance from the ‘zero point’ onthe Epiguide to the end of a needle of known length, with the actual distance.The prototype of the Epiguide was designed in collaboration with StarfishMedical Corporation (Victoria, Canada) under ISO 13485 and 3D printed inan acrylic compound. Epiguide has a channel in which the needle is pressedand held with two or three fingers.US scanning was performed by an anesthesiologist (Jeannine Stone),starting at the caudal most level in the paramedian plane. In order toidentify each intervertebral space, scanning was initially in 2D mode withthe beam angled medially aiming to detect the distinctive wave like patternof the laminae. Further scanning then sought the acoustic window betweenthe laminae, so that the LF, or the posterior complex, which comprises theLF, epidural space, and posterior dura [73] were visible (Fig. 3.2(a)).Once this typical pattern was seen, depth settings were optimized andthe probe was changed to 3D function. This changed the mode of the US,such that the paramedian images obtained were reconstructed to allow visu-alization of the midline and needle trajectory. The selected needle was then343.2. Materials and methods(a) (b)Needle shaftAnterior complexLaminaPosterior complexNeedle tipAnterior complexPosterior complexLaminaFigure 3.2: Paramedian sagittal oblique 3D thick slice rendered view ofporcine lumbar vertebrae with echogenic needle in situ. Posterior complexconsists of LF, epidural space and posterior dura. Anterior complex com-prises anterior dura, posterior longitudinal ligament, vertebral body. (a)needle in half way, (b) needle tip just before the posterior complex.mounted on the Epiguide, which was used to guide the trajectory of theneedle. The needle was inserted into the skin, and advanced to the epidu-ral space in the midline plane, watched in-plane, in real-time on the 3DUSscreen. If the needle was visibly heading off course, it could be re-angledslightly to again aim for LF. Once the needle was near LF, the stylus wasremoved and a LOR syringe filled with water was attached to the epidu-ral needle. A continuous pressure technique was utilized on advancing theepidural needle, as shown in Fig. 3.3. Needle visibility was assessed in seriesof these images by the anesthesiologist on a 4 point scale [26, 74] (Fig 3.2(b))(0 = cannot see, 1 = poor, 2 = satisfactory, 3 = excellent). Successful entryinto the epidural space was judged by achieving LOR to fluid.Needle insertion depth, from needle tip to a “zero-mark” on the Epigu-ide, was recorded. Calibrated 3DUS measured depth from the ‘zero’ pointmarked on the Epiguide, to LF was also measured. The needle was then re-moved, and the whole process was repeated at ascending lumbar levels. Theneedle order was randomized using a computer generated program, with theresult of eventually having equal numbers of echogenic needle insertions, Pa-junk (Germany) 17G 150 mm Tuohy, and standard needle insertions, Arrow(USA) 17G, 125 mm Tuohy.Each intervertebral level was accessed only once. Once all lumbar levelshad been utilized, the next porcine spine was used, starting at the caudalmost level. The screenshot images were saved and needle visibility was laterrated by an independent assessor (Vickie Lessoway), who is an experiencedultrasonographer, and who was blinded to the needle type. Success at first353.3. Statistical analysisFigure 3.3: Screenshot of video showing continuous pressure technique forLOR, with Epiguide and needle held in one hand, and syringe held in otherhand.needle pass was recorded in a binary manner, based on whether or not LORto fluid was achieved. Needle visibility scores from the anesthesiologist andindependent assessor were assessed for degree of agreement. The actualmanual needle insertion depth we had recorded were then compared withthe 3DUS depth.3.3 Statistical analysisAll analyses were carried out in R version 3.3.2 (R Core Team). Success ratesof this novel technique were calculated as a simple percentage of successfulentries into the epidural space, as a proportion of all insertions. Concor-dance correlation coefficients were calculated between the manual and USmeasured depth of needle insertions with 95% CI. A Bland Altman analysiswas plotted to determine the mean difference between these two measure-ments, and the 95% limits of agreement.To determine whether the use of echogenic needles had an effect onneedle visibility, a weighted kappa score with 95% confidence interval (CI)was first calculated for level of agreement between ratings of needle visibilityfrom the anesthesiologist and independent assessor. Mean needle visibilityscores between the two assessors were calculated. Fisher’s exact test wasthen used to determine whether mean needle visibility differed significantlybetween the needle types. This test was also to assess whether the success363.4. Experimental resultsof insertion was related to needle visibility. A Cochran Armitage test fortrend was then performed on this latter data.3.4 Experimental resultsA total of 34 epidural needle first pass insertions were evaluated in the sixporcine spines at the 34 lumbar intervertebral spaces. 76% of insertions(26/34) were deemed a success, as judged by achieving LOR. The unsuc-cessful needle insertions did not result in entry into the epidural space, asindicated by no LOR to fluid. These were predominantly, (5/8 occasions)during the insertions where needle visibility was not optimal (‘0’, cannotsee, or ‘1’, poor) (Table 3.1).We collected depth data for 14/34 needle insertions. Of these, meanactual insertion depth of the needle from zero-mark to epidural space, wasgreater than the mean depth estimated using the calibrated 3DUS, 10.4 cm(SD 0.5 cm) vs 9.6 cm (SD 0.6 cm). Concordance correlation coefficientbetween the manual and US measurements of needle depth were 0.45 (95%CI = 0.20 to 0.65). Bland-Altman analysis suggested a mean difference be-tween the methods of 0.85 cm, (SD = 0.53 cm). The 95% limits of agreementrange from -0.21 cm to 1.91 cm (Fig. 3.4).The mean ± SD of the thickness of LF measured on screen shot imageswas 5.4±1.2 mm. Cohens weighted kappa score of 0.83 (95% CI 0.75−0.91),was calculated between the anesthesiologist and the independent assessor’srating of needle visibility, indicating a near perfect agreement [76]. Basedon this, mean visibility scores between the assessors were calculated.In 94.2% of insertions with the echogenic needles, visibility was rated asexcellent/satisfactory, compared with 29.4% of standard needles insertions.(Table 3.2, Fig 4) This reached statistical significance, (Fisher’s exact testp ≤ 0.0001). 93.3% of needle insertions (14/15) with needle visibility ratedas ‘excellent’ achieved successful LOR (Table 3.1). However, the associationbetween better needle visibility and successful entry into epidural space didnot reach statistical significance. (Fisher’s exact test p = 0.11). A CochranArmitage test for trend did however reveal a significant trend for increasedsuccess at higher needle visibility, (Z = -2.1, p = 0.03).This section is also adapted from [75]: J. Stone and P. Beigi and R. Rohling and V.A. Lessoway and A. Dube and V. Gunka (2015). Novel 3D-ultrasound-guided midlinelumbar epidural placement, utilizing Epiguide needle guide in porcine model: a compar-ison of standard versus Pajunk epidural needles. Society for Obstetric Anesthesia andPerinatology, pp. 278.373.4. Experimental resultsTable 3.1: Success at LOR categorized by mean visibility score. Visibilityscore key: 0=cannot see, 1=poor, 2=satisfactory, 3=excellentLOR success 0 1 2 3Successful LOR 3(50.0%) 5(71.4%) 4(66.7%) 14(93.3%)Unsuccessful LOR 3(50.0%) 3(28.6%) 2(33.3%) 1(6.7%)Figure 3.4: Bland Altman plot of Manual depth of needle insertion vs USmeasured depth, with 95% limits of agreement.Table 3.2: Mean needle visibility score categorized by needle types (echo-genic and standard needle): Visibility score key: 0=cannot see, 1=poor,2=satisfactory, 3=excellent. %=percentageMean visibility score Echogenic n(%) Standard n(%)3 14 (82.4%) 1 (5.9%)2 2 (11.8%) 4 (23.5%)1 1 (5.9%) 6 (35.3%)0 0 (0.0%) 6 (35.3%)Totals 17 17383.5. Discussion and conclusion3.5 Discussion and conclusionThe key finding of the feasibility study presented in this chapter was that our3DUS and Epiguide technique for single-operator, real-time midline epiduralplacement was successful in 76% of cases. The unsuccessful needle insertionswere predominantly, (5/8 occasions) during the insertions where needle vis-ibility was not optimal (‘0’, cannot see, or ‘1’, poor) (Table 3.1. Therefore,just as in non-US guided epidurals in clinical practice, we occasionally hitbone, or just did not make contact with the LF. In the remaining threeinsertions that were unsuccessful despite satisfactory ‘2’, or excellent ‘3’,needle visibility it is possible that we mis-identified the target structure, theposterior complex, and were therefore heading for the wrong target.Although there are many groups experimenting in this area, aiming toperform real time US guided epidurals, we believe we are the first groupto utilize both the midline plane, and to require just a single operator. Itwas the use of our Epiguide fixed needle guide, that holds the needle at afixed angle to the probe, but allows it to move forwards and backwards, thatenabled us to perform the needle insertion with just a single operator, witha free hand to assess for LOR. Previous groups, including Grau et al. [26]have attempted real time US guided neuraxial blockade with the use of twooperators. This has the disadvantages of requiring two skilled operators,with the obvious financial implications, and requiring two pairs of hands ina relatively small operating field. In the paramedian plane, with a singleoperator, previous success has been had by Karmaker et al. [32]. However,because their technique required two hands, (one for the US probe and onefor the needle), they used an autodetect syringe, to detects LOR. This addsfurther complexity to the technique and has the potential of producing afalse LOR. This group also found that they got relatively poor views ofthe needle, due to the steep angle of insertion in the paramedian plane. Incontrast, Tran et al. [30], like us, used a fixed needle technique for theirsuccessfully performed single operator real time guided epidurals, again us-ing the paramedian plane. By utilizing a 3DUS with thick-slice renderingwe were able to alter this method to allow the US scan to be performed inthe paramedian plane, but with visualization of the needle inserted in themidline.Our secondary objective was to determine whether the use of Pajunkechogenic needles had an effect on needle visibility. As might be expected,we found that needle visibility was statistically significantly improved bythe use of an echogenic needles, with 94.2% of insertions with echogenicneedles rated as excellent/satisfactory, compared with just 29.4% for stan-393.5. Discussion and conclusiondard needles (Table 3.2). This, in turn, seemed to increase the likelihood ofsuccessful LOR, with 93.3% of needle insertions rated as excellent achievingsuccessful LOR, compared with just 50% of insertions rated ‘0’, or cannotsee. This trend was confirmed by a Cochran Armitage test.A possible explanation for this trend is that when the needle was clearlyseen to be heading in the wrong trajectory it could be re-angled slightlyto target the LF. Menace et al. [64] have previously investigated the useof echogenic Tuohy needles from Pajunk in a cadaver study, again using aparamedian approach and real-time views. Like us, they found that needletip visibility was improved by the use of echogenic needles, however, theirstudy was limited by the fact that it was non-blinded. Although the ratingof needle visibility is inherently subjective, we attempted to reduce anypotential bias by having a second independent assessor, who was blinded tothe needle type being used, also rating the visibility, and we found there wasnear perfect agreement (Weighted Kappa 0.83).The depth measurement calculated from the manual needle insertionsconsistently measured greater than the 3DUS image depth (mean differencebetween the methods by Bland Altman analysis of 0.85 cm, SD = 0.53).This is consistent with previous work, albeit to a greater extent. Tran et al.found a mean error of 0.28 cm [77]. The error in our figures can be accountedfor by a number of phenomena. Firstly, not all screenshots showed optimalviews of the LF, (as they were captured for optimal needle visibility) soidentifying the leading edge of the LF for the 3DUS measurement was notalways without difficulty. Secondly, the US depth was measured to theleading edge of the LF, i.e. the posterior edge, or edge most proximal to theneedle. The actual thickness of the (human) LF is 5.0-6.0 mm [77] and inour study we indeed measured a width on 5.4±1.2 mm. The manual needledepth, however, was calculated once LOR had been achieved, by definition,once the needle was through the LF and into the epidural space.Concordance correlation coefficient between the manual needle depthand the 3DUS measured depth was also low, 0.45 (95% CI 0.20 to 0.65).However, we only managed to collect depth data for 14/34 needle insertions,as we did not collect this data on the first day of the study. This resulted ina very small sample size, reflected by the wide CI. This is clearly a limitationof this part of the study. This study is further limited by the fact it wasconducted ex vivo in a laboratory, and not on live human patients. However,we selected a porcine model as the porcine lumbar spinal anatomy is similarto human anatomy [77]. A convenience sample size was used as this was afeasibility study, where power calculations could not be made.We did not record instances of inadvertent dural puncture. This is be-403.5. Discussion and conclusioncause in the prepared porcine model, as well as the tissues being relativelystiff, we could not be confident there was CSF in the dural space. We couldnot, therefore, reliably say whether or not dural puncture had occurred. Asentry into the epidural space was identified using the clinically familiar tech-nique of LOR to fluid, we do not foresee increased episodes of dural puncturebeing a risk of this technique, when used in human patients. The next stepin this process is to determine the feasibility of this technique on humanparturients while the needle is inserted. However, the safety of US gel onhuman nervous tissue has yet to be established, so a potential modifier couldbe the use of sterile water as a coupling medium have been postulated bysome groups [32].In summary, this chapter demonstrates the feasibility of a novel tech-nique of single operator, real time US guided midline lumbar epidurals. Wehave also demonstrated that US calculated depths of LF tend to underesti-mate actual depths. Although this is a safer position than overestimating,we recommend that 3DUS, together with Epiguide, be used to aid both theselection of puncture site and the needle trajectory, but LOR should be usedas the endpoint of insertion into the epidural space. We have further shownthat the use of echogenic epidural needles in this setting improves needle vis-ibility, and that improved needle visibility tends to lead to increased successat LOR, which confirms the theory.The use of 3DUS in real time epidurals has previously been investigatedby Belavy et al. [31]. They postulated that there would be potential toimage the needle insertion outside the primary imaging plane. However,the only feasible technique they found required a paramedian US scan, within-plane paramedian needle insertion. They concluded that real time 3DUShas the potential to improve operator orientation on the vertebral column,but this comes at the price of decreased resolution, frame rate, and needlevisibility. They found that the in-plane 3D images were inferior to the 2D im-ages. As each new generation of technology provides motorized transducerswith faster imaging speed and smaller size and weight, the latest motorizedtransducer is approaching to 2D transducers. The size and weight of theold 4DC7−3/40 convex 4D transducer is significantly larger compared tothe new m4DC7−3/40 microconvex 4D transducer (45 mm × 75 mm, 250g) and (35 mm × 60 mm, 195 g), respectively. Although our 3DUS hassimilar limitations in frame rate, there already exists a faster transducer(e.g. xMATRIX array transducer) and imaging speed could be improved ifneeded.Based on the performed studies presented in the past two chapters, prac-tical challenges involved in US imaging in epidurals as well as workflow issues413.5. Discussion and conclusionhave been observed. In this chapter, we have demonstrated the significanceof needle visibility in the success of the epidural procedure. As mentioned,hardware-based modified needles such as echogenic needles are too costlyfor routine clinical use, and therefore clinicians prefer solutions for standardneedles using the current apparatus. After gaining credibility and under-standing of the clinical problem, and issues to tackle, solutions will be pro-vided in order to enhance needle appearance in ultrasound images to helpwith the guidance. Our main aim in the remaining chapters is to proposesimple, software-only solutions that are applicable to both 2D and 3D imag-ing and clinically acceptable. Our focus is on enhancing needle visibilityespecially in difficult cases where the needle has no discernible features inthe static US image. Although our special interest is on epidurals, the pro-posed needle visibility enhancement frameworks are quite generic and couldbe used in various applications, e.g. in-plane biopsies.42Chapter 4Motion Analysis of aStandard Needle: Movingstylus4.1 IntroductionAs introduced in Chapter 1, there have been many state-of-the-art needledetection methods for US imaging. In most of these methods, one of theassumptions for needle detection was that the needle would appear as a highintensity line of pixels [34–40], which is not true in many cases. These studiesalso often have tunable parameters that need to be adjusted for different USparameters such as frequency, depth and angle [40]. Some of them focus onlyon the needle trajectory while leaving the difficult tip identification task tothe operator [36], and some may focus only on the needle tip [43]. Thischapter attempts to investigate motion analysis to localize a needle that isotherwise invisible in the static B-mode image.Needle segmentation can be a challenging procedure, due to several fac-tors: First, Specular reflection off the needle degrades sonographic needlevisibility and reduces the needle brightness. Second, steeper needle inser-tion angles (perpendicular to the imaging plane) also cause a degradation inneedle visibility. At last, even if the needle is visible to some degree, specklenoise and reverberation artifacts are still present. Due to speckle, signalfallout and other strong edges, standard edge and line detection methodsoften fail in determining the needle location.Previous chapters involved clinical investigation using 3DUS+Epiguideusing the motorized m4DC7−3/40 microconvex array transducer. However,as mechanical motion of the motorized transducer limits the frame rate, thetemporal resolution is not sufficient for accurate analysis of motion dynamicsThis chapter is adapted from [78]: P. Beigi, R. Rohling, T. Salcudean, V. A. Lessowayand G. C. Ng (2015). Needle trajectory and tip localization in real-time 3-D ultrasoundusing a moving stylus. Ultrasound in medicine & biology, 41(7), 2057-2070.434.1. Introductionto detect a needle, and a faster frame rate is required. Matrix array trans-ducers have higher frame rate, allowing for accurate analysis of the needlemotion dynamics. In this chapter, we introduce a technique to localize theneedle in the 3D image using a standard needle and a 3D matrix array trans-ducer, with minimal prior information. Stylus motion is the principle of ourtechnique which is inspired by the well-known, but relatively un-researchedad-hoc technique of jiggling the needle and looking for a change in the USdisplay. We often see the operator jiggles the needle because human visionis sensitive to motion, and the tissue moves with the needle. However, thiseffect disappears when the motion is stopped and jiggling may result in smallcuts to tissue. Such needle motion is therefore not recommended for manyapplications such as spinal anesthesia. In the proposed approach, as thestylus is withdrawn, the acoustic impedance of the cannula changes, result-ing in a small change in the B-mode images. There is motion to detect butthe cannula remains fixed, therefore this approach is inherently safer. Wetherefore propose to use stylus motion as a safe way of producing ultrasonicintensity changes in order to enhance needle visibility [79].This chapter describes a real-time needle trajectory and tip localizationmethod with consideration toward clinical utility on a standard US machine.The method is implemented for 3DUS, which could also overcome someof the challenges involved in hand-eye coordination in 2D imaging [80–83].Matrix array transducers are capable of capturing 3D difference data in real-time, so we have used the X6–1 Philips transducer. To avoid the constraintsof using a needle guide, the method is designed for free-hand needle insertionwith arbitrary insertion angle and depth. A motion detection method isdesigned to look for the intensity variation resulting from the moving stylusinside the cannula. The needle localization algorithm then localizes theneedle in the 3D volume using two orthogonal image projections. Over aset of continuous movements, including a fully inserted stylus, we are ableto define the needle trajectory and the needle tip location.Consider the simple idea underlying the proposed method, as to oscil-late the stylus while looking at differences between subsequent images tosee small changes in the image due to the moving stylus. For the sake ofillustration, MIP of 3D B-mode volumes is used to visualize the needle inthree different tissue types as shown in the top row in Figure 4.1. Thesevolumes are captured once the stylus was fully inserted in the cannula. Thebottom row depicts the MIP of absolute differences from pairs of volumes asthe stylus undergoes an oscillation. Comparing the two sets of images, onecan understand the significant effect of stylus motion in needle visualization,and the benefit of working with image differentials.444.2. Materials and methods(a) (b)Needle (c)Figure 4.1: MB of B-mode volume (top row) and absolute difference volume(bottom row) for: (a) bovine muscle, (b) porcine muscle and (c) bovine liver.Difference operation removes most of the anatomical echoes leaving mainlythe needle.4.2 Materials and methodsStatic detection of a needle is usually a challenging task, due to existingartifacts and anatomical structures with high echogenicity that can appearas straight lines, such as fat, vessels and bone surfaces. In addition, the nee-dle does not always appear as a line-like structure in the image, thereforestatic-based approaches usually fail for such difficult cases. We thus proposea 3D localization technique based on the analysis of motion dynamics in-duced by an oscillatory stylus through the cannula, illustrated in Figure 4.2.The significance of frame differences in identifying intensity changes due tostylus motion is illustrated in Figure 4.5. While it is difficult to tell thatmotion exists from the B-mode data, it is obvious in the frame differences.4.2.1 Method overviewThe diagram of our 3D needle localization method using a moving stylus isdepicted in Figures 4.3 and 4.4. To reduce the computation time, the userfirst indicates the side of the probe the needle will be inserted in and whether454.2. Materials and methodsStylusOscillatory motionCannulaTipFigure 4.2: Illustration of stylus motion.the insertion is at the long or short axis of the 3D transducer so that a smallersearch space is needed. Although this requires some user input, it is similarto current US interfaces using a needle guide or beam-steering to highlightthe needle. (1) With each motion of the stylus, two subsequent volumesare acquired. (2) Next, the absolute difference volume with highlightedintensity changes due to stylus motion is obtained. (3) MIP is used toreduce the 3D search to two 2D searches. (4a) On the first projected image,candidates undergo a histogram analysis to remove the outliers; a HT to finda rough estimate of the needle plane and a detailed analysis of the pixelssurrounding the rough estimation, followed by convex hull calculation anda linear fit. (4b) On the second projected image, candidates go througha histogram analysis, reverberation suppression, and a HT followed by adetailed analysis of the pixels surrounding the obtained coordinates anda linear fit. (5) Combining the results of (4a) and (4b), the 3D needletrajectory and tip location is computed.After each pair of volumes is processed in “real-time” (with the givenacquisition capabilities), the 2D plane containing the needle is displayedshowing the needle. The trajectory and the tip are highlighted to help withthe detection of poorly visible needles. The needle plane and the localizationresults are updated with the streaming of each new pair of volumes.4.2.2 Stylus motionThe needle with its fully inserted stylus is initially inserted into the tissueby the operator. The operator starts pulling back the stylus to a distance (3centimeters) from the tip, and pushing it back in. Repeated oscillation of thestylus (≈ 2-3 oscillations) is preferred. With the current research platformof data streaming out of the iU22 Philips system (with X6-1 xMATRIXtransducer) that we used, a volume rate of up to ≈ 2-3 vol/sec could be464.2. Materials and methodsLine coordinatesP1ProjectionImageB-Mode Volume1. Acquire VolumeB-Mode VolumeDiff. Volume5. Calculate line in 3DAdd to stackTip in 3D Trajectory in 3D2. Absolute Difference4a.3. MIPP2ProjectionImageStack of imagesStack of imagesLine coordinates6. Repeat4b.Figure 4.3: Diagram of the projection-based 3D needle localization ap-proach.474.2. Materials and methodsMaximum Intensity ProjectionHistogram AnalysisHough TransformDiscontinuity Connection Linear FitHistogram AnalysisHough TransformReverberation SuppressionLinear FitMaximum Intensity ProjectionStack of imagesP1ZStack of imagesP2ZLinear FitLine coordinatesLinear FitLine coordinatesConvex hullConvex hull4a.4b.Discontinuity Connection Figure 4.4: Detailed description of the projection-based 3D needle localiza-tion approach.484.2. Materials and methodsachieved. For the task of visual localization, we consider at least 5 mmmotion of the stylus between each two volumes. A simple calculation resultsthat the stylus oscillation rate should be above 1/3 Hz for the intensitychanges due to the motion to be detected. This oscillation rate is quitepractical and is a comfortable rate to moving the stylus by the operator.4.2.3 Absolute volume differenceA simple way to detect motion is to capture the intensity differences betweenconsecutive US volumes. Measuring the differences has several advantagescompared to the analysis of the individual volumes. We found that a finemovement of the stylus surrounded by a high intensity cannula is not easilyvisualized in a single B-mode volume, but it can be detected reliably usingthe difference of the volumes. In addition, since speckle noise is dependenton the distribution of small reflectors along the US path, the speckle patternshould remain fixed [84]. The difference image therefore has a reduced levelof speckle.The US transducer captures a sequence of 3D volumes scan converted toa Cartesian-grid of voxels. The voxel intensity at time t is defined by V t(x,y,z),mapping the 3D voxel space to the gray scale space. At time t, when the newvolume is captured, the absolute difference of the current volume and theprevious volume is computed ∆V t(x,y,z) = |V t(x,y,z) − V t−1(x,y,z)|. The differencevolume is then passed to the next step to obtain the 3D coordinates of theneedle.4.2.4 Maximum intensity projectionTo minimize the computational complexity and to also account for specklenoise and artifacts, the 3D localization problem is efficiently simplified totwo 2D localizations. For each volume difference, the projection images areconstructed by orthographic MIP. MIP is a rendering technique that, in-stead of integrating every sample value based on opacity, it only selects thevoxels that have the maximum intensity along the direction of projection.MIP is widely used in medical image analysis for the visualization of loca-tion and topology of thin bright anatomical structures such as lung nodulesand blood vessels. The aim is to highlight these structures compared toother anatomical structures. Because of its computational simplicity, MIPis already implemented in real-time in most 3DUS machines.MIP maximizes the contribution of high luminance voxels with nee-dle echoes and minimizes the contribution of opaque voxels without needle494.2. Materials and methods(a) (b)(c) (d)Figure 4.5: Absolute difference illustration in 4 stylus locations. Consecu-tive B-mode frames are on the left and the corresponding absolute framedifferences are on the right. The reflection from the approximate stylus tipis indicated with an arrow.504.2. Materials and methodsSALAElevational (y)Azimuthal (x)Axial (z)zxyzxymip needle plane SA LA xmip yFigure 4.6: Example of a short-axis (SA) needle insertion, versus long-axis(LA).echoes. Although any arbitrary orthogonal projections could be used, forthe sake of simplicity and to reduce computational complexity, we choosethe two coordinate axes of the matrix array transducer as the projectiondirections of the orthographic projections (x & y). The MIP along the xaxis applied on the 3D difference volume ∆V , is obtained as follows:MIP (∆V t(y, z)) = maxx ∆Vt(x, y, z), (x, y, z) ∈ Z3 (4.1)The 3D volume is projected along the x-axis (lateral direction) and thez-axis (elevational direction) to obtain two 2D projection images. Withoutloss of generality, we assume that the needle was inserted from the shortaxis, Figure 4.6. The first projection direction would therefore result in imtxwhich is used to obtain 2D needle coordinates and identify the needle plane,while imty is used to identify the needle on the plane, determining the thirdcoordinate for 3D localization. If the needle was inserted in the adjacentquadrant, x and y would be switched.Stack of projectionsAfter repeated movements (≈ 4) of the stylus, including its full insertioninto the cannula to the tip, a set of candidates is obtained to determine the514.2. Materials and methodsDiff.VolumexzyMIPMIPt = t1, t2, t3, … t1t2t3t1 t2 t3 …stackystackxMIPMIPmipymipximyt1imxt3Figure 4.7: MIP of orthogonal projections.needle tip and trajectory. At each oscillation, MIP images are added to astack of projected images forming two sets of images (stackx and stacky).Another MIP is performed on stackx and stacky to form two projectionimages (mipx and mipy) highlighting the changes corresponding to variousstylus locations. Figure 4.7 illustrates this step.4.2.5 Histogram analysisIn addition to high intensity changes due to the stylus motion, the differ-ence volume can contain non-zero voxels due to other factors such as tissuemotion from cardiac, respiratory or external forces. Other artifacts couldbe present as well such as reverberation. Therefore the difference imagecontains some values we consider as false positive points. Our aim is tominimize the number of false positives first. We observe that changes dueto the anatomical motion are usually low intensity. Therefore a thresholdingis a suitable step to group pixels with and without the needle. A histogram524.2. Materials and methods0 255Pixel gray valueNumber of pixelsTBSmall tissue motionBackgroundTailFigure 4.8: Sketch of the typical histogram of the MIP of difference volume.analysis of the pixel intensity distribution in mipx and mipy is performed toobtain a threshold value to create binary images. According to the histogrammorphology of the MIP of a difference volume (Figure 4.8), the followingobservations are obtained. There is an initial bin for the low intensity val-ues of the histogram, corresponding to the black boundaries of the B-modeUS data. There is a wide peak at the lower half of the histogram mainlycorresponding to small tissue movements. The tail of the histogram consistsof the needle, high intensity reverberation artifacts and large local/globaltissue motions.A suitable threshold retains the needle pixels and omits non-needle pix-els. In order to automate the process for various sets of data, we choosethe threshold value based on the cumulative sum of the number of pixels(n) grouped in 256 bins (0-255). First, the initial bin at the beginning ofthe histogram, containing a large pixel population (due to the black imagebackground), is discarded from the data. The normalized cumulative sumfor bin b is then calculated as follows.Cb =∑bi=1 ni∑255i=1 ni(4.2)where i is the bin index, ni is the number of pixels with intensity i.An acceptable threshold value is obtained where the energy of the sig-534.2. Materials and methodsnal (simply the cumulative sum of the samples) reaches a portion of themaximum energy, called the binarizing threshold (TB).It = (∆Imini |Ci > TB, ∀i ∈ I) (4.3)where Ci is the normalized cumulative sum up to intensity i and ∆Imini is theintensity of the first pixel satisfying the equation on the projected differenceimage. It is therefore the intensity value corresponding to the first pixelwhose cumulative sum is greater than the binarizing threshold value. Aftera systemic analysis on a few test sets, TB = 99% is selected. The thresholdis then used to binarize the projection image Bt = p ∈ P : ∆It ≥ Tt; wherep is a pixel in the projection space P , Bt is the binary image at time tcontaining the set of candidates with intensity larger than It, the intensitythreshold at time t.At the end of this stage, two binary images are obtained representing theneedle in two orthogonal projections (mipx → bwx and mipy → bwy). bwx ispassed to the needle plane identification step to identify the first 2D needlecoordinates as well as the plane that the needle falls in. bwy is sent throughthe reverberation suppression and needle tip localization steps to locate theneedle in the plane.4.2.6 Reverberation suppressionReverberation occurs when at least two highly reflective interfaces are en-countered along the path of the US beam, such as the boundaries of a needleor the metal-tissue interfaces of a needle. In our experiment, as the stylusis moving up and down inside the cannula, at each time instant, the re-verberation artifact changes along the length of the needle and is not fullyremoved by thresholding. Instead, knowing the side of the transducer thatthe needle was inserted in, the presented method looks for the first edgesalong the beam path and picks the first copy of the needle while omittingthe remaining copies. Suppression is applied to bwy without loss of general-ity; for needles inserted in the adjacent quadrant then suppression is appliedto bwx.4.2.7 Discontinuity connectionA HT is applied on bwx and bwy images to identify linear structures amongthe candidates, and to obtain an approximation of the needle trajectory. Theangular range of the HT is set to 20◦ − 80◦ which is the practical extentsof insertion angle in US guided interventions. The HT however gives only544.2. Materials and methodsTxzxTzFigure 4.9: Description of the Discontinuity Connection step. The cross inred, shows the needle tip detected before the correction step and the blackdot shows the final tip detected by the correction step.an approximate estimate for linear structures among the candidates andreduces the search region. After line coordinates are identified by the HT, toobtain a more accurate measure of the needle direction, voxel intensities aremeasured within a TX × TZ channel surrounding the HT coordinate. Basedon preliminary tests, a 2 mm wide and 20 mm tall channel around the HTcoordinates was selected. 2 mm is chosen based on the needle width and20 mm is based on the largest observed gap between the echoes belongingto the shaft in the difference image. Note, the algorithm is not sensitiveto these parameters, as shown later in the results. The aim of this step isto find these detached structures and consider them as needle features, seeFigure 4.9.4.2.8 Convex hull calculation and linear fitNext step is to first omit outliers far from the calculated axis of bwx andbwy. The thresholding distance 2 × Tx. 2 × Tx is chosen according to theneedle width, and the aim is to maintain a reasonable range around theestimated needle axis while removing distant outliers. Depending on thestylus orientation within the cannula, there might be some reflections of554.2. Materials and methodsReverberation artifacts Convex hull Stylus reflections A B Figure 4.10: An example of a projection difference image appearance in thepresence of random stylus reflections and reverberation artifacts.the stylus tip on top of the needle shaft. Reverberation artifacts couldalso happen underneath the needle shaft. If strong, these reflections coulddeviate the calculated needle direction. We thus use the convex hull, thesmallest convex set of candidates, of the remaining set of pixels to discardany non-needle image features. Convex hull is implemented in many softwarepackages; we have used the implemented function in MATLAB R©. For Mpoints P1, · · · , PM , the convex hull is given by the following expression:CH ≡{M∑i=1λiPi : ∀λi ≥ 0,M∑i=1λi = 1}(4.4)A schematic of the difference projection image in this case is shown inFig 4.10. Points A and B represents the outer boundary points of the convexhull. Outliers far from the line axis of (A,B) will be omitted (2× Tx) and astraight line is then fit on the candidate set, to determine the final computedneedle axis. The 2D coordinates of the needle tip and shaft obtained frombwx and bwy determine the 3D needle coordinates and the tip in the 3Dvolume. As seen in the results section, the needle trajectory is measuredwith respect to the x axis (θ′) and needle tip is measured according to theobtained 3D coordinates (P ′(x, y, z)).564.2. Materials and methods4.2.9 Experimental setupImage acquisition3DUS images for this study were obtained using an iU22 US imaging sys-tem with a X6–1 (1–6 MHz) xMATRIX array transducer. Philips provideda research interface to access the volumetric data. To validate the method,we tested it on various tissue types. Freshly excised bovine muscle, bovineliver and porcine muscle were purchased from a local butcher at the sameday of the study. Bovine muscle was used as a highly echogenic medium,porcine muscle had medium echogenicity and bovine liver provided us witha fairly low echogenicity. A mounting device was used to hold the trans-ducer in place, and the needle was held by hand. The acoustic and imagingparameters (“abdomen general”) were kept constant, and only the imagingdepth setting varied from 50 mm up to 90 mm. A total of 60 sets of volumeswere obtained, 20 sets for each tissue type, each set containing volumes of arange of stylus tip location within the cannula. A standard 17 gauge Tuohyspinal/epidural needle (B. Braun Medical Inc., Bethlehem, PA, USA) wasinserted at various insertion depths (50− 90 mm) and various insertion an-gles. According to the fact that needles are less visible when inserted atsteeper angles, we included the following insertion angles for our test dataset: 60% of volumes had the needle inserted at a steep angle (55◦−80◦) and40% had a moderate insertion angle (30◦ − 55◦), relative to the skin. Steepangles are often used in epidural anesthesia, while other procedures like sci-atic blocks or other anesthesia or biopsies also fall in this range. Moderateangles would best fit the femoral nerve block injections. The experimentalsetup depicting the transducer coordinate axis is represented in Figure 4.11.Gold standardFor the ease of gold standard (GS) identification, without loss of generality,the needle was inserted such that it lies in a native lateral/elevational slice ofthe scan-converted volume. A sonographer was asked to independently re-view the slices offline and select the slice that contains the brightest image ofthe needle. She then identified the needle tip and shaft on the needle plane.Manual identification of the needle was further verified by our proposed GSverification approach. We use a combined spinal & epidural anesthesia 17GTuohy needle. Such needle has a cannula that offers a needle-through-needleinsertion. The appearance of this needle is exactly the same as the regular17G Tuohy needle, however the distal orifice allows us to perform our GS574.2. Materials and methodsxyzFigure 4.11: Experimental setup - needle inserted at the short axis in theliver tissue ex vivo.validation technique. For that we use a 20G ChibaSono echogenic needle(Pajunk, Geisingen, Germany). After the data is captured, the standardstylus is removed from the needle and the 20G echogenic needle (the samediameter as the stylus of the 17G needle) is inserted inside the cannula.The 20G echogenic needle is inserted 2 cm past the Tuohy needle tip and,being echogenic, it allows clear needle tip/shaft identification in the volume,Figure 4.12. We chose this GS over other possible GSs (CT registration to3DUS, optical tracking and transparent phantoms) because it offers an in-dicator of both tip and trajectory in the same US coordinates as the needlemeasurements with the proposed algorithm.Performance metricsThe localization accuracy was evaluated in terms of shaft and tip detectionerror. Shaft accuracy was obtained according to the angular deviation be-tween the needle direction calculated by the algorithm and the true needledirection measured by the sonographer on the needle plane. Tip accuracywas obtained by calculating the deviation in the 3D needle tip detection.Figure 4.13 illustrates the needle plane and the performance metrics basedon the insertion direction with respect to the transducer in long axis (LA)and short axis (SA).584.2. Materials and methods2 cm Figure 4.12: Gold standard: 20G echogenic needle within the 17G combinedspinal & epidural cannula.XZYPP'Δθ(a)XZYΔθP P'(b)Figure 4.13: Diagram illustrating the performance metrics when the needleis inserted through (a) SA and (b) LA of the transducer.594.2. Materials and methods• The true needle direction is identified as the angle that the needleshaft makes with the x axis (θ). The angular deviation between thetrue manually segmented needle (θ) and the automatically obtainedtrajectory (θ′) in the needle plane (∆θ = θ′−θ) represents the accuracyin the needle direction.• The true tip location is determined from the endpoints of the seg-mented lines on the two projection images (P (x, y, z)). Tip deviation(∆P ) between the true needle tip (P ) and the tip obtained using thealgorithm (P ′) is calculated as the 3D Euclidean distance between thetwo points.Validation parameters were computed for 20 trials for each tissue type(bovine and porcine muscle and bovine liver) at various insertion lengths(50 − 90 mm), various machine depth settings (50 − 90 mm) and differentinsertion angles (moderate and steep), evenly distributed. For each tissuetype, the mean, SD and root-mean-square (RMS) of the validation param-eters were calculated.Parameter selection and sensitivity analysisThe effect of the parameters involved in the method is studied in the fol-lowing manner. To analyze the method sensitivity to the parameters, eachparameter was varied over a range of values, while keeping all other pa-rameters constant. For each parameter value, the method performance wastested on 6 randomly selected data sets (2 from each tissue type). Resultsare presented as the range of parameter values that results in a maximumacceptable mean error. The 95% CI of needle placement accuracy is de-termined according to the clinical acceptance. The key parameters are TB(binarizing threshold), TX and TZ (channel size for needle tip and shaftprocessing).Tissue motion analysisIn order to analyze the performance where tissue motion is present, a com-pression tool is designed to simulate small intrinsic body motion by graduallycompressing the tissue, as shown in Figure 4.14. The clamped needle guideis used to hold the needle and the probe in place, and the compression isintroduced by advancing the lead screw of the clamp. To test the perfor-mance, an offline version of the software was used to allow start-stop data604.2. Materials and methodsFigure 4.14: Tissue motion simulator setup.capture to save the data for each compression. Tissue was compressed grad-ually to 2 mm compression, and each compression cycle was divided to 16levels of compression. At each compression level, the stylus was oscillatedand 8 volumes were captured at different locations of the stylus, rangingfrom the fully inserted to the fully removed stylus.Volumes were grouped based on the compression levels, where those vol-umes within a group were taken from consecutive compression levels. Foreach of the stylus locations, a volume was randomly chosen from the corre-sponding group. The proposed approach was then tested on the groups ofvolumes taken from bovine muscle and liver tissue specimens, ranging fromthe volume with no compression to a volume with maximum compression.Bovine muscle tissue consists of striated linear structures belonging tomuscle layers. If the deformation is large, these closely-spaced structuresmight result in the tip misidentifications and the shaft too. The bovine liveron the other hand, is more homogenous with some hypo-echoic inclusions,which seem to affect the detection technique less, as the inclusions are notclosely-spaced compared to the stylus motion increments, and the methodignores them as non-needle points.For the experimental tests, the transducer was held with a mountingdevice, since a rigid setup was needed for GS verification of needle position.614.3. Experimental resultsNote that a setup with a single-operator and a transducer holder, could beadjusted according to the application. Alternatively, one may use a needleguide on the transducer to hold the needle, so a mounting device wouldnot be needed. A two-operator scenario is also an option. We investigatedthese two cases, where the operators hand was kept as steady as possible bybracing against an object (e.g. patient body).4.3 Experimental resultsNeedle localization accuracy results of the proposed method are summarizedbased on the mean, SD and Root Mean Square (RMS) of the errors, aslisted in Table 4.1. Needle shaft accuracy is presented by ∆θ, σ∆θ andRMS(∆θ) in degree and and needle tip accuracy is tested by ∆P , σ∆P andRMS(∆P ) in mm. Needle detection error is below the clinically acceptederror (Section 1.2). Considering the fact that there is less interest in long-axis needle insertion, due to the poorer elevational resolution compared tothe lateral resolution and smaller field of view in the elevational-axial plane,only short-axis insertions are tested.Table 4.2 shows the accuracy results for low/high depth settings (50−70mm/70−90 mm). Again needle shaft accuracy is represented by ∆θ, σ∆θ andRMS(∆θ) in degree and and needle tip accuracy is tested by ∆P , σ∆P andRMS(∆P ) in mm. Depth was adjusted according to the insertion length,therefore, low/high depth settings correspond to low/high insertion lengths,respectively.In Table 4.3 results for shallow/steep insertion angles have been listed.Needle shaft accuracy is represented by ∆θ, σ∆θ and RMS(∆θ) in degreeand and needle tip accuracy is tested by ∆P , σ∆P and RMS(∆P ) in mm.Sensitivity analysis results show that for 97% < TB < 99%, 1 mm <TX < 10 mm and 10 mm < TZ < 30 mm the algorithm accuracy lies withinthe acceptable error range.In Figure 4.15, examples of the projection views of the volumes withmaximum allowed tissue displacement is shown for bovine liver and muscletissue. Experimental analysis shows that up to 2 mm compression resultsin a mean error of (0.8◦, 1 mm) and (0.5◦, 0.5 mm) in the trajectory andthe tip localization on the bovine muscle and the bovine liver, respectively.For much larger motion of tissue with strong linear US features, the methodneeds adjustments to avoid incorrect identification of the moving stylus,which is the aim for the future extensions of the method.624.3. Experimental resultsTable 4.1: Localization error for three different tissue samples, for all in-sertion angles, needle lengths and depth settings. ∆θ, σ∆θ and RMS(∆θ)(degree) are the mean, SD and RMS of the angular deviation and ∆P , σ∆Pand RMS(∆P ) (mm) are the mean, SD and RMS of the needle tip deviation,respectively.Tissue ∆θ σ∆θ RMS(∆θ) ∆P σ∆P RMS(∆P )Bovine Muscle 1.0◦ 1.1◦ 1.2◦ 1.1 mm 0.6 mm 1.2 mmPorcine Muscle 0.9◦ 0.8◦ 1.2◦ 1.1 mm 0.4 mm 1.1 mmBovine Liver 1.4◦ 0.7◦ 1.5◦ 0.8 mm 1.1 mm 1.2 mmTable 4.2: Localization error for low/high depth settings (50− 70 mm/70−90 mm), averaged for three tissue types.Tissue ∆θ σ∆θ RMS(∆θ) ∆P σ∆P RMS(∆P )Low Depth 1.1◦ 1.0◦ 1.5◦ 0.9 mm 0.4 mm 1.0 mmHigh Depth 1.1◦ 0.9◦ 1.4◦ 0.9 mm 0.3 mm 1.2 mmTable 4.3: Localization error for shallow/steep insertion angles averaged forthree tissue types.Tissue ∆θ σ∆θ RMS(∆θ) ∆P σ∆P RMS(∆P )Shallow Angle 1.0◦ 1.0◦ 1.4◦ 1.1 mm 0.5 mm 1.1 mmSteep Angle 1.3◦ 1.1◦ 1.6◦ 1.0 mm 0.5 mm 1.1 mm634.3. Experimental results(a)(b)Figure 4.15: MIP results on the stacks of projections along the lateral andelevational directions of the output for volumes taken from (a) bovine liverand (b) bovine muscle. The arrow indicates the correct needle tip which isalso detected by the algorithm.Figure 4.16 and Figure 4.17 show the MIP results for the outputs ofthree pairs of volumes captured in a two-operator and needle-guide scenarios,respectively. As it is shown, needle localization from the multiple oscillationsof the stylus results in acceptable accuracy, despite despite the inherentartifacts. Figure 4.18 depicts the needle localization results and the GS foreach case.644.3. Experimental results(a)(b)Figure 4.16: MIP results on the stacks of projections along (a) lateral and(b) elevational directions of the output, for three pairs of volumes in thetwo-operator scenario.(a)(b)Figure 4.17: MIP results on the stacks of projections along (a) lateral and(b) elevational directions of the output, for three pairs of volumes with theuse of a needle guide.654.4. Discussion and conclusion(a) (b)Figure 4.18: Localization results for (a) two-operator scenario and (b) needleguide. The solid line represents the result of the algorithm and the dottedline is the GS.4.4 Discussion and conclusionIn this chapter, we introduced a 3D needle localization technique using amoving stylus. A motion detection method is designed to look for intensityvariations resulting from the moving stylus inside the cannula. The nee-dle localization algorithm then localizes the needle in 3D volume using twoimage projections. Over a set of continuous movements, including a fullyinserted stylus, we are able to define the needle trajectory and the tip lo-cation with an average accuracy range of (0.9◦ − 1.4◦) and (0.8− 1.1 mm),respectively. The intra-observer variability of the gold standard’s manualannotations was assessed by repeating measurements 10 times on a sam-ple dataset, and was found to be 0.32◦ ± 0.28◦ in the insertion angle and0.46± 0.30 mm in the tip.The proposed approach works best with the matrix array transducersdue to their high 3D frame rate. The method could however be modifiedfor motorized transducers using a start-stop type of motion. This techniqueappears to introduce negligible risk to the patient because the outer cannularemains stationary. According to the results of this feasibility study, thismethod is able to identify an invisible needle in highly echogenic medium(bovine muscle).The framework is computationally efficient, running on a personal com-puter (4 GHz Processor and 16 GB RAM) with an un-optimized MATLAB R©code, in 1.51±0.07 seconds on average for three stylus displacements in threepairs of volumes. Although this speed matches our data acquisition speed of664.4. Discussion and conclusion(≈ 2-3 volumes per sec), we could further improve the speed by employingC programming and parallel processing. By analyzing each step, we foundout that the two projections and their corresponding processing consumeabout 75% of the computational time. These steps are amenable to parallelcomputing and multicore processing.The stylus motion could be automated, however there is the drawbackof needing custom apparatus, electronics and mechanical equipment as wellas a power supply. As stated, one of our main goals has been to developsoftware-based low-cost tools to guide epidurals in obstetric anesthesia, soissues of electrical safety and work-flow would suggest standard hardware ispreferred.For future extensions of the method, a registration step could be used toalign parts of the projection images based on the anatomical deformation.The absolute difference could then applied on the registered images. This,however would add computational cost and complexity. The aim of theproposed method however is to design a less complex real-time techniquemainly used for applications that involve low intrinsic body motion, such asspinal anesthesia.There may be cases where the proposed method would not work. Ifthere are strong reflectors near the needle, or the needle is too deep in thetissue, the difference in needle echoes from a moving stylus may be tooweak for detection. In these scenarios, stylus motion may not be detectable.Limitations of the proposed approach are therefore as follows. In a single-operator scenario it is difficult to oscillate the stylus, and hold the needleand the transducer at the same time. This would make it difficult especiallyfor shallow angles. The proposed solution is therefore designed for (1) deepinsertions with any insertion angle so the needle is already anchored in thetissue (no needle guide is needed), and (2) needle guides, which would limitthe flexibility in the insertion direction but could be applicable for bothshallow and deep insertions.Motivating findings in this chapter demonstrate the significance of anal-ysis of the motion dynamics to reveal useful information for needle detec-tion that was hidden in the static image otherwise. Due to the limitationsinvolved in the stylus motion approach however, our motivation for the fol-lowing chapters is to propose a more clinically-suitable method based onmotion signatures of the needle on the tissue. The rest of the chapters fo-cus on investigating motion features of subtle movements of the needle fromtremor-like hand motion as a more efficient solution for US-guided interven-tions.67Chapter 5Optical Flow Analysis of aHand-held Needle: FixedImaging5.1 IntroductionIn Chapter 4, analysis of the motion dynamics was demonstrated to revealclues about the needle location that was otherwise invisible in the staticimage. This motivated us to continue investigating micro-motion analysis forneedle detection. Manual stylus motion requires both hands to be dedicatedto the needle, which may not be satisfactory for some applications such asepidurals because it may require an assistant to lend a hand. Our proposalis to design a single-operator clinically-acceptable framework to incorporatemotion signatures of the needle, without explicit motion creation. In thischapter, we aim to detect an invisible hand-held needle (without separatemotion of the stylus), by searching for regions that move coherently underthe hand tremor motion.Due to easier interpretation of 2D data, and the fact that current clini-cal demand is inclined towards the use of 2DUS imaging, 2DUS is utilizedfor the rest of this thesis, although it could be extended to 3DUS, espe-cially by adopting matrix array transducers. We aim to solve the challengesarose from applications requiring steep insertion angles such as epidurals,and where the angular range of directional beams from the curvilinear trans-ducer further reduces the needle echo, without using additional hardware orspecialized needles.Our proposed approach is based on measuring and analyzing the spec-tral properties of small displacements arising from natural tremor motionThis chapter is adapted from [85]: P. Beigi, R. Rohling, T. Salcudean and G. C. Ng(2016). Spectral analysis of the tremor motion for needle detection in curvilinear US viaspatiotemporal linear sampling. International Journal of Computer Assisted Radiologyand Surgery, 11(6), 1183-1192.685.2. Materials and methodsinduced on an inserted hand-held needle. Normal (physiologic) hand tremoris high-frequency (6 − 12 Hz) and low-magnitude (< 0.15 mm), measuredby an accelerometer [86, 87]. This type of tremor is present in every nor-mal person, and is barely visible to the naked eye. We propose to use theinduced hand tremor on the needle, which results in vibratory motions, asa way of creating unique needle signatures in the tissue.Block matching (BM) and Optical flow (OF) are two commonly usedmethods for motion estimation in ultrasound [88]. We compared both meth-ods and found empirically that OF (based on consistent flow in pixels withclose proximity) obtains more accurate results that BM (based on normal-ized cross correlation), this is also discussed in detail for radio-frequencydata in [89]. We have therefore used OF analysis to estimate the motiondynamics, which is performed in a block-based framework to detect the sub-tle motion arising from the hand tremor. The idea is to use the time seriesdata of the displacement map, obtained using OF analysis, which forms theperiodic tremor signal equation At sin(ωtt). Regions with similar motionpattern, i.e., those moving coherently, are then determined using our spec-tral correlation analysis, forming the cloud of pixels moving under tremormotion. Finally, the proposed spatio-temporal linear-sampling techniqueaims to localize the needle from the obtained regions with coherent motion.5.2 Materials and methods5.2.1 Method overviewAn overview of our method is shown in Fig. 5.1. The entire pipeline ofthe proposed approach could be summarized in four main procedures: (I)automatic detection of the insertion site, (II) displacement map computa-tion for a region of interest (ROI) surrounding the insertion site, and (III)coarse estimation of the needle trajectory as a trajectory channel, followedby (IV) fine trajectory localization within the detected channel. Buildingblocks of each of these procedures are listed in more detail in Fig. 5.1: (1)The reference/first frame is taken from the stream of the B-mode images.(2) An estimate of the insertion site is automatically identified on the refer-ence frame, using dyadic wavelets of row-based first derivative of Gaussian(FDoG) filter responses. (3) An ROI is then defined automatically aroundthe estimated insertion site. (4) The ROI is divided into smaller regionscalled macro-blocks (MBs), and the block containing the estimated inser-tion site is labeled as the target MB (5). The displacement map is computedfor all MBs in the current frame versus those in the reference frame (6) by695.2. Materials and methodscomputing the spatial and temporal gradients. A least-squares fit (8) isused to estimate the flow parameters from the overdetermined system ofgradients in a regularized form. (9) Spectral coherency is computed for theaverage of displacements in each MB versus that of the target MB. (10)Blocks with high spectral coherency at the tremor frequency range are seg-mented and (11) the trajectory channel is estimated based on the selectedMBs. (12) Spatio-temporal linear sampling using oriented sample paths isperformed within the trajectory channel. Spectral coherency is computedfor axial displacements of each spatio-temporal sample path versus that ofthe estimated puncture site. (13) The trajectory is finally estimated fromthe sample paths with the maximum coherence within the tremor range.5.2.2 Approximate puncture site identificationDue to the convex shape of the transducer and the steep insertion angle,the part of the shaft closest to the transducer face is the most visible part,which usually appears as a tiny edge. The puncture site could be providedmanually by the operator or assumed for a given procedure. However, wedetect it automatically so that the solution is fully automatic. In order toestimate the puncture site in the US image, a search space is first identifiedcovering possible locations of the insertion site. Without loss of generality,we assume that the needle was inserted from the right side of the image.The FDoG filter is applied on the rows of the image. Row-based processingdetects oblique edges and filters out most of the unwanted edges becauseof the fact that tissue boundaries perpendicular to the beam direction arethose creating the highest echos. The speckle noise present in the US imagecould result in a noisy edge at the insertion site. Large-scale filters are goodsolutions as they are robust to speckle noise, however they may filter outdetails. Since the insertion site might only be a tiny edge, the chosen filtershould not (1) filter out fine details and (2) be susceptible to speckle. Theidea of scale products was exploited by [90] where it was shown that scalemultiplication can detect major edges better while suppressing noise. Theproduct of filtered images is computed for FDoG Dyadic filters at scales S1and S2 = 2S1 to improve the detection and needle edge localization in thepresence of speckle noise. Dyadic dilation of function φ(x) at scale Si isgiven by:φSi(x) =1Siφ(xSi) (5.1)The row-based FDoG Dyadic wavelet transform of the reference frameat scale Si is defined by I(x, y, t0) ∗ φSi(y), where (x,y) is the 2D loca-705.2. Materials and methodsInput set of frames(3) ROI Identification(4) Block-wise Coordination of the ROI(9) Spectral Coherency(11) TrajectoryChannel(5) Target Macro-Block(2) Insertion Site Estimation(1) Reference frame selectionSmall ScaleLarge ScaleDyadic Wavelets Row-based FDoG(6) Optical Flow(12) Spatio-temporal Linear SamplingSpectral CoherencyTrajectory(10) Block Segmentation(13) Path Selection(I)(II)(III)(IV)uvt(7) Spatial/Temporal Gradients(8) Least Square FitRegion of Interest (ROI)(3) ROI Identification(4) Block-wise Coordination of the ROI(6) Optical FlowAll MBs except the target MBMaskFigure 5.1: Block diagram of the proposed approach. (I) Insertion site isestimated automatically on the reference frame. (II) Displacement map iscomputed within the ROI in a block-wise approach using an OF approach.(III) Trajectory channel is estimated according to the block segmentationresults of spectral coherency between displacement map of each MB and thatof the target MB. (IV) Needle trajectory is detected from spectral coherencybetween spatio-temporal linear sample paths and the trace of the estimatedpuncture site within the trajectory channel.715.2. Materials and methodstion, I(x, y, t0) is the B-mode intensity at the reference frame, and φ(y) =−ye−y2/2, is the row-based FDoG. An oblique mask covering the possible lo-cations of insertions is next applied on the product of filtered responses. Thismask is simply a binary, angled rectangle at a fixed position with respectto the fan shape of the B-mode image. The pixel with maximum wavelettransform and minimum y (closer to the insertion site) in the masked wavelettransform is chosen as the approximate insertion site.5.2.3 ROI identificationWhile explicit motion analysis is computationally costly, estimating thetremor motion should provide us with the needed information with mini-mal computational complexity. To investigate the dynamics of the tremormotion induced on the tissue surrounding the needle, we first compute thedisplacement field based on a gradient-based approach on the time-varyingintensity image [91, 92]. To derive an approximation to the displacementfield, we compute the OF of each frame with respect to the reference framein a block-based approach. Detecting subtle movements due to tremor isdifficult using direct motion estimation approaches such as block match-ing or phase correlation performed globally on the image. With OF on theother hand, it is possible to track only the local changes to produce the localmotion vector more accurately [92].A rectangular ROI with a size proportional to the imaging depth isautomatically defined, with its top right corner at the estimated insertionsite. The motion estimation approach tracks the position of pixels in severalframes with respect to the reference frame and creates a displacement mapfor each MB.5.2.4 Optical flow calculationA critical assumption in the general OF calculation is the brightness con-stancy constraint, meaning that the intensity of a point, remains the samebetween one frame (t) and the next (t+dt), despite small change of the pointlocation. This can be summarized in (5.2), using the first-order Taylor seriesapproximation.I(x+ dx, y + dy, t+ dt) ≈ I(x, y, t) + ∂I∂xdx+∂I∂ydy +∂I∂tdt (5.2)where I : R3 → R is the B-mode intensity and (dx, dy) is the displacementvector between two frames that are dt apart.725.2. Materials and methodsFollowing (5.2), brightness constancy constraint could be written as:It + uIx + vIy = 0 (5.3)where u = dx/dt and v = dy/dt are the lateral and axial components of theOF and (Ix, Iy) and It are spatial and temporal gradients with respect toposition (x, y) and time t, respectively.Spatial gradients, Ix and Iy are approximated in the lateral and axialdirections of the B-mode image using the FDoG masks along the rows andcolumns of the input frames. The temporal gradient is estimated simply asthe difference of the Gaussian smoothed images at two frames.{Ii(x, y, t) = I(x, y, t)⊗ gi (i = x, y),It(x, y, t) = I(x, y, t)⊗ g − I(x, y, t− 1)⊗ g,(5.4)where ⊗, gi and g account for 2D convolution, FDoG with respect to i (xor y) and 2D Gaussian smoothing function (with SD σG), respectively.To estimate the flow parameters (u, v) in (5.3), a constraint needs to beadded to the under-determined system of equation. For this purpose, we usethe approach proposed by Lucas and Kanade [92] in a regularized form. TheLucas-Kanade approach resolves the inherent ambiguity of the OF equationby adding the constant flow constraint on the neighboring pixels within adefined patch Ω (Fig 5.2). To satisfy brightness constancy constraint, anormalization step is then applied on each patch. The local flow estimationapproach works well due to physical proximity of pixels within a patch, andalso performs consistently and robustly compared to other OF estimationmethods [93].uIx(xc −m, yc − n, t) + vIy(xc −m, yc − n, t) = −It(xc −m, yc − n, t)...uIx(xc +m, yc + n, t) + vIy(xc +m, yc + n, t) = −It(xc +m, yc + n, t)(5.5)where (xc, yc) corresponds to the coordinate of the center pixel of a (2m +1) × (2n + 1) patch within the block of interest. For our analysis we usesquare patch m = n = 12(Block sizeflow res − 1), where flow res is the number offlow components in each dimension of the block.To solve (5.5) the least squares principle can be used but resulted innumerical instability in preliminary results, so a regularized least squaresapproach is used instead.735.2. Materials and methodsRegion of interestMacro blockPatchFigure 5.2: Block-based analysis of OF.5.2.5 Regularized least squaresThe least squares approach generally looks for the best fit to the data byminimizing the error term. Adding a regularization term to the traditionalLK approach, increases the stability of the algorithm in cases where con-stant or symmetrical intensities are present. For this purpose the Tikhonovregularization [94] is used which is a common method of regularization forill-posed problems (other regularizers may also be suitable). Tikhonov regu-larization term Tikc |(u, v)|2 (constant Tikc) is therefore added for stabilityas follows:E(u, v) =∑∀i∈Ωw(xi, yi)[(u, v)T · ∇I(xi, yi, t) + It(xi, yi, t)]2+ Tikc |(u, v)|2(5.6)where ∇I(x, y, t) = (Ix, Iy) and w is the Gaussian weighting function givingmore weight to the center of the region.The least square estimate minimizes the total error term in equation(5.6). The minimizer of E(u, v) is obtained in the following:∇E(u, v)T =∑∀i∈Ωw(xi, yi){(uI2x + vIxIyvI2y + uIxIy)+(IxItIyIt)}+ Tikc(uv)= 0(5.7)where ∇E = (∂E∂u , ∂E∂v ).Equation (5.7) can be written in matrix form S~ν = −~t, with S, ~t and745.2. Materials and methods~ν = (u, v) accounting for the spatial and temporal gradient terms and thevelocity vector, as follows:(∑wI2x + Tikc∑wIxIy∑wIxIy∑wI2y + Tikc)~ν = −(∑wIxIt∑wIyIt)(5.8)Tikc is added along the S diagonal in equation (5.8), ensuring it is well-conditioned and invertible. The lateral and axial components of the flow~ν are finally derived for each patch using vectorization of the analyticallycomputed S−1.5.2.6 Spectral coherenceAs shown in Fig. 5.2, the ROI covering the initial section of the needle shaftis divided into several square MBs with the target MB at the estimatedpuncture site. For each MB, the trace of the displacement map over time isobtained based on the average of flow parameters of all patches within theMB over several frames. The key concept is that a region containing theneedle has greater spectral correlation with the reference block at tremorfrequency, compared to a region without the needle. To obtain the coher-ence between two non-stationary signals A and B in general, time-frequencycoherence [95] is defined as follows:CAB(f) =|SAB(t, f)|2SA(t, f)SB(t, f), (5.9)where SA, SB and SAB are the Wigner-Ville spectrum [95] of A and B andtheir cross spectrum, respectively. B-mode data among a sequence of framescontain variations due to tremor, speckle and intrinsic body motion. Thistime series could be approximated as a stationary signal in time, since itsstatistical parameters are constant over time. The time-frequency coherenceis therefore simplified to spectral coherence, and the Wigner-Ville spectrumand the cross spectrum in equation (5.9) is replaced by the Power Spec-tral Density (PSD: Fourier transform of the auto-correlation) of dA and dB(average displacement at blocks A and B) and their cross PSD, respectively.Based on empirical findings, the displacement along one dimension con-tains sufficient spectral information and is computationally more efficientthan dealing with 2D displacement. Spectral coherence is therefore com-puted for the axial displacement between the reference block and every otherblock in the ROI.Axial is chosen (1) due to poor lateral resolution in B-mode, which alsoaffects the lateral flow resolution, and also (2) due to the steep inclination755.2. Materials and methodsFrequency (Hz)Magnitude(a) (b)Pulsating vesselsDetected needle trajectoryFigure 5.3: Illustration of significantly dropping spectral coherence (withinthe tremor frequency range of 2-5 Hz) as the block of interest moves awayfrom the needle: (a) A frame of the B-mode US image, with the sampleblocks and the detected needle as overlays. (b) Spectral coherence plots forcorresponding blocks of different distances from the needle trajectory: ∆dishows the lateral distance of the block of interest i from the nearest blockcontaining the needle shaft, in terms of the number of blocks (each block is2 mm × 2 mm in size).of the insertion, where axial component provides a better estimation of themotion. A simple segmentation step segments blocks based on the maximumvalue of spectral coherence in 2-5 Hz, with a threshold value of 85%, althoughany threshold in the range of 75% to 90% would likely also work. Fig. 5.3shows an example of coherency analysis on a sample set to illustrate therelative behavior of blocks with different distances from blocks containingthe needle. Coherence in the tremor frequency range drops significantly forblocks of interest further away from the needle.5.2.7 Trajectory detectionCoarse Estimation: Approximate ChannelFrom each block segmented based on the spectral coherence, four points areselected as the endpoint coordinates of the trajectory channel boundaries(U1, U2, L1, L2 in Fig. 5.4(d)) to increase the angular range of the field ofview. The trajectory channel is identified by the upper and lower boundary765.2. Materials and methodsNeedle tip(a)Spectral coherence:Labelled blockscr𝑈1𝐿1𝑈2𝐿2U-line Endpoint selectionBlock1: Max col → min rowBlock𝑈2: min col → min rowU-coordinates: top leftL-coordinates: bottom rightL-line Endpoint selectionTrajectory channelExtend U-line and L-line(b)(c) (d)(e) (f)NeedleU-lineL-lineBlock1: Max col → min rowBlock𝐿2: Max row → Max colFigure 5.4: Illustration of block-based trajectory estimation: (a) Schematicdrawing of the imaging setup and a B-mode frame containing the needle. (b)Flowchart showing the estimation of the trajectory channel. (c) Schematicdrawing showing the needle candidates overlaid on the segmented MBs basedon the spectral coherence results in gray. (d) Selected MBs for trajectorychannel and (e) endpoints of the channel boundaries. (e) Obtained line seg-ments forming the trajectory channel and (f) extended channel boundariesspanning the entire imaging depth.lines, labelled as U-line and L-line, respectively. The start coordinates ofU-line and L-line are selected at the top left and bottom right corners of thereference block (violet in Fig. 5.4(d)). The end coordinates of the U-line ischosen as the top left corner of the labelled block with the minimum rownumber among the labelled blocks with the minimum column number (redin Fig. 5.4(d)). The end coordinates of the L-line is chosen as the bottomright corner of the labelled block with maximum column number amongall labelled blocks with the maximum row number (blue in Fig. 5.4(d)).The U-line and L-line are considered to extend over the span of the entireimage depth, and thereby define the boundaries of a channel surroundingthe needle trajectory. Fig. 5.4 illustrates these steps graphically and usesa block diagram to show the importance of a channel selection to span theentire needle depth.775.2. Materials and methodsFine Estimation: Needle TrajectoryAfter limiting the region of analysis to an appropriate channel, we now intro-duce the key approach in localizing the needle trajectory based on coherentmotion along the cannula. Motion patterns along oriented spatio-temporallinear paths are used to detect the needle trajectory, based on a search forthe trajectory direction through the estimated channel. Considering the factthat the tremor motion is induced on the cannula and is therefore conveyedmainly along the insertion direction in the B-mode image, our hypothesis isthat the motion pattern along trajectory direction results in highest coher-ence compared to other directions.This stage is a search through all possible needle trajectories passing theestimated insertion site and stays within the estimated trajectory channel.In detail:• The trace of the displacement at the estimated puncture site is ob-tained over time along win size number of frames. This is referred toas reference path from now on.• Spatio-temporal linear sample paths originating from the estimatedpuncture site are chosen at sample orientations spanning the trajectorychannel within the ROI (dashed lines in Fig. 5.5).• The displacement pattern for pixels in each path is extracted from theflow field already calculated for the coarse estimation step.– An array of the displacement is formed for each path by pick-ing the displacement value of pixel position i (starting at thepuncture site), between frame i+ 1 and the reference frame (i =1, 2, . . . , win size− 1).– i is incremented until the last element is taken from the end ofthe line segment (i = lp).– If the length of the sample path (lp) is shorter than win size,copies of the pixel positions for i ≤ lp are also added for (win size−lp) remaining samples. The array of the displacement is thenfilled with the displacement of the corresponding pixel positionfor frame i+1 with respect to the reference frame (i = lp+1, lp+2, . . . , win size− 1).• The spectral coherency is derived for sample paths with respect to thereference path.785.2. Materials and methodsPositionFrame Path 2Path 1Path 3Path PReference𝑙1𝑙2𝑙3𝑙𝑃Insertion Site(b)(a)50 100 150 200 250 300 3505010015020025030035040045050 100 150 200 250 300 35050100150200250300350400450P321⋯⋯Figure 5.5: Sample result of porcine tissue in vivo: (a) Spatio-temporalsample paths overlaid on top of a frame of B-mode US image. (b) Samplinginstants for each sample path as well as the reference path, with respect tothe position and frame.• The needle trajectory is identified as the mean of the sample pathswith the maximum coherence within the tremor frequency range.Fig. 5.5 shows sample results. Investigating an example frame, spatio-temporal linear paths originated from the estimated puncture site are shownoverlaid on the B-mode image. Samples on each path are also plotted withrespect to pixel coordinate cross-sections along path segments and consecu-tive frames.5.2.8 Experimental analysis and setupUS images for this study were obtained using an iU22 US imaging sys-tem with a C5–1 (1–5 MHz) curved array transducer (Philips Ultrasound,Bothell, WA, USA). For the experiments, the transducer was fixed and theinserted needle was held by hand. The acoustic and imaging parameters(“abdomen general”) were kept constant, and the imaging depth was within50 mm to 60 mm. A total of 20 image sequences were obtained, each se-quence containing about 6×SR frames to be adequate for temporal analysis.Given a practical sampling rate to avoid aliasing (on average 6 times themaximum frequency), and a maximum detectable tremor frequency (5 Hz),the minimum acceptable frame rate is 30 Hz, referred to as the samplingrate (SR). This is not a limitation considering the B-mode frame rate forpractical depth settings from the C5–1 transducer.795.2. Materials and methodsTable 5.1: Parameter selection for experimental analysis.win size ang res Block size flow res σG Tikc6× SR 1◦ 2 mm 5 2 100Several frames of 2D B-mode data were captured for several insertionsin the biceps femoris muscle of an anesthetized pig. The porcine trial wasconducted at an animal lab at the Jack Bell Animal Research Facility, Van-couver (UBC animal care # A11-0223). In all of the captured sequences,the focus was on the presence of a pulsating vessel in the field of view. Astandard 17 gauge Tuohy epidural needle (Arrow International, Reading,PA, USA) with a metal stylus was inserted at various steep insertion angles(50◦−80◦) and depths (40 mm−60 mm). The chosen range of insertion an-gles was steep so that only the initial part of the needle shaft at the insertionsite was visible in the B-mode image.The method was implemented on a personal computer (4 GHz processorand 16 GB RAM) with an un-optimized MATLAB R© code. To evaluate theperformance, the needle trajectory calculated by the proposed method iscompared against a GS. For the sake of validation, all insertions were madesuch that the needle tip was made visible (despite the invisible shaft) as wellas the insertion site. This was possible by rotating the Tuohy needle aboutits long axis to increase the tip echo, while maintaining the invisibility ofthe shaft. In order to define a segmented GS for the analysis, all insertionswere made in-plane. Instead of segmenting the needle on one frame, here weintroduce a more accurate way of calculating the GS. We take into accountthe influence of the tremor itself on the GS, by analyzing how the tremorpotentially changes the angle of insertion among frames. To test this, needlewas manually segmented by a sonographer in two frames with maximumdisplacements (at opposite directions) with respect to the reference frame,obtained from block-based OF analysis. The GS is thus defined as the meanof the identified insertion angles at frames with maximum displacements oneach data.Table 5.1 shows the parameter selection for the experimental analysis.Parameters were selected according to the geometry of the setup and pre-liminary tests. A window size is selected according to the sampling rate(US frame rate), and the frequency resolution (FR): win size = SR/FR.Empirically we found that window size of 6×SR gives reasonable resolutionin frequency as well as sufficient length for PSD estimation. An angularresolution (ang res) of 1◦ is chosen, which provides an acceptable resolutionfor detecting the needle trajectory in the fine estimation step, as well as a805.3. Experimental resultsreasonable computational efficiency. A suitable choice of block size (blk sze)was obtained empirically and listed in mm to automatically form the ROI.flow res, the number of flow components in each dimension of the block,was set to 5, resulting in 25 flow vectors within each block. σG, the SD of2D Gaussian smoothing function and Tikc is Tikhonov regularization termwere set empirically.5.3 Experimental resultsFig. 5.6 shows an example of the displacement fields as well as the corre-sponding coherency functions for a block of interest and the reference blockat the top corner of the image. Note the high coherency for displacementarising from hand tremour motion on the needle, as well as the distinguish-able high-frequency response of the intrinsic body motion versus that of theneedle. Fig. 5.7 shows a few frames of the B-mode image corresponding toa hand-held needle in an agar phantom. The detected trajectory by theproposed technique is overlaid on top of each image. A key observationis the ability to detect the needle at the mid-shaft location when there isessentially no echo present in the B-mode image.Quantitative analysis are listed in terms of the mean, SD (σ) and root-mean-square measurements. Table 5.2 describes the accuracy of the needletrajectory by computing mean, SD and RMS measurements for the trajec-tory deviation (∆θ) of the presented method from the GS, averaged over20 sequences of images with the needle inserted in biceps femoris muscle ofin vivo porcine tissue. Needle detection error is below the clinically acceptederror (Section 1.2).Analysis for the maximum angular change due to tremor, which is ob-tained by manual needle segmentation on frames with maximum absolutedisplacement with respect to the reference frame, shows mean, SD and RMSmeasurement of 0.43◦, 0.23◦ and 0.48◦, averaged over 20 data-sets.5.3.1 Parameters SensitivityTo evaluate parameter sensitivity, the method was re-tested on 5 randomlyselected data sets over a range of each of the six parameters, keeping otherparameters constant. Sensitivity was given as the range of parameter valuesThis section is also adapted from [96]: P. Beigi, T. Salcudean, R. Rohling and G. C.Ng (2015). Needle detection in US using the spectral properties of the displacement field:a feasibility study. Proceedings of SPIE Medical Imaging, vol. 9415, pp. 94150U-6.815.3. Experimental resultsTable 5.2: Needle trajectory detection accuracy on porcine experimentsin vivo (n=20).∆θ σ∆θ RMS(∆θ)2.83◦ 1.64◦ 3.23◦Table 5.3: Sensitivity analysis for parameter selection on n = 5 randomlyselected data sets.Params win size ang res blk sze flow res σG Tikc[Values] [4,8]×SR [0.5 ,1.5] [1,3] [4,6] [1.5,2.5] [80,120]∆θ 2.88 2.90 3.11 3.93 3.32 2.93σ∆θ 1.69 1.80 2.35 2.00 1.90 1.48RMS∆θ 3.30 3.35 3.83 4.36 3.78 3.25resulting in a maximum acceptable mean error. The mean, SD and RMSerror were calculated as specified in Table 5.3. While needle width, frequencyresolution and image size are taken into account, a parameter is consideredinsensitive if the errors deviate little from the values in Table 5.2. Acceptableerror was defined as 4◦ for the sake of illustration of sensitivity [40]. We havealso analyzed the detection results when the overall imaging gain is varied,while keeping other parameters constant. The results are summarized asmean, SD and RMS error in Table 5.4.Table 5.4: Needle trajectory detection results averaged over the same porcinedata in vivo (n=20), varying the overall imaging gain. (% is the percentageof the intensity saturated to 0 (–) and 255 (+)).Gain variation (%) ∆θ σ∆θ RMS(∆θ)−15 4.16◦ 2.48◦ 4.53◦−10 3.13◦ 1.87◦ 3.39◦−5 2.92◦ 1.83◦ 3.10◦+5 2.86◦ 1.74◦ 3.28◦+10 2.78◦ 1.71◦ 3.57◦+15 3.04◦ 1.58◦ 3.41◦825.3. Experimental resultsFrequency (Hz) Magnitude (a)Frequency (Hz)(b)Figure 5.6: Displacement field of a block at a random frame with respectto a reference frame (top row) and the corresponding coherency map of thelateral displacement (bottom row), for: (a) abdominal aorta and (b) a hand-held needle. The reference window in (a) contains only the tissue whereasin (b) it contains the needle.Figure 5.7: A sequence of frames displaying the results of the presentedmethod in detecting a hand-held needle in an agar phantom.835.4. Discussion and conclusion5.4 Discussion and conclusionIn this chapter, a new needle detection technique for US-guided interven-tions based on spatio-temporal analysis of the tremor motion is proposed.Our method detects the needle from minute periodic displacements on alinear path. To obtain the target MB, the puncture site is automaticallyestimated, but may also be provided manually by the operator or assumedfor a given procedure, without much loss of the functionality. The trajec-tory is estimated as a linear path in the image having maximum spectralcorrelation with the time trace of displacement due to tremor.For comparison, there are few other methods that can localize an un-modified needle that is nearly invisible by eye in the image. For the sakeof illustration, since the tip was purposefully made visible for validation,and a short superficial portion of the shaft is barely visible, the HT was ap-plied to these features. A HT tuned to 50◦ − 80◦ was applied to the binaryrepresentation of each frame after thresholding manually to maximize theneedle response. The angular deviation error (∆θ) was 14.75◦. Using thevisible tip, the mean distance (∆p) from the detected trajectory was 46.52mm. Compared to the proposed approach (∆θ = 2.83◦ and ∆p = 1.88 mm),the large HT errors show the challenge of detecting nearly invisible needlesbased on intensity features alone. The intra-observer variability of the goldstandard’s manual annotations was assessed by repeating measurements 10times on a sample dataset, and was found to be 0.68◦±0.47◦ in the insertionangle and 0.63± 0.29 mm in the tip.The computational efficiency of the framework is tested, running on apersonal computer (4 GHz Processor and 16 GB RAM) with an un-optimizedMATLAB R© code. The overall computational time depends mainly on thewindow size and the number of blocks. In general, OF computation con-sumes about 70% of the total computational time, where the flow field com-putation of each MB in one frame with respect to the reference frame takes0.0021 seconds. To give an example: for sampling rate 33, win size = 198,blk sze = 2 mm and flow res = 5, the total processing time is about 0.97second for each MB and 9.43 seconds overall with un-parallelized block pro-cessing. Flow computation for all frames and MB are amenable to parallelcomputing and multicore processing to improve speed.We specifically chose to investigate curvilinear transducers because sev-eral commercial US systems already have beam-steering solutions for im-proving needle visibility with linear array transducers. The proposed ap-proach however should still be applicable to both types of transducers; theonly difference for a linear transducer is to revise the step for estimating the845.4. Discussion and conclusionpuncture site relative to the transducer. Since the OF computation is basedon brightness constancy, imaging settings need to stay constant during thelocalization step.For the experimental tests, the transducer was fixed with a mountingdevice, since a rigid setup was needed for GS verification of the needle po-sition. In general, similar to any dynamic motion detection technique, thesteadiness of the medium results in a more accurate detection. In practice,the transducer can be maintained steady by bracing the hand holding it,over a few seconds of the needle localization process. However even if tinytremor is present, it would mostly be in the lateral direction and would min-imally affect the overall OF computation and especially spectral coherencyon the axial flow components.Alternative motion detection tests were also performed with power Dopp-ler imaging to try to detect the tremor motion. According to our find-ings, Doppler is not sensitive enough to detect the tremor motion. This ismainly due to the fact that the small amplitude and frequency of tremor mo-tion, resulting in displacement dis = At sin(ωtt), produce a small velocity,∂dis/∂t = Atωt cos(ωtt), which is not adequate to create the Doppler shift.Minimum detectable velocity is also inversely proportional to the lower fre-quency of the transducer, making this even more challenging for curvilineartransducers. In addition, the frequency shift is further attenuated by thecos factor for steep insertions, since the variations parallel to the ultrasoundbeam produce the maximum Doppler shift. Although Doppler did detectportions of the shaft occasionally, it was not found reliable enough for nee-dle localization, especially when intrinsic tissue motion is present. Since thetremor is measured from the displacements in the B-mode data, the limitedspatio-temporal sampling in B-mode image formation could cause our mea-surements to only contain the larger tremors, and seeing power in the 2− 5Hz range.In summary, the natural periodic tremor motion on a hand-held needlecan be used to detect a hand-held needle using spectral analysis of the dis-placement field within clinically acceptable accuracy. There are still a fewlimitations of our system despite its functionality. Direct analysis of thetremor motion, without incorporating prior knowledge, can result in local-ization inaccuracy due to the surrounding tissue also minutely moving withthe needle at the tremor frequency. In addition, the block-based analysis re-quired in the presented method increases the computational complexity andlimits real-time implementation. The puncture site also has to be initiallyestimated for spectral correlation, which could result in inaccuracy if errorsare introduced in its estimation.85Chapter 6Multi-scale Phase-basedAnalysis of a Hand-heldNeedle: Fixed Imaging6.1 IntroductionAs shown in Chapter 5, the natural periodic tremor motion on a hand-held needle can be used to detect a hand-held needle using spectral analysisof the displacement field. Direct analysis of the tremor motion, withoutincorporating prior knowledge, however, can result in localization inaccuracydue to the surrounding tissue also minutely moving with the needle at thetremor frequency. The required puncture site estimation and block-basedrequirements in the method presented in the previous chapter are two of thelimiting factors.Motion calculation from intensity variations is challenging in the cases ofartifacts and complex boundaries and is also inefficient for real-time clinicalapplications. In this chapter, we aim to detect the needle by estimatingtremor from phase variations instead of the magnitude. Phase contains moreinformation for an accurate image reconstruction and is also more robustto artifacts than amplitude. Our method is originally inspired by motionmagnification work developed for natural videos [97, 98], whose pyramidformation for spatial decomposition and temporal filters have been adaptedto match with our case. This original approach is extensively modified: (1)To detect the needle from subtle motion arising from tremor rather thanmagnifying motion, and (2) to extract features to be used in a machinelearning-based framework. After analyzing the phase variations of spatiallydecomposed image over time, the needle can be detected by detecting specificpattern in the temporal filtered data. We also integrate this approach to anew learning-based framework to segment a hand-held needle in US B-modeimages. Our method satisfies some of the requirements involved in the directmotion computation, and can be trained for specific clinical applications.866.2. Materials and methodsThe machine learning (ML) approach, is in general superior over directapproaches in parameter tuning (after the model is trained), is more robustto the imaging parameters variations, and also has the potential to outper-form direct methods. We hypothesize that formulating needle localization asa classification problem would make the needle detection more robust to im-precise assumptions and incorrect detection. In an epidural application forinstance, an ML approach can learn the hand tremor pattern of specific oper-ators and potentially improve the performance accordingly. The hypothesisis that ML can be used to improve the detection accuracy through distin-guishing the needle from the surrounding tissue in close contact with theneedle. This is possible with the proposed time-domain statistical featuresalong with the probability-based classification and a probability-weightedlocalization. The second key aspect is that prior knowledge may be auto-matically incorporated in the framework by training on application-specificdata (e.g. different tremor patterns), to improve the detection accuracy inthat application. We tested our algorithm on intramuscular insertion exper-iment and treated it as one such clinical application. Experimental analysisis performed in vivo with a pulsating vessel in the field of view, for a morerealistic evaluation of the presented method. Our focus is on cases were theneedle is essentially (for much of its length) invisible to the eye, therefore anapproach based on motion analysis is warranted. The method is developedfor standard needles inserted at a mid-to-deep depth (40 mm−60 mm) witha mid-to-steep insertion angle (50◦ − 80◦), with generally poorer visibility.6.2 Materials and methods6.2.1 Phase-based motion estimationRecall the shift property of the Fourier transform stating that displacementin time/space induces a phase shift proportional to the displacement andfrequency. The relationship between motion and phase shift could be usedas an efficient way of estimating motion more robustly and accurately usingthe phase instead of the magnitude. The estimated motion is then processedto segment a hand-held needle with minute tremor displacement, from othersources of displacement such as intrinsic body motion. Our proposed methodis summarized in Fig. 6.1 and is described in detail in the following. Notethat subtle movements due to tremor may be hard to be detected fromThis section is adapted from [99]: P. Beigi, T. Salcudean, R. Rohling and G. C. Ng(2016). Automatic detection of a hand-held needle in ultrasound via phased-based analysisof the tremor motion. Proceedings of SPIE Medical Imaging, vol. 9786, pp. 97860I-1.876.2. Materials and methodsthe gray scale images but could be detected with the following framework.Starting from the Fourier theorem, the intensity profile of a frame of a B-mode sequence at time t and spatial position (x, y) can be written as follows:I(x, y, t) = f(x+ ∆x(x, t), y + ∆y(y, t)) ,∑νAνei2piν(x+y+∆xy(x,y,t)) (6.1)where each band contains a complex sinusoid decomposition at spatial fre-quency Ω = 2piν, ∆xy(x, y, t) is equal to ∆x(x, t) + ∆y(y, t), ν is the spatialfrequency and Aν ’s are the Fourier coefficients of I(x, y, t). At each spatialfrequency band, which corresponds to a single scale, the displacement term∆xy(x, y, t) appears in the phase, which is used to evaluate the motion.Displacement arising from tremor is in fact a function of both spaceand time; to estimate such variation from phase changes in the spectraldomain, we use a pyramid of complex spatial filter banks to decompose thesignal into different spatial frequencies. Phases of the filtered signal with thecomplex-valued wavelets will then be used for estimating motion. Accordingto the nature of the tremor and the angle of insertion, tremor motion occursmainly at a certain range of directions. To account for this information,we use the complex steerable pyramid, a special multi-scale pyramid thatprovides image decomposition at multiple orientations. In such a pyramiddecomposition, filters of arbitrary orientation are synthesized as a linearcombination of “basis filters” [100]. The basis function of the transform isa pair of steerable even and odd Gabor wavelets, which are sinusoidal waveswindowed by a Gaussian envelope. Although given enough basis filters,steering at any orientation is possible, fewer orientations is preferred inpractice to reduce computational complexity.Our filter bank is designed such that with much fewer steerable filter pairs(three at 70◦, 50◦ and 30◦ angles), we mainly detect the variations alongthe specified range of insertion angles and ignore variations at a differentdirection. Data is decomposed into local amplitudes and phases, at eachscale and orientation. The local phase is then sent to the temporal filteringstep for further analysis.886.2.MaterialsandmethodsSpatial DecompositionEven Gabor Odd GaborScale 1 (ω1)Scale 2 (ω2)Reference frame (ref)φ𝑟𝑒𝑓(𝑠1, 70°)φPhaseMagnitude АφPhaseMagnitudeАOrientation 1 (70°)Orientation 3 (30°)Orientation 1 (70°)Orientation 2 (50°)Orientation 3 (30°)Orientation 2 (50°)Even Gabor Odd GaborAmplitude Weightingφ𝑟𝑒𝑓(𝑠2,70°)Combination and NormalizationInput set of framesTemporal FilteringOutput set of framesPolynomial fittingHough Transform(a) (b) (c) (d) (e) (f) (g)(i) (j) (k)φ𝑟𝑒𝑓(𝑠1, 50°)φ𝑟𝑒𝑓(𝑠1, 30°)φ𝑟𝑒𝑓(𝑠2,50°)φ𝑟𝑒𝑓(𝑠2,30°)Histogram Analysis(h)Figure 6.1: Block diagram of our proposed approach. A reference frame is selected from the input sequence ofthe B-mode data and all frames are sent as the input to the algorithm (a). Three complex Gabor wavelet pairsat three orientations (70◦, 50◦ and 30◦) form the steerable pyramid for spatial decomposition of the sequence (b)into local magnitude and phase measurements (c). Phase differences of all frames will then be computed from thereference frame and (e) temporally filtered using an FIR bandpass filter. Amplitude weighting is then performedon the filtered phase differences for adjustments in cases of weak magnitude responses (f). Results from all scalesand orientations are combined (g) and thresholded to generate the binary mask for the HT (h). The HT derivesan estimate of the trajectory and discards some of the outliers (i). Polynomial fitting is finally used to removeany remaining outliers and improve the trajectory detection (j). The detected needle is then added to the inputsequence as an overlay (k).896.2. Materials and methodsTemporal filtering and localizationSince each sub-band represents a complex sinusoid, displacement informa-tion can be found in the phase. Motion vectors ∆xy(x, y, t) can be obtainedfrom the temporal trace of local phases differences. To detect motion ata specific tremor temporal frequency, at each scale, the phase differenceΩ(∆xy(x, y, t)) is temporally filtered using a finite impulse response (FIR)bandpass filter centered at the estimated frequency of the tremor νtremor.The FIR filter is designed by windowing the ideal bandpass filter using aHanning window wn = 0.5(1− cos(2pinN−1)), where n is the weight numberand N + 1 is the order of the filter.The next step aims to segment the tremor motion from the filteredphase differences at multiple spatial frequencies. Similar to the idea ofSusceptibility-Weighted Imaging to enhance contrast in magnetic resonanceimaging, we multiply the filtered phase differences by the magnitude re-sponse of the corresponding spatial sub-band at each scale to obtain Awphases. This will adjust the shape and contrast of the structures by at-tenuating the phase differences in regions with weak magnitude responses.Results obtained from various scales are then combined and thresholded togenerate the binary map input for the HT.In order to automate the process, we choose a dynamic thresholdingscheme, based on the cumulative sum of the number of pixels (n) groupedin 256 bins (0-255), in the combined response (as discussed in Section 4.2.5).The normalized cumulative sum for bin b is calculated as:Cb =∑bi=1 ni∑255i=1 ni(6.2)where i is the bin index, ni is the number of pixels with intensity i.An acceptable threshold value is obtained where the energy of the sig-nal (simply the cumulative sum of the samples) reaches a portion of themaximum energy, called the binarizing threshold (TB).It = (∆Imini |Ci > TB, ∀i ∈ I) (6.3)where Ci is the normalized cumulative sum up to intensity i and ∆Imini is theintensity of the first pixel satisfying the equation. It is therefore the intensityvalue corresponding to the first pixel whose cumulative sum is greater thanthe binarizing threshold value. After a systemic analysis on a few test sets,TB = 99% is selected empirically. Using HT, an estimate of the trajectoryis identified as the longest cluster representing a line. Polynomial fitting906.2. Materials and methodsis then used to remove the remaining outliers and improve the trajectorydetection. The detected needle is finally added back to the input sequenceas a set of segmented pixels and is displayed as an overlay.Note that the direct analysis of the tremor motion without incorporatingprior knowledge, can result in localization inaccuracy due to the surround-ing tissue also minutely moving with the needle at the tremor frequency.With the direct approach, all pixels (needle axis, needle boundary or thesurrounding tissue) are treated the same way, such that they can be identi-fied as the needle if they move at the tremor frequency. A machine-learningapproach could be used to divide these pixels into different regions, whichwill be introduced in the next section.6.2.2 Learning-based motion estimation using ARMAfeatures of the phaseThe motivation of this section is to create an ML framework to detect ahand-held needle using micro-motion analysis of US images. We hypothe-size that formulating needle localization as a classification problem wouldmake the needle detection more robust to imprecise assumptions and in-correct detection. This framework is presented and tested in a preliminarystudy to demonstrate that such micro-motion features can be learned andused for needle detection for a range of needle angles and depths. Our keycontribution is the use of an SVM classifier with posterior probabilities onour proposed time-domain features to localize a hand-held needle based onthe likelihood of classification.SVM was chosen for this application because hand-crafted features canbe integrated into the model, and the feature space can be easily changedusing kernels. In addition, it can be formulated as a convex optimiza-tion, where the existence of a unique solution makes the computation moretractable. The probabilistic outputs can also be obtained from the SVMoutputs, by Platt Scaling that fits a logistic regression on the classifier out-put. Posterior probabilities are used as an estimation of likelihood of needlepixels, which will benefit our localization accuracy on the probability map.Fig. 6.2 and Fig. 6.3 illustrate the schematic diagram of the proposedframework for feature selection and detection of the needle. A high-leveldescription of the method is summarized as follows: (1) compute the localphase of each pixel in the US image over time at every spatial scale andThis section is adapted from [101]: P. Beigi, R. Rohling, T. Salcudean and G. C.Ng (2017). Detection of an invisible needle in ultrasound using a probabilistic SVM andtime-domain features. Journal of Ultrasonics, vol. 78, pp. 18-22.916.2. Materials and methodsφPhase(c) (e)Phase Extractionℛℯ ℐ𝓂Scale 1 (ω1)Scale 2 (ω2)(80°)(50°)(80°)(60°)(50°)(60°)(a) (b)Input framesTime (d)φ0(𝑠1, 65°)φ0(𝑠1, 80°)φ0(𝑠1, 50°)φ0(𝑠2, 80°)φ0(𝑠2, 65°)φ0(𝑠2, 50°)ℛℯ ℐ𝓂(Partial) AutocorrelationFeature selectionARMA modelConcatenate for all scales/orientationsRepeat for all scales/orientationsPeriodsPeak distancePeak heightOrderCut-off lagFeature ExtractionLocal Phase MeasurementFigure 6.2: Illustration of the proposed feature selection pipeline. Phasedifference of local phase measurements obtained from pyramid decomposi-tion of the input frames (a) using oriented Gabor wavelets are obtained (b).Phase variations over time are computed for phase values at each spatial lo-cation over subsequent frames (c). Features are extracted at various scalesand orientations (e) from the sequence of the phase variations (d).orientation of the steerable pyramid. (2) Extract temporal features fromone-dimensional (partial) auto-correlation function of the phase vectors toeventually segment the regions with tremor temporal frequency. (3) Train asupport vector machine (SVM) classifier on the features obtained from thetraining images, along with the manually annotated training data-set of thesegmented needle. (4) Apply a course initial mask obtained from the su-perposition of amplitude-weighted (AW) phase responses of the test imagesequence, to increase the computational efficiency by selecting a ROI. (5) Inaddition to pixel classification of the test images, obtain the probability es-timates of the classification by an internal pairwise coupling step performedwithin the classifier. (6) Apply a HT on the probability map of the unseentest data to localize the needle.926.2. Materials and methodsFeature Vectors(𝐹1 , 𝐹2, … , 𝐹𝑚)(1,2,…,𝑛)Pixels𝑙⋯⋯⋯⋯⋯ ⋯ ⋯ ⋯ ⋯⋯⋯⋯⋯ ⋯ ⋯ ⋯SVM Classifier Trained SVMClassified needle𝑙𝑝⋯⋯AW Phase mask⋯LocalizationWeighted HTFigure 6.3: Schematic Diagram of the needle detection and localization.Based on the obtained feature vectors, an SVM classifier is trained on theselected features in the training data-set and is validated on the test data-set resulting in the predicted labels l and their probability estimate p. Aweighted HT based on the probability estimate is performed on the classifiedpixels to localize the needle (f).Feature selectionIn the previous section, we developed the framework to estimate motion fromphase variations, without explicit motion computation. In this chapter, weare specifically addressing the steep insertion angles, therefore a half-octavecomplex pyramid with oriented Gabor filter bank at (50◦, 65◦ and 80◦) wasused to decompose the signal into different spatial frequencies. As mentionedin Section 6.2.1, using the complex valued wavelets and the Fourier theorem,the intensity profile of a B-mode image at spatial location (x, y) and time t,can be decomposed into sinusoids at each Ω = 2piν as follows:I(x, y, t) = f(x+ ∆x(x, t), y + ∆y(y, t)) ,∑νAνei2piν(x+y+∆xy(x,y,t))The trace of ∆xy at each scale and orientation over t is therefore ap-proximated as a stationary process. A stationary process could generally bedescribed as an Auto-Regressive Moving Average (ARMA) model [102]. Wepropose to use the statistical parameters of the ARMA model as features tocharacterize tissue motion and segment the needle.936.2. Materials and methodsTime-domain statistical signaturesThe time-domain statistical descriptors are used to describe the stochasticsignal of the trace of phase variations over t. The ARMA model, which isusually denoted as ARMA(pAR, qMA), defines a stationary process by ARMApolynomial terms of orders pAR and qMA, respectively. Time domain statisticssuch as auto-correlations are useful tools to characterize the temporal vari-ations as they represent the unique parameters to identify the model [102].The tremor motion of a hand-held needle results in minute periodic needleoscillation, and B-mode time series of regions containing the needle conveythis oscillatory pattern. The continuous periodic pattern of heartbeat ortremor could be modeled as an auto-regressive process, specifying that theoutput of the time-series depends linearly on its previous values.This process can be identified by the shape of its Autocorrelation Func-tion (ACF) with order pAR, the cutoff lag of Partial Autocorrelation Function(PACF). ACF is used to determine the linear dependence of a signal with it-self at two different times. PACF is the partial correlation of the signal withits own lagged values, i.e., PACF of lag-l is the autocorrelation between ytand yt+l after removing the linear dependence of yt on yt+1, yt+2, ..., yt+l−1.Therefore, we propose to extract features from (P)ACF, to distinguish re-gions with different types of tissue motion (periodic/aperiodic) and withdifferent periods (tremor/heartbeat). Features are obtained manually basedon the expert knowledge to select distinguishing characteristics. The pro-posed features are thus chosen such that they represent cyclic characteristicsof the motion (e.g. peak distance, peak height and rolling variance) as listedin Fig. 6.4. The order of the partial auto-correlation function is also chosenas a representative of the signal type, which is obtained from the laggedPACF (PACF values at various delay shifts). Six features that are obtainedat each spatial location from all orientations are averaged at each scale andconcatenated over both scales to form a 12-D feature vector for the corre-sponding pixel.Pixel-based classificationAn SVM1 is used to build the model using the proposed features on a trainingdata-set manually annotated by a sonographer, which is done by solving thefollowing optimization problem:min (Cn∑i=1ζi +12‖ω‖2), subject to yi(ωφ(xi) + b) ≥ 1− ζi (6.4)1Library for SVMs: http://www.csie.ntu.edu.tw/~cjlin/libsvm/946.2. Materials and methods𝟐𝑭𝑹OrderPeak distance𝒑𝒎𝒊𝒌𝒑𝒊𝒎𝒊Selected features: Rolling variance of ACF: 𝜎𝑘(𝑘: sliding window width) Number of ACF peaks  Mean of ACF peak distances Mean of peak-min height of ACF (𝑝𝑚) over 𝐹𝑅 frames Max of 𝑝𝑚 over 𝐹𝑅 frames Order: Cut-off lag of PACF Last significant lag (lag ≥2𝐹𝑅)AutocorrelationPartial AutocorrelationFigure 6.4: Details of the selected features from the statistics of ARMAmodel. Characteristics of an ARMA model, obtained from (P)ACF, areselected as features.where x ∈ Rm is the m-dimensional feature vector for n pixels in the trainingdataset, y ∈ Rn is the label vector, and b/ ‖ω‖ determines the offset of thehyperplane ωφ(xi) + b from the origin along ω, the normal vector to thehyperplane. φ is the function to map the data into a higher dimension, andthe slack variables ζi and C allow for the misclassification of noisy data,where C is the penalty for the error term controlling overfitting. The radialbasis function (RBF) is used as the kernel due to its ease of initialization(requiring only one parameter) and classification accuracy for non-linearpatterns:RBF (xp, xq) = e−γ‖xp−xq‖2 (6.5)To find ω in equation (6.4), there are only two parameters to set, C andγ, the regularization term and the inverse of the RBF variance, respectively.To map the SVM outputs to needle likelihood estimates, we have used im-plemented Platt scaling, which is calibrated by a logistic regression (sigmoidfunction) on the SVM outputs as follow:P (Y = 1|X) = 1/(1 + eαf(x)+β), (6.6)in which P (Y = 1|X) is the estimated likelihood of needle pixels, f(x) is thedecision function, and α and β are estimated using a maximum likelihoodmethod on the training data and their corresponding scores. The trainedSVM is then used to classify the needle and background pixels in the un-seentest images.956.2. Materials and methodsPre-processingThe optimal values of the hyper-parameters (C and γ) were determinedusing a grid-search algorithm based on leave-one-out cross-validations onthe cross-validation dataset. We used exponentially-growing sequences ofC = 2−10 − 210 and γ = 2−10 − 22 (both with 22 increments) in a coarse-fine search framework to identify the best values in each run. Also, due tothe imbalanced class sizes (needle versus background), and our objective todetect as many needle pixels as possible, we used the same number of back-ground pixels as needle pixels for the training data-set. Background pixelsare therefore randomly selected equal to the number of the needle pixels foreach round of the classifier training. This process is repeated 10 times andthe classification output is reported as the average of outputs. To evalu-ate the effectiveness of the approach to locate the needle, the classificationaccuracy is measured on each un-seen test data.Needle localizationFollowing the segmentation step, the entire needle is localized using the prob-ability map p of the data instead of the classified data itself. The posteriorprobability estimate was used as a weighting coefficient in determining thecontribution of each pixel in the needle formation. Therefore, pixels withhigher likelihood estimates will have a greater effect on the HT estimate,hence the effect of false positive pixels obtained from the classification stepis mitigated. The location r and orientation θ of the obtained line usingweighted HT is assumed to represent the line passing through the needleand is obtained as a voting procedure through HT(r, θ) =∑(x,y)∈Lr,θ p(x, y).Lr,θ represents the line formed by r and θ.6.2.3 Experimental setupUS images for this study were obtained using an iU22 US imaging systemwith a C5–1 (1–5 MHz) transducer. The acoustic and imaging parameterssuch as gain and focus were kept constant and the imaging depth was within50 mm to 60 mm. Data for evaluation are captured from an agar phantom aswell as porcine femoris muscle in-vivo. Images of 20 different insertions in anagar phantom were initially used to test the direct approach. 20 sets of imagesequences were then obtained in vivo, with each set containing about threetimes the number of frames in a second (3×FR) frames to be adequate fortemporal analysis. The data was captured for various insertions in the bicepsfemoris muscle of an anesthetized pig. The porcine trial was conducted at966.3. Experimental resultsan animal lab at the Jack Bell Animal Research Facility, Vancouver (UBCanimal care # A11-0223). In the captured sequences, there was a pulsatingvessel purposefully placed in the field of view. A standard 17 gauge Tuohyneedle (Arrow International, Reading, PA, USA) with a stylus was insertedat various mid-to-steep insertion angles (50◦ - 80◦) and mid-to-deep depths(40 mm−60 mm).To evaluate the performance, (1) the classification accuracy of needle andbackground pixels and (2) the localization accuracy (∆θ, ∆p) by comparingthe detected final pose of the needle against a GS were measured. For thesake of validation using a GS, all insertions were made in-plane with theneedle tip and the initial section of the shaft made barely visible. This waspossible by rotating the Tuohy needle about its LA until the tip created anecho, while maintaining the invisibility of the shaft. To select the GS forlocalization, a sonographer was asked to draw a line from the insertion siteto the tip. To label pixels as the needle or the background for the training,pixels that clearly represent the tremor motion in their phase trace over timeshould be selected. To annotate the image for classification, the sonographerwas asked to draw a channel surrounding the needle, then the automaticframework labels the needle pixels. Pixels classified as needle lying withinthe channel are considered as true positives. Quantitative analysis is listedin terms of the mean and SD (σ) of the measurements.6.3 Experimental resultsFig. 6.5 shows a sample frame, where the input video contains a sequence ofthe 17G epidural needle held at a fixed depth in the porcine femoris musclewith the detected trajectory shown as an overlay. Although the needle shaftis not clear to the naked eye, it can be detected by our proposed algorithmfrom minute displacements of the tissue surrounding the needle due to thetremor motion.Quantitative evaluation of the proposed approach was performed on anagar phantom and porcine femoris muscle in vivo, averaged for various in-sertion angles and insertion depths. Table 6.1 provides error analysis of thetrajectory and the tip localization for 20 insertions in the agar phantom.∆θ, σ∆θ and RMS(∆θ) (degree) are the mean, SD and RMS of the angulardeviation, and ∆P , σ∆P and RMS(∆P ) (mm) are the mean, SD and RMSerror of the needle tip deviation, respectively.For the learning-based study, the data is randomly grouped into training(50%), cross-validation (25%) and test (25%) sets, over 10 permutations.976.3. Experimental resultsDetected needle trajectoryPulsating vesselPulsating vesselGold standardFigure 6.5: Needle detection result for an in vivo sequence of a hand-heldneedle in porcine femoris muscle.Table 6.1: Localization accuracy for medium-steep needle insertions in anagar phantom, averaged over 20 data set.∆θ σ∆θ RMS(∆θ) ∆P σ∆P RMS(∆P )0.93◦ 0.87◦ 1.26◦ 1.53 mm 1.02 mm 1.82 mmThe classification accuracy is averaged over the randomly selected back-ground pixels and the involved permutations. Fig. 6.6 shows the detectionresult of an inserted needle in the presence of pulsating vessels, as well as theprobability map. As Fig. 6.6(a) shows, greater number of high probabilityneedle pixels are at the needle position rather than the vessels, where pnand pv1/pv2 are the average of non-zero probability components at a blockaround the needle tip and the pulsating vessels, respectively.The classification accuracy is 81%±4% (mean±SD). Table 6.2 shows thecomparison of the localization accuracy of the proposed approach based onTable 6.2: Needle localization accuracy comparison of the proposed methodand the existing state-of-the-art approaches for in vivo porcine test-set.∆θ ± σ∆θ ∆p± σ∆pML 2.12◦ ± 1.79◦ 1.69± 1.54 mmOF 2.83◦ ± 1.64◦ 1.88± 1.36 mmPB 2.37◦ ± 1.81◦ 1.92± 1.17 mm986.4. Discussion and conclusionML against (1) OF-based motion detection technique, presented in Chapter5, and (2) direct PB approach, where motion is estimated from the phasevariations, presented in Section 6.2.1. Results evaluated for angular devi-ation against the GS (∆θ) and the tip offset from the detected trajectory(∆p), show improved accuracy of the trajectory and the tip. The proposedML approach performs at least as good as the direct approaches, while itcould potentially outperform them by learning for specific needle and tissuesignatures, it is also more robust to the imaging parameters variations.(b)10(c)𝑝𝑛 = 81% 𝑝𝑣1 = 38% 𝑝𝑣2 = 44% (a)Figure 6.6: (a) A B-mode frame of the captured sequence, (b) the probabilitymap of the masked pixels in the frame, and average probability scores ofregions surrounding the needle tip (box) and the pulsating vessels (circles),and (c) the detection result and the GS overlayed on the B-mode frame, asdashed and solid lines, respectively.6.4 Discussion and conclusionIn this chapter, we proposed novel needle detection techniques for US guidedinterventions based on PB analysis of the tremor motion. Our method lo-cates the needle by detecting minute periodic displacements of the needlearising from tremor within clinically acceptable accuracy. Complex-valuedsteerable filter banks are used to decompose the data into several spatialsub-bands. Each sub-band is then filtered separately to isolate regions withtremor motion using temporal bandpass filters. Results obtained from differ-ent spatial scales are then combined and thresholded to generate the binarymask input for the HT, which detects a rough estimate of the needle trajec-tory. Polynomial fitting is applied next to remove the remaining outliers andsegment the needle as a cluster of pixels. Detected needle is finally addedto the image as an overlay. The results from this initial feasibility study are996.4. Discussion and conclusionpromising, and encourages future investigations of the method in vivo.There have been a lot of advances in ML algorithms and tools recently,which allow for easier and more powerful implementations of specializedmethodologies using ML. Thus, in order to allow for potential future im-provements, we also proposed a new learning-based detection framework todetect an invisible hand-held needle based on time-domain features. Themain aim was to provide a working solution for unmodified needles insertedat a mid-to-deep depths with mid-to-steep insertion angles. Our methodis able to distinguish the needle from the tissue in contact with the nee-dle and the pulsating vessels using the probability map of the classification.We also acknowledge that the method is currently only tested on needleswith minimal bending; for larger gauge (thinner) needles, the localizationstep could be generalized to piece-wise analysis, and the HT is used at eachsmall segment. The intra-observer variability of the gold standard’s manualannotations was assessed by repeating measurements 10 times on a sam-ple dataset, and was found to be 0.68◦ ± 0.47◦ in the insertion angle and0.63± 0.29 mm in the tip.Learning from the representations of the data and incorporating moreprior knowledge may increase the overall decision accuracy. Application-specific methodologies could also be employed by ML via training on par-ticular data and tailoring for a specific clinical application. PB analysis isshown to be successful in extracting useful needle features for localizationof a hand-held needle while it is fixed in the tissue. Real-time tracking andlocalization of the needle while it is being inserted is however still a chal-lenge. Although the learning-based approach could help with surroundingtissue being misclassified as the needle to some extent using the probabilitymap, the pixel-based analysis of motion cannot yet perfectly distinguish theneedle (needle axis) and the surrounding tissue as both move with the samefrequency. Despite the assumption made in this chapter about a truly fixedtransducer, the US transducer may also be held by hand during the insertion.Direct flow analysis is unable to distinguish whether the tremor comes fromthe needle or the transducer, which may result in further localization errors.In the following chapter, we will propose a new learning-based framework forneedle detection and tracking using features obtained within spatio-temporalcuboids and differential flow analysis by combining our initial findings frompreviously proposed OF and PB tremor detection techniques.100Chapter 7Needle Tracking UsingPhase-based Optical-flowAnalysis and MachineLearning: Free-hand Imaging7.1 IntroductionPrevious chapters showed that the analysis of the tremor motion dynamicsis successful in extracting useful needle features for localization of a hand-held needle. Real-time tracking of the needle while it is being inserted is stillrequired. In previous chapters, we assume that the transducer is fixed, whichmight be an over-simplification of an actual clinical scenario. In this chapter,we have proposed methodologies to mitigate the effect of the transducer’stremor in the image and to allow the transducer to be held by hand.In Chapter 6, a multi-scale spatial decomposition followed by a tempo-ral filtering step was used to estimate the motion pattern and detect pixelsmoving at the tremor frequency. In Chapter 5, a block-based approachwas used to first select the regions with motion pattern similar to the mo-tion pattern of the block at the estimated puncture site. The time trace ofthe displacement was then computed along the spatio-temporal linear pathsarising from the puncture site and the needle was identified as the pathwith maximum spectral correlation with the motion pattern of the puncturesite. That method works best with curvilinear transducers because the ini-tial portion of the shaft is usually detectable near the puncture site wherethe beam is perpendicular. Upon further testing in vivo, despite some im-provements, the individual pixel-based analysis in these methods was foundThis chapter is adapted from [103]: P. Beigi, R. Rohling, T. Salcudean and G. C. Ng(2017). CASPER: Computer-aided segmentation of imperceptible motion – A learning-based tracking of an invisible needle in ultrasound. International Journal of ComputerAssisted Radiology and Surgery, pp. 1-10.1017.2. Materials and methodsto be sensitive to noise and also resulted in localization errors due to thesurrounding tissue that also moved with the needle. On one hand, accuratemotion computation using optical-flow analysis on intensity images is moresensitive to noise, and on the other hand, motion estimation is not quiteaccurate for localization yet more robust using the phase. In this chapter,we aim to combine phase-based motion decomposition and optical flow toobtain a more accurate motion computation, which is also more robust toimage variations.The proposed system in this chapter is called CASPER: Computer-Aided Segmentation of imPERceptible motion, which is a learning-basedframework that tracks a needle by detecting variations of imperceptible fea-tures over time. In previous chapters, tremor motion for needle detectionwas detected based on absolute motion analysis, which works well whenthe transducer is fixed (not hand-held). We propose a tracking frameworkusing differential OF and spatio-temporal micro-motion features to incor-porate neighboring pixels and mitigate the effects of subtle tremor motionof a hand-held transducer. PB analysis of the motion in complex-valuedpyramids is used to extract spatial features as it is more robust to subtlemovements. In addition, spatial decomposition at different orientations us-ing a tuned complex steerable pyramid, isolates motions mainly at the angleof interest, i.e., the expected range of insertion angles. Our main novelty isincorporating relative flow and the characteristics of the nearby regions inthe detection framework, in addition to flow estimation from the local phasevariations. We also introduce a self-supervised tracking approach capable ofimproving the performance in the subsequent frames using spatial analysisand dynamic training update. Our contributions are: (1) incorporating thesurrounding spatio-temporal neighborhood in the analysis of the pixel data,(2) incorporating the direction of the flow field in addition to its magnitude,(3) tracking of the needle during insertion instead of detecting it at a fixedspatial position, and (4) extension to free-hand imaging: where both theneedle and the transducer are allowed to be hand-held. Qualitative andquantitative analysis is performed in vivo on porcine subjects.7.2 Materials and methods7.2.1 Method overviewAn overview of our tracking method is shown in Fig. 7.1. It consists ofthree main steps, including motion description using PB analysis and OF,spatio-temporal and spectral feature extraction from the micro-motion in1027.2. Materials and methodsIncremental SVMxytImage AcquisitionMulti-scale Spatial DecompositionOptical Flow ComputationConsecutive framesTraining UpdateOnlineSpatial AnalysisPerformance Analysis FeedbackOnline EvaluationDetection and tracking resultRepeat Spatio-temporal FeaturesSpectral FeaturesScale-basedDifferential Flow Pre-processingFeature extractionOffline Online⋯Spatio-temporal CellsxytFigure 7.1: Block diagram of the proposed approach: Complex steerablepyramids are used for spatial decomposition and motion descriptors arecomputed at each spatial scale. Spatio-temporal and spectral features areextracted within the STCs around each pixel and forwarded to an incremen-tal SVM for classification. Classification results are spatially analyzed forcontinuity of the positive needle pixels and false positive non-needle pixelsare fed into the adaptive training for online learning. Detection result isshown as an overlay on the current frame, and the procedure is repeated inthe subsequent frames.1037.2. Materials and methodscuboids centered at each pixel, and needle tracking using incremental SVM.Steerable pyramids with oriented Gabor filter banks are designed in therange of insertion angles to isolate motion mainly at the orientation of theneedle. The differential flow map of the AW phase of consecutive framesis computed from the OF analysis. Spatio-temporal and spectral featuresare obtained for several cuboids, called spatio-temporal cells (STCs), con-structed around each pixel. The feature vector is sent to an incrementalSVM for classification. Classified pixels at the current frame are comparedto the morphological estimate obtained from spatial analysis of the labelsand their position. Mislabeled data is added as new training example andthe model is updated, which enhances the prediction for the subsequentframes. The tracking procedure continues for every captured frame, whilethe model as well as the classification results are updated iteratively. Thedetails of the method will be described in the following sections.7.2.2 Motion descriptorsNeedle insertion involves micro-motion of the imperceptible needle featuresand the tissue surrounding the needle. During the insertion, although themovement of the needle may not be perceptible in the B-mode data, it couldbe extracted from analysis of the low-level motion features. Due to the factthat pixel-based motion descriptors are more sensitive to noise, we introducean efficient feature representation based on low-level motion descriptors ofvolumes of pixel.Multi-scale spatial decompositionIn Chapter 6, we showed that compared to magnitude-based methods, PBanalysis of motion is more robust to subtle intensity changes, and hencecould be used to extract micro-motions induced by needle insertion. Ac-cording to the shift property of the Fourier transform, the displacementin time/space induces a phase shift proportional to the displacement andfrequency. Based on Fourier series, to extract motion, the displaced in-tensity profile (I) of a frame in a B-mode sequence at time t and spatialposition (x, y) can be decomposed into its spatial sub-bands sbω(x, y, t) =Aωeiω(x+y+∆xy(t)) as follows:I(x+ ∆x(t), y + ∆y(t)) ,∑kAkei2pik(x+y+∆xy(t)) (7.1)where each band contains a complex sinusoid at spatial frequency ω = 2pik.1047.2. Materials and methods3 filters 2 scales3 orientations per locationComplex Steerable PyramidMagnitude 𝑀𝑖 Phase 𝑃𝑖𝑀𝑟Magnitude-weighted Phase Superposition of orientationsScale-wise(a)(b)Insertion angle rangeFigure 7.2: Multi-scale spatial decomposition with oriented filters. (a) Half-octave bandwidth filters in the insertion angle range. (b) Multi-scale PBdecomposition of each frame (i) according to the reference frame (r).We therefore use the phase of each sub-band, denoted by Pω(x, y, t) =ω(x+ y+ ∆xy(x, y, t)) to compute the motion map. As shown in Fig. 7.2, acomplex steerable pyramid of two scales and three orientations (50◦, 65◦ and80◦) is used to decompose the signal into its spatial frequency bands. Toreduce noise in the phase data, phase responses are attenuated at stationaryregions where intensity variations is low. This is obtained by computing theAw phase using the reference frame intensity, as shown in Fig. 7.2b.Phase-based optical flow mapsTo characterize moving regions, we begin by computing the OF of the su-perposition of AW phase responses Pω at multiple orientations and scales,for consecutive frames:It + uIx + vIy = 0 (7.2)1057.2. Materials and methodsSTCs(b)xytTrajectory𝐿𝑡xyt(a)⋯Past frames(c)(d)⋯𝐿𝑡Current frameReference framePrevious frameyxFigure 7.3: (a) Captured frames containing the needle, arrow is perpen-dicular to the invisible needle shaft. (b) Several STCs constructed for thecaptured frames. (c) frame selection for spatio-temporal analysis at the cur-rent frame, and the STCs for each candidate pixel (two candidate pixels inthis example). (d) OF is computed for each pair of the previous consecutiveframes.1067.2. Materials and methodswhere u = dx/dt and v = dy/dt are the lateral and axial components of theOF and (Ix, Iy) and It are spatial and temporal gradients of the phase imageswith respect to position (x, y) and time t, respectively. Spatial gradients areapproximated in the lateral and axial directions of the phase image using theFDoG masks along the rows and columns of the phases of the filtered framesIx/y(x, y, t) = I(x, y, t)⊗gx/y. The temporal gradient is simply estimated asthe difference of the Gaussian smoothed images at the consecutive framesIt(x, y, t) = I(x, y, t)⊗ g − I(x, y, t− 1)⊗ g.As discussed in Chapter 5, to estimate the flow parameters (u, v), weuse the regularized Lucas-Kanade approach, based on the constant flowconstraint on the neighboring pixels within a defined window w. Regularizedleast square is then used to find the best estimate by minimizing the errorterm:E(u, v) =∑∀i∈ww(xi, yi)[(u, v)T · ∇I(xi, yi, t) + It(xi, yi, t)]2+ Tikc |(u, v)|2(7.3)where ∇I(x, y, t) = (Ix, Iy) and w is the Gaussian weighting function andTikc is the constant in the Tikhonov regularization term.Given a sequence of Lt frames up to the current frame {f1, . . . , fr, fp, fc},the collection of the OF from all consecutive frames is defined as a timedependent flow field F .F = {〈u(I1), v(I1)〉 , . . . , 〈u(Ir), v(Ir)〉 , 〈u(Ip), v(Ip)〉 , 〈u(Ic), v(Ic)〉} . (7.4)where Ir, Ip and Ic are the reference, previous and current images, respec-tively. At each frame, OF is computed for each pair of previous consecutiveframes. Fig. 7.3 describes the frame selection for PB analysis and OF com-putation. Features are obtained for several STCs constructed around eachpixel. STCs are of size Ln×Ln×Lt, in which Ln is approximately the samesize as the needle width and Lt is the temporal window size for temporalanalysis.Differential flow mapsTo use motion analysis for hand-held needle detection in US data capturedby a hand-held transducer (free-hand), features features that are both goodat characterizing needle motion and resistant to transducer and intrinsicbody motion are required. The main issue with previous motion-based nee-dle detection approaches is that, absolute motion detection works best when1077.2. Materials and methods𝑓𝑐𝑓𝑝𝑓0𝑓𝑐𝑓𝑝𝑓0Grid of cellsFigure 7.4: Differential flow scheme shown for a 3 × 3 grid of cells as anexample. The center cell is the spatial cross section of the image with thecorresponding STC of size 5× 5× Lt as an example. Arrows represent thediagonal direction of the relative flow.the transducer is fixed in place and direct analysis of the motion cannot welldistinguish the needle from surrounding tissue moving with the needle.We introduce differential flow features to better represent a hand-heldneedle despite the pulsating vessels in an US image captured by a hand-heldtransducer. While spatial gradients of the flow capture the sharp changes,the differential flow is basically the large-scale spatial differences at variouslocations. In more detail, the tremor motion on a hand-held transducer isglobally distributed along the image with the additive motion vector (uprobe,vprobe). Depending on the image intensity, a scale of the transducer’s motionvector is added to the OF field of each pixel. Therefore, assuming thatthe intensity is relatively uniform around the needle, which matches wellwith the assumption of the invisible needle in the tissue, differential flowcomputation will cancel out most of the transducer’s tremor effect.Differential flow is computed for 3×3 block of cells, centered at each pixelposition. As shown in Fig. 7.4, in each of the surrounding cells, the relativeflow is computed for each pixel as the flow difference with respect to thecorresponding pixel in the center cell. The spatial size of the cell is defined toapproximately match the needle width, therefore the flow differences relativeto the neighboring tissue mainly detects the motion due to insertion.7.2.3 Feature extractionTo represent the micro-motion due to needle insertion, we introduce spatio-temporal and spectral features, which are extracted from the OF and dif-ferential flow maps of the flow field F in the constructed STCs. Features1087.2. Materials and methodsare obtained for each scale separately and the classification result on bothscale outputs are superimposed to get the final classification result. Care istaken in selecting the STC temporal dimension to ensure that the windowsize includes a complete cycle for temporal analysis. The computed flowfield of the last Lt frames are stored in a buffer, where the oldest componentis replaced by the flow field of the previous frame, with parsing of each newframe.Spatio-temporal feature selectionIn order to describe the motion patterns, we first extract the spatio-temporalfeatures from the sequence of phase images. At every pixel within the STCof the candidate pixel, the spatio-temporal motion descriptors of OF (u, v),and differential flows (∆u,∆v) are computed. The spatio-temporal featuresintroduced to detect the needle are to address the effects of tremor motioninduced on the hand-held transducer, detect the needle from neighboringtissue more accurately and distinguish needle motion from the intrinsic bodymotion more robustly.The OF parameters are represented in polar coordinate with AOF =√u2 + v2 and θOF = arctan(u/v). A set of 11 features are used for everyscale of the sequence. Our first pair of features f1,2 = (AOF , θOF ) is themagnitude and phase of the superposition of OF along the temporal lengthLt of an STC, defined for each pixel. For all pixels within an STC, we alsocompute the average of the OF field along Lt. Magnitudes of the averageflow vectors for all pixels are concatenated, normalized and histogrammedto show the distribution of the average OF magnitudes for pixels in STC.Statistical moments median and skewness of the magnitude of the averageOF are then computed to form f3,4.The next set of features is obtained from the differential flow that iscomputed for the grid of cells (block) around each STC. Pixel-wise averageof differential flow field is computed along Lt, for all outer cells within theblock. The average differential flow for each pixel is described by A∆OFi =√∆u2i + ∆v2i and θ∆OFi = arctan(∆ui/∆vi), ∀i ∈ block. A∆OFi values areconcatenated for all corresponding pixels within all of the outer cells Atot(Fig. 7.4). Auto-correlation function is then computed for 1D vector Atotobtained from the concatenated amplitude vectors to find motion patternsof pixels within the block. The lag t auto-correlation function is defined as:rt(Atot) =∑T−ts=1 (Atot(s)− A¯tot)(Atot(s+ t)− A¯tot)∑Ts=1(Atot(s)− A¯tot)2(7.5)1097.2. Materials and methodsAuto-correlation function is computed for every lag and the median ofthe auto-correlation value of the lags is used as another feature f5. Theaverage of the differential angle θ∆OFi is also computed, to (1) extract thedirection of movement of the neighboring cells and (2) distinguish the needlefrom pulsating vessels by their different angular pattern. θ¯∆OFi is used asour sixth feature f6.Spectral feature selectionThe trace of the OF (F ) from consecutive frames is computed for all pixelswithin the STC. The key concept is that pixels moving together has greaterspectral correlation at the corresponding frequency of motion. Thereforeconsidering the fact that STC width is approximately the same size as theneedle width, if the candidate pixel (center of the STC) is on the needle,pixels within the corresponding STC have similar spectral coherency patternespecially around the tremor frequency.The flow field along a sequence of frames contains variations resultingfrom insertion and intrinsic body motion, and due to their constant sta-tistical parameters over time, the sequence is considered a stationary pro-cess. Spectral coherence is obtained between the flow field of the centerpixel F0 = F (p0) at STC versus the flow fields of all the neighboring pixelsFi = F (pi) within the STC as follows:Cp0pi(f) =|PSD(F0Fi)|2PSD(F0)PSD(Fi), ∀pi ∈ ~STC − {p0} (7.6)where PSD(p0), PSD(pi) and PSD(p0pi) are the power spectral densities(PSD) (Fourier transform of the auto-correlation) of the flow field of p0 andpi and their cross PSD. Spectral coherence is computed for the magnitude ofthe flow field between the center pixel and all other pixels in the STC. Thefrequency component is quantized and the weighted votes for coherence com-ponents are computed from the spectral coherence and locally histogrammedto produce the feature vectors. The accumulated value at each bin is thennormalized based on the maximum value at the corresponding bin. We areonly interested in the low frequency components, as they contain the in-sertion and intrinsic heart rate frequency components we need for analysis.The first five components of the normalized histogram are then used for thefinal five features f7−11.1107.2. Materials and methods7.2.4 Pixel-based classificationAn SVM was chosen as the classifier due to its easy integration of the hand-crafted features, and its use of kernels to modify the feature space. Inaddition, it is formulated as a convex optimization, so that a tractable com-putation is obtained using the unique solution. SVM is chosen at two stepsin our framework, it is first used to train the initial model offline usingour training set data. The initial trained model is then used in our onlinetracking framework to classify the un-seen dataset and update the initialtrained model. Needle and background pixels are represented as +1 and −1observations in the trained binary classifier. Incremental learning is consid-ered as an online method to update the model by adding one example toan existing solution at a time. The key is to update the weights to keepthe Kuhn-Tucker conditions satisfied on the enlarged dataset in addition tothe previous examples. Particularly, given an observation f ∈ Rm and amapping function Φ, an SVM discriminant function is given by:〈ω,Φ(f)〉+ b (7.7)where 〈 〉 is the inner product operator and (ω, b) are the linear separatorparameters. The weight vector ω can be written as linear combination ofthe training examples (ω =∑ni=1 αiliΦ(fi)), and equation (7.7) is writtenas:n∑i=1αiliK(fi, f) + b (7.8)The optimal discriminant parameters are chosen in order to maximizethe margin Γ = 1||ω|| , the distance between the hyperplane and the closesttraining vectors fi. To simplify analysis, minimizing is usually formulated indual quadratic form using weightings α and offset b (Lagrange multipliers)as follows:minαi,b: ω =∑iαi12∑jliljK(fi, fj)αj + bli − 1, subject to 0 ≤ αi ≤ C(7.9)where f is the m-dimensional feature vector for n pixels, l ∈ Rn is the labelvector and b determines the offset. The RBF is used as the kernel K dueto its ease of initialization (requiring only one parameter) and classificationaccuracy for non-linear patterns: K(fi, fj) = e−Γ‖fi−fj‖2 .1117.2. Materials and methodsThe saddle point of equation (7.9) is given by Kuhn-Tucker conditions,optimizing the dual parameters (α, b) to solve (7.9):ωαi :=∂ω∂αi= liljK(fi, fj)αi + bli − 1αi = 0⇒ |ωαi | ≥ 10 < αi < C ⇒ |ωαi | = 1αi = C ⇒ |ωαi | ≤ 1(7.10)During online training, the margin vector coefficients change value ateach incremental step to keep Kuhn-Tucker condition satisfied for all ex-amples in the updated training dataset. When a new false-positive samplefc is added with initial weight αc = 0, if it is not optimum, i.e., if fc issupposed to be a support vector, all weights are updated eventually to keepKuhn-Tucker condition satisfied. A more detailed description can be foundelsewhere [104]. The optimum values for the classifier’s parameters C andγ, the regularization term and the inverse of the RBF variance, are obtainedusing coarse-fine search over the cross-validation dataset.7.2.5 Online evaluationWe analyze the spatial distribution of the classification results of at each it-eration. Our developed self-supervising step involves a voting procedure toautomatically determine the cluster of points belonging to the needle. Dueto the fact that the imaged tissue does not change drastically in consecu-tive frames, the estimation is improved for each new frame with addition ofthe new classification result. The aim of the spatial distribution and onlineupdate is also to account for false positives such as reverberation artifacts,copies of the needle with similar motion pattern as the needle. These areless likely to be detected with longer training and are specific to a sequence,which could be used in a self-supervisory framework to enhance the local-ization within the sequence.Spatial distribution analysis and online updateThe spatial distribution analysis and online update are performed in aniterative approach (Fig. 7.5). Considering the continuity of the needle, weuse a parametric representation of a line ρ = x cos(ϕ)+y sin(ϕ), to representthe line using parameters ρ and ϕ, where ρ is the orthogonal distance fromthe origin to the line and ϕ is the angle between the line trajectory and x-axis. ϕ is computed for all lines formed by pixels classified as +1 to identifythe lines’ angles. A needle cluster is selected from the histogram analysis1127.2. Materials and methods1. Initialize 𝐼𝑡𝑜𝑡 = 0𝑠𝑖𝑧𝑒(𝐼), 𝑓𝑟 = 0, 𝑝𝑖𝑥𝑒𝑙 =2. repeat 3. {𝐶−1, 𝐶+1} ← 𝑆𝑉𝑀𝑡𝑒𝑠𝑡 𝑓𝑟, 𝑓𝑠1 , 𝑓𝑠24. 𝐼𝑡𝑜𝑡 𝐶+1 ← +15. l ← possible lines in 𝐼𝑡𝑜𝑡6. l𝑜 ← outliers(l, ρ, φ)7. l ← l − l𝑜 (remove the outliers)8. ∀ 𝑝𝑖 ∈ 𝐼𝑡𝑜𝑡 > 0, ∀ l𝑗 ∈ l: 𝑑𝑖 ← distance(𝑝𝑖, l𝑗)9. l ← { l𝑗with 𝑑𝑖 < 𝐿𝑛} (update the line set)10. 𝑝𝑓𝑝 ← 𝑝𝑖 ∈ 𝐶+1 & 𝑝𝑖 ∉ l11. 𝐼𝑡𝑜𝑡 𝑝𝑖 ∈ l ← +1 (update the classification output)12. (𝛲, 𝜃) ← lines(𝐼𝑡𝑜𝑡)13. 𝑆𝑉𝑀𝑡𝑟𝑎𝑖𝑛(𝑓𝑠1(𝑝𝑓𝑝), 𝑓𝑠2(𝑝𝑓𝑝), 𝑦(𝑝𝑓𝑝) ← −1) (re-lable false positives )14. 𝑓𝑟 ← 𝑓𝑟 + 115. until 𝑓𝑟 ≤ 𝑓𝑟𝑚𝑎𝑥Figure 7.5: Summary of the spatial distribution analysis and online update.I is the input image, fr is the frame number, pi is pixel i and fs1/fs2 :feature vectors at both scales. Itot is the classifier’s output binary imagethat gets updated at each iteration. pfp is the false-positive pixel set, i.e.,non-needle pixel misclassified as needle. Outlier is defined based on thehistogram analysis described in Section 7.2.5of the populated needle pixels contributing to lines with angle ϕ withinthe insertion range (50◦ − 80◦). Outliers are further removed by histogramanalysis of the ρ values to keep the lines with more populated ρ. Nearby +1pixels within the Ln distance of the needle cluster and Ln distance from theselected lines are considered as true positives. All other +1-classified pixelsfarther from the cluster are considered as false positives and are added tothe dynamic training set for online training. If the false positives do notshow up again during insertion in the next few frames, they are improbableto belong to the needle and will be marked as the background. During theonline training, the feature vectors of the obtained false positives as well astheir target label (l = −1) are sent to the SVM model, which will be usedin future iterations. The same procedure is repeated for all frames, and thecluster is adaptively updated with the new results.1137.2. Materials and methods7.2.6 Experimental analysis and setupUS images were obtained using an iU22 US machine and a hand-held C5–1(1–5 MHz) curvilinear transducer. The acoustic and imaging parameterswere kept constant suitable for general abdominal imaging. The imagingdepth was varied within 50 mm to 70 mm and the insertion angle was within50◦−80◦. A standard 17 gauge Tuohy epidural needle (Arrow International,Reading, PA, USA) was used for insertion. The porcine trial was conductedat Jack Bell Animal Research Facility, Vancouver (UBC animal care #A14−0171). 60 sequences of US images were captured for independent insertionsin the biceps femoris muscle of an anesthetized pig. For a realistic scenario,a pulsating vessel was present in the field of view when possible. A completedescription of the animal study protocol is included in Appendix 2.Based on the pixel resolution, outer diameter of the 17 gauge needle (1.47mm), and different spatial scales, the histogram of the features was obtainedfor cross-validation dataset, while varying Ln in the range of 0.2 mm−1.2mm. We empirically obtained the value of Ln ≈ 0.9 mm for the analysis. TheGS was defined by the expert as a line passing through the needle axis, usedto select observations for the training set. A channel with the same width asthe needle was computed around the segmented trajectory for performanceevaluation of the test set. For HT analysis, 2 mm was chosen as the minimumconnectivity of regions to form a line. Also, since each classification resultcontains the data from the past Lt frames, and considering the maximuminsertion rate of 2 cm per second, we use the maximum connectivity of 20 mmto connect the line segments with the same angle. Care is taken to ensurethat the window size for spectral analysis includes at least one complete cycleof the heartbeat. We empirically found that window size (Lt) of number offrames in a second gives reasonable resolution in frequency, and sufficientlength for spectral coherency estimation. The histogram analysis for outlierremoval was performed by obtaining the histograms of ϕ and ρ of the linesand keeping the lines contributing to the most populated three bins.To evaluate the performance, the needle trajectory calculated by theproposed method is compared against the GS manually annotated by asonographer with 30 years of experience. The method performance wasevaluated in terms of shaft and tip detection errors. Shaft accuracy wasobtained according to the angular deviation between the needle directioncalculated by the algorithm and the true needle direction of the GS ∆θ.Tip accuracy was obtained by calculating the Euclidean distance betweenthe needle tip at the closest channel boundary at the tip depth, and thedetected trajectory ∆p.1147.3. Experimental resultsThe needle is detected at each frame in the incremental training frame-work and the classification results are enhanced for the subsequent framesbased on the spatial analysis. For the sake of validation, all insertions weremade in plane such that the needle tip was as visible as possible. In manycases, however, the GS could not be obtained from the static image, andthe expert had to analyze the sequence to label the needle based on movingregions. To annotate data for the offline training set, needle pixels withsignificant motion pattern and matching the GS are annotated as +1. Thisensures that the neighboring pixels, within the STC of a pixel, are also apart of the needle, which aims to increase the classification accuracy. Theexpert also records their confidence level of the GS selection. For the initialoffline training data, 30% of the sequences were randomly selected from im-ages with high confidence (92% of the entire data). Over 10 permutations,the remaining 70% of the sequences were grouped into cross-validation andtest sets. 10% of the data were used in cross-validation tests to find theoptimum values for parameters, including SVM hyper-parameters C andγ. The remaining 60% of image sequences were used in the online eval-uation. Note that several pixels were obtained from each training image,and each image contributed to over 160 training samples to form the ini-tial offline training dataset of 3,000 observations. To have balanced classes,background pixels were randomly selected equal to the number of needlepixels for each training dataset, and repeated 10 times. The method wasimplemented on a personal computer (4 GHz processor and 16 GB RAM)with an un-optimized MATLAB R© code.7.3 Experimental resultsFig. 7.6 shows a summary of the classification pipeline. The flow map iscomputed for frames within an STC with respect to the reference frame,and their flow magnitude is superimposed to create a coarse motion mask.The features vector is computed for candidate pixels and sent to the SVMclassifier to determine the needle pixels. The classification results are thensent to the spatial distribution analysis and online training update to de-termine the needle location. As shown in Fig. 7.7, the spatial distributionanalysis and online learning, removes the outliers and improves the classi-fication results at each frame. The needle cluster is enhanced with respectto the final result of the previous frame and therefore both the classifier’soutput and the final result are improved in the new frame. The localizationaccuracy is further enhanced in subsequent frames when the needle cluster1157.3. Experimental resultsOptical flow map  50 100 150 200 250 30050100150200250300 -12-10-8-6-4-202468x 10-7  50 100 150 200 250 30050100150200250300 -12-10-8-6-4-202468x 10-7  50 100 150 200 250 30050100150200250300 -12-10-8-6-4-202468x 10-7  50 100 150 200 250 30050100150200250300 -12-10-8-6-4-202468x 10-7  50 100 150 200 250 30050100150200250300 -12-10-8-6-4-202468x 10-7  50 100 150 200 250 30050100150200250300 -12-10-8-6-4-202468x 10-7  50 100 150 200 250 30050100150200250300-10-8-6-4-202x 10-7  50 100 150 200 250 30050100150200250300-10-8-6-4-202x 10-7  50 100 150 200 250 30050100150200250300-10-8-6-4-202x 10-7  50 100 150 200 250 30050100150200250300-10-8-6-4-202x 10-7  50 100 150 200 250 30050100150200250300-10-8-6-4-202x 10-7  50 100 150 200 250 30050100150200250300-10-8-6-4-202x 10-7  50 100 150 200 250 30050100150200250300-10-8---02x 10-7  50 100 150 200 250 30050100150200250300 -12-10-8-6-4-20 0-7𝑢(𝑥, 𝑦)v(𝑥, 𝑦)Classification maskMotion detectionHistograms features𝑝 ∈ 𝐶𝑛100 200 300 400 500 600 700 800100200300400500600100 200 300 400 500 600 700 800100200300400500600100 200 300 400 500 600 700 800100200300400500600100 200 300 400 500 600 700 800100200300400500600100 200 300 400 500 600 700 800100200300400500600100 200 300 400 500 600 700 800100200300400500600100 200 300 400 500 600 700 800100200300400500600100 200 300 400 500 600 700 800100200300400500600100 200 300 400 500 600 700 800100200300400500600100 200 300 400 500 600 700 800100200300400500600100 200 300 400 500 600 700 800100200300400500600 Input captured framesSTC50 100 150 200 250 3005010015020025030050 100 150 200 250 3005010015020025030050 100 150 200 250 3005010015020025030050 100 150 200 250 3005010015020025030050 100 150 200 250 30050100150200250300Superposition of the magnitude of flow for candidates’ STC   50 100 150 200 250 30050100150200250300-10-8-6-4-202x 10-7NeedleFigure 7.6: Summary of the steps in the classification pipeline. Initial se-quence of frames are shown on the left side, where a white arrow points tothe (invisible) needle trajectory. OF is computed and the superposition oftheir magnitude is shown for the selected STCs. finally, the classificationresult of the incremental training is shown as an overlay on each frame.grows further and estimates the true trajectory more accurately.Table 7.1 and Table 7.2 describe the trajectory and tip accuracy of thedetected needle using the mean, SD and root-mean-square of the error. Re-sults are evaluated for angular deviation against the GS ∆θ, and the tipoffset from the detected trajectory ∆p. Results are averaged over 36 se-quences of images used during each online evaluation as well as the involvedpermutations.We compared the proposed method with respect to our previous methodsin Chapter 5 and Chapter 6, and an appearance-based detection method.As shown in Table 7.1, 16% of the data required manual identification of theinitial portion of the shaft for OF-based method, as the automatic method(presented in Chapter 5), failed to select it correctly. For a fair compar-ison of fully-automatic methods, these are ignored and 84% success ratewas reported for OF-based method. Our proposed method outperforms theprevious approach with statistically significant improvement in both angular(P < 0.0005) and tip deviation (P < 0.002), obtained using paired two-sidedMann-Whitney-Wilcoxon test. The method is also compared against anappearance-based approach relying on visible needle features, by applying atuned HT at the insertion angle. The HT-based method was tested on all im-ages and the error was reported only for the successful cases. Note the follow-ing criteria for a successful detection: (1) the detected trajectory ld shouldbe in close proximity of the GS needle pixels PGS = {p|GS(p) = +1} of theannotated channel, i.e., their intersection should not be empty, ∃{ld∩PGS},1167.3. Experimental resultsShaftTipShaftShaftTipTipGold standardClassifier’s outputFinal result(a) (b) (c) (d)Figure 7.7: Localization results shown for 3 subsequent frames. (a) B-modeUS image, with two arrows pointing to the needle tip and the shaft locations.(b) and (c) zoom in of the classifier’s output and the final output of thealgorithm, respectively. (d) GS (green line), the HT estimated trajectoryfrom the classifier’s output (dashed line) and the algorithm’s output (whitesolid line) overlaid on two corresponding sample frames.and (2) the detected trajectory should have angular deviation ∆θ ≤ 10◦with respect to the annotated GS. Although the average errors of shaft an-gle and tip were 0.93◦ and 1.42 mm respectively, the method only succeededin 5 cases where the needle trajectory was totally visible in the static image.This shows the challenge of needle localization based on intensity featuresonly.Offline-CASPER accounts for the classifier’s output (formulated by HT),just before spatial distribution analysis and online learning update. As sum-marized in Table 7.3, comparison of offline-CASPER against the OF-basedapproach, shows the importance of feature selection, i.e., the spatio-temporalneighborhood and the direction of the flow. Comparison against CASPERshows the significance of spatial analysis and online learning update in theoverall performance. CASPER versus Offline-CASPER shows highly signif-icant improvement (P < 0.001) in both the angle and tip, confirming therole of the online update. Offline-CASPER versus OF-based and PB shows1177.3. Experimental resultsTable 7.1: Comparison of the needle localization trajectory results onporcine femoris muscle in vivo.Method Success ∆θ σ∆θ RMS(∆θ)CASPER 100% 1.28◦ 1.09◦ 1.68◦Offline-CASPER 100% 1.81◦ 1.34◦ 2.25◦PB 100% 2.14◦ 1.11◦ 2.41◦OF-based 84% 2.36◦ 1.57◦ 2.84◦HT-based 14% 0.93◦ 0.33◦ 0.99◦Table 7.2: Comparison of the needle localization tip results on porcinefemoris muscle in vivo.Method ∆p σ∆p RMS(∆p)CASPER 0.82 mm 1.21 mm 1.47 mmOffline-CASPER 1.53 mm 1.43 mm 2.09 mmPB 1.71 mm 1.32 mm 2.16 mmOF-based 1.78 mm 1.29 mm 2.20 mmHT-based 1.42 mm 0.67 mm 1.57 mmsignificant improvement (P < 0.05) in both the angle and the tip accuracy.Fig. 7.8 demonstrates the performance of CASPER versus the OF-basedapproach for 3 different frames. Frames were purposefully selected from thesequence where only tremor was present, i.e., for a window of time whenthe needle was stationary and held by hand. Windows of previous frameswere concatenated to produce a longer time span. The localization result ofthe OF-based method is relatively close to that of CASPER initially, but itdeviates further as the insertion in progressing. This is mostly due to theTable 7.3: P-values of paired two-sided Mann-Whitney-Wilcoxon test on theresults of offine-CASPER against CASPER, PB and the OF-based method.CASPER PB OF-basedAngle 1.90E-03 0.0135 0.0093Tip 2.52E-03 0.0238 0.01711187.4. Discussion and Conclusionfact that CASPER analysis is performed on each new frame relative to theprevious two frames, therefore changes during insertion are detected morereliably compared to the OF-based method.Figure 7.8: Localization results shown for 3 frames. CASPER output (whitesolid line), the OF-based method (red dashed line) and GS (green line) areoverlaid on the B-mode image.7.4 Discussion and ConclusionIn this chapter, we proposed a novel tracking and localization approach forUS-guided interventions based on spatio-temporal features and incrementaltraining. Differential optical flow and spatio-temporal features incorporat-ing neighboring pixels were used to mitigate background noises caused bysubtle tremor motion from the hand holding the transducer. Micro-motiondescriptors were computed from the AW phase of the spatially decomposeddata using Gabor filters. The self-supervised tracking framework improvesthe performance in the subsequent frames and updates the adaptive trainingdataset.Note that any method based on motion detection in general requiresrelatively small variations from frame-to-frame for its best performance. Inthis work, several steps were taken to mitigate the potential effects of othersources of motion in the detection. The tuned oriented filters in spatialdecomposition, the specific frequency channels in spectral features compu-tation, the differential flow analysis, and the spatial distribution analysisin the final step, all aimed to mitigate the effects of other sources such asintrinsic body motion.In addition to accuracy, online learning is especially useful for tracking,where new samples (detected false positives) are added to the training, af-ter each frame is gathered. Investigation of decremental learning could beinteresting, by adjusting the false positives added to the model, iteratively.1197.4. Discussion and ConclusionOur focus in this study was specifically on curvilinear transducers as theyare more challenging, and less researched compared to linear array trans-ducers. Unlike the method presented in Chapter 5, the proposed method inthis chapter relaxes the strict requirement of a portion of the needle beingvisible near the insertion site, which is particular to curvilinear transduc-ers. Therefore, it also has the potential to be applied to other transducergeometries.The method was implemented on a personal computer (4 GHz processorand 16 GB RAM) with an un-optimized MATLAB R© code. The total com-putation time is 1.18 seconds for each frame on average: multi-scale spatialdecomposition and OF computation takes 0.53 second, feature selection steptakes 0.17 second for both scales, SVM evaluation takes 0.003 second, andspatial distribution analysis and online training takes 0.01 and 0.46 second,respectively. The intra-observer variability of the gold standard’s manualannotations was assessed by repeating measurements 10 times on a sam-ple dataset, and was found to be 0.57◦ ± 0.34◦ in the insertion angle and0.53± 0.26 mm in the tip.120Chapter 8Conclusion and Future WorkReal-time guidance of a midline epidural needle insertion is impractical withconventional 2DUS. In addition, despite the wide range and long historyof US-guided procedures including epidurals, an unresolved issue is clearneedle visibility. In this thesis, in an attempt to develop a real-time epiduralguidance system, we introduced methodologies that enhance the ultrasonicvisibility of essentially invisible needles. Our hypothesis is that there issufficient information in the dynamic motion profile of the needle for thesuccessful detection of it, even when it is invisible in the static US image. Wedemonstrate that nearly invisible changes in motion dynamics of the needlecan be revealed through spatio-temporal processing of the standard B-modevideo sequences. Various motion analysis methodologies were introduced toextract needle signatures based on the analysis of multiple frames collectedconsecutively. The extracted features were then used in the proposed directand ML frameworks to facilitate the detection, localization and tracking ofan invisible needle. The clinical studies presented in this thesis are also thestart of a series of studies that investigate the utility of 3DUS+Epiguidefor single-operator real-time midline epidural guidance, within the standardworkflow of administering neuraxial anesthesia. Data obtained from clinicaland animal studies conducted at BC Women’s Hospital and Jack Bell AnimalResearch Facility, respectively, were used to evaluate 3DUS+Epiguide andour proposed needle enhancement techniques with respect to the clinical GSannotated by an expert.With the ultimate goal of providing a clinically-acceptable technology toenhance needle visibility for epidural guidance, the proposed methodologieswere discussed in six chapters accordingly. Credibility and understandingof the clinical problem were gained in the first two chapters by validatingSURE clinically, and investigating needle visibility challenges in epidurals.In Chapter 2, clinical feasibility of the developed epidural guidance system(3DUS+Epiguide) was investigated. The system was tested for its accu-racy in puncture site selection on parturients. Quantitative results wereobtained within the range of intra-observer palpation variability. Chapter3 evaluated the success of LOR using Epiguide, and investigated the im-121Chapter 8. Conclusion and Future Workportance of the needle type in the success of the procedure. Experimentsperformed on porcine models ex vivo confirmed the challenge of visualizinga standard needle, in routine LOR procedures. An echogenic needle wasshown to increase the likelihood of successful LOR, with 93.3% of needleinsertions rated as excellent, compared with just 50% of insertions rated asinvisible, achieving successful LOR. Although such echogenic needles havedemonstrated enhanced visibility, they are less likely to be adopted widely,as the current clinical demand is inclined towards standard needles, mak-ing customized apparatus harder to be adopted clinically. Investigating thesignificance of needle visibility in the success of the epidural procedure hasmotivated us to perform research on clinically-acceptable software-based so-lutions to enhance needle visibility. In Chapter 4, analysis of the stylusmotion dynamics was investigated as our first fully software-based attemptto create needle signature to detect it in US. The method was evaluatedon various tissue types ex vivo, and was proved to be successful despite theessential ultrasonic invisibility of the needle.Promising results of the previous chapter has motivated us to continue in-vestigating micro-motion analysis for needle detection. To satisfy the single-operator requirement in applications such as epidurals, however, in Chapter5, we introduced the idea of using the minute motion caused by hand tremorto extract needle signatures, without the need of an explicit motion. Opti-cal flow analysis in a coarse-fine estimation framework was used to detectmotion patterns. Regions with coherent motion dynamics are identified us-ing spectral analysis, and those along straight lines with tremor motion arelabelled as belonging to the needle. The proposed approach were evaluatedon home-made phantom and tissue ex vivo and in vivo on porcine subjects.The method was compared with the state-of-the-art, and achieved 2.83◦ an-gular error, against 14.75◦ for intensity-based approaches. In search of amore robust and computationally efficient motion estimation approach, PBanalysis was investigated in Chapter 6 to estimate the spatial displacement.The phase image is known to be more robust to artifacts and contains moreinformation compared to the magnitude. Temporal filtering of the phasedata was then used in a probabilistic learning-based framework to detecta hand-held needle more reliably. Results were obtained ex vivo and fromporcine subjects in vivo, and were compared with the previous method. InChapter 7, we extended the proposed methodologies to a tracking frameworkby combining differential OF and PB analysis. The combined frameworkbenefits from both the robust phase data analysis as well as accurate OFmotion estimation. In an online self-supervised learning-based framework,a hand-held needle was localized and tracked in a sequence of US frames1228.1. Contributionscaptured by a hand-held transducer. The proposed approach was evaluatedon porcine subjects in vivo, was compared against the previous methods,and showed superior results. The method compared with state-of-the-art,achieved a 100% success rate (angular error < 10◦) compared to the 14%with the intensity-based approach. We conclude that the proposed frame-work can detect and track a needle in US despite the needle being invisible.Results also show that needle detection using the proposed method producesacceptable accuracy for our target clinical application, i.e., epidurals.To conclude, we have demonstrated that nearly invisible changes in mo-tion dynamics of the needle can be revealed through time-series analysis ofthe standard US video sequences. We have investigated the micro-motiondynamics to detect a needle in US from barely-detectable tissue motion (clin-ically less dangerous). Despite our interest in epidurals as our clinical target,needle detection and tracking frameworks proposed in this thesis could behelpful in a wide range of applications, e.g. in-plane biopsies.8.1 ContributionsIn this thesis, we aimed to develop techniques that were required for a reli-able US-guided epidural procedure. We especially looked into the problemof needle visibility and aimed to detect an invisible needle in the still US im-age, using motion dynamics analysis. The proposed methodology was a fullysoftware-based approach with consideration toward clinical utility, to tracka clinically-used standard needle using motion analysis. We also conductedprospective studies to look into the clinical feasibility of the epidural guid-ance system designed by our lab. In the course of achieving the objectives,the following contributions were made:• A real-time novel framework was introduced to detect an invisibleneedle in 3DUS using the stylus motion. Minute intensity variationsimperceptible to the naked eye were used to detect and localize aneedle, that is invisible otherwise. Results indicate that the method’sperformance is independent of the essential visibility of the needle andechogenicity of the tissue.• The novel idea of detecting an invisible hand-held needle from handtremor motion was introduced.• A novel framework was proposed to distinguish a hand-held needlefrom the surrounding tissue using optical flow analysis and spectral1238.2. Future workcoherence. The method was compared against intensity-based ap-proaches and evaluated ex vivo and in vivo on porcine models, whichresulted in an accuracy within the clinical acceptance.• A novel framework was developed to extract spatio-temporal and spec-tral features from a hand-held needle using spatial decomposition witha Gabor filter bank, and followed by temporal filtering.• The proposed needle detection framework was extended to incorpo-rate probabilistic learning-based methods, and improve the detectionaccuracy using a confidence map. PB analysis and the probabilisticSVM was used to improve the needle detection accuracy versus thesurrounding vessels.• A novel learning-based extension was proposed combining the opticalflow and PB techniques. Differential optical flow and STCs for featureextraction were developed to enhance the detection and mitigate theeffect of the tremor of a hand-held transducer. A hand-held needlewas detected and tracked despite the pulsating vessels in the tissueand the hand-held transducer.8.2 Future workSome relevant areas of research can be suggested as follows:• Optical-flow analysis requires the motion to be small, so higher framerates are required to reduce the artifacts. At low frame rates andrelatively large tissue motions, coarse-to-fine extension of OF could beinvestigated.• A probabilistic optical-flow with sub-pixel accuracy could also be in-vestigated, since knowing the uncertainty in the flow allows for a morerobust weighted analysis by down-weighting the uncertain estimates.• Future works also include investigating tremor patterns of various op-erators, using various ultrasound machines as well as different needletypes.• The tremor motion measured from displacements in the B-mode data,mostly reveals larger tremors in the 2−5 Hz range, due to the limitedspatio-temporal sampling in B-mode image formation. Analysis of rawultrasound radio-frequency data could result in more accurate motion1248.2. Future workestimation and better spatial and temporal resolution, compared toB-mode data. The accuracy of the proposed technique could thereforebe improved if implemented using the radio-frequency data, wherespeckle tracking (as in elastography) could also be investigated.• Although the minute transducer tremor is mitigated using the differen-tial flow analysis, larger motion is potentially better estimated using a3D transducer. This could be measured by including the speed of vari-ations of the needle coordinates (in 3D) for estimation using methodssuch as Kalman filtering. Speckle tracking and 3D transducers couldbe tested to estimate the speed of the variations in needle coordinates.• Although HT is robust and also performs in real-time, other paramet-ric fitting approaches could also be investigated for potential outperfor-mance. RANSAC may potentially achieve a more accurate estimationsince it probabilistically omits the outliers to estimate the true needleaxis from the classified pixels. Attention should be taken to chooseproper parameters for the cost function however, as the localizationusing RANSAC would be vulnerable to the chosen threshold.• The ultimate aim of this research is to integrate the proposed needlevisibility enhancement technique in conjunction with 3DUS+Epiguideinto the clinical workflow. Run-time of the feature extraction stage,however, is a bottleneck here. The computational complexity could befurther reduced using parallelization, multi-core computing and imple-mentations on graphical processing unit.• The SVM frameworks need accurately-annotated US images to trainon, which may not be easy to obtain in some applications. An unsu-pervised (or semi-supervised) implementation of the framework (e.g.Transductive SVM) could be beneficial to accelerate these cases.• The features selected for the ML framework could be extended andadjusted, in order to better model the motion characteristics. Morepowerful ML tools, such as neural networks, could also be investigated.• The methodologies described in this thesis, could be used in conjunc-tion with beam steering, to enhance the accuracy of the needle local-ization further.• The proposed framework can also be used to learn a particular oper-ator’s tremor motion or tissue type by training with different training1258.2. Future workdata; the key will be obtaining a wide range and number of images.The feasibility of applying the current framework to different clinicalapplications can also be investigated, which will further demonstratethe advantages of ML in specific applications.• The current presentation of the methodologies is limited to straightneedles. Although this is a fair assumption considering the relativelysmall gauge of epidural needles and their thickness, the method couldbe generalized. Curved/bent needles could intuitively be detected us-ing the proposed approach, however the detection algorithm needs tobe extended to piece-wise linear segmentation.• We started investigating the motion analysis for needle detection basedon stylus motion in 3DUS. 2D imaging was then used for tremor analy-sis, (1) due to the popularity and extensive clinical use of 2D imaging,and (2) since most basic 2D methods can be extended to 3D. Exten-sion of the tremor analysis method to 3D imaging could be beneficial.The out-of-plane issue, which may occur in 2D imaging, is an examplethat could be avoided with 3DUS.• For training purposes, new image processing tools could be used toautomatically identify the placement of the transducer in the correctanatomical plane. A patient-specific anatomical atlas could be used toregister and overlay on the US images to better interpret the anatom-ical features. These new tools could be integrated in future studies.• 3DUS+Epiguide in its current form has a limited frame rate becauseof the delay due to the mechanical movement of the motorized trans-ducer. The technology could be developed for Matrix array transduc-ers, as they are shown to be superior to motorized transducers in termsof the imaging speed and resolution.126Bibliography[1] S. S. Liu, W. M. Strodtbeck, J. M. Richman, and C. L. Wu. A compar-ison of regional versus general anesthesia for ambulatory anesthesia:A meta-analysis of randomized controlled trials. Anesthesia and Anal-gesia Journal, 101(6):1634–1642, 2005.[2] B. Ozgur, E. Benzel, and S. Garfin. Minimally invasive spine surgery:a practical guide to anatomy and techniques. In Springer Science andBusiness Media. 2009.[3] T. Grau, R. W. Leipold, R. Conradi, and E. Martin. Ultrasound con-trol for presumed difficult epidural puncture. Acta anaesthesiologicaScandinavica, 45(6):766–771, 2001.[4] T. Grau, R. W. Leipold, J. Horter, R. Conradi, E. Martin, andJ. Motsch. The lumbar epidural space in pregnancy: visualization byultrasonography. British journal of anaesthesia, 86(6):798–804, 2001.[5] R. W. D. Nickalls and M. S. Kokri. The width of the posterior epiduralspace in obstetric patients. Anaesthesia, 41(4):432–433, 1986.[6] P. E. Helayel, D. B. da Conceio, G. Meurer, C. Swarovsky, and G. R.de Oliveira Filho. Evaluating the depth of the epidural space with theuse of ultrasound. Brazilian Journal of Anesthesiology, 60(4):376–382,2010.[7] J. S. Sprigge and S. J. Harper. Accidental dural puncture and post du-ral puncture headache in obstetric anesthesia: presentation and man-agement: A 23-year survey in district general hospital. Anesthesia,63(1):36–43, 2008.[8] T. T. Horlocker, D. G. McGregor, D. K. Matsushige, D. R. Schroeder,and J. A. Besse. A retrospective review of 4767 consecutive spinalanesthetics: central nervous system complications. perioperative out-comes group. Anesthesia & Analgesia, 84(3):578–584, 1997.127Bibliography[9] C. C. Loo, G. Dahlgren, and L. Irestedt. Neurological complicationsin obstetric regional anaesthesia. International Journal of ObstetricAnesthesia, 9(2):99–124, 2000.[10] W. A. Visser, J. B. Kolling, G. J. Groen, E. Tetteroo, R. van Dijl,P. M. J. Rosseel, and N. J. M. Van Der Meer. Persistent corticalblindness after a thoracic epidural test dose of bupivacaine. Anesthe-siology, 112(2):493–495, 2010.[11] A. Agrawal and K. Kishore. Complications and controversies of re-gional anesthesia: A review. Indian Journal of Anaesthesia, 53(5):543–553, 2009.[12] G. R. Filho. The construction of learning curves for basic skills inanesthesia procedures: An application for the cumulative sum method.Anesthesia and Analgesia, 95(2):411–416, 2002.[13] C. Konard, G. Schupfer, M. Wietlisbach, and H. Gerber. Learningmanual skills in anesthesiology: Is there a recommended number ofcases for anesthetic procedures? Anesthesia and Analgesia, 86(3):635–639, 1998.[14] R. M. Giebler, R. U. Scherer, and J. Peters. Incidence of neurologiccomplications related to thoracic epidural catheterization. Anesthesi-ology, 86(1):55–63, 1997.[15] R. E. Windsor, S. Storm, and R. Sugar. Prevention and manage-ment of complications resulting from common spinal injections. PainPhysician, 6(4):473–484, 2003.[16] B. Gupta, S. Sharma, N. D’souza, and M. Kaur. Pseudo loss of re-sistance in epidural space localization. Saudi journal of anaesthesia,4(2):117–118, 2010.[17] T. J. Lechner, M. G. van Wijk, A. J. Maas, F. R. van Dorsten, Ro. A.Drost, C. J. Langenberg, L. J. Teunissen, P. H. Cornelissen, and J. vanNiekerk. Clinical results with the acoustic puncture assist device, anew acoustic device to identify the epidural space. Anesthesia & Anal-gesia, 96(4):1183–1187, 2003.[18] H. Kalvoy, O. G. Martinsen, and S. Grimnes. Determination of tissuetype surrounding a needle tip by electrical bioimpedance. Proceedingsof the 30th Annual International Conference of the IEEE Engineeringin Medicine and Biology Society, pages 2285–2286, 2008.128Bibliography[19] G. A. Chapman, D. Johnson, and A. R. Bodenham. Visualisation ofneedle position using ultrasonography. Anaesthesia, 61(2):148–158,2006.[20] T A Matalon and B Silver. US guidance of interventional procedures.Radiology, 174(1):43–47, 1990.[21] S Bondestam and J Kreula. Needle tip echogenicity. A study with realtime ultrasound. Investigative radiology, 24(7):555–560, 1989.[22] H. H. Holm and B. Skjoldbye. Interventional ultrasound. Ultrasoundin Medicine and Biology, 22(7):773–789, 1996.[23] R. Heckermann and K. J. Seidel. The sonographic appearance andcontrast enhancement of puncture needles. Journal of Clinical Ultra-sound, 11(5):265–268, 1983.[24] R. C. Corr, J. J. Kryc, and R. W. Vaughan. Ultrasonic localizationof the lumbar epidural space. The Journal of the American Society ofAnesthesiologists, 52(6):513–515, 1980.[25] T. Grau. The evaluation of ultrasound imaging for neuraxial anesthe-sia. Canadian Journal of Anesthesia/Journal canadien d’anesthsie,50:R30–R37, 2003.[26] T. Grau, R. W. Leipold, S. Fatehi, E. Martin, and J. Motsch. Real-time ultrasonic observation of combined spinal-epidural anesthesia.European Journal of Anaesthesiology, 21(1):25–31, 2004.[27] T. Grau, R. W. Leipold, J. Horter, R. Conradi, E. O. Martin, andJ. Motsch. Paramedian access to the epidural space: the optimum win-dow for ultrasound imaging. Journal of clinical anesthesia, 13(3):213–217, 2001.[28] J. C. A. Carvalho. Ultrasound-facilitated epidurals and spinals inobstetrics. Anesthesiology clinics, 26(1):145–158, 2008.[29] T. Grau, E. Bartusseck, R. Conradi, E. Martin, and J. Motsch. Ultra-sound imaging improves learning curves in obstetric epidural anesthe-sia: a preliminary study. Anesthesia & Analgesia, 50(10):1047–1050,2003.129Bibliography[30] D. Tran, A. Kamani, E. Al-Attas, V. Lessoway, S. Massey, andR. Rohling. Single-operator real-time ultrasound guided lumbar epidu-ral needle insertion. Canadian journal of Anesthesia, 57(4):313–321,2010.[31] D. Belavy, M. J. Ruitenberg, and R. B. Brijball. Feasibility studyof real-time three-/four-dimensional ultrasound for epidural catheterinsertion. British Journal of Anaetshesia, 107(3):438–45, 2011.[32] M. K. Karmakar, X. Li, A. M. Ho, W. H. Kwok, and P. T. Chui. Real-time ultrasound guided paramedian epidural access: evaluation of anovel in-plane technique. British Journal of Anesthesia, 102(6):845–54, 2009.[33] K. J. Chin, A. Perlas, V. W. S. Chan, and R. Brull. Needle Visu-alization in Ultrasound-Guided Regional Anesthesia: Challenges andSolutions, 2008.[34] E. Ayvali and J. P. Desai. Optical flow-based tracking of needles andneedle-tip localization using circular hough transform in ultrasoundimages. Annals of Biomedical Engineering, 43(8):1828–1840, 2015.[35] M. Uhercik, J. Kybic, H Liebgott, and C. Cachard. Model fitting usingRANSAC for surgical tool localization in 3D ultrasound images. IEEETransactions in Biomedical Engineering, 57(8):1907–1916, 2010.[36] Y. Zhao, C. Cachard, and H. Liebgott. Automatic needle detection andtracking in 3D ultrasound using an ROI-based RANSAC and Kalmanmethod. Ultrasonic Imaging, 35(4):283–306, 2013.[37] S. H. Okazawa, R. Ebrahimi, J. Chuang, R. N. Rohling, and S. E. Sal-cudean. Methods for segmenting curved needles in ultrasound images.Medical Image Analysis, 10(3 SPEC. ISS.):330–342, 2006.[38] M. Ding and A. Fenster. Projection-based needle segmentation in 3Dultrasound images. Computer aided surgery : International Societyfor Computer Aided Surgery, 9(5):193–201, 2004.[39] Q. Wu, M. Yuchi, and M. Ding. Phase grouping-based needle segmen-tation in 3D trans-rectal ultrasound-guided prostate trans-perinealtherapy. Ultrasound in Medicine and Biology, 40(4):804–816, 2014.130Bibliography[40] K J Draper, C C Blake, L Gowman, D B Downey, and A Fenster.An algorithm for automatic needle localization in ultrasound-guidedbreast biopsies. Journal of Medical Physics, 27(8):1971–1979, 2000.[41] I. Hacihaliloglu, P. Beigi, G. Ng, R. Rohling, S. Salcudean, andP. Abolmaesumi. Projection-based phase features for localization of aneedle tip in 2D curvilinear ultrasound. In International Conference onMedical Image Computing and Computer-Assisted Intervention, pages347–354. Springer International Publishing, 2015.[42] R. R. Perrella, C. Kimme-Smith, F. N. Tessler, N. Ragavendra, andE. G. Grant. A new electronically enhanced biopsy system: valuein improving needle-tip visibility during sonographically guided inter-ventional procedures. American Journal of Roentgenology, 158(1):195–198, 1992.[43] R. Feld, L. Needleman, and B. B. Goldberg. Use of needle-vibratingdevice and color doppler imaging for sonographically guided invasiveprocedures. American journal of Roentgenology, 168(1):255–256, 1997.[44] A. Hakime, F. Deschamps, E. G. M. De Carvalho, A. Barah, A. Au-perin, and T. De Baere. Electromagnetic-tracked biopsy under ul-trasound guidance: preliminary results. Cardiovascular and interven-tional radiology, 35(4):898–905, 2012.[45] T. K. Adebar, A. E. Fletcher, and A. M. Okamura. 3D ultrasound-guided robotic needle steering in biological tissue. IEEE Trans BiomedEng, 61(12):2899–910, 2014.[46] A. Harmat, R. N. Rohling, and S. E. Salcudean. Needle tip local-ization using stylet vibration. Ultrasound in Medicine and Biology,32(9):1339–1348, 2006.[47] C. Kim, D. Chang, D. Petrisor, G. Chirikjian, M. Han, andD. Stoianovici. Ultrasound probe and needle-guide calibration forrobotic ultrasound scanning and needle targeting. IEEE Transactionson Biomedical Engineering, 60(6):1728–1734, 2013.[48] N.J. Cowan, K. Goldberg, G. S. Chirikjian, G. Fichtinger, R. Al-terovitz, K. B. Reed, V. Kallem, W. Park, S. Misra, and A. M. Oka-mura. Robotic needle steering: Design, modeling, planning, and imageguidance. In Surgical robotics, pages 557–582. Springer US, 2011.131Bibliography[49] A. Majewicz, S. P. Marra, M.G. van Vledder, M. Lin, M. A. Choti,D. Y. Song, and A. M. Okamura. Behavior of tip-steerable needles in exvivo and in vivo tissue. IEEE Transactions on Biomedical Engineering,59(10):2705–2715, 2012.[50] E. M. Boctor, M. A. Choti, E. C. Burdette, and R. J. Webster. Three-dimensional ultrasound-guided robotic needle placement: an experi-mental evaluation. The International Journal of Medical Robotics andComputer Assisted Surgery, 4(2):180–91, 2008.[51] S. Hebard and G. Hocking. Echogenic technology can improve nee-dle visibility during ultrasound-guided regional anesthesia. RegionalAnesthesia and Pain Medicine, 36(2):185–189, 2011.[52] R. K. Deam, R. Kluger, M. J. Barrington, and C. A. McCutcheon.Investigation of a new echogenic needle for use with ultrasound pe-ripheral nerve blocks. Anaesthesia and Intensive Care, 35(4):582–586,2007.[53] H. P. Sviggum, K. Ahn, J. A. Dilger, and et al. Needle echogenicityin sonographically guided regional anesthesia: blinded comparison of4 enhanced needles and validation of visual criteria for evaluation.Journal of Ultrasound in Medicine, 32(1):143–148, 2013.[54] A. Menhadji, V. Nguyen, J. Cho, R. Chu, K. Osann, P. Bucur, P. Pa-tel, A. Lusch, E. McDougall, and J. Landman. In vitro comparison ofa novel facilitated ultrasound targeting technology vs standard tech-nique for percutaneous renal biopsy. Urology, 82(3):734–737, 2013.[55] Y. Zhao, Y. Shen, A. Bernard, C. Cachard, and H. Liebgott. Evalua-tion and comparison of current biopsy needle localization and trackingmethods using 3D ultrasound. Ultrasonics, 73:206–220, 2017.[56] S. Cheung and R. Rohling. Enhancement of needle visibility inultrasound-guided percutaneous procedures. Ultrasound in Medicineand Biology, 30(5):617–624, 2004.[57] C. R. Hatt, G. Ng, and V. Parthasarathy. Enhanced needle localizationin ultrasound using beam steering and learning-based segmentation.Computerized Medical Imaging and Graphics, 41:46–54, 2015.[58] B. Zhuang, K. Dickie, and L. Pelissier. In vivo needle visualization inultrasound images using tensor-based filtering. In IEEE UltrasonicsSymposium, pages 667–670, 2013.132Bibliography[59] P. Beigi, P. Malenfant, A. Rasoulian, R. Rohling, A. Dube, andV. Gunka. Three-dimensional ultrasound-guided real-time midlineepidural needle placement with epiguide: A prospective feasibilitystudy. Ultrasound in Medicine and Biology, 43(1):375–379, 2017.[60] F. Shaikh, J. Brzezinski, S. Alexander, C. Arzola, J. Carvalho,J. Beyene, and L. Sung. Ultrasound imaging for lumbar puncturesand epidural catheterisations: systematic review and meta-analysis.British Medical Journal, 346:f1720, 2013.[61] A. Perlas. Evidence for the use of ultrasound in neuraxial blocks.Regional Anesthesia and Pain Medicine, 35(2):S43–6, 2010.[62] A. Perlas, L. Chaparro, and K. Chin. Lumbar neuraxial ultrasound forspinal and epidural anesthesia: a systematic review and meta-analysis.Regional Anesthesia and Pain Medicine, 40(2), 2015.[63] A. U. Niazi, K. J. Chin, R. Jin, and V. W. Chan. Real-time ultrasound-guided spinal anesthesia using the SonixGPS ultrasound guidancesystem: a feasibility study. Acta Anaesthesiologica Scandinavica,58(7):875–881, 2014.[64] C. Menac, O. Choquet, B. Abbal, D. Morau, P. Biboulet, S. Bringuier,and X. Capdevila. Real-time ultrasound-guided epidural anaesthesiatechnique can be improved by new echogenic tuohy needles: a pi-lot study in cadavers. British Journal of Anesthesia, 113(2):299–301,2014.[65] P. Malenfant, P. Beigi, V. Gunka, Rasoulian. A., R. N. Rohling, andDube. A. Accuracy of 3D ultrasound for identification of epiduralneedle skin insertion point in parturients; A prospective observationalstudy. In Society for Obstetric Anesthesia and Perinatology (SOAP),page 308, 2014.[66] C. Arzola, S. Davies, A. Rofaeel, and J. C. Carvalho. Ultrasoundusing the transverse approach to the lumbar spine provides reliablelandmarks for labor epidurals. Anesthesia and Analgesia, 104:1188–92, 2007.[67] M. Balki, Y. Lee, S. Halpern S, and J. C. Carvalho. Ultrasound imag-ing of the lumbar spine in the transverse plane: the correlation betweenestimated and actual depth to the epidural space in obese parturients.Anesthesia and Analgesia, 108(6):1876–81, 2009.133Bibliography[68] J. Stone, P. Beigi, R. Rohling, V. A. Lessoway, A. Dube, and V. Gunka.Novel 3D ultrasound system for midline single-operator epidurals: Afeasibility study on a porcine model. International Journal of ObstetricAnesthesia, 31:51–56, 2016.[69] T. Grau, R. Leipold, R. Conradi, E. Martin, and J. Motsch. Efficacyof ultrasound imaging in obstetric epidural anesthesia. Journal ofClinical Anesthesia, 14(3):169–175, 2002.[70] NICE: National Institute for Health and Clinical Excellence. Ul-trasound guided catheterization of the epidural space: understandingNICE guidance.[71] S. Brinkmann, R. Tang, H. Vaghadia, and A. Sawka. Assessment ofa real-time ultrasound-guided spinal technique using sonixgpstm inhuman cadavers. Canadian Journal of Anesthesia, 59(12):1156–1157,2012.[72] P. H. Conroy, C. Luyet, C McCartney, and P. McHardy. Real-time ultrasound-guided spinal anaesthesia: a prospective observationalstudy of a new approach. Anesthesiology Research and Practice, page525818, 2013.[73] K. J. Chin, M. K. Karmakar, and P. Peng. Ultrasonography of theadult thoracic and lumbar spine for central neuraxial blockade. Anes-thesiology, 114:1459–85, 2011.[74] M. Gofeld, D. Krashin, and S. Ahn. Needle echogenicity in ultrasound-guided lumbar spine injections: a cadaveric study. Pain Physician,16(6):E72530, 2013.[75] J. Stone, P. Beigi, R. Rohling, V. A. Lessoway, A. Dube, and V. Gunka.Novel 3D-ultrasound-guided midline lumbar epidural placement, uti-lizing epiguide needle guide in porcine model: a comparison of stan-dard versus pajunk epidural needles. Society for Obstetric Anesthesia& Perinatology (SOAP), page 278, 2015.[76] J. Landis and G. Koch. An application of hierarchical kappa-typestatistics in the assessments of majority agreement among multipleobservers. Biometrics, 33:363–74, 1977.[77] D. Tran and R. Rohling. Automatic detection of lumbar anatomy inultrasound images of human subjects. IEEE Transactions on Biomed-ical Engineering, 57(9):2248–56, 2010.134Bibliography[78] P. Beigi, R. Rohling, T. Salcudean, V. A. Lessoway, and G. C. Ng.Needle trajectory and tip localization in real-time 3D ultrasound usinga moving stylus. Ultrasound in Medicine and Biology, 41(7):2057–2070, 2015.[79] P. Beigi and R. N. Rohling. Needle localization using a movingstylet/catheter in ultrasound-guided regional anesthesia: a feasibil-ity study. In Proceedings of SPIE Medical Imaging, the InternationalSociety for Optics and Photonics, volume 9036, pages 90362–90366,2014.[80] I. Schafhalterzoppoth, C. Mcculloch, and A. Gray. Ultrasound visibil-ity of needles used for regional nerve block: An in vitro study. RegionalAnesthesia and Pain Medicine, 29(5):480–488, October 2004.[81] Y. Wang, H. N. Cardinal, D. B. Downey, and A. Fenster. Semiau-tomatic three-dimensional segmentation of the prostate using two-dimensional ultrasound images. Medical physics, 30(5):887–897, 2003.[82] T. R. Nelson and D. H. Pretorius. Three-dimensional ultrasound imag-ing. Ultrasound in Medicine and Biology, 24:1243–1270, 1998.[83] A. Fenster, D. B. Downey, and H. N. Cardinal. Three-DimensionalUltrasound Imaging. Physics in medicine and biology, 46(5):67–99,2001.[84] C. B. Burckhardt. Speckle in ultrasound B-mode scans. IEEE Trans-actions on Sonics and Ultrasonics, 25(1):1–6, 1978.[85] P. Beigi, T. Salcudean, R. N. Rohling, and G. C. Ng. Spectral analysisof the tremor motion for needle detection in curvilinear ultrasound viaspatio-temporal linear sampling. International Journal of ComputerAssisted Radiology and Surgery, 11(6):1183–92, 2016.[86] R. N. Stiles. Mechanical factors in human tremor frequency. ProQuestDissertations Publishing, 1966.[87] S. Calzetti, M. Baratti, M. Gresty, and L. Findley. Fre-quency/amplitude characteristics of postural tremor of the hands ina population of patients with bilateral essential tremor: implicationsfor the classification and mechanism of essential tremor. Journal ofNeurology, Neurosurgery Psychiatry, 50(5):561–567, 1987.135Bibliography[88] Adam D. Lysyansky P. Behar, V. and Z. Friedman. Improving mo-tion estimation by accounting for local image distortion. Ultrasonics,43(1):57–65, 2004.[89] Z. Liu and J. Luo. Performance comparison of optical flow and blockmatching methods in shearing and rotating models. In Proceedingsof SPIE Medical Imaging, the International Society for Optics andPhotonics, volume 10139, 2017.[90] A. Rosenfeld. A nonlinear edge detection technique. Proceedings ofIEEE letters, 58(5):814–816, 1970.[91] B. Horn and B. Schunck. Determining optical flow. Artificial Intelli-gence, 17:185–203, 1981.[92] B. D. Lucas and T. Kanade. An iterative image registration techniquewith an application to stereo vision. Proc. 7th International JointConference on Artificial Intelligence, pages 674–679, 1981.[93] D. J. Fleet, J. L. Barron, and S. Beauchemin. Performance of opticalflow techniques. International Journal of Computer Vision, 12:43–77,1994.[94] A. N. Tikhonov, A. V. Goncharsky, V. V. Stepanov, and A. G. Yagola.Numerical methods for the solution of ill-posed problems. In SpringerScience and Business Media, volume 328. 2013.[95] L. B. White and B. Boashash. Cross spectral analysis of non-stationaryprocesses. IEEE Transactions on Information Theory, 36(4):830–835,1990.[96] P. Beigi, T. Salcudean, R. Rohling, V. A Lessoway, and G. C. Ng.Needle detection in ultrasound using the spectral properties of thedisplacement field: a feasibility study. In Proceedings of SPIE MedicalImaging, the International Society for Optics and Photonics, volume9415, pages 94150U–6, 2015.[97] H. Y. Wu, M. Rubinstein, E. Shih, J. Guttag, F. Durand, and W. T.Freeman. Eulerian video magnification for revealing subtle changes inthe world. ACM Transactions on Graphics, 31(4), 2012.[98] N. Wadhwa, M. Rubinstein, F. Durand, and W. T. Freeman. Phase-based video motion processing. ACM Transactions on Graphics, 32(4),2013.136[99] P. Beigi, T. Salcudean, R. N. Rohling, V. A. Lessoway, and G. C. Ng.Automatic detection of a hand-held needle in ultrasound via phased-based analysis of the tremor motion. In Proceedings of SPIE MedicalImaging, the International Society for Optics and Photonics, volume9786, pages 97860I–1, 2016.[100] W. Freeman and T. Adelson. The design and use of steerable fil-ters. IEEE Transactions on Pattern Analysis and Machine Intelli-gence, 13(9):891–900, 1991.[101] P. Beigi, R. Rohling, T. Salcudean, V. A. Lessoway, and G. C. Ng.Detection of an invisible needle in ultrasound using a probabilistic svmand time-domain features. Journal of Ultrasonics, 78:18–22, 2017.[102] G. E. P. Box and G. M. Jenkins. Time series analysis, forecasting andcontrol. In San Francisco Holden-Day, 1970.[103] P. Beigi, R. Rohling, S. E. Salcudean, and G. C. Ng. CASPER:Computer-aided segmentation of imperceptible motion – a learning-based tracking of an invisible needle in ultrasound. International Jour-nal of Computer Assisted Radiology and Surgery, pages 1–10, 2017.[104] G. Cauwenberghs and T. Poggio. Incremental and decremental sup-port vector machine learning. Advances in neural information process-ing systems, 13:409, 2001.137AppendixClinical and Animal StudyProtocols1 Epiguide: clinical studies at BC Women’sHospital1.1 Equipment preparation• Place epidural tape on the top edge of the transparency plus a smallpiece of tape on the reverse side of the bottom edge.• Squirt the entire contents of sterile coupling gel packet inside the sterilesheath and pull it over the US transducer. Secure with two elasticbands.• The sterile plastic needle guide will be mounted on the US transducerover the sterile sheath.• Connect 3DUS transducer and turn on the US machine. Verify the 3Dthick-slice software is operating correctly.The subject will be seated upright on the edge of a leveled bed with herneck and back flexed forward, and her feet supported on a chair. Stabilizethe subjects arms with a support. The anesthesiologist will face the subjectsback.1.2 L2-3 and L3-4 Identification• Palpate L2-3 and L3-4 and mark both levels with a green permanentmarker on the skin. Level L2-3 should be marked “P1” and level L3-4should be marked “P2”. The marks should be on the same side asthe needle guide of the US probe, and on the lateral margin of thepatient’s back to avoid obscuring the midline.• Draw two vertical lines on the patients back to indicate the midline atapproximately T12-L1 and L4-5.1381. Epiguide: clinical studies at BC Women’s Hospital1.3 3D US scanning• On Ultrasonix touch screen, press: “Presets” then “Generic” and then“Epidural” buttons. This returns you to the main screen.• While operating in 2D mode, perform a quick scan of the patient’sspine and adjust depth setting, and gain (typically 60-65%) to obtainthe best possible image quality.• Press “Epidural” button which should then turn green. Hit the refreshbutton (dot with two arrows around it) which should start the motorin the probe and show the 3D thick slice on the screen.• Place the blunt-tipped needle into the guide carefully.• Place gel and the US transducer in the paramedian plane and align theneedle guide with the mark for L3-4. First align the needle visuallyfor coarse positioning, ensureing that it is perpendicular to the skinand in the vertical mid-sagittal plane. Then look at the image for finepositioning of the transducer position. The characteristic wave (“saw-tooth”) pattern will be observed as the US waves are reflected off thelamina, until the L3-4 interspace is visualized in the US image.• The position and angle of the transducer will be adjusted (i.e. varyingthe distance from the transducer to the spine’s midline and the trans-ducer angle) until the best image of the L3-4 LF is obtained in the USimage.• Remove excess gel from skin using a sterile sponge to clearly see theblunt-tipped needle indent the skin.• Warn the patient of a subsequent “pinch” feeling on her skin.• The blunt-tipped epidural needle will be advanced to touch the sub-jects skin. Mark the indent with an erasable red marker. (The markon the skin will be referred as “3D2”.)• The US transducer will be then up the paramedian plane to reach theL2-3 interspace.• Remove excess gel from skin using a sterile sponge to clearly see theblunt-tipped needle indent the skin.• Warn the patient of a subsequent “pinch” feeling on her skin.1391. Epiguide: clinical studies at BC Women’s Hospital• The epidural needle with a blunt tip will be advanced to touch thesubject’s skin. Mark the indent with an erasable red marker. (Themark on the skin will be referred as “3D1”.)• Hand the US transducer back to the assistant to place in the holder.• Residual US coupling gel will be removed from the skin by paper wipes(keep the dots visible).• A single transparency, with white epidural tape along the top edge,will be attached to the subjects back to cover both the “3D1” and“3D2” marks on the skin. Tape it close to the top mark “3D1”.• Using a thick-tipped erasable marker, such as a whiteboard marker,mark the transparencys lower corners on the skin (marking both trans-parency and skin at the same time).• Transfer the marks “3D1” and “3D2” to the transparency using apermanent red marker.• Lift the transparency up and tape to the patient’s back.• Erase the markings on the skin, using the alcohol wipes if needed.• Optional: if the markings cannot be fully erase, add several more dotsto the area to purposefully confuse the operator which dot was used.1.4 Palpation• By palpation, the point on the skin where the epidural needle wouldbe normally inserted in L2-3 will be identified and marked with anerasable blue marker. The skin mark will be referred as “P1”.• Similarly, by palpation, the point on the skin where the epidural needlewould be normally inserted in L3-4 will be identified and marked withan erasable blue marker. The skin mark will be referred as “P2”.• Lower the transparency, aligning the corners to the skins marks.• The location of P1 and P2 will be copied on the transparency with apermanent blue marker.• Remove transparency, write the date/non-identifying patient numberonto the transparency. Erase the marks on the skin, using the alcoholwipes if needed.1402. Needle tracking: animal study at Jack Bell Animal Research Facility2 Needle tracking: animal study at Jack BellAnimal Research Facility2.1 Setup preparation• Initialize Philips iU22 US imaging system and a C5–1 (1–5 MHz)curved array transducer( or X6-1 xMATRIX array transducer).• Select the transducer from the list of transducers and select “abdomengeneral” pre-set on the console.• SonoCT, harmonics, and XRES (on the console) should be off duringthe data acquisition.2.2 Data acquisition• Place standard coupling gel and the US transducer on the Bicepsfemoris muscle of the porcine model, and perform a quick scan tospot a pulsating vessel in the field of view.• While scanning, insert a 17G Tuohy epidural needle (with fully in-serted stylus), at a range of 50◦ − 80◦ insertion angle, for about 1 cminto the tissue and within the field of view of the US image.• Place the US transducer at a proper spot to image the detected vesseland the needle in-plane.– First align the needle visually for coarse positioning, then lookat the image for fine positioning of the transducer. Adjust thetransducer positioning until at least the initial portion of theneedle shaft and the needle tip can be depicted in the US image(for manual GS annotation).– In cases of invisible needle, the second operator may oscillate thestylus of the needle to create intensity variations at the needlelocation in the B-mode image. This will help with aligning thetransducer.• Hold the transducer fairly stationary by bracing the operator’s armagainst a support, i.e. the operating table.• The second operator adjusts the imaging parameters, such as depth(5− 6 cm) and focus (around the needle tip).1412. Needle tracking: animal study at Jack Bell Animal Research Facility• Upon the signal from the second operator at the start of data acquisi-tion, the first operator starts inserting the needle further at a rate ofabout 5 mm insertion per second.• The second operator stops the acquisition just before the needle tipleaves the field of view.• Review the data to ensure no gross transducer motion is visible, andrepeat the acquisition if needed.• Store the data to be analyzed using the offline processing frameworklater.• Remove the needle and repeat the previous steps from the beginningfor a new insertion.142

Cite

Citation Scheme:

        

Citations by CSL (citeproc-js)

Usage Statistics

Share

Embed

Customize your widget with the following options, then copy and paste the code below into the HTML of your page to embed this item in your website.
                        
                            <div id="ubcOpenCollectionsWidgetDisplay">
                            <script id="ubcOpenCollectionsWidget"
                            src="{[{embed.src}]}"
                            data-item="{[{embed.item}]}"
                            data-collection="{[{embed.collection}]}"
                            data-metadata="{[{embed.showMetadata}]}"
                            data-width="{[{embed.width}]}"
                            async >
                            </script>
                            </div>
                        
                    
IIIF logo Our image viewer uses the IIIF 2.0 standard. To load this item in other compatible viewers, use this url:
http://iiif.library.ubc.ca/presentation/dsp.24.1-0355263/manifest

Comment

Related Items