UBC Theses and Dissertations

UBC Theses Logo

UBC Theses and Dissertations

Image-guided robot-assisted diagnostic ultrasound Abolmaesumi, Purang 2002

Your browser doesn't seem to have a PDF viewer, please download the PDF to view this item.

Item Metadata

Download

Media
831-ubc_2002-749827.pdf [ 17.24MB ]
Metadata
JSON: 831-1.0065526.json
JSON-LD: 831-1.0065526-ld.json
RDF/XML (Pretty): 831-1.0065526-rdf.xml
RDF/JSON: 831-1.0065526-rdf.json
Turtle: 831-1.0065526-turtle.txt
N-Triples: 831-1.0065526-rdf-ntriples.txt
Original Record: 831-1.0065526-source.json
Full Text
831-1.0065526-fulltext.txt
Citation
831-1.0065526.ris

Full Text

Image-Guided Robot-Assisted Diagnostic Ultrasound by Purang Abolmaesumi M.Sc, Sharif University of Technology, Tehran, Iran, 1997 B.Sc, Sharif University of Technology, Tehran, Iran, 1995 A THESIS SUBMITTED IN PARTIAL FULFILLMENT OF THE REQUIREMENTS FOR THE DEGREE OF Doctor of Philosophy in THE FACULTY OF GRADUATE STUDIES (Department of Electrical and Computer Engineering) We accept this thesis as conforming to the required standard THE UNIVERSITY OF BRITISH COLUMBIA September 2002 © Purang Abolmaesumi, 2002 In presenting this thesis in partial fulfilment of the requirements for an advanced degree at the University of British Columbia, 1 agree that the Library shall make it freely available for reference and study. I further agree that permission for extensive copying of this thesis for scholarly purposes may be granted by the head of my department or by his or her representatives. It is understood that copying or publication of this thesis for financial gain shall not be allowed without my written permission. Department of £^W-tn'f^J h G ~ r ^ * t * " g/voj >'A<.<^>^ The University of British Columbia Vancouver, Canada Date DE-6 (2/88) Abstract Ultrasound technicians are often required to hold the transducers in awkward positions for prolonged periods of time, while exerting large forces. As a result, they suffer from an unusually high incidence of musculoskeletal disorders. A novel robot-assisted tele-operation system has been proposed at the University of British Columbia to alleviate these problems. This thesis is part of this larger tele-operation system for diagnostic ultrasound. It contributes several novel real-time feature tracking methods for ultrasound images, the development of the "ultrasound visual servoing" concept, the design and implementation of the image controller and the user interface, the system integration and the development of several applications of the robot-assisted diagnostic ultrasound interface. It also makes a first attempt at comparing the robot-assisted approach with the conventional ultrasound examination through presenting the results of a human factors study. Ultrasound image features selected by the operator are recognized and tracked in real-time using a Correlation algorithm, a Sequential Similarity Detection algorithm, a Star algorithm, a Star-Kalman algorithm, and a Discrete Snakes algorithm. The Sequential Similarity Detection and the Star-Kalman algorithms have excellent performance while tracking features with motions of up to 25 mm/s (200 pixels/s); however, the Star-Kalman n algorithm requires 60% less computation time. The Correlation and the Star algorithms exhibit poorer performance and have higher computational costs. The Snake algorithm is unable to track features with motions faster than 12 mm/s (100 pixels/s). Al l these methods have been compared for carotid artery tracking in ultrasound images over 10 second (300 frames) time periods. Based on feature tracking, ultrasound image servoing in three axes has been incorporated into the ultrasound robot interface and can be enabled to automatically compensate for unwanted motions in the plane of the ultrasound beam. The accuracy of the system is illustrated through a 3-D reconstruction of an ultrasound phantom and human carotid artery. Its functionality was tested by means of an Internet-based, robot-assisted tele-ultrasound examination conducted in Vancouver from Montreal. • iii Contents Abstract ii Table of Contents iv List of Tables x List of Figures xii Acknowledgments xvi 1 Introduction 1 1.1 Tele-operation System Design Overview 3 1.2 Tele-operation System Components 5 1.2.1 Ultrasound Robot 5 1.2.2 Robot Controller 5 1.2.3 Hand Controller 6 1.3 Thesis Objectives 6 1.4 Prior Work 8 1.4.1 Real-Time Feature Tracking 8 iv CONTENTS 1.4.2 Speckle Reduction 13 1.4.3 Robot-Assisted Tele-Ultrasound 16 1.4.4 Shared Control 16 1.4.5 Visual Servoing 18 1.5 Thesis Contributions 24 1.6 Thesis Outline 26 2 Real-time Feature Tracking in Ultrasound Images 30 2.1 Overview 30 2.2 Experimental Setup 33 2.3 Template Matching Algorithms 36 2.3.1 Phase Correlation 36 2.3.2 Cross-Correlation 38 2.3.3 Sequential Similarity Detection 41 2.4 Segmentation Algorithms 42 2.4.1 Star Algorithm 43 2.4.2 Star-Kalman Algorithm 48 2.4.3 The Discrete Snake Model 54 2.5 Discussion 59 3 Ultrasound Image Servoing 61 3.1 Overview 61 3.2 Ultrasound Image Jacobian 62 3.3 Ultrasound Image Jacobian Properties 65 CONTENTS 3.3.1 Tracking one feature point in the image 66 3.3.2 Tracking two feature points in the image 67 3.3.3 Tracking more than two features 68 3.4 The Visual Controller 69 3.4.1 Stability Analysis 70 3.4.1.1 Tracking One Feature in the Ultrasound Image 70 3.4.1.2 Tracking Two or More Features in the Ultrasound Image . 72 3.4.2 Experimental Results 73 3.5 Adaptive Ultrasound Image Servoing 75 3.6 Discussion 80 4 Ultrasound Robot User Interface 81 4.1 Overview 81 4.2 Joystick 82 4.2.1 SpaceMouse 82 4.2.2 The Spherical Wrist Joystick 84 4.2.3 SpaceMouse Integration Details 89 4.3 Graphical User Interface 91 4.3.1 Ultrasound Image Display 91 4.3.2 Start/Stop Button 91 4.3.3 Force Control 91 4.3.4 Probe Sensitivity Setting 91 4.3.5 Axis Control 92 4.3.6 3-D Graphical Probe 93 vi CONTENTS 4.3.7 Image Control 93 4.3.8 Real-Time Contour Extraction 94 4.4 Conclusions and Recommendations 94 5 Practical Applications 96 5.1 Overview 96 5.2 3-D Ultrasound Imaging 96 5.2.1 Calibration 97 5.2.2 Stradx 100 5.2.3 Star-Kalman Based Reconstruction 102 5.3 Tele-ultrasound 103 5.4 Feature-Based Probe Position Control 105 6 Human Factors Study 108 6.1 Overview 108 6.2 Ultrasound Robot User Testing Questionnaire 108 6.3 Feature Positioning Ability I l l 6.4 Surface Electromyography I l l 6.4.1 Basics of Myoelectric Signals 112 6.4.2 E M G Feature Extraction 113 6.4.3 Monitoring the Muscle Activity of the Sonographers 115 7 Conclusions and Future Work 127 7.1 Contributions of this Thesis 127 7.1.1 Real-Time Ultrasound Feature Extraction 128 vii CONTENTS 7.1.2 Ultrasound Visual Servoing 128 7.1.3 Feature-Based 3-D Ultrasound Image Reconstruction 129 7.1.4 Tele-ultrasound 129 7.1.5 Ultrasound Robot User-Interface 129 7.1.6 Human Factors Study 130 7.2 Suggestions for Future Work 130 Bibliography 132 Appendix 158 A Ultrasound Imaging 159 A . l Ultrasound Imaging Basics 159 A.2 Ultrasound Analysis of Peripheral Artery Disease 161 A.3 Ultrasound Imaging Display Modes 162 A.3.1 A-Mode 162 A.3.2 B-Mode 163 A.3.3 M-Mode 163 A.3.4 Doppler Modes 164 A.4 Ultrasound Imaging Artifacts 164 A.4.1 Reverberations 164 A.4.2 Speckle 165 A.4.3 Dropouts 165 A.4.4 Probe Motion Artifacts 165 A.5 3-D Ultrasound 166 viii CONTENTS A.5.1 Acquisition Techniques 167 A.5.1.1 Free-hand Acquisition 167 A.5.1.2 Mechanical Localizers 169 A.5.1.3 2-D Arrays 171 B Ultrasound Robot User Testing Form 172 ix List of Tables 2.1 A comparison of different feature tracking algorithms 60 5.1 Calibration results 99 6.1 Questionnaire results. 110 6.2 The specifications of the E M G sensors 114 6.3 The activity of different muscles under conventional and robot-assisted ex-aminations 117 6.4 Sonographer's short scan normalized muscle activity with joystick force thresh-old of 2.5 N 120 6.5 Student l's short scan normalized muscle activity with joystick force thresh-old of 2.5 N 121 6.6 Student 2's short scan normalized muscle activity with joystick force thresh-old of 2.5 N 121 6.7 Student 3's short scan normalized muscle activity with joystick force thresh-old of 2.5 N 121 6.8 A comparison among the sonographer and the students when interacting with the robot-assisted system relative to the conventional ultrasound examination. 122 x LIST OF TABLES 6.9 Student l's short scan normalized muscle activity with joystick force thresh-old of 1.5 N. 123 6.10 Student 2's short scan normalized muscle activity with joystick force thresh-old of 1.5 N 124 6.11 A comparison between students 1 and 2 when interacting with the robot-assisted system relative to the conventional ultrasound examination 124 6.12 Student l's short scan normalized muscle activity with joystick force thresh-olds of 1.5 N and 2.5 N 124 6.13 The effect of soft and hard support surfaces for the forearm on the average muscle activity of student 1 125 A.1 Approximate acoustic impedance of selected materials 160 xi List of Figures 1.1 Experimental setup for robot-assisted ultrasound 4 1.2 Block diagram of the ultrasound robot system 9 1.3 Data flow in the user interface 9 1.4 Position-based visual servo (PBVS) structure 19 1.5 Image-based visual servo (IBVS) structure 20 1.6 Dynamic position-based look-and-move structure 20 1.7 Dynamic image-based look-and-move structure 20 2.1 A CAD model of the ultrasound phantom and the ultrasound transducer. . 35 2.2 An ultrasound image of the phantom 35 2.3 Displacement of the probe along the x-axis (Ax) relative to the phantom and the corresponding feature displacement in the ultrasound image along the u-axis (An) 36 2.4 A schematic diagram of the correlation algorithm 40 2.5 The tracking performance of the correlation algorithm in different velocities. 40 2.6 Tracking the carotid artery by the correlation algorithm 41 xii LIST OF FIGURES 2.7 The tracking performance of the SSD algorithm for two different image ve-locities 42 2.8 Tracking the carotid artery by the SSD algorithm 43 2.9 Illustration of the Star algorithm 46 2.10 Block diagram of the Star tracking method 46 2.11 The tracking performance of the Star algorithm for two different image ve-locities 48 2.12 Tracking the carotid artery by the Star algorithm 49 2.13 A schematic diagram of the border extraction method 50 2.14 The tracking performance of the Star-Kalman algorithm for two different image velocities 53 2.15 Tracking the carotid artery by the Star-Kalman algorithm 54 2.16 The tracking performance of the snake algorithm at V = 60 pixels/s 59 2.17 Tracking the carotid artery by the snake algorithm 59 3.1 Definition of the frames for the ultrasound robot 62 3.2 Tracking one feature in the ultrasound image; the illustrated sphere is locus of probe motions that lay in the null-space of J 67 3.3 Feature point motions that lay in the 3T null-space in the case of two feature points 69 3.4 Ultrasound image controller 69 3.5 Experimental results for image servoing in a single axis 74 3.6 Experimental results for image servoing in three axes 75 xiii LIST OF FIGURES 3.7 Experimental results for adaptive and conventional ultrasound image servo controllers 78 3.8 Experimental results for adaptive and conventional ultrasound image servo controllers when scaling down ultrasound images 79 4.1 System setup; the joysticks are positioned beside the ultrasound machine panel: 1) JR? force-torque sensor, 2) potentiometers, 3) thumb lever, 4) spherical wrist joystick handle, 5) SpaceMouse (courtesy of Simon Bachmann). 83 4.2 Hand controller: Magellan/SpaceMouse; 1) movable handle, 2) base, 3) pro-grammable push buttons, 4) translation and rotation axes 83 4.3 Hand controller: Spherical wrist (courtesy of Simon Bachmann) 85 4.4 Hysteresis band that is used to apply the velocity command to the robot. . 86 4.5 The location of the thumb lever with respect to the joystick handle; the operator can easily adjust the lever to get different velocity gains during the operation 87 4.6 Graphical user interface 92 5.1 Partial 3-D image reconstruction of the ultrasound phantom by using the Stradx program 100 5.2 Partial 3-D image reconstruction of the ultrasound phantom by using the Star-Kalman contour extraction method 101 5.3 Carotid artery 3-D reconstruction: a) external view, b) internal view . . . . 103 5.4 Tele-ultrasound system setup 104 5.5 Data flow in the tele-ultrasound system 104 xiv L I S T OF FIGURES 5.6 Feature-based probe position control 106 5.7 Feature-based probe position control in long carotid artery scan 106 6.1 Feature centering experiment with and without the visual servoing feature. 112 6.2 Muscle 1; Abductor Pollicis Brevis muscle 118 6.3 Muscle 2; Extensor Digitorum Communis muscle 118 6.4 Muscle 3; Flexor Digitorum Sublimis muscle 119 6.5 Muscle 4; Flexor Pollicis Longus muscle 119 6.6 Muscle 5; Anterior Deltoid muscle 120 6.7 Muscle 6; Middle Deltoid muscle 122 A . l Schematic diagram showing three basic methods for obtaining the position and orientation of the ultrasound transducer for the free-hand acquisition technique: a) acoustic, b) articulated arm, and c) electromagnetic positioners. 168 A.2 Schematic diagram showing the three basic types of motion used in 3-D ultrasound systems making use of mechanical scanning: a) linear, b) fan, and c) rotational 170 xv Acknowledgments I would like to express my sincere and utmost gratitude to my supervisor, Professor Tim Salcudean, for providing me with an environment where I was free to discover. This work would not have been possible without his inspiring guidance, invaluable support and encouragement at every stage of this research. I am also grateful to Dr. Wen-Hong Zhu for providing me with a working platform and supporting me with his in-depth vision. Special thanks are directed to my colleagues, Shahin Sirouspour, Simon DiMaio, Daniela Constantinescu, Parvaneh Saeedi and Julian Guerrero, for their help, fruitful discussions and welcoming spirits. Thanks are due to my past and present colleagues, Henry Wong, Rudy Six, Simon Bachmann, Luca Filipozzi and Borna Noureddin, for being open-handed in helping me to solve difficult technical problems. I would also like to acknowledge the valuable assistance provided by Professors Ted Milner, David Lowe, Robert Rohling and Sidney Fels, and the ultrasound technicans at the Vancouver General Hospital, especially Ann Hope. My gratitude extends to the Robotics and Control Laboratory, the ECE Department technical staff and secretaries, and Professor Michael Davies. I would also like to thank IRIS/PRECARN Network of Centres and Excellence for supporting this project. On a personal level, I am heavily indebted to my parents, Parvin and Alireza, and my sister, Paria, who provided the foundation of the long process towards earning this degree. No words can express my appreciation towards them. Finally, I wish to thank my friends, Shiva, Farhad, Khosro, Sharon, Keyvan, and Shahram for their support, friendship and fun attitude in the course of my studies at UBC. PURANG ABOLMAESUMI The University of British Columbia September 2002 xvi C h a p t e r 1 Introduction Musculoskeletal disorders are some of the largest factors affecting compensation costs and lost time within the health care industry and stem from soft tissue discomfort caused or aggravated by work place exposures. These are particularly pertinent for medical technicians who, when conducting medical ultrasound examinations, are often required to hold the transducers in awkward positions for prolonged periods of time, sometimes exerting large forces. A number of studies indicate that, as a consequence, they suffer from an unusually high incidence of musculoskeletal disorders [43,165,177]. In an early study, approximately 100 veteran sonographers, each with 5-20 years activity in the profession were questioned regarding their working conditions [43]. They reported a high incidence of problems, most commonly stress, vision problems, infections, electrical shock and major muscle strains. One such work related injury, carpal tunnel syndrome, is the compression neuropathy of the median nerve at the wrist. This gives way to symptoms of burning pain, numbness, and tingling in the thumb, index and long fingers, and the lateral half of the palm. In order to 1 Introduction research this further, a questionnaire pertaining to possible causes of work-related injuries was developed and distributed to 225 cardiac sonographers [177]. This achieved a 47% response rate and although 86% of respondents reported experi-encing one or more physical symptoms, only 3% had been diagnosed with carpal tunnel syndrome. Other outcomes from this survey were that posture correlated significantly with other work-related musculoskeletal injuries and that high-pressure hand grip correlated sig-nificantly with carpal tunnel symptoms. No other strong relations with physical symptoms were found. More recently, a survey was distributed to all 232 registered British Columbian sonog-raphers, achieving a 92% response rate [165]. This study showed that self-reported work-related musculoskeletal symptoms were high among BC medical diagnostic sonographers with approximately two thirds having sought medical attention and receiving a diagno-sis for their symptoms. The prevalence, frequency and severity of musculoskeletal disorders were highest for the shoulder, neck and upper back. These findings suggest that not only are musculoskeletal symptoms related to the work of scanning, but also, that these are exacer-bated according to the frequency and duration of exposure to particular tasks. These include those that are related to sustained shoulder abduction, manipulating the transducer while prolonging applied pressure, sustained twisting of the neck/trunk and performing repetitive twisting. Motivated initially by the need to alleviate these problems and to present a more er-gonomic interface to ultrasound technicians, a tele-operation approach to diagnostic ultra-sound has been proposed at the University of British Columbia [152,155]. The next section provides an overview of the system. 2 Introduction 1.1 Tele-operation System Design Overview An inherently safe, light, backdrivable, counterbalanced robot has been designed and tested. Its primary use was envisaged to be carotid artery examinations to diagnose occlusive disease in the left and right common carotid arteries - a major cause of strokes [155]. Figure 1.1 shows the system setup. The system consists of a hand controller, a slave manipulator carrying the ultrasound probe, and a computer system that allows the operator to remotely position the ultrasound transducer relative to the patient's body. The motion of the robot arm is based on measured positions and forces. The system uses a shared control approach capable of achieving position and force control simultaneously. In addition to having ergonomic benefits, the tele-operation approach has major ad-vantages over existing robotic interfaces for ultrasound examination in that it incorporates the ability to interactively position the ultrasound probe via using a safe, light, and back-drivable robot in a tele-operation system, while also being assisted with a force controller. In [21,46,140], a Mitsubishi PA-10 industrial robot is used with a force controller to assist ultrasound technicians in moving the ultrasound probe against the patient's body. The system is able to generate 3-D models of blood vessels by maintaining a constant force that it exerts on the patient, while translating the probe across the patients skin. However, the system has several drawbacks [70]. Since the probe could only be moved by the robot through a pre-specified trajectory, the flexibility of the examination is extremely limited. Also, the system does not provide the sonographer with some of the motions that are re-quired for different scan modes (such as fan scans). In [71], a remote center of motion (RCM) robot is used as a high dexterity wrist for manipulating ultrasound probes for diag-3 Introduction Figure 1.1: Experimental setup for robot-assisted ultrasound. The 6-DOF parallelogram linkage robot moves the ultrasound probe on the patient's neck for the carotid artery ex-amination. nosis and ultrasound-guided biopsies. The robot has a relatively limited workspace based around a mechanically constrained remote center of motion. However, this configuration has the advantage that ultrasound scans can be performed quickly for fast updates of a model volume without endangering the patient. None of these, however, report a shared control structure in a tele-operation platform to guide an inherently safe robot. Other ap-proaches, such as [15,131], focus primarily on providing an interface for 3-D ultrasound image reconstruction. 4 Introduction 1.2 Tele-operation System Components The main components of the system are as follows. 1.2.1 Ultrasound Robot A lightweight robot with limited force capability was designed for teleoperated ultrasound examinations [152]. The robot moves fast enough to allow ultrasound examination to take place at a pace close to that achieved by the unassisted sonographer. In addition, the robot joints are backdrivable so that the arm can be pushed out of the way if necessary and controlled effectively in force mode. During the ultrasound examination, the ultrasound transducer is carried by the end-effector of the robot. The robot can achieve a resolution of 0.1mm and 0.09° for translation and rotation, respectively, at the end-effector. The maximum force that is generated at the end-effector is 10 N . 1.2.2 Robot Controller The integration of a human operator with autonomous force control module is absolutely necessary when designing a robot-assisted diagnostic ultrasound system. The operator should always have a supervisory control on the system that is operating on the patient and should be able to guide the robot in the right direction in case of any instability. Therefore, a shared control approach has been chosen that provides a smooth transition between the position control, when the probe is freely moving in the space, and the force control, when the robot is positioning the ultrasound transducer against the patient's body [189]. The system uses the measured probe positions and forces, and/or the commanded posi-5 Introduction tion and force trajectories simultaneously in a shared control approach to guide the robot arm. The objective of the controller is to let a linear combination of the velocity and scaled force of the ultrasound probe to track the hand controller command. There is no explicit switching between the contact and free motion states. The controller runs on a VxWorks™ real-time operating system. Safety issues have been addressed in the design and control of the ultrasound robot and are discussed in [155]. The velocity command sent to the robot is controlled and limited through the software. In addition, the shared control feature of the robot allows the operator to guide the robot in the correct direction in case of any failure. 1.2.3 H a n d C o n t r o l l e r A Logitech 6-axis SpaceMouse [50] was used to receive user commands to move the robot. By displacing the SpaceMouse handle in any direction, a proportional velocity command is generated in the same direction and is sent to the robot. The range of the displacement is very limited. The programmable push buttons can be used to change the command gain to the amount of handle displacement, as well as activating/deactivating different translation and rotation axes. 1.3 Thesis Objectives This thesis has been founded on top of the existing tele-operation platform for robot-assisted ultrasound diagnosis. Its primary goal has been envisaged to design, develop and integrate a novel user interface for the robot-assisted ultrasound examination system. To this end, 6 Introduction this thesis is intended: 1. To introduce and develop the concept of automatically guiding the ultrasound probe as a function of its acquired images, an approach termed "ultrasound visual servoing", and to integrate it with the system controller, using the shared control approach. This could be a useful feature for diagnostic examinations when used in conjunction with human supervisory control, in order to reduce operator fatigue. 2. To evaluate existing methods in feature tracking and to develop novel real-time robust feature extraction methods for ultrasound images. The development of these feature extraction algorithms is necessary in order to implement the ultrasound visual servo controller. 3. To develop a user interface that combines position, force and visual servo controller. The user interface has to enable the sonographer to control the motion of the ultra-sound transducer in a similar fashion to the conventional ultrasound examination. 4. To develop and implement a tele-ultrasound system that uses a shared control ap-proach to remotely assist the operator in the examination site. 5. To make a first attempt at evaluating the effectiveness of system ergonomics in terms of a qualitative and quantitative human factors study that compares both conventional and robot-assisted ultrasound examinations. This thesis also explains the process that led to the development of each of these objectives. An experimental setup was designed. Figures 1.2 and 1.3 show the block-diagram, the inter-communication and the data flow of the setup. In this system, the operator is able 7 Introduction to interact with a graphical user interface and a hand controller during an ultrasound examination. The resulting operator commands are coordinated with a visual servoing system in order to control the robot, and thus the ultrasound-probe motion. 1.4 Prior Work The development of an image-guided robot-assisted diagnostic ultrasound system has evolved from a process that involved multiple input sensor fusion for robot control, as well as feature extraction from ultrasound images. This section briefly explains the prior-art. 1.4.1 Real-Time Feature Tracking There have been numerous attempts to automatically extract features from gray scale im-ages. Several segmentation and motion tracking methods in medical images and specially ultrasound images have also been proposed in the literature. The success in terms of clini-cal usefulness, however, has been moderate. This is to a large extent due to complexity of ultrasound images that frequently present weak echoes, echo dropouts, scattering and large amount of speckle. In [36], mathematical morphology is used to extract borders in endocardiograms. A morphological filter is first used to reduce speckle noise and increase contrast in ultrasound images. The endocardial contour is then extracted by a watershed segmentation method. The watershed algorithm uses the edge image as an input, that is time consuming. In addi-tion, the algorithm does not perform well for images with large dropouts in the boundary. A dynamic programming method that is based on minimizing a cost function, including 8 Introduction Space Mouse Operator o I — - 1 n + Feature Space Control Law Image Processing System Robot Controllei Position/Force Sensors User Interface Figure 1.2: Block diagram of the ultrasound robot system. Graphics Panel Control Parameter Inputs Force at the Probe End Graphics Display Probe Orientation Robot Controller Position/Orientation Feature Commands Locations Magellan Mouse Ultrasound System Ultrasound Images Image Processing Feature Locations Ultrasound Images Ultrasound Image Display Figure 1.3: Data flow in the user interface. terms representing echo intensity, edge strength, and boundary continuity in space and time, is reported in [75,76]. The method is used to detect contours in echocardiograms. The performance of this method is compared with a maximum gradient algorithm and a matched filter algorithm in [74] to detect contours from human carotid artery longitudinal scan images. The dynamic programming algorithm resulted in high accuracy and low intra-operative variability in comparison to the other methods. However, it requires extensive 9 Introduction training for each image sequence in order to determine appropriate weights to be included in the cost function. Similarly, a simulated annealing technique is used to solve the contours cost function optimization problem in [59] to detect ventricular cavity boundary from a sequence of ultrasound images. The method employs an iterative radial gradient algorithm (the Star Algorithm) to determine the center of gravity of the cavity. The algorithm combines spatial and temporal information along with a physical model in its decision rule. Both of the simulated annealing and the dynamic programming methods fail when there are large gaps in the ultrasound image contours, or where there are several possible routes that the boundary detection algorithm may follow. Correlation algorithms have been widely used in image processing and motion tracking for a long time [98]. The performance of different block matching algorithms, namely normalized correlation, non-normalized correlation, and sum of absolute differences (SAD) methods, in tracking speckle patterns in ultrasound images, is compared in [60]. It is shown that the normalized correlation method and the SAD method do not show statistically significant difference in tracking speckle blocks in ultrasound images, however, the non-normalized correlation is very susceptible to local variations in the intensity of the image. Most of these block matching methods suffer from significant drift, as the error in estimated motion between the two consecutive frames will be accumulated over several frames [176]. In [63,135], an algorithm based on first order absolute moment center of mass operator is used. The approach, that is basically an iterative edge detection method, provides a robust contour extraction algorithm from ultrasound images. However, because the method only finds the visible edge points as the contour location in the image, it does not work well in 10 Introduction the presence of large dropouts in ultrasound images. Several other methods, such as texture segmentation [126], adaptive temporal filtering [53], neural networks [20,33,35], fuzzy logic [166], wavelet transform [158], radial basis functions [107], deformable meshes [185] and maximum likelihood methods [168], are also reported to extract features from ultrasound images. Active contour models have also been extensively used to extract features from ultra-sound images [8, 9, 22, 23, 28, 29, 31, 32, 66, 68, 85,103,105,118,123,159,172,175,186]. For example, an active contour model that uses several temporally adjacent images during the extraction of the tongue surface contour for a sequence of image frames is presented in [8, 9].The user supplies an initial contour model for a single image frame in the whole sequence. Using optical flow and multiresolution methods, this initial contour is then used to find the candidate contour points in the temporally immediate adjacent images. Sub-sequently, the snake mechanism is applied to estimate optimal contours for each image frame using these candidate points. Since the method performs the optimization on the whole image sequence, it is not suitable for real-time applications. Another approach [78] extracts the human left ventricle in echocardiographic images based on constraining the deformations of a traditional snake so that only allowable (similar to training examples) segmentation results are obtained. In [103], a semi-automatic technique for the segmentation of the prostate from ultra-sound images is presented. The algorithm uses deformable models and gradient direction information to converge to the prostate boundary. Like many other deformable model al-gorithms, the method requires a careful initialization of the contour in order to converge. In [93], a contour extraction approach that uses snake algorithm with temporal Kalman 11 Introduction filtering to extract periodic motions from a sequence of ultrasound images is presented. The algorithm performs an optimization on all the images in the sequence to extract the motion parameters, that makes it not suitable for a real-time application. Although the method has a good periodic behavior, it shows a coarse boundary detection. A different approach [64] presents a complex algorithm that includes several levels of low-pass filtering, model-based constrained contour evolution by using snake algorithms and optical flow esti-mation to perform contour extraction. Although the method shows an accurate estimation of contours for good quality ultrasound images, it is susceptible to noise and echo dropouts in ultrasound images. In addition, the method requires that the operator initially draws a contour close to the desired contours in the image frame. The method is also very sensitive to proper tuning of the parameters of the image processing algorithm. Most of the above mentioned snake approaches could be attracted by structures that do not belong to the actual feature. Also, an accurate estimation of the weights in the contour extraction cost function is needed. Most of the developed methods are complex and therefore difficult to implement in real-time. For example, [8] proposes a method to extract the tongue surface from a sequence of 250 images, which takes an hour on an SGI Octane machine running one 195 MHz R10000 processor. Helbing [85] proposes a parallel processing configuration to solve this problem. An active contour model has been implemented on 9 transputers which were connected in parallel. The time for 10 iterations of the cost function optimization is 3.67 s. In [186], a processing time of 3 s to process one image frame is reported on a Sun Sparc 10 workstation. Setarehdan [157] presents a fuzzy logic-based left ventricular epicardial and endocardial boundary detection method. On a Sun Sparc 20, each frame takes 30 s to 12 Introduction process. The implementation of the algorithm in [64] with all its components requires one second per frame on a Sparc 10 [65]. The only real-time contour extraction systems are presented in [19,63,135]. These sys-tems use special hardware to extract contours. The system in [19] uses two Intel i960CA processors at 33 MHz, together capable of a maximum performance of 132 MIPS, to de-tect left ventricular endocardial contour. The algorithm performs well on still images and good quality images, however, it has poor stability in low quality sequences of ultrasound images with large dropouts. In [63,135], four parallel Texas Instruments TMS320C80 DSP microprocessors, as well as a master 32-bit general RISC processor, are used to extract the contours of aorta in real-time. However, this algorithm is basically a robust edge detector and it does not compensate for dropouts in the image. 1.4.2 Speckle Reduction Speckle in ultrasound imaging is a random interference pattern formed with coherent ra-diation of a medium containing many sub-resolution scatterers [13]. The nature of speckle pattern depends mainly on the number of scatterers per resolution cell or Scatterer Number Density (SND) [1]. In [72] it is shown that when many fine randomly distributed scatterers exist within the resolution cell of the imaging system, the amplitude of the backscattered signal can be modeled as a Rayleigh distributed random variable. In this case, the charac-teristic of the coherent speckle is independent of the scattering phenomenon and depends only on the physical dimensions of the imaging instrument, the wavelength of the signal and the distances involved [180]. Blood cells generate a speckle pattern of this kind [1]. At the other extreme, where there is only a small number of scatterers, the image is smooth 13 Introduction and is referred to specular reflection [180]. Example of this type are the lobules in liver parenchyma [1]. In between, the speckle contrast and the speckle spatial pattern depend on parameters of the scattering surface, and the probability density function of the backscat-tered signals become Rician distribution [1,180]. A discussion on the statistics of speckles in ultrasound imaging can be found in [13]. When visualizing larger anatomical structures speckle has a negative impact on diag-nosis. For example, a reduction of lesion detectability of approximately a factor of eight due to the presence of speckle in ultrasound images is reported [16]. For this reason, when designing echocardiographic equipment, much effort is spent on reducing the speckle pat-terns. Several speckle reduction methods are reported in the literature. These methods can be categorized into image filtering and compounding techniques [52,101,108,115,150]. Image filtering methods use a low-pass filter to smooth the ultrasound image, which would reduce speckle. However, these methods often result in blurred image features. In [16], an adaptive low-pass filter is proposed where local image statistics around each pixel is used to adjust the filtering parameters. In order to preserve the edges, the filtering is only applied to almost uniform gray level areas of the image. In [101], it is shown that these adaptive approaches that are based on local image statistics are very sensitive to the choice of statis-tical parameters to be used. Therefore, since appropriate statistical parameters vary from one data set to the next, these methods are not appropriate for automatic speckle reduction. Moreover, different filter sizes should be used for different parts of the image, and there are no systematic rules to adequately choose this value [115]. The compounding methods average several ultrasound images to reduce the speckle [52, 150]. These can be categorized into spatial, frequency, and temporal compounding 14 Introduction methods. The spatial compounding is achieved by using multiple images of the target struc-ture, taken from slightly different viewpoints [115,150]. The frequency compounding adds together detected data from different frequency bands [13]. This is based on the assump-tion that speckle produced at one imaging frequency is not correlated with that produced at other frequencies. In temporal compounding, several temporally adjacent ultrasound image frames are averaged in order to reduce speckles [37]. However, ultrasound transducer should not move in order to have spatially aligned ultrasound image frames. Some approaches use motion compensation methods to perform temporal compounding [115]. Except some of the compounding methods that are built in the hardware of commercial ultrasound machines, most of the reported speckle reduction methods are computationally expensive and therefore, not suitable for a real-time application. For example, [171] reports that on a Pentium II, 233 MHz, the median filter takes 670 s to process an image of 512x512 pixels. This paper also presents a computationally inexpensive method, however, the algorithm still takes 60 s to process a single image frame. One of the fastest methods reported in the literature is [1] which takes around 70 ms on a Pentium II, 366 MHz, that is close to real-time. In our tele-operation system, however, we desire a computation time of less than 33 ms (30 frames/s) in order to implement the image servo controller as fast as possible. This computation time should include the one for speckle filtering method, as well as the feature extraction method. Since most of the feature extraction methods are computationally demanding, instead of using any speckle reduction technique, that is time consuming, this thesis has targeted the development of real-time feature extraction techniques that are less sensitive to different speckle levels in ultrasound images. 15 Introduction 1.4.3 Robot-Assisted Tele-Ultrasound Several robot-assisted tele-ultrasound examination systems have been reported i n the liter-ature. A modified Arms t rong EndoSis ta II robot is controlled over a A T M point to point connection to perform a min ima l ly invasive Laparoscopic procedure and a simple ul tra-sound guided biopsy [45] as part of a European telepresence research project. In [73], a tele-scanning robot system is reported. The system is experimented dur ing a cardiac and liver ul trasound scan between Bourges (France) and K a t m a n d u (Nepal) by using a satellite connection. However, the mot ion of the probe on the skin surface is l imi ted to about 4 c m 2 . In [100,120], a master-slave tele-operation system wi th force feedback is proposed. A 3-axes force sensor is installed on the ultrasound probe and the detected force information is fed back to the master manipulator . T w o I S D N lines are used to scan a patient's shoulder over a distance of 700 km. In [179], a robot-assisted tele-ultrasound system is presented. The expert operator has a force feedback joystick to perceive the pressure which it ind i -rectly exerts on the body of his or her patient. A u d i o and video flows of environment are t ransmit ted between the patient's side and the operator side. The robot mounts on the patients bed and rests par t ly on his /her body. Pneumat ic actuators are used to move the robot. None of the reported systems use a shared control approach to assist the operator in the tele-ultrasound examination. 1.4.4 Shared Control It has long been recognized that multisensor-based control is an important problem in robotics [184]. A s a robot is expected to accomplish more and more complex tasks, such 16 Introduction as task planning in a manufacturing workcell or performing surgery in an operating room, the need to take advantage of multiple sensors in controlling a system becomes increasingly important. To achieve this end, researchers have proposed to build multisensor-based robots that can compensate for changes in the environment and uncertainties in the dynamic models without explicit human intervention or reprogramming. In [129], an algorithm has been proposed to fuse vision and force. The method uses the two sensors at two different stages of the control process while performing the task of contact transition, when a manipulator moves in free space and when it makes contact with a surface. In [27], a hybrid vision/position control is proposed. The system uses visual feedback to control two degrees of freedom of a 3-DOF manipulator, parallel to the image plane of a supervisory camera, and the remaining degree of freedom (perpendicular to the camera image plane) is controlled using position feedback provided by the robot joint encoders. In [134], the problem of integrating human operator with autonomous robot visual tracking and servoing modules is addressed. A shared control approach is presented in which the human operator and the autonomous visual tracking modules command motions along orthogonal sets of degrees of freedom. An approach that combines visual servo control and force feedback is presented in [124] to control an eye-in-hand robot. The implemented control scheme involves a pure damping position-based impedance control and an external image-based visual controller. The system has been tested successfully to perform peg-in-hole tasks. In [184], a sensor fusion scheme for controlling an end-effector to follow an unknown trajectory on a surface is developed. At the point of contact, a force-torque sensor mounted on the wrist of the robot provides local information about the unknown 17 Introduction surface. The force-torque sensor also provides the necessary control to maintain contact with a desired force on the surface. A vision system, with only one camera, is used to track an unknown trajectory on the surface. None of the reported shared control approaches have been used to guide an ultrasound transducer during an ultrasound examination. 1.4.5 V i s u a l Servoing It has long been recognized that sensor integration is fundamental to increasing the versa-tility and application domain of robots. Sensory feedback can enhance the performance of a robotic manipulator in many applications such as product processing, material handling and automatic assembly [40,184], Vision is a useful robotic sensor since it mimics the human sense of vision and allows for non-contact measurement of environment [40]. Visual servoing systems use camera sensors inside the control loop to accomplish tasks in unstructured environments [39, 87]. In visual servo systems a basic task to be performed is the control of the pose (position and orientation) of a robot camera with respect to a target object or a set of target features [92]. Since the early work of Shirai and Inoue [161], considerable effort has been devoted to the visual control of robot manipulators. The reported applications are quite extensive, encompassing manufacturing, teleoperation, missel tracking cameras and fruit picking as well as robotic ping-pong, juggling, and balancing [7,10-12,25,26,30,38,44,48,49,51,51, 54-56,67,79-83,89,91,95,96,102,104,106,111-113,116;117,119,122,127,128,133,134,147-149,151,160,162-164,173,178,181,182,187,188]. In 1980, Sanderson and Weiss [40,156] introduced an important classification of visual servo structures. This classification is essentially based on the following two questions [92]: 18 Introduction 1. Is the control structure hierarchical, with the vision system providing set-points as input to the robot's joint-level controller, or does the visual controller directly compute the joint-level inputs? 2. Is the error signal obtained in task space coordinates, or directly in terms of image The resulting taxonomy, thus, has four major categories, which are described here. These fundamental structures are shown schematically in Figures 1.4 to 1.7. Position-Based Control; In position-based visual servoing, features are extracted from the image. These features are then used to estimate the pose of the target with respect to the camera. Using these values, an error between the current and the desired pose of the robot is denned in the task space. In this way, position-based control neatly separates the control issues, namely the computation of the feedback signal, from the estimation problems involved in computing position or pose from visual data. A very good discussion about position based methods can be found in [92]. features? Robot Desired feature location in the world frame +1 Cartesian Control Law J Feature location in the world frame I r Feature Extraction k \ 1— Pose Estimation k Camera Figure 1.4: Position-based visual servo (PBVS) structure. 19 Introduction Robot Desired feature location in the image + o-Feature location in the image Feature Space Control Law Feature Extraction P1 «<FJ1 \ Camera Figure 1.5: Image-based visual servo (IBVS) structure. Desired feature location in the world frame + — o 4 -Feature location in the world frame Robot Cartesian Control Law Robot Controller Position/Force Sensors — Pose Estimation Feature Extraction Camera Figure 1.6: Dynamic position-based look-and-move structure. Robot Desired feature location in the image + Feature location in the image Feature Space Control Law Robot Controller Position/Force Sensors Feature Extraction Figure 1.7: Dynamic image-based look-and-move structure. 20 Introduction The principal advantage of position-based control is that it is possible to describe tasks in terms of Cartesian pose as is common in robotics. Its primary disadvantage is that feedback is computed using estimated quantities that are a function of the system calibration parameters [92]. Most of the approaches are based on having an accurate model of the target object - a form of calibration. Hence, in some situations, where the target object (such as the carotid artery) can not be accurately modeled, position-based control can become extremely sensitive to calibration error. Feature-based approaches tend to be more appropriate to tasks where there is no prior model of the geometry of the task, for example in teleoperation applications [77]. Generally speaking, since feature-based methods rely on less prior information, they can be expected to perform more robustly on comparable tasks [92]. Image-Based Control: Image-based visual servo control uses the location of features on the image plane directly for feedback [40]. For a robot with an end-effector-mounted camera, the viewpoint, and hence the features will be a function of the relative pose of the camera with respect to the target, 'Xt . In general this function is non-linear and cross-coupled such that motion of one end-effector degree-of-freedom will result in the complex motion of many features [40]. The location of the feature points, f, may be linearized about the operation point: Sf =J <FXt (1.1) 21 Introduction where J = (1.2) is a Jacobian matrix, relating rate of change in pose to rate of change in feature space. This Jacobian is referred to variously as feature Jacobian, image Jacobian, feature sensitivity matrix, or interaction matrix [40]. Assume for the moment that the Jacobian is square and non-singular, then a simple proportional control law [40] will tend to move the features in the image towards the desired feature vector fd. iv" is a diagonal gain matrix. Such a closed-loop system is relatively robust in the presence of image distortions [41] and kinematic variations in the manipulator Jacobian [121]. This follows from the fact that by construction, when the image error function is zero, the kinematic error must also be zero. Even if the hand-eye system is miscalibrated, if the feedback system is asymptotically stable, the image error will tend to zero, and hence so will the kinematic error. This is not the case with the position-based system. Thus, one of the chief advantages to image-based control over position-based control is that the positioning accuracy of the system is less sensitive to camera calibration errors [40]. There are also often computational advantages to image-based control [92]. Dynamic Look-and-Move Systems: Dynamic look-and-move systems follow the same descriptions as direct visual servo architectures (position and image based control systems) u = K J " 1 (fd - f) (1.3) 22 Introduction with the following difference [92]. If the control architecture is hierarchical and uses the vision system to provide set-point inputs to the joint-level controller, thus making use of joint feedback to internally stabilize the robot, it is referred to as a dynamic look-and-move system. In contrast, direct visual servo eliminates the robot controller by entirely replacing it with a visual servo controller that directly computes joint inputs, thus using vision alone to stabilize the mechanism. The image-based approach may reduce computational delay, eliminate the necessity for image interpretation and eliminate errors in sensor modeling and camera calibration. However, it presents a significant challenge to controller design since the plant is non-linear and highly coupled. In addition, the position based and image based structures use no joint position information at all. In contrast, the dynamic look-and-move structures make use of joint feedback [40]. For several reasons, nearly all implemented systems adopt the dynamic look-and-move approach [30,54-56,61,81,83,89,90,94,112,139,147,170]. Firstly, the relatively low sampling rates available from vision make direct control of a robot end-effector with complex, non-linear dynamics an extremely challenging control problem. Using internal feedback with a high sampling rate generally presents the visual controller with idealized axis dynamics [40]. Secondly, many robots already have an interface for accepting Cartesian velocity or incre-mental position commands. This simplifies the construction of the visual servo system, and also makes the methods more portable. Thirdly, look-and-move separates the kinematic singularities of the mechanism from the visual controller, allowing the robot to be consid-ered as an ideal Cartesian motion device. Since many resolved rate [183] controllers have specialized mechanisms for dealing with kinematics singularities [34], the system design is 23 Introduction again greatly simplified. In addition, the implementation of direct visual servo architectures currently faces some technical limitations. Weiss found that even for a 2-DOF revolute mechanism a sample interval less than 3 ms was needed to achieve satisfactory plant identification [40]. Paul [137] suggests that the sample rate should be at least 15 times the link structural frequency for manipulator control. Since the highest sampling frequency achievable with standard cameras and image processing hardware is 60Hz, the image-based visual servo structure is not currently practical for low level robot control. The dynamic look-and-move structure is more suitable for control of manipulators, by combining high-band width point level control with a lower rate visual position control loop [40]. For all the above mentioned reasons, the ultrasound robot uses an image-based dynamic look-and-move visual servo controller. This means that the ultrasound robot uses a higher frequency position-force based joint controller to idealize the axis dynamics and the lower frequency visual servo controller is used in a shared control with user commands to move the robot. 1.5 Thes is C o n t r i b u t i o n s This thesis is part of the larger tele-operation system for robot-assisted diagnostic ultra-sound. Five different feature extraction and segmentation methods, namely a Correlation algorithm, a Sequential Similarity Detection algorithm, a Star algorithm, the Star-Kalman algorithm, and a Discrete Snakes algorithm, have been implemented in real-time to track features in ultrasound images. It has been shown that the novel Star-Kalman algorithm is 24 Introduction able to track the carotid artery, while extracting its contours in real-time. The Star-Kalman and SSD algorithms have excellent performance while tracking features with motions of up to 200 pixels/s. All of the implemented feature extraction methods have been compared for carotid artery tracking in ultrasound images over 10 second (300 frames) time periods. The Star-Kalman algorithm has been used to reconstruct a 3-D anatomical model of the carotid artery, with the aim of providing anatomical landmark information to the sonographers. The thesis also contributes the development of the "ultrasound visual servoing" concept. It has been shown that the motion of the features in ultrasound images could be controlled by using three degrees-of-freedom of the ultrasound robot. The ultrasound visual servoing has been used during several diagnostic ultrasound examinations, in conjunction with human supervisory control, with the aim of reducing the operator fatigue. An adaptive visual servo controller is also presented. A novel user interface that combines velocity, force, and image-based control has been developed. This user interface was used as part of a robot-assisted tele-ultrasound demon-stration system. Participants at the IRIS/PRECARN 2000 conference in Montreal were able to manipulate the ultrasound probe and perform an ultrasound examination in Vancouver. The ergonomic effect of the system has been demonstrated by means of a pilot hu-man factors study. The study gathered qualitative information from user questionnaires and performed a quantitative comparison of the muscle activity exerted by sonographers when they conducted conventional ultrasound examinations versus examinations using the designed system. It has been shown that the robot-assisted diagnostic ultrasound system significantly reduces the muscle activity for the large shoulder muscles for an unexperi-enced operator, while it increases the muscle activity of an experienced sonographer. These 25 Introduction muscles are the major cause of pain in sonographers during conventional ultrasound exam-inations. Also, accurate placement of the carotid artery is possible by using the ultrasound visual servoing. It has been shown that the average muscle activity is reduced by using the visual servo controller. 1.6 Thesis Outline This thesis is concerned with the design of a teleoperated robot-assisted medical ultrasound system, mainly to improve the ergonomic interface for sonographers. First, a brief introduc-tion to visual servoing is presented and the selection of an appropriate visual servo control structure is justified. Then different feature tracking methods that have been developed with their application in an ultrasound image servo system are discussed. Practical appli-cations of the system in 3-D ultrasound imaging and tele-ultrasound are explained next. Finally, the results of a human factors study is presented. The thesis consists of seven chapters as follows. Chapter 2: Real-time Feature Tracking in Ultrasound Images. An overview of different feature tracking methods that are proposed for ultrasound image feature tracking is presented. Six feature tracking methods have been attempted and are presented in this chapter. The experimental results support the effectiveness of the methods. The relative performance of these methods is discussed qualitatively and quantitatively by using the ultrasound robot. Chapter 3: Ultrasound Image Servoing. This chapter starts with calculating the ultrasound image Jacobian and discusses different properties of the Jacobian matrix. 26 Introduction The visual controller is discussed next and the system performance is demonstrated by the experimental results. The chapter concludes with the discussion of a new adaptive ultrasound image servo controller to compensate the calibration errors. Chapter 4: Ultrasound Robot User Interface. This chapter presents different features of the designed user interface for the ultrasound robot. The user interface consists of a joystick and a graphical user interface. The design of a novel six axes joystick is presented. Different features of the graphical user interface, such as the ability to change the sensitivity of the robot to the user positioning commands, to activate/deactivate the robot, to enable/disable the force and image servo controllers, and to enable/disable different degrees of freedom of the robot, are discussed. Chapter 5: Practical Applications. The ultrasound robot has been used to capture and reconstruct 3-D structures in the ultrasound image. This chapter presents two different methods that are used for this reconstruction. In addition, an experimental tele-ultrasound system has been developed and tested. The details of its components are presented. The chapter concludes with an application of the system for feature-based probe position control. Chapter 6: Human Factors Study. The results of the human factors study along with the feedback from the ultrasound technicians are presented in this chapter. The muscle activity exerted by a sonographer has been used to quantitatively compare the conventional ultrasound examinations relative to using the robotic system. In addition, the effectiveness of the image servo controller in helping an ultrasound technician to position anatomical regions in ultrasound images is discussed. 27 Introduction Chapter 7: Conclusions and Future Work. The contributions of this thesis along with suggestions for future research are summarized in this chapter. The main contributions are as follows: 1. Real-time ultrasound feature extraction: Several real-time feature extraction and track-ing algorithms, namely a Correlation algorithm, a Sequential Similarity Detection algorithm, the Star algorithm, the Star-Kalman algorithm and a Discrete Snakes al-gorithm, are presented. These methods have been compared for tracking the carotid artery in ultrasound images. 2. Ultrasound visual servoing: The concept of ultrasound visual servoing has been de-veloped to control three axes of the robot. This has been used to automatically compensate for unwanted motions in the plane of the ultrasound beam. 3. Feature-based 3-D ultrasound image reconstruction: A computationally inexpensive feature-based 3-D ultrasound imaging technique has been demonstrated. The algo-rithm has been validated by a 3-D reconstruction of an ultrasound phantom and a human carotid artery. 4. Tele-ultrasound: A tele-ultrasound system has been developed that allows the radiol-ogist to view and manipulate the ultrasound transducer at the remote site by using a safe robot, while being assisted by the remote force and image servo controllers. 5. Ultrasound robot user interface: A novel spherical wrist joystick has been designed to control the motion of the robot. This joystick has been used as part of a user interface that combines velocity, force, and image-based control. 28 Introduction 6. Human factors study: A novel EMG-based quantitative evaluation of the sonogra-phers' muscle activity during the ultrasound examinations has been developed. A user questionnaire has been designed to get qualitative feedback from sonographers. Portions of this work have been presented in the IEEE Transactions on Robotics and Automation [4], and the Proceedings of the following conferences: IEEE International Con-ference on Robotics and Automation [3,154,189], IEEE/ASME International Conference on Advanced Intelligent Mechatronics [6], IEEE International Conference on Computer-Based Medical Systems [5], World Congress on Medical Physics and Biomedical Engineering [2], International Symposium of Robotics Research [155] and International Conference on Med-ical Image Computing and Computer Assisted Intervention [152]. 29 Chapter 2 Real-time Feature Tracking in Ultrasound Images 2.1 Overview Feature tracking and image segmentation of ultrasound images is a multidisciplinary chal-lenge that combines topics from signal and image processing, and optimization theory. Sev-eral segmentation and motion tracking methods in medical images and especially ultrasound images have been proposed in the literature. The success in terms of clinical usefulness, how-ever, has been moderate. This is to a large extent due to presence of artifacts in ultrasound images even in very echogenic patients (patients with "good" image quality). Some of the main reasons for this are the small difference in acoustic impedance between different tissue types, scattering and large amount of speckle. These specular reflections occur so image edge amplitudes are strongly dependent on the orientation of anatomical surfaces relative 30 Real-time Feature Tracking in Ultrasound Images to ultrasound beam direction. These artifacts can produce false image edges and abrupt reduction in edge strength (echo dropout) where real anatomical structures are present but do not lie perpendicular to the beam direction. Therefore, there is still a need for very-robust segmentation techniques. Algorithms need to cope with low contrast images and a fair amount of speckle [22]. The image-guided robot-assisted diagnostic ultrasound project would not have been suc-cessful without the development of real-time robust feature tracking methods. Of particular interest to the problem of visual servoing and shared control is the ability to track images in real-time over a long period of time. The problem considered first as a test-bed to compare different ultrasound feature tracking algorithms has been the robust tracking of the human carotid artery over a real-time sequence of ultrasound images. The robust contour extraction and feature tracking methods that are developed in this thesis have been evolved from a process that involved template matching algorithms, such as the phase correlation, the Sequential Similarity Detection, the cross-correlation algorithms, and segmentation algorithms, such as the Snakes, the Star, and the Star-Kalman algorithms. Because of the circular shape of the carotid artery, these algorithms have been used only to extract translations from ultrasound images and rotations are not considered here. The standard frame to frame cross-correlation motion tracking techniques that have been used for a long time in the literature [60] suffer from the drifting problem, especially when the shape of the feature changes from one frame to the other. A cross-correlation algorithm has been presented in Section 2.3.2 that uses the information from multiple image frames to reduce the drifting effect. When the shape of the image feature does not change significantly, it is also possible to perform the tracking by seeking for a constant 31 Real-time Feature Tracking in Ultrasound Images template in the image. A Sequential Similarity Detection algorithm has been implemented in Section 2.3.3 to track an almost stationary feature in the image. The algorithm has very low computational cost and has shown very good tracking results. A phase correlation algorithm is also tested to track features in ultrasound images. It was expected that because of the sharp maximum correlation peak that the algorithm provides, it should be easier to detect motions in the image. However, because of the high sensitivity of this method to the noise in the image [114], the effort has not been successful. The algorithm is explained in Section 2.3.1. In addition to the above mentioned template matching techniques, several image seg-mentation methods have been developed and implemented. Section 2.4.1 explains the Star algorithm. The algorithm was originally proposed in [59] to find the center of gravity of a cavity in an ultrasound image. However, our experiments show that the method has poor stability. The reason is the large dropouts in the contours of the carotid artery in ultrasound images. A temporal Kalman filter algorithm with a constant velocity kinematic model is used to solve this problem. This algorithm combines the location history of the center of gravity of the cavity in the previous frames with the current estimation to reliably track the feature. A novel fully-automatic segmentation and tracking technique has been developed and is explained in Section 2.4.2. This method, the Star-Kalman algorithm, is inspired from a radar target tracking algorithm [17], where the tracking of a single target in a randomly distributed cluttered environment is presented. The algorithm compensates for the echo dropouts in ultrasound images by using a spatial Kalman filter. The method shows excellent performance to solve our tracking problem. 32 Real-time Feature Tracking in Ultrasound Images The snake methods [97] suffer from several problems such as convergence to local mini-mum because of the low quality of ultrasound images, as well as the difficulty in calculating the weight factors that control their deformation behavior. To overcome these problems, we have combined the discrete snake method [31] with the Star algorithm to solve our tracking problem. Rather than performing snake deformation on the original image, the discrete snake model carries out energy optimization on selected edge points of the original image. The algorithm is discussed in Section 2.4.3. A quantitative comparison of these methods has been performed by using the ultrasound robot and an ultrasound phantom. These methods have also been qualitatively compared to track the carotid artery over a sequence of ultrasound images that are captured in real-time while scanning a patient. The following sections explain the experimental setup, a complete explanation of each feature tracking algorithm, and the experimental results. 2.2 Experimental Setup An ultrasound video was captured.from a real carotid ultrasound examination at the Van-couver General Hospital, UBC branch. Originally, this video was used as the test-bed to implement different feature tracking algorithms. Later on, this video was replaced with the real-time video feed of carotid artery examinations from an ultrasound machine in our laboratory. An ultrasound phantom was designed in our laboratory and was used as a standard tool to compare the tracking methods. Figure 2.1 illustrates a CAD model of the phantom. Three plastic tubes are positioned in a solution [110] along three different axes in the 33 Real-time Feature Tracking in Ultrasound Images phantom. An ultrasound image of the phantom is shown in Figure 2.2, with the ultrasound transducer imaging plane being aligned with the xz-plane and pointing along the negative z-axis. Any probe motion of A x along the x-axis of the probe would cause a feature motion A n along the u-axis of the image. Therefore, the experimental results reported in this chapter show the velocity of the feature in the ultrasound image, instead of the corresponding ultrasound transducer velocity that is generated by moving the ultrasound robot end-effector. Ultrasound images are captured through the system's user interface. Experiments have been performed to quantitatively compare the effectiveness of the feature tracking methods. In these experiments, the center of one of the pipes was selected as a feature in the ultrasound image and the robot was used to move the ultrasound probe back-and-forth with constant velocity along the x-axis of the probe coordinate frame. Figures 2.2 and 2.3 show the concept. The tracking of the carotid artery is also demonstrated on 10 second sequences of carotid artery images. The images are acquired when the ultrasound transducer is positioned and moved on the neck of a patient by the sonographer. Several feature tracking techniques have been tested, modified and developed and are presented here. These methods can be categorized into the template matching algorithms and the image segmentation algorithms. 34 ReaJ-time Feature Tracking in Ultrasound Images Figure 2.1: A C A D model of the ultrasound phantom and the ultrasound transducer. The plane of the ultrasound beam which is emitted from the ultrasound transducer is also shown. Figure 2.2: An ultrasound image of the phantom. The figure shows the location of the ultrasound probe with respect to the ultrasound image. Any probe motion of A x along the x-axis of the probe would cause a feature motion A n along the n-axis of the image. 35 Real-time Feature Tracking in Ultrasound Images Feature Displacement along the u-axis (A u) 0 0.5 1 1.5 2 2.5 3 3.5 4 4.5 5 Probe Displacement along the x-axis (A x) 0 0.5 1 1.5 2 2.5 3 3.5 4 4.5 5 Time (s) Figure 2.3: Displacement of the probe along the x-axis (Ax) relative to the phantom and the corresponding feature displacement in the ultrasound image along the u-axis (An); one pixel displacement in the ultrasound image along the u-axis corresponds to a motion of 0.123 mm of the robot end-effector along the x-axis. 2.3 Template Matching Algorithms In the template matching algorithms, an image block is selected in one image frame and a similar block is sought in the following image frames based on different criteria. Three different methods are presented here. 2.3.1 Phase Correlation The first approach that was evaluated for image tracking was the phase correlation method. Unlike pixel level correlation method (such as the cross-correlation), the algorithm provides a sharp peak in the translation search space that accurately determines the motion in the image. Phase correlation techniques have been used in motion estimation before [141, 145]. Although in these image motion estimation methods phase correlation techniques are applied to the entire image, the phase correlation method has also been used in optical flow estimation by applying it locally to a certain part of the image, using a small window 36 Real-time Feature Tracking in Ultrasound Images around the point of interest, where the image flow velocity is being estimated [69,146]. The basic phase correlation method estimates the relative shift between two image blocks by means of a normalized cross-correlation function computed in the 2-D spatial Fourier domain. It is also based on the principle that a relative shift in the spatial domain results in a linear phase term in the Fourier domain. Suppose that Fi(£, rj) and F2(C>?7) are Fourier transforms of the two images Ai (x , y) and A2(x, y) = A i ( x + dx, y + dy), where dx and dy are translations in x and y directions. If we calculate the Fourier transform of these two images, we will have: Therefore, translations only affect the phase of the Fourier transform of the signal. Equation (2.1) can be used to find the cross power spectrum of two images as follows: where F2 is the conjugate of F2. The inverse Fourier transform of equation (2.2) gives the normalized cross-correlation function and has a maximum at inter-frame translation. It is feasible to implement the algorithm in real-time for small image templates, since the Fourier and inverse Fourier transforms have computation time of 2N2log2(N) each, and the computation time to calculate F i x is 0(N3), when the size of images are N x N and N = 2", where n in an integer. The experimental results, however, were not very successful. While the algorithm could detect motions accurately for synthetic images, it failed to extract motions from the carotid (2.1) -j2ir{Cdx+ridy) = F l (C, ^ 2 (C , ??) Fi(C,»7)F2(C,»7)| (2.2) 37 Real-time Feature Tracking in Ultrasound Images artery images. This problem can be related to several factors: 1. A report on the accuracy analysis of the phase correlation method [114] shows that both the low-pass and high-pass noise in the image significantly degrade the per-formance of the algorithm. This result is significant, since it states that increasing the robustness in detecting the correlation maximum (by sharpening the correlation . peaks), comes at the expense of increased sensibility to noise of the computed maxi-mum position. 2. It can be shown that the phase correlation surface contains one impulse for each translatory moving patch within the measurement window. The size of each peak indicates the area that the patch covers within the window [69]. In an ultrasound image with speckle and motions such as the carotid artery pulsation, the amplitude of these peaks could be very close to each other and it is very difficult to detect the dominant motion in the image. These problems in the phase correlation algorithm pushed us towards the development of more reliable and robust feature tracking algorithms that are explained in the following sub-sections. 2.3.2 Cross-Correlation Cross-correlation algorithms have been widely used in image processing and motion tracking for a long time [98]. In [60], it is shown that the normalized correlation formula performs well in tracking speckle blocks in ultrasound images. However, cross-correlation algorithms have the drift problem when the image feature changes from one frame to the next one. 38 Real-time Feature Tracking in Ultrasound Images A modified normalized cross-correlation algorithm has been developed and presented here that is less sensitive to the changes in the image feature1. The normalized cross-correlation is governed by the following formula: r ( K » \ S x , y [ (A 1 (x ,y ) -A 1 ) (A 2 ( a ; ,y ) -A 2 ) ] G ( A i , A 2 J = , _ _ - \1.6) yJ±Zx>y (Ai(x, y) - A X ) 2 (A 2 (x, y) - A 2 ) 2 where A i and A 2 are average of gray levels in two images Ai (x , y) and A 2 (x , y), respectively, and x and y range between 0 and the size of images. In the proposed method, a sub-block of the image acquired at time U is shifted in its neighborhood looking for a best correlated match with a fixed sub-block of the same size in a prior frame A schematic diagram of the algorithm is shown in Figure 2.4. If k is fixed, the best correlation is sought relative to a fixed or reference image. Applying the cross-correlation method in this way leads to little drift, but high sensitivity to image deformation. If i — k is fixed, the best correlation is sought relative to an image acquired a fixed time offset relative to the current frame. Applying the cross-correlation method in this way leads to little sensitivity to image deformation, but to significant drift, as the shift estimate is being integrated. A mixed approach is implemented that seeks the best correlation relative to multiple frames at times tk-i-, *fc-4> • • ' > £ f c - 2 n > where n is fixed. The feature point in this algorithm is defined as the central coordinates of the correlation window. Figure 2.5 shows the performance of the correlation algorithm with n = 3 for an image sub-block of 64 x 64 when tracking a desired feature in the ultrasound phantom and Figure 2.6 shows the real-time tracking result for carotid artery images for an image 1Discussions wi th Professor David Lowe, Department of Computer Science, The University of Bri t ish Columbia, 1998 39 Real-time Feature Tracking in Ultrasound Images sub-block of 64 x 64. In these experiments, the size of the search area is 76 x 76, which means that the image sub-block was shifted in a 12 x 12 area. A look-up table has been used to accelerate the computation of equation (2.3). Although the algorithm performs well in reducing the drifting problem, it does not completely eliminate it. Different filters, such as the median filter, were applied to the image to reduce the speckle and consequently the drifting problem, with little success. n-2' Figure 2.4: A schematic diagram of the correlation algorithm. Correlation Algorithm (V = 60 pixels/s) 0 0 5 1 1 5 2 2 5 3 3.5 4 4 5 5 Correlation Algorithm (V = 200 pixels/s) I \ ' i » '• 'l Figure 2.5: The tracking performance of the correlation algorithm in different velocities. 40 Real-time Feature Tracking in Ultrasound Images t = 2s Figure 2.6: Tracking the carotid artery by the correlation algorithm; the cross in the image shows the position of the feature. The size of the image sub-block is shown at t = 0 s and the size of the image search area is shown at t = 2 s. 2.3.3 Sequential S imi la r i ty Detect ion In [60], it is shown that statistically, there is no significant difference in performance between the normalized cross-correlation method and the sum of absolute differences algorithm, when tracking speckle patterns in ultrasound images. Therefore, a sequential similarity detection (SSD) [24] algorithm, that uses the sum of absolute differences between two image blocks, is implemented to track arbitrary features in ultrasound images. The advantage is the low computational cost of the SSD algorithm in comparison to the cross-correlation algorithm. In this simple method of motion detection, a sub-block of the image acquired at time ti is shifted in its neighborhood looking for a minimum absolute subtraction (pixel by pixel) with a fixed sub-block of the same size in a prior frame 41 Real-time Feature Tracking in Ultrasound Images Figure 2.7 shows the performance of the SSD algorithm for an image sub-block of 64 x 64 when tracking a desired feature in the ultrasound phantom and Figure 2.8 shows the real-time tracking result for carotid artery images. A temporal Kalman filtering could be used to smooth the trajectory of the extracted feature. SSD Algorithm (V = 60 pixels/s) -40 J' ^\ 0 0.5 1 1.5 2 2.5 3 3.5 4 4.5 5 SSD Algorithm (V = 200 pixels/s) 0 0.5 1 1.5 2 2.5 3 3.5 4 4.5 5 Time (s) Figure 2.7: The tracking performance of the SSD algorithm for two different image velocities. 2.4 Segmentation Algorithms The implemented block matching algorithms, namely the cross correlation algorithm, the SSD algorithm, and the phase correlation algorithm, seek the most similar block in an image frame to an image block in one of the previous image frames, based on different criteria. However, these methods only look for a specific pattern in the ultrasound image and none of them actually locate the feature contours, such as the carotid artery. Locating the carotid artery contours has other advantages. Because the algorithm tracks the carotid artery based on its contour shape and location, it should be less sensitive to the changes in the size and shape of the carotid artery. Also, these contours could be used to reconstruct the carotid 42 Real-time Feature Tracking in Ultrasound Images t = Os t = 2s Figure 2.8: Tracking the carotid artery by the SSD algorithm; the cross in the image shows the position of the feature. The size of the image sub-block is shown at t = 0 s and the size of the image search area is shown at t — 2 s. artery and other anatomical features in 3-D. This will be discussed in Chapter 5. Three image segmentation and feature tracking methods based on the carotid artery contours are presented here. These methods are the Star algorithm, the Star-Kalman algorithm and the discrete snake algorithm. 2.4.1 Star Algorithm The Star algorithm was originally presented in [59] to determine center of gravity of cavities in ultrasound images. This method has been implemented to track the carotid artery in real-time. The major advantage of this method is its robustness to cumulative error. The algorithm uses an edge-detection filter to detect the carotid artery boundary along rays 43 Real-time Feature Tracking in Ultrasound Images emanating from a point interior to the carotid artery. A schematic diagram of the method is shown in Figure 2.9. It determines (i{lgnal,jlgnal) (center coordinates of the cavity) through the following iterative procedure [59]: 1. An initial seed point (i£ , j®g) inside the carotid artery cavity is chosen manually, using the very first frame. The manual selection of the initial seed point provides the flexibility for the operator to track any desired cavity in the image, such as the internal, external or common carotid arteries, and the jugular vein. 2. A projection of N angularly equispaced radii by AO = (2-K/N) is performed from the origin (*",, fcg). 3. A threshold step edge detection process is performed along each radius, by using the following one-dimensional high-pass filter: FedgeiTi) = ^{ / (n + 2Ar) + I(n + Ar) + I(n) -I(n - Ar) - I(n - 2Ar) - I(n - 3Ar)} where rj is a given radial increment from the seed point on radius i, i € 1, 2, • • • , N and A r represents differential radial increments from rj along radius i. I(r) represents the local gray-level values in the image along the radius r. If Fedgd^i) is greater than a pre-defined threshold, stop progressing along radius i and go on to radius i + 1 until all N radii have been processed. If no Fedge(ri) is greater than the threshold, peak the one with the highest value. 4. Using the N stop points attained in step 3., the new center of gravity i\g, j\g is 44 (2.4) Real-time Feature Tracking in Ultrasound Images calculated. 5. Steps 2. through 4. are repeated until acceptable estimation of the cavity is achieved; i.e., \\(icg,jcg) k ~ (icgjcg)k+1\\2 < Cgthr (2.5) where cgthr is the acceptable threshold level for the error in the location of the center of gravity of the cavity. The output (ilg™ 1, jlg nal) reached at convergence is the centroid coordinates of the carotid artery in each frame. This information is passed to the next frame to be used as an initial seed point. The algorithm assumes that the motion in the images is slow enough that the calculated seed point lays within the cavity from one frame to the next frame. Choosing N = 32 and cgthr — 2 pixels, and a search area with the radius of 75 pixels, the algorithm was implemented in real-time to solve our tracking problem. The Modified Star Algorithm Although the Star algorithm performs in real-time, it is relatively unstable. The reasons can be explained as follows: 1. Because of the echo dropouts, the side walls of the carotid artery are not very clear in ultrasound images. Therefore, the location of the side wall edge points that are used to calculate the center of gravity are not reliable. 2. The algorithm does not incorporate previous estimates of the carotid artery centroid. This can be improved by using a temporal Kalman filter. Figure 2.10 demonstrates the approach. A constant velocity kinematic model [17] is chosen for the motion of the carotid 45 Real-time Feature Tracking in Ultrasound Images Figure 2.9: Illustration of the Star algorithm; the central point is updated in each iteration as the center of gravity of the detected edge points. Star Algorithm Kalman Filter (Temporal) QcgJcgXO Figure 2.10: Block diagram of the Star tracking method, ( t i g ™ ) is the estimated position of the carotid artery using the Star algorithm. This information is used in a temporal Kalman filter to predict the location of the carotid artery centroid ( i c g , j c g ) -artery along each of the image axes: X(fc + 1) 1 T 0 1 X(k) + V (Jb) (2.6) 46 Real-time Feature Tracking in Ultrasound Images Z(k) 1 0 X(ife) +w( fc ) (2.7) where X(fc) = | x ( £ ) x(k)\ *s * n e system state (position and velocity of the carotid artery center), T is the sampling time of the system, V(fc) is the process noise vector with covariance C(k), Z(k) is the output of the Star algorithm, and ui(k) is its error with covariance D(k). It is assumed that the acceleration can be modeled by zero-mean, white, Gaussian noise, and that the Star algorithm output is the noisy version of the actual position of the carotid artery. The recursive Kalman filter algorithm [17] is implemented to estimate the location of the center point of the carotid artery in consecutive ultrasound image frames. Considering the smooth motion of the probe during an ultrasound examination, we chose o~v = 10 pixel/s2, where av is the variance of the acceleration of a feature in the ultrasound image. With a sampling time of T = .033 s for a frame rate of 30 frames/s, we calculate [17] C(k) = T 2 ~ 0.0 0.002 T3 2 rp2 0.002 0.12 for all k. Selecting a higher value for D(k) with respect to av will filter out instabilities in the output of the Star algorithm, however, it will also reduce the tracking performance in higher image motion velocities. Based on our experiments, it was found that D(k) — 20, for all k, provides an acceptable performance. Using these parameters, Figure 2.11 shows the performance of the Star algorithm for different velocities of the ultrasound probe when tracking a desired feature in the ultrasound phantom. The tracking of the carotid artery in ultrasound images is shown in Figure 2.12. During these experiments, the edge points are sought along 32 radii. The radius of the search area is 75 pixels. Considering the velocity of the probe in our experiments, and consequently the changes in the image from one frame to the next 47 Real-time Feature Tracking in Ultrasound Images frame, on the average, the algorithm required four iterations to converge. 40 "5i x Q . 20 C E 0 CD CJ (0 a.-20 <£ a - 4 0 Star Algorithm (V = 60 pixels/s) Star Algorithm (V = 200 pbce!s/s) 1.5 2 2.5 Time (s) Figure 2.11: The tracking performance of the Star algorithm for two different image veloc-ities. 2.4.2 Star-Kalman Algorithm While the Star Algorithm can track the carotid artery in ultrasound images, it does not provide the user with an accurate estimation of the carotid artery contours. This subsection describes the development of a novel fully-automatic segmentation and tracking system to solve our tracking problem. The feature extraction method is inspired from [17], where the tracking of a single target in a randomly distributed cluttered environment is presented in the temporal domain. In this method, several candidate edge points are extracted along each angularly equispaced radius that is projected from a point inside the cavity. A circular model is used for the shape of the cavity. This model estimates the location of the actual cavity contours from one radius to the next one. This estimation is combined with a normal weight factor for the candidate edge points in a spatial Kalman filtering algorithm to calculate the location of boundary points. The weight function gives less weight to the 48 Real-time Feature Tracking in Ultrasound Images t = Os t = 2 s Figure 2.12: Tracking the carotid artery by the Star algorithm; the cross in the image shows the position of the feature. The circle at t = 0 s shows the size of the search area from the seed point along each radius. candidate edge points that are farther away from the estimated boundary points. The algorithm is explained in detail here. In each frame, N angularly equispaced radii are projected from (icg,jcg), the center of gravity of the extracted contour points in the previous frame. The edge function used in the Star algorithm (Equation (2.4)) is applied to all the pixels along each individual radius. Then, M points along each radius that have the highest edge function value are chosen. Figure 2.13 shows a schematic diagram of the method. Assuming a circular shape for the carotid artery, the following model could be used to describe the system: d(k + 1) = d(k) + £(k) (2.8) 49 Real-time Feature Tracking in Ultrasound Images Gaussian weight function Estimated boundary point Actual boundary of the carotid artery ' A - l Carotid artery center Figure 2.13: A schematic diagram of the border extraction method; notations are described where the state d(k) is the radius of the boundary point along radius k. z(k) is a noisy ver-sion of d[k) (one of ri(k)'s, where r,(fc) is the distance of the ith candidate edge point along radius k from the seed point (i £ 1,2, • • • , M) at iteration k), and £ and (p are sequences of zero-mean, white, Gaussian process and measurement noise values with covariances Q(k) and R(k), respectively. The curvature of the extracted feature can be determined by ad-justing the values of Q(k) and R(k) with respect to each other. Let the state of the filter d(k — l|fc — 1) at iteration k — 1 be the radius of the estimated boundary point along radius fc — 1. The radius of the boundary can be estimated as: in the text. z[k) = d{k) + 4>{k) (2.9) d ( f c | f c - l ) = d ( f c - l | f c - l ) (2.10) where the estimation of the boundary radius is updated using current radius candidate edge 50 Real-time Feature Tracking in Ultrasound Images points as follows: d(k\k) = d{k\k - 1) + W(k){z(k) - d(k\k - 1)) (2.11) where W{k) is the Kalman filter gain. Since z(k) is unknown, a combination of different candidate boundary points along radius k, y(k), is used as measurement in the Kalman filter: M V(k) = 5>(fc)A (2.12) 1=1 The Pi's are weighting factors determined by the likelihood of each candidate edge point i on radius k being on the actual boundary. The /Vs can be computed by assuming a normal distant dependent weight around the estimated boundary for each estimated edge point. The edge magnitudes are also incorporated in the calculation of the /Vs. Thus, the following formulation can be used to compute the /Vs: m = jg&j ( , 1 3 ) where ( k ) _ Fedge(ri(k),ek)> / {n{k) - d{k\k - l)f\ is the Gaussian weight factor for the correct measurements around the actual contour. The function Fedge{fi(k),6k) is the magnitude of the edge at the point (ri(k),0k) in polar coordinates, S(k) is the boundary location prediction covariance, and d(k\k — 1) is the 51 Real-time Feature Tracking in Ultrasound Images predicted state of the system at iteration k. Note that the weight factor in (2.12), Bi, is calculated such that an edge point with a higher edge magnitude has a higher possibility to be the correct measurement. The Gaussian weight reduces the effect of the candidate edge points that are farther away from the estimated boundary points that are calculated by using the circular model in equation (2.8). The following recursive formulation summarizes the method: 1. Calculate the one-step prediction covariance: P(k\k - 1) = P(k - l\k - 1) + Q(k - 1) (2.15) 2. Predict the boundary location: d(k\k-l) =d(k-l\k-l) (2.16) 3. Calculate the boundary location prediction covariance: S(k) = P(k\k - 1) + R(k) (2.17) 4. Calculate the filter gain: W(k) = P(k\k-l)S-1(k) (2.18) 52 Real-time Feature Tracking in Ultrasound Images 5. Use equations (2.12) to (2.14) to estimate the boundary radius as: d(k\k) = d(k\k - 1) + W(k)(y(k) - d(k\k - 1)) (2.19) 6. Calculate the covariance of the state as follows: M p{k\k) = (i - w(k))P(k\k - 1 ) + w{k)2 YIA(*0ri(fc)2 - v(kf (2.20) Implementation results show excellent performance. Figure 2.14 shows the performance of the Kalman algorithm for different velocities of the ultrasound probe when tracking a desired feature in the ultrasound phantom. Figure 2.15 shows the tracking result on the carotid artery images when R(k) = 20, Q(k) = 3 and P ( l | l ) = 20. R(k), Q{k) and P ( l | l ) are chosen based on trial and error to achieve an acceptable performance in tracking the carotid artery in ultrasound images. On average, the algorithm required N + 3 iterations to converge. Star-Kalman Algorithm (V = 60 pixels/s) Star-Kalman Algorithm (V = 200 pixels/s) iy/:....{?!. _ ::ftTAiE"C:jKr \\l\\f\ \;K § \ j '1 I \ 1 \ 1 \ r i J' i i Xr \ J i if V 1 \ f J T i l j , I \ I: \ \ j .1., V- - - I \A-...''...L.M.J..".... ij [ i i A.' ' 1 i i: \i\ V ; V V i V ! 1' 0 0.5 1 1.5 2 . 2.5 3 3.5 4 4.5 5 Time (s) Figure 2.14: The tracking performance of the Star-Kalman algorithm for two different image velocities. 53 Real-time Feature Tracking in Ultrasound Images t = Os t = 2s t= 10s Figure 2.15: Tracking the carotid artery by the Star-Kalman algorithm; the extracted carotid artery contours is shown in the image. 2.4.3 The Discrete Snake Model The "snake" model was originally proposed to find a continuous contour that minimizes an energy function. Given an image I, a snake is a piecewise smooth spline curve, possibly closed, that moves through the image towards noteworthy image features. The energy of the snake is determined by features in the image (such as edges), by the relative smoothness of the snake, and by additional constraints subject to the user input. To describe snakes in more formal mathematical terms, let 7 : [0,1] x [0, 00) —> K 2 describe a curve moving in the plane. That is, at any time to > 0, the points r(i 0) = {{x,y)\(x,y) = 7(s,t 0)for some s 6 [0,1]} (2.21) 54 Real-time Feature Tracking in Ultrasound Images lie on a curve that sweeps through the image in time. Associated with any such curve 7(s, t) at any given time t is a scalar value given by the energy functional E\l] = \ [Einternal{ls(s, t),Jss(s, t)) + Eimage('y(s, t)) + Econstraints(l(s, t))} ds - [I \ dA Js=0 J JR (2.22) where 7S and 7 S S are continuity (i.e., the absolute value of the first order derivative with respect to s) and curvature (i.e., the absolute value of second order derivative with respect to s) terms of 7(s,t), respectively. The internal energy is usually defined as: Einternal = \ (<*(*) || 7 . ( M ) II* +/?(«) II 7ss(s,t) f) (2.23) where a(s)(> 0) and P(s)(> 0) control the stretching and bending properties of the curve at point s, respectively [31]. The image energy Eimage can be chosen in a number of ways such as Eimage = ±7(x, v), that causes the snake to move towards the darker or brighter contours of the image. Econstraints is an additional term selected to attract or repel snakes away from given areas on the image and is usually used to prevent the snake to be caught by local minima. The term J* fR dA is the balloon force that is the area of region R enclosed by the snake and /z(> 0) is a weighting factor. This term is used to inflate or deflate the snake. The basic idea of snakes for contour detection is to find a closed contour u such that [14] ESnake(u) = min Esnake(j) (2.24) 7 55 Real-time Feature Tracking in Ultrasound Images Although snakes have been extensively used to extract features from ultrasound images (e.g., [9,22,68]), they can be trapped by noise easily, and the weighting factors that control their deformation behavior are difficult to compute. To solve these problems, we follow the approach proposed in [31]. Rather than perform-ing snake deformation on the original image, the discrete snake model carries out energy minimization on selected edge points of the original image. In [31], an early vision model is used to reduce speckle and detect edges in ultrasound images. The model moves Gabor filters of size m x m on an image of size N x N (m < N) and convolves them with the image to find the edge points. Then a snake method is implemented to find the contours of the cavity. The algorithm is used to extract contours from single ultrasound image frames. Instead of using an early vision model described in [31], which is a time consuming process, a method similar to the Star algorithm to find candidate edge points has been used here. In each frame, N angularly equispaced radii are projected from (icg,jcg), the center of gravity of the extracted contour points in the previous frame. The edge function used in the Star algorithm is applied to all the pixels along each individual radius. Then, M points along each radius that have the highest edge function value are chosen as the candidate edge points. The contour V is defined as follows: l > i , - - - ,vN) = {vi\vie{e\,--- ,e l M}, i = 1, • • • , N} (2.25) where Vi represents a contour point along radius i, and e*'s represent the M candidate edge points along radius i. el,s are sorted based on their edge magnitudes. Considering three consecutive points of elK, K = k — l,k,k + l, the following energy function is calculated for 56 Real-time Feature Tracking in Ultrasound Images each of the three points [31]: Ef = (a^(z) | r f (z ) | 2 +/3^) | r f s ( i ) | 2 ) -37(e^) -^( i ) J jf d A (2.26) where ak(i) > 0, @k(i) and /xfc(z) show the weighting factors that control the stretching property, the bending property and the centrifugal force, respectively. I(elK) represents the intensity of the image at point elK. Note that the continuity and curvature forces on the i ' th radius are defined as: | r f (i)\2 = \JK - tti_i|2, K = k-l,k,k + l (2.27) | r^«l 2 = | r f « - r f ( i - i ) | 2 , K = k-i,k,k + i (2.28) The objective here is to find a set of points Vi (one V{ along each radius i), such that: N Es„ake(vi, • • • , vN) = min ^  El (2-29) where E\ is the energy that is calculated for only one of the j € {1, • • • , M } candidate edge points along radius i. To keep the internal force in balance with the image force, aK(i), 57 Real-time Feature Tracking in Ultrasound Images BK(i) and 7 (i) are chosen such that [31]: i r f - 1 ^ ) ! 2 URdA (2.30) (2.31) (2.32) where I(el'K_1), | T f _ 1 ( i ) | 2 , | r ^ _ 1 ( i ) | 2 and / JRdA denote the image force, continuity force, curvature force and balloon force of the last iteration for each point on the contour, respec-tively. For the first iteration, these factors are set to 1. In order to minimize the contour energy, vl is moved from the fc'th position to (k + l)'st position if E^^ke < Egnake. This simple minimization method is chosen because of its inexpensive computational cost but there is no guarantee on its convergence. A discussion on the convergence of this algorithm can be found in [31]. In our case, we found that the algorithm converges, as long as the image has a reasonable quality. The experiments show that the algorithm performs well only for slow motions in the ultrasound image. Figure 2.16 shows the performance of the snake algorithm when tracking a desired feature in the ultrasound phantom. The algorithm was unable to track features with velocities more than 100 pixels/s. As well, Figure 2.17 shows the tracking result on the carotid artery images. In these experiments, N = 32 and M = 3. The algorithm required around 5 iterations on all the radii to converge to an acceptable cavity contour. 58 Real-time Feature Tracking in UJtrasound Images Snake Algorithm (V = 60 pixels/s) 401 1 r- 1 1 1 1 ! 0.5 1 1.5 2 25 3 3.5 4 4.5 5 Time (s) Figure 2.16: The tracking performance of the snake algorithm at V = 60 pixels/s. The dotted line shows the actual displacement of the extracted feature along the u-axis in the image. t=0s t=2s 100 200 300 t = 8s t = 10 s Figure 2.17: Tracking the carotid artery by the snake algorithm; the extracted carotid artery contours is shown in the image. 2.5 Discussion Table 2.1 presents the error percentage of feature tracking algorithms for two different velocities. This table is generated by averaging the error in feature tracking experiments based on ultrasound phantom studies. In these experiments, the exact location of the desired 59 Real-time Feature Tracking in Ultrasound Images Method Seo/R S200/R knops/frame Correlation 8.7% 15.5% 100 SSD 6.3% 13.5% 50 Star 10.7% 18.9% 30-150 Star-Kalman 6.5% 13.6% 20 Snakes 8.0% N / A 50-300 Table 2.1: A comparison of different feature tracking algorithms; #60 a n d S200 represent standard deviations of the error in desired feature velocities of 60 pixels/s (7.38 mm/s) and 200 pixels/s (25 mm/s), respectively and R represents the peak to peak displacement of the actual feature along the u-axis in pixels. feature is determined in the very first frame, and the probe location is used to calculate the location of the desired feature location in the consecutive frames (see Figure 2.2). The data shows that both the SSD and the Star-Kalman algorithms can be used with a small error to track features in ultrasound images at different velocities. The snake algorithm is the least reliable one, as the algorithm loses the feature and diverges after V = 100 pixels/s. The large computational range for the Star and the snake algorithms demonstrate the time that is needed to converge under different image motion velocities. The segmentation algorithms locate the contours of the carotid artery and therefore, their tracking performance is less sensitive to changes in the shape of the cavity. The qualitative results of the carotid artery tracking by using different feature tracking methods also show that the segmentation algorithms have better performance than the template matching algorithms. The Star-Kalman algorithm performs the best, as it extracts the carotid artery contours accurately even in very high probe velocities. 60 C h a p t e r 3 Ultrasound Image Servoing 3.1 Overview One of the main features of the current system is its ability to visually track features in ultrasound images in real-time. This could help ultrasound technicians in guiding the motion of the ultrasound probe during the examination. The feature tracking algorithms provide the image controller with the required feature coordinates (e.g. the center of the carotid artery) to control the robot. This chapter explains the development of an original ultrasound image controller that compensates for unwanted motions in the plane of the ultrasound beam by controlling three axes of the robot. The notation in this chapter will mostly follow the ones in the visual servoing tutorial paper [92]. All matrices and vectors are demonstrated in bold. 61 Ultrasound Image Servoing 3.2 Ultrasound Image Jacobian The feasibility of the ultrasound image servoing to control three axes of the robot can be determined by examining the ultrasound image Jacobian, that relates differential changes in the image features to differential changes in the configuration of the robot. Figure 3.1 illustrates the concept. Let Fi be a feature point in the ultrasound image plane with coordinates ii Ui Vi coinciding with a point P with coordinates ep = Py in the end-effector (probe-attached) O Base Frame Figure 3.1: Definition of the frames for the ultrasound robot. G 2 Ultrasound Image Servoing frame. The following relation holds °p = °d + °Ce e p , (3.1) where °p are the coordinates of P in the base frame, °d are the end-effector coordinates in the base frame and the columns of ° C e are the end-effector frame vectors in base frame coordinates. Suppose that point P does not move, i.e., °p = 0. Derivation with respect to time of (3.1) yields: ep = - e C G 0 d + ( e p x ) e C 0 ° W l (3.2) where °UJ is the angular velocity of the end-effector in base frame coordinates. Alternatively: 2 P = [-I f P x ; (3.3) where TP*) 0 -£Pz £Py 0 -epx (3.4) Assuming an orthographic projection model [92] with scale a for the ultrasound image and assuming that P remains in the image plane, the coordinates of Fi in the two-dimensional 63 Ultrasound Image Servoing ultrasound image become Ui - u 0 0 — 0 Vi a'Vz where (urj vo) are the coordinates of the center of the ultrasound image in the image frame. From (3.3), it follows: Ui Vi a 0 0 0 -Vi 0 0 0 a 0 Ui — no 0 ~UJ (3.6) where = 'us is the end-effector body velocity [125] in end-effector coordinates and Ji £ 9r.2 x 6 is the ultrasound image Jacobian matrix. If m feature points are considered in the image, similar pairs of rows will be added to (3.6): f l J l f J m <X = J <x, (3.7) where f € 3? 2 m, J € 3 ? 2 m x 6 , and € K 6 . The column rank of the resulting Jacobian is at most three. Two or more feature points will generate a Jacobian of rank three. Thus, as expected, with non-trivial ultrasound images, it is possible to control the motion of the 64 Ultrasound Image Servoing feature point in the ultrasound image by controlling the motion of the ultrasound transducer in three degrees of freedom in its image plane. 3.3 Ultrasound Image Jacobian Properties In order to investigate the ultrasound image Jacobian properties, we will use the following definitions from linear algebra [174]: 1. The null-space of an k x I matrix A is the space of all Z-dimensional vectors that are orthogonal to the rows of A . 2. The range of A is the space of all A;-dimensional vectors that are generated by the columns of A . Thus, x € null(A) iff A x = 0, and b 6 range(A) iff A x = b for some x . In this section, { n ^ , i = 1, • • • , k} is a set of base vectors for the null-space of A . From (3.7), it is clear that any that lies in the null-space of J does not generate any motion in the feature space. Also, since the null-space of 3T and the range space of J are orthogonal, any f that lies in the null-space of 3T does not belong to the range space of J. This section studies the properties of the null-spaces of J and JT for different number of features. 65 Ultrasound Image Servoing 3.3.1 Tracking one feature point in the image In the case of one feature point in the image, J = Jj as in (3.6). The range of JT has dimension 2 and the null-space of J is spanned by the following orthonormal vectors: nj,i = ^ ) J + ( a = a ) , ; r l 21 a 0 0 0 0 1 0 0 ui— un a nj ) 2 = 0 nj,3 = 0 0 0 0 1 0 1 0 0 0 0 0 0 1 (3.8) From Figure 3.1, it is clear that any motion of the probe along the x or z axes of the probe frame is equivalent to the motion of the feature point along the x or z axes in the probe frame, respectively, but in negative directions, i.e., edx — — _px, and edz = —*pz. Therefore, using (3.5) and njti, the locus of the probe positions should satisfy the following equation: 9X Vvl = R2, (3.9) where is a constant. It is clear from nj^ and the nature of the ultrasound imaging that the feature point should always be in the x — z plane. Therefore, any motion of the probe on a sphere around the feature point does not affect the location of the feature in the image (see Figure(3.2)). nj^ and nj^ show that any rotation of the probe about the x and z axes of the end-effector frame does not result in a change in feature location in the image plane. 66 Ultrasound Image Servoing Locus of the probe Ultrasound probe Figure 3.2: Tracking one feature in the ultrasound image; the illustrated sphere is locus of probe motions that lay in the null-space of J. 3.3.2 T r a c k i n g t w o fea ture p o i n t s i n the i m a g e For two feature points in the image, the image Jacobian matrix can be written as follows: J = a 0 0 0 -vi 0 0 0 a 0 u\ — UQ 0 a 0 0 0 -V2 0 0 0 a 0 U2 — UQ 0 (3.10) 67 Ultrasound Image Servoing In this case, it can be shown that the basis vectors for the null-space of J are nj^, nj^, and njt4 in (3.8). The null-space of J T is spanned by n J T , i V ^ K m - u 2 ) 2 + (vi - v2y -(ui - u2) -(•ui - v2) ui - u2 vi — v2 (3.11) Therefore, any ui ill u2 V2 ^ n J T , i = ^2[{ui-u2y + {vi-v2y] - ( u i - u2) -(vi - v2) Ul - u2 Vl - v2 (3.12) lies in the null-space of J T . From (3.7) and (3.12), if two features move according to iii = —u2 and vi — —v2, i.e., towards each other along a line which passes through them in the image, the motion can not be compensated by any motion in the end-effector frame. Figure 3.3 shows the motion of the feature points which lies in the null-space of 3T. 3.3.3 Tracking more than two features It can be shown that by tracking more than two features in the ultrasound image, the null-space of J remains the same as the one for two feature points (rank(J) is maximum 3). However, the dimension of the null-space of 3T increases, which means that there are 68 l/Jtrasound Image Servoing f l < > l > V l ) \ \ f>(u?,v2) v - • Figure 3.3: Feature point motions that lay i n the 3T null-space in the case of two feature points. more feature point motions that can not be compensated by any mot ion in the end-effector. Since the analysis i n Section 3.3.2 holds for any two feature points i n the image, any probe motion that changes the distance between any two feature points in the image lays i n the null-space of JT. 3.4 The Visual Controller Desired Feature J r~ Location / d_ dt Actual Feature Location | e y +A' . f ^T~l d j Velocity Controlled ^ " Robot X 1 •» Feature Ultrasound Extraction ^ Image Figure 3.4: Ul t rasound image controller. 09 Ultrasound Image Servoing Consider first a simple proportional controller shown in Figure 3.4: «Xd = - j t [ K d ( f - f r ) - i v ] , (3.13) where f is defined in (3.7), fr G K 2 m is the desired location vector, j t is the pseudo-inverse of J , K-d = kd I2mx2m is the controller gain and 'Xd is the commanded robot velocity. 3.4.1 Stability Analysis Two cases will be considered in the visual servoing stability analysis, depending on the number of features tracked in the image. In the following analysis, it is assumed that the dynamics of the robot control loop are negligible relative to the image control loop, i.e., "Kd = <X (3.14) 3.4.1.1 Tracking One Feature in the Ultrasound Image Let be the minimum norm solution of (3.7) for m = 1, i.e., IX = J t f with Jt given by: j t = J r ( J J T ) - 1 (3.15) If there is no delay in the system, using (3.7), (3.13), (3.14) and (3.15), the following equation is derived: f = J «X 9* J "Ka = -3 J f [ Kd (f - f r) - fr ] (3.16) 70 Ultrasound Image Servoing or (f - f r ) + k d (f - f r ) = 0 (3.17) This guarantees that the tracking error converges to zero exponentially, provided that kd > 0. When the dynamics of the robot is not negligible relative to the image controller loop or when there is a delay in the system, the above analysis is not valid anymore. For example, assume that «X(t) =i < ± d { t - T d ) (3.18) where rd is the delay in the robot controller. Using (3.7), (3.13), (3.15) and (3.18), the following equation is derived: f ( t ) = 3 ^ ( t ) = 3 % i ( t - T d ) = - J Jt [ k d I (f(t - T d ) - f r ( t ~ T d ) ) - f r { t ~ T d ) } (3.19) or f (t) + k d f ( t - Td) = f r ( t - Td) + k d t r ( t ~ Td) (3.20) The characteristic equation of (3.20) is: s + k d e~TdS = 0 (3.21) 71 Ultrasound Image Servoing Using Nyquist stability criteria, it can be shown that the system remains stable as long as o<kd<£-d. 3.4.1.2 Tracking Two or More Features in the Ultrasound Image In this case, we assume that any feature vector in the image plane can be written as follows: f = fc + f" (3.22) where fc and fn lie in the range space of J and null-space of JT, respectively. It can be shown that: 2m j j f = i-E n^ n^ . (3-23) i=4 where the njr^'s are orthogonal vectors that span the null-space of 3T. Using (3.13), (3.15) and (3.23), we have: (fc - frc) + Kd (fc - frc) + fn = 0 (3.24) Since (fc — f^ r) and fn belong to two orthogonal spaces, we have (fc - frc) + kd I (fc - frc) = 0 fn = 0, (3.25) 72 Ultrasound Image Servoing which guarantees that image feature servoing (fc) —> (frc) (3.26) is achieved when kd > 0. From (3.25), f n is constant, which means that certain desired feature locations cannot be achieved by the image servo controller. It can be shown that the final location of the feature points depends on their initial and desired locations. From Section 3.3.2, the distance between every two feature points in the image is preserved, while the features are moved in the ultrasound image. A similar approach to the previous section can used to analyze the stability when there is a delay in the system. 3.4.2 Exper imen ta l Results Ultrasound images are captured at 30 frames/s through the user interface and are processed at the same rate to control up to three axes of the robot while tracking one or two features (e.g., center of the pipes in Figure 2.2) in the ultrasound image. A backward Euler method has been used to calculate fr in (3.13). The robot and the image controller loops run at 500 Hz and 30 Hz, respectively, IQ = J-4X4 and a = 10000 pixel/m. The force control is disabled in these experiments. Initially, the operator moves the robot and pushes the probe against the elastic surface of the phantom by using the joystick commands. During these experiments, the range of motions for the probe are small enough to maintain the contact between the probe and the phantom, because of phantom compliance. With the coordinate system illustrated in Figure 3.1, the control axes are the translations along the x-axis and z-axis, while the rotation is about the y-axis. Figure 3.5 shows the ultrasound 73 Ultrasound Image Servoing image servoing performance for one of the axes. For this experiment, a feature (center of one of the pipes in the phantom ultrasound image) is selected before enabling the visual servoing. While the operator is moving the probe along the y axis, the feature position is maintained in the center of the image automatically. The Star-Kalman algorithm is used to extract the feature from the ultrasound image. The performance of the system while tracking two features (center of two pipes in the phantom ultrasound image) simultaneously is shown in Figure 3.6. Three degrees of freedom of the robot are controlled by the image servo controller in this experiment. The two features are moved away from their reference points at t = .7s by moving the robot along the x-axis and are moved back by the image servoing action. We report ultrasound image servoing results using the phantom because the results are quantifiable and repeatable. The ultrasound image servoing has been tested in tracking the CL tn ~-5fj l ' 1 1 1 1 1 0 1 2 3 4 5 6 Time (s) Fi gure 3.5: Experimental results for image servoing in a single axis. The position of one feature is maintained in the center of the image (bottom figure) while the robot is displaced along y-axis (top figure). 74 Ultrasound Image Servoing Feature 1 401 1 1 Feature 2 401 1 1 ,l , , , , 0 1 2 3 4 5 Time (s) Figure 3.6: Experimental results for image servoing in three axes; the position of the two features along the u-axis are changed by 30 pixels. carotid artery by a sonographer and a number of volunteers and visitors in our laboratory and was found to work very well. The results of our human factors study in Chapter 6 show that ultrasound visual servoing helps to reduce operator fatigue during the ultrasound examination. Also, the error in positioning the feature during the robot-assisted ultrasound examination is reduced significantly. 3.5 Adaptive Ultrasound Image Servoing The performance of the ultrasound image servoing directly relates to the accuracy of the ultrasound image Jacobian. In the previous section, it was assumed that a model of the ultrasound image projection is known. This assumption may not be valid anymore when the operator changes the image parameters such as the magnification of the ultrasound image. Furthermore, the scale factor a in the ultrasound image Jacobian, which should be 75 Ultrasound Image Servoing determined by a calibration procedure, may be different for different axes of the ultrasound image. In order to compensate for these factors, the approach in [138] is adopted to update the ultrasound image Jacobian. The following definitions are used here: At*. = tfc-tfc-i (3.27) ffc = f\t=tk (3-28) ffc = f|t=tfc (3-29) frfc = fr|t=tfc (3.30) frfc = fr | t=tfc (3.31) eXfc — e X | t = t f c (3.32) A'Xfc = Xfc — eXfc_! (3.33) at . k = ^x' ( e X f c , t f c ) '3'34^ A J f c = Jfc-Jfc-i (3.35) assuming that f is only dependent on e X and Jfc is the fc'th approximation to the ultrasound image Jacobian. By using a similar approach as in [138], it can be shown that the following dynamic Broyden update A J f c = [(ffc - f r J - (ffc.! - f r f c_J - Jfc-x AeXfc + t.k A*fc] AeXfc3 A e X f c T A e X f c (3.36) 76 Ultrasound Image Servoing minimizes the Frobenius norm |AJfc||F = Ey||(AJ f c)? ||) 2 (3.37) subject to the constraint equation: (J f c + AJ f c ) *Xk = ffc (3.38) where (AJ)jj is an element of A J . The convergence of this algorithm is discussed in [139]. We use the following first order approximations to simplify (3.36) in our implementation: f — f k AtT~ (3-40) * " ~Atk~ (3-41) Dividing the numerator and denominator of (3.36) by Atk2, we obtain the following simpli-fied formula: (ffc - Jfc_i eX f e) e±l AJfc = ^ 4^— (3-42) e X f c «Xfe The result shows that for the first order approximation, the change in Jk is not related to the target motion in time. The updated image Jacobian is used in the image controller in (3.13) at every time step. Experiments show a satisfactory performance of the adaptive ultrasound image servoing 77 Ultrasound Image Servoing method. In all of the experiments, the reference target moves along a line with equation v — — (u — Uio) +Vio in the image, where (UIQ Vio)T are the initial coordinates of the feature. The sequential similarity detection algorithm is used to track the feature in the ultrasound image. Figure 3.7 compares the performance of the adaptive and conventional ultrasound image servo controllers. The initial condition for the adaptive ultrasound image Jacobian estimator, Jo, is chosen equal to J . The convergence for Jfc is achieved after 5 s. The standard deviation of errors are ox = 3.88 and ay = 4.35 for the adaptive method and ax = 7.52 and ay — 6.28 for the conventional method. However, the performance of the conventional method can be improved by increasing the controller gain. Figure 3.7 demonstrates the performance of the conventional method with kd = 20. The standard deviation of errors are ax = 3.93 and ay = 3.74 which shows a similar performance to the adaptive controller. Adaptive, Kd=5 ^ - 4 0 0 r Q. ,QQI I 1 1 1 1 1 1 1 1 1 0 0.5 1 1.5 2 2.5 3 3.5 4 4.5 5 Conventional, Kd=20 = = - 4 0 0 r D . ,001 1 1 1 1 1 1 1 1 1 1 0 0.5 1 1.5 2 2.5 3 3.5 4 4.5 5 Time (s) Figure 3.7: Experimental results for adaptive and conventional ultrasound image servo controllers. kd = 5 for the adaptive controller and kd = 5 and kd = 20 for the conventional controller. In each figure, the tracking results for x-axis (top) and z-axis (bottom) are displayed. 78 Ultrasound Image Servoing Adaptive 1.5 2 2.5 3 Time (s) Figure 3.8: Experimental results for adaptive and conventional ultrasound image servo controllers with kd — 5 and kd = 20 , respectively. Ultrasound images are scaled down to 65% of their original size. In each figure, the tracking results for x-axis (top) and z-axis (bottom) are displayed. Although increasing the controller gain reduces the effect of calibration errors in calcu-lating the ultrasound image Jacobian, it also increases the possibility of instability in the system. Furthermore, if the projection model in Section 3.2 is not accurate, increasing the gain does not enhance the performance of the image controller significantly. In order to see the effect of the accuracy of the image projection model on the performance of the ultra-sound image servo controllers, ultrasound images were scaled down to 65% of their original size. Figure 3.8 compares the performance of the two controllers. In this experiment, the controller gains for the adaptive and the conventional image controllers are kd — 5 and kd = 20, respectively. The standard deviation of errors are ox = 4.65 and ay = 4.42 for the adaptive method and o~x = 9.04 and o~y = 6.17 for the conventional method. This shows that the adaptive image controller achieves a better performance with a lower gain when the image parameters are changed. 79 Ultrasound Image Servoing 3.6 Discussion The concept of the ultrasound image servoing was developed in this chapter. It was shown that the rank of the ultrasound image Jacobian is at most three. An ultrasound image controller is proposed and its stability analysis is presented. The experimental results show that it is possible to control the motion of the ultrasound transducer in its image plane in three degrees of freedom to compensate for unwanted feature motions in the image. The ultrasound image Jacobian may not be accurate because of the calibration errors or inaccuracies in the image projection model. An adaptive image controller was developed to improve the performance of the image controller. When the image projection model is almost accurate, it was shown that increasing the gain in the conventional ultrasound image controller provides similar results to the adaptive image controller. However, the adaptive image controller enhances the performance significantly when the image parameters (such as its size) change. 80 Chapter 4 Ultrasound Robot User Interface 4.1 Overview One of the main goals of the current robot-assisted diagnostic ultrasound system is to provide an easy and intuitive user interface for the sonographers to perform ultrasound examinations. The user interface should enable the operator to interact with the robot and to move the ultrasound transducer in a similar fashion to the conventional ultrasound examination. An ideal user interface should demonstrate the ultrasound images to the operators, while providing them with some control options on the robotic system, such as the force level that the operator wants to apply to the patient. In addition, the interface should be intuitive enough that the sonographer does not feel any difference in controlling the probe orientation and position while performing the ultrasound examination. Therefore, the joystick that is used to control the probe motions should be very similar to the probe itself with the same 81 Ultrasound Robot User Interface motion ranges. The current user interface consists of a joystick and a graphical user interface (GUI). A Logitech 6-axes SpaceMouse [50] was initially considered as the joystick for the system. However, because all the axes are coupled in this joystick, it was found to be difficult for operators to use. A novel joystick is designed in our laboratory (courtesy of Simon Bachmann) that eliminates this problem. The integration issues of this joystick with our robotic interface are presented here. The graphical user interface enables the operator to activate different features of the system, such as the image servo control, while showing the ultrasound image and the force information in real-time. The user interface uses different semaphores and a multi-process program to run all the functions at the same time with a proper timing. 4.2 Joystick Two different joysticks are used to apply commands to the robot. These joysticks are positioned beside the ultrasound machine as shown in Figure 4.1. 4.2.1 SpaceMouse Initially, a Logitech 6-axis SpaceMouse was used to receive user commands to move the ro-bot. Figure 4.2 shows a photograph of the mouse. By displacing the SpaceMouse movable handle in any direction, a proportional velocity command is generated in the same direction and is sent to the robot. The range of the displacement is very limited. The programmable push buttons can be used to change the command gain to the amount of handle displace-82 Ultrasound Robot User Interface Figure 4.1: System setup; the joysticks are positioned beside the ultrasound machine panel: 1) JR3 force-torque sensor, 2) potentiometers, 3) thumb lever, 4) spherical wrist joystick handle, 5) SpaceMouse (courtesy of Simon Bachmann). Figure 4.2: Hand controller: Magellan/SpaceMouse; 1) movable handle, 2) base, 3) pro-grammable push buttons, 4) translation and rotation axes. 83 Ultrasound Robot User Interface merit, as well as activating/deactivating different translation and rotation axes. The details of this design are discussed in Section 4.2.3. The SpaceMouse joystick was initially used in the design of the system. The system was tested by several graduate students in our laboratory, as a well as a sonographer from the Vancouver General Hospital. Because of the design of the joystick, it was difficult for the operators to perform decoupled motions along different axes. In addition, in the conventional ultrasound examination, a sonographer mentally combines the orientation of the ultrasound probe in her hand with ultrasound images in order to map the direction of the probe motion to the image. Because the SpaceMouse joystick controls the robot in velocity mode, there is no mapping between the orientation of the ultrasound probe and the orientation of the joystick. Therefore, it was found to be difficult for the operators to intuitively use their experience in the conventional ultrasound examination and apply it to the robot-assisted one. A novel joystick was designed in our laboratory to overcome these problems. 4.2.2 T h e S p h e r i c a l W r i s t J o y s t i c k An interface that is used by the sonographers should enable the operators to move the ultrasound probe in a similar fashion to the conventional ultrasound examination. This means that a desired joystick design should provide an operator with a one-to-one orienta-tion mapping to the ultrasound probe. Since the range of the probe orientations that are being used by the sonographers are very large, large orientation motions should be possible. In order to make the control of the ultrasound probe more intuitive for an operator, a novel joystick has been designed by Simon Bachmann, a research engineer in our laboratory. Fig-84 Ultrasound Robot User Interface ure 4.3 shows a photograph of this design. The wrist uses adjustable friction in each joint to passively compensate for the weight of the handle and keep its orientation. Potentiometers are installed in each joint to detect the rotation angles. A JR 3 force-torque sensor is used to generate translational velocity commands to the robot along the axes along which the operator applies forces to the handle. Any handle force of more than 1.5 N in any direction sends a constant velocity command to the robot. This threshold is necessary, because any orientation change in the handle might unintentionally generate forces along each transla-tional axis. A hysteresis band of 0.5 N is used to generate a more smooth velocity command. Figure 4.4 shows the concept. When the operator's applied force along any axis increases 1.5 N , a velocity command is sent to the robot and this continues until the force level drops below 1 N . During an ultrasound examination, continuous changes in the probe velocities are used z Figure 4.3: Hand controller: Spherical wrist (courtesy of Simon Bachmann) 85 Ultrasound Robot User. Interface 2 5 ~6 ForceN 2 25 3 Figure 4.4: Hysteresis band that is used to apply the velocity command to the robot. by the operators to position the probe. Although the ability to adjust the sensitivity of the probe motions to the operator commands has already been built in the graphical user interface, it was found to be difficult by the operators to continuously switch between the joystick and the computer mouse to perform this adjustment. A thumb lever is installed in the handle of the joystick to overcome this problem. This lever is used by the operator to adjust the velocity command gain that is applied to the robot during the ultrasound examination. Figure 4.5 shows the location of the lever with respect to the joystick handle. In every time step, the rotation matrix ^ R e , which is the rotation matrix of the probe frame with respect to the handle frame, is calculated as follows: where frame O is the base frame. R 0 is calculated from the angles received from the h-rt h Tf o-rt (4.1) 86 Figure 4.5: The location of the thumb lever with respect to the joystick handle; the operator can easily adjust the lever to get different velocity gains during the operation. potentiometers at each angle of the joystick as follows: h •R* = (Rz(0 3)R„(02)R*(0i)) T. (4.2) where R x , Hy and R 2 are the rotation matrices around x, y and z axes of frame H with respect to the base frame, respectively and 6\, 62 and 63 are the potentiometer angles of the joystick for the x, y and z axes, respectively. The following control signal is applied to the robot to compensate for any error between the two frames of H and E: rz2 - r23 U = 0 r i 3 - n\ (4.3) r~2\ ~ r\2 87 Ultrasound Robot User Interface where is the (i, j) element of hH, 6 = acos( m + f22 + — 1 2 (4.4) and r32 - r23 K = 1 (4.5) \/(j-32 - r23)2 + (r13 - r 3 X ) 2 + (r 2i - r12)2 T21 ~ rn is the unit vector along the equivalent axis of finite rotation between the two frames of H and E [42]. This means that by rotating frame H about vector K by an angle of 6 according to the right hand rule we get to frame E. The design of the joystick enables the operator to have large orientation motions with the handle. However, considering the size of the robot, it is not desirable to apply the same rate of changes in the orientation of the handle to the robot. Therefore, changes in 9{, i= l , 2, and 3, are limited by using the following procedure: 1. At every time step k, calculate where 9{k is the value of Q{ at time step k, and T is the control loop sampling rate. Aft, (4.6) T T 2. Defining AOnmu as the desired limit of changes in 8ik, if > A0ik _ A g t i m « T 88 Ultrasound Robot User Interface 3. Update 9ik as 0, + ( A0< T ) T (4.7) 4.2.3 S p a c e M o u s e I n t e g r a t i o n D e t a i l s This section explains the details of the application of the SpaceMouse in our system. As we mentioned before, the main problem of the design of the SpaceMouse is that its axes are coupled. This means that it is very hard to activate only one axis without activating the other ones. The coupling effect could be reduced by increasing the zero radius (the radius that any displacement smaller than that would give zero output) of the SpaceMouse along different axes. This could be done through sending commands through the serial port to the mouse. However, this also means that in order to generate any desired motion along each axis, the operator has to apply more force to generate the same velocity command. In our designed system, the following method is used to reduce the coupling effect. In order to decouple rotation commands of the SpaceMouse from its displacement commands, the displacement outputs of the SpaceMouse are modified as follows: where a > 0 is the attenuation gain, AYs are the displacement commands from the Space-Mouse, Cj's are the modified displacement commands and Ci = i = 1,2,3 (4.8) 1 + a\n\ (4.9) 89 Ultrasound Robot User Interface where X4, X$ and Xe are rotation commands from the SpaceMouse. Each rotation com-mand was normalized with respect to the other rotation commands in order to decouple rotations from each other: rA X a (4-10) rK X s (4.11) rc X e (4.12) where C 4 , C 5 and CQ are the modified rotation commands, and 3 > 0 is the attenuation gain. Al l the translations are chosen to be in the robot base frame. The Yaw of the probe is controlled by XQ. X$ and X 5 rotate the robot in the joint space rather than the probe frame. This is found to be less confusing and easier to handle for the operator, as there is no need for a transformation of the probe frame to the operator frame for different orientations of the probe during the ultrasound examination. The experimental results in our laboratory with different operators has validated this approach. For the ease of the operator, the keypad of the SpaceMouse is used as a set of short-cut keys to control all major functionalities of the graphical user interface. The results were very satisfactory. While operators had serious problems controlling the probe motions by using the velocity commands that are originally generated by the SpaceMouse, the modified velocity commands enabled the operators to easily move the probe along any desired axis, while reducing the command gain along the non-desired ones. 90 Ultrasound Robot User Interface 4.3 Graphical User Interface A graphical user interface (GUI) was written using the GTK-f library under the Linux operating system. Figure 4.6 shows a screen shot of the GUI. The features of the GUI are explained here. 4.3.1 Ultrasound Image Display Ultrasound images are captured at a rate of 30 frames/s by using a Matrix Vis ion-^ Mv-delta frame grabber. These images are stored in the memory for image processing purposes and at the same time are displayed to the operator in a 640 x 480 pixel resolution. 4.3.2 Start/Stop Button The operator can enable and disable the robot by using this button. The button color is red when the robot is enabled and green otherwise. 4.3.3 Force Control The operator can select the target level of the force applied to the patient by adjusting the force setting vertical bar. By enabling the Force button, the robot pushes the probe against the patient's body with the force level shown in the Force Display box. 4.3.4 Probe Sensitivity Setting When using the SpaceMouse as the input command device for the robot, the sensitivity of the probe motion to the joystick commands can be adjusted by using the Robot Sensi-91 Ultrasound Robot User Interface Figure 4.6: Graphical user interface t ivi ty horizontal bar. The sensitivity of the probe motions to the translational commands from the spherical wrist is adjusted by using the thumb lever on the joystick and is displayed to the operator through the GUI in real-time. This is performed by automatically adjusting and displaying the location of the sensitivity bar with respect to the location of the thumb lever. 4.3.5 A x i s Con t ro l Al l six axes of the robot can be enabled/disabled by activating/deactivating the A x i s Con-trol buttons. The rotation control buttons can only be deactivated when the SpaceMouse 92 Ultrasound Robot User Interface joystick is used. 4.3.6 3-D Graphical Probe For the ease of the operator, a 3-D rendered model of the ultrasound probe, which displays the orientation of the transducer in real-time, is incorporated within the GUI. An accurate model of the ultrasound probe was generated in SolidWorks (courtesy of Simon DiMaio) and was converted to OpenGL code for this purpose. The OpenGL graphics library routines were combined with GTK+ libraries in order to display the graphics in the user interface. Two horizontal and vertical surfaces with chess textures are incorporated in the environment in order to compensate for the lack of depth perception while viewing a three dimensional environment on a flat monitor screen. The graphical representation includes shadows from the ultrasound probe that are vertically and horizontally projected onto the floor and the wall, respectively. The shadows help in generating a depth perception in the environment. In the current implementation, only probe orientation is displayed in the graphical display. This comes from the lack of accurate information about the relative position of the patient's body and the ultrasound probe with respect to each other. Future work may include the use of a camera to model and display the patient. 4.3.7 Image Control By centering an image region (e.g., the carotid artery) in the ultrasound image, the robot can help the operator to guide the probe during an ultrasound examination. This can be enabled by selecting the desired feature in the ultrasound image by using the computer mouse pointer. A white cross in the image shows the location of the feature in the image 93 Ultrasound Robot User Interface in real-time. When this button is enabled, a red square appears at the right-bottom corner of the image. 4.3.8 Real-Time Contour Extraction The Contour button enables the operator to observe the automatically identified image region contours in the ultrasound image in real-time. In the current implementation, the Star-Kalman contour extraction method is used to extract the carotid artery contours. 4.4 Conclusions and Recommendations This chapter discussed the different features of the robot-assisted ultrasound system user interface. Two different joysticks, namely the SpaceMouse and a spherical wrist joystick, were used to control the probe motions. It was found that the operators had serious prob-lems in controlling the robot motion, because of the axis coupling in the SpaceMouse. A method was presented that reduced this coupling effect. It was found that although the performance of the operators was improved significantly by using this method, however, since the robot was controlled in velocity mode along all the axes, it was impossible for the operators to realize the orientation of the probe without constantly looking at the patient. This was not an intuitive design for the sonographers, since during the conventional ultra-sound examinations, they mostly look at the ultrasound image and have a mental mapping between the probe in their hand and the image. The novel joystick that was designed in our laboratory solved the orientation control problem. Since the operator was able to hold the joystick handle similar to the way she 94 Ultrasound Robot User Interface holds the ultrasound transducer in conventional ultrasound examinations, it was easy for the operators to map the orientation of the handle to the images. However, the design of the joystick was such that any orientation command to the handle can potentially generate unwanted translational motions along different axes. We tried to solve this problem by adding a force threshold level for the translational axes. However, this also meant that operators had to generate some unnecessary force to apply translational commands to the robot. The future developments of this spherical wrist joystick could include finding means to solve this problem. The design of the graphical user interface was also discussed in this chapter. The GUI shows the ultrasound images to the operators in real-time, while enabling them to interact with the ultrasound robot through different control buttons. It was found that some of these features, such as the 3-D probe model, were not necessarily used by the operators to control the ultrasound probe motion. However, both force control and image control were useful features and enhanced the operator performance significantly. The result of a human factors study that discusses the usefulness of each of these features are presented in Chapter 6. 95 C h a p t e r 5 Practical Applications 5.1 Overview In addition to ergonomic benefits, the image-guided robot-assisted diagnostic ultrasound examination system has other potential applications in teleradiology and image-guided in-terventions. The feature extraction algorithms could also be used to extract anatomical maps in real-time, which will help ultrasound technicians to guide the probe during an ultrasound examination. This chapter presents three applications of the ultrasound robot, namely 3-D ultrasound imaging, tele-ultrasound and feature-based probe position control. 5.2 3-D Ultrasound Imaging Since the location of the ultrasound transducer can be determined via the forward kine-matics of the slave manipulator, three-dimensional ultrasound images can be reconstructed 96 Practical Applications from a series of two-dimensional image slices. While this can be achieved with any posi-tion/orientation tracker of the ultrasound probe, the feature detection algorithms presented in Chapter 3 allow real-time geometric model building with very little storage or compu-tational overhead. For example, a vascular model could be built while the transducer is scanning arteries or veins to provide the ultrasound technician with a 3-D model for refer-ence and guidance. The ultrasound phantom was used in this experiment to demonstrate the accuracy of the 3-D ultrasound imaging. This section discusses the two different 3-D reconstruction methods that have been implemented. Similar results have been obtained from carotid artery scans. 5.2.1 Calibration Accurate calibration is essential for a consistent reconstruction that preserves true anatom-ical shape [142]. This involves determining the position and orientation of the ultrasound images with respect to the robot end-effector. The results of the calibration take the form of six constant offsets, three for position and three for orientation. These offsets must be added to the robot measurements to calculate the position of the ultrasound images during reconstruction. The three-wire technique [142] is used to calibrate the system. A coordinate system C is placed at the center of the top pipe in the phantom (Figure 2.1). Let f u be a pixel in the image coinciding with point P that lays on the center of the top pipe in 0 the ultrasound phantom with coordinates cp = y 0 in frame C. Also, let C T C , °T e , and 2 Tj show the transformations from robot base-frame to ultrasound phantom frame, robot 97 Practical Applications end-effector frame to robot base-frame, and image-frame to robot end-effector frame. The following relation holds: - o e i (5.1) 01 IsTu y o I where sx and sy are the scale factors for u and v axes of the image, respectively, with units of mm/pixel. In (5.1), each pixel in the image coordinate system is first transformed to the end-effector coordinate system, then to the robot base-frame, and finally to the phantom co-ordinate system. As we mentioned above, a transformation between two coordinate systems has six degrees of freedom: three rotation (a,/3,7) and three translation (x,y,z). An x-y-z fixed angles scheme has been used to parameterize a rotation using three angles [42,142]. Therefore, any of the transformation matrices can be written as follows: cos a cos P cos a sin P sin 7 — sin a cos 7 cos a sin /? cos 7 + sin a sin 7 x sin a cos (3 sin a sin f3 sin 7 + cos a cos 7 sin a sin f3 cos 7 — cos a sin 7 y • sin/3 cos P sin 7 cos Pcos 7 (5.2) If we assume that ° T e can be measured by the robot, there are 14 parameters which have to be calculated in (5.1) as the calibration parameters (sx, sy and 6 translation and rotation parameters for each C T G and e Tj) . We call the calibration vector <f>. 98 Practical Applications The two zero components on the left hand side of (5.1) give two equations in the un-knowns <p. Therefore, if m ultrasound images are used, the over-determined set of 2m equations is solved using the line-search method in Matlab optimization toolbox. From the geometry of the problem, it is obvious that the rotation and translation parameters along the y axis in frame C can not be calculated by using this method. Therefore, two arbitrary numbers were used instead of these two parameters in the final calculation. For translation, we chose 300 mm, since the distance of the robot base to the phantom along y axis was around the same number, and for rotation, we chose 0 degrees. This reduces the size of the calibration vector cf) to 12. A total of 100 ultrasound images were used to calculate the parameters. The Star-Kalman algorithm was used to locate the center of the top pipe in each ultrasound image. Table 5.1 shows the identified calibration parameters. Transformation Matrix 0 Parameters sx (mm/pixel) -0.1063 sy (mm/pixel) 0.1336 x (mm) 40.6129 V (mm) 6.3316 -M z (mm) 21.4395 a (deg) 15.5180 3 (deg) -3.8487 7 (deg) -84.2883 x (mm) -643.1671 V (mm) -300 C T - 1 o z (mm) -314.2242 a (deg) 0.9156 3 (deg) 0 7 (deg) 1.3231 Table 5.1: Calibration results. 99 Practical Applications The accuracy of the calibration procedure is dependent on the quality of the ultrasound images, the accuracy of the Star-Kalman algorithm to locate the center of the pipe and the resolution of the ultrasound robot in locating the position and orientation of the probe. The RMS error, which is the root mean square residual of the set of (5.1) at solution <f>, is 0.2 mm. 5.2.2 Stradx Stradx [62] is a tool for the acquisition and visualization of 3-D ultrasound images using a conventional 2-D ultrasound machine and an Ascension Bird™ position sensor. The contours that specify an anatomic region of interest are drawn in a series of two-dimensional image slices by the operator. These contours are mapped to a 3-D space by using the position Figure 5.1: Partial 3-D image reconstruction of the ultrasound phantom by using the Stradx program. 100 Practical Applications Figure 5.2: Partial 3-D image reconstruction of the ultrasound phantom by using the Star-Kalman contour extraction method. information provided by the sensor. In this application, the measured robot/probe position was substituted in place of the sensor data as input to the Stradx program. Since the resolution of the robot is higher than that of the Bird sensor and since it is independent of metal in the surrounding environment, the system provides a powerful tool for accurate 3-D reconstruction of ultrasound images. In addition, instead of manual segmentation of ultrasound images in Stradx by using the mouse pointer, a program was written to generate a segmentation data file for the Stradx that uses the Star-Kalman algorithm in each image to provide the software with the required segmentation data. Figure 5.1 shows the 3-D reconstruction of the ultrasound phantom using this approach. It should be noted that no filtering is used in this 3-D reconstruction and the non-smooth surface of the extracted 101 Practical Applications model comes from the error in the location of the tube contours in ultrasound images. 5.2.3 Star-Kalman Based Reconstruction By using the Star-Kalman algorithm to extract each pipe's contour in the ultrasound image and the inverse kinematics of the robot to map each contour to the world coordinates, a 3-D model of each pipe is reconstructed. Figure 5.2 shows a partially reconstructed image. In this 3-D reconstruction, low-pass filtering is used to reduce the effect of errors in contour extraction. The reconstructed model has an average absolute error of less than 0.7mm. Figure 5.3 shows a reconstructed 3-D image of the carotid artery by using the Star-Kalman reconstruction method. The robot was used to move the ultrasound probe along the neck of a volunteer subject. The Star-Kalman algorithm was used to extract the contours of the carotid artery. Filtering is performed to reconstruct a smooth surface of carotid artery, while reducing the effect of the pulsation in this 3-D anatomical model. The figure clearly shows .the shape and the bifurcation of the artery. In contrast to the Stradx reconstruction approach, where the 3-D reconstruction is performed off-line, from either the manual or the Star-Kalman-based segmentation data files, the Star-Kalman-based reconstruction extracts each pipe's contour automatically and can also be performed in real-time. The data storage requirements are much lower than that of the Stradx approach, as only the contour coordinates, and not the entire ultrasound images, are stored for 3-D reconstruction. 102 Practical Applications (a) (b) Figure 5.3: Carotid artery 3-D reconstruction: a) external view, b) internal view 5.3 Tele-ultrasound An application of the system as a tele-ultrasound device through the Internet is presented here. Figure 5.4 shows the architecture of the experimental setup. A client-server applica-tion was written for this purpose under the Linux/Debian operating system. The server is responsible for relaying data between the user interface, the image processing system, the robot controller and the video camera. The server uses a JPEG compression algorithm to transfer ultrasound images to the remote operation site over a limited bandwidth. This could be improved by using an MPEG video compression method, that is not included in the current implementation of the system. At the same time, different operator commands are sent from the remote operation site to the server through this application. Figure 5.5 shows the data flow in this application. Two cameras send live video images, from two different viewing angles, to the operator from the examination site to the remote operation site through the client-server applica-103 Practical Applications Remote Operation Site Examination Site M a g e l l a n M o u s e R e m o t e O p e r a t o r U s e r I n t e r f a c e W e b C a m F e a t u r e S p a c e C o n t r o l L a w R o b o t C o n t r o l l e r P o s i t i o n / F o r c e S e n s o r s I n t e r n e t C o n n e c t i o n • H ! I m a g e P r o c e s s i n g ^ S y s t e m Figure 5.4: Tele-ultrasound system setup Web Camera User Interface Client User Commands Image Data Server User Commands J Robot and Image Data User Commands Robot Data Image Processing Data Robot Controller Client Image Processing System Figure 5.5: Data flow in the tele-ultrasound system. tion. These images provide visual feedback for the operator to perform the ultrasound examination. The system was demonstrated for the first time during the IRIS/PRECARN 2000 con-ference in Montreal. While the robot was located at the Robotics and Control Laboratory of the University of British Columbia (UBC) in Vancouver, conference delegates could interact with the user interface and the live visual feedback to successfully perform the ultrasound examination on an ultrasound phantom remotely. 104 Practical Applications While scalable to communication networks of almost any transmission speed, the demon-stration was performed using a small portion of a shared 1.5 Mbit/s T - l line at the con-ference center. With this transmission speed, a frame rate of at least 10 frames/s was achieved for ultrasound images and frame rates of 3 and 10 frames/s for the two cam-eras were achieved for the live video feedback. The ultrasound images had a resolution of 256x256 with an 8-bit gray scale. A visual feedback delay of approximately 500 ms was observed during the demonstration, because of the Internet transmission delay. Although as it was mentioned in Chapter 1, there are several robot-assisted tele-ultrasound systems reported in the literature, however, none of the reported a shared control approach in the design of a tele-operation system. In contrast, our system uses a safe robot in a tele-operation platform that enables the operator to manipulate the ultrasound probe, while being assisted with remote force and image controls. The implementation and integration of this tele-ultrasound system was a challenging process that involved client-server TCP/IP programming, user-interface design, image com-pression and visual servoing, as well as the design of a protocol that could deal with data-packet loss and Internet delays. This tele-ultrasound system later on became a base for the development and integration of our image-guided robot-assisted interface. 5.4 Feature-Based Probe Position Control One of the useful features of the current robotic system is its ability to follow a trajectory in the workspace. This feature is used to develop a feature-based probe positioning system. Figure 5.6 shows the concept. During the early stages of the ultrasound examination, the 105 Practical Applications Figure 5.6: Feature-based probe position control; fa and fa are the locations of the feature in two different ultrasound image frames. The robot moves along the line which connects the two feature points to each other in the workspace. operator selects a feature point in the ultrasound image at different probe locations along the x-axis on the patient's body. The system calculates the location of these feature points in the workspace by using the transformation matrices that are generated from the calibration procedure, and during the ultrasound examination, keeps the location of the probe along the y-axis on a trajectory that passes through these points in the workspace. Assuming that the patient's movement is negligible, this feature enables the operator to follow the (a) (b) Figure 5.7: Feature-based probe position control; this tool is mainly useful during the long-scan of the carotid artery, as the operator has to control the location of the ultrasound transducer in a very limited range (courtesy of Julian Guerrero). 106 Practical Applications direction of a vessel, e.g. carotid artery, during the long and short scans, and reduces the scanning time. Figure 5.7 shows an application of this approach. During the long-scan of the carotid artery, the operator should control the location of the carotid artery in a very limited range in order to have the carotid artery in the image. By using the feature-based probe position control, the operator selects several points along the carotid artery by using the short-scan images, and the robot automatically repositions the ultrasound probe during the scan. This significantly helps the operator to fine-tune the image at every time step. 107 Chapter 6 Human Factors Study 6.1 Overview This chapter explains results of a pilot human factors study that is performed to compare the presented robot-assisted ultrasound examination system with the conventional ultrasound examination. Three different approaches are used to achieve this goal. They are explained here. 6.2 Ultrasound Robot User Testing Questionnaire A questionnaire (Appendix B) was designed [18,136] to evaluate the performance of the sonographers and also the importance of different features of the system1. The base of the evaluation is the sonographers' performance during conventional ultrasound examinations. 1Discussions with Professor Sidney Fels, Department of Electrical and Computer Engineering, The Uni -versity of British Columbia, 2001 108 Human Factors Study In this questionnaire, the scan speed is defined to be the speed of which the operator moves the ultrasound transducer on the patient's neck. In addition, the accuracy of the scan is defined to be the achievable desired accuracy in the position and orientation of the ultrasound image by simple positioning commands to the ultrasound transducer. This questionnaire was filled in three times by a sonographer (Ann Hope) from the radiology department of the Vancouver General Hospital, UBC branch. The Magellan/SpaceMouseTiW was used as the interface to the robot for the first trial. It was found that, because of the design of this device, the sonographer had difficulty to perform decoupled motions along different axes. In addition, during the conventional ultrasound examination, a sonographer has a mental mapping between the orientation of the probe, that is in her hand, with ultrasound images. This enables the sonographer to relate the direction of the probe motion to the direction and orientation of the image. Because the SpaceMouse joystick controls the robot in the velocity mode, there is no mapping between the orientation of the ultrasound probe and the orientation of the joystick. Therefore, the sonographer found it difficult to intuitively use her experience in the conventional ultrasound examination and apply it to the robot-assisted one. Therefore, during the second and the third trials, the spherical wrist joystick was de-signed and used to overcome these problem. The joystick enables the sonographer to orient the ultrasound probe in the same way as the conventional ultrasound examinations. Further-more, it was found that the operator's control ability over the probe motions was increased significantly by using the spherical wrist joystick. Table 6.1 demonstrates a comparison of the results. Although more experiments are required to reach a steady state in the learning curve of the operator, based on these three trials, on average, there are 16 percent and 45 109 Human Factors Study Question Description First trial Second trial Third trial 1 The system is easy to use 1 3 2 2 Safety level 5 5 5 3 Comfort level 1 2 5 4 The robot has smooth motions 3 3 4 5-a Is Start button useful? 4 4 5 5-b Is 3-D probe useful? 1 1 N / A 5-c Is image control useful? 4 4 4 5-d Is force control useful? 4 5 4 5-e Is force display useful? 1 2 4 5-f Are axis buttons useful? 1 1 N / A 5-g Is contour extraction useful? 1 1 N / A 5-h Is feature-based control useful? N / A 3 4 6-a Using the previous scans 2 N / A N / A 6-b Showing the 3-D anatomy 5 4 4 7-a Similarity N / A 4 2 7-b Comparable scan speed 1 2 1 7-c Similar accuracy N / A 1 1 7-d Visual servoing is useful 4 4 4 7-e Relaxation level 1 2 4 Average Score out of 5 2.44 2.83 3.53 Table 6.1: Questionnaire results. The scores are out of 5, with 5 being the best and 1 being the worst. Al l these comparisons are between the robot-assisted and the conventional ultrasound examinations. percent improvements at the second and the third trials in comparison to the first trial, respectively. Therefore, it is clear that the spherical wrist joystick has increased the opera-tor's comfort level to control the motion of the ultrasound probe. Some of the features of the system, such as image and force controls, seem to be very useful during the examination. However, it seems that the operator does not use the 3-D probe model, the force display and the axis activation buttons in the GUI during the scan. This might be because of the fact that during conventional ultrasound examinations, the sonographer does not use such 110 Human Factors Study features. These features could be removed from the future implementations of the user interface or alternatively, more traning sessions could be provided for the sonographers to learn the new system. At the end of each of our experiments, the sonographer was com-pletely at ease with no shoulder pain. It should be noted that these results only represent the general impression of a single sonographer towards our robot-assisted ultrasound ex-amination system. More experiments with more sonographers are required to perform a statistically valid study. 6 . 3 Feature Positioning Ability In this experiment, an operator was asked to keep the position of the carotid artery in the center of the ultrasound image while performing the short scan on the patient's neck. The experiment was performed 10 times with and without enabling the ultrasound visual servo controller in the system. Figure 6.1 shows the result for one of the experiments. On average, both of the methods show a mean error of around one pixel, however the standard deviation errors are 7 and 19 pixels for visually assisted and not-assisted scans, respectively. This shows that the visual servoing can significantly increase the accuracy of ultrasound examinations by centering the desired features in the image. 6 . 4 Surface Electromyography Surface Electromyography [109,169] was chosen to perform a quantitative analysis of the muscle activity exerted by sonographers when they conduct conventional ultrasound ex-aminations relative to using the designed system. Surface electromyography provides easy 111 Human Factors Study With visual servoing enabled 401 1 1 1 1 1 1 2 — i r 20 \-£ 0 0,5 1 1.5 2 2.5 3 3.5 4 4,5 5 (a) Figure 6.1: Feature centering experiment with and without the visual servoing feature. access to the physiological processes that appear to us as the forces that a muscle generates, or muscle movements. This section briefly explains the basis of EMG signal processing and its relation to the generated force from the muscle [109]. 6.4.1 Basics of Myoelectric Signals A skeletal muscle comprises a number of motor units. Each motor unit consists of muscle fibers innervated by the terminal branches of a single a-motoneuron whose cell body is located in the anterior horn of the spinal cord. The motor unit is the smallest part of a muscle that the central nervous system can control individually. The central nervous system controls muscle force by adjusting the number of recruited motoneurons and the firing frequency of each motoneuron [99]. The sum of the contributions of the fibers belonging to a particular motor unit is the motor-unit action potential (MUAP), and the sequence of these motor-unit action potentials 112 Human Factors Study is the M U A P train [99]. The myoelectric signal detected at the surface of the skin or inside the muscle is the summation of the contributions of individual MUAP trains. Because motor-unit discharges are irregular and MUAPs have different shapes, we can think of the myoelectric signal as a band-limited stochastic process with a Gaussian amplitude distribution. Its frequency spectrum ranges from DC to approximately 500 Hz [99,109]. The specifications of the E M G sensors used to capture EMG data are shown in Table 6.2. These sensors were designed by Professor Ted Milner at the Department of Kinesiology at Simon Fraser University. During our experiments, we used a band-pass filter with cut-off frequencies of 20 and 345 Hz 2 . Based on our experiments, this frequency range has the maximum information of the E M G signal. The E M G sensors have a bandwidth of 30-500 Hz that covers the frequency spectrum of the E M G signal. A sampling frequency of 2 KHz was used to sample the E M G signal of each muscle. 6.4.2 E M G Feature Extraction Several kinds of features are used in the literature to represent the myoelectric signal pat-terns. The main features are explained here [109]: 1. Average Rectified Value (ARV): This is an estimate of the summation of absolute value of the E M G signal. It is given by ARV = ^ k = 1 | X f c | (6.1) 2Discussions with Professor Ted Milner, Department of Kinesiology, Simon Fraser University, 2002 113 Human Factors Study Specifications Units Min Nom Max Maximum Gain (CCW to increase gain) V / V 1800 2000 2200 Minimum Gain (CW to decrease gain) V / V 22 24 26 Upper cutoff frequency Hz 450 500 550 Lower cutoff frequency Hz 25 30 35 Maximum output before clipping (peak-peak) V 5.0 5.5 6.0 Noise at maximum gain (peak-peak) mV n/a 10 20 Operating temperature C 0 25 50 Storage temperature c -25 25 75 Operating current mA 5 10 20 Supply voltage (bipolar) V 4.9 5.0 5.1 Supply voltage noise (peak-peak) mV n/a 1 5 Impedance from subject to supply ground ft 0 n/a 100 Mechanical shock (6 axis) g n/a n/a 30 Input impedance ft -10% 10*0 +10% Table 6.2: The specifications of the E M G sensors, where xk is the k th sample data which has N samples raw data. 2. Root Mean Squared (RMS): The RMS value of the EMG signal is defined as follows: RMS = ^ k = 1 7 ; ; (6.2) where x is the average of the E M G signal over a sequence of AT samples. It has been shown that the amplitude variables of the E M G signal, such as ARV and RMS, qualitatively relate to the amount of torque (or force) measured about a joint [109]. However, it is not possible to find an accurate quantitative relationship between the two. The reason for this is that the E M G signal is the result of many physiological, anatomical, and technical factors which are hard to quantify. 114 Human Factors Study Other features of the E M G signal, such as the mean and the median frequencies, relate to the muscle fatigue. However, since the duration of each of our experiments was short (around 10 seconds), these features are not useful to measure the muscle fatigue of the sonographer. The spectral modification of the EMG signal is caused by the accumulation of the lactic acid in the muscle, that is insignificant for the duration of our experiments. 6.4.3 Monitoring the Muscle Activity of the Sonographers This pilot study has only considered the muscles in hand, forearm and shoulder joint area that are involved in a simple translational motion of the ultrasound transducer along the patient's neck, both in conventional and robot-assisted examinations. The translational motion is one of the main motions that is performed by sonographers, while manipulating the ultrasound probe. Since we are using the surface E M G data, our study is only limited to the muscles that can be directly recorded from the surface with small cross-talk from adjacent muscles3, and does not consider any deeper muscles that might be more involved in performing this task. Moreover, several of the muscles in these areas work in parallel, which means that their activity level increases and decreases together when performing a certain task. Fourteen muscles were originally considered. Based on the anatomy, these muscles are assumed to be the main ones that are involved in gripping the ultrasound probe and pushing it against the patient's body. The selected muscles are: 1. Hand: Abductor Pollicis Brevis (APB) 3Discussions with Professor Ted Milner, Department of Kinesiology, Simon Fraser University, 2002 115 Human Factors Study 2. Forearm: Extensor Digitorum Communis (EDC), Flexor Carpi Radialis (FCR), Flexor Carpi Ulnaris (FCU), Flexor Digitorum Sublimis (FDS) and Flexor Pollicis Longus (FPL) 3. Arm and Shoulder Joint: Biceps Brachii (BB), Brachialis (B), Lateral Head Tricep (LT), Long Head Tricep (LHT), Medial Head Tricep (MHT), Anterior Deltoid (AD), Middle Deltoid (MD) and Posterior Deltoid (PD) An operator was asked to translate the ultrasound probe along a patient's neck by using the conventional and robot-assisted methods and the activity of these fourteen muscles were recorded. The spherical wrist joystick was used for these experiments. The results are shown in Table 6.3. The main muscles that are involved in both scans in the forearm area are EDC, FDS, and FPL. In the shoulder area, AD and MD are the muscles with highest level of activity in both scans. Therefore, in this pilot human factors study where we only consider the translational motions along the patient's neck, the following set of muscles are chosen: Abductor Pollicis Brevis, Extensor Digitorum Communis, Flexor Digitorum Sublimis, Flexor Pollicis Longus, Anterior Deltoid and Middle Deltoid. Figures 6.2-6.7 show the location of the muscles in the human body [47] (with permission from Charles C. Thomas, Publisher, LTD.). The cross mark in each figure shows the location of the attached probe to the operator's skin. Four operators, one sonographer and three students, were asked to move the probe on the patient's neck both for long and short scans and the E M G data was captured from their muscles during these examinations. Student 1 had hours of experience with the robotic 116 Human Factors Study Scan Conventional Robot-Assisted APB 0.0187 0.0068 EDC 0.0260 0.0182 FCR 0.0029 0.0028 FCU 0.0026 0.0092 FDS 0.0065 0.0069 FPL 0.0199 0.0274 BB 0.0056 0.0021 B 0.0102 0.0055 LT 0.0098 0.0061 LHT 0.0219 0.0139 MHT 0.0173 0.0132 AD 0.0709 0.0192 MD 0.0799 0.0147 PD 0.0319 0.0048 Table 6.3: The activity of different muscles under conventional and robot-assisted exami-nations. The activity of each muscle is normalized with its maximum activity level. system, however, students 2 and 3 had almost no previous experience with the interface. Each of the experiments were performed five times and the results were averaged. During each experiment, the operators were asked to only translate the ultrasound probe for 10 sec-onds along a patient's neck, while keeping the carotid artery in the center of the ultrasound image. Since a simple probe motion was chosen for these experiments and the duration of each experiment was the same, we assume that similar probe motions were performed by all the operators both in complexity and in speed. The activity of each muscle was normalized with respect to its maximum activity. The following scan modes were performed: 1. Joystick Scan (J): In this experiment, the operator was asked to move the probe along the patient's neck by using the robot and the spherical wrist joystick, while trying to keep the carotid artery in the center of the image. 117 Human Factors Study Figure 6.2: Muscle 1; Abductor Pollicis Brevis muscle (from [47] with permission from Charles C. Thomas, Publisher, LTD.) Figure 6.3: Muscle 2; Extensor Digitorum Communis muscle (from [47] with permission from Charles C. Thomas, Publisher, LTD.) 2. Visual Servoing Scan (V): In this experiment, the operator was asked to move the probe along the patient's neck by using the robot and the joystick, while being assisted by the image servo controller. 3. Hand Scan (H): In this experiment, the operator used the conventional ultrasound 118 Human Factors Study Figure 6.4: Muscle 3; Flexor Digitorum Sublimis muscle (from [47] with permission from Charles C. Thomas, Publisher, LTD.) Figure 6.5: Muscle 4; Flexor Pollicis Longus muscle (from [47] with permission from Charles C. Thomas, Publisher, LTD.) examination method to scan the patient. The R M S values of muscle activity are used here to perform comparisons between different operators. Similar comparisons can be made by using the average rectified values of the signals. Tables 6.4-6.7 show the results for different operators and Table 6.8 shows the com-parison of these results. The joystick force threshold was set to 2.5 N in these experiments, since otherwise the sonographer was not able to effectively control the probe orientation 119 Human Factors Study Figure 6.6: Muscle 5; Anterior Deltoid muscle (from [47] with permission from Charles C. Thomas, Publisher, LTD.) along different axes. It is interesting to note that although the average muscle activity increases for all the operators, the average shoulder muscle activity significantly reduces for the student operators. In addition, the visual servoing reduces the muscle activity for all the operators by around 20% on average. Although the results are promising, however, we suspect that the high joystick force threshold that we had to set for the sonographer has significantly affected our results. In order to verify this, we performed similar experiments with two of the students with a Muscle 1 2 3 4 5 6 Mean Joystick (2.5 N) 0.1889 0.1208 0.0358 0.0493 0.0090 0.0694 0.0789 Visual Servoing (2.5 N) 0.1520 0.1038 0.0300 0.0482 0.0097 0.0606 0.0674 Hand 0.0268 0.0584 0.0192 0.0222 0.0236 0.0269 0.0295 J / H 704% 206% 186% 222% 38% 258 % 267% v/J 80% 86% 83 % 97% 107% 87% 85% Table 6.4: Sonographer's short scan normalized muscle activity with joystick force threshold of 2.5 N . 120 Human Factors Study Muscle 1 2 3 4 5 6 Mean Joystick (2.5 N) 0.0436 0.2250 0.0223 0.0404 0.0235 0.0451 0.0606 Visual Servoing (2.5 N) 0.0430 0.1859 0.0200 0.0238 0.0196 0.0249 0.0529 Hand 0.0110 0.1207 0.0072 0.0179 0.0853 0.1212 0.0667 J / H 396% 186% 309% 226% 27% 37% 90% V / J 98% 82% 89% 59% 83% 55% 87% Table 6.5: Student l's short scan normalized muscle activity with joystick force threshold of 2.5 N . Muscle 1 2 3 4 5 6 Mean Joystick (2.5 N) 0.3135 0.1260 0.2325 0.0313 0.0602 0.0204 0.1307 Visual Servoing (2.5 N) 0.1043 0.1191 0.1001 0.0284 0.0139 0.0142 0.0633 Hand 0.0892 0.0819 0.0942 0.0134 0.0405 0.1118 0.0718 J / H 351% 153% 246% 233% 148% 18% 182% V / J 33% 94% 43% 90% 23% 69% 88% Table 6.6: Student 2's short scan normalized muscle activity with joystick force threshold of 2.5 N . Muscle 1 2 3 4 5 6 Mean Joystick (2.5 N) 0.0777 0.2653 0.0098 0.0251 0.0571 0.0026 0.0730 Visual Servoing (2.5 N) 0.0885 0.2535 0.0101 0.0269 0.0149 0.0031 0.0662 Hand 0.0654 0.1215 0.0064 0.0186 0.0692 0.0068 0.0480 J / H 118% 218% 153% 134% 82% 38% 152% V / J 113% 95% 103% 107% 26% 119% 90% Table 6.7: Student 3's short scan normalized muscle activity with joystick force threshold of 2.5 N . 121 Figure 6.7: Muscle 6; Middle Deltoid muscle (from [47] with permission from Charles C. Thomas, Publisher, LTD.) Operators Sonographer Student 1 Student 2 Student 3 J / H 269% 197% 192% 124% J / H (shoulder muscles) 148% 32% 83% 60% J / H (forearm muscles) 204% 240% 210% 168% V / J 90% 78% 59% 94% Table 6.8: A comparison among the sonographer and the students when interacting with the robot-assisted system relative to the conventional ultrasound examination. The joystick force threshold is 2.5 N ; J/H demonstrates the percentages of change in the average muscle activity for the operator when using the joystick relative to the hand scan; V/ J demonstrates the percentage of change in the average muscle activity for the operator when enabling the visual servoing. joystick force threshold at 1.5 N. Tables 6.9 and 6.10 show the results. A comparison of these results is shown in Table 6.11. It is clear that a reduction in the joystick force threshold level has significantly reduced the muscle activity level. In addition, the average muscle 122 Human Factors Study activity has been reduced both for shoulder and forearm muscles. Therefore, although the average muscle activity might increase by using the robot-assisted interface, we suspect that if the operators learn to use the joystick more efficiently, under lower joystick force thresholds, significant reduction in activity level should be observed for shoulder muscles, that are the main cause of pain for sonographers. In addition, similar results are observed for the total average muscle activity in both cases. More experiments are required to verify this hypothesis. Since during the conventional ultrasound examinations, the sonographer rests her fore-arm against the patient's chest, in contrast to our experimental setup where she rests her forearm against a hard flat surface, we suspect that this might have also affected our result for the forearm muscles. To validate this hypothesis, two sets of experiments were per-formed with Student 1 resting his forearm on a soft surface, with two different joystick force threshold levels, and the results are reported in Table 6.12. Table 6.13 compares the results with the previous examinations that were performed on the hard surface. The results show that the soft surface support for the forearm reduces the average muscle activity. In addi-tion, the effect is more significant with the joystick force threshold of 2.5 N, in comparison to 1.5 N . Muscle 1 2 3 4 5 6 Mean Joystick (1.5 N) 0.0347 0.0401 0.0729 0.0156 0.0108 0.0086 0.0305 Visual Servoing (1.5 N) 0.0296 0.0382 0.0507 0.0124 0.0069 0.0061 0.0240 Hand 0.0089 0.0284 0.0459 0.0148 0.0408 0.0435 0.0304 J /H 389% 141% 158% 105% 26% 19% 100% V / J 85% 95% 69% 79% 63% 70% 78% Table 6.9: Student I's short scan normalized muscle activity with joystick force threshold of 1.5 N . 123 Human Factors Study Muscle 1 2 3 4 5 6 Mean Joystick (1.5 N) 0.1568 0.0875 0.1843 0.0237 0.0278 0.0158 0.0827 Visual Servoing (1.5 N) 0.1498 0.0833 0.0900 0.0215 0.0270 0.0139 0.0643 Hand 0.0892 0.0819 0.0942 0.0134 0.0405 0.1118 0.0718 J / H 175% 106% 195% 176% 68% 14% 115% V / J 95% 95% 48% 90% 97% 88% 77% Table 6.10: Student 2's short scan normalized muscle activity with joystick force threshold of 1.5 N . Operators Student 1 Student 2 J / H 140% 122% J / H (shoulder muscles) 22% 41% J / H (forearm muscles) 134% 159% V / J 77% 85% Table 6.11: A comparison between students 1 and 2 when interacting with the robot-assisted system relative to the conventional ultrasound examination. The joystick force threshold is 1.5 N; J/H demonstrates the percentages of change in the average muscle activity for the operator when using the joystick relative to the hand scan; V/ J demonstrates the percentage of change in the average muscle activity for the operator when enabling the visual servoing. Muscle 1 2 3 4 5 6 Mean Joystick (1.5 N) 0.0383 0.0498 0.0088 0.0762 0.0142 0.0151 0.0337 Visual Servoing (1.5 N) 0.0205 0.0552 0.0063 0.0821 0.0079 0.0113 0.0306 Joystick (2.5 N) 0.0176 0.0677 0.0100 0.0965 0.0166 0.0147 0.0372 Visual Servoing (2.5 N) 0.0238 0.0507 0.0075 0.0943 0.0156 0.0129 0.0341 Hand 0.0201 0.0448 0.0068 0.0650 0.0295 0.0662 0.0387 J /H (1.5 N) 190% 111% 129% 117% 48% 22% 87% V / J (1.5 N) 53% 110% 71% 107% 55% 74% 90% J / H (2.5 N) 87% 151% 147% 148% 56% 22% 96% V / J (2.5 N) 135% 74% 75% 97% 93% 87% 91% Table 6.12: Student I's short scan normalized muscle activity with joystick force thresholds of 1.5 N and 2.5 N; a soft surface has been used as the support for the forearm. 124 Human Factors Study Surface Hard Soft J / H (1.5 N) 140% 103% J / H (1.5 N , shoulder muscles) 22% 35% J / H (1.5 N , forearm muscles) 134% 119% V / J (1.5 N) 77% 79% J / H (2.5 N) 197% 102% J / H (2.5 N, shoulder muscles) 32% 39% J / H (2.5 N, forearm muscles) 240% 148% V / J (2.5 N) 78% 94% Table 6.13: The effect of soft and hard support surfaces for the forearm on the average muscle activity of student 1. The joystick force threshold is set to 1.5 N and 2.5N for two different experiments; J/H demonstrates the percentages of change in the average muscle activity for the operator when using the joystick relative to the hand scan; V/ J demonstrates the percentage of change in the average muscle activity for the operator when enabling the visual servoing. It should be noted that after performing 3 hours of experiments with the sonographer, she mentioned that she does not feel any pain in her arm and shoulders. However, at the hospital, she usually has shoulder pain for the same duration of ultrasound scan (although she added that the amount of pain might vary day by day). This is promising, as it demonstrates that although the muscle activity might increase by using the robot-assisted system, the posture of the sonographer could be more ergonomic than the conventional ultrasound examination. The robotic interface also controls the force along the axis of the probe that is perpendicular to the patient's skin. Therefore, the muscles that are involved in pushing the probe along this axis should show significantly less activity by using the tele-operation system. More experimental results are required to perform a statistically valid study. In conclusion, this preliminary study shows that there seems to be a significant difference between a sonographer who has years of experience in performing conventional ultrasound 125 Human Factors Study examinations, and the students who are more familiar with robotic interfaces and do not have any clinical experience. It has also been shown that the visual servoing reduces the muscle activity both for sonographer and an unexperienced operator. The robotic system significantly reduces the shoulder muscle activity of the unexperienced operator, in contrast to an increase in the shoulder muscle activity for the sonographer. We have only considered a simple translational examination for carotid artery in this analysis. An extension to this human factors study could include the design of an interface for a single ergonomic posture that applies to other types of examinations. This study also does not consider other muscles that might be involved in probe rotation. Therefore, future studies should consider other muscles that might be involved in other probe motions and different postures of the sonographers. In addition, currently, the center of the rotation of the joystick is around 1.5 cm below the tip of the handle. Since during the conventional ultrasound examinations, the sonographer is used to rotate the ultrasound probe with the center of the rotation being the tip of the probe, this could have also affected both our qualitative and quantitative results. Future work could include the design of a new joystick to coincide the tip of the joystick handle with the center of rotation of the device. 126 C h a p t e r 7 Conclusions and Future W o r k This chapter summarizes the main contributions of my thesis and also proposes areas for future research. 7.1 Contributions of this Thesis This thesis has presented contributions towards the successful development and integration of a novel image-guided robot-assisted system for medical ultrasound diagnosis. Specifically, the concept of ultrasound visual servoing has been introduced and developed in this thesis. This required the development of several robust real-time feature tracking techniques for ul-trasound images. Some of these techniques have been used to reconstruct simple anatomical features, such as the carotid artery, in 3-D, with the aim of providing anatomical landmark information for sonographers. Furthermore, a user interface has been developed that com-bines position, force, and image control. This user interface has been used as part of a tele-ultrasound system between Montreal and Vancouver. For the first time, the proposed 127 Conclusions and Future Work tele-ultrasound system enables the operator to manipulate the ultrasound transducer, while being assisted by force and image controllers. The results of a pilot human factors study demonstrate that the robot-assisted system significantly reduces the muscle activity, mainly for shoulder muscles, and consequently it has a potential to help reduce the high incidence of work-related injuries suffered by sonographers. The main contributions are as follows: 7.1.1 Real-Time Ultrasound Feature Extraction Several feature extraction and tracking algorithms, namely a Sequential Similarity Detection algorithm, a Star algorithm, and the Discrete Snakes algorithm, have been modified and applied in real-time to track features in ultrasound images. These methods are compared with a developed Correlation algorithm and the Star-Kalman algorithm to track the carotid artery in ultrasound images. The Sequential Similarity Detection method and the Star-Kalman algorithm have been demonstrated to have excellent performance while tracking features with motions of up to 200 pixel/s; however, the Star-Kalman algorithm requires less computation time. The Correlation and Star algorithms exhibit poorer performance, with higher computational cost. The Snake algorithm was unable to track features with motions faster than 100 pixels/s. 7.1.2 Ultrasound Visual Servoing The concept of ultrasound visual servoing was developed to control three axes of the robot. The methodology of ultrasound image-guided transducer motion was used to automatically compensate for unwanted motions in the plane of the ultrasound beam by means of closed-loop robot control. 128 Conclusions and Future Work 7.1.3 Feature-Based 3-D Ultrasound Image Reconstruction A computationally inexpensive feature-based 3-D ultrasound imaging technique was demon-strated. The method uses the Star-Kalman algorithm to extract the contours of the carotid artery in real-time. The algorithm was validated by a 3-D reconstruction of an ultrasound phantom and a human carotid artery. 7.1.4 Tele-ultrasound Although a number of robot-assisted tele-ultrasound examination systems have already been proposed in the literature, none of the reported systems use a shared control approach to assist the operator in the tele-ultrasound examination. In contrast, our tele-ultrasound system allows the radiologist to view and manipulate the ultrasound transducer at the remote site by using a safe robot, while being assisted by the remote force and image servo controllers. 7.1.5 Ultrasound Robot User-Interface A novel user-interface that combines velocity, force, and image-based control has been developed. Preliminary human factors experiments performed by this system motivated the design of a novel spherical wrist joystick. The joystick, designed by Simon Bachmann at the Robotics and Control Laboratory of the University of British Columbia, mimics the feeling of controlling a real ultrasound probe for the operator. The joystick has 6 decoupled axes and provides absolute orientation control of the ultrasound probe. 129 Conclusions and Future Work 7.1.6 Human Factors Study A novel EMG-based evaluation of the sonographers' muscular activity during ultrasound examinations has been developed. This method is used to compare the robot-assisted diagnostic ultrasound interface with that of the conventional method. The pilot study shows that large shoulder muscles use significantly less energy during the robot-assisted ultrasound examination relative to the conventional method for an inexperienced operator. These muscles are the major cause of pain in sonographers. However, it seems that the shoulder muscle activity increases for an experienced sonographer. This might be from the fact that an experienced operator is less receptive to a novel system than someone who has performed only a few ultrasound examinations. In addition, it has been shown that the visual servoing reduces the muscle activity of the operator during a robot-assisted ultrasound examination. A questionnaire that was given to a sonographer demonstrated that the operator was much more comfortable to use the novel spherical wrist joystick to control the ultrasound robot than to use the SpaceMouse. 7.2 Suggestions for Future Work There are many opportunities for this research to be continued. Some examples are outlined below: 1. The feature extraction algorithms could also be used to extract anatomical maps in real-time, which would help ultrasound technicians guide the probe during an ultra-sound examination. This would also enable them to match the current scan with previous scans of a patient. 130 Conclusions and Future Work 2. Because the ultrasound transducer is positioned by a robot rather than by the opera-tor, precise, repeatable, three-dimensional scanning, as well as more accurate Doppler ultrasound will be made possible by this system. 3. The passive SpaceMouse could be replaced by a PowerMouse haptic interface [153] in order to realize bilateral tele-operation with force feedback. 4. A more comprehensive human factors study should be performed by testing the robot at a hospital and using operators with different levels of experience. 5. A robot-assisted diagnostic ultrasound examination system has other potential appli-cations in image guided intervention and multi-modal image registration (e.g., ultra-sound, MRI, CT and X-ray). The robotic system could also be used in ultrasound elastography. 131 Bibl iography [1] K.Z. Abd-Elmoniem, Y . M . Kadah, and A . M . Yousef. Real-time adaptive ultrasound speckle reduction and coherence enhancement. Proceedings of IEEE International Conference on Image Processing, 1:172-175, 2000. [2] P. Abolmaesumi, S.E. Salcudean, and W.H. Zhu. Visual servoing for robot-assisted diagnostic ultrasound. World Congress on Medical Physics and Biomedical Engineer-ing, 4:2532-2535, Chicago, July 2000. [3] P. Abolmaesumi, S.E. Salcudean, W.H. Zhu, S.P. DiMaio, and M.R. Sirouspour. A user interface for robot-assisted diagnostic ultrasound. Proceedings of IEEE Interna-tional Conference on Robotics and Automation, 2:1549-1554, 2001. [4] P. Abolmaesumi, S.E. Salcudean, W.H. Zhu, M.R. Sirouspour, and S.P. DiMaio. Image-guided control of a robot for medical ultrasound. IEEE Transactions on Ro-botics and Automation, February 2002. [5] P. Abolmaesumi, M.R. Sirouspour, and S.E. Salcudean. Real-time extraction of carotid artery contours from ultrasound images. IEEE International Conference on Computer-Based Medical Systems, pages 181-186, Texas, June 2000. 132 BIBLIOGRAPHY [6] P. Abolmaesumi, M.R. Sirouspour, S.E. Salcudean, and W.H. Zhu. Adaptive image servo controller for robot-assisted diagnostic ultrasound. Proceedings of IEEE/ASME International Conference on Advanced Intelligent Mechatronics, 2:1199-1204, 2001. [7] R. Ahluwalia and L. Fogwell. A modular approach to visual servoing. Proceedings of IEEE International Conference on Robotics and Automation, pages 943-950, 1986. [8] Y.S. Akgul, C. Kambhamettu, and M . Stone. Extraction and tracking of the tongue surface from ultrasound image sequences. Proceedings of IEEE Computer Society Conference on Computer Vision and Pattern Recognition, pages 298-303, 1998. [9] Y.S. Akgul, C. Kambhamettu, and M . Stone. A task-specific contour tracker for ultrasound. Proceedings of IEEE Workshop on Mathematical Methods in Biomedical Image Analysis, pages 135-142, 2000. [10] P.K. Allen, A. Timcenko, B. Yoshimi, and P. Michelman. Real-time visual servoing. Proceedings of IEEE International Conference on Robotics and Automation, pages 1850-1856, 1992. [11] P.K. Allen, B. Yoshimi, and A. Timcenko. Real-time visual servoing. Proceedings of IEEE International Conference on Robotics and Automation, pages 851-856, 1991. [12] N . Andersen, O. Raven, and A. Sorensen. Real-time vision based control of servome-chanical systems. Proceedings of the 2nd International Symposium on Experimental Robotics, France, 1991. [13] M.E. Anderson and G.E. Trahey. A seminar on k-space applied to medical ultrasound. BME 265 course notes, Department of Biomedical Engineering, Duke University, 2000. 133 BIBLIOGRAPHY [14] D. Aruliah. A survey of level set methods applied to active contour models. CPSC 525 Final Project Report, UBC, 1998. [15] K. Baba, K. Satch, S. Satamoto, T. Okai, and I.Shiego. Development of an ultra-sonic system for three-dimensional reconstruction of the foetus. Journal of Perinatal Medicine, 17:19-24, 1989. [16] J.C. Bamber and C. Daft. Adaptive filtering for reduction of speckle in ultrasonic pulse-echo images. Ultrasonics, 24:41-44, 1986. [17] Y . Bar-Shalom and T.E. Fortmann. Tracking and Data Association. Academic Press Inc., 1988. [18] A.E. Bennett and K. Ritchie. Questionnaires in medicine : a guide to their design and use. Oxford University Press, 1975. [19] J.G. Bosch, G. van Burken, S.S. Schukking, R. Wolff, A.J . van de Goor, and J.H.C. Reiber. Real-time frame-to-frame automatic contour detection on echocardiograms. IEEE Computers in Cardiology, pages 29-32, 1994. [20] N . M . Botros. A pc-based tissue classification system using artificial neural networks. IEEE Transactions on Instrumentation and Measurement, 41(5):633-638, 1992. [21] S. Boudet, J. Gariepy, and S. Mansour. An integrated robotics and medical control device to quantify atheromatous plaques: Experiments on the arteries of a patient. IEEE International Conference on Intelligent Robotics and Systems, 3:1533-1538, 1997. 134 BIBLIOGRAPHY [22] E. Brandt. Segmentation techniques for echocardiographic image sequences. M.Sc. Thesis, Linkopings University, Sweden, 1998. [23] E. Brandt, L. Wigstrom, and B. Wranne. Segmentation of echocardiographic image sequences using spatio-temporal information. Proceedings of International Conference on Medical Image Computing and Computer Assisted Intervention, pages 410-419, Cambridge, UK, 1999. [24] L.G. Brown. A survey of image registration techniques. ACM Computing Surveys, 24(4):325-376, December 1992. [25] R. Bukowski, L. Haynes, Z. Geng, N . Coleman, A. Santucci, K. Lam, A. Paz, R. May, and M . De Vito. Robot hand-eye coordination rapid prototyping environment. Pro-ceedings of IEEE International Symposium on Intelligent Robotics, pages 15-28, 1991. [26] G. Buttazzo, B. Allotta, and F. Fanizza. Mousebuster: a robot system for catching fast moving objects by vision. Proceedings of IEEE International Conference on Robotics and Automation, pages 932-937, 1993. [27] A. Castano and S. Hutchinson. Hybrid vision/position servo control of a robotic ma-nipulator. Proceedings of IEEE International Conference on Robotics and Automation, pages 1264-1269, 1992. [28] V. Chalana, D.R. Haynor, and Y. Kim. Left-ventricular boundary detection from short-axis echocardiograms: the use of active contour models. SPIE Conference on Image Processing, 2167:786-798, 1994. 135 BIBLIOGRAPHY [29] V. Chalana, D.T. Linker, D.R. Haynor, and Y. Kim. A multiple active contour model for cardiac boundary detection on echocardiographic sequences. IEEE Transactions on Medical Imaging, 15(3):290-298, June 1996. [30] F. Chaumette, P. Rives, and B. Espiau. Positioning of a robot with respect to an object, tracking it and estimating its velocity by visual servoing. Proceedings of IEEE International Conference on Robotics and Automation, pages 2248-2253, 1991. [31] C M . Chen, H.H.S. Lu, and Y.C. Lin. A new ultrasound image segmentation algorithm based on an early vision model and discrete snake model. SPIE Proceedings on Medical Imaging, 3338:959-970, 1998. [32] D.C. Cheng, A.S. Trucksass, K.S. Cheng, M . Sandrock, Q. Pu, and H. Burkhardt. Automatic detection of the intimal and the adventitia layers of common carotid artery wall in ultrasound b-mode images using snakes. Proceedings of IEEE International Conference on Image Analysis and Processing, pages 452-457, 1999. [33] K.W. Cheung, T. Lee, and R.T. Chin. Boundary detection by artificial neural network. Proceedings of International Joint Conference on Neural Networks, 2:1189-1194, 1993. [34] S. Chieaverni, L. Sciavicco, and B. Siciliano. Control of robotic systems through sin-gularities. Proceedings of International Workshop on Nonlinear and Adaptive Control, 1991. [35] G.I. Chiou and J.N. Hwang. Image sequence classification using a neural network based active contour model and a hidden markov model. Proceedings of IEEE Inter-national Conference on Image Processing, 3:926-930, 1994. 136 BIBLIOGRAPHY [36] M . M . Choy and J.S. Jin. Morphological image analysis of left-ventricular endocardial borders in 2d echocardiograms. In SPIE Proceedings on Medical Imaging, volume 2710, pages 852-863, 1996. [37] C H . Chu, E.J. Delp, and A.J . Buda. Detecting left ventricular endocardial and epi-cardial boundaries by digital two-dimensional echocardiography. IEEE Transactions on Medical Imaging, 7(2):81-90, 1988. [38] W.F. Clocksin, S.E. Bromley, P.G. Davey, A.R. Vidler, and C.G. Morgan. An imple-mentation of model-based visual feedback for robot arc welding of thin sheet steel. International Journal of Robotics Research, 4(l):13-26, 1985. [39] F. Conticelli, B. Allotta, and V. Colla. Global asymptotic stabilization of visually-servoed manipulators. Proceedings of IEEE/ASME International Conference on Ad-vanced Intelligent Mechatronics, pages 926-931, 1999. [40] P.I. Corke. Visual Control of Robots: High Performance Visual Servoing. John Wiley & Sons Inc., 1996. [41] P.Y. Coulon and M . Nougaret. Use of a tv camera system in closed-loop position control mechanism. A. Pugh, Ed., London: I.F.S. Publications, Ltd., Ch. Robot Vision, 1983. [42] J.J. Craig. Introduction to Robotics: Mechanics and Control. Addison Wesley, 1986. [43] M.Craig. Sonography: An occupational health hazard. Journal of Diagnostic Medical Sonography, 1:121-124, May/June 1985. 137 BIBLIOGRAPHY [44] A. Cretual and F. Chaumette. Image-based visual servoing by integration of dy-namic measurements. Proceedings of IEEE International Conference on Robotics and Automation, pages 1994-2001, 1998. [45] D. de Cunha, P. Gravez, C. Leroy, E. Maillard, J. Jouan, P. Varley, M . Jones, M . Hal-liwell, D. Hawkes, P.N.T. Wells, and L. Angelini. The midstep system for ultrasound guided remote telesurgery. IEEE Engineering in Medicine and Biology, 20(3): 1266-1269, 1998. [46] E. Degoulange, S. Boudet, J. Gariepy, F. Pierrot, L. Urbain, J.L. Megnien, E. Dombre, and P. Caron. Hippocrate: an intrinsically safe robot for medical applications. Pro-ceedings of IEEE/RSJ International Conference on Intelligent Robots and Systems, pages 959-964, Victoria, B.C., Canada, October 1998. [47] E.F. Delagi and A. Perotto. Anatomic guide for the electromyographer. Charles C. Thomas Publisher LTD., Springfield, IL, 1980. [48] E. Dickmanns and V. Graefe. Applications of dynamic monocular machine vision. Machine Vision and Applications, pages 241-261, 1988. [49] E. Dickmanns and F.R. Schell. Autonomous landing of airplanes by dynamic machine vision. Proceedings of IEEE Workshop on Applications of Computer Vision, pages 172-179, 1992. [50] J. Dietrich, G. Plank, and H. Kraus. Optoelectronic system housed in plastic sphere. Europ. Patent No. 0 240 023; US-Patent No. 4,785,180; JP-Patent No. 1 763 620. 138 BIBLIOGRAPHY [51] K.A. Dzialo and R.J. Schalkoff. Control implications in tracking moving objects using time-varying perspective-projective imagery. IEEE Transactions on Industrial Electronics, 33(3):247-253, 1986. [52] A.N. Evans and M.S. Nixon. Model filtering to reduce ultrasound speckle for feature extraction. IEE Processings on Vision, Image and Signal Processing, 142(2):87-94, 1995. [53] A.N. Evans and M.S. Nixon. Biased motion-adaptive temporal filtering for speckle reduction in echocardiography. IEEE Transactions on Medical Imaging, 15(l):39-50, February 1996. [54] J.T. Feddema, C.S.G. Lee, and O.R. Mitchell. Weighted selection of image features for resolved rate visual feedback control. IEEE Transactions on Robotics and Automation, 7(l):31-47, 1991. [55] J.T. Feddema and O.R. Mitchell. Vision-guided servoing with feature-based trajectory generation. IEEE Transactions on Robotics and Automation, 5:691-700, 1989. [56] J.T. Feddema and R.W. Simon. Cad-driven microassembly and visual servoing. Pro-ceedings of IEEE International Conference on Robotics and Automation, pages 1212-1219, 1998. [57] A. Fenster and D.B. Downey. 3-d ultrasound imaging: A review. IEEE Engineering in Medicine and Biology Magazine, 15(6):41-51, 1996. 139 BIBLIOGRAPHY [58] A. Fenster, S. Tong, S. Sherebrin, D.B. Downy, and R.N. Rankin. Three-dimensional ultrasound imaging. Proceedings of SPIE in Medical Image Acquisition and Process-ing, J.K. Udupa and A. Fenster (Eds.), 2432:176-184, 1995. [59] N . Friedland and D. Adam. Automatic ventricular cavity boundary detection from se-quential ultrasound images using simulated annealing. IEEE Transactions on Medical Imaging, 8(4):344-353, December 1989. [60] B.H. Friemel, L.N. Bohs, and G.E. Trahey. Relative performance of two-dimensional speckle-tracking technique: Normalized correlation, non-normalized correlation and sum-absolute-difference. Proceedings of the IEEE Ultrasonics Symposium, pages 1481— 1484, 1995. [61] J.A. Gangloff, M . de Mathelin, and G. Abba. 6 dof high speed dynamic visual servoing using gpc controllers. Proceedings of IEEE International Conference on Robotics and Automation, pages 2008-2013, 1998. [62] A.H. Gee and R.W. Prager. Sequential 3d diagnostic ultrasound using the stradx system. Medical Image Computing and Computer Assisted Intervention, Cambridge, UK, September 1999. [63] V. Gemignani, M . Demi, M . Paterni, and A. Benassi. Real-time implementation of a new contour tracking procedure in a multi-processor dsp system. Proceedings of Circuits, Systems, Communications and Computers Conference, pages 3521-3526, 2000. [64] A. Giachetti. On line analysis of echocardiographic image sequences. Medical Image Analysis, 2(3):l-25, 1998. 140 BIBLIOGRAPHY [65] A. Giachetti, G. Gigli, and V. Torro. Computer assisted analysis of echocardiographic image sequences. Proceedings of Computer Vision, Virtual Reality and Robotics in Medicine, pages 267-271, 1995. [66] S.H. Gibson, H.E. Villanueva, and J. Falconer. Detecting the wall-motion of the fetal heart within ultrasound images. Proceedings of IEEE International Conference on Engineering in Medicine and Biology, 2:883-884, 1996. [67] A. Gilbert, M . Giles, G. Flachs, R. Rogers, and H. Yee. A real-time video tracking system. IEEE Transactions on Pattern Analysis and Machine Intelligence, 2(l):47-56, 1980. [68] J.D. Gill, H. Ladak, D.A. Steinman, and A. Fenster. Accuracy of a semi-automatic technique for segmentation of the carotid arteries from 3d ultrasound images. Pro-ceedings of BMES/EMBS Conference on Serving Humanity, Advancing Technology, 2:1146, 1999. [69] B. Girod. Motion compensation prediction with fractional-pel accuracy. IEEE Trans-actions on Communications, 41(4):604-612, 1993. [70] R. Goldberg. A modular robotic system for ultrasound image acquisition. M.Sc. Thesis, Johns Hopkins University, Baltimore, MD, 2001. [71] R. Goldberg, D. Mazilu, R.H. Taylor, and D. Stoianovici. A modular robotic system for ultrasound image acquisition. Medical Image Computing and Computer-Assisted Intervention, pages 1430-1432, 2001. [72] J.W. Goodman. Statistical optics. Wiley-Interscience, New York, 1985. 141 BIBLIOGRAPHY [73] A. Gourdon, P. Poignet, G. Poisson, P. Vieyres, and P. Marche. A new robotic mech-anism for medical application. Proceedings of IEEE/ASME International Conference on Advanced Intelligent Mechatronics, pages 33-38, 1999. [74] T. Gustavsson, R. Abu-Gharbieh, G. Hamarneh, and Q.Liang. Implementation and comparison of four different boundary detection algorithms for quantitative ultra-sonic measurements of the human carotid artery. IEEE Proceedings of Computers in Cardiology, pages 69-72, 1997. [75] T. Gustavsson, Q. Liang, I. Wendelhag, and J. Wikstrand. A dynamic programming procedure for automated ultrasonic measurement of the carotid artery. In IEEE Proceedings of Computers in Cardiology, pages 297-300, 1994. [76] T. Gustavsson, S. Molander, R. Pascher, Q. Liang, H. Broman, and K. Caidahl. A model-based procedure for fully automated boundary detection and 3d reconstruction from 2d echocardiograms. In IEEE Proceedings of Computers in Cardiology, pages 209-212, 1994. [77] G.D. Hager, W.C. Chang, and A.S. Morse. Robot feedback control based on stereo vision: Towards calibration-free hand-eye coordination. Proceeding of IEEE Interna-tional Conference on Robotics and Automation, pages 2850-2856, 1994. [78] G. Hamarneh and T. Gustavsson. Combining snakes and active shape models for segmenting the human left ventricle in echocardiographic images. Computers in Ca-diology, 27:115-118, 2000. 142 BIBLIOGRAPHY [79] S.H. Han, W.H. Seo, S.Y. Lee, S.H. Lee, H.W. Lee, and H. Toshiro. A study on real-time implementation of visual feedback control of robot manipulator. Proceedings of IEEE International Conference on Systems, Man and Cybernetics, 2:824-829, 1999. [80] R.C. Harrel, D.C. Slaughter, and P.D. Adsit. A fruit-tracking system for robotic harvesting. Machine Vision and Applications, pages 69-80, 1989. [81] H. Hashimoto, T. Kubota, W.C. Lo, and F. Harashima. A control scheme of visual servo control of robotic manipulators using artificial neural network. Proceedings of IEEE International Conference on Robotics and Applications, pages 3-6, 1989. [82] K. Hashimoto, T. Kimoto, T. Ebine, and H. Kimura. Manipulator control with image-based visual servo. Proceedings of IEEE International Conference on Robotics and Automation, pages 2267-2272, 1991. [83] K. Hashimoto and T. Noritsugu. Performance and sensitivity in visual servoing. Proceedings of IEEE International Conference on Robotics and Automation, pages 2321-2326, 1998. [84] W.R. Hedrick. Ultrasound Physics and Instrumentation. 3rd Ed., Mosbey Year Book, 1992. [85] M . Helbing and R. Orglmeister. Anisotropic filtering for detecting left ventricular borders in echocardiographic images. Proceedings of IEEE International Conference on Computers in Cardiology, pages 197-200, 1993. [86] W.R. Hendee and R. Ritenour. Medical Imaging Physics. Mosby Year Book, 1992. 143 BIBLIOGRAPHY [87] J. Hill and W.T. Park. Real time control of a robot with a mobile camera. Proceedings of the 9th International Symposium on Intelligent Robotics, pages 233-246, 1979. [88] T.C. Hodges, P.R. Detmer, D.H. Burns, K.W. Beach, and Jr. D.E. Strandness. Ul-trasound three-dimensional reconstruction: in vitro and in vivo volume and area measurement. Ultrasound in Medicine and Biology, 20:719-729, 1994. [89] R. Horaud, F. Dornaika, and B. Espiau. Visually guided object grasping. IEEE Transactions on Robotics and Automation, 14(4):525-532, 1998. [90] K. Hosoda and M . Asada. Versatile visual servoing without knowledge of true jaco-bian. Proceedings of IEEE/RSJ/GI International Conference on Intelligent Robots and Systems, pages 186-193, 1994. [91] N . Houshangi. Control of a robotic manipulator to grasp a moving target using vision. Proceedings of IEEE International Conference on Robotics and Automation, pages 604-609, 1990. [92] S. Hutchinson, G. Hager, and P.I. Corke. A tutorial on visual servo control. IEEE Transactions on Robotics and Automation, 12(5):651-670, October 1996. [93] G. Jacob, J.A. Nobel, and A. Blake. Robust contour tracking in echocardiographic sequences. Proceedings of the Sixth International Conference on Computer Vision, pages 408-413, 1998. [94] W. Jang and Z. Bien. Feature-based visual servoing of an eye-in-hand robot with improved tracking performance. Proceedings of IEEE International Conference on Robotics and Automation, pages 2254-2260, 1991. 144 BIBLIOGRAPHY [95] M . Kabuka, J. Desoto, and J. Miranda. Robot vision tracking system. IEEE Trans-actions on Industrial Electronics, 35(1):40-51, 1988. [96] M . Kabuka, E. McVey, and P. Shironoshita. An adaptive approach to video tracking. IEEE Transactions on Robotics and Automation, 4(2):228-236, 1988. [97] M . Kass, A. Witkin, and D. Terzoulos. Snakes: Active contour models. International Journal of Computer Vision, 1:321-331, 1987. [98] D. Kim, T .M. Kinter, and J.F. Greenleaf. Correlation search method with third-order statistics for computing velocities from ultrasound images. IEEE Ultrasonics Symposium, pages 869-872, 1989. [99] M . Knaflitz and G. Balestra. Computer analysis of the myoelectric signal. IEEE Micro, 11 (5): 12-15, October 91. [100] N . Koizumi, S. Warisawa, M . Mitsushi, and H. Hashizume. Impedance controller for a remote ultrasound diagnostic system. Proceedings of the IEEE International Conference on Robotics and Automation, pages 651-656, 2002. [101] J.I. Koo and S.B. Park. Speckle reduction with edge preservation in medical ultrasonic images using a homogeneous region growing mean filter (hrgmf). Ultrasonic Imaging, 13:211-237, 1991. [102] M . Kuperstein. Generalized neural model for adaptive sensory-motor control of single postures. Proceedings of IEEE International Conference on Robotics and Automation, pages 140-143, 1988. 145 BIBLIOGRAPHY [103] H.M. Ladak, D.B. Downey, D.A. Steinman, and A. Fenster. Semi-automatic technique for segmentation of the prostate from 2d ultrasound images. Proceedings of IEEE BMES/EMBS Conference on Serving Humanity, Advanced Technology, 2:1144, 1999. [104] M . Leahy, V. Milholen, and R. Shipman. Robotic aircraft refueling: a concept demon-stration. Proceedings of National Aerospace and Electronics Conference, pages 1145-1150, 1990. [105] F. Lefebvre, G. Berger, and P. Laugier. Automatic detection of the boundary of the calcaneus from ultrasound parametric images using an active contour model; clinical assessment. IEEE Transactions on Medical Imaging, 17(l):45-52, 1998. [106] Z. Lin, V. Zeman, and R.V. Patel. On-line robot trajectory planning for catching a moving object. Proceedings of IEEE International Conference on Robotics and Automation, pages 1726-1731, 1989. [107] Y . J . Liu, W.S. Ng, M.Y. Teo, and H.C. Lim. Computerized prostate boundary esti-mation of ultrasound images using radial bas-relief method. Medical and Biological Engineering and Computing, 35(5):445-454, 1997. [108] T. Loupas, W.N. McDicken, and P.L. Allan. An adaptive weighted median filter for speckle suppression in medical ultrasound images. IEEE Transactions on Circuits and Systems, 36(1):129-135, 1989. [109] C.J. De Luca. The use of surface electromyography in biomechanics. Journal of Applied Biomechanics, 13:135-163, 1997. 146 BIBLIOGRAPHY [110] J.J. Mai, C. Kargel, S. Mhanna, and M.F. Insana. Ultrasonic strain imaging in me-dia with pulsatile flow. Proceedings of SPIE Medical Imaging Conference: Ultrasonic Imaging and Signal Processing, M.F. Insana, K.K. Shung (Eds), 4325:139-149, Feb-ruary 2001. [Ill] A.G. Makhlin. Stability and sensitivity of servo vision systems. Proceedings of the 5th International Conference on Robot Vision and Sensory Controls, pages 79-89, 1985. [112] E. Malis, F. Chaumette, and S. Boudet. Positioning a coarse-calibrated camera with respect to an unknown object by 2d 1/2 visual servoing. Proceedings of IEEE Inter-national Conference on Robotics and Automation, pages 1352-1359, 1998. [113] E. Malis, F. Chaumette, and S. Boudet. Positioning a coarse-calibrated camera with respect to an unknown object by 2d 1/2 visual servoing. Proceedings of IEEE/RSJ International Conference on Intelligent Robots and Systems, pages 691-697, 1998. [114] R. Manduchi and G.A. Mian. Accuracy analysis for correlation-based image regis-tration algorithms. Proceedings of IEEE International Symposium on Circuits and Systems, pages 834-837, 1993. [115] L. Markosian. A motion-compensated filter for ultrasound image sequences. Technical Report, CS-96-14, Department of Computer Science, Brown University, 1996. [116] P. Martinet and J. Gallice. Position based visual servoing using a non-linear ap-proach. Proceedings of IEEE/RSJ International Conference on Intelligent Robots and Systems, pages 531-536, 1998. 147 BIBLIOGRAPHY [117] A. Maruyama and M. Fujita. Robust visual servo control for planar manipulators with eye-in-hand configurations. Proceedings of the 3&h International Conference on Decision and Control, pages 2551-2552, 1997. [118] T. Mclnerney and D. Terzopoulos. Deformable models in medical image analysis: a survey. Medical Image Analysis, 1(2):91-108, 1996. [119] W. Miller. Sensor-based control of robotic manipulators using a general learning algorithm. IEEE Transactions on Robotics and Automation, 3(5): 157-165, 1987. [120] M . Mitsuishi, S. Warisawa, T. Tsuda, T. Higuchi, N . Koizumi, H. Hashizume, and K. Fujiwara. Remote ultrasound diagnostic system. Proceedings of the IEEE Inter-national Conference on Robotics and Automation, pages 1567-1574. [121] F. Miyazaki and S. Aritmoto. Sensory feedback for robot manipulators. Journal of Robotic Systems, 2(1):53-71, 1985. [122] J. Mochizuki, M . Takahashi, and S. Hata. Unpositioned workpieces handling robot with visual and force sensors. IEEE Transactions on Industrial Electronics, 34(l):l-4, 1987. [123] J. Montagnat, H. Delingette, and G. Malandain. Cylindrical echocardiographic image segmentation based on 3d deformable models. Proceedings of International Confer-ence on Medical Image Computing and Computer Assisted Intervention, 1679:168-175, Cambridge, UK, 1999. 148 BIBLIOGRAPHY [124] G. Morel, E. Malis, and S. Boudet. Impedance based combination of visual and force control. Proceedings of IEEE International Conference on Robotics and Automation, pages 1743-1748, 1998. [125] R. Murray, Z. Li , and S.S. Sastry. A mathematical introduction to robotic manipula-tion. CRC Press, 1994. [126] R. Muzzolini, Y .H. Yang, and R. Pierson. Multiresolution texture segmentation with application to diagnostic ultrasound images. IEEE Transactions on Medical Imaging, 12(1):108-123, 1993. [127] Y . Nakabo and M . Ishikawa. Visual impedance using 1 ms visual feedback system. Proceedings of IEEE International Conference on Robotics and Automation, pages 2333-2338, 1998. [128] S. Negahdaripour and J. Fox. Undersea optical stationkeeping: Improved methods. Journal of Robotic Systems, 8(3):319-338, 1991. [129] B.J. Nelson and P.K. Khosla. Force and vision resolvability for assimilating disparate sensory feedback. IEEE Transactions on Robotics and Automation, 12, 1996. [130] T.R. Nelson, D.B. Downey, D.H. Pretorius, and A. Fenster. Three-Dimensional Ul-trasound. Lippincott Williams and Wilkins, 2000. [131] R. Ohbuchi, D. Chen, and H. Fuchs. Incremental volume reconstruction and ren-dering for 3-d ultrasound imagine. Proceedings of SPIE, Visualization in' Biomedical Computing, R.A. Robb (Ed.), 1808:312-323, 1992. 149 BIBLIOGRAPHY [132] R. Ohbuchi and H. Fuchs. Incremental 3-d ultrasound imaging from a 2-d scanner. Visualization in Biomedical Computing, pages 360-367, 1990. [133] N . Papanikolopoulos, P. Kholsa, and T. Kanade. Vision and control techniques for robotic visual tracking. Proceedings of IEEE International Conference on Robotics and Automation, pages 857-864, 1991. [134] N.P. Papanikolopoulos and P.K. Khosla. Shared and traded telerobotic visual control. Proceedings of IEEE International Conference on Robotics and Automation, pages 878-885, 1992. [135] V . G . M . Paterni, M . Demi, A. Benassi, and V. Gemignani. A real-time contour track-ing system to investigate the cross-area sectional changes of the aorta. Computers in Cardiology, 27:599-602, 2000. [136] M.L. Patten. Questionnaire research : a practical guide. Pyrczak Publishing, 2001. [137] R.P. Paul. Robot Manipulators: Mathematics, Programming and Control. MIT Press, Cambridge, Massachusetts, 1981. [138] J.A. Piepmeier, G.V. McMurray, and H. Lipkin. A dynamic jacobian estimation method for uncalibrated visual servoing. Proceedings of IEEE International Confer-ence on Adaptive Intelligent Mechatronics, pages 944-949, 1999. [139] J.A. Piepmeier, G.V. McMurray, and H. Lipkin. A dynamic quasi-newton method for uncalibrated visual servoing. Proceedings of IEEE International Conference on Robotics and Automation, 2:1595-1600, 1999. 150 BIBLIOGRAPHY [140] F. Pierrot, E. Domre, and E. Dgoulange. Hippocrate: A safe robot arm for medical applications with force feedback. Medical Image Analysis, 3:285-300, 1999. [141] F. Pla and M . Bober. Estimating translation/deformation motion through phase correlation. International Conference on Image Analysis and Processing, 1997. [142] R.W. Prager, R.N. Rohling, A.H. Gee, and L. Berman. Automatic calibration for 3-d free-hand ultrasound. Technical report CUED/F-INFENG/TR 303, Cambridge University Department of Engineering, September 1997. [143] J.U. Quistgaard. Signal acquisition and processing in medical diagnostic ultrasound. IEEE Signal Processing Magazine, 14(l):67-74, 1997. [144] F.H. Raab, E.B. Blood, T.O Steiner, and H.R. Jones. Magnetic position and ori-entation tracking system. IEEE Transactions on Aerospace and Electronic Systems, 15:709-717, 1979. [145] V.A. Ramirez, R .M. Cid, and M . Briot. A fourier transform based method for esti-*> mation of 2-d translation and rotation: An application to outdoor mobile robotics. Technical Report, Laboratory for Analysis and Architecture of Systems, Toulouse, France, 1998. [146] B.S. Reddy and B.N. Chatterji. An fft-based technique for translation, rotation, and scale-invariant image registration. IEEE Transactions on Image Processing, 5(8):1266-1271, 1996. 151 BIBLIOGRAPHY [147] F. Reyes and R. Kelly. Experimental evaluation of fixed-camera direct visual con-trollers on a direct-drive robot. Proceedings of IEEE International Conference on Robotics and Automation, pages 2327-2332, 1998. [148] P. Rives and J.J. Borrelly. Visual servoing techniques applied to an underwater vehicle. Proceedings of IEEE International Conference on Robotics and Automation, pages 1851-1856, 1997. [149] A. Rizzi and D. Koditscheck. Preliminary experiments in spatial robot juggling. Proceedings of the 2nd International Symposium on Experimental Robotics, France, 1991. [150] R. Rohling, A. Gee, and L. Berman. 3-d spatial compounding of ultrasound images. Medical Image Analysis, 1(3):177-193, Oxford University Press, Oxford, UK, 1997. [151] T. Sakaguchi, M . Fujita, H. Watanabe, and F. Miyazaki. Motion planning and control for a robot performer. Proceedings of IEEE International Conference Robotics and Automation, pages 925-931, 1993. [152] S.E. Salcudean, G. Bell, S. Bachmann, W.H. Zhu, P. Abolmaesumi, and P.D. Lawrence. Robot-assisted diagnostic ultrasound - design and feasibility experiments. Medical Image Computing and Computer Assisted Intervention, pages 1062-1071, September C. Taylor and A. Colchester (Eds.), Springer, 1999. [153] S.E. Salcudean and N.R. Parker. 6-dof desk-top voice-coil joystick. 6th Symposium of Haptic Intererfaces for Virtual Environment and Teleoperation Systems (ASME Winter Annual Meeting), 61:131-138, November 16-21 Dallas, Texas, 1997. 152 BIBLIOGRAPHY [154] S.E. Salcudean, W.H. Zhu, P. Abolmaesumi, S. Bachmann, G. Bell, P.D. Lawrence, S.P. DiMaio, and M.R. Sirouspour. Robot-assisted diagnostic ultrasound. Proceedings of IEEE International Conference on Robotics and Automation, Video Presentation, 2001. [155] S.E. Salcudean, W.H. Zhu, P. Abolmaesumi, S. Bachmann, and P.D. Lawrence. A robot system for medical ultrasound. The 9th International Symposium of Robotics Research (ISRR), J.M. Hollerbach, D.E. Koditschek (Eds.), pages 152-159, Oct. 9-12 Snowbird, Utah, 1999. [156] A.C. Sanderson and L.E. Weiss. Image based visual servo control using relational graph error signal. Proceedings of IEEE International Conference on Cybernetics and Society, pages 1074-1077, 1980. [157] S.K. Setarehdan and J.J. Soraghan. Automatic cardiac lv boundary detection and tracking using hybrid fuzzy temporal and fuzzy multiscale edge detection. IEEE Transactions on Biomedical Engineering, 46(11):1364-1378, 1999. [158] S.K. Setarehdan, J.J. Soraghan, and L A . Hunter. Fully automatic left ventricular myocardial boundary detection in echocardiographic images: A comparison of two modern methods. IEE Colloquium on Artificial Intelligence Methods for Biomedical Data Processing, 5:1-5, 1996. [159] S.K. Setarehdan, J.J. Soraghan, and LA. Hunter. Fully automatic left ventricular myocardial boundary detection in echocardiographic images: a comparison of two modern methods. IEE Colloquium on Artificial Intelligence Methods for Biomedical Data Processing, 5:1-6, 1996. 153 BIBLIOGRAPHY [160] R. Sharma and S. Hutchinson. Motion perceptibility and its application to ac-tive vision-based servo control. IEEE Transactions on Robotics and Automation, 13(4):607-617, 1997. [161] Y . Shirai and H. Inoue. Guiding a robot by visual feedback in assembling tasks. Pattern Recognition, 5:99-108, 1973. [162] S. Skaar, W. Brockman, and R. Hanson. Camera-space manipulation. International Journal of Robotics Research, 6(4):20-32, 1987. [163] G. Skofteland and G. Hirzinger. Computing position and orientation of a freeflying polyhedron from 3d data. Proceedings of IEEE International Conference on Robotics and Automation, pages 150-155, 1991. [164] C.E. Smith, S.A. Brandt, and N.P. Papanikolopoulos. Eye-in-hand robotic tasks in un-calibrated environments. IEEE Transactions on Robotics and Automation, 13(6):903-914, 1997. [165] British Columbia Ultrasonographer's Society, Healthcare Benefit Trust, and Health Sciences Association. Sonographer's work, health &; disability survey. 1996. [166] B. Solaiman, C. Roux, R .M. Rangayyan, F. Pipelier, and A. Hillion. Fuzzy edge evaluation in ultrasound endosonographic images. Proceeding of Canadian Conference on Electrical and Computer Engineering, 1:335-338, 1996. [167] M . Sonka and J.M. Fitzpatrick, editors. Handbook of Medical Imaging. SPIE Press, 2000. 154 BIBLIOGRAPHY [168] M.G. Strintzis and I. Kokkindis. Maximum likelihood motion estimation in ultrasound image sequences. IEEE Signal Processing Letters, 4(6): 156-157, 1997. [169] F.B. Stulen and C.J. De Luca. Frequency parameters of the myoelectric signal as a measure of muscle conduction velocity. IEEE Transactions on Biomedical Engineer-ing, 28(7):135-163, July 1981. [170] H. Sutanto, R. Sharma, and V. Varma. Image based autodocking without calibration. Proceedings of IEEE International Conference on Robotics and Automation, 2:974-979, 1997. [171] A. Suvichakorn and C. Chinrungrueng. Speckle noise reduction based on least squares approximation. Proceedings of IEEE Asia-Pacific Conference on Circuits and Sys-tems, pages 430-433, 2000. [172] M.C. Taine, A. Herment, B. Diebold, and P. Peronneau. Segmentation of cardiac and vascular ultrasound images with extension to border kinetics. Proceedings of IEEE Ultrasonics Symposium, 3:1773-1776, 1994. [173] F. Tendick, J. Voichick, G. Tharp, and L. Stark. A supervisory telerobotic control system using model-based vision feedback. Proceedings of IEEE International Con-ference on Robotics and Automation, pages 2280-2285, 1991. [174] C. Tomasi. Lecture notes of cs205. Department of Computer Science, Stanford Uni-versity, Fall 2000. 155 BIBLIOGRAPHY [175] A.H. Torp, B. Olstad, K.P. Schipper, S. Frigstad, and K. Oygarden. Simultaneous tracking of endocard and epicard in ultrasound contrast imaging. Proceedings of IEEE Ultrasound Symposium, 3:1585-1588, 1994. [176] S. Tsuruoka, M . Umehara, F. Kimura, T. Wakabayashi, Y . Miyake, and K. Sekioka. Regional wall motion tracking system for high-frame rate ultrasound echocardiogra-phy. Proceedings of the IEEE Advanced Motion Control Conference, pages 389-394, 1996. [177] H.E. Vanderpool, E.A. Friis, B.S. Smith, and K.L . Harms. Prevalence of carpal tunnel syndrome and other work-related musculoskeletal problems in cardiac sonographers. Journal of Occupational Medicine, 35:604-610, June 1993. [178] S. Venkatesan and C. Archibald. Real-time tracking in five degrees of freedom using two wrist-mounted laser range finders. Proceedings of IEEE International Conference on Robotics and Automation, pages 2004-2010, 1990. [179] A. Vilchis-Gonzales, J. Troccaz, P. Cinquin, F. Courreges, G. Poisson, and B. Tondu. Robotic tele-ultrasound system (ter): Slave robot control. In Processings of TA2001: 1st IFAC Conference on Telematic Applications in Automation and Robotics, 2001. [180] R.F. Wagner, S.W. Smith, J .M. Sandrik, and H. Lopez. Statistics of speckle in ultra-sound b-scans. IEEE Transactions on Sonics and Ultrasonics, 30(3): 156-163, 1983. [181] J. Wang and W.J. Wilson. Three-d relative position and orientation estimation using kalman filter for robot control. Proceedings of IEEE International Conference on Robotics and Automation, pages 2338-2645, 1992. 156 BIBLIOGRAPHY [182] D.B. Westmore and W.J. Wilson. Direct dynamic control of a robot using an end-point mounted camera and kalman filter position estimation. Proceedings of IEEE International Conference on Robotics and Automation, pages 2376-2384, 1991. [183] D.E. Whitney. The mathematics of coordinated control of prosthetic arms and ma-nipulators. Journal of Dynamic Systems and Measurement Control, pages 303-309, 1972. [184] D. Xiao, K. Ghosh, N. X i , and T.J. Tarn. Sensor-based hybrid position/force control of a robot manipulator in an uncalibrated environment. IEEE Transactions on Control Systems Technology, 8(4), 2000. [185] F. Yeung, S.F. Levinson, D. Fu, and K.J . Parker. Feature-adaptive motion tracking of ultrasound image sequence using a deformable mesh. IEEE Transactions on Medical Imaging, 17(6):945-956, December 1998. [186] A. Yezzi, S. Kichenassamy, A. Kumar, P. Olver, and A. Tannenbaum. A geometric snake model for segmentation of medical imagery. IEEE Transactions on Medical Imaging, 16(2):199-209, 1997. [187] E. Zergeroglu, D.M. Dawson, M.S. de Queiroz, and S. Nagarkatti. Robust visual-servo control of robot manipulators in the presence of uncertainty. Proceedings of the 3b* h International Conference on Decision and Control, pages 4137-4142, 1999. [188] D.B. Zhang, L.V. Gool, and Oosterlinck. Stochastic predictive control of robot track-ing systems with dynamic visual feedback. Proceedings of IEEE International Con-ference on Robotics and Automation, pages 610-615, 1990. 157 BIBLIOGRAPHY [189] W.H. Zhu, S.E. Salcudean, S. Bachmann, and P. Abolmaesumi. Motion/force/image control of a diagnostic ultrasound robot. Proceedings of IEEE International Confer-ence on Robotics and Automation, 2:1580-1585, San Francisco, CA, 2000. 158 A p p e n d i x A Ultrasound Imaging A.1 Ultrasound Imaging Basics In ultrasound imaging, an acoustic wave is launched into the body using a handheld trans-ducer. The wave interacts with tissue and blood, and some of the transmitted energy returns to the transducer to be detected by the instrument. If we know the velocity of propagation in the tissue being interrogated, we can determine the distance from the transducer at which the interaction occured. The characteristics of the return signal (amplitude, phase, etc.) provide information on the nature of the interaction, and hence they give some indication of the type of medium in which they occured. In most diagnostic applications of ultrasound, use is made of ultrasound waves reflected from interfaces between different tissues in the patient [86]. The fraction of the impinging energy reflected from an interface depends on the difference in acoustic impedance of the media on opposite sides of the interface. The acoustic impedance Z of a medium is the 159 Ultrasound Imaging product of the density p of the medium and the velocity C of sound waves in the medium: Z = pc (A.l) Acoustic impedances of several materials are listed in Table A . l . For an ultrasound wave incident perpendicularly upon an interface, the fraction a# of the incident energy that is reflected (i.e., the reflection coefficient CCR) is f z 2 - z l \ a R = \z^z[) where Z\ and Z2 are the acoustic impedances of the two media. With a large impedance mismatch at an interface, much of the energy of an ultrasound is reflected, and only a small amount is transmitted across the interface. For example, ultrasound beams are reflected strongly at air-tissue and air-water interfaces because the impedance of air is much less than that of tissue or water. At a muscle-liver interface, slightly more than 1% of the incident energy is reflected, and about 99% of the energy is transmitted across the interface. Even Material Impedance (kg/m2/s) Air 430 Water 1.52 x lO 6 Brain 1.56 x lO 6 Fat 1.40 x lO 6 Bone 6.00 x lO 6 Skin 1.63 x lO 6 Blood 1.56 x lO 6 Muscle 1.68 x lO 6 Table A . l : Approximate acoustic impedance of selected materials 160 Ultrasound Imaging though the reflected energy is small, it is often sufficient for visualization of the liver border. Because of the high value of the coefficient of ultrasound reflection at an air-tissue inter-face, water paths and various creams and gels are used during the ultrasound examinations to remove air pockets (i.e., to obtain good acoustic coupling) between the ultrasound trans-ducer and patients skin. With adequate acoustic coupling, the ultrasound waves will enter the patient with little reflection at the skin surface. Similarly, strong reflections of ultra-sound occur at the boundary between the chest wall and the lungs and at the millions of air-tissue interfaces within the lungs. Because of the large impedance mismatch at these interfaces, efforts to use ultrasound as a diagnostic tool for the lungs have been unrewarding. A.2 Ultrasound Analysis of Peripheral Artery Disease Ultrasound imaging is widely used to depict carotid, brachial, femoral, as well as other peripheral arteries [167]. There are several major advantages of using ultrasound in com-parison to other techniques such as MRI or CT scanning. Most importantly, ultrasound imaging is non-invasive and allows real-time imaging of the arterial lumen and wall that is not currently possible with any other imaging modality [167]. While x-ray contrast angiog-raphy offers real-time imaging, the image information is limited to the vessel lumen and no information about the vessel wall is available. Insufficient spatial image resolution, the need for injection of blood pool contrast agents, as well as low volumetric imaging speed are limiting the utility of MRI to lumen visualization of large vessels. The same limitation applies to CT in general, although electron beam as well as spiral CT has proven useful for visualization and quantitation of arterial calcification [167]. B-mode ultrasound provides a 161 Ultrasound Imaging noninvasive approach to visualize arteries and can be applied repeatedly to the same sub-ject to monitor development of atherosclerosis. Ultrasound methods have other advantages allowing direct evaluation of the arterial wall. Therefore both lumen diameter and wall thickness would be measured, which are important to assess the severity of the disease and to evaluate its progression [167], A.3 Ultrasound Imaging Display Modes Ultrasound images are normally displayed in four different modes which are explained here. A.3.1 A-Mode A-mode (amplitude-mode) ultrasound scanning is based on the echo-ranging principle sim-ilar to sonar. A pulsed ultrasound wave is directed into the studied subject and the echoes generated at various tissue interfaces are detected. Only the structures that lie along the direction of the ultrasound wave propagation are interrogated. The beam path is called a scan line [167]. The display is an oscilloscope with the x axis as the time after the source pulse and the y axis the amplitude of the reflected sound pulse as it reaches the trans-ducer. When the distances between tissue interfaces are less than half of the ultrasound pulse length, the displayed spikes overlap. Different echoes appear as one spike, which is consequently interpreted as a single interface [167]. 162 Ultrasound Imaging A.3.2 B-Mode The most commonly used scanning mode is the brightness scan. B-mode scanning con-verts the A-mode spike into brightness-modulated dot, which is varied in response to the amplitude of the echo-induced signal [84]. The position of the dot is derived from the time-of-flight information. Two-dimensional static B-mode imaging requires the collection of dots from individual lines of sight, one at a time, and the storing of the scan data until all lines of sight that form the image are collected. Real-time B-mode ultrasound imaging allows continuous monitoring of the area of in-terest. The displayed image rapidly updates with new scan data as the ultrasound beam is swept repeatedly throughout the field of view. The frame rate is usually 15 or 30 frames per second but higher frame rates are possible. A.3.3 M-Mode Another way of viewing echo information is to only interrogate tissue in one direction and to display the resulting one-dimensional data over time as a scrolling display [143]. Such a display is referred to as an "M-mode" or "motion-mode" display. The advantage of M-mode imaging is that it has much finer temporal resolution (on the order of 1000 lines per second) than B-mode imaging which, for example, can be important when analyzing cardiac valve motion. 163 Ultrasound Imaging A . 3 . 4 D o p p l e r M o d e s In addition to simple "brightness" displays of echo information, ultrasound systems can detect motion using either continuous-wave (CW) or pulsed-wave (PW) Doppler techniques [167]. The resulting data are usually processed and displayed as either spectral Doppler (Fourier Transform information), or color flow images. Power-mode Doppler techniques are related to color flow imaging, but in this case an estimate is made of the power of the Doppler signal, as opposed to velocity and direction. This is an easier estimate to make and is generally more sensitive to low flow states and less angle dependent than color flow. These advantages allow more complete visualization of the fine vascular architecture. A . 4 Ultrasound Imaging Artifacts An ultrasound image is the portrayal of anatomy probed by ultrasound beams. An imaging artifact is any structure in the image that does not correlate with actual tissue. There are four major ultrasound imaging artifacts [167]. A . 4 . 1 R e v e r b e r a t i o n s Reverberations occur from multiple reflections in the investigated subject or multiple reflec-tions between investigated object and the transducer. This is visualized in the ultrasound image by a series of bright bands of decreasing intensity that are equidistant from each other. 164 Ultrasound Imaging A . 4 . 2 S p e c k l e Speckle patterns in ultrasound images are generated by constructive and deconstructive interference of backscattered echoes. When visualizing anatomical structures the speckle patterns create an undesirable effect in the image. For this reason when designing echocar-diographic equipment much effort is spent on reducing the speckle patterns. A . 4 . 3 D r o p o u t s Dropouts are common in ultrasound images. Dropouts are produced by individual or com-bined effects of multiple causes such as signal attenuation, refraction, transducer defects, and the presence of anatomy structures. Since no information can be derived from the dropout areas, nearby tissue interfaces may impede identification of the correct borders [167]. A . 4 . 4 P r o b e M o t i o n A r t i f a c t s Probe motion artifact is an important issue if imaging of the same vessel is performed over an extended period of time. During any long image acquisition, both the ultrasound transducer and the subject must be kept stationary to ensure the reproducible imaging plane. However, the sonographer may not hold the transducer still due to hand fatigue, or the subject may move during the scanning, or the vessel itself may move inside the imaged tissue. When either motion occurs, the probe motion artifacts will have observable effects in the acquired images, especially if diameter measurement is involved. 165 Ultrasound Imaging A.5 3-D Ultrasound One disadvantage of 2-D ultrasound imaging relates to the subjectivity of the conventional exam, which results from the dependence on the experience and knowledge of the diagnosti-cian to manipulate the ultrasound transducer, mentally transform the 2-D images into 3-D tissue structure, and make the diagnosis or perform an interventional procedure [167]. This difficulty results primarily from using a spatially flexible 2-D imaging technique to view 3-D anatomy. In addition, it is difficult to localize the thin 2-D ultrasound image plane in the organ, and difficult to reproduce a particular image location at a later time, making the conventional 2-D exam a poor imaging modality for quantitative prospective or follow-up studies. Further, the patients anatomy or orientation sometimes restricts the image angle, resulting in inaccessibility of the optimal image plane necessary for diagnosis. The goal of 3-D ultrasound imaging is to overcome these limitations by providing an imaging technique that reduces the variability of the conventional technique and allows the diagnostician to view the anatomy in 3-D [57]. Unlike CT and MRI, ultrasound provides images at a high rate (10-60 images per second), and the orientation of the images is flexible because they are not necessarily acquired as a stack of planes. In addition to the unique problems imposed by ultrasound imaging physics (speckle, shadowing, distortions), the high rate of image acquisition and flexibility of the conventional technique provide unique problems to overcome, as well as opportunities to be exploited in extending ultrasound imaging from its 2-D presentation of images to 3-D and 4-D [57]. The generation of a 3-D ultrasound image from a series of 2-D image slices has been covered extensively in the literature, with methods of registering the location of the probe 166 Ultrasound Imaging and reconstruction of three-dimensional objects being the two main areas of research. A.5.1 Acquisition Techniques A number of methods have been used for image acquisition in 3-D ultrasound systems: free-hand acquisitions, mechanical localizers, and 3-D probes [57]. A.5.1.1 Free-hand Acquisition In free-hand acquisition, the operator holds an assembly composed of the probe and posi-tioning sensors,such as a magnetic field sensor, and manipulates it in the usual manner over the anatomy to be viewed. Three basic approaches have been developed to track the ultra-sound probe in this method: acoustic, articulated arm, and electromagnetic positioners, as shown in Figure A . l . Acoustic Positioner The most common method for acquiring free-hand 3-D images is based on acoustic ranging (Figure A. 1(a)) [57]. The angulation and position of the transducer is obtained by mounting three sound-emitting devices in a fixed position relative to each other on the transducer. An array of microphones is typically mounted above the patient. To obtain the information necessary to reconstruct the 3-D image, the operator moves the transducer freely over the patient while the sound emitting devices are active. So, the position and angulation of the transducer can be continuously monitored [57]. There are some drawbacks to using acoustic tracking techniques however. For instance, the ultrasound technician must avoid blocking the sound path. As well, the noise generated by the probe can be 167 Ultrasound Imaging (a) (b) (c) Figure A.1: Schematic diagram showing three basic methods for obtaining the position and orientation of the ultrasound transducer for the free-hand acquisition technique: a) acoustic, b) articulated arm, and c) electromagnetic positioners. disconcerting for some patients. In addition, corrections must be made for variation of the speed of sound in air due to changes in temperatures and humidity [57]. Articulated Arm Positioner The simplest approach is achieved by mounting the transducer on a mechanical arm system with multiple movable joints, which allows the operator to manipulate the transducer in a complex manner and select the desired view and orientation (Figure A.1(b)). One example of this approach can be found at [132]. Potentiometers were used to measure the joint angles, which in turn were used to calculate the position of the transducer. The use of passive mechanical arms to determine the ultrasound transducer location does have some limitations. In particular, they tend to be cumbersome to use due to the added inertia. Magnetic Field Sensor Another approach makes use of a six degree-of-freedom magnetic field sensor to measure the transducers position and orientation [57,58,88,144]. This device, shown schematically in Figure A. 1(c), consists of a transmitter placed close to the patient and a receiver mounted on 168 Ultrasound Imaging the probe. The transmitter produces a spatially varying magnetic field, and the receiver, containing three orthogonal coils, measures the field strength. By measuring the local magnetic filed, the position and angulation of the receiver relative to the transmitter can be determined. Although this approach is very flexible, accurate 3-D reconstruction requires that elec-tromagnetic interference be minimized, the transmitter be close to the receiver to allow field measurements with sufficient signal-to-noise ratio, and that ferrous or highly conduc-tive metals be absent from the vicinity, since they can distort the magnetic field. Two companies are currently producing magnetic positioning devices of sufficient quality for 3-D ultrasound imaging: the Fastrack by Polhemus and Flock of Birds by Ascension Technologies. These devices have been used successfully in echocardiography, obstetrics, and vascular imaging [130]. A.5.1.2 Mechanical Localizers In this approach, a mechanical 3-D probe is used in which the third dimension is obtained by mechanical movement of the transducer in a precise, predefined manner [57]. As the trans-ducer is moved, 2-D ultrasound images are acquired at predefined spatial. In general, these assemblies make use of conventional mechanical or linear-array transducers mounted in an assembly to allow translation or rotation of the transducer by a motor. The reconstruction is efficient because the required geometrical parameters can be computed in advance. This approach to 3-D imaging has been implemented with three basic types of motion, as shown schematically in Figure A.2: linear, fan, and rotation scanning. 169 Ultrasound Imaging (a) (b) (c) Figure A.2: Schematic diagram showing the three basic types of motion used in 3-D ultra-sound systems making use of mechanical scanning: a) linear, b) fan, and c) rotational. Linear Scanning In this approach, the conventional ultrasound transducer is mounted on an assembly connected to a lead screw that can be driven by a motor. Since the acquired 2-D images are parallel to each other and separated by predefined intervals, the reconstruction can be very efficient (Figure A.2(a)) Fan scanning In this scanning geometry, the transducer is rotated about an axis at the transducer face, as shown in Figure A.2(b). This results in an angular sweep, providing a fan of planes, which are acquired with a predefined angular separation [57]. The advantage of this technique is that the mechanism can be made sufficiently small to allow easy hand-held manipulators. Rotational Scanning In this scanning geometry, the transducer is placed into an external assembly that rotates the probe with the axis of rotation along the central axis of the probe (Figure A.2(c)). In this way, the probe tip and its axis location remain fixed, and the acquired images sweep out a canonical volume in a propeller-like fashion [57]. 170 Ultrasound Imaging With this approach, if any motion occurs during the scan, other than the desired rotation about the probe axis of either the patient or the probe, the resulting image would contain artifacts in the center along the axis of rotation. In addition, the relative geometry of the imaging plane and the axis of rotation must be accurately known to avoid artifacts. A.5.1.3 2-D Arrays All the 3-D imaging techniques described earlier require that a planar beam of ultrasound be swept over the anatomy by the use of a 1-D transducer array, which is manipulated either mechanically or free-hand. A better approach would be to keep the transducer stationary but use electronic scanning to sweep the ultrasound beams over the volume. This can be accomplished with the use of 2-D transducer arrays that produce an ultrasound beam covering the entire volume. A number of 2-D array designs have been described, but the one developed at Duke University for real-time 3-D echocardiography is the most advanced [57,130]. In this approach, the 2-D array generates a pulse of ultrasound diverging away from the array in a pyramidal shape. The echoes are processed to generate 3-D information in real-time. However, currently, there are some technical problems in designing these transducers at a reasonable cost, including the physical construction, wiring and the speed of processing the ultrasound signal. 171 A p p e n d i x B Ultrasound Robot User Testing F o r m 1. The system is easy to use. (strongly agree) 5 4 3 2 1 (strongly disagree) 2. You feel safe when you are working with the robot. (strongly agree) 5 4 3 2 1 (strongly disagree) 3. You feel comfortable to move the robot by using the Joystick, (strongly agree) 5 4 3 2 1 (strongly disagree) 4. The robot moves smoothly when you are using the Joystick, (strongly agree) 5 4 3 2 1 (strongly disagree) 5. Please rate the following: 172 Ultrasound Robot User Testing Form (a) Start button color change (extremely useful) 5 4 3 2 1 (not useful at all) (b) 3-D probe model (extremely useful) 5 4 3 2 1 (not useful at all) (c) Image control (extremely useful) 5 4 3 2 1 (not useful at all) (d) Force control (extremely useful) 5 4 3 2 1 (not useful at all) (e) Force display (extremely useful) 5 4 3 2 1 (not useful at all) (f) Axis activation buttons (extremely useful) 5 4 3 2 1 (not useful at all) (g) Contour extraction (extremely useful) 5 4 3 2 1 (not useful at all) (h) Feature-based probe automatic placement (extremely useful) 5 4 3 2 1 (not useful at all) 6. The following features are possible to add to the system: (a) Remembering the previous movements of the probe and use it for future scans. (extremely useful) 5 1 (not useful at all) (b) Real-time 3-D reconstruction of the carotid artery during scanning. (extremely useful) 5 4 3 2 1 (not useful at all) 173 Ultrasound Robot User Testing Form 7. Please rate the following: (a) You can use the robotic system similar to the conventional method, (absolutely agree) 5 4 3 2 1 (absolutely disagree) (b) By using the robot, you can scan the carotid artery at a reasonable speed, (absolutely agree) 5 4 3 2 1 (absolutely disagree) (c) By using the robot, you can scan the carotid artery more accurately than the conventional method. (absolutely agree) 5 4 3 2 1 (absolutely disagree) (d) The image controller is helpful during scanning. (absolutely agree) 5 4 3 2 1 (absolutely disagree) (e) You feel more relaxed during the examination because the robot is holding the ultrasound probe for you. (absolutely agree) 5 4 3 2 1 (absolutely disagree) 174 

Cite

Citation Scheme:

        

Citations by CSL (citeproc-js)

Usage Statistics

Share

Embed

Customize your widget with the following options, then copy and paste the code below into the HTML of your page to embed this item in your website.
                        
                            <div id="ubcOpenCollectionsWidgetDisplay">
                            <script id="ubcOpenCollectionsWidget"
                            src="{[{embed.src}]}"
                            data-item="{[{embed.item}]}"
                            data-collection="{[{embed.collection}]}"
                            data-metadata="{[{embed.showMetadata}]}"
                            data-width="{[{embed.width}]}"
                            async >
                            </script>
                            </div>
                        
                    
IIIF logo Our image viewer uses the IIIF 2.0 standard. To load this item in other compatible viewers, use this url:
https://iiif.library.ubc.ca/presentation/dsp.831.1-0065526/manifest

Comment

Related Items