Open Collections

UBC Theses and Dissertations

UBC Theses Logo

UBC Theses and Dissertations

Automatic pathology of prostate cancer in whole mount slides incorporating individual gland classification Rashid, Sabrina 2014

Your browser doesn't seem to have a PDF viewer, please download the PDF to view this item.

Item Metadata

Download

Media
24-ubc_2014_spring_rashid_sabrina.pdf [ 80.81MB ]
Metadata
JSON: 24-1.0166891.json
JSON-LD: 24-1.0166891-ld.json
RDF/XML (Pretty): 24-1.0166891-rdf.xml
RDF/JSON: 24-1.0166891-rdf.json
Turtle: 24-1.0166891-turtle.txt
N-Triples: 24-1.0166891-rdf-ntriples.txt
Original Record: 24-1.0166891-source.json
Full Text
24-1.0166891-fulltext.txt
Citation
24-1.0166891.ris

Full Text

Automatic Pathology of Prostate Cancer in Whole MountSlides incorporating Individual Gland ClassificationbySabrina RashidBSc in Electrical and Electronic Engineering, Bangladesh University ofEngineering and Technology, 2011A THESIS SUBMITTED IN PARTIAL FULFILLMENTOF THE REQUIREMENTS FOR THE DEGREE OFMaster of Applied ScienceinTHE FACULTY OF GRADUATE AND POSTDOCTORALSTUDIES(Electrical and Computer Engineering)The University Of British Columbia(Vancouver)March 2014c© Sabrina Rashid, 2014AbstractThis thesis presents an automatic pathology (AutoPath) approach to detect prostaticadenocarcinoma based on the morphological analysis of high resolution wholemount histopathology images of the prostate. We are proposing a novel techniqueof labeling individual glands as benign or malignant exploiting only gland spe-cific features. Two new features, the Number of Nuclei Layers and the EpithelialLayer density are proposed here to label individual glands. To extract the features,individual gland and nuclei units are segmented automatically. The nuclei unitsare segmented by employing a marker-controlled watershed algorithm. The glandunits are segmented by consolidating their lumina with the surrounding layers ofepithelium and nuclei. The main advantage of this approach is that it can detectindividual malignant gland units, irrespective of neighboring histology and/or thespatial extent of the cancer. Therefore, a more sensitive annotation of cancer can beachieved by the proposed AutoPath technique, in comparison to the current clini-cal protocol, where the cancer annotation is performed at the regional macro levelinstead of glandular level technique.We have also combined the proposed gland-based approach with a regional ap-proach to perform automatic cancer annotation of the whole mount images. Theproposed algorithm performs the task of cancer detection in two stages: at firstwith pre-screening of the whole mount images in a low resolution (5×), and thenii) a finer annotation of the cancerous regions by labeling individual glands at ahigher magnification (20×). In the first stage, the probable cancerous regions areclassified using a random forest classifier that exploits the regional features of thetissue. In the second stage, gland specific features are used to label individual glandunits as benign or malignant. The strong agreement between the experimental re-iisults and the pathologist’s annotation corroborates the effectiveness of the proposedtechnique. The algorithm has been tested on 70 images. In a 10-fold cross vali-dation experiment it achieved average sensitivity of 88%, specificity of 94% andaccuracy of 93%. This surpasses the accuracy of other methods reported to date.iiiPrefaceMaterial from Chapters 2 and 3 was published in the conference for Medical ImageComputing and Computer Assisted Intervention (MICCAI) under the title “Sepa-ration of benign and Malignant Glands in Prostatic Adenocarcinoma”. The workwas co-authored by Ladan Fazli, Alexander Boag, Robert Siemnes, Purang Abol-maesumi, and Septimiu E. Salcudean1.The histopathology data used in this thesis was collected at Queen’s Universityby Alexander Boag and Robert Siemens. The study was cleared by the MedicalResearch Ethics Board at Queen’s University, research study number UROL-146-05.1S. Rashid, L. Fazli, R. Siemens, A. Boag, P. Abolmaesumi, S. E. Salcudean, “Separation ofbenign and malignant glands in prostatic adenocarcinoma, Medical Image Computing and ComputerAssisted Intervention (MICCAI), 2013. Springer Berlin Heidelberg, 2013, 461-468.ivTable of ContentsAbstract . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . iiPreface . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . ivTable of Contents . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . vList of Tables . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . viiList of Figures . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . viiiList of Abbreviations . . . . . . . . . . . . . . . . . . . . . . . . . . . . xvAcknowledgments . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . xviDedication . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . xvii1 Introduction . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 11.1 Prostate Anatomy and Pathology . . . . . . . . . . . . . . . . . . 31.2 Literature Review . . . . . . . . . . . . . . . . . . . . . . . . . . 51.3 Summary of the Proposed Technique . . . . . . . . . . . . . . . . 111.4 Thesis Organization . . . . . . . . . . . . . . . . . . . . . . . . . 132 Cancer Classification Using Regional Features . . . . . . . . . . . . 142.1 Introduction . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 142.2 Gland Segmentation . . . . . . . . . . . . . . . . . . . . . . . . . 152.3 Feature Extraction . . . . . . . . . . . . . . . . . . . . . . . . . . 182.3.1 Nuclei Features . . . . . . . . . . . . . . . . . . . . . . . 20v2.3.2 Lumen Features . . . . . . . . . . . . . . . . . . . . . . . 202.3.3 Epithelial Features . . . . . . . . . . . . . . . . . . . . . 202.4 Classification of Malignant Regions . . . . . . . . . . . . . . . . 212.5 Conclusion . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 223 Cancer Classification Using Glandular Features . . . . . . . . . . . 243.1 Introduction . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 243.2 Segmentation of the Nuclei Unit . . . . . . . . . . . . . . . . . . 253.3 Extraction of Gland-specific Features . . . . . . . . . . . . . . . 273.3.1 Number of Nuclei Layers (NNL) . . . . . . . . . . . . . . 273.3.2 Ratio of Epithelial Layer area to Lumen Area (REL) . . . . 283.4 Consolidation of Glands . . . . . . . . . . . . . . . . . . . . . . 283.5 Conclusion . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 294 Experimental Results . . . . . . . . . . . . . . . . . . . . . . . . . . 324.1 Dataset . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 324.2 Parameter Tuning for Random Forest Classifier . . . . . . . . . . 324.3 Parameter Tuning for Individual Gland Classification . . . . . . . 344.4 Qualitative Performance Evaluation . . . . . . . . . . . . . . . . 344.5 Discussion . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 355 Conclusions . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 735.1 Summary of Contributions . . . . . . . . . . . . . . . . . . . . . 735.2 Future Work . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 75References . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 76viList of TablesTable 1.1 Literature review . . . . . . . . . . . . . . . . . . . . . . . . . 10Table 2.1 Features extracted from each image block . . . . . . . . . . . 19Table 4.1 AUC obtained by our algorithm for different parameter valuesof Rd flea f . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 34viiList of FiguresFigure 1.1 A typical whole mount histopathology slide of prostate. . . . . 2Figure 1.2 Four pathology zones of prostate. . . . . . . . . . . . . . . . 3Figure 1.3 A normal gland structure. . . . . . . . . . . . . . . . . . . . . 5Figure 1.4 Examples of the five grades of the Gleason grading system. a)Classic Gleason grading diagram drawn by Dr. Gleason [14].(b-f) Evolution of Gleason grades from 1 to 5, respectively . . 6Figure 1.5 Visual comparison between a) benign and b) cancerous prostateglands. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 9Figure 1.6 Flow-chart of the proposed algorithm. . . . . . . . . . . . . . 12Figure 2.1 Gland segmentation. a) A sample image block, b) labeled im-age where each histological component is represented by a dif-ferent color, c) enlarged view of a small window in the labeledimage, d) lumen objects (the red mark corresponds the initialgland boundary), and e) segmented gland unit after consolidat-ing surrounding epithelial layer-nuclei object with the glandlumen. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 15Figure 2.2 Plot of Out-of-the-bag classification error against the numberof grown trees. . . . . . . . . . . . . . . . . . . . . . . . . . 22Figure 2.3 Performance of the proposed algorithm on a test image. a)Cyan annotation is performed by pathologist and consideredas the ground truth. b) Result of the screening phase of theproposed technique. . . . . . . . . . . . . . . . . . . . . . . 23viiiFigure 3.1 a) Input scene: inverted ‘R’ channel of the histopathology im-age, b) preprocessed image after background subtraction andthresholding, c) computed foreground and background mark-ers overlaid on the preprocessed image, d) modified gradi-ent image with regional minima placed at the foreground andbackground object markers, e) labeled image of the segmentednuclei after watershed transform, and f) segmented nuclei (markedby green boundary). . . . . . . . . . . . . . . . . . . . . . . 26Figure 3.2 a) Graphical illustration ofYang calculation. b) Sample histopathol-ogy scene with a single benign (marked with blue ellipse) andtwo malignant gland units (marked with red ellipses). c), d),and e) illustrate the different appearances of the histograms(Yang) of benign and malignant glands. . . . . . . . . . . . . . 27Figure 3.3 a) Graphical illustration of boundary hull generation by α-shape approach. . . . . . . . . . . . . . . . . . . . . . . . . . 30Figure 3.4 a) Experimental result of the proposed technique after incorpo-rating the glandular level classification with the regional cancerclassification. . . . . . . . . . . . . . . . . . . . . . . . . . . 31Figure 4.1 ROC analysis for the parameter tuning of the random forestclassifier. Here the ROC curve with Rd flea f = 12 is shown.At the optimum operating point the sensitivity Sn = 0.94 andspecificity Sp = 0.83. . . . . . . . . . . . . . . . . . . . . . . 33Figure 4.2 ROC analysis of the individually labeled gland dataset. Here,the ROC curve for REL = 0.93 is shown. At the optimum op-erating point the sensitivity Sn = 0.79 and specificity Sp = 0.85. 35ixFigure 4.3 Performance of the proposed technique on a sample test image(Case: 1). a) The test image with both pathologists’ annotationoverlaid. The cyan annotation is from the second pathologistand considered as the ground truth. b) The blue mark is the in-termediate annotation after random forest classification at 5×resolution. c) The green mark is the final cancer annotation ob-tained by the proposed technique.The blue dots represent thedetected malignant gland units. . . . . . . . . . . . . . . . . . 36Figure 4.4 Performance of the proposed technique on a sample test image(Case: 2). a) The test image with both pathologists’ annotationoverlaid. The cyan annotation is from the second pathologistand considered as the ground truth. b) The blue mark is the in-termediate annotation after random forest classification at 5×resolution. c) The green mark is the final cancer annotation ob-tained by the proposed technique.The blue dots represent thedetected malignant gland units. . . . . . . . . . . . . . . . . . 37Figure 4.5 Performance of the proposed technique on a sample test image(Case: 3). a) The test image with both pathologists’ annotationoverlaid. The cyan annotation is from the second pathologistand considered as the ground truth. b) The blue mark is the in-termediate annotation after random forest classification at 5×resolution. c) The green mark is the final cancer annotation ob-tained by the proposed technique.The blue dots represent thedetected malignant gland units. . . . . . . . . . . . . . . . . . 38Figure 4.6 Performance of the proposed technique on a sample test image(Case: 4). a) The test image with both pathologists’ annotationoverlaid. The cyan annotation is from the second pathologistand considered as the ground truth. b) The blue mark is the in-termediate annotation after random forest classification at 5×resolution. c) The green mark is the final cancer annotation ob-tained by the proposed technique.The blue dots represent thedetected malignant gland units. . . . . . . . . . . . . . . . . . 39xFigure 4.7 Performance of the proposed technique on a sample test image(Case: 5). a) The test image with both pathologists’ annotationoverlaid. The cyan annotation is from the second pathologistand considered as the ground truth. b) The blue mark is theintermediate annotation after random forest classification at 5xresolution. c) The green mark is the final cancer annotationobtained by the proposed technique.The blue dots representthe detected malignant gland units. . . . . . . . . . . . . . . . 40Figure 4.8 Performance of the proposed technique on a sample test image(Case: 6). a) The test image with both pathologists’ annotationoverlaid. The cyan annotation is from the second pathologistand considered as the ground truth. b) The blue mark is theintermediate annotation after random forest classification at 5xresolution. c) The green mark is the final cancer annotationobtained by the proposed technique.The blue dots representthe detected malignant gland units. . . . . . . . . . . . . . . . 41Figure 4.9 Performance of the proposed technique on a sample test image(Case: 7). a) The test image with both pathologists’ annotationoverlaid. The cyan annotation is from the second pathologistand considered as the ground truth. b) The blue mark is theintermediate annotation after random forest classification at 5xresolution. c) The green mark is the final cancer annotationobtained by the proposed technique.The blue dots representthe detected malignant gland units. . . . . . . . . . . . . . . . 42Figure 4.10 Performance of the proposed technique on a sample test image(Case: 8). a) The test image with both pathologists’ annotationoverlaid. The cyan annotation is from the second pathologistand considered as the ground truth. b) The blue mark is theintermediate annotation after random forest classification at 5xresolution. c) The green mark is the final cancer annotationobtained by the proposed technique.The blue dots representthe detected malignant gland units. . . . . . . . . . . . . . . . 43xiFigure 4.11 Performance of the proposed technique on a sample test image(Case: 9). a) The test image with both pathologists’ annotationoverlayed. The cyan annotation is from the second pathologistand considered as the ground truth. b) The blue mark is theintermediate annotation after random forest classification at 5xresolution. c) The green mark is the final cancer annotationobtained by the proposed technique.The blue dots representthe detected malignant gland units. . . . . . . . . . . . . . . . 44Figure 4.12 Illustration of the individual gland labeling on whole mountimages. Enlarged view of four different types of detected glands,1). Atrophy, 2) PIN, 3) Malignant glands, and 4) Seminal vesi-cle. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 45Figure 4.13 Illustration of the individual gland labeling on a sample wholemount image (Test image:1) . . . . . . . . . . . . . . . . . . 46Figure 4.14 Illustration of the individual gland labeling on a sample wholemount image (Test image:2) . . . . . . . . . . . . . . . . . . 47Figure 4.15 Illustration of the individual gland labeling on a sample wholemount image (Test image:3) . . . . . . . . . . . . . . . . . . 48Figure 4.16 Illustration of the individual gland labeling on a sample wholemount image (Test image:4) . . . . . . . . . . . . . . . . . . 49Figure 4.17 Illustration of the individual gland labeling on a sample wholemount image (Test image:5) . . . . . . . . . . . . . . . . . . 50Figure 4.18 Illustration of the individual gland labeling on a sample wholemount image (Test image:6) . . . . . . . . . . . . . . . . . . 51Figure 4.19 Illustration of the individual gland labeling on a sample wholemount image (Test image:7) . . . . . . . . . . . . . . . . . . 52Figure 4.20 Illustration of the individual gland labeling on a sample wholemount image (Test image:8) . . . . . . . . . . . . . . . . . . 53Figure 4.21 Illustration of the individual gland labeling on a sample wholemount image (Test image:9) . . . . . . . . . . . . . . . . . . 54Figure 4.22 Illustration of the individual gland labeling on a sample wholemount image (Test image:10) . . . . . . . . . . . . . . . . . . 55xiiFigure 4.23 Illustration of the individual gland labeling on a sample wholemount image (Test image:11) . . . . . . . . . . . . . . . . . . 56Figure 4.24 Illustration of the individual gland labeling on a sample wholemount image (Test image:12) . . . . . . . . . . . . . . . . . . 57Figure 4.25 Illustration of the individual gland labeling on a sample wholemount image (Test image:13) . . . . . . . . . . . . . . . . . . 58Figure 4.26 Illustration of the individual gland labeling on a sample wholemount image (Test image:14) . . . . . . . . . . . . . . . . . . 59Figure 4.27 Illustration of the individual gland labeling on a sample wholemount image (Test image:15) . . . . . . . . . . . . . . . . . . 60Figure 4.28 Illustration of the individual gland labeling on a sample wholemount image (Test image:16) . . . . . . . . . . . . . . . . . . 61Figure 4.29 Illustration of the individual gland labeling on a sample wholemount image (Test image:17) . . . . . . . . . . . . . . . . . . 62Figure 4.30 Illustration of the individual gland labeling on a sample wholemount image (Test image:18) . . . . . . . . . . . . . . . . . . 63Figure 4.31 Illustration of the individual gland labeling on a sample wholemount image (Test image:19) . . . . . . . . . . . . . . . . . . 64Figure 4.32 Illustration of the individual gland labeling on a sample wholemount image (Test image:20) . . . . . . . . . . . . . . . . . . 65Figure 4.33 Illustration of the individual gland labeling on a sample wholemount image (Test image:21) . . . . . . . . . . . . . . . . . . 66Figure 4.34 Illustration of the individual gland labeling on a sample wholemount image (Test image:22) . . . . . . . . . . . . . . . . . . 67Figure 4.35 Illustration of the individual gland labeling on a sample wholemount image (Test image:23) . . . . . . . . . . . . . . . . . . 68Figure 4.36 Illustration of the individual gland labeling on a sample wholemount image (Test image:24) . . . . . . . . . . . . . . . . . . 69Figure 4.37 Illustration of the individual gland labeling on a sample wholemount image (Test image:25) . . . . . . . . . . . . . . . . . . 70Figure 4.38 Illustration of the individual gland labeling on a sample wholemount image (Test image:26) . . . . . . . . . . . . . . . . . . 71xiiiFigure 4.39 Illustration of the individual gland labeling on a sample wholemount image (Test image:27) . . . . . . . . . . . . . . . . . . 72xivList of AbbreviationsWM Whole MountPSA Prostate-Specific AntigenDRE Digital Rectal ExaminationPIN Prostatic Intraepithelial NeoplasiaROC Receiver Operator CharacteristicsAUC Area Under the CurveROI Region of InterestLDA Linear Discriminant AnalysisxvAcknowledgmentsFirstly, I would like to express my sincere gratitude towards my supervisors Drs.Tim Salcudean and Purang Abolmaesumi for their continuous support and advice.Their guidance and enthusiasm kept me motivated through out the research project.I would like to thank Dr. Ladan Fazli for her time and support in providing theground truth for the project and training me about the clinical aspect of prostatepathology. I would also like to thank Drs. Alexander Boag and Robert Siemens forproviding the data set of the whole mount images of the prostate.Dr. Mehdi Moradi was extremely helpful in providing feedback on the project’smachine learning aspects. I would also like to thank my colleagues at the Roboticsand Control Lab for creating a very friendly and helpful environment during mystay as a masters candidate.Lastly, I would like to thank my amazing parents back in my country Bangladesh,my father Abdur Rashid Dewan, mother Lutfa Sultana. It is difficult being so faraway but their strong belief in me have always kept me motivated to achieve mygoal. I would also like to take the opportunity to thank my brother Toufiqul Islamand sister in law Rubaiya Rahman for their continuous advice and support.xviDedicationI would like to dedicate this thesis to my father Abdur Rashid Dewan and motherLutfa Sultana for always believing in me and giving me the freedom to pursue mydream.xviiChapter 1IntroductionProstate cancer is currently the second most prevalent type of cancer for men andranks third among the cancer related deaths of men worldwide [17]. Prostate can-cer is usually suspected when a high level of prostate specific antigen (PSA) isdetected in blood tests. A digital rectal examination (DRE), in which the physicianpalpates the prostate through the rectum is then performed to detect any abnor-malities within the prostate. Anomalies in these tests lead clinicians to conducta prostate biopsy. A prostate gland biopsy is a diagnostic procedure which in-volves removal and examination of small samples of tissue. Examination of themicroscopic biopsy specimens by pathologists is an important step for confirmingthe diagnosis of malignancy and guiding the treatment [40]. In case of advancedcancers, surgeons often perform Radical Prostatectomy (RP) on patients, i.e., sur-gical removal of the entire prostate. The prostate specimen removed during the RPprocedure is processed and analyzed by a pathologist in order to determine furthertreatment, such as radiation or hormone treatment, depending on the extend and lo-cation of the cancer found in the specimen. In some cases, the prostate specimen issliced in parallel transversal slices that cross the entire organ at intervals of 4-8 mm,in order to facilitate the comparison of prostate images acquired pre-operatively tis-sue histopathology. The histopathology slices obtained from the cross section ofthese ex vivo prostate are termed as Whole Mount (WM) slides. Fig. 1.1 showsa typical WM slide of the prostate. The black marks present in the image are thecoarse annotation by the pathologist on the glass slide before digitization.1Figure 1.1: A typical whole mount histopathology slide of prostate.The analysis of these WM slides has been an area of special interest in recentyears. The analysis of the WM images help to predict the long term disease out-come of the RP patients i.e., the possibility of recurrence of cancer in adjacentorgans. The whole mount pathology analysis after RP can also be used as groundtruth to determine the ability of imaging, such as multi-parametric magnetic reso-nance imaging, to detect cancer.In clinical practice, the analysis of the WM slides is performed by patholo-gists manually. Since the level of structural detail in these images is very high,the process of annotating and grading the entire image is very time consumingand also subjective to human pathologists’ expertise. Therefore, for future work inprostate cancer prognosis and image-based diagnosis, it is important that an auto-matic pathology approach that is consistent and accurate in classifying cancer bedeveloped. This is the objective of the AutoPath research work described in thisthesis.2Figure 1.2: Four pathology zones of prostate.1.1 Prostate Anatomy and PathologyThe prostate is a gland in the male reproductive system. Its function is to store andsecrete a slightly alkaline fluid which usually constitutes 20−30% of the volumeof the semen. In healthy adult males its size is slightly larger than a walnut. Theweight of a healthy prostate in adult males ranges from 7 to 16 gms with an averageweight about 11 grams [21], [23]. The prostate sits above the base of the penis andbelow the urinary bladder and backs onto the front wall of the rectum. The apex ofthe prostate is pointed down to the perineum as opposed to the base which is widerand located next to the bladder. The prostatic urethra is the portion of the urethrathat runs from the urinary bladder through the prostate and exits from the apex viathe urinary sphincter which is a group of muscles that prevents involuntary leakageof urine. The prostate is surrounded by a membrane called the prostatic capsule. Inpathology, the regions of the prostate are classified as zones Fig. 1.2. The prostategland has four distinct glandular regions:• Peripheral zone (PZ): This zone occupies approximately 70% of the volumeof gland. 70−80% of prostatic cancers originate in the peripheral zone [8].3• Central zone (CZ): This zone constitutes approximately 25% of the prostategland. About 5% of prostate cancer cases originate in the central zone [8].• Transition zone (TZ): The transition zone is the innermost part of the prostategland and surrounds the urethra. It makes up about 5% of the prostate vol-ume. About 10% of prostate cancers occur in this zone. This zone alsoenlarges with age and can result in benign prostatic enlargement [8].• Anterior fibro-muscular zone (or stroma): The anterior zone is located closeto the abdomen (away from the rectum). This zone constitutes 5% of thegland volume and is composed mostly of muscular tissue [8].The main histopathological structural unit in the prostate is called a gland.Fig. 1.3 shows the structure of a normal gland unit. It mainly comprises a lumenof irregular shape, a layer of epithelial cells, and nuclei surrounding the lumina.The unit is supported by a surrounding fibro-muscular stroma. Each of these com-ponents correspond to a different color when the slides are stained using a Hema-toxylin and Eosin (H&E) solution. In response to the solution, the nuclei turn intodark blue objects and the epithelial layer and stroma turn into different shades ofpurple to pink. The morphological and architectural features of the glands indicatewhether the gland is benign or malignant. Fig. 1.5 illustrates the different appear-ances of cancerous and benign glands. Cancerous glands tend to have a single layerof nuclei with a higher ratio of epithelial layer area to lumen area.By examining the glandular tissue features in the microscopic histopathologysections, the pathologist determines the histological grades. The most widespreadtechnique for histological grading is the Gleason grading system [14].This grading scheme was developed by a pathologist, Dr. Gleason during the1970s. In this grading system, the prostate cancer can be classified into 5 gradesrepresenting a number ranging from 1 to 5, where 1 is the most benign and 5 is themost malignant case. A classic Gleason grading diagram containing the five basictissue patterns associated with five cancer grades is shown in Fig. 1.4a. Gleasongrading is based upon the distribution of nuclei and morphology of gland structuresin the image. Fig 1.4b-f shows the evolution of glandular and nuclear regions inthe different grades of prostate cancers in real pathological images. As can be41. Lumen4. Stroma3. Nuclei layer2. Luminal SecretoryEpithelial layerFigure 1.3: A normal gland structure.observed from the figures, in the lower grades of cancer (Grade 1 and 2) the glandsstill maintain an irregular shape like normal glands but they get smaller in size andthe concentration of nuclei increases slightly. In higher grades (3 and 4) the glandsare smaller, and instead of having an irregular shape they tends to have a moreregular circular or elliptical shape. The distance between the glands increases. InGrade 5 cancer, no distinct glands can be observed. Instead, there is a randomconcentration of nuclei floating randomly in the stroma. The aggressiveness ofprostate cancer are scored by combining the top two grades present in a particularregion. For example, if a tissue region has 50% Grade 3 cancer, 30% Grade 4cancer, and 20% other Grade cancers then the corresponding Gleason score of thetissue region will be 3+4=7.1.2 Literature ReviewThe analysis of pathological images has been an area of interest during the last fewyears. Normally referred as Digital Pathology, the aim of this field has been todistinguish between the normal and abnormal tissue. Research in this area can be5e) f)c)d)b)a)Figure 1.4: Examples of the five grades of the Gleason grading system. a)Classic Gleason grading diagram drawn by Dr. Gleason [14]. (b-f)Evolution of Gleason grades from 1 to 5, respectively6broadly divided into two categories: work on biopsy specimens and work on wholemount pathology images.Most of the research in this field has been carried out on the biopsy specimensamples. A method to distinguish the moderately and poorly differentiated lesionsof prostate tissues was presented by Stotzka et al. in [34]. The decision was basedon a number of features obtained from the shape and texture of the glands. Thenuclear roundness factor analysis (NRF) is proposed in [9] to predict the behaviorof low grade samples. Since this technique requires manual nuclear tracing, it istime consuming and tedious. Jafari-Khoujani et al. [19] proposed a method forgrading the pathological images of prostate biopsy samples by using energy andentropy features calculated from multi-wavelet coefficients of an image. Thesemulti-wavelet features were used by a k-nearest neighborhood classifier for classi-fication and a leave one out procedure [19] was applied to estimate the error rate.In other research, prostate cancer grading was carried out using fractal dimensionanalysis [16]. In [16], the authors proposed fractal dimension based texture fea-tures that were extracted through a differential box counting method. These fea-tures were combined with an entropy-based fractal dimension estimation methodas a fractal-dimension based feature set to analyze pathological images of prostatecarcinoma. This research focuses only on the separation of the different gradeson manually detected cancerous region. Tabesh et al. [36] proposed a two stagesystem for prostate cancer diagnosis and Gleason grading. The color, morphome-tric and texture features are extracted from the tissue images. Then, linear andquadratic Gaussian classifiers were used to classify images into cancer/noncancerclasses and then further into low and high grade classes.Since the grading of the cancer depends on gland morphology and nuclei distri-bution, proper segmentation of these has been an active area of research interest forlast couple of years. A good part of the research activity addressed the segmenta-tion of nuclei, as they are clearly visible on histology. Bamford and Lovell [3] usedan active contour scheme for segmenting nuclei in pap-stained cervical cell images.A fuzzy logic engine was proposed by for segmentation of prostate tissue that usesboth color and shape based constraints [5]. But these studies focus only on find-ing the nuclei units only. Segmentation of multiple structures on prostate histologyhas been carried out by Gao et al. using a color-based histogram thresholding tech-7nique to enhance regions of cytoplasm and nuclei to aid in manual cancer diagnosis[13]. Recently, Naik et al. proposed an automatic gland segmentation algorithm[28]. A Bayesian classifier is used to detect candidate gland regions by utilizinglow-level image features to find the lumen, epithelial cell cytoplasm, and epithelialnuclei of the tissue. Then, the features calculated from the boundaries of the glandthat characterize the morphology of the lumen and gland region have been used tograde the cancer tissue. The most recent articles on cancer classification in biopsyspecimen have been summarized in Table I. As can be observed from the table,among the recent literature, Naik et al. [28] gives the best accuracy.Compared to the cancer classification works on biopsy specimens, there havebeen fewer reports on automatically annotating whole mount images. Gorelick etal. proposed an automatic cancer classification method for sub-images extractedfrom whole mount images [15] using the superpixel [12], [37] partitioning andAda-Boost classification [33]. The authors did not report the performance of anno-tating complete whole mount images. Monaco et al. [26] proposed an algorithmfor annotating cancerous regions in whole mount slides using gland features. Thereported technique segmented gland lumens and classified glands into normal orcancer by (i) using gland size feature to assign initial gland labels and (ii) applyinga probabilistic pairwise Markov model (PPMM) to update gland labels. See TableI for a summary of the results reported for biopsy specimen and WM images.Most of these previous works do not evaluate the features that are determinedfor each individual gland unit. Since prostatic adenocarcinoma is the cancer per-taining to the gland unit and the pathological changes in malignant tissue occurat the gland level, a clinically more relevant approach would be to incorporatethe gland specific features in the computational cancer detection process. In arecent article on gland classification, Nguyen et al. [31] achieved an accuracyof 0.79 in classification of benign and malignant glands by exploiting region-specific/contextual features such as percentage of nuclei pixels, lumen shape sim-ilarity, lumen size similarity, and neighborhood. Since the proposed techniqueutilizes contextual information for classifying individual glands, therefore in caseswhere benign gland appear in close proximity of malignant glands, the techniquewill most likely fail. In addition, the proposed technique does not perform nucleisegmentation but utilizes percentage of nuclei pixels as classification. Therefore,8a) b)Figure 1.5: Visual comparison between a) benign and b) cancerous prostateglands.multiple smaller nuclei (usually in benign) and single large nuclei (usually in ma-lignant) will result in similar feature index which will not be true representative ofthe gland condition. In comparison to that, our proposed features are strictly glandspecific and involve i) pixel labeling, ii) segmentation of each nuclei in the gland,and iii) finding the number of layers of nuclei for each gland from angle-dependenthistograms. The advantage of this technique is that it can detect a malignant/suspi-cious gland irrespective of the region properties. In cases where malignant glandsare present in close proximity to benign glands, this approach might provide a moresensitive cancer annotation compared to approaches that use region-dependent im-age features [31].9Table 1.1: Literature reviewAuthors Dataset size Classes AccuracyDoyle et al. 2006 [19] 22 (biopsy) cancer/non-cancer 88%Tabesh et al. 2007 [36] 268 (biopsy) Low/High grade 81%Naik et al. 2008 [28] 44 (biopsy) Benign, Grade-3, Grade-4, Grade-5 90%Tai et al. 2010 [16] 1000 (biopsy) Benign, Grade-3, Grade-4, Grade-5 86%Monaco et al. 2012 [26] 40 (37 quarter sections, 13 WM) Benign/malignant 90%Gorelick et al. 2013 [15] 991 sub-images from 50 WM sections Cancer/non-cancer 90%Nguyen et al. 2012 [31] 48 images Gland labeling 79%101.3 Summary of the Proposed TechniqueThe AutoPath technique proposed in this thesis performs automatic cancer classifi-cation on WM prostate in two stages, i) screening of probable cancerous regions atlow magnification (5×) and ii) finer annotation of the detected cancerous regionsat high magnification (20×). To extract the tissue features in the first stage, weautomatically segment the individual gland units and its associated tissue compo-nents. The whole image is divided into small blocks and features extracted fromeach of these blocks are fed into a random forest classifier [7] to detect benign andmalignant regions of the image. Further analysis of the detected regions is per-formed in the second stage at a higher magnification. At this step, the malignantregions are detected based on their gland-specific properties. We propose two newfeatures for classifying glands, i) Number of Nuclei Layers (NNL) and ii) Ratio ofEpithelial Layer area to Lumen area (REL). To extract the first feature, nuclei unitshave been automatically segmented from the image using a marker-controlled wa-tershed algorithm[25]. The introduction of these two gland-specific features allowus to detect malignant or suspicious glands without relying on surrounding histol-ogy. Therefore, a more specific and sensitive annotation of the images is possible.We have tested our technique on 70 images obtained from 30 patients. In a 10-fold cross validation we have achieved an average sensitivity of 84%, specificity of94%, and accuracy of 93%. A flow-chart of the proposed technique is illustrated inFig. 1.6 and the detailed explanation of each step is provided in the methodologysection.11Figure 1.6: Flow-chart of the proposed algorithm.121.4 Thesis OrganizationWe start with Chapter 2 explaining the screening phase of our prostate cancer de-tection technique exploiting regional features. This chapter discusses the glandsegmentation algorithm, regional feature extraction, and the application of randomforest classifier in classifying benign and malignant regions. Chapter 3 describesthe glandular level classification of the proposed technique providing the details ofnuclei segmentation and glandular feature extraction. We present the performanceevaluation of our algorithm in Chapter 4. Finally, conclusions, achievements andfuture directions of our work are laid out in Chapter 5.13Chapter 2Cancer Classification UsingRegional Features2.1 IntroductionIn prostatic adenocarcinoma, a tumor is defined to be comprised of a group ofmalignant gland units. The arrangement and architecture of the glands in the tumordeviate from the healthy tissue type depending the cancer grade. The higher thecancer grade, the more deviation of the tissue architecture is observed. At thisstage we have quantified these changes in tissue architecture and based on thatwe have screened the pathology slices for the possible cancerous regions. Forthe screening phase, only the 5× magnification level have been utilized. At thismagnification, the image resolution is 2µmm per pixel. In the first step, the entireimage is divided into small blocks of 0.5 mm×0.5 mm images. The choice of theblock size corresponds to the size of the smallest annotated region present in ourdataset. Each of these blocks are categorized into probable cancerous and non-cancerous region by using a random forest classifier [7]. The features exploited bythe classifier are extracted from the segmented gland images.14LumenNucleiEpithelial LayerStromaAnnotationTraining patchesb)e)d)a)600 micronsc)Figure 2.1: Gland segmentation. a) A sample image block, b) labeled imagewhere each histological component is represented by a different color, c)enlarged view of a small window in the labeled image, d) lumen objects(the red mark corresponds the initial gland boundary), and e) segmentedgland unit after consolidating surrounding epithelial layer-nuclei objectwith the gland lumen.2.2 Gland SegmentationThe gland segmentation algorithm has been partially adopted from the work ofNguyen et al. [30]. In the first step, each image block has been segmented intofive categories, i.e., i) Gland lumen, ii) Cytoplasm, iii) Nuclei, iv) Stroma, and v)Annotation mark. This segmentation uses the distinct color information of eachcategory. Variations in illumination caused by variations in staining or changes inambient lighting conditions at the time of digitization may dramatically affect im-age characteristics and then potentially affect the performance of the algorithm. Inthe RGB color space the lighting information and the color information is blendedtogether. This is why each sub-region is converted from RGB color space to Labcolor space. The Lab space consists of a luminosity layer ‘L’, chromaticity-layer‘a’ indicating where color falls along the red-green axis, and chromaticity-layer ‘b’indicating where the color falls along the blue-yellow axis. By converting to Lab15color space the lighting information is confined into only one channel, L. Smalltraining patches of each categories has been used to train the classifier, which la-bels each pixel of the image into one of the five different categories listed above.For ith pixel in either test data or training data, the pixel is represented as Di, jwhere i = {1,2, ...,n}; n is the number of data points and j = {1,2}, for the twochromaticity layers in the Lab color space. The classification algorithm uses alinear discriminant analysis to label the testing pixels. Given a training data setwith class known for each of the data point, the jth component of the mean vectorfor class k is simply the mean for variable j over the Nk data points in group k.¯D j,k =1Nk ∑n∈kDn, j, (2.1)where n ∈ k indicates the set of data points in group k.The covariances matrices for each class is considered to be equal and estimatedas single pooled estimate of S, with entriesSi, j =1N−KN∑n=1(xn;i− ¯xk(n);i)(xn; j − ¯xk(n); j), (2.2)where ¯xk(n);i is the ith component of the mean vector for which class the data pointn belongs to, k(n). Then the squared Mahalanobis distance from a data vector x tothe mean of group of k is given byz2k = (x− x¯k)′S−1(x− x¯k). (2.3)As a result of pooled estimate of covariance matrix, all the determinants ofcovariance estimate is equal and the Bayes’ formula for estimating posterior prob-ability of data vector x to class k is reduces toPk(x) =qkexp[−0.5z2k ]∑Kl=1 qlexp[−0.5z2l ]. (2.4)Then the data vector x is assigned to the class with which it has maximumposterior probability. For ith point, lets assume the data vector is x, the the corre-sponding pixel label (li) will be the li = argmaxPi(x).The training patches used in the classifier are manually selected from 5 differ-16ent patient images. The number of training patches and the number of training pix-els for each category are: lumen: 2 (Number of pixels: 1580), nuclei: 4 (Numberof pixels: 440), epithelial layer: 4 (Number of pixels: 1564), stroma: 10 (Numberof pixels: 3033), and annotation: 2 (Number of pixels: 1503). Among the histol-ogy components, the stroma units exhibit highest variation in color information.Therefore, more number of training patches for these category has been utilized.By contrast, the lumen and nuclei are the most homogenous tissue components inthe image and hence lower number of training patches from these categories havebeen utilized.Fig. 2.1b-c shows the result of the color based segmentation of an exampleimage sub-region. For pixel labeling, we have used linear discrimination analysisinstead of the Voronoi tesselation based approach from [30]. In the Vronoi tessela-tion approach, the training points create a Voronoi tessellation [2] of the Lab space.Each training point is associated with one convex polygon which includes all pointscloser to it than any other training point. Each test point is assigned to the sameclass associated with the training point of the polygon to which it belongs. Themain drawback of Voronoi tesselation approach is that when the number of test-ing samples is large, the classification time for each testing data point is very highcompared to that of linear discriminant analysis [20]. Therefore, when the numberof testing samples is, very large the reported Voronoi tesselation based approachwill be very expensive to compute.After having labeled the image into the categories listed above, first we grouptogether the lumen pixels using a connected-components algorithm [35] which usesthe eight-connectivity property. Then, the flood-fill [22] algorithm is employed tolabel all the pixels in the connected neighborhood. By putting a constraint on themaximum possible size of the lumen, some objects are discarded that are too bigto be considered as a gland. This constraint eliminates the background object ofthe histology section that has almost the same color information as lumen objects.Around each lumen object, a lumen boundary is extracted. This is considered tobe the primary gland boundary (see Fig. 2.1d). As stated earlier in the introductionsection, a complete gland unit consists of the lumen and its surrounding layer ofepithelial cells and nuclei. Therefore, to segment a complete gland unit we have toconsolidate the surrounding epithelial layer and nuclei with the lumen. To accom-17plish this, an iterative search around δ×δ neighborhood centered at each boundarypoint of the lumen is carried out. Here, δ is set to 3 pixels, the minimum neigh-borhood window length, in order to eliminate the possibility of adding any extranon-gland pixels to the gland unit. The pixels labeled as epithelial layer or nu-clei within this neighborhood are grouped together with the lumen object and thecorresponding gland boundary is updated. This procedure stops when the glandboundary reaches the pixels labeled as stroma. Also, under the assumption that atrue gland unit is always surrounded by epithelial layer, the lumen pixels that arenot surrounded by the cytoplasm and nuclei are discarded as false lumen objects.Fig. 2.1e illustrates the resultant segmented gland units.2.3 Feature ExtractionFrom each block of the image, an array of nine features related to the arrangementand morphology of the lumen, nuclei, and epithelial layer of the gland is extracted.The histological changes occurring in the malignant regions are most pronouncedin these three tissue components. The stroma rarely shows any alteration in malig-nant regions. Malignant regions in prostate histology are usually characterized bygroup of closely packed glands that are similar in shape, whereas in benign regions,the glands usually have highly irregular shapes. The malignant glands are usuallycircular in shape and posses a thicker epithelial layer. In high grade of cancers,the malignant regions also exhibit high concentration of randomly floating nuclei.Based on these observations, we have synthesized a set of nine features to classifybenign and malignant regions. Following are the detail description of the features:18Table 2.1: Features extracted from each image blockNuclei featuresEntropy of the Nuclei En = ∑Pn× logPnpixel distribution (En) Pn is the spatial histogram of nuclei pixelsNuclei Density (ND)ND = AnAbAn = Total area of nuclei units (in pixels)Ab = Total area of the block image (in pixels)Lumen featuresMean LumenRoundness Metric (MLRM) LRMi = (pi×EDi)Pi ;Pi = Perimeter of the ith lumenStandard Deviation of the EDi = Equivalent diameter of the ith lumenLumen Roundness Metric (SLRM)Maximum Number ofLumens per Cluster Lumens (MNLC)Average Number of In this block image, MNLC=6;Lumens per Cluster (ANLC) ANLC=(5+6+3)/3=4.67Epithelial features Average Epithelial layer EDGi =Ae(i)AglandiDensity per Gland unit (AEDG) Aei = Area of epithelial layer of ith glandStandard deviation of Epithelial layer (SEDG) Aglandi = Area of the ith glandDensity per Gland unitOverall Epithelial layer Density (ED) ED = AeAbAe = Total area of the epithelial layer (in pixels)192.3.1 Nuclei FeaturesNuclei Density, NDThis feature is calculated by taking the ratio of nuclear area to the total area of theimage block. Formally, Nuclei density, ND = AnAb , where An = Total area of nucleiunits (in pixels) and Ab = Total area of the block image (in pixels).Nuclei Entropy, EnThis feature is proposed here to capture the randomness of nuclei appearance inthe block image. It is a newly proposed feature in cancer classification. We havequantified the Entropy of the nuclei pixels as, En = ∑Pn × logPn. Here, Pn is thespatial histogram of nuclei pixels.2.3.2 Lumen FeaturesLumen Roundness Metric (LRM)The LRM is a measure of circularity of the lumen shapes. Two features, AverageLRM (ALRM) and Standard deviation of the LRM (SLRM) are extracted fromeach of the image blocks to incorporate the shape information in the feature set.The LRM of ith gland is calculated as, LRMi = (pi×EDi)Pi . Here Pi = Perimeter of theith lumen and EDi = Equivalent diameter of the ith lumen.Lumen Cluster: Clustering of the lumens is a common regional property of prostatic adenocar-cinoma. Two new features, Maximum Number of Lumens per Cluster (MNLC)and Average Number of Lumens per Cluster are calculated in each image block torepresent the clustering of lumens in the feature set.2.3.3 Epithelial FeaturesThree new features associated with the epithelial layer structure are proposed here.The malignant regions usually posses a higher epithelial area compared to the nor-20mal regions. The overall Epithelial Density (ED) is calculated by taking ratio ofepithelial area to total area of the image block, i.e., ED = AeAb , where Ae = Totalarea of the epithelial layer (in pixels). In addition to the overall higher density ofepithelial layers, the individual glands also posses a thicker epithelial layer. TheEpithelial layer Density per Gland unit (EDG) for ith gland in the image block iscalculated as EDGi =Ae(i)Aglandi, where Aei = Area of epithelial layer of ith gland andAglandi = Area of the ith gland.These features are tabulated in Table 2.1.2.4 Classification of Malignant RegionsThese features are utilized by the random forest classifier for separating benign andmalignant blocks. The random forest algorithm was developed by Leo Breimanand Adele Cutler [7]. Random forests are a combination of tree predictors whereeach tree depends on the values of a random vector sampled independently withthe same distribution for all trees in the forest. Each tree in the random forestcan be considered as a ‘weak’ learner and in the ensemble they come together toform a ‘strong’ learner. Single decision trees often have high variance or high bias.Random forests attempt to mitigate the problems of high variance and high bias byaveraging to find a natural balance between the two extremes. Each tree of randomforest ensemble has been trained by randomly selecting two thirds of the sampleseach time with replacement. The remaining samples are used to test the tree and themean squared error in classification of all the trees constitutes the out-of-the-bagclassification error. Fig. 2.2 illustrates the plot of the out of the bag classificationerror against the number of trees. As the number of trees in the ensemble goes up,the classification error goes down. As can be observed from the figure, with ourproposed features the out of the bag classification error goes down to 0.04 for 100trees.In the random forest classifier model, there are two parameters to be tuned, i)minimum leaf size and ii) threshold of class probability for the classifier model. Wehave determined the optimum value for these parameters by performing a ReceiverOperating Characteristics (ROC) analysis on 50 out of the 70 images of the dataset.The parameter selection process and the detail cross-validation of the classifier are210 20 40 60 80 1000.040.0450.050.0550.060.065Number of grown treesOut−of−bag classification errorFigure 2.2: Plot of Out-of-the-bag classification error against the number ofgrown trees.described in detail in the results section. The remaining 20 images constitutes testset and the performance of the proposed technique on the test is obtained by a leave-25%-out experiment with the tuned model parameters. After the classification, thedetected blocks are grouped together to form a continuous area. Fig. 2.3 illustratesthe result of the first stage of cancer annotation on a sample image. More resultsare provided in the results chapter. A finer annotation of the detected regions isperformed in the next step at a higher magnification.2.5 ConclusionIn this chapter, we presented a cancer annotation approach at the regional level.With the help of the extracted features from segmented gland units, a random for-est classifier separates the probable cancerous regions from the benign regions.The array of regional features proposed here quantify the histopathology changes22a) b)Figure 2.3: Performance of the proposed algorithm on a test image. a) Cyanannotation is performed by pathologist and considered as the groundtruth. b) Result of the screening phase of the proposed technique.occurring in a cancerous region. Changes in all the tissue components, such as lu-men, epithelium, and nuclei, are accounted here to generate a unique set of featuresthat is used for classification using a random forest classifier.23Chapter 3Cancer Classification UsingGlandular Features3.1 IntroductionGlands are the basic building blocks of the prostate. Prostatic adenocarcinoma, themost common type of prostate cancer i.e., prostatic adenocarcinoma is originatefrom the epithelium layer of the glands. Subsequently, the evolution of malignancyis most evident at the glandular level. Here we focus on the quantification of thechanges that occur in the individual gland units from high resolution images. In thisstep, the detected regions from the previous step are further magnified to performgland-level analysis. In our experiments, we use whole-mount histology scans atmagnification of 20× with a resolution of 0.5 µm per pixel. The gland units inthe cancer-probable regions are classified based on their individual gland specificproperties. The features exploited here are, 1) Number of Nuclei Layers and 2)Ratio of Epithelial Layer Area to Lumen Area. To quantify the number of nucleilayer associated with each gland, the nuclei units are segmented automatically.Then based on these two features, the benign and malignant glands are separatedand the final annotation consolidating the malignant glands is obtained.243.2 Segmentation of the Nuclei UnitThe nucleus is the smallest visible histological component present in a pathologyimage. Several techniques for segmenting nuclei [1], [3], [4], [6], [18], [24], [38],[39] have already been proposed exploiting very high magnification images (40×or higher). However, automatic segmentation of nuclei from images with lowermagnification is yet to be investigated. Here we employ a modified watershedalgorithm for automatic nuclei segmentation exploiting the foreground and back-ground object markers.For the nuclei segmentation, only the ‘R’ channel of the image has been usedsince it produces maximum histogram separation between nuclei and non-nucleiobjects. The nuclei objects appear as dark objects in the tissue image. The ‘R’channel is inverted to make the nuclei units foreground objects for segmentation(see Fig. 3.1 a). The preprocessing steps before applying the marker controlledwatershed algorithm are the background subtraction and thresholding. The back-ground of the image is estimated by performing a morphological opening of theimage with a disk shaped structural element of radius 10. The radius is chosen suchthat the element cannot fit inside an individual nuclei unit. Therefore, nuclei unitsare not affected by the morphological filtering to estimate the background. Thethreshold for separating the nuclei units is computed by minimizing the intra-classvariance of the image [32]. The resultant preprocessed image after backgroundsubtraction and thresholding is shown in Fig. 3.1b.The segmentation function used in this watershed algorithm is the gradient im-age. But before applying the watershed algorithm, foreground and backgroundmarkers need to be computed to reduce over segmentation. To compute the fore-groundmarkers, morphological opening and closing by reconstruction is performedon the preprocessed image (Fig. 3.1b) to create a flat maxima inside each of theforeground objects which are used as the foreground markers. The watershed ridgeline calculated from the Euclid distance transform of the thresholded binary imageis taken as the background marker (see Fig. 3.1c). After computing the markers,the final watershed segmentation is performed on the gradient image of the prepro-cessed input scene. This gradient image is modified by placing regional minimain the marked pixels of foreground and background objects of the image (see Fig25a) b)a)c) d)e) f)Figure 3.1: a) Input scene: inverted ‘R’ channel of the histopathology image,b) preprocessed image after background subtraction and thresholding, c)computed foreground and background markers overlaid on the prepro-cessed image, d) modified gradient image with regional minima placedat the foreground and background object markers, e) labeled image ofthe segmented nuclei after watershed transform, and f) segmented nu-clei (marked by green boundary).26213R EL=0.71R EL=1.48 R EL =1.91−180 −50 0 50 1800.511.5Angle (in degrees)Y ang−180 −50 0 50 1800.511.522.5Angle (in degrees) −180 −50 00.511.522.5Angle (in degrees) 18050Yang(α1)=1 Yang(α2)=4 α1 α2 Gland 2Gland 1 Gland 3a) b)c) d) e)200 microns400 micronsY ang Y angFigure 3.2: a) Graphical illustration of Yang calculation. b) Samplehistopathology scene with a single benign (marked with blue ellipse)and two malignant gland units (marked with red ellipses). c), d), ande) illustrate the different appearances of the histograms (Yang) of benignand malignant glands.3.1d). These modified regional minima limits the number of segments producedthe watershed segmentation. Fig. 3.1d-e illustrates the final segmentation of thenuclei units.3.3 Extraction of Gland-specific Features3.3.1 Number of Nuclei Layers (NNL)To determine the number of nuclei layers pertaining to each gland, at first the seg-mented nuclei objects are paired with the corresponding gland unit that minimizesthe distance between the centroid of the nuclei and the gland lumen boundary. Foreach of the combined gland-nuclei object, an ellipse is fit around it. The angularlocation of each of the nuclei is evaluated by calculating the angle of the connect-27ing line of the gland centroid and corresponding nuclei centroid (see Fig. 3.2a).Then the feature NNL is evaluated from the histogram Yang of angular locations ofnuclei. Customized bin spacing has been utilized to account for glands of differ-ent sizes. The bin spacing for the histogram is evaluated as 360◦/Pg, where Pg isthe perimeter of the corresponding ellipse surrounding the gland. Then the NNLis evaluated by counting the total number of instances where multiple nuclei havethe same angular bin in the histogram and then normalizing it by dividing by Pg.Mathematically, NNL = 1Pg |{n|Yang(n) >= 1}|. Fig. 3.2c-e illustrates the differentnature of histogram, Yang in case of benign and malignant glands. As can be ob-served from the figure, the benign histogram provides more instances of multiplenuclei having same angular location.3.3.2 Ratio of Epithelial Layer area to Lumen Area (REL)This feature is evaluated by simply taking the ratio of the epithelial layer area tolumen area of the gland. In case of malignant glands, fast multiplication of cellslead the epithelial layer to invade more in to gland lumen. As a result, the ratio getslarger in case of malignant gland units.After the feature extraction we choose optimum thresholds on the features,τNNL and τREL for the classification of benign and malignant glands. We classify agland (Gi) as benign when the parameters fulfill the following criteria, LabelGi ={Benign|NNL(Gi) > τNNL ,REL(Gi) < τREL}. These threshold parameters are tunedby performing a ROC analysis that will be discussed in the following section of thethesis.3.4 Consolidation of GlandsThe final stage of the algorithm consolidates the malignant glands into continuousregions. The glands are separated into distinct groups based on their parent regionfrom the previous step. One approach for the encapsulation of the detected glandsis to generate a convex hull of the gland centroids. Unfortunately, since the truespatial extent of the prostate cancer rarely forms convex hulls, using such an al-gorithm will not represent the true extent of the detected regions. As a solution tothis problem we have utilized α-shape approach [11] of generating a continuous28boundary from a point cloud. In this approach, a continuous area from a pointcloud can be generated by point pairs that can be touched by an empty disc ofradius α . A graphical illustration of the hull generation is presented in Fig. 3.3.To implement this technique, at first 2-D Delauney [10] triangulation of the pointsare obtained. Each edge/triangle of the Delaunay triangulation is associated witha characteristic radius, the radius of the smallest empty circle containing the edgeor triangle. For a specific radius α , the α-complex of the given set of points isthe complex formed by the set of edges and triangles whose radii are at most α .The union of the edges and triangles in the α-complex forms the α-shape. Herewe have chosen the radius α = 0.12 mm in the order of a typical size of a malig-nant gland unit. Fig. 3.4 illustrates the final annotation obtained by the proposedtechnique on the sample image demonstrated in the previous chapter.3.5 ConclusionThis chapter elaborates on the detail of individual gland labeling which includenuclei segmentation, gland-specific feature extraction, and consolidation techniquefor detected malignant glands. Apart from cancer annotation, these gland specificfeatures have the the potential to diagnose other prostate anomalies too, such at-rophy and benign prostatic hyperplasia. One limitation of individual gland basedapproach is the less frequent Grade 5 cancers. At Grade 5, the individual glandsget raptured and no longer becomes visible as a single gland unit. Therefore, indi-vidual gland labeling will generate false negative regions. One future improvementon this technique can be to propose a solution to this problem.29Figure 3.3: a) Graphical illustration of boundary hull generation by α-shapeapproach.30Figure 3.4: a) Experimental result of the proposed technique after incorpo-rating the glandular level classification with the regional cancer classi-fication.31Chapter 4Experimental Results4.1 DatasetThe proposed algorithm has been evaluated on 70 different histopathology imagesobtained from 30 radical prostatectomy patients. These whole mount histopathol-ogy images are digitized at 20× magnification (0.5 µm per pixel) with an Aperioscanner. Each image was annotated by two pathologists. At first, the images weremarked by a pathologist on the glass slide before digitization. Then, a secondpathologist performed a detailed annotation on the digitized images. The anno-tations from the second pathologist have been used here as the gold standard toevaluate the performance of the proposed algorithm.4.2 Parameter Tuning for Random Forest ClassifierAmong the 70 images, 50 have been exploited to develop the random forest clas-sifier model. The parameters of the random forest classifier, the minimum leafsize of the tree and the classification threshold have been tuned by performing a10-fold cross validation for each set of the parameters on the 50 images. For eachleaf parameter in the set Rd flea f = {1,12,23, ...,100} a ROC curve is generatedby varying the threshold Rd fth = {0.20,0.23,0.26, ...,0.80}. Fig. 4.1 illustratesthe the ROC curve obtained for the optimum Rd flea f value. Table III providesthe Area Under the Curve (AUC) for each of the ROC curves. At the optimum320 0.2 0.4 0.6 0.8 10.860.880.90.920.940.960.98False Positive RateTruePositiveRate Sn=0.94Sp=0.83AUC=0.93Figure 4.1: ROC analysis for the parameter tuning of the random forest clas-sifier. Here the ROC curve with Rd flea f = 12 is shown. At the optimumoperating point the sensitivity Sn = 0.94 and specificity Sp = 0.83.operating point, the parameters are Rd flea f = 12 and Rd fth = 0.58. At this opti-mum point the sensitivity is 0.94 and specificity is 0.83. With these parameters,the remaining 20 images have been tested in a leave-25%-out cross validation ex-periment. In 50 independent repetitions of the experiment, the algorithm achievedsensitivity, specificity, and accuracy are 0.88, 0.92, and 0.92, respectively. Theprobable malignant regions detected after the random forest classification are thenfurther classified using their gland specific properties.33Table 4.1: AUC obtained by our algorithm for different parameter values ofRd flea f .Rd flea f 1 12 23 34 45 56 67 78 89 100AUC 0.90 0.93 0.94 0.92 0.90 0.92 0.88 0.87 0.85 0.804.3 Parameter Tuning for Individual GlandClassificationThe performance of the algorithm at the gland level is influenced by the choice ofthe parameter values REL and NNL. We tune the parameters by performing a similarROC analysis on a set of individually labeled benign and malignant glands. A totalof 4230 labeled glands have been used in this tuning process. The ROC curve of theclassifier is generated by varying the parameter NNL as {0,0.08,0.16, ...,4}. To de-termine the effect of varying REL on the classifier performance the following oper-ation has been performed: for each choice of REL in the set REL = {0,0.25, ...,2.5}the individual ROC curve by varying NNL has been generated. We choose thethresholds τNNL and τREL corresponding to the optimum operating point in theROC curve. In this experiment, we found the thresholds to be τNNL = 2.36 andτREL = 0.92. At the optimum operating point, the sensitivity and specificity are 0.79and 0.85, respectively. In each test image, the glands present in the pre-screenedmalignant regions are classified using these two parameters. With the gland levelanalysis, the sensitivity, specificity, and the accuracy of the proposed techniquereaches 0.88, 0.94, and 0.93, respectively.4.4 Qualitative Performance EvaluationFor qualitative performance evaluation, we have illustrated four test WM imagesfrom four different patients (see Fig. 4.3- 4.6). The first column represents the WMimages with the pathologist’s annotation (green). The middle column is the inter-mediate classification result at the 5× resolution (blue) and the third column is thefinal annotation after the gland-level analysis (green). A strong agreement betweenthe final annotation and the pathologist’s annotation corroborates the effectivenessof the proposed algorithm.340 0.2 0.4 0.6 0.8 100.20.40.60.81False positive rateTruepositiverateS n = 0 . 7 9S p = 0 . 8 5A UC = 0 . 8 7Figure 4.2: ROC analysis of the individually labeled gland dataset. Here, theROC curve for REL = 0.93 is shown. At the optimum operating pointthe sensitivity Sn = 0.79 and specificity Sp = 0.85.4.5 DiscussionThe individual gland classification approach is the major novelty of the proposedapproach. When this technique is applied independently without combining withthe regional approach, it can detect other tissue anomalies in the prostate. Most ofthe prostate diseases are closely related to the individual gland units and directlyaffect gland morphology [21]. Therefore, beyond cancer annotation, this individ-ual gland labeling can potentially generate a map of abnormality in prostate tissue.Since, these abnormalities are not reported in current clinical protocol of WM anal-ysis, a gold standard to compare the performance of the proposed technique couldnot be collected. When we perform the individual gland labeling of the wholemount images, a distribution of the abnormal glands is observed. Apart from de-tecting malignant glands, the approach detects glands from Prostatic Intraepithelial35a) b)c)Figure 4.3: Performance of the proposed technique on a sample test image(Case: 1). a) The test image with both pathologists’ annotation overlaid.The cyan annotation is from the second pathologist and considered asthe ground truth. b) The blue mark is the intermediate annotation afterrandom forest classification at 5× resolution. c) The green mark is thefinal cancer annotation obtained by the proposed technique.The bluedots represent the detected malignant gland units.Neoplasia (PIN), seminal vesicle , and prostatic atrophy. Fig. 4.12, illustrates anexample image of applying individual gland classification in WM images. Theenlarged figures illustrates the detected malignant glands, atrophic glands, and pre-cancerous PIN. Further improvement of this technique to detect only malignantunits can be achieved by identifying basal cell layers that are present in only thebenign units. To identify basal layer, a different chemical staining of the pathol-ogy images with Glutathione S-Transferase pi (GST-pi) is required [27]. More testimages with the individual gland segmentation is illustrated in Fig. 4.13- 4.39.36a) b)c)Figure 4.4: Performance of the proposed technique on a sample test image(Case: 2). a) The test image with both pathologists’ annotation overlaid.The cyan annotation is from the second pathologist and considered asthe ground truth. b) The blue mark is the intermediate annotation afterrandom forest classification at 5× resolution. c) The green mark is thefinal cancer annotation obtained by the proposed technique.The bluedots represent the detected malignant gland units.The combined classification accuracy obtained by our proposed technique isthe highest among the previously reported techniques on automatic cancer anno-tation of whole mount slides [26], [29]. Incorporating individual gland based ap-proach with the region based approach increases the specificity by 2% and accuracyby 1%.37a) b)c)Figure 4.5: Performance of the proposed technique on a sample test image(Case: 3). a) The test image with both pathologists’ annotation overlaid.The cyan annotation is from the second pathologist and considered asthe ground truth. b) The blue mark is the intermediate annotation afterrandom forest classification at 5× resolution. c) The green mark is thefinal cancer annotation obtained by the proposed technique.The bluedots represent the detected malignant gland units.38a) b)c)Figure 4.6: Performance of the proposed technique on a sample test image(Case: 4). a) The test image with both pathologists’ annotation overlaid.The cyan annotation is from the second pathologist and considered asthe ground truth. b) The blue mark is the intermediate annotation afterrandom forest classification at 5× resolution. c) The green mark is thefinal cancer annotation obtained by the proposed technique.The bluedots represent the detected malignant gland units.39a) b)c)Figure 4.7: Performance of the proposed technique on a sample test image(Case: 5). a) The test image with both pathologists’ annotation overlaid.The cyan annotation is from the second pathologist and considered asthe ground truth. b) The blue mark is the intermediate annotation afterrandom forest classification at 5x resolution. c) The green mark is thefinal cancer annotation obtained by the proposed technique.The bluedots represent the detected malignant gland units.40a) b)c)Figure 4.8: Performance of the proposed technique on a sample test image(Case: 6). a) The test image with both pathologists’ annotation overlaid.The cyan annotation is from the second pathologist and considered asthe ground truth. b) The blue mark is the intermediate annotation afterrandom forest classification at 5x resolution. c) The green mark is thefinal cancer annotation obtained by the proposed technique.The bluedots represent the detected malignant gland units.41a) b)c)Figure 4.9: Performance of the proposed technique on a sample test image(Case: 7). a) The test image with both pathologists’ annotation overlaid.The cyan annotation is from the second pathologist and considered asthe ground truth. b) The blue mark is the intermediate annotation afterrandom forest classification at 5x resolution. c) The green mark is thefinal cancer annotation obtained by the proposed technique.The bluedots represent the detected malignant gland units.42a) b)c)Figure 4.10: Performance of the proposed technique on a sample test image(Case: 8). a) The test image with both pathologists’ annotation over-laid. The cyan annotation is from the second pathologist and consid-ered as the ground truth. b) The blue mark is the intermediate an-notation after random forest classification at 5x resolution. c) Thegreen mark is the final cancer annotation obtained by the proposedtechnique.The blue dots represent the detected malignant gland units.43a) b)c)Figure 4.11: Performance of the proposed technique on a sample test im-age (Case: 9). a) The test image with both pathologists’ annotationoverlayed. The cyan annotation is from the second pathologist andconsidered as the ground truth. b) The blue mark is the intermediateannotation after random forest classification at 5x resolution. c) Thegreen mark is the final cancer annotation obtained by the proposedtechnique.The blue dots represent the detected malignant gland units.4412432PIN Malignantglands Seminal vesicleAtrophy413Figure 4.12: Illustration of the individual gland labeling on whole mount im-ages. Enlarged view of four different types of detected glands, 1).Atrophy, 2) PIN, 3) Malignant glands, and 4) Seminal vesicle.45Figure 4.13: Illustration of the individual gland labeling on a sample wholemount image (Test image:1)46Figure 4.14: Illustration of the individual gland labeling on a sample wholemount image (Test image:2)47Figure 4.15: Illustration of the individual gland labeling on a sample wholemount image (Test image:3)48Figure 4.16: Illustration of the individual gland labeling on a sample wholemount image (Test image:4)49Figure 4.17: Illustration of the individual gland labeling on a sample wholemount image (Test image:5)50Figure 4.18: Illustration of the individual gland labeling on a sample wholemount image (Test image:6)51Figure 4.19: Illustration of the individual gland labeling on a sample wholemount image (Test image:7)52Figure 4.20: Illustration of the individual gland labeling on a sample wholemount image (Test image:8)53Figure 4.21: Illustration of the individual gland labeling on a sample wholemount image (Test image:9)54Figure 4.22: Illustration of the individual gland labeling on a sample wholemount image (Test image:10)55Figure 4.23: Illustration of the individual gland labeling on a sample wholemount image (Test image:11)56Figure 4.24: Illustration of the individual gland labeling on a sample wholemount image (Test image:12)57Figure 4.25: Illustration of the individual gland labeling on a sample wholemount image (Test image:13)58Figure 4.26: Illustration of the individual gland labeling on a sample wholemount image (Test image:14)59Figure 4.27: Illustration of the individual gland labeling on a sample wholemount image (Test image:15)60Figure 4.28: Illustration of the individual gland labeling on a sample wholemount image (Test image:16)61Figure 4.29: Illustration of the individual gland labeling on a sample wholemount image (Test image:17)62Figure 4.30: Illustration of the individual gland labeling on a sample wholemount image (Test image:18)63Figure 4.31: Illustration of the individual gland labeling on a sample wholemount image (Test image:19)64Figure 4.32: Illustration of the individual gland labeling on a sample wholemount image (Test image:20)65Figure 4.33: Illustration of the individual gland labeling on a sample wholemount image (Test image:21)66Figure 4.34: Illustration of the individual gland labeling on a sample wholemount image (Test image:22)67Figure 4.35: Illustration of the individual gland labeling on a sample wholemount image (Test image:23)68Figure 4.36: Illustration of the individual gland labeling on a sample wholemount image (Test image:24)69Figure 4.37: Illustration of the individual gland labeling on a sample wholemount image (Test image:25)70Figure 4.38: Illustration of the individual gland labeling on a sample wholemount image (Test image:26)71Figure 4.39: Illustration of the individual gland labeling on a sample wholemount image (Test image:27)72Chapter 5Conclusions5.1 Summary of ContributionsIn this thesis, we have proposed a technique for automatic cancer annotation ex-ploiting both regional and gland-specific properties. Combining these two aspectsof histology, we have been able to achieve the best performance of automaticprostate cancer annotation. The major contributions of the works are:1. Gland segmentation: Here, we have implemented an algorithm to automat-ically segment the gland units. We have employed linear discriminant anal-ysis for labeling the tissue components associated with the gland units suchas lumen, nuclei, epithelium, and stroma. The application of the linear dis-criminant analysis provides a faster classification compared to other similartechnique of pixel labeling that uses voronoi tesselation based classification[30]. The segmentation of the gland units facilitates the extraction of featuresassociated with it that are used to classify benign and malignant regions.2. Nuclei Segmentation: We have also proposed a technique for nuclei seg-mentation using a marker controlled watershed algorithm. To the best of ourknowledge this is the first technique to segment nuclei from images at 20×magnification. The existing techniques of nuclei segmentation are performedon 40× or higher magnification [1], [3], [4], [6], [18], [24], [38], [39], .733. Quantification of Number of Nuclei Layers: We observed that the numberof nuclei layers is one of the most important features for classification ofindividual gland unit. Here, We have proposed an innovative technique forthe quantification of the number of nuclear layers. The number of layersare quantified by calculating the angular histogram of the nuclei surroundingeach gland. The average of the number of bins in the histogram having mul-tiple nuclei occurrences represents the Number of Nuclei Layers associatedwith the gland.4. Individual Gland classification: Here, we have presented the first techniquefor labeling individual glands in prostate using gland-specific features only.We have proposed two novel features for labeling individual glands: numberof nuclei layers and ratio of epithelial layer. The application of this individ-ual gland-based technique will lead to a more sensitive cancer annotation. Inaddition to cancer annotation, these gland-specific properties might also beuseful in identifying other prostate anomalies such as atrophy, benign pro-static hyperplasia, and prostatic intraepithelial neoplasia.5. Regional feature set: We have also proposed a set of regional tissue featuresto detect cancerous regions in WM images. We observed that in the malig-nant tissue regions the morphology and architecture of the histology compo-nents i.e., i) lumen, ii) epithelial layer, and iii) nuclei are altered from theirnormal condition. Hence we have proposed a set of nine features associatedwith these tissue components to capture these morphology and architecturalchanges. Among the nine features six of them are newly proposed features.The high classification accuracy of the random forest classifier corroboratesthe effectiveness the of the proposed feature set.6. Cancer annotation using regional and glandular features: Here we have pre-sented a two stage algorithm for automatic annotation of prostate cancerfrom WM images. The multi-resolution technique presented here is a newapproach for WM cancer annotation. The incorporation of the gland spe-cific features with that of the regional features achieved the highest accuracyamong the existing techniques for automated cancer annotation from WM74images.5.2 Future WorkThe proposed research has a number promising research directions in the field ofcomputational pathology. Future work on this topic may include:1. The features proposed here can be utilized in future to automatically gradethe prostate cancers. The Gleason grading scheme is based on the tissue ar-chitecture of the prostate. Since the proposed features captures the architec-ture and morphology of the tissue, those features can be useful in developinga computational grading scheme.2. The proposed individual gland classification technique often include glandfrom other prostate anomalies such as atrophy and prostatic intraepithelialneoplasia. Further improvement of the technique can be achieved to excludenon-malignant units. One possible improvement can be to identify the basallayer associated with each benign glands. Since malignant gland units do notposses a basal layer this criteria can be very effective to detect the malignantglands. This basal layer identification can be a post processing step after theproposed individual gland classification algorithm. The detected glands bythe proposed technique can be further classified to detect malignant glandsbased on the presence of the basal layer.3. Another exciting future research on this topic can be to investigate the cor-relation of the tissue features with the long term disease progression. Recur-rence of cancer in prostatectomy patients often does not correlate with theirgleason score. Therefore a better indicator is required to predict the patientoutcome. Study of the histology features of prostate might be very useful inthis area.4. The proposed cancer annotation from whole mount images can also be usedto find correlation of cancer in other imaging modalities, such as multipara-metric ultrasound and multiparametric MRI.75Bibliography[1] Y. Al-Kofahi, W. Lassoued, W. Lee, and B. Roysam. Improved automaticdetection and segmentation of cell nuclei in histopathology images. IEEETransactions on Biomedical Engineering, 57(4):841–852, 2010. → pages25, 73[2] F. Aurenhammer. Voronoi diagramsa survey of a fundamental geometricdata structure. ACM Computing Surveys (CSUR), 23(3):345–405, 1991. →pages 17[3] P. Bamford and B. Lovell. Unsupervised cell nucleus segmentation withactive contours. Signal Processing, 71:203–213, 1998. → pages 7, 25, 73[4] P. H. Bartels, T. Gahm, and D. Thompson. Automated microscopy indiagnostic histopathology: From image processing to automated reasoning.International Journal of Imaging Systems and Technology, 8(2):214–223,1997. → pages 25, 73[5] G. Begelrnan, E. Gur, E. Rivlin, M. Rudzsky, and Z. Zalevsky. Cell nucleisegmentation using fuzzy logic engine. In International Conference onImage Processing (ICIP), volume 5, pages 2937 – 2940, Oct. 2004. → pages7[6] L. Boucheron, Z. Bi, N. Harvey, B. Manjunath, and D. Rimm. Utility ofmultispectral imaging for nuclear classification of routine clinicalhistopathology imagery. BMC Cell Biology, 8(Suppl 1):S8, 2007. → pages25, 73[7] L. Breiman. Random forests. Machine Learning, 45(1):5–32, 2001. →pages 11, 14, 21[8] P. cancer treatment guide. Prostate cancer information from the foundationof the prostate gland. http://www.prostate-cancer.com, 2013. → pages 3, 476[9] M. D. Clark, F. B. Askin, and C. R. Bagnell. Nuclear roundness factor: aquantitative approach to grading in prostate carcinoma, reliability of needlebiopsy tissue, and the effect of tumor stage fore usefulness. The Prostate,10:199–206, 1987. → pages 7[10] B. Delaunay. Sur la sphere vide. Izv. Akad. Nauk SSSR, OtdelenieMatematicheskii i Estestvennyka Nauk, 7(793-800):1–2, 1934. → pages 29[11] H. Edelsbrunner and E. P. Mu¨cke. Three-dimensional alpha shapes. ACMTransactions on Graphics (TOG), 13(1):43–72, 1994. → pages 28[12] B. Fulkerson, A. Vedaldi, and S. Soatto. Class segmentation and objectlocalization with superpixel neighborhoods. In IEEE 12th InternationalConference on Computer Vision, pages 670–677. IEEE, 2009. → pages 8[13] M. Gao, P. Bridman, and S. Kumar. Computer aided prostate cancerdiagnosis using image enhancement and jpeg2000. In Proc. SPIE, volume5203, pages 323–334, 2003. → pages 8[14] D. F. Gleason and M. Tannenbaum. The veteran’s administrationcooperative urologic research group: Histologic grading and clinical stagingof prostatic carcinoma. Urologic Pathology: The Prostate, pages 171–198,1977. → pages viii, 4, 6[15] L. Gorelick, O. Veksler, M. Gaed, J. Gomez, M. Moussa, G. Bauman,A. Fenster, and A. Ward. Prostate histopathology: Learning tissuecomponent histograms for cancer detection and classification. IEEETransactions on Medical Imaging, 2013. → pages 8, 10[16] P. W. Huang and C. H. Lee. Automatic classification for pathologicalprostate images based on fractal analysis. IEEE Transactions on MedicalImaging, 28(7):1037–1050, 2009. → pages 7, 10[17] A. Jemal, F. Bray, M. M. Center, J. Ferlay, E. Ward, and D. Forman. Globalcancer statistics. CA: A Cancer Journal for Clinicians, 61(2):69–90, 2011.→ pages 1[18] T. Jiang and F. Yang. An evolutionary tabu search for cell imagesegmentation. IEEE Transactions on Systems, Man, and Cybernetics, PartB: Cybernetics, 32(5):675–678, 2002. → pages 25, 73[19] J. K. Khouzani and S. H. Zadeh. Multiwavelet grading of prostatepathological images. In Proc. SPIE, volume 4628, pages 1130–1138, 2002.→ pages 7, 1077[20] W. J. Krzanowski and W. Krzanowski. Principles of multivariate analysis.Clarendon, 2000. → pages 17[21] V. Kumar, A. K. Abbas, N. Fausto, and J. C. Aster. Robbins & CotranPathologic Basis of Disease. Elsevier Health Sciences, 2009. → pages 3, 35[22] E. Lee, Y. Pan, and P. Chu. An algorithm for region filling usingtwo-dimensional grammars. International Journal of Intelligent Systems, 2(3):255–263, 1987. → pages 17[23] K.-H. Leissner and L.-E. Tisell. The weight of the human prostate.Scandinavian Journal of Urology and Nephrology, 13(2):137–142, 1979. →pages 3[24] G. Li, T. Liu, J. Nie, L. Guo, J. Chen, J. Zhu, W. Xia, A. Mara, S. Holley,and S. Wong. Segmentation of touching cell nuclei using gradient flowtracking. Journal of Microscopy, 231(1):47–58, 2008. → pages 25, 73[25] F. Meyer. Topographic distance and watershed lines. Signal processing, 38(1):113–125, 1994. → pages 11[26] J. P. Monaco, M. Feldman, J. Tomaszewski, and A. Madabhushi. Detectionof prostate cancer from whole-mount histology images using markovrandom fields. In Proc. of 2nd Workshop on Micro. Image Anal. withApplications in Biology, 2008. → pages 8, 10, 37[27] C. A. Moskaluk, P. H. Duray, K. H. Cowan, M. Linehan, and M. J. Merino.Immunohistochemical expression of pi-class glutathione s-transferase isdown-regulated in adenocarcinoma of the prostate. Cancer, 79(8):1595–1599, 1997. → pages 36[28] S. Naik, S. Doyle, M. Feldman, J. Tomaszewski, and A. Madabhushi. Glandsegmentation and computerized gleason grading of prostate histology byintegrating low-, high-level and domain specific information. In Proc. of 2ndWorkshop on Micro. Image Anal. with Applications in Biology, 2007. →pages 8, 10[29] K. Nguyen, A. K. Jain, and B. Sabata. Prostate cancer detection: Fusion ofcytological and textural features. Journal of Pathology Informatics, 2, 2011.→ pages 37[30] K. Nguyen, B. Sabata, and A. K. Jain. Prostate cancer grading: Glandsegmentation and structural features. Pattern Recognition Letters, 33:951 –961, 2011. → pages 15, 17, 7378[31] K. Nguyen, A. Sarkar, and A. K. Jain. Structure and context in prostaticgland segmentation and classification. In Medical Image Computing andComputer-Assisted Intervention–MICCAI 2012, pages 115–123. Springer,2012. → pages 8, 9, 10[32] N. Otsu. A threshold selection method from gray-level histograms.Automatica, 11(285-296):23–27, 1975. → pages 25[33] R. E. Schapire. A brief introduction to boosting. In IJCAI, volume 99, pages1401–1406, 1999. → pages 8[34] R. Stotzka, R. Manner, P. H. Bartels, and D. Thompson. A hybrid neural andstatistical classifier system for histopathologic grading of prostate lesions.Analytical and Quantitative Cytology and Histology, 17(3):204–218, 1995.→ pages 7[35] K. Suzuki, I. Horiba, and N. Sugie. Linear-time connected-componentlabeling based on sequential local operations. Computer Vision and ImageUnderstanding, 89(1):1–23, 2003. → pages 17[36] A. Tabesh, M. Teverovskiy, H. Y. Pang, V. P. Kumar, D. Verbel, A. Kotsianti,and O. Saidi. Multifeature prostate cancer diagnosis and gleason grading ofhistological images. IEEE Transactions on Medical Imaging, 26(4):518–523, 2007. → pages 7, 10[37] O. Veksler, Y. Boykov, and P. Mehrani. Superpixels and supervoxels in anenergy optimization framework. In Computer Vision–ECCV 2010, pages211–224. Springer, 2010. → pages 8[38] S. Wienert, D. Heim, K. Saeger, A. Stenzinger, M. Beil, P. Hufnagl,M. Dietel, C. Denkert, and F. Klauschen. Detection and segmentation of cellnuclei in virtual microscopy images: a minimum-model approach. ScientificReports, 2, 2012. → pages 25, 73[39] H.-S. Wu, J. Barba, and J. Gil. A parametric fitting algorithm forsegmentation of cell images. IEEE Transactions on BiomedicalEngineering, 45(3):400–407, 1998. → pages 25, 73[40] Y. Zhu, S. Williams, and R. Zwiggelaar. Computer technology in detectionand staging of prostate carcinoma: A review. Med. image Anal, 10:178–199,2006. → pages 179

Cite

Citation Scheme:

        

Citations by CSL (citeproc-js)

Usage Statistics

Share

Embed

Customize your widget with the following options, then copy and paste the code below into the HTML of your page to embed this item in your website.
                        
                            <div id="ubcOpenCollectionsWidgetDisplay">
                            <script id="ubcOpenCollectionsWidget"
                            src="{[{embed.src}]}"
                            data-item="{[{embed.item}]}"
                            data-collection="{[{embed.collection}]}"
                            data-metadata="{[{embed.showMetadata}]}"
                            data-width="{[{embed.width}]}"
                            async >
                            </script>
                            </div>
                        
                    
IIIF logo Our image viewer uses the IIIF 2.0 standard. To load this item in other compatible viewers, use this url:
https://iiif.library.ubc.ca/presentation/dsp.24.1-0166891/manifest

Comment

Related Items