d Original Contribution ANALYSIS OF ELASTOGRAPHIC AND B-MODE FEATURES AT SONOELASTOGRAPHY FOR BREAST TUMOR CLASSIFICATION WOO K YUNG MOON , * CHIUN-S HENG HUANG, y WEI-C HIH S HEN , z ETSUO TAKADA , x RUEY-F ENG CHANG , k J ULIWATI J OE , { MICHIKO NAKAJIMA , # and M ASAYUKI K OBAYASHI # * Department of Radiology, College of Medicine, Seoul National University, Seoul, Korea; y Department of Surgery, National Taiwan University Hospital, Taipei, Taiwan; z Department of Computer Science and Information Engineering, Asia University, Taichung County, Taiwan; x Center of Medical Ultrasonics, Dokkyo Medical University, Mibu, Japan; k Department of Computer Science and Information Engineering, Graduate Institute of Biomedical Electronics and Bioinformatics, National Taiwan University, Taipei, Taiwan; { Department of Computer Science and Information Engineering, National Chung Cheng University, Chiayi, Taiwan; and # Comprehensive Regional Service, Saitama Medical University, Saitama, Japan (Received 12 January 2009, revised 10 June 2009, in final form 20 June 2009) Abstract—The purpose of this study was to evaluate the accuracy of neural network analysis of elastographic features at sonoelastography for the classification of biopsy-proved benign and malignant breast tumors. Sonoe lastography of 181 solid breast masses (113 benign and 68 malignant tumors) was performed for 181 patients (mean age, 47 years; range, 24–75 years). After the manual segmentation of the tumors, five elastographic feat (strain difference, strain ratio, mean, median and mode) and six B-mode features (orientation, undulation, angu- larity, average gradient, gradient variance and intensity variance) were computed. A neural network was used to classify tumors by the use of these features. The Student’s t test and receiver operating characteristic (ROC) cur analysis were used for statistical analysis. Area under ROC curve (Az) values of the three elastographic features– mean (0.87), median (0.86) and mode (0.83)–were significantly higher than the Az values for the six B-mode features (0.54–0.69) (p , 0.01). Accuracy, sensitivity, specificity and Az of the neural network for the classificatio of solid breast tumors were 86.2% (156/181), 83.8% (57/68), 87.6% (99/113) and 0.84 for the elastographic feat respectively, and 82.3% (149/181), 70.6% (48/68), 89.4% (101/113) and 0.78 for the B-mode features, respectiv and 90.6% (164/181), 95.6% (65/68), 87.6% (99/113) and 0.92 for the combination of the elastographic and B-m features, respectively. We conclude that sonoelastographic images and neural network analysis of features has potential to increase the accuracy of the use of ultrasound for the classification of benign and malignant breast tumors.(E-mail: rfchang@csie.ntu.edu.tw) Ó 2009 World Federation for Ultrasound in Medicine & Biology. Key Words: Breast tumor, Elastography, B-mode ultrasound, BI-RADS, Neural network. INTRODUCTION Elastography is a noninvasive imaging method developed to evaluate the stiffness of soft tissues ( Hall et al. 2003; Ophir et al. 1991). With the use of sonoelastography, the difference in hardness between normal and diseased tissue of the breast can be estimated by measuring the tissue strain induced by probe compression. Several clinicalstudies have reported that sonoelastography has the potential to differentiate between benign and malignant breast masses (Booi et al. 2008; Garra et al. 1997; Regner et al. 2006). The measured transverse diameters of benign tumors on elastographic images were almost always the same as o smaller than the diameters of the tumors as determined on B-mode images, whereas the diameters of malignant tumors on elastographic images were invariably larger thanthoseon B-modeimages( Garraet al. 1997; Hall et al. 2003). This finding results from benign tumors generally having smooth borders and thatthe benign tumorsare loosely bound to the surrounding tissues, whereas malignant tumors are usually characterized by firm desmoplastic reactions with the surrounding tissue. In currentcommercialsonoelastography units including the equipment used in this study, the strain data are converted into a color scale images and that ar superimposed on B-mode images to recognize easily the relationshipbetweenthe strainand lesionon the Address correspondence to: Professor Ruey-Feng Chang, Depart- mentof ComputerScienceand Information Engineering, National Taiwan University, Taipei, Taiwan 106, R.O.C. E-mail: rfchang@csie. ntu.edu.tw 1794 Ultrasound in Med. & Biol., Vol. 35, No. 11, pp. 1794–1802, 2009 Copyright Ó 2009 World Federation for Ultrasound in Medicine & Biology Printed in the USA. All rights reserved 0301-5629/09/$–see front matter doi:10.1016/j.ultrasmedbio.2009.06.1094