Increasing segmentation accuracy in ultrasound imaging using filtering and snakes

Kaveh Houshmand, Hamid R. Tizhoosh

Research output: Chapter in Book/Report/Conference proceedingConference contribution

Abstract

Ultrasound images have low level of contrast and are corrupted with speckle noise. Due to these effects, segmentation of ultrasound images is very challenging. Because of their adaptive characteristics, active Contours or Snakes are a commonly used method for segmentation of this type of images. Even with this adaptive method which is made for this type of environment other challenges come across. With abundance of noise in ultrasound images, snakes cannot converge to the object's outline in some cases. As a result, the detected boundary will not be accurate enough. Therefore, some pre-processing methods are usually necessary. In this paper, contrast adjustment techniques and fusion of different filters have been implemented to help the snake algorithm converge. As a result, the boundaries of object of interest in this case prostate cancer will be identified. Then the accuracy is measured and compared with ground-truth images prepared by experts.

Original languageEnglish (US)
Title of host publicationIEEE Canadian Conference on Electrical and Computer Engineering, Proceedings, CCECE 2008
Pages1333-1336
Number of pages4
DOIs
StatePublished - 2008
EventIEEE Canadian Conference on Electrical and Computer Engineering, CCECE 2008 - Niagara Falls, ON, Canada
Duration: May 4 2008May 7 2008

Publication series

NameCanadian Conference on Electrical and Computer Engineering
ISSN (Print)0840-7789

Conference

ConferenceIEEE Canadian Conference on Electrical and Computer Engineering, CCECE 2008
Country/TerritoryCanada
CityNiagara Falls, ON
Period5/4/085/7/08

ASJC Scopus subject areas

  • Hardware and Architecture
  • Electrical and Electronic Engineering

Fingerprint

Dive into the research topics of 'Increasing segmentation accuracy in ultrasound imaging using filtering and snakes'. Together they form a unique fingerprint.

Cite this