Radon-Gabor barcodes for medical image retrieval

Mina Nouredanesh, H. R. Tizhoosh, Ershad Banijamali, James Tung

Research output: Chapter in Book/Report/Conference proceedingConference contribution


In recent years, with the explosion of digital images on the Web, content-based retrieval has emerged as a significant research area. Shapes, textures, edges and segments may play a key role in describing the content of an image. Radon and Gabor transforms are both powerful techniques that have been widely studied to extract shape-texture-based information. The combined Radon-Gabor features may be more robust against scale/rotation variations, presence of noise, and illumination changes. The objective of this paper is to harness the potentials of both Gabor and Radon transforms in order to introduce expressive binary features, called barcodes, for image annotation/tagging tasks. We propose two different techniques: Gabor-of-Radon-Image Barcodes (GRIBCs), and Guided-Radon-of-Gabor Barcodes (GRGBCs). For validation, we employ the IRMA x-ray dataset with 193 classes, containing 12,677 training images and 1,733 test images. A total error score as low as 322 and 330 were achieved for GRGBCs and GRIBCs, respectively. This corresponds to ≈ 81% retrieval accuracy for the first hit.

Original languageEnglish (US)
Title of host publication2016 23rd International Conference on Pattern Recognition, ICPR 2016
PublisherInstitute of Electrical and Electronics Engineers Inc.
Number of pages6
ISBN (Electronic)9781509048472
StatePublished - Jan 1 2016
Event23rd International Conference on Pattern Recognition, ICPR 2016 - Cancun, Mexico
Duration: Dec 4 2016Dec 8 2016

Publication series

NameProceedings - International Conference on Pattern Recognition
ISSN (Print)1051-4651


Conference23rd International Conference on Pattern Recognition, ICPR 2016

ASJC Scopus subject areas

  • Computer Vision and Pattern Recognition


Dive into the research topics of 'Radon-Gabor barcodes for medical image retrieval'. Together they form a unique fingerprint.

Cite this