Beyond neighbourhood-preserving transformations for quantization-based unsupervised hashing

Sobhan Hemati, H. R. Tizhoosh

Research output: Contribution to journalArticlepeer-review


An effective unsupervised hashing algorithm leads to compact binary codes preserving the neighborhood structure of data as much as possible. One of the most established schemes for unsupervised hashing is to reduce the dimensionality of data and then find a rigid (neighborhood-preserving) transformation that reduces the quantization error. Although employing rigid transformations is effective, we may not reduce quantization loss to the ultimate limits. As well, reducing dimensionality and quantization loss in two separate steps seems to be sub-optimal. Motivated by these shortcomings, we propose to employ both rigid and non-rigid transformations to reduce quantization error and dimensionality simultaneously. We relax the orthogonality constraint on the projection in a PCA-formulation and regularize this by a quantization term. We show that both the non-rigid projection matrix and rotation matrix contribute towards minimizing quantization loss but in different ways. A scalable nested coordinate descent approach is proposed to optimize this mixed-integer optimization problem. We evaluate the proposed method on five public benchmark datasets providing almost half a million images. Comparative results indicate that the proposed method mostly outperforms state-of-art linear methods and competes with end-to-end deep solutions.

Original languageEnglish (US)
Pages (from-to)44-50
Number of pages7
JournalPattern Recognition Letters
StatePublished - Jan 2022


  • Binary Representation
  • Image Search
  • Quantization
  • Unsupervised Hashing

ASJC Scopus subject areas

  • Software
  • Signal Processing
  • Computer Vision and Pattern Recognition
  • Artificial Intelligence


Dive into the research topics of 'Beyond neighbourhood-preserving transformations for quantization-based unsupervised hashing'. Together they form a unique fingerprint.

Cite this