Sparse Composite Quantization
Abstract
The quantization techniques have shown competitive performance in approximate nearest neighbor search. The state-of-the-art algorithm, composite quantization, takes advantage of the compositionabity, i.e., the vector approximation accuracy, as opposed to product quantization and Cartesian k-means. However, we have observed that the runtime cost of computing the distance table in composite quantization, which is used as a lookup table for fast distance computation, becomes nonnegligible in real applications, e.g., reordering the candidates retrieved from the inverted index when handling very large scale databases. To address this problem, we develop a novel approach, called sparse composite quantization, which constructs sparse dictionaries. The benefit is that the distance evaluation between the query and the dictionary element (a sparse vector) is accelerated using the efficient sparse vector operation, and thus the cost of distance table computation is reduced a lot. Experiment results on large scale ANN retrieval tasks (1M SIFTs and 1B SIFTs) and applications to object retrieval show that the proposed approach yields competitive performance: superior search accuracy to product quantization and Cartesian k-means with almost the same computing cost, and much faster ANN search than composite quantization with the same level of accuracy.
Publication Date
10-14-2015
Publication Title
Proceedings of the IEEE Computer Society Conference on Computer Vision and Pattern Recognition
Volume
07-12-June-2015
Number of Pages
4548-4556
Document Type
Article; Proceedings Paper
Personal Identifier
scopus
DOI Link
https://doi.org/10.1109/CVPR.2015.7299085
Copyright Status
Unknown
Socpus ID
84959198229 (Scopus)
Source API URL
https://api.elsevier.com/content/abstract/scopus_id/84959198229
STARS Citation
Zhang, Ting; Qi, Guo Jun; Tang, Jinhui; and Wang, Jingdong, "Sparse Composite Quantization" (2015). Scopus Export 2015-2019. 1987.
https://stars.library.ucf.edu/scopus2015/1987