Abstract
Parametric Local Multimodal Hashing for Cross-View Similarity Search / 2754
Deming Zhai, Hong Chang, Yi Zhen, Xianming Liu, Xilin Chen, Wen Gao
Recent years have witnessed the growing popularity of hashing for efficient large-scale similarity search. It has been shown that the hashing quality could be boosted by hash function learning(HFL). In this paper, we study HFL in the context of multimodal data for cross-view similarity search. We present a novel multimodal HFL method, called Parametric Local Multimodal Hashing(PLMH), which learns a set of hash functions to locally adapt to the data structure of each modality. To balance locality and computational efficiency, the hashing projection matrix of each instance is parameterized, with guaranteed approximation errorbound, as a linear combination of basis hashing projections of a small set of anchor points. A local optimal conjugate gradient algorithm is designed to learn the hash functions for each bit, and the overall hash codes are learned in a sequential manner to progressively minimize the bias. Experimental evaluations on cross-media retrieval tasks demonstrate that PLMH performs competitively against the state-of-the-art methods.