Online unsupervised cross-view discrete hashing for large-scale retrieval
File version
Author(s)
Wu, Wei
Yuan, Yun-Hao
Pan, Shirui
Shen, Xiaobo
Griffith University Author(s)
Primary Supervisor
Other Supervisors
Editor(s)
Date
Size
File type(s)
Location
License
Abstract
Cross-view hashing has shown great potential for large-scale retrieval due to its superiority in terms of computation and storage. In real-world applications, data emerges in a streaming manner, e.g., new images and tags are uploaded to social media by users every day. Existing cross-view hashing methods have to retrain model on new multi-view data, which is time-consuming and not applicable in the real-world applications. To fill this gap, this paper proposes a new online cross-view hashing method, dubbed online unsupervised cross-view discrete hashing (OUCDH) that considers similarity preservation and quantization loss. OUCDH generates hash code as latent embedding shared by multiple views via matrix factorization. OUCDH can well preserve similarity among newly arriving data and old data with the help of anchor graph. An iterative efficient algorithm is developed for online optimization. OUCDH further updates hash code of old data to match that of newly arriving data in each iteration. Extensive experiments on three benchmarks demonstrate that the proposed OUCDH yields superior performance than existing state-of-the-art online cross-view hashing methods.
Journal Title
Applied Intelligence
Conference Title
Book Title
Edition
Volume
52
Issue
13
Thesis Type
Degree Program
School
Publisher link
Patent number
Funder(s)
Grant identifier(s)
Rights Statement
Rights Statement
Item Access Status
Note
Access the data
Related item(s)
Subject
Artificial intelligence
Image processing
Science & Technology
Technology
Computer Science, Artificial Intelligence
Computer Science
Discrete hashing
Persistent link to this record
Citation
Li, X; Wu, W; Yuan, Y-H; Pan, S; Shen, X, Online unsupervised cross-view discrete hashing for large-scale retrieval, Applied Intelligence, 2022, 52 (13), pp. 14905-14917