Results 161 to 170 of about 1,457 (180)
Some of the next articles are maybe not open access.

Fast Scalable Supervised Hashing

The 41st International ACM SIGIR Conference on Research & Development in Information Retrieval, 2018
Despite significant progress in supervised hashing, there are three common limitations of existing methods. First, most pioneer methods discretely learn hash codes bit by bit, making the learning procedure rather time-consuming. Second, to reduce the large complexity of the n by n pairwise similarity matrix, most methods apply sampling strategies ...
Xin Luo   +5 more
openaire   +1 more source

Reconstruction-based supervised hashing

2017 IEEE International Conference on Multimedia and Expo (ICME), 2017
In this paper, we propose a reconstruction-based supervised hashing (RSH) method to learn compact binary codes with holistic structure preservation for large scale image search. Unlike most existing hashing methods which consider pair-wise similarity, our method exploits the structural information of samples by employing a reconstruction-based ...
Xin Yuan   +4 more
openaire   +1 more source

Hadamard Coding for Supervised Discrete Hashing

IEEE Transactions on Image Processing, 2018
In this paper, we propose a learning-based supervised discrete hashing method. Binary hashing is widely used for large-scale image retrieval as well as video and document searches because the compact binary code representation is essential for data storage and reasonable for query searches using bit-operations. The recently proposed supervised discrete
Gou Koutaki   +2 more
openaire   +2 more sources

Supervised Multi-View Distributed Hashing

2020 IEEE International Conference on Image Processing (ICIP), 2020
Multi-view hashing efficiently integrates multi-view data for learning compact hash codes, and achieves impressive large-scale retrieval performance. In real-world applications, multi-view data are often stored or collected in different locations, where hash code learning is more challenging yet less studied.
Yunpeng Tang   +5 more
openaire   +1 more source

Supervised Locality Preserving Hashing

2018
Hashing methods are becoming increasingly popular because they can achieve fast retrieval of large-scale data by representing the images with binary codes. However, the traditional hashing methods tend to obtain the binary codes by relaxing the discrete problems which greatly increase the information loss.
Xiao Zhou, Zhihui Lai, Yudong Chen
openaire   +1 more source

Semi-supervised constraints preserving hashing

Neurocomputing, 2015
With the ever-increasing amount of multimedia data on the web, hashing-based approximate nearest neighbor search methods have attracted significant attention due to its remarkable efficiency gains and storage reductions. Traditional unsupervised hashing methods are designed for preserving distance metric similarity which may lead to semantic gap among ...
Di Wang, Xinbo Gao, Xiumei Wang
openaire   +1 more source

Supervised Hashing with Soft Constraints

Proceedings of the 23rd ACM International Conference on Conference on Information and Knowledge Management, 2014
Due to the ability to preserve semantic similarity in Hamming space, supervised hashing has been extensively studied recently. Most existing approaches encourage two dissimilar samples to have maximum Hamming distance. This may lead to an unexpected consequence that two unnecessarily similar samples would have the same code if they are both dissimilar ...
Cong Leng   +4 more
openaire   +1 more source

Weakly-supervised Cross-modal Hashing

IEEE Transactions on Big Data, 2019
Cross-modal hashing can efficiently retrieve data across different modalities and has been successfully applied in various domains. Although many supervised cross-modal hashing methods have been proposed, they generally focus on two modals only and assume that the labels of training data are sufficient and complete.
Xuanwu Liu   +5 more
openaire   +1 more source

Supervised Consistent and Specific Hashing

2019 IEEE International Conference on Multimedia and Expo (ICME), 2019
Most existing methods seek for the common semantics using different projections for different modalities, which isolates the intrinsic relationships among different modalities. Besides, to avoid the large quantization error, some of them adopt the discrete cyclic coordinate descent schemes which are usually time-consuming.
Haitao Wang   +3 more
openaire   +1 more source

Large-Margin Supervised Hashing

2017
Learning to hash embeds objects (e.g. images/documents) into a binary space with the semantic similarities preserved from the original space, which definitely benefits large-scale tough tasks such as image retrieval. By leveraging semantic labels, supervised hashing methods usually achieve better performance than unsupervised ones in real-world ...
Xiaopeng Zhang   +3 more
openaire   +1 more source

Home - About - Disclaimer - Privacy