Skip to content

CAS-CLab/K-Nearest-Neighbors-Hashing

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

30 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

K-Nearest-Neighbors-Hashing

Matlab implementation of "K-Nearest Neighbors Hashing".

Dependencies :

  • MatLab >= R2016b (GPU support knnsearch)
  • GPU memory >= 10G
  • MatLab <= R2016a (CPU only knnsearch)

How to use ?

  1. Download datasets and unzip to ./datasets
  2. Edit SETTING part in demo.m:
%% SETTING
num_bits = 16;
dataset = 'cifar10'; % cifar10, mnist, labelme, places
feature_type = 'vggfc7'; % gist, vggfc7, uint (MINIST only), alexnet (Places205 only)

gpuDevice_ID = 1; % if CPU-only / you don't want to use GPU, just set it as -1

ITER = 10; % run ITER times
K = 20; % 200 for Places205
%%

For Places205, we recommend downloading the precomputed KNN index matrix from GoogleDrive or Baidu Pan (582l), then move it to ./model. It could be time consuming to generate the results using CPU.

  1. run demo.m.

Demo results :

mAP Cifar10-Gist MNIST-Gist LabelMe-Gist Cifar10-VGG LabelMe-VGG MNIST-gray Places205-alexnet
16 bit 17.3 53.1 15.6 29.1 20.1 47.3 7.7
24 bit 18.1 59.6 16.0 30.3 22.0 52.7 10.1
32 bit 18.8 61.1 16.4 30.8 23.8 53.3 12.2
64 bit 19.5 65.6 17.5 32.6 26.2 56.0 15.9
128 bit 20.0 68.3 18.1 33.7 28.2 58.3 17.9

For more detailed results and experimental settings, please refer to our paper.

Dataset Links:

OneDrive gist vggfc7 uint alexnet
Cifar10
MNIST
Labelme
Places205
Baidu Pan gist vggfc7 uint alexnet
Cifar10 (irou) (0hr5)
MNIST (6d29) (masv)
Labelme (47ny) (b7na)
Places205 (opw9)

uint refers to MNIST 784-D (28x28) gray-scale feature vector, which is saved as uint8.

Brief Intro :

MSE may not lead to the optimal binary representation. We propose to use the conditional entropy as the criterion to refine the previous method.

Q&A

  1. By adjusting the hyper-parameters such as KNNHparam.K and KNNHparam.times, I get a better or worse result.

It is possible that our setting is suboptimal, which has been discussed in our paper (see Figure 3). We did not try to fine-tune those settings to overcome previous well-known methods.

  1. Why KNNH performs poorly on precision(%)@r=2?

Precision@r is different from the well-known R-precision which describes one point on the precision-recall curve (we use p@r=2 because it is popular in recent deep hashings). In our experiments, though the precision@r=2 values by KNNH are not very good, the recall@r=2 of KNNH is higher than baselines. As we all know, it is common to compare the precision of two points on different PR curves with the same recall. Otherwise, the comparison is unfair.

Acknowlegment :

We would like to thank the authors of Gemb and MiHash for sharing their codes! This project is built on previous well-known methods such as ITQ, BA, KMH, SH, PCAH and SPH.

References :

@InProceedings{KNNH,
  author    = {Xiangyu He and Peisong Wang and Jian Cheng},
  title     = {K-Nearest Neighbors Hashing},
  booktitle = {2019 {IEEE} Conference on Computer Vision and Pattern Recognition},
  month     = {July},
  year      = {2019}
}

About

Matlab implementation of "K-Nearest Neighbors Hashing" (CVPR2019)

Topics

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published