Skip to content

Latest commit

 

History

History
121 lines (121 loc) · 8.41 KB

DeepHashing.md

File metadata and controls

121 lines (121 loc) · 8.41 KB

Deep Hashing

Kevin (Ke-Yun) Lin

Homepage:https://sites.google.com/site/kevinlin311tw/
github:https://github.com/kevinlin311tw

Learning Compact Binary Descriptors with Unsupervised Deep Neural Networks

Introduction

We propose a new unsupervised deep learning approach to learn compact binary descriptor.
We enforce three criterions on binary codes which are learned at the top layer of our network:

  1. minimal loss quantization;
  2. evenly distributed codes;
  3. rotation invariant bits.
    Then, we learn the parameters of the networks with a back-propagation technique.
    Experimental results on three different visual analysis tasks including image matching, image retrieval, and object recognition demonstrate the effectiveness of the proposed approach.

点评:利用无监督学习,首先在vgg16模型基础上学习量化损失和均匀分布,对Vgg16模型参数调优,然后在学习旋转不变的patch, 进一步调优。

The details can be found in the following CVPR 2016 paper

codes

github: https://github.com/kevinlin311tw/cvpr16-deepbit

  1. MATLAB (tested with 2015a on 64-bit Ubuntu)
  2. Caffe’s prerequisites

Learning Compact Binary Descriptors with Unsupervised Deep Neural Networks
Kevin Lin, Jiwen Lu, Chu-Song Chen and Jie Zhou
IEEE Conference on Computer Vision and Pattern Recognition (CVPR), 2016

Deep Learning of Binary Hash Codes for Fast Image Retrieval

Deep Learning of Binary Hash Codes for Fast Image Retrieval
K. Lin, H.-F. Yang, J.-H. Hsiao, C.-S. Chen
CVPR Workshop (CVPRW) on Deep Learning in Computer Vision, DeepVision 2015, June 2015.
Rapid Clothing Retrieval via Deep Learning of Binary Codes and Hierarchical Search
K. Lin, H.-F. Yang, K.-H. Liu, J.-H. Hsiao, C.-S. Chen
ACM International Conference on Multimedia Retrieval, ICMR 2015, June 2015.

Introduction

We present a simple yet effective deep learning framework to create the hash-like binary codes for fast image retrieval. We add a latent-attribute layer in the deep CNN to simultaneously learn domain specific image representations and a set of hash-like functions. Our method does not rely on pairwised similarities of data and is highly scalable to the dataset size.

the paper is in CVPRW 2015 paper and the slides

点评:在F7F8之间增加隐含层H

Code

github: https://github.com/kevinlin311tw/caffe-cvprw15

  1. MATLAB (tested with 2012b on 64-bit Linux)
  2. Caffe’s prerequisites

Yan Pan

homepage:http://ss.sysu.edu.cn/~py/

Supervised Hashing for Image Retrieval via Image Representation Learning

Supervised Hashing for Image Retrieval via Image Representation Learning
Rongkai Xia (third-year master student), Yan Pan*, Hanjiang Lai, Cong Liu, and Shuicheng Yan
In Proceedings of AAAI Conference on Artificial Intelligence (AAAI), 2014

paper: http://ss.sysu.edu.cn/~py/papers/AAAI-CNNH.pdf
PPT: http://ss.sysu.edu.cn/~py/CNNH-slides.pdf

Method:

  • step1: 通过对相似度矩阵(矩阵中的每个元素指示对应的两个样本是否相似)进行分解,得到样本的二值编码
  • step2: 利用CNN对得到的二值编码进行拟合;
    • 多标签预测问题 — 交叉熵损失
    • 分类的损失函数 — softmax

conclusion: 尽管实验中CNNH相比传统的基于手工设计特征的方法取得了显著的性能提升,但是这个方法仍然不是端到端的方法,学到的图像表示不能反作用于二值编码的更新,因此并不能完全发挥出深度学习的能力。

Simultaneous Feature Learning and Hash Coding with Deep Neural Networks

Simultaneous Feature Learning and Hash Coding with Deep Neural Networks
Hanjiang Lai, Yan Pan*, Ye Liu, and Shuicheng Yan
In Proceedings of IEEE International Conference on Pattern Recognition and Computer Vision (CVPR), 2015

paper: http://arxiv.org/abs/1504.03410

Method:

  1. 网络使用三张图像构成的三元组进行训练。
  2. 损失函数:Hamming空间中,相似样本间的距离小于不相似样本间的距离
  3. 网络结构
    • 为了减小二值码不同bit之间的冗余性,作者提出使用部分连接层代替全连接层,每个部分负责学习一个bit,各部分之间无连接;
    • 为了避免二值码学习中的离散取值约束,像大多数哈希方法一样,作者使用sigmoid激活函数将离散约束松弛为范围约束({0,1}→(0,1)),同时为了保持学到的特征空间和Hamming空间相似,引入了分段量化函数;

liang lin

homepage : http://ss.sysu.edu.cn/~ll/

Bit-Scalable Deep Hashing

Ruimao Zhang, Liang Lin, Rui Zhang, Wangmeng Zuo, Lei Zhang. Bit-Scalable Deep Hashing with Regularized Similarity Learning for Image Retrieval and Person Re-identification. TIP 2015.

Project: http://vision.sysu.edu.cn/projects/deephashing/
Code :https://github.com/ruixuejianfei/BitScalableDeepHash/

Introduction

  1. 基于三元组的损失
  2. 使用图像对(image pair)之间的相似性作为正则项
  3. 网络学习: 加权Hamming距离

wujun li

homepage : http://cs.nju.edu.cn/lwj/
Learning to Hash: http://cs.nju.edu.cn/lwj/L2H.html

Feature Learning based Deep Supervised Hashing with Pairwise Labels

Feature Learning based Deep Supervised Hashing with Pairwise Labels.
Wu-Jun Li, Sheng Wang*, Wang-Cheng Kang*.
To Appear in Proceedings of the 25th International Joint Conference on Artificial Intelligence (IJCAI), 2016.

Paper: http://cs.nju.edu.cn/lwj/paper/IJCAI16_DPSH.pdf
Code: VGG convnetmat

Introduction:

simultaneous feature learning and hash code learning for applications with pairwise labels.