Home

Awesome

DeepCD

Code Author: Tsun-Yi Yang

Last update: 2017/08/17 (Training and testing codes are both uploaded.)

Platform: Ubuntu 14.04, Torch7

Paper

[ICCV17] DeepCD: Learning Deep Complementary Descriptors for Patch Representations

Authors: Tsun-Yi Yang, Jo-Han Hsu, Yen-Yu Lin, and Yung-Yu Chuang

PDF:

Code abstract

This is the source code of DeepCD. The training is done on Brown dataset.

Two distinct descriptors are learned for the same network.

Product late fusion in distance domain is performed before the final ranking.

DeepCD project is heavily inspired by pnnet https://github.com/vbalnt/pnnet

This respository: (author: Tsun-Yi Yang)

Related respositories: (author: Jo-Han Hsu)

Model

<img src="https://github.com/shamangary/DeepCD/blob/master/models_word.png" height="400"/>

Training with Data-Dependent Modulation (DDM) layer

The backward gradient value is scaled by a factor η (1e-3~1e-4). This step not only let us to slow down the learning of fully connected layer inside DDM layer, but also let us to approximately ignore the effect of DDM layer on the forward propagation of the complementary stream and make it an identity operation. The update equation is basically the the backward equation derived from multipling a parameter w from the previous layer.

<img src="https://github.com/shamangary/DeepCD/blob/master/DDM.png" height="300"/><img src="https://github.com/shamangary/DeepCD/blob/master/DeepCD_triplet.png" height="300"/>

a_DDM = nn.Identity()
output_layer_DDM = nn.Linear(pT.batch_size*2,pT.batch_size)
output_layer_DDM.weight:fill(0)
output_layer_DDM.bias:fill(1)
b_DDM = nn.Sequential():add(nn.Reshape(pT.batch_size*2,false)):add(output_layer_DDM):add(nn.Sigmoid())
DDM_ct1 = nn.ConcatTable():add(a_DDM:clone()):add(b_DDM:clone())
DDM_layer = nn.Sequential():add(DDM_ct1):add(nn.DataDependentModule(pT.DDM_LR))

Testing stage

Brown dataset results

<img src="https://github.com/shamangary/DeepCD/blob/master/DeepCD_brown.png" height="400"/>