Abstract
Quantization has been an effective technology in
ANN (approximate nearest neighbour) search due
to its high accuracy and fast search speed. To meet
the requirement of different applications, there is
always a trade-off between retrieval accuracy and
speed, reflected by variable code lengths. However,
to encode the dataset into different code lengths,
existing methods need to train several models,
where each model can only produce a specific code
length. This incurs a considerable training time
cost, and largely reduces the flexibility of quantization methods to be deployed in real applications.
To address this issue, we propose a Deep Recurrent
Quantization (DRQ) architecture which can generate sequential binary codes. To the end, when
the model is trained, a sequence of binary codes
can be generated and the code length can be easily controlled by adjusting the number of recurrent
iterations. A shared codebook and a scalar factor is designed to be the learnable weights in the
deep recurrent quantization block, and the whole
framework can be trained in an end-to-end manner. As far as we know, this is the first quantization
method that can be trained once and generate sequential binary codes. Experimental results on the
benchmark datasets show that our model achieves
comparable or even better performance compared
with the state-of-the-art for image retrieval. But
it requires significantly less number of parameters
and training times. Our code is published online:
https://github.com/cfm-uestc/DRQ