UBC Theses and Dissertations

UBC Theses Logo

UBC Theses and Dissertations

Learning efficient binary representation for images with unsupervised deep neural networks Liu, Fangrui


Coding deficiency, which refers to information insufficiency a code can carry, is one of the barriers to high-performance representation learning. Unsupervised binary representations have broader applications than other representations but suffer from the same problem. This work addresses the coding deficiency from two perspectives: biases on single binary neurons and correlation between pairs. A normalization layer and a mutual information loss are introduced to encourage lower code bias and less conflict when learning unsupervised hash for images. Learning uniform distribution for binary neurons is crucial to keep every learned bit informative, which motivates the proposed normalized binary layer. Experiments suggest that the proposed normalization can enhance the code quality by having lower biases, especially in small code lengths. Also, a mutual information loss on individual stochastic binary neurons is proposed to reduce the correlation between binary neurons, discouraging code conflict by minimizing mutual information on the learned binary representation and diverging the code distribution before optimizing it in the next epoch. Performance benchmarks on image retrieval with the unsupervised binary code is conducted on four open datasets. Both the proposed approaches help the model to achieve state-of-the-art accuracy on image retrieval task for all those datasets, which validates their effectiveness in improving unsupervised hashing efficiency.

Item Citations and Data


Attribution-NonCommercial-ShareAlike 4.0 International