Deep learning to ternary hash codes by continuation
Abstract Recently, ithas been observed that {0,±1}‐ternary codes, which are simply generated from deep features by hard thresholding, tend to outperform {−1,1}‐binary codes in image retrieval. To obtain better ternary codes, the authors for the first time propose to jointly learn the features with t...
Guardado en:
Autores principales: | , , |
---|---|
Formato: | article |
Lenguaje: | EN |
Publicado: |
Wiley
2021
|
Materias: | |
Acceso en línea: | https://doaj.org/article/a5c841e2503147e9bb25a3892e2fa9f6 |
Etiquetas: |
Agregar Etiqueta
Sin Etiquetas, Sea el primero en etiquetar este registro!
|
Sumario: | Abstract Recently, ithas been observed that {0,±1}‐ternary codes, which are simply generated from deep features by hard thresholding, tend to outperform {−1,1}‐binary codes in image retrieval. To obtain better ternary codes, the authors for the first time propose to jointly learn the features with the codes by appending a smoothed function to the networks. During training, the function could evolve into a non‐smoothed ternary function by a continuation method, and then generate ternary codes. The method circumvents the difficulty of directly training discrete functions and reduces the quantization errors of ternary codes. Experiments show that the proposed joint learning indeed could produce better ternary codes. For the first time, the authors propose to generate ternary hash codes by jointly learning the codes with deep features via a continuation method. Experiments show that the proposed method outperforms existing methods. |
---|