Deep learning to ternary hash codes by continuation
Abstract Recently, ithas been observed that {0,±1}‐ternary codes, which are simply generated from deep features by hard thresholding, tend to outperform {−1,1}‐binary codes in image retrieval. To obtain better ternary codes, the authors for the first time propose to jointly learn the features with t...
Enregistré dans:
Auteurs principaux: | , , |
---|---|
Format: | article |
Langue: | EN |
Publié: |
Wiley
2021
|
Sujets: | |
Accès en ligne: | https://doaj.org/article/a5c841e2503147e9bb25a3892e2fa9f6 |
Tags: |
Ajouter un tag
Pas de tags, Soyez le premier à ajouter un tag!
|
Résumé: | Abstract Recently, ithas been observed that {0,±1}‐ternary codes, which are simply generated from deep features by hard thresholding, tend to outperform {−1,1}‐binary codes in image retrieval. To obtain better ternary codes, the authors for the first time propose to jointly learn the features with the codes by appending a smoothed function to the networks. During training, the function could evolve into a non‐smoothed ternary function by a continuation method, and then generate ternary codes. The method circumvents the difficulty of directly training discrete functions and reduces the quantization errors of ternary codes. Experiments show that the proposed joint learning indeed could produce better ternary codes. For the first time, the authors propose to generate ternary hash codes by jointly learning the codes with deep features via a continuation method. Experiments show that the proposed method outperforms existing methods. |
---|