Feature fusion-based collaborative learning for knowledge distillation
Deep neural networks have achieved a great success in a variety of applications, such as self-driving cars and intelligent robotics. Meanwhile, knowledge distillation has received increasing attention as an effective model compression technique for training very efficient deep models. The performanc...
Enregistré dans:
Auteurs principaux: | Yiting Li, Liyuan Sun, Jianping Gou, Lan Du, Weihua Ou |
---|---|
Format: | article |
Langue: | EN |
Publié: |
SAGE Publishing
2021
|
Sujets: | |
Accès en ligne: | https://doaj.org/article/e64c67ea766f46eea337ddceeef31f02 |
Tags: |
Ajouter un tag
Pas de tags, Soyez le premier à ajouter un tag!
|
Documents similaires
-
Equipment Quality Data Integration and Cleaning Based on Multiterminal Collaboration
par: Cui-Bin Ji, et autres
Publié: (2021) -
Research on a Microexpression Recognition Technology Based on Multimodal Fusion
par: Jie Kang, et autres
Publié: (2021) -
Grasp Detection under Occlusions Using SIFT Features
par: Zhaojun Ye, et autres
Publié: (2021) -
Cooperative Cloud-Edge Feature Extraction Architecture for Mobile Image Retrieval
par: Chao He, et autres
Publié: (2021) -
Interoperability of Multimedia Network Public Opinion Knowledge Base Group Based on Multisource Text Mining
par: Yanru Zhu
Publié: (2021)