Feature fusion-based collaborative learning for knowledge distillation
Deep neural networks have achieved a great success in a variety of applications, such as self-driving cars and intelligent robotics. Meanwhile, knowledge distillation has received increasing attention as an effective model compression technique for training very efficient deep models. The performanc...
Saved in:
Main Authors: | Yiting Li, Liyuan Sun, Jianping Gou, Lan Du, Weihua Ou |
---|---|
Format: | article |
Language: | EN |
Published: |
SAGE Publishing
2021
|
Subjects: | |
Online Access: | https://doaj.org/article/e64c67ea766f46eea337ddceeef31f02 |
Tags: |
Add Tag
No Tags, Be the first to tag this record!
|
Similar Items
-
Equipment Quality Data Integration and Cleaning Based on Multiterminal Collaboration
by: Cui-Bin Ji, et al.
Published: (2021) -
Research on a Microexpression Recognition Technology Based on Multimodal Fusion
by: Jie Kang, et al.
Published: (2021) -
Grasp Detection under Occlusions Using SIFT Features
by: Zhaojun Ye, et al.
Published: (2021) -
Cooperative Cloud-Edge Feature Extraction Architecture for Mobile Image Retrieval
by: Chao He, et al.
Published: (2021) -
Interoperability of Multimedia Network Public Opinion Knowledge Base Group Based on Multisource Text Mining
by: Yanru Zhu
Published: (2021)