Compressing deep graph convolution network with multi-staged knowledge distillation.
Given a trained deep graph convolution network (GCN), how can we effectively compress it into a compact network without significant loss of accuracy? Compressing a trained deep GCN into a compact GCN is of great importance for implementing the model to environments such as mobile or embedded systems...
Guardado en:
Autores principales: | Junghun Kim, Jinhong Jung, U Kang |
---|---|
Formato: | article |
Lenguaje: | EN |
Publicado: |
Public Library of Science (PLoS)
2021
|
Materias: | |
Acceso en línea: | https://doaj.org/article/ca3c820cee7544318b24cd0850d32610 |
Etiquetas: |
Agregar Etiqueta
Sin Etiquetas, Sea el primero en etiquetar este registro!
|
Ejemplares similares
-
Sentence Compression Using BERT and Graph Convolutional Networks
por: Yo-Han Park, et al.
Publicado: (2021) -
Deep Graph Convolutional Networks for Accurate Automatic Road Network Selection
por: Jing Zheng, et al.
Publicado: (2021) -
Abdominal multi-organ auto-segmentation using 3D-patch-based deep convolutional neural network
por: Hojin Kim, et al.
Publicado: (2020) -
MSGCN: Multi-Subgraph Based Heterogeneous Graph Convolution Network Embedding
por: Junhui Chen, et al.
Publicado: (2021) -
Joint Trajectory Prediction of Multi-Linkage Robot Based on Graph Convolutional Network
por: Hu Wu, et al.
Publicado: (2020)