Pruning Filters Base on Extending Filter Group Lasso

Deep Convolution Neural Networks (CNNs) have been widely used in image recognition, while models of CNNs are desired to be more compact as the growing demands arise from various kinds of AI applications. As sparsity is generally accepted as inherent characteristics of the pruned models, L1 based app...

Descripción completa

Guardado en:
Detalles Bibliográficos
Autores principales: Zhihong Xie, Ping Li, Fei Li, Changyi Guo
Formato: article
Lenguaje:EN
Publicado: IEEE 2020
Materias:
Acceso en línea:https://doaj.org/article/4aad1c6d7ed1463ab9a9d1d2e730ce54
Etiquetas: Agregar Etiqueta
Sin Etiquetas, Sea el primero en etiquetar este registro!
id oai:doaj.org-article:4aad1c6d7ed1463ab9a9d1d2e730ce54
record_format dspace
spelling oai:doaj.org-article:4aad1c6d7ed1463ab9a9d1d2e730ce542021-11-19T00:04:50ZPruning Filters Base on Extending Filter Group Lasso2169-353610.1109/ACCESS.2020.3042707https://doaj.org/article/4aad1c6d7ed1463ab9a9d1d2e730ce542020-01-01T00:00:00Zhttps://ieeexplore.ieee.org/document/9281363/https://doaj.org/toc/2169-3536Deep Convolution Neural Networks (CNNs) have been widely used in image recognition, while models of CNNs are desired to be more compact as the growing demands arise from various kinds of AI applications. As sparsity is generally accepted as inherent characteristics of the pruned models, L1 based approaches normally add penalty on convolution filters or channels to objectives structurally but straightforwardly. That kind of induced sparse models only represent the balanced results of training loss and penalty mechanically. In this paper, we construct the Extending Filter Group (EFG) by thorough investigations on underlying constraints between every two successive layers. The penalty in terms of EFG addresses train process on filters of the current layer and channels in the following layer, which called as synchronous reinforcement. Thus it provides an alternative way to induce a model with ideal sparsity, especially in case of complex datasets. Moreover, we present Noise Filter Recognition Mechanism (NFRM) to improve model accuracy, as a shrinking model is to be obtained from the original based our contributions. Our method achieves the accuracy of 72.67% in cifar-100 dataset, while the performance is 72.04% in current baseline. It is worth noticing that the pruning rate achieved through our method marks 43.8%, which is higher than the one generated by other popular pruning methods.Zhihong XiePing LiFei LiChangyi GuoIEEEarticleCNNspruningextending filter groupgroup lassonoise filter recognition mechanismElectrical engineering. Electronics. Nuclear engineeringTK1-9971ENIEEE Access, Vol 8, Pp 217867-217876 (2020)
institution DOAJ
collection DOAJ
language EN
topic CNNs
pruning
extending filter group
group lasso
noise filter recognition mechanism
Electrical engineering. Electronics. Nuclear engineering
TK1-9971
spellingShingle CNNs
pruning
extending filter group
group lasso
noise filter recognition mechanism
Electrical engineering. Electronics. Nuclear engineering
TK1-9971
Zhihong Xie
Ping Li
Fei Li
Changyi Guo
Pruning Filters Base on Extending Filter Group Lasso
description Deep Convolution Neural Networks (CNNs) have been widely used in image recognition, while models of CNNs are desired to be more compact as the growing demands arise from various kinds of AI applications. As sparsity is generally accepted as inherent characteristics of the pruned models, L1 based approaches normally add penalty on convolution filters or channels to objectives structurally but straightforwardly. That kind of induced sparse models only represent the balanced results of training loss and penalty mechanically. In this paper, we construct the Extending Filter Group (EFG) by thorough investigations on underlying constraints between every two successive layers. The penalty in terms of EFG addresses train process on filters of the current layer and channels in the following layer, which called as synchronous reinforcement. Thus it provides an alternative way to induce a model with ideal sparsity, especially in case of complex datasets. Moreover, we present Noise Filter Recognition Mechanism (NFRM) to improve model accuracy, as a shrinking model is to be obtained from the original based our contributions. Our method achieves the accuracy of 72.67% in cifar-100 dataset, while the performance is 72.04% in current baseline. It is worth noticing that the pruning rate achieved through our method marks 43.8%, which is higher than the one generated by other popular pruning methods.
format article
author Zhihong Xie
Ping Li
Fei Li
Changyi Guo
author_facet Zhihong Xie
Ping Li
Fei Li
Changyi Guo
author_sort Zhihong Xie
title Pruning Filters Base on Extending Filter Group Lasso
title_short Pruning Filters Base on Extending Filter Group Lasso
title_full Pruning Filters Base on Extending Filter Group Lasso
title_fullStr Pruning Filters Base on Extending Filter Group Lasso
title_full_unstemmed Pruning Filters Base on Extending Filter Group Lasso
title_sort pruning filters base on extending filter group lasso
publisher IEEE
publishDate 2020
url https://doaj.org/article/4aad1c6d7ed1463ab9a9d1d2e730ce54
work_keys_str_mv AT zhihongxie pruningfiltersbaseonextendingfiltergrouplasso
AT pingli pruningfiltersbaseonextendingfiltergrouplasso
AT feili pruningfiltersbaseonextendingfiltergrouplasso
AT changyiguo pruningfiltersbaseonextendingfiltergrouplasso
_version_ 1718420666230767616