An Efficient Approach Using Knowledge Distillation Methods to Stabilize Performance in a Lightweight Top-Down Posture Estimation Network
Multi-person pose estimation has been gaining considerable interest due to its use in several real-world applications, such as activity recognition, motion capture, and augmented reality. Although the improvement of the accuracy and speed of multi-person pose estimation techniques has been recently...
Guardado en:
Autores principales: | , , , , , |
---|---|
Formato: | article |
Lenguaje: | EN |
Publicado: |
MDPI AG
2021
|
Materias: | |
Acceso en línea: | https://doaj.org/article/5b7260bb507a4836b4ee492f48aad005 |
Etiquetas: |
Agregar Etiqueta
Sin Etiquetas, Sea el primero en etiquetar este registro!
|
id |
oai:doaj.org-article:5b7260bb507a4836b4ee492f48aad005 |
---|---|
record_format |
dspace |
spelling |
oai:doaj.org-article:5b7260bb507a4836b4ee492f48aad0052021-11-25T18:58:08ZAn Efficient Approach Using Knowledge Distillation Methods to Stabilize Performance in a Lightweight Top-Down Posture Estimation Network10.3390/s212276401424-8220https://doaj.org/article/5b7260bb507a4836b4ee492f48aad0052021-11-01T00:00:00Zhttps://www.mdpi.com/1424-8220/21/22/7640https://doaj.org/toc/1424-8220Multi-person pose estimation has been gaining considerable interest due to its use in several real-world applications, such as activity recognition, motion capture, and augmented reality. Although the improvement of the accuracy and speed of multi-person pose estimation techniques has been recently studied, limitations still exist in balancing these two aspects. In this paper, a novel knowledge distilled lightweight top-down pose network (KDLPN) is proposed that balances computational complexity and accuracy. For the first time in multi-person pose estimation, a network that reduces computational complexity by applying a “Pelee” structure and shuffles pixels in the dense upsampling convolution layer to reduce the number of channels is presented. Furthermore, to prevent performance degradation because of the reduced computational complexity, knowledge distillation is applied to establish the pose estimation network as a teacher network. The method performance is evaluated on the MSCOCO dataset. Experimental results demonstrate that our KDLPN network significantly reduces 95% of the parameters required by state-of-the-art methods with minimal performance degradation. Moreover, our method is compared with other pose estimation methods to substantiate the importance of computational complexity reduction and its effectiveness.Changhyun ParkHean Sung LeeWoo Jin KimHan Byeol BaeJaeho LeeSangyoun LeeMDPI AGarticlepose estimationconvolutional neural networklightweightknowledge distillationChemical technologyTP1-1185ENSensors, Vol 21, Iss 7640, p 7640 (2021) |
institution |
DOAJ |
collection |
DOAJ |
language |
EN |
topic |
pose estimation convolutional neural network lightweight knowledge distillation Chemical technology TP1-1185 |
spellingShingle |
pose estimation convolutional neural network lightweight knowledge distillation Chemical technology TP1-1185 Changhyun Park Hean Sung Lee Woo Jin Kim Han Byeol Bae Jaeho Lee Sangyoun Lee An Efficient Approach Using Knowledge Distillation Methods to Stabilize Performance in a Lightweight Top-Down Posture Estimation Network |
description |
Multi-person pose estimation has been gaining considerable interest due to its use in several real-world applications, such as activity recognition, motion capture, and augmented reality. Although the improvement of the accuracy and speed of multi-person pose estimation techniques has been recently studied, limitations still exist in balancing these two aspects. In this paper, a novel knowledge distilled lightweight top-down pose network (KDLPN) is proposed that balances computational complexity and accuracy. For the first time in multi-person pose estimation, a network that reduces computational complexity by applying a “Pelee” structure and shuffles pixels in the dense upsampling convolution layer to reduce the number of channels is presented. Furthermore, to prevent performance degradation because of the reduced computational complexity, knowledge distillation is applied to establish the pose estimation network as a teacher network. The method performance is evaluated on the MSCOCO dataset. Experimental results demonstrate that our KDLPN network significantly reduces 95% of the parameters required by state-of-the-art methods with minimal performance degradation. Moreover, our method is compared with other pose estimation methods to substantiate the importance of computational complexity reduction and its effectiveness. |
format |
article |
author |
Changhyun Park Hean Sung Lee Woo Jin Kim Han Byeol Bae Jaeho Lee Sangyoun Lee |
author_facet |
Changhyun Park Hean Sung Lee Woo Jin Kim Han Byeol Bae Jaeho Lee Sangyoun Lee |
author_sort |
Changhyun Park |
title |
An Efficient Approach Using Knowledge Distillation Methods to Stabilize Performance in a Lightweight Top-Down Posture Estimation Network |
title_short |
An Efficient Approach Using Knowledge Distillation Methods to Stabilize Performance in a Lightweight Top-Down Posture Estimation Network |
title_full |
An Efficient Approach Using Knowledge Distillation Methods to Stabilize Performance in a Lightweight Top-Down Posture Estimation Network |
title_fullStr |
An Efficient Approach Using Knowledge Distillation Methods to Stabilize Performance in a Lightweight Top-Down Posture Estimation Network |
title_full_unstemmed |
An Efficient Approach Using Knowledge Distillation Methods to Stabilize Performance in a Lightweight Top-Down Posture Estimation Network |
title_sort |
efficient approach using knowledge distillation methods to stabilize performance in a lightweight top-down posture estimation network |
publisher |
MDPI AG |
publishDate |
2021 |
url |
https://doaj.org/article/5b7260bb507a4836b4ee492f48aad005 |
work_keys_str_mv |
AT changhyunpark anefficientapproachusingknowledgedistillationmethodstostabilizeperformanceinalightweighttopdownpostureestimationnetwork AT heansunglee anefficientapproachusingknowledgedistillationmethodstostabilizeperformanceinalightweighttopdownpostureestimationnetwork AT woojinkim anefficientapproachusingknowledgedistillationmethodstostabilizeperformanceinalightweighttopdownpostureestimationnetwork AT hanbyeolbae anefficientapproachusingknowledgedistillationmethodstostabilizeperformanceinalightweighttopdownpostureestimationnetwork AT jaeholee anefficientapproachusingknowledgedistillationmethodstostabilizeperformanceinalightweighttopdownpostureestimationnetwork AT sangyounlee anefficientapproachusingknowledgedistillationmethodstostabilizeperformanceinalightweighttopdownpostureestimationnetwork AT changhyunpark efficientapproachusingknowledgedistillationmethodstostabilizeperformanceinalightweighttopdownpostureestimationnetwork AT heansunglee efficientapproachusingknowledgedistillationmethodstostabilizeperformanceinalightweighttopdownpostureestimationnetwork AT woojinkim efficientapproachusingknowledgedistillationmethodstostabilizeperformanceinalightweighttopdownpostureestimationnetwork AT hanbyeolbae efficientapproachusingknowledgedistillationmethodstostabilizeperformanceinalightweighttopdownpostureestimationnetwork AT jaeholee efficientapproachusingknowledgedistillationmethodstostabilizeperformanceinalightweighttopdownpostureestimationnetwork AT sangyounlee efficientapproachusingknowledgedistillationmethodstostabilizeperformanceinalightweighttopdownpostureestimationnetwork |
_version_ |
1718410464142032896 |