Multi-Task Learning with Task-Specific Feature Filtering in Low-Data Condition

Multi-task learning is a computationally efficient method to solve multiple tasks in one multi-task model, instead of multiple single-task models. MTL is expected to learn both diverse and shareable visual features from multiple datasets. However, MTL performances usually do not outperform single-ta...

Descripción completa

Guardado en:
Detalles Bibliográficos
Autores principales: Sang-woo Lee, Ryong Lee, Min-seok Seo, Jong-chan Park, Hyeon-cheol Noh, Jin-gi Ju, Rae-young Jang, Gun-woo Lee, Myung-seok Choi, Dong-geol Choi
Formato: article
Lenguaje:EN
Publicado: MDPI AG 2021
Materias:
Acceso en línea:https://doaj.org/article/a66ea26763d343aab655f260efa48ecc
Etiquetas: Agregar Etiqueta
Sin Etiquetas, Sea el primero en etiquetar este registro!
id oai:doaj.org-article:a66ea26763d343aab655f260efa48ecc
record_format dspace
spelling oai:doaj.org-article:a66ea26763d343aab655f260efa48ecc2021-11-11T15:40:58ZMulti-Task Learning with Task-Specific Feature Filtering in Low-Data Condition10.3390/electronics102126912079-9292https://doaj.org/article/a66ea26763d343aab655f260efa48ecc2021-11-01T00:00:00Zhttps://www.mdpi.com/2079-9292/10/21/2691https://doaj.org/toc/2079-9292Multi-task learning is a computationally efficient method to solve multiple tasks in one multi-task model, instead of multiple single-task models. MTL is expected to learn both diverse and shareable visual features from multiple datasets. However, MTL performances usually do not outperform single-task learning. Recent MTL methods tend to use heavy task-specific heads with large overheads to generate task-specific features. In this work, we (1) validate the efficacy of MTL in low-data conditions with early-exit architectures, and (2) propose a simple feature filtering module with minimal overheads to generate task-specific features. We assume that, in low-data conditions, the model cannot learn useful low-level features due to the limited amount of data. We empirically show that MTL can significantly improve performances in all tasks under low-data conditions. We further optimize the early-exit architecture by a sweep search on the optimal feature for each task. Furthermore, we propose a feature filtering module that selects features for each task. Using the optimized early-exit architecture with the feature filtering module, we improve the 15.937% in ImageNet and 4.847% in Places365 under the low-data condition where only 5% of the original datasets are available. Our method is empirically validated in various backbones and various MTL settings.Sang-woo LeeRyong LeeMin-seok SeoJong-chan ParkHyeon-cheol NohJin-gi JuRae-young JangGun-woo LeeMyung-seok ChoiDong-geol ChoiMDPI AGarticledeep learningmulti-task learningconvolutional neural networkElectronicsTK7800-8360ENElectronics, Vol 10, Iss 2691, p 2691 (2021)
institution DOAJ
collection DOAJ
language EN
topic deep learning
multi-task learning
convolutional neural network
Electronics
TK7800-8360
spellingShingle deep learning
multi-task learning
convolutional neural network
Electronics
TK7800-8360
Sang-woo Lee
Ryong Lee
Min-seok Seo
Jong-chan Park
Hyeon-cheol Noh
Jin-gi Ju
Rae-young Jang
Gun-woo Lee
Myung-seok Choi
Dong-geol Choi
Multi-Task Learning with Task-Specific Feature Filtering in Low-Data Condition
description Multi-task learning is a computationally efficient method to solve multiple tasks in one multi-task model, instead of multiple single-task models. MTL is expected to learn both diverse and shareable visual features from multiple datasets. However, MTL performances usually do not outperform single-task learning. Recent MTL methods tend to use heavy task-specific heads with large overheads to generate task-specific features. In this work, we (1) validate the efficacy of MTL in low-data conditions with early-exit architectures, and (2) propose a simple feature filtering module with minimal overheads to generate task-specific features. We assume that, in low-data conditions, the model cannot learn useful low-level features due to the limited amount of data. We empirically show that MTL can significantly improve performances in all tasks under low-data conditions. We further optimize the early-exit architecture by a sweep search on the optimal feature for each task. Furthermore, we propose a feature filtering module that selects features for each task. Using the optimized early-exit architecture with the feature filtering module, we improve the 15.937% in ImageNet and 4.847% in Places365 under the low-data condition where only 5% of the original datasets are available. Our method is empirically validated in various backbones and various MTL settings.
format article
author Sang-woo Lee
Ryong Lee
Min-seok Seo
Jong-chan Park
Hyeon-cheol Noh
Jin-gi Ju
Rae-young Jang
Gun-woo Lee
Myung-seok Choi
Dong-geol Choi
author_facet Sang-woo Lee
Ryong Lee
Min-seok Seo
Jong-chan Park
Hyeon-cheol Noh
Jin-gi Ju
Rae-young Jang
Gun-woo Lee
Myung-seok Choi
Dong-geol Choi
author_sort Sang-woo Lee
title Multi-Task Learning with Task-Specific Feature Filtering in Low-Data Condition
title_short Multi-Task Learning with Task-Specific Feature Filtering in Low-Data Condition
title_full Multi-Task Learning with Task-Specific Feature Filtering in Low-Data Condition
title_fullStr Multi-Task Learning with Task-Specific Feature Filtering in Low-Data Condition
title_full_unstemmed Multi-Task Learning with Task-Specific Feature Filtering in Low-Data Condition
title_sort multi-task learning with task-specific feature filtering in low-data condition
publisher MDPI AG
publishDate 2021
url https://doaj.org/article/a66ea26763d343aab655f260efa48ecc
work_keys_str_mv AT sangwoolee multitasklearningwithtaskspecificfeaturefilteringinlowdatacondition
AT ryonglee multitasklearningwithtaskspecificfeaturefilteringinlowdatacondition
AT minseokseo multitasklearningwithtaskspecificfeaturefilteringinlowdatacondition
AT jongchanpark multitasklearningwithtaskspecificfeaturefilteringinlowdatacondition
AT hyeoncheolnoh multitasklearningwithtaskspecificfeaturefilteringinlowdatacondition
AT jingiju multitasklearningwithtaskspecificfeaturefilteringinlowdatacondition
AT raeyoungjang multitasklearningwithtaskspecificfeaturefilteringinlowdatacondition
AT gunwoolee multitasklearningwithtaskspecificfeaturefilteringinlowdatacondition
AT myungseokchoi multitasklearningwithtaskspecificfeaturefilteringinlowdatacondition
AT donggeolchoi multitasklearningwithtaskspecificfeaturefilteringinlowdatacondition
_version_ 1718434367943999488