MEML: A Deep Data Augmentation Method by Mean Extrapolation in Middle Layers
Data augmentation, generating new data that are similar but not same with the original data by making a series of transformations to the original data, is one of the mainstream methods to alleviate the problem of insufficient data. Instead of augmenting input data, this paper proposes a method for a...
Guardado en:
Autores principales: | , , , , , |
---|---|
Formato: | article |
Lenguaje: | EN |
Publicado: |
IEEE
2021
|
Materias: | |
Acceso en línea: | https://doaj.org/article/434a0ebb58f24674a9e21dac64037db1 |
Etiquetas: |
Agregar Etiqueta
Sin Etiquetas, Sea el primero en etiquetar este registro!
|
id |
oai:doaj.org-article:434a0ebb58f24674a9e21dac64037db1 |
---|---|
record_format |
dspace |
spelling |
oai:doaj.org-article:434a0ebb58f24674a9e21dac64037db12021-11-17T00:00:18ZMEML: A Deep Data Augmentation Method by Mean Extrapolation in Middle Layers2169-353610.1109/ACCESS.2021.3125841https://doaj.org/article/434a0ebb58f24674a9e21dac64037db12021-01-01T00:00:00Zhttps://ieeexplore.ieee.org/document/9606654/https://doaj.org/toc/2169-3536Data augmentation, generating new data that are similar but not same with the original data by making a series of transformations to the original data, is one of the mainstream methods to alleviate the problem of insufficient data. Instead of augmenting input data, this paper proposes a method for augmenting features in the middle layers of deep models, called MEML (Mean Extrapolation in Middle Layers). It gets the features outputted by any middle layer of deep models and then generates new features by doing mean extrapolation with some randomly selected features. After that, it replaces these selected features with their corresponding new ones and keeps the labels unchanged, and then lets the new composed output continue to propagate forward. Experiments on two classic deep neural network models and three image datasets show that our MEML method can significantly improve the model classification accuracy and outperform in most experiments the state-of-the-art feature space augmentation methods that factor dropout and K Nearest Neighbors extrapolation. Interestingly, when coupled with some input space augmentation methods, e.g., rotation and horizontal flip, our MEML method further improves the performance of two deep models on three datasets, implying that the input space augmentation methods and MEML could complement each other.Dongchen LiuLun ZhangXiansen JiangCaixia SuYufeng FanYongfeng CaoIEEEarticleData augmentationextrapolationfeature spaceElectrical engineering. Electronics. Nuclear engineeringTK1-9971ENIEEE Access, Vol 9, Pp 151621-151630 (2021) |
institution |
DOAJ |
collection |
DOAJ |
language |
EN |
topic |
Data augmentation extrapolation feature space Electrical engineering. Electronics. Nuclear engineering TK1-9971 |
spellingShingle |
Data augmentation extrapolation feature space Electrical engineering. Electronics. Nuclear engineering TK1-9971 Dongchen Liu Lun Zhang Xiansen Jiang Caixia Su Yufeng Fan Yongfeng Cao MEML: A Deep Data Augmentation Method by Mean Extrapolation in Middle Layers |
description |
Data augmentation, generating new data that are similar but not same with the original data by making a series of transformations to the original data, is one of the mainstream methods to alleviate the problem of insufficient data. Instead of augmenting input data, this paper proposes a method for augmenting features in the middle layers of deep models, called MEML (Mean Extrapolation in Middle Layers). It gets the features outputted by any middle layer of deep models and then generates new features by doing mean extrapolation with some randomly selected features. After that, it replaces these selected features with their corresponding new ones and keeps the labels unchanged, and then lets the new composed output continue to propagate forward. Experiments on two classic deep neural network models and three image datasets show that our MEML method can significantly improve the model classification accuracy and outperform in most experiments the state-of-the-art feature space augmentation methods that factor dropout and K Nearest Neighbors extrapolation. Interestingly, when coupled with some input space augmentation methods, e.g., rotation and horizontal flip, our MEML method further improves the performance of two deep models on three datasets, implying that the input space augmentation methods and MEML could complement each other. |
format |
article |
author |
Dongchen Liu Lun Zhang Xiansen Jiang Caixia Su Yufeng Fan Yongfeng Cao |
author_facet |
Dongchen Liu Lun Zhang Xiansen Jiang Caixia Su Yufeng Fan Yongfeng Cao |
author_sort |
Dongchen Liu |
title |
MEML: A Deep Data Augmentation Method by Mean Extrapolation in Middle Layers |
title_short |
MEML: A Deep Data Augmentation Method by Mean Extrapolation in Middle Layers |
title_full |
MEML: A Deep Data Augmentation Method by Mean Extrapolation in Middle Layers |
title_fullStr |
MEML: A Deep Data Augmentation Method by Mean Extrapolation in Middle Layers |
title_full_unstemmed |
MEML: A Deep Data Augmentation Method by Mean Extrapolation in Middle Layers |
title_sort |
meml: a deep data augmentation method by mean extrapolation in middle layers |
publisher |
IEEE |
publishDate |
2021 |
url |
https://doaj.org/article/434a0ebb58f24674a9e21dac64037db1 |
work_keys_str_mv |
AT dongchenliu memladeepdataaugmentationmethodbymeanextrapolationinmiddlelayers AT lunzhang memladeepdataaugmentationmethodbymeanextrapolationinmiddlelayers AT xiansenjiang memladeepdataaugmentationmethodbymeanextrapolationinmiddlelayers AT caixiasu memladeepdataaugmentationmethodbymeanextrapolationinmiddlelayers AT yufengfan memladeepdataaugmentationmethodbymeanextrapolationinmiddlelayers AT yongfengcao memladeepdataaugmentationmethodbymeanextrapolationinmiddlelayers |
_version_ |
1718426044486123520 |