A Pruning Optimized Fast Learn++NSE Algorithm
Due to the large number of typical applications, it is very important and urgent to study the fast classification learning of accumulated big data in nonstationary environments. The newly proposed algorithm, named Learn++.NSE, is one of the important research results in this re...
Guardado en:
Autores principales: | , , , , |
---|---|
Formato: | article |
Lenguaje: | EN |
Publicado: |
IEEE
2021
|
Materias: | |
Acceso en línea: | https://doaj.org/article/1c215702f04b4da3bee54d80c7183377 |
Etiquetas: |
Agregar Etiqueta
Sin Etiquetas, Sea el primero en etiquetar este registro!
|
id |
oai:doaj.org-article:1c215702f04b4da3bee54d80c7183377 |
---|---|
record_format |
dspace |
spelling |
oai:doaj.org-article:1c215702f04b4da3bee54d80c71833772021-11-18T00:09:07ZA Pruning Optimized Fast Learn++NSE Algorithm2169-353610.1109/ACCESS.2021.3118568https://doaj.org/article/1c215702f04b4da3bee54d80c71833772021-01-01T00:00:00Zhttps://ieeexplore.ieee.org/document/9562526/https://doaj.org/toc/2169-3536Due to the large number of typical applications, it is very important and urgent to study the fast classification learning of accumulated big data in nonstationary environments. The newly proposed algorithm, named Learn++.NSE, is one of the important research results in this research field. And a pruning version, named Learn++.NSE-Error-based, was given for accumulated big data to improve the learning efficiency. However, the studies have found that the Learn++.NSE-Error-based algorithm often encounters a situation that the newly generated base classifier is pruned in the next integration, which reduces the accuracy of the ensemble classifier. The newly generated base classifier is very important in the next ensemble learning and should be retained. Therefore, the two latest base classifiers are reserved without being pruned, and a new pruning algorithm named NewLearn++.NSE-Error-based was proposed. The experimental results on the generated dataset and the real-world dataset show that NewLearn++.NSE-Error-based can further improve the accuracy of the ensemble classifier under the premise of obtaining the same time complexity as Learn++.NSE algorithm. It is suitable for fast classification learning of long-term accumulated big data.Yong ChenYuquan ZhuHaifeng ChenYan ShenZhao XuIEEEarticleEnsemble learningnonstationary environmentclassification algorithmbig data miningElectrical engineering. Electronics. Nuclear engineeringTK1-9971ENIEEE Access, Vol 9, Pp 150733-150743 (2021) |
institution |
DOAJ |
collection |
DOAJ |
language |
EN |
topic |
Ensemble learning nonstationary environment classification algorithm big data mining Electrical engineering. Electronics. Nuclear engineering TK1-9971 |
spellingShingle |
Ensemble learning nonstationary environment classification algorithm big data mining Electrical engineering. Electronics. Nuclear engineering TK1-9971 Yong Chen Yuquan Zhu Haifeng Chen Yan Shen Zhao Xu A Pruning Optimized Fast Learn++NSE Algorithm |
description |
Due to the large number of typical applications, it is very important and urgent to study the fast classification learning of accumulated big data in nonstationary environments. The newly proposed algorithm, named Learn++.NSE, is one of the important research results in this research field. And a pruning version, named Learn++.NSE-Error-based, was given for accumulated big data to improve the learning efficiency. However, the studies have found that the Learn++.NSE-Error-based algorithm often encounters a situation that the newly generated base classifier is pruned in the next integration, which reduces the accuracy of the ensemble classifier. The newly generated base classifier is very important in the next ensemble learning and should be retained. Therefore, the two latest base classifiers are reserved without being pruned, and a new pruning algorithm named NewLearn++.NSE-Error-based was proposed. The experimental results on the generated dataset and the real-world dataset show that NewLearn++.NSE-Error-based can further improve the accuracy of the ensemble classifier under the premise of obtaining the same time complexity as Learn++.NSE algorithm. It is suitable for fast classification learning of long-term accumulated big data. |
format |
article |
author |
Yong Chen Yuquan Zhu Haifeng Chen Yan Shen Zhao Xu |
author_facet |
Yong Chen Yuquan Zhu Haifeng Chen Yan Shen Zhao Xu |
author_sort |
Yong Chen |
title |
A Pruning Optimized Fast Learn++NSE Algorithm |
title_short |
A Pruning Optimized Fast Learn++NSE Algorithm |
title_full |
A Pruning Optimized Fast Learn++NSE Algorithm |
title_fullStr |
A Pruning Optimized Fast Learn++NSE Algorithm |
title_full_unstemmed |
A Pruning Optimized Fast Learn++NSE Algorithm |
title_sort |
pruning optimized fast learn++nse algorithm |
publisher |
IEEE |
publishDate |
2021 |
url |
https://doaj.org/article/1c215702f04b4da3bee54d80c7183377 |
work_keys_str_mv |
AT yongchen apruningoptimizedfastlearnx002bx002bnsealgorithm AT yuquanzhu apruningoptimizedfastlearnx002bx002bnsealgorithm AT haifengchen apruningoptimizedfastlearnx002bx002bnsealgorithm AT yanshen apruningoptimizedfastlearnx002bx002bnsealgorithm AT zhaoxu apruningoptimizedfastlearnx002bx002bnsealgorithm AT yongchen pruningoptimizedfastlearnx002bx002bnsealgorithm AT yuquanzhu pruningoptimizedfastlearnx002bx002bnsealgorithm AT haifengchen pruningoptimizedfastlearnx002bx002bnsealgorithm AT yanshen pruningoptimizedfastlearnx002bx002bnsealgorithm AT zhaoxu pruningoptimizedfastlearnx002bx002bnsealgorithm |
_version_ |
1718425210440384512 |