A Pruning Optimized Fast Learn++NSE Algorithm
Due to the large number of typical applications, it is very important and urgent to study the fast classification learning of accumulated big data in nonstationary environments. The newly proposed algorithm, named Learn++.NSE, is one of the important research results in this re...
Guardado en:
Autores principales: | , , , , |
---|---|
Formato: | article |
Lenguaje: | EN |
Publicado: |
IEEE
2021
|
Materias: | |
Acceso en línea: | https://doaj.org/article/1c215702f04b4da3bee54d80c7183377 |
Etiquetas: |
Agregar Etiqueta
Sin Etiquetas, Sea el primero en etiquetar este registro!
|
Sumario: | Due to the large number of typical applications, it is very important and urgent to study the fast classification learning of accumulated big data in nonstationary environments. The newly proposed algorithm, named Learn++.NSE, is one of the important research results in this research field. And a pruning version, named Learn++.NSE-Error-based, was given for accumulated big data to improve the learning efficiency. However, the studies have found that the Learn++.NSE-Error-based algorithm often encounters a situation that the newly generated base classifier is pruned in the next integration, which reduces the accuracy of the ensemble classifier. The newly generated base classifier is very important in the next ensemble learning and should be retained. Therefore, the two latest base classifiers are reserved without being pruned, and a new pruning algorithm named NewLearn++.NSE-Error-based was proposed. The experimental results on the generated dataset and the real-world dataset show that NewLearn++.NSE-Error-based can further improve the accuracy of the ensemble classifier under the premise of obtaining the same time complexity as Learn++.NSE algorithm. It is suitable for fast classification learning of long-term accumulated big data. |
---|