Comparison of Machine Learning Classifiers for Reducing Fitness Evaluations of Structural Optimization

Metaheuristic algorithms have been widely used to solve structural optimization problems. Despite their powerful search capabilities, these algorithms often require a large number of fitness evaluations. Constructing a machine learning classifier to identify which individuals should be evaluated usi...

Descripción completa

Guardado en:
Detalles Bibliográficos
Autores principales: Tran-Hieu Nguyen, Anh-Tuan Vu
Formato: article
Lenguaje:EN
Publicado: Pouyan Press 2021
Materias:
T
Acceso en línea:https://doaj.org/article/354635fcf77e46fa8f23f1c2004a56e2
Etiquetas: Agregar Etiqueta
Sin Etiquetas, Sea el primero en etiquetar este registro!
id oai:doaj.org-article:354635fcf77e46fa8f23f1c2004a56e2
record_format dspace
spelling oai:doaj.org-article:354635fcf77e46fa8f23f1c2004a56e22021-12-03T15:12:29ZComparison of Machine Learning Classifiers for Reducing Fitness Evaluations of Structural Optimization2588-287210.22115/scce.2021.306249.1367https://doaj.org/article/354635fcf77e46fa8f23f1c2004a56e22021-10-01T00:00:00Zhttp://www.jsoftcivil.com/article_140631_5558a8192fd5dda8180018f59b838453.pdfhttps://doaj.org/toc/2588-2872Metaheuristic algorithms have been widely used to solve structural optimization problems. Despite their powerful search capabilities, these algorithms often require a large number of fitness evaluations. Constructing a machine learning classifier to identify which individuals should be evaluated using the original fitness evaluation is a great solution to reduce the computational cost. However, there is still a lack of a thorough comparison between machine learning classifiers when integrating into the optimization process. This paper aims to evaluate the efficiencies of different classifiers in eliminating unnecessary fitness evaluations. For this purpose, the weight optimization of a double-layer grid structure comprising 200 members is used as a numerical experiment. Six machine learning classifiers selected for assessment in this study include Artificial Neural Network, Support Vector Machine, k-Nearest Neighbor, Decision Tree, Random Forest, and Adaptive Boosting. The comparison is made in terms of the optimal weight of the structure, the rejection rate as well as the computing time. Overall, it is found that the AdaBoost classifier achieves the best performance.Tran-Hieu NguyenAnh-Tuan VuPouyan Pressarticleevolutionary algorithmdifferential evolutionsurrogate modelmachine learning classifieradaboostTechnologyTENJournal of Soft Computing in Civil Engineering, Vol 5, Iss 4, Pp 57-73 (2021)
institution DOAJ
collection DOAJ
language EN
topic evolutionary algorithm
differential evolution
surrogate model
machine learning classifier
adaboost
Technology
T
spellingShingle evolutionary algorithm
differential evolution
surrogate model
machine learning classifier
adaboost
Technology
T
Tran-Hieu Nguyen
Anh-Tuan Vu
Comparison of Machine Learning Classifiers for Reducing Fitness Evaluations of Structural Optimization
description Metaheuristic algorithms have been widely used to solve structural optimization problems. Despite their powerful search capabilities, these algorithms often require a large number of fitness evaluations. Constructing a machine learning classifier to identify which individuals should be evaluated using the original fitness evaluation is a great solution to reduce the computational cost. However, there is still a lack of a thorough comparison between machine learning classifiers when integrating into the optimization process. This paper aims to evaluate the efficiencies of different classifiers in eliminating unnecessary fitness evaluations. For this purpose, the weight optimization of a double-layer grid structure comprising 200 members is used as a numerical experiment. Six machine learning classifiers selected for assessment in this study include Artificial Neural Network, Support Vector Machine, k-Nearest Neighbor, Decision Tree, Random Forest, and Adaptive Boosting. The comparison is made in terms of the optimal weight of the structure, the rejection rate as well as the computing time. Overall, it is found that the AdaBoost classifier achieves the best performance.
format article
author Tran-Hieu Nguyen
Anh-Tuan Vu
author_facet Tran-Hieu Nguyen
Anh-Tuan Vu
author_sort Tran-Hieu Nguyen
title Comparison of Machine Learning Classifiers for Reducing Fitness Evaluations of Structural Optimization
title_short Comparison of Machine Learning Classifiers for Reducing Fitness Evaluations of Structural Optimization
title_full Comparison of Machine Learning Classifiers for Reducing Fitness Evaluations of Structural Optimization
title_fullStr Comparison of Machine Learning Classifiers for Reducing Fitness Evaluations of Structural Optimization
title_full_unstemmed Comparison of Machine Learning Classifiers for Reducing Fitness Evaluations of Structural Optimization
title_sort comparison of machine learning classifiers for reducing fitness evaluations of structural optimization
publisher Pouyan Press
publishDate 2021
url https://doaj.org/article/354635fcf77e46fa8f23f1c2004a56e2
work_keys_str_mv AT tranhieunguyen comparisonofmachinelearningclassifiersforreducingfitnessevaluationsofstructuraloptimization
AT anhtuanvu comparisonofmachinelearningclassifiersforreducingfitnessevaluationsofstructuraloptimization
_version_ 1718373139060097024