Self-Tuning Lam Annealing: Learning Hyperparameters While Problem Solving

The runtime behavior of Simulated Annealing (SA), similar to other metaheuristics, is controlled by hyperparameters. For SA, hyperparameters affect how “temperature” varies over time, and “temperature” in turn affects SA’s decisions on whether or not to transition to neighboring states. It is typica...

Descripción completa

Guardado en:
Detalles Bibliográficos
Autor principal: Vincent A. Cicirello
Formato: article
Lenguaje:EN
Publicado: MDPI AG 2021
Materias:
T
Acceso en línea:https://doaj.org/article/d036af5c9c094a0ca83439ec9a1dd42d
Etiquetas: Agregar Etiqueta
Sin Etiquetas, Sea el primero en etiquetar este registro!
id oai:doaj.org-article:d036af5c9c094a0ca83439ec9a1dd42d
record_format dspace
spelling oai:doaj.org-article:d036af5c9c094a0ca83439ec9a1dd42d2021-11-11T14:58:17ZSelf-Tuning Lam Annealing: Learning Hyperparameters While Problem Solving10.3390/app112198282076-3417https://doaj.org/article/d036af5c9c094a0ca83439ec9a1dd42d2021-10-01T00:00:00Zhttps://www.mdpi.com/2076-3417/11/21/9828https://doaj.org/toc/2076-3417The runtime behavior of Simulated Annealing (SA), similar to other metaheuristics, is controlled by hyperparameters. For SA, hyperparameters affect how “temperature” varies over time, and “temperature” in turn affects SA’s decisions on whether or not to transition to neighboring states. It is typically necessary to tune the hyperparameters ahead of time. However, there are adaptive annealing schedules that use search feedback to evolve the “temperature” during the search. A classic and generally effective adaptive annealing schedule is the Modified Lam. Although effective, the Modified Lam can be sensitive to the scale of the cost function, and is sometimes slow to converge to its target behavior. In this paper, we present a novel variation of the Modified Lam that we call Self-Tuning Lam, which uses early search feedback to auto-adjust its self-adaptive behavior. Using a variety of discrete and continuous optimization problems, we demonstrate the ability of the Self-Tuning Lam to nearly instantaneously converge to its target behavior independent of the scale of the cost function, as well as its run length. Our implementation is integrated into Chips-n-Salsa, an open-source Java library for parallel and self-adaptive local search.Vincent A. CicirelloMDPI AGarticleSelf-TuningSimulated AnnealingModified LamhyperparametersExponential Moving Averageadaptive searchTechnologyTEngineering (General). Civil engineering (General)TA1-2040Biology (General)QH301-705.5PhysicsQC1-999ChemistryQD1-999ENApplied Sciences, Vol 11, Iss 9828, p 9828 (2021)
institution DOAJ
collection DOAJ
language EN
topic Self-Tuning
Simulated Annealing
Modified Lam
hyperparameters
Exponential Moving Average
adaptive search
Technology
T
Engineering (General). Civil engineering (General)
TA1-2040
Biology (General)
QH301-705.5
Physics
QC1-999
Chemistry
QD1-999
spellingShingle Self-Tuning
Simulated Annealing
Modified Lam
hyperparameters
Exponential Moving Average
adaptive search
Technology
T
Engineering (General). Civil engineering (General)
TA1-2040
Biology (General)
QH301-705.5
Physics
QC1-999
Chemistry
QD1-999
Vincent A. Cicirello
Self-Tuning Lam Annealing: Learning Hyperparameters While Problem Solving
description The runtime behavior of Simulated Annealing (SA), similar to other metaheuristics, is controlled by hyperparameters. For SA, hyperparameters affect how “temperature” varies over time, and “temperature” in turn affects SA’s decisions on whether or not to transition to neighboring states. It is typically necessary to tune the hyperparameters ahead of time. However, there are adaptive annealing schedules that use search feedback to evolve the “temperature” during the search. A classic and generally effective adaptive annealing schedule is the Modified Lam. Although effective, the Modified Lam can be sensitive to the scale of the cost function, and is sometimes slow to converge to its target behavior. In this paper, we present a novel variation of the Modified Lam that we call Self-Tuning Lam, which uses early search feedback to auto-adjust its self-adaptive behavior. Using a variety of discrete and continuous optimization problems, we demonstrate the ability of the Self-Tuning Lam to nearly instantaneously converge to its target behavior independent of the scale of the cost function, as well as its run length. Our implementation is integrated into Chips-n-Salsa, an open-source Java library for parallel and self-adaptive local search.
format article
author Vincent A. Cicirello
author_facet Vincent A. Cicirello
author_sort Vincent A. Cicirello
title Self-Tuning Lam Annealing: Learning Hyperparameters While Problem Solving
title_short Self-Tuning Lam Annealing: Learning Hyperparameters While Problem Solving
title_full Self-Tuning Lam Annealing: Learning Hyperparameters While Problem Solving
title_fullStr Self-Tuning Lam Annealing: Learning Hyperparameters While Problem Solving
title_full_unstemmed Self-Tuning Lam Annealing: Learning Hyperparameters While Problem Solving
title_sort self-tuning lam annealing: learning hyperparameters while problem solving
publisher MDPI AG
publishDate 2021
url https://doaj.org/article/d036af5c9c094a0ca83439ec9a1dd42d
work_keys_str_mv AT vincentacicirello selftuninglamannealinglearninghyperparameterswhileproblemsolving
_version_ 1718437929714450432