An Adaptive Deep Learning Optimization Method Based on Radius of Curvature

An adaptive clamping method (SGD-MS) based on the radius of curvature is designed to alleviate the local optimal oscillation problem in deep neural network, which combines the radius of curvature of the objective function and the gradient descent of the optimizer. The radius of curvature is consider...

Description complète

Enregistré dans:
Détails bibliographiques
Auteurs principaux: Jiahui Zhang, Xinhao Yang, Ke Zhang, Chenrui Wen
Format: article
Langue:EN
Publié: Hindawi Limited 2021
Sujets:
Accès en ligne:https://doaj.org/article/acab22f5e532433c807e6c5f9bb9d3fc
Tags: Ajouter un tag
Pas de tags, Soyez le premier à ajouter un tag!
Description
Résumé:An adaptive clamping method (SGD-MS) based on the radius of curvature is designed to alleviate the local optimal oscillation problem in deep neural network, which combines the radius of curvature of the objective function and the gradient descent of the optimizer. The radius of curvature is considered as the threshold to separate the momentum term or the future gradient moving average term adaptively. In addition, on this basis, we propose an accelerated version (SGD-MA), which further improves the convergence speed by using the method of aggregated momentum. Experimental results on several datasets show that the proposed methods effectively alleviate the local optimal oscillation problem and greatly improve the convergence speed and accuracy. A novel parameter updating algorithm is also provided in this paper for deep neural network.