A Scalable Bayesian Sampling Method Based on Stochastic Gradient Descent Isotropization

Stochastic gradient <span style="font-variant: small-caps;">sg</span>-based algorithms for Markov chain Monte Carlo sampling (<span style="font-variant: small-caps;">sgmcmc</span>) tackle large-scale Bayesian modeling problems by operating on mini-batches...

Descripción completa

Guardado en:
Detalles Bibliográficos
Autores principales: Giulio Franzese, Dimitrios Milios, Maurizio Filippone, Pietro Michiardi
Formato: article
Lenguaje:EN
Publicado: MDPI AG 2021
Materias:
Q
Acceso en línea:https://doaj.org/article/ce4b43b8d50f4f07bbae919563d8ebfd
Etiquetas: Agregar Etiqueta
Sin Etiquetas, Sea el primero en etiquetar este registro!
id oai:doaj.org-article:ce4b43b8d50f4f07bbae919563d8ebfd
record_format dspace
spelling oai:doaj.org-article:ce4b43b8d50f4f07bbae919563d8ebfd2021-11-25T17:29:34ZA Scalable Bayesian Sampling Method Based on Stochastic Gradient Descent Isotropization10.3390/e231114261099-4300https://doaj.org/article/ce4b43b8d50f4f07bbae919563d8ebfd2021-10-01T00:00:00Zhttps://www.mdpi.com/1099-4300/23/11/1426https://doaj.org/toc/1099-4300Stochastic gradient <span style="font-variant: small-caps;">sg</span>-based algorithms for Markov chain Monte Carlo sampling (<span style="font-variant: small-caps;">sgmcmc</span>) tackle large-scale Bayesian modeling problems by operating on mini-batches and injecting noise on <span style="font-variant: small-caps;">sg</span>steps. The sampling properties of these algorithms are determined by user choices, such as the covariance of the injected noise and the learning rate, and by problem-specific factors, such as assumptions on the loss landscape and the covariance of <span style="font-variant: small-caps;">sg</span> noise. However, current <span style="font-variant: small-caps;">sgmcmc</span> algorithms applied to popular complex models such as Deep Nets cannot simultaneously satisfy the assumptions on loss landscapes and on the behavior of the covariance of the <span style="font-variant: small-caps;">sg</span> noise, while operating with the practical requirement of non-vanishing learning rates. In this work we propose a novel practical method, which makes the <span style="font-variant: small-caps;">sg</span> noise isotropic, using a fixed learning rate that we determine analytically. Extensive experimental validations indicate that our proposal is competitive with the state of the art on <span style="font-variant: small-caps;">sgmcmc</span>.Giulio FranzeseDimitrios MiliosMaurizio FilipponePietro MichiardiMDPI AGarticleBayesian samplingstochastic gradientsMonte Carlo integrationScienceQAstrophysicsQB460-466PhysicsQC1-999ENEntropy, Vol 23, Iss 1426, p 1426 (2021)
institution DOAJ
collection DOAJ
language EN
topic Bayesian sampling
stochastic gradients
Monte Carlo integration
Science
Q
Astrophysics
QB460-466
Physics
QC1-999
spellingShingle Bayesian sampling
stochastic gradients
Monte Carlo integration
Science
Q
Astrophysics
QB460-466
Physics
QC1-999
Giulio Franzese
Dimitrios Milios
Maurizio Filippone
Pietro Michiardi
A Scalable Bayesian Sampling Method Based on Stochastic Gradient Descent Isotropization
description Stochastic gradient <span style="font-variant: small-caps;">sg</span>-based algorithms for Markov chain Monte Carlo sampling (<span style="font-variant: small-caps;">sgmcmc</span>) tackle large-scale Bayesian modeling problems by operating on mini-batches and injecting noise on <span style="font-variant: small-caps;">sg</span>steps. The sampling properties of these algorithms are determined by user choices, such as the covariance of the injected noise and the learning rate, and by problem-specific factors, such as assumptions on the loss landscape and the covariance of <span style="font-variant: small-caps;">sg</span> noise. However, current <span style="font-variant: small-caps;">sgmcmc</span> algorithms applied to popular complex models such as Deep Nets cannot simultaneously satisfy the assumptions on loss landscapes and on the behavior of the covariance of the <span style="font-variant: small-caps;">sg</span> noise, while operating with the practical requirement of non-vanishing learning rates. In this work we propose a novel practical method, which makes the <span style="font-variant: small-caps;">sg</span> noise isotropic, using a fixed learning rate that we determine analytically. Extensive experimental validations indicate that our proposal is competitive with the state of the art on <span style="font-variant: small-caps;">sgmcmc</span>.
format article
author Giulio Franzese
Dimitrios Milios
Maurizio Filippone
Pietro Michiardi
author_facet Giulio Franzese
Dimitrios Milios
Maurizio Filippone
Pietro Michiardi
author_sort Giulio Franzese
title A Scalable Bayesian Sampling Method Based on Stochastic Gradient Descent Isotropization
title_short A Scalable Bayesian Sampling Method Based on Stochastic Gradient Descent Isotropization
title_full A Scalable Bayesian Sampling Method Based on Stochastic Gradient Descent Isotropization
title_fullStr A Scalable Bayesian Sampling Method Based on Stochastic Gradient Descent Isotropization
title_full_unstemmed A Scalable Bayesian Sampling Method Based on Stochastic Gradient Descent Isotropization
title_sort scalable bayesian sampling method based on stochastic gradient descent isotropization
publisher MDPI AG
publishDate 2021
url https://doaj.org/article/ce4b43b8d50f4f07bbae919563d8ebfd
work_keys_str_mv AT giuliofranzese ascalablebayesiansamplingmethodbasedonstochasticgradientdescentisotropization
AT dimitriosmilios ascalablebayesiansamplingmethodbasedonstochasticgradientdescentisotropization
AT mauriziofilippone ascalablebayesiansamplingmethodbasedonstochasticgradientdescentisotropization
AT pietromichiardi ascalablebayesiansamplingmethodbasedonstochasticgradientdescentisotropization
AT giuliofranzese scalablebayesiansamplingmethodbasedonstochasticgradientdescentisotropization
AT dimitriosmilios scalablebayesiansamplingmethodbasedonstochasticgradientdescentisotropization
AT mauriziofilippone scalablebayesiansamplingmethodbasedonstochasticgradientdescentisotropization
AT pietromichiardi scalablebayesiansamplingmethodbasedonstochasticgradientdescentisotropization
_version_ 1718412312590680064