A Scalable Bayesian Sampling Method Based on Stochastic Gradient Descent Isotropization
Stochastic gradient <span style="font-variant: small-caps;">sg</span>-based algorithms for Markov chain Monte Carlo sampling (<span style="font-variant: small-caps;">sgmcmc</span>) tackle large-scale Bayesian modeling problems by operating on mini-batches...
Enregistré dans:
Auteurs principaux: | , , , |
---|---|
Format: | article |
Langue: | EN |
Publié: |
MDPI AG
2021
|
Sujets: | |
Accès en ligne: | https://doaj.org/article/ce4b43b8d50f4f07bbae919563d8ebfd |
Tags: |
Ajouter un tag
Pas de tags, Soyez le premier à ajouter un tag!
|
Résumé: | Stochastic gradient <span style="font-variant: small-caps;">sg</span>-based algorithms for Markov chain Monte Carlo sampling (<span style="font-variant: small-caps;">sgmcmc</span>) tackle large-scale Bayesian modeling problems by operating on mini-batches and injecting noise on <span style="font-variant: small-caps;">sg</span>steps. The sampling properties of these algorithms are determined by user choices, such as the covariance of the injected noise and the learning rate, and by problem-specific factors, such as assumptions on the loss landscape and the covariance of <span style="font-variant: small-caps;">sg</span> noise. However, current <span style="font-variant: small-caps;">sgmcmc</span> algorithms applied to popular complex models such as Deep Nets cannot simultaneously satisfy the assumptions on loss landscapes and on the behavior of the covariance of the <span style="font-variant: small-caps;">sg</span> noise, while operating with the practical requirement of non-vanishing learning rates. In this work we propose a novel practical method, which makes the <span style="font-variant: small-caps;">sg</span> noise isotropic, using a fixed learning rate that we determine analytically. Extensive experimental validations indicate that our proposal is competitive with the state of the art on <span style="font-variant: small-caps;">sgmcmc</span>. |
---|