Conditional Deep Gaussian Processes: Multi-Fidelity Kernel Learning
Deep Gaussian Processes (DGPs) were proposed as an expressive Bayesian model capable of a mathematically grounded estimation of uncertainty. The expressivity of DPGs results from not only the compositional character but the distribution propagation within the hierarchy. Recently, it was pointed out...
Guardado en:
Autores principales: | Chi-Ken Lu, Patrick Shafto |
---|---|
Formato: | article |
Lenguaje: | EN |
Publicado: |
MDPI AG
2021
|
Materias: | |
Acceso en línea: | https://doaj.org/article/7279de62091e4c17b2e769ba1c8ad513 |
Etiquetas: |
Agregar Etiqueta
Sin Etiquetas, Sea el primero en etiquetar este registro!
|
Ejemplares similares
-
Conditional Deep Gaussian Processes: Empirical Bayes Hyperdata Learning
por: Chi-Ken Lu, et al.
Publicado: (2021) -
Sampling the Variational Posterior with Local Refinement
por: Marton Havasi, et al.
Publicado: (2021) -
Gaussian states of continuous-variable quantum systems provide universal and versatile reservoir computing
por: Johannes Nokkala, et al.
Publicado: (2021) -
Fast Approximations of the Jeffreys Divergence between Univariate Gaussian Mixtures via Mixture Conversions to Exponential-Polynomial Distributions
por: Frank Nielsen
Publicado: (2021) -
Bird Species Identification Using Spectrogram Based on Multi-Channel Fusion of DCNNs
por: Feiyu Zhang, et al.
Publicado: (2021)