Efficient probabilistic inference in generic neural networks trained with non-probabilistic feedback
Behavioural tasks often require probability distributions to be inferred about task specific variables. Here, the authors demonstrate that generic neural networks can be trained using a simple error-based learning rule to perform such probabilistic computations efficiently without any need for task...
Guardado en:
Autores principales: | A. Emin Orhan, Wei Ji Ma |
---|---|
Formato: | article |
Lenguaje: | EN |
Publicado: |
Nature Portfolio
2017
|
Materias: | |
Acceso en línea: | https://doaj.org/article/9eb0080d125246cb88d6ac22181477e7 |
Etiquetas: |
Agregar Etiqueta
Sin Etiquetas, Sea el primero en etiquetar este registro!
|
Ejemplares similares
-
Neural substrates of cognitive biases during probabilistic inference
por: Alireza Soltani, et al.
Publicado: (2016) -
Gaussian synapses for probabilistic neural networks
por: Amritanand Sebastian, et al.
Publicado: (2019) -
Probabilistic phylogenetic inference with insertions and deletions.
por: Elena Rivas, et al.
Publicado: (2008) -
On the origins of suboptimality in human probabilistic inference.
por: Luigi Acerbi, et al.
Publicado: (2014) -
A self-consistent probabilistic formulation for inference of interactions
por: Jorge Fernandez-de-Cossio, et al.
Publicado: (2020)