Charles Bonnet syndrome: evidence for a generative model in the cortex?

Several theories propose that the cortex implements an internal model to explain, predict, and learn about sensory data, but the nature of this model is unclear. One condition that could be highly informative here is Charles Bonnet syndrome (CBS), where loss of vision leads to complex, vivid visual...

Descripción completa

Guardado en:
Detalles Bibliográficos
Autores principales: David P Reichert, Peggy Seriès, Amos J Storkey
Formato: article
Lenguaje:EN
Publicado: Public Library of Science (PLoS) 2013
Materias:
Acceso en línea:https://doaj.org/article/5c0e065beaab41be8931464a1e7e0e1d
Etiquetas: Agregar Etiqueta
Sin Etiquetas, Sea el primero en etiquetar este registro!
id oai:doaj.org-article:5c0e065beaab41be8931464a1e7e0e1d
record_format dspace
spelling oai:doaj.org-article:5c0e065beaab41be8931464a1e7e0e1d2021-11-18T05:52:00ZCharles Bonnet syndrome: evidence for a generative model in the cortex?1553-734X1553-735810.1371/journal.pcbi.1003134https://doaj.org/article/5c0e065beaab41be8931464a1e7e0e1d2013-01-01T00:00:00Zhttps://www.ncbi.nlm.nih.gov/pmc/articles/pmid/23874177/pdf/?tool=EBIhttps://doaj.org/toc/1553-734Xhttps://doaj.org/toc/1553-7358Several theories propose that the cortex implements an internal model to explain, predict, and learn about sensory data, but the nature of this model is unclear. One condition that could be highly informative here is Charles Bonnet syndrome (CBS), where loss of vision leads to complex, vivid visual hallucinations of objects, people, and whole scenes. CBS could be taken as indication that there is a generative model in the brain, specifically one that can synthesise rich, consistent visual representations even in the absence of actual visual input. The processes that lead to CBS are poorly understood. Here, we argue that a model recently introduced in machine learning, the deep Boltzmann machine (DBM), could capture the relevant aspects of (hypothetical) generative processing in the cortex. The DBM carries both the semantics of a probabilistic generative model and of a neural network. The latter allows us to model a concrete neural mechanism that could underlie CBS, namely, homeostatic regulation of neuronal activity. We show that homeostatic plasticity could serve to make the learnt internal model robust against e.g. degradation of sensory input, but overcompensate in the case of CBS, leading to hallucinations. We demonstrate how a wide range of features of CBS can be explained in the model and suggest a potential role for the neuromodulator acetylcholine. This work constitutes the first concrete computational model of CBS and the first application of the DBM as a model in computational neuroscience. Our results lend further credence to the hypothesis of a generative model in the brain.David P ReichertPeggy SerièsAmos J StorkeyPublic Library of Science (PLoS)articleBiology (General)QH301-705.5ENPLoS Computational Biology, Vol 9, Iss 7, p e1003134 (2013)
institution DOAJ
collection DOAJ
language EN
topic Biology (General)
QH301-705.5
spellingShingle Biology (General)
QH301-705.5
David P Reichert
Peggy Seriès
Amos J Storkey
Charles Bonnet syndrome: evidence for a generative model in the cortex?
description Several theories propose that the cortex implements an internal model to explain, predict, and learn about sensory data, but the nature of this model is unclear. One condition that could be highly informative here is Charles Bonnet syndrome (CBS), where loss of vision leads to complex, vivid visual hallucinations of objects, people, and whole scenes. CBS could be taken as indication that there is a generative model in the brain, specifically one that can synthesise rich, consistent visual representations even in the absence of actual visual input. The processes that lead to CBS are poorly understood. Here, we argue that a model recently introduced in machine learning, the deep Boltzmann machine (DBM), could capture the relevant aspects of (hypothetical) generative processing in the cortex. The DBM carries both the semantics of a probabilistic generative model and of a neural network. The latter allows us to model a concrete neural mechanism that could underlie CBS, namely, homeostatic regulation of neuronal activity. We show that homeostatic plasticity could serve to make the learnt internal model robust against e.g. degradation of sensory input, but overcompensate in the case of CBS, leading to hallucinations. We demonstrate how a wide range of features of CBS can be explained in the model and suggest a potential role for the neuromodulator acetylcholine. This work constitutes the first concrete computational model of CBS and the first application of the DBM as a model in computational neuroscience. Our results lend further credence to the hypothesis of a generative model in the brain.
format article
author David P Reichert
Peggy Seriès
Amos J Storkey
author_facet David P Reichert
Peggy Seriès
Amos J Storkey
author_sort David P Reichert
title Charles Bonnet syndrome: evidence for a generative model in the cortex?
title_short Charles Bonnet syndrome: evidence for a generative model in the cortex?
title_full Charles Bonnet syndrome: evidence for a generative model in the cortex?
title_fullStr Charles Bonnet syndrome: evidence for a generative model in the cortex?
title_full_unstemmed Charles Bonnet syndrome: evidence for a generative model in the cortex?
title_sort charles bonnet syndrome: evidence for a generative model in the cortex?
publisher Public Library of Science (PLoS)
publishDate 2013
url https://doaj.org/article/5c0e065beaab41be8931464a1e7e0e1d
work_keys_str_mv AT davidpreichert charlesbonnetsyndromeevidenceforagenerativemodelinthecortex
AT peggyseries charlesbonnetsyndromeevidenceforagenerativemodelinthecortex
AT amosjstorkey charlesbonnetsyndromeevidenceforagenerativemodelinthecortex
_version_ 1718424727488299008