Stimulus-dependent maximum entropy models of neural population codes.

Neural populations encode information about their stimulus in a collective fashion, by joint activity patterns of spiking and silence. A full account of this mapping from stimulus to neural activity is given by the conditional probability distribution over neural codewords given the sensory input. F...

Descripción completa

Guardado en:
Detalles Bibliográficos
Autores principales: Einat Granot-Atedgi, Gašper Tkačik, Ronen Segev, Elad Schneidman
Formato: article
Lenguaje:EN
Publicado: Public Library of Science (PLoS) 2013
Materias:
Acceso en línea:https://doaj.org/article/049461a7e66448c7928f8cc59215ad42
Etiquetas: Agregar Etiqueta
Sin Etiquetas, Sea el primero en etiquetar este registro!
id oai:doaj.org-article:049461a7e66448c7928f8cc59215ad42
record_format dspace
spelling oai:doaj.org-article:049461a7e66448c7928f8cc59215ad422021-11-18T05:52:22ZStimulus-dependent maximum entropy models of neural population codes.1553-734X1553-735810.1371/journal.pcbi.1002922https://doaj.org/article/049461a7e66448c7928f8cc59215ad422013-01-01T00:00:00Zhttps://www.ncbi.nlm.nih.gov/pmc/articles/pmid/23516339/pdf/?tool=EBIhttps://doaj.org/toc/1553-734Xhttps://doaj.org/toc/1553-7358Neural populations encode information about their stimulus in a collective fashion, by joint activity patterns of spiking and silence. A full account of this mapping from stimulus to neural activity is given by the conditional probability distribution over neural codewords given the sensory input. For large populations, direct sampling of these distributions is impossible, and so we must rely on constructing appropriate models. We show here that in a population of 100 retinal ganglion cells in the salamander retina responding to temporal white-noise stimuli, dependencies between cells play an important encoding role. We introduce the stimulus-dependent maximum entropy (SDME) model-a minimal extension of the canonical linear-nonlinear model of a single neuron, to a pairwise-coupled neural population. We find that the SDME model gives a more accurate account of single cell responses and in particular significantly outperforms uncoupled models in reproducing the distributions of population codewords emitted in response to a stimulus. We show how the SDME model, in conjunction with static maximum entropy models of population vocabulary, can be used to estimate information-theoretic quantities like average surprise and information transmission in a neural population.Einat Granot-AtedgiGašper TkačikRonen SegevElad SchneidmanPublic Library of Science (PLoS)articleBiology (General)QH301-705.5ENPLoS Computational Biology, Vol 9, Iss 3, p e1002922 (2013)
institution DOAJ
collection DOAJ
language EN
topic Biology (General)
QH301-705.5
spellingShingle Biology (General)
QH301-705.5
Einat Granot-Atedgi
Gašper Tkačik
Ronen Segev
Elad Schneidman
Stimulus-dependent maximum entropy models of neural population codes.
description Neural populations encode information about their stimulus in a collective fashion, by joint activity patterns of spiking and silence. A full account of this mapping from stimulus to neural activity is given by the conditional probability distribution over neural codewords given the sensory input. For large populations, direct sampling of these distributions is impossible, and so we must rely on constructing appropriate models. We show here that in a population of 100 retinal ganglion cells in the salamander retina responding to temporal white-noise stimuli, dependencies between cells play an important encoding role. We introduce the stimulus-dependent maximum entropy (SDME) model-a minimal extension of the canonical linear-nonlinear model of a single neuron, to a pairwise-coupled neural population. We find that the SDME model gives a more accurate account of single cell responses and in particular significantly outperforms uncoupled models in reproducing the distributions of population codewords emitted in response to a stimulus. We show how the SDME model, in conjunction with static maximum entropy models of population vocabulary, can be used to estimate information-theoretic quantities like average surprise and information transmission in a neural population.
format article
author Einat Granot-Atedgi
Gašper Tkačik
Ronen Segev
Elad Schneidman
author_facet Einat Granot-Atedgi
Gašper Tkačik
Ronen Segev
Elad Schneidman
author_sort Einat Granot-Atedgi
title Stimulus-dependent maximum entropy models of neural population codes.
title_short Stimulus-dependent maximum entropy models of neural population codes.
title_full Stimulus-dependent maximum entropy models of neural population codes.
title_fullStr Stimulus-dependent maximum entropy models of neural population codes.
title_full_unstemmed Stimulus-dependent maximum entropy models of neural population codes.
title_sort stimulus-dependent maximum entropy models of neural population codes.
publisher Public Library of Science (PLoS)
publishDate 2013
url https://doaj.org/article/049461a7e66448c7928f8cc59215ad42
work_keys_str_mv AT einatgranotatedgi stimulusdependentmaximumentropymodelsofneuralpopulationcodes
AT gaspertkacik stimulusdependentmaximumentropymodelsofneuralpopulationcodes
AT ronensegev stimulusdependentmaximumentropymodelsofneuralpopulationcodes
AT eladschneidman stimulusdependentmaximumentropymodelsofneuralpopulationcodes
_version_ 1718424708078108672