Complexity and diversity in sparse code priors improve receptive field characterization of Macaque V1 neurons.
System identification techniques-projection pursuit regression models (PPRs) and convolutional neural networks (CNNs)-provide state-of-the-art performance in predicting visual cortical neurons' responses to arbitrary input stimuli. However, the constituent kernels recovered by these methods are...
Guardado en:
Autores principales: | , , , , |
---|---|
Formato: | article |
Lenguaje: | EN |
Publicado: |
Public Library of Science (PLoS)
2021
|
Materias: | |
Acceso en línea: | https://doaj.org/article/2069ff302cfb44a391b3381cd5161622 |
Etiquetas: |
Agregar Etiqueta
Sin Etiquetas, Sea el primero en etiquetar este registro!
|
id |
oai:doaj.org-article:2069ff302cfb44a391b3381cd5161622 |
---|---|
record_format |
dspace |
spelling |
oai:doaj.org-article:2069ff302cfb44a391b3381cd51616222021-12-02T19:57:41ZComplexity and diversity in sparse code priors improve receptive field characterization of Macaque V1 neurons.1553-734X1553-735810.1371/journal.pcbi.1009528https://doaj.org/article/2069ff302cfb44a391b3381cd51616222021-10-01T00:00:00Zhttps://doi.org/10.1371/journal.pcbi.1009528https://doaj.org/toc/1553-734Xhttps://doaj.org/toc/1553-7358System identification techniques-projection pursuit regression models (PPRs) and convolutional neural networks (CNNs)-provide state-of-the-art performance in predicting visual cortical neurons' responses to arbitrary input stimuli. However, the constituent kernels recovered by these methods are often noisy and lack coherent structure, making it difficult to understand the underlying component features of a neuron's receptive field. In this paper, we show that using a dictionary of diverse kernels with complex shapes learned from natural scenes based on efficient coding theory, as the front-end for PPRs and CNNs can improve their performance in neuronal response prediction as well as algorithmic data efficiency and convergence speed. Extensive experimental results also indicate that these sparse-code kernels provide important information on the component features of a neuron's receptive field. In addition, we find that models with the complex-shaped sparse code front-end are significantly better than models with a standard orientation-selective Gabor filter front-end for modeling V1 neurons that have been found to exhibit complex pattern selectivity. We show that the relative performance difference due to these two front-ends can be used to produce a sensitive metric for detecting complex selectivity in V1 neurons.Ziniu WuHarold RockwellYimeng ZhangShiming TangTai Sing LeePublic Library of Science (PLoS)articleBiology (General)QH301-705.5ENPLoS Computational Biology, Vol 17, Iss 10, p e1009528 (2021) |
institution |
DOAJ |
collection |
DOAJ |
language |
EN |
topic |
Biology (General) QH301-705.5 |
spellingShingle |
Biology (General) QH301-705.5 Ziniu Wu Harold Rockwell Yimeng Zhang Shiming Tang Tai Sing Lee Complexity and diversity in sparse code priors improve receptive field characterization of Macaque V1 neurons. |
description |
System identification techniques-projection pursuit regression models (PPRs) and convolutional neural networks (CNNs)-provide state-of-the-art performance in predicting visual cortical neurons' responses to arbitrary input stimuli. However, the constituent kernels recovered by these methods are often noisy and lack coherent structure, making it difficult to understand the underlying component features of a neuron's receptive field. In this paper, we show that using a dictionary of diverse kernels with complex shapes learned from natural scenes based on efficient coding theory, as the front-end for PPRs and CNNs can improve their performance in neuronal response prediction as well as algorithmic data efficiency and convergence speed. Extensive experimental results also indicate that these sparse-code kernels provide important information on the component features of a neuron's receptive field. In addition, we find that models with the complex-shaped sparse code front-end are significantly better than models with a standard orientation-selective Gabor filter front-end for modeling V1 neurons that have been found to exhibit complex pattern selectivity. We show that the relative performance difference due to these two front-ends can be used to produce a sensitive metric for detecting complex selectivity in V1 neurons. |
format |
article |
author |
Ziniu Wu Harold Rockwell Yimeng Zhang Shiming Tang Tai Sing Lee |
author_facet |
Ziniu Wu Harold Rockwell Yimeng Zhang Shiming Tang Tai Sing Lee |
author_sort |
Ziniu Wu |
title |
Complexity and diversity in sparse code priors improve receptive field characterization of Macaque V1 neurons. |
title_short |
Complexity and diversity in sparse code priors improve receptive field characterization of Macaque V1 neurons. |
title_full |
Complexity and diversity in sparse code priors improve receptive field characterization of Macaque V1 neurons. |
title_fullStr |
Complexity and diversity in sparse code priors improve receptive field characterization of Macaque V1 neurons. |
title_full_unstemmed |
Complexity and diversity in sparse code priors improve receptive field characterization of Macaque V1 neurons. |
title_sort |
complexity and diversity in sparse code priors improve receptive field characterization of macaque v1 neurons. |
publisher |
Public Library of Science (PLoS) |
publishDate |
2021 |
url |
https://doaj.org/article/2069ff302cfb44a391b3381cd5161622 |
work_keys_str_mv |
AT ziniuwu complexityanddiversityinsparsecodepriorsimprovereceptivefieldcharacterizationofmacaquev1neurons AT haroldrockwell complexityanddiversityinsparsecodepriorsimprovereceptivefieldcharacterizationofmacaquev1neurons AT yimengzhang complexityanddiversityinsparsecodepriorsimprovereceptivefieldcharacterizationofmacaquev1neurons AT shimingtang complexityanddiversityinsparsecodepriorsimprovereceptivefieldcharacterizationofmacaquev1neurons AT taisinglee complexityanddiversityinsparsecodepriorsimprovereceptivefieldcharacterizationofmacaquev1neurons |
_version_ |
1718375766905847808 |