The information theory of developmental pruning: Optimizing global network architectures using local synaptic rules.

During development, biological neural networks produce more synapses and neurons than needed. Many of these synapses and neurons are later removed in a process known as neural pruning. Why networks should initially be over-populated, and the processes that determine which synapses and neurons are ul...

Descripción completa

Guardado en:
Detalles Bibliográficos
Autores principales: Carolin Scholl, Michael E Rule, Matthias H Hennig
Formato: article
Lenguaje:EN
Publicado: Public Library of Science (PLoS) 2021
Materias:
Acceso en línea:https://doaj.org/article/fd12fd0dc7964aed830c787a619b98ee
Etiquetas: Agregar Etiqueta
Sin Etiquetas, Sea el primero en etiquetar este registro!
id oai:doaj.org-article:fd12fd0dc7964aed830c787a619b98ee
record_format dspace
spelling oai:doaj.org-article:fd12fd0dc7964aed830c787a619b98ee2021-12-02T19:57:42ZThe information theory of developmental pruning: Optimizing global network architectures using local synaptic rules.1553-734X1553-735810.1371/journal.pcbi.1009458https://doaj.org/article/fd12fd0dc7964aed830c787a619b98ee2021-10-01T00:00:00Zhttps://doi.org/10.1371/journal.pcbi.1009458https://doaj.org/toc/1553-734Xhttps://doaj.org/toc/1553-7358During development, biological neural networks produce more synapses and neurons than needed. Many of these synapses and neurons are later removed in a process known as neural pruning. Why networks should initially be over-populated, and the processes that determine which synapses and neurons are ultimately pruned, remains unclear. We study the mechanisms and significance of neural pruning in model neural networks. In a deep Boltzmann machine model of sensory encoding, we find that (1) synaptic pruning is necessary to learn efficient network architectures that retain computationally-relevant connections, (2) pruning by synaptic weight alone does not optimize network size and (3) pruning based on a locally-available measure of importance based on Fisher information allows the network to identify structurally important vs. unimportant connections and neurons. This locally-available measure of importance has a biological interpretation in terms of the correlations between presynaptic and postsynaptic neurons, and implies an efficient activity-driven pruning rule. Overall, we show how local activity-dependent synaptic pruning can solve the global problem of optimizing a network architecture. We relate these findings to biology as follows: (I) Synaptic over-production is necessary for activity-dependent connectivity optimization. (II) In networks that have more neurons than needed, cells compete for activity, and only the most important and selective neurons are retained. (III) Cells may also be pruned due to a loss of synapses on their axons. This occurs when the information they convey is not relevant to the target population.Carolin SchollMichael E RuleMatthias H HennigPublic Library of Science (PLoS)articleBiology (General)QH301-705.5ENPLoS Computational Biology, Vol 17, Iss 10, p e1009458 (2021)
institution DOAJ
collection DOAJ
language EN
topic Biology (General)
QH301-705.5
spellingShingle Biology (General)
QH301-705.5
Carolin Scholl
Michael E Rule
Matthias H Hennig
The information theory of developmental pruning: Optimizing global network architectures using local synaptic rules.
description During development, biological neural networks produce more synapses and neurons than needed. Many of these synapses and neurons are later removed in a process known as neural pruning. Why networks should initially be over-populated, and the processes that determine which synapses and neurons are ultimately pruned, remains unclear. We study the mechanisms and significance of neural pruning in model neural networks. In a deep Boltzmann machine model of sensory encoding, we find that (1) synaptic pruning is necessary to learn efficient network architectures that retain computationally-relevant connections, (2) pruning by synaptic weight alone does not optimize network size and (3) pruning based on a locally-available measure of importance based on Fisher information allows the network to identify structurally important vs. unimportant connections and neurons. This locally-available measure of importance has a biological interpretation in terms of the correlations between presynaptic and postsynaptic neurons, and implies an efficient activity-driven pruning rule. Overall, we show how local activity-dependent synaptic pruning can solve the global problem of optimizing a network architecture. We relate these findings to biology as follows: (I) Synaptic over-production is necessary for activity-dependent connectivity optimization. (II) In networks that have more neurons than needed, cells compete for activity, and only the most important and selective neurons are retained. (III) Cells may also be pruned due to a loss of synapses on their axons. This occurs when the information they convey is not relevant to the target population.
format article
author Carolin Scholl
Michael E Rule
Matthias H Hennig
author_facet Carolin Scholl
Michael E Rule
Matthias H Hennig
author_sort Carolin Scholl
title The information theory of developmental pruning: Optimizing global network architectures using local synaptic rules.
title_short The information theory of developmental pruning: Optimizing global network architectures using local synaptic rules.
title_full The information theory of developmental pruning: Optimizing global network architectures using local synaptic rules.
title_fullStr The information theory of developmental pruning: Optimizing global network architectures using local synaptic rules.
title_full_unstemmed The information theory of developmental pruning: Optimizing global network architectures using local synaptic rules.
title_sort information theory of developmental pruning: optimizing global network architectures using local synaptic rules.
publisher Public Library of Science (PLoS)
publishDate 2021
url https://doaj.org/article/fd12fd0dc7964aed830c787a619b98ee
work_keys_str_mv AT carolinscholl theinformationtheoryofdevelopmentalpruningoptimizingglobalnetworkarchitecturesusinglocalsynapticrules
AT michaelerule theinformationtheoryofdevelopmentalpruningoptimizingglobalnetworkarchitecturesusinglocalsynapticrules
AT matthiashhennig theinformationtheoryofdevelopmentalpruningoptimizingglobalnetworkarchitecturesusinglocalsynapticrules
AT carolinscholl informationtheoryofdevelopmentalpruningoptimizingglobalnetworkarchitecturesusinglocalsynapticrules
AT michaelerule informationtheoryofdevelopmentalpruningoptimizingglobalnetworkarchitecturesusinglocalsynapticrules
AT matthiashhennig informationtheoryofdevelopmentalpruningoptimizingglobalnetworkarchitecturesusinglocalsynapticrules
_version_ 1718375819085086720