Unique scales preserve self-similar integrate-and-fire functionality of neuronal clusters

Abstract Brains demonstrate varying spatial scales of nested hierarchical clustering. Identifying the brain’s neuronal cluster size to be presented as nodes in a network computation is critical to both neuroscience and artificial intelligence, as these define the cognitive blocks capable of building...

Full description

Saved in:
Bibliographic Details
Main Authors: Anar Amgalan, Patrick Taylor, Lilianne R. Mujica-Parodi, Hava T. Siegelmann
Format: article
Language:EN
Published: Nature Portfolio 2021
Subjects:
R
Q
Online Access:https://doaj.org/article/4437a08b9cd74b978b311f5678a4bb4a
Tags: Add Tag
No Tags, Be the first to tag this record!
Description
Summary:Abstract Brains demonstrate varying spatial scales of nested hierarchical clustering. Identifying the brain’s neuronal cluster size to be presented as nodes in a network computation is critical to both neuroscience and artificial intelligence, as these define the cognitive blocks capable of building intelligent computation. Experiments support various forms and sizes of neural clustering, from handfuls of dendrites to thousands of neurons, and hint at their behavior. Here, we use computational simulations with a brain-derived fMRI network to show that not only do brain networks remain structurally self-similar across scales but also neuron-like signal integration functionality (“integrate and fire”) is preserved at particular clustering scales. As such, we propose a coarse-graining of neuronal networks to ensemble-nodes, with multiple spikes making up its ensemble-spike and time re-scaling factor defining its ensemble-time step. This fractal-like spatiotemporal property, observed in both structure and function, permits strategic choice in bridging across experimental scales for computational modeling while also suggesting regulatory constraints on developmental and evolutionary “growth spurts” in brain size, as per punctuated equilibrium theories in evolutionary biology.