Hidden Hypergraphs, Error-Correcting Codes, and Critical Learning in Hopfield Networks

In 1943, McCulloch and Pitts introduced a discrete recurrent neural network as a model for computation in brains. The work inspired breakthroughs such as the first computer design and the theory of finite automata. We focus on learning in Hopfield networks, a special case with symmetric weights and...

Descripción completa

Guardado en:
Detalles Bibliográficos
Autores principales: Christopher Hillar, Tenzin Chan, Rachel Taubman, David Rolnick
Formato: article
Lenguaje:EN
Publicado: MDPI AG 2021
Materias:
Q
Acceso en línea:https://doaj.org/article/530fe3130c4548da95e254bcc23a234c
Etiquetas: Agregar Etiqueta
Sin Etiquetas, Sea el primero en etiquetar este registro!
id oai:doaj.org-article:530fe3130c4548da95e254bcc23a234c
record_format dspace
spelling oai:doaj.org-article:530fe3130c4548da95e254bcc23a234c2021-11-25T17:30:09ZHidden Hypergraphs, Error-Correcting Codes, and Critical Learning in Hopfield Networks10.3390/e231114941099-4300https://doaj.org/article/530fe3130c4548da95e254bcc23a234c2021-11-01T00:00:00Zhttps://www.mdpi.com/1099-4300/23/11/1494https://doaj.org/toc/1099-4300In 1943, McCulloch and Pitts introduced a discrete recurrent neural network as a model for computation in brains. The work inspired breakthroughs such as the first computer design and the theory of finite automata. We focus on learning in Hopfield networks, a special case with symmetric weights and fixed-point attractor dynamics. Specifically, we explore minimum energy flow (MEF) as a scalable convex objective for determining network parameters. We catalog various properties of MEF, such as biological plausibility, and then compare to classical approaches in the theory of learning. Trained Hopfield networks can perform unsupervised clustering and define novel error-correcting coding schemes. They also efficiently find hidden structures (cliques) in graph theory. We extend this known connection from graphs to hypergraphs and discover <i>n</i>-node networks with robust storage of <inline-formula><math xmlns="http://www.w3.org/1998/Math/MathML" display="inline"><semantics><msup><mn>2</mn><mrow><mi mathvariant="sans-serif">Ω</mi><mo>(</mo><msup><mi>n</mi><mrow><mn>1</mn><mo>−</mo><mi>ϵ</mi></mrow></msup><mo>)</mo></mrow></msup></semantics></math></inline-formula> memories for any <inline-formula><math xmlns="http://www.w3.org/1998/Math/MathML" display="inline"><semantics><mrow><mi>ϵ</mi><mo>></mo><mn>0</mn></mrow></semantics></math></inline-formula>. In the case of graphs, we also determine a critical ratio of training samples at which networks generalize completely.Christopher HillarTenzin ChanRachel TaubmanDavid RolnickMDPI AGarticleHopfield networksclusteringerror-correcting codesexponential memoryhidden graphneuroscienceScienceQAstrophysicsQB460-466PhysicsQC1-999ENEntropy, Vol 23, Iss 1494, p 1494 (2021)
institution DOAJ
collection DOAJ
language EN
topic Hopfield networks
clustering
error-correcting codes
exponential memory
hidden graph
neuroscience
Science
Q
Astrophysics
QB460-466
Physics
QC1-999
spellingShingle Hopfield networks
clustering
error-correcting codes
exponential memory
hidden graph
neuroscience
Science
Q
Astrophysics
QB460-466
Physics
QC1-999
Christopher Hillar
Tenzin Chan
Rachel Taubman
David Rolnick
Hidden Hypergraphs, Error-Correcting Codes, and Critical Learning in Hopfield Networks
description In 1943, McCulloch and Pitts introduced a discrete recurrent neural network as a model for computation in brains. The work inspired breakthroughs such as the first computer design and the theory of finite automata. We focus on learning in Hopfield networks, a special case with symmetric weights and fixed-point attractor dynamics. Specifically, we explore minimum energy flow (MEF) as a scalable convex objective for determining network parameters. We catalog various properties of MEF, such as biological plausibility, and then compare to classical approaches in the theory of learning. Trained Hopfield networks can perform unsupervised clustering and define novel error-correcting coding schemes. They also efficiently find hidden structures (cliques) in graph theory. We extend this known connection from graphs to hypergraphs and discover <i>n</i>-node networks with robust storage of <inline-formula><math xmlns="http://www.w3.org/1998/Math/MathML" display="inline"><semantics><msup><mn>2</mn><mrow><mi mathvariant="sans-serif">Ω</mi><mo>(</mo><msup><mi>n</mi><mrow><mn>1</mn><mo>−</mo><mi>ϵ</mi></mrow></msup><mo>)</mo></mrow></msup></semantics></math></inline-formula> memories for any <inline-formula><math xmlns="http://www.w3.org/1998/Math/MathML" display="inline"><semantics><mrow><mi>ϵ</mi><mo>></mo><mn>0</mn></mrow></semantics></math></inline-formula>. In the case of graphs, we also determine a critical ratio of training samples at which networks generalize completely.
format article
author Christopher Hillar
Tenzin Chan
Rachel Taubman
David Rolnick
author_facet Christopher Hillar
Tenzin Chan
Rachel Taubman
David Rolnick
author_sort Christopher Hillar
title Hidden Hypergraphs, Error-Correcting Codes, and Critical Learning in Hopfield Networks
title_short Hidden Hypergraphs, Error-Correcting Codes, and Critical Learning in Hopfield Networks
title_full Hidden Hypergraphs, Error-Correcting Codes, and Critical Learning in Hopfield Networks
title_fullStr Hidden Hypergraphs, Error-Correcting Codes, and Critical Learning in Hopfield Networks
title_full_unstemmed Hidden Hypergraphs, Error-Correcting Codes, and Critical Learning in Hopfield Networks
title_sort hidden hypergraphs, error-correcting codes, and critical learning in hopfield networks
publisher MDPI AG
publishDate 2021
url https://doaj.org/article/530fe3130c4548da95e254bcc23a234c
work_keys_str_mv AT christopherhillar hiddenhypergraphserrorcorrectingcodesandcriticallearninginhopfieldnetworks
AT tenzinchan hiddenhypergraphserrorcorrectingcodesandcriticallearninginhopfieldnetworks
AT racheltaubman hiddenhypergraphserrorcorrectingcodesandcriticallearninginhopfieldnetworks
AT davidrolnick hiddenhypergraphserrorcorrectingcodesandcriticallearninginhopfieldnetworks
_version_ 1718412283211677696