Hidden Hypergraphs, Error-Correcting Codes, and Critical Learning in Hopfield Networks
In 1943, McCulloch and Pitts introduced a discrete recurrent neural network as a model for computation in brains. The work inspired breakthroughs such as the first computer design and the theory of finite automata. We focus on learning in Hopfield networks, a special case with symmetric weights and...
Saved in:
Main Authors: | Christopher Hillar, Tenzin Chan, Rachel Taubman, David Rolnick |
---|---|
Format: | article |
Language: | EN |
Published: |
MDPI AG
2021
|
Subjects: | |
Online Access: | https://doaj.org/article/530fe3130c4548da95e254bcc23a234c |
Tags: |
Add Tag
No Tags, Be the first to tag this record!
|
Similar Items
-
Hypergraph reconstruction from network data
by: Jean-Gabriel Young, et al.
Published: (2021) -
Phase transitions and stability of dynamical processes on hypergraphs
by: Guilherme Ferraz de Arruda, et al.
Published: (2021) -
Node and edge nonlinear eigenvector centrality for hypergraphs
by: Francesco Tudisco, et al.
Published: (2021) -
Detecting informative higher-order interactions in statistically validated hypergraphs
by: Federico Musciotto, et al.
Published: (2021) -
Publisher Correction: Node and edge nonlinear eigenvector centrality for hypergraphs
by: Francesco Tudisco, et al.
Published: (2021)