Flexible kernel memory.

This paper introduces a new model of associative memory, capable of both binary and continuous-valued inputs. Based on kernel theory, the memory model is on one hand a generalization of Radial Basis Function networks and, on the other, is in feature space, analogous to a Hopfield network. Attractors...

Descripción completa

Guardado en:
Detalles Bibliográficos
Autores principales: Dimitri Nowicki, Hava Siegelmann
Formato: article
Lenguaje:EN
Publicado: Public Library of Science (PLoS) 2010
Materias:
R
Q
Acceso en línea:https://doaj.org/article/1df67ecceedb4f71842f5a9388e702b1
Etiquetas: Agregar Etiqueta
Sin Etiquetas, Sea el primero en etiquetar este registro!
id oai:doaj.org-article:1df67ecceedb4f71842f5a9388e702b1
record_format dspace
spelling oai:doaj.org-article:1df67ecceedb4f71842f5a9388e702b12021-12-02T20:21:00ZFlexible kernel memory.1932-620310.1371/journal.pone.0010955https://doaj.org/article/1df67ecceedb4f71842f5a9388e702b12010-06-01T00:00:00Zhttps://www.ncbi.nlm.nih.gov/pmc/articles/pmid/20552013/pdf/?tool=EBIhttps://doaj.org/toc/1932-6203This paper introduces a new model of associative memory, capable of both binary and continuous-valued inputs. Based on kernel theory, the memory model is on one hand a generalization of Radial Basis Function networks and, on the other, is in feature space, analogous to a Hopfield network. Attractors can be added, deleted, and updated on-line simply, without harming existing memories, and the number of attractors is independent of input dimension. Input vectors do not have to adhere to a fixed or bounded dimensionality; they can increase and decrease it without relearning previous memories. A memory consolidation process enables the network to generalize concepts and form clusters of input data, which outperforms many unsupervised clustering techniques; this process is demonstrated on handwritten digits from MNIST. Another process, reminiscent of memory reconsolidation is introduced, in which existing memories are refreshed and tuned with new inputs; this process is demonstrated on series of morphed faces.Dimitri NowickiHava SiegelmannPublic Library of Science (PLoS)articleMedicineRScienceQENPLoS ONE, Vol 5, Iss 6, p e10955 (2010)
institution DOAJ
collection DOAJ
language EN
topic Medicine
R
Science
Q
spellingShingle Medicine
R
Science
Q
Dimitri Nowicki
Hava Siegelmann
Flexible kernel memory.
description This paper introduces a new model of associative memory, capable of both binary and continuous-valued inputs. Based on kernel theory, the memory model is on one hand a generalization of Radial Basis Function networks and, on the other, is in feature space, analogous to a Hopfield network. Attractors can be added, deleted, and updated on-line simply, without harming existing memories, and the number of attractors is independent of input dimension. Input vectors do not have to adhere to a fixed or bounded dimensionality; they can increase and decrease it without relearning previous memories. A memory consolidation process enables the network to generalize concepts and form clusters of input data, which outperforms many unsupervised clustering techniques; this process is demonstrated on handwritten digits from MNIST. Another process, reminiscent of memory reconsolidation is introduced, in which existing memories are refreshed and tuned with new inputs; this process is demonstrated on series of morphed faces.
format article
author Dimitri Nowicki
Hava Siegelmann
author_facet Dimitri Nowicki
Hava Siegelmann
author_sort Dimitri Nowicki
title Flexible kernel memory.
title_short Flexible kernel memory.
title_full Flexible kernel memory.
title_fullStr Flexible kernel memory.
title_full_unstemmed Flexible kernel memory.
title_sort flexible kernel memory.
publisher Public Library of Science (PLoS)
publishDate 2010
url https://doaj.org/article/1df67ecceedb4f71842f5a9388e702b1
work_keys_str_mv AT dimitrinowicki flexiblekernelmemory
AT havasiegelmann flexiblekernelmemory
_version_ 1718374152433303552