Synaptic metaplasticity in binarized neural networks
Deep neural networks usually rapidly forget the previously learned tasks while training new ones. Laborieux et al. propose a method for training binarized neural networks inspired by neuronal metaplasticity that allows to avoid catastrophic forgetting and is relevant for neuromorphic applications.
Guardado en:
Autores principales: | Axel Laborieux, Maxence Ernoult, Tifenn Hirtzlin, Damien Querlioz |
---|---|
Formato: | article |
Lenguaje: | EN |
Publicado: |
Nature Portfolio
2021
|
Materias: | |
Acceso en línea: | https://doaj.org/article/e2b90fdc25c546258715984597f47c48 |
Etiquetas: |
Agregar Etiqueta
Sin Etiquetas, Sea el primero en etiquetar este registro!
|
Ejemplares similares
-
Binarized Neural Network with Silicon Nanosheet Synaptic Transistors for Supervised Pattern Classification
por: Sungho Kim, et al.
Publicado: (2019) -
Impact of Synaptic Device Variations on Classification Accuracy in a Binarized Neural Network
por: Sungho Kim, et al.
Publicado: (2019) -
Neural-like computing with populations of superparamagnetic basis functions
por: Alice Mizrahi, et al.
Publicado: (2018) -
An adiabatic method to train binarized artificial neural networks
por: Yuansheng Zhao, et al.
Publicado: (2021) -
Differential chloride homeostasis in the spinal dorsal horn locally shapes synaptic metaplasticity and modality-specific sensitization
por: Francesco Ferrini, et al.
Publicado: (2020)