Event-based backpropagation can compute exact gradients for spiking neural networks
Abstract Spiking neural networks combine analog computation with event-based communication using discrete spikes. While the impressive advances of deep learning are enabled by training non-spiking artificial neural networks using the backpropagation algorithm, applying this algorithm to spiking netw...
Guardado en:
Autores principales: | , |
---|---|
Formato: | article |
Lenguaje: | EN |
Publicado: |
Nature Portfolio
2021
|
Materias: | |
Acceso en línea: | https://doaj.org/article/b122f4ad7eb4420e80716f2cff95fb3f |
Etiquetas: |
Agregar Etiqueta
Sin Etiquetas, Sea el primero en etiquetar este registro!
|
id |
oai:doaj.org-article:b122f4ad7eb4420e80716f2cff95fb3f |
---|---|
record_format |
dspace |
spelling |
oai:doaj.org-article:b122f4ad7eb4420e80716f2cff95fb3f2021-12-02T17:41:07ZEvent-based backpropagation can compute exact gradients for spiking neural networks10.1038/s41598-021-91786-z2045-2322https://doaj.org/article/b122f4ad7eb4420e80716f2cff95fb3f2021-06-01T00:00:00Zhttps://doi.org/10.1038/s41598-021-91786-zhttps://doaj.org/toc/2045-2322Abstract Spiking neural networks combine analog computation with event-based communication using discrete spikes. While the impressive advances of deep learning are enabled by training non-spiking artificial neural networks using the backpropagation algorithm, applying this algorithm to spiking networks was previously hindered by the existence of discrete spike events and discontinuities. For the first time, this work derives the backpropagation algorithm for a continuous-time spiking neural network and a general loss function by applying the adjoint method together with the proper partial derivative jumps, allowing for backpropagation through discrete spike events without approximations. This algorithm, EventProp, backpropagates errors at spike times in order to compute the exact gradient in an event-based, temporally and spatially sparse fashion. We use gradients computed via EventProp to train networks on the Yin-Yang and MNIST datasets using either a spike time or voltage based loss function and report competitive performance. Our work supports the rigorous study of gradient-based learning algorithms in spiking neural networks and provides insights toward their implementation in novel brain-inspired hardware.Timo C. WunderlichChristian PehleNature PortfolioarticleMedicineRScienceQENScientific Reports, Vol 11, Iss 1, Pp 1-17 (2021) |
institution |
DOAJ |
collection |
DOAJ |
language |
EN |
topic |
Medicine R Science Q |
spellingShingle |
Medicine R Science Q Timo C. Wunderlich Christian Pehle Event-based backpropagation can compute exact gradients for spiking neural networks |
description |
Abstract Spiking neural networks combine analog computation with event-based communication using discrete spikes. While the impressive advances of deep learning are enabled by training non-spiking artificial neural networks using the backpropagation algorithm, applying this algorithm to spiking networks was previously hindered by the existence of discrete spike events and discontinuities. For the first time, this work derives the backpropagation algorithm for a continuous-time spiking neural network and a general loss function by applying the adjoint method together with the proper partial derivative jumps, allowing for backpropagation through discrete spike events without approximations. This algorithm, EventProp, backpropagates errors at spike times in order to compute the exact gradient in an event-based, temporally and spatially sparse fashion. We use gradients computed via EventProp to train networks on the Yin-Yang and MNIST datasets using either a spike time or voltage based loss function and report competitive performance. Our work supports the rigorous study of gradient-based learning algorithms in spiking neural networks and provides insights toward their implementation in novel brain-inspired hardware. |
format |
article |
author |
Timo C. Wunderlich Christian Pehle |
author_facet |
Timo C. Wunderlich Christian Pehle |
author_sort |
Timo C. Wunderlich |
title |
Event-based backpropagation can compute exact gradients for spiking neural networks |
title_short |
Event-based backpropagation can compute exact gradients for spiking neural networks |
title_full |
Event-based backpropagation can compute exact gradients for spiking neural networks |
title_fullStr |
Event-based backpropagation can compute exact gradients for spiking neural networks |
title_full_unstemmed |
Event-based backpropagation can compute exact gradients for spiking neural networks |
title_sort |
event-based backpropagation can compute exact gradients for spiking neural networks |
publisher |
Nature Portfolio |
publishDate |
2021 |
url |
https://doaj.org/article/b122f4ad7eb4420e80716f2cff95fb3f |
work_keys_str_mv |
AT timocwunderlich eventbasedbackpropagationcancomputeexactgradientsforspikingneuralnetworks AT christianpehle eventbasedbackpropagationcancomputeexactgradientsforspikingneuralnetworks |
_version_ |
1718379725376716800 |