STT-BSNN: An In-Memory Deep Binary Spiking Neural Network Based on STT-MRAM
This paper proposes an in-memory binary spiking neural network (BSNN) based on spin-transfer-torque magnetoresistive RAM (STT-MRAM). We propose residual BSNN learning using a surrogate gradient that shortens the time steps in the BSNN while maintaining sufficient accuracy. At the circuit level, pres...
Guardado en:
Autores principales: | , , , |
---|---|
Formato: | article |
Lenguaje: | EN |
Publicado: |
IEEE
2021
|
Materias: | |
Acceso en línea: | https://doaj.org/article/fdc82db0a0f44b1380ee2886de2abf87 |
Etiquetas: |
Agregar Etiqueta
Sin Etiquetas, Sea el primero en etiquetar este registro!
|
id |
oai:doaj.org-article:fdc82db0a0f44b1380ee2886de2abf87 |
---|---|
record_format |
dspace |
spelling |
oai:doaj.org-article:fdc82db0a0f44b1380ee2886de2abf872021-11-17T00:01:17ZSTT-BSNN: An In-Memory Deep Binary Spiking Neural Network Based on STT-MRAM2169-353610.1109/ACCESS.2021.3125685https://doaj.org/article/fdc82db0a0f44b1380ee2886de2abf872021-01-01T00:00:00Zhttps://ieeexplore.ieee.org/document/9605611/https://doaj.org/toc/2169-3536This paper proposes an in-memory binary spiking neural network (BSNN) based on spin-transfer-torque magnetoresistive RAM (STT-MRAM). We propose residual BSNN learning using a surrogate gradient that shortens the time steps in the BSNN while maintaining sufficient accuracy. At the circuit level, presynaptic spikes are fed to memory units through differential bit lines (BLs), while binarized weights are stored in a subarray of nonvolatile STT-MRAM. When the common inputs are fed through BLs, vector-to-matrix multiplication can be performed in a single memory sensing phase, hence achieving massive parallelism with low power and low latency. We further introduce the concept of a dynamic threshold to reduce the implementation complexity of synapses and neuron circuitry. This adjustable threshold also permits a nonlinear batch normalization (BN) function to be incorporated into the integrate-and-fire (IF) neuron circuit. The circuitry greatly improves the overall performance and enables high regularity in circuit layouts. Our proposed netlist circuits are built on a 65-nm CMOS with a fitted magnetic tunnel junction (MTJ) model for performance evaluation. The hardware/software co-simulation results indicate that the proposed design can deliver a performance of 176.6 TOPS/W for an in-memory computing (IMC) subarray size of <inline-formula> <tex-math notation="LaTeX">$1\times 288$ </tex-math></inline-formula>. The classification accuracy reaches 97.92% (83.85%) on the MNIST (CIFAR-10) dataset. The impacts of the device non-idealities and process variations are also thoroughly covered in the analysis.Van-Tinh NguyenQuang-Kien TrinhRenyuan ZhangYasuhiko NakashimaIEEEarticleBinary spiking neural networkemerging memory technologyin-memory computingneuromorphic computingprocess variationSTT-MRAMElectrical engineering. Electronics. Nuclear engineeringTK1-9971ENIEEE Access, Vol 9, Pp 151373-151385 (2021) |
institution |
DOAJ |
collection |
DOAJ |
language |
EN |
topic |
Binary spiking neural network emerging memory technology in-memory computing neuromorphic computing process variation STT-MRAM Electrical engineering. Electronics. Nuclear engineering TK1-9971 |
spellingShingle |
Binary spiking neural network emerging memory technology in-memory computing neuromorphic computing process variation STT-MRAM Electrical engineering. Electronics. Nuclear engineering TK1-9971 Van-Tinh Nguyen Quang-Kien Trinh Renyuan Zhang Yasuhiko Nakashima STT-BSNN: An In-Memory Deep Binary Spiking Neural Network Based on STT-MRAM |
description |
This paper proposes an in-memory binary spiking neural network (BSNN) based on spin-transfer-torque magnetoresistive RAM (STT-MRAM). We propose residual BSNN learning using a surrogate gradient that shortens the time steps in the BSNN while maintaining sufficient accuracy. At the circuit level, presynaptic spikes are fed to memory units through differential bit lines (BLs), while binarized weights are stored in a subarray of nonvolatile STT-MRAM. When the common inputs are fed through BLs, vector-to-matrix multiplication can be performed in a single memory sensing phase, hence achieving massive parallelism with low power and low latency. We further introduce the concept of a dynamic threshold to reduce the implementation complexity of synapses and neuron circuitry. This adjustable threshold also permits a nonlinear batch normalization (BN) function to be incorporated into the integrate-and-fire (IF) neuron circuit. The circuitry greatly improves the overall performance and enables high regularity in circuit layouts. Our proposed netlist circuits are built on a 65-nm CMOS with a fitted magnetic tunnel junction (MTJ) model for performance evaluation. The hardware/software co-simulation results indicate that the proposed design can deliver a performance of 176.6 TOPS/W for an in-memory computing (IMC) subarray size of <inline-formula> <tex-math notation="LaTeX">$1\times 288$ </tex-math></inline-formula>. The classification accuracy reaches 97.92% (83.85%) on the MNIST (CIFAR-10) dataset. The impacts of the device non-idealities and process variations are also thoroughly covered in the analysis. |
format |
article |
author |
Van-Tinh Nguyen Quang-Kien Trinh Renyuan Zhang Yasuhiko Nakashima |
author_facet |
Van-Tinh Nguyen Quang-Kien Trinh Renyuan Zhang Yasuhiko Nakashima |
author_sort |
Van-Tinh Nguyen |
title |
STT-BSNN: An In-Memory Deep Binary Spiking Neural Network Based on STT-MRAM |
title_short |
STT-BSNN: An In-Memory Deep Binary Spiking Neural Network Based on STT-MRAM |
title_full |
STT-BSNN: An In-Memory Deep Binary Spiking Neural Network Based on STT-MRAM |
title_fullStr |
STT-BSNN: An In-Memory Deep Binary Spiking Neural Network Based on STT-MRAM |
title_full_unstemmed |
STT-BSNN: An In-Memory Deep Binary Spiking Neural Network Based on STT-MRAM |
title_sort |
stt-bsnn: an in-memory deep binary spiking neural network based on stt-mram |
publisher |
IEEE |
publishDate |
2021 |
url |
https://doaj.org/article/fdc82db0a0f44b1380ee2886de2abf87 |
work_keys_str_mv |
AT vantinhnguyen sttbsnnaninmemorydeepbinaryspikingneuralnetworkbasedonsttmram AT quangkientrinh sttbsnnaninmemorydeepbinaryspikingneuralnetworkbasedonsttmram AT renyuanzhang sttbsnnaninmemorydeepbinaryspikingneuralnetworkbasedonsttmram AT yasuhikonakashima sttbsnnaninmemorydeepbinaryspikingneuralnetworkbasedonsttmram |
_version_ |
1718426086572818432 |