Neural Architecture Search and Hardware Accelerator Co-Search: A Survey
Deep neural networks (DNN) are now dominating in the most challenging applications of machine learning. As DNNs can have complex architectures with millions of trainable parameters (the so-called weights), their design and training are difficult even for highly qualified experts. In order to reduce...
Enregistré dans:
Auteur principal: | Lukas Sekanina |
---|---|
Format: | article |
Langue: | EN |
Publié: |
IEEE
2021
|
Sujets: | |
Accès en ligne: | https://doaj.org/article/c44f1794327f4d478272a0619e4c1a9d |
Tags: |
Ajouter un tag
Pas de tags, Soyez le premier à ajouter un tag!
|
Documents similaires
-
Multi-Branch Neural Architecture Search for Lightweight Image Super-Resolution
par: Joon Young Ahn, et autres
Publié: (2021) -
Developing Language-Specific Models Using a Neural Architecture Search
par: YongSuk Yoo, et autres
Publié: (2021) -
Power Efficient Design of High-Performance Convolutional Neural Networks Hardware Accelerator on FPGA: A Case Study With GoogLeNet
par: Ahmed J. Abd El-Maksoud, et autres
Publié: (2021) -
Utilization of Unsigned Inputs for NAND Flash-Based Parallel and High-Density Synaptic Architecture in Binary Neural Networks
par: Sung-Tae Lee, et autres
Publié: (2021) -
Sparse and dense matrix multiplication hardware for heterogeneous multi-precision neural networks
par: Jose Nunez-Yanez, et autres
Publié: (2021)