Scalable training of artificial neural networks with adaptive sparse connectivity inspired by network science
Artificial neural networks are artificial intelligence computing methods which are inspired by biological neural networks. Here the authors propose a method to design neural networks as sparse scale-free networks, which leads to a reduction in computational time required for training and inference.
Guardado en:
Autores principales: | Decebal Constantin Mocanu, Elena Mocanu, Peter Stone, Phuong H. Nguyen, Madeleine Gibescu, Antonio Liotta |
---|---|
Formato: | article |
Lenguaje: | EN |
Publicado: |
Nature Portfolio
2018
|
Materias: | |
Acceso en línea: | https://doaj.org/article/50ee9604a82c41788aeb102570ad016f |
Etiquetas: |
Agregar Etiqueta
Sin Etiquetas, Sea el primero en etiquetar este registro!
|
Ejemplares similares
-
Decentralized dynamic understanding of hidden relations in complex networks
por: Decebal Constantin Mocanu, et al.
Publicado: (2018) -
Author Correction: Decentralized dynamic understanding of hidden relations in complex networks
por: Decebal Constantin Mocanu, et al.
Publicado: (2018) -
Dendritic normalisation improves learning in sparsely connected artificial neural networks.
por: Alex D Bird, et al.
Publicado: (2021) -
Brain-inspired replay for continual learning with artificial neural networks
por: Gido M. van de Ven, et al.
Publicado: (2020) -
Training Optimization for Artificial Neural Networks
por: Primitivo Toribio Luna, et al.
Publicado: (2010)