SPACE: Structured Compression and Sharing of Representational Space for Continual Learning
Humans learn incrementally from sequential experiences throughout their lives, which has proven hard to emulate in artificial neural networks. Incrementally learning tasks causes neural networks to overwrite relevant information learned about older tasks, resulting in ‘Catastrophic Forget...
Guardado en:
Autores principales: | Gobinda Saha, Isha Garg, Aayush Ankit, Kaushik Roy |
---|---|
Formato: | article |
Lenguaje: | EN |
Publicado: |
IEEE
2021
|
Materias: | |
Acceso en línea: | https://doaj.org/article/46706ff8450743849542b1376c3fdb24 |
Etiquetas: |
Agregar Etiqueta
Sin Etiquetas, Sea el primero en etiquetar este registro!
|
Ejemplares similares
-
T-DFNN: An Incremental Learning Algorithm for Intrusion Detection Systems
por: Mahendra Data, et al.
Publicado: (2021) -
A divided and prioritized experience replay approach for streaming regression
por: Mikkel Leite Arnø, et al.
Publicado: (2021) -
Combining Accuracy and Plasticity in Convolutional Neural Networks Based on Resistive Memory Arrays for Autonomous Learning
por: Stefano Bianchi, et al.
Publicado: (2021) -
A-iLearn: An adaptive incremental learning model for spoof fingerprint detection
por: Shivang Agarwal, et al.
Publicado: (2022) -
Adaptive Data Compression for Classification Problems
por: Farhad Pourkamali-Anaraki, et al.
Publicado: (2021)