Multitask learning over shared subspaces.

This paper uses constructs from machine learning to define pairs of learning tasks that either shared or did not share a common subspace. Human subjects then learnt these tasks using a feedback-based approach and we hypothesised that learning would be boosted for shared subspaces. Our findings broad...

Descripción completa

Guardado en:
Detalles Bibliográficos
Autores principales: Nicholas Menghi, Kemal Kacar, Will Penny
Formato: article
Lenguaje:EN
Publicado: Public Library of Science (PLoS) 2021
Materias:
Acceso en línea:https://doaj.org/article/d106e5c4366f4f8fa9d3caf8aeb511cf
Etiquetas: Agregar Etiqueta
Sin Etiquetas, Sea el primero en etiquetar este registro!
id oai:doaj.org-article:d106e5c4366f4f8fa9d3caf8aeb511cf
record_format dspace
spelling oai:doaj.org-article:d106e5c4366f4f8fa9d3caf8aeb511cf2021-12-02T19:57:25ZMultitask learning over shared subspaces.1553-734X1553-735810.1371/journal.pcbi.1009092https://doaj.org/article/d106e5c4366f4f8fa9d3caf8aeb511cf2021-07-01T00:00:00Zhttps://doi.org/10.1371/journal.pcbi.1009092https://doaj.org/toc/1553-734Xhttps://doaj.org/toc/1553-7358This paper uses constructs from machine learning to define pairs of learning tasks that either shared or did not share a common subspace. Human subjects then learnt these tasks using a feedback-based approach and we hypothesised that learning would be boosted for shared subspaces. Our findings broadly supported this hypothesis with either better performance on the second task if it shared the same subspace as the first, or positive correlations over task performance for shared subspaces. These empirical findings were compared to the behaviour of a Neural Network model trained using sequential Bayesian learning and human performance was found to be consistent with a minimal capacity variant of this model. Networks with an increased representational capacity, and networks without Bayesian learning, did not show these transfer effects. We propose that the concept of shared subspaces provides a useful framework for the experimental study of human multitask and transfer learning.Nicholas MenghiKemal KacarWill PennyPublic Library of Science (PLoS)articleBiology (General)QH301-705.5ENPLoS Computational Biology, Vol 17, Iss 7, p e1009092 (2021)
institution DOAJ
collection DOAJ
language EN
topic Biology (General)
QH301-705.5
spellingShingle Biology (General)
QH301-705.5
Nicholas Menghi
Kemal Kacar
Will Penny
Multitask learning over shared subspaces.
description This paper uses constructs from machine learning to define pairs of learning tasks that either shared or did not share a common subspace. Human subjects then learnt these tasks using a feedback-based approach and we hypothesised that learning would be boosted for shared subspaces. Our findings broadly supported this hypothesis with either better performance on the second task if it shared the same subspace as the first, or positive correlations over task performance for shared subspaces. These empirical findings were compared to the behaviour of a Neural Network model trained using sequential Bayesian learning and human performance was found to be consistent with a minimal capacity variant of this model. Networks with an increased representational capacity, and networks without Bayesian learning, did not show these transfer effects. We propose that the concept of shared subspaces provides a useful framework for the experimental study of human multitask and transfer learning.
format article
author Nicholas Menghi
Kemal Kacar
Will Penny
author_facet Nicholas Menghi
Kemal Kacar
Will Penny
author_sort Nicholas Menghi
title Multitask learning over shared subspaces.
title_short Multitask learning over shared subspaces.
title_full Multitask learning over shared subspaces.
title_fullStr Multitask learning over shared subspaces.
title_full_unstemmed Multitask learning over shared subspaces.
title_sort multitask learning over shared subspaces.
publisher Public Library of Science (PLoS)
publishDate 2021
url https://doaj.org/article/d106e5c4366f4f8fa9d3caf8aeb511cf
work_keys_str_mv AT nicholasmenghi multitasklearningoversharedsubspaces
AT kemalkacar multitasklearningoversharedsubspaces
AT willpenny multitasklearningoversharedsubspaces
_version_ 1718375870013374464