Crowdsourced Evaluation of Robot Programming Environments: Methodology and Application

Industrial robot programming tools increasingly rely on graphical interfaces, which aim at rendering the programming task more accessible to a wide variety of users. The usability of such tools is currently being evaluated in controlled environments, such as laboratories or companies, in which a gro...

Descripción completa

Guardado en:
Detalles Bibliográficos
Autores principales: Daria Piacun, Tudor B. Ionescu, Sebastian Schlund
Formato: article
Lenguaje:EN
Publicado: MDPI AG 2021
Materias:
T
Acceso en línea:https://doaj.org/article/440542a8f55d41d6844bf133fc6d40d2
Etiquetas: Agregar Etiqueta
Sin Etiquetas, Sea el primero en etiquetar este registro!
id oai:doaj.org-article:440542a8f55d41d6844bf133fc6d40d2
record_format dspace
spelling oai:doaj.org-article:440542a8f55d41d6844bf133fc6d40d22021-11-25T16:40:18ZCrowdsourced Evaluation of Robot Programming Environments: Methodology and Application10.3390/app1122109032076-3417https://doaj.org/article/440542a8f55d41d6844bf133fc6d40d22021-11-01T00:00:00Zhttps://www.mdpi.com/2076-3417/11/22/10903https://doaj.org/toc/2076-3417Industrial robot programming tools increasingly rely on graphical interfaces, which aim at rendering the programming task more accessible to a wide variety of users. The usability of such tools is currently being evaluated in controlled environments, such as laboratories or companies, in which a group of participants is asked to carry out several tasks using the tool and then fill out a standardized questionnaire. In this context, this paper proposes and evaluates an alternative evaluation methodology, which leverages online crowdsourcing platforms to produce the same results as face-to-face evaluations. We applied the proposed framework in the evaluation of a web-based industrial robot programming tool called <i>Assembly</i>. Our results suggest that crowdsourcing facilitates a cost-effective, result-oriented, and reusable methodology for performing user studies anonymously and online.Daria PiacunTudor B. IonescuSebastian SchlundMDPI AGarticlerobot programminguser interface evaluationcrowdsourcingTechnologyTEngineering (General). Civil engineering (General)TA1-2040Biology (General)QH301-705.5PhysicsQC1-999ChemistryQD1-999ENApplied Sciences, Vol 11, Iss 10903, p 10903 (2021)
institution DOAJ
collection DOAJ
language EN
topic robot programming
user interface evaluation
crowdsourcing
Technology
T
Engineering (General). Civil engineering (General)
TA1-2040
Biology (General)
QH301-705.5
Physics
QC1-999
Chemistry
QD1-999
spellingShingle robot programming
user interface evaluation
crowdsourcing
Technology
T
Engineering (General). Civil engineering (General)
TA1-2040
Biology (General)
QH301-705.5
Physics
QC1-999
Chemistry
QD1-999
Daria Piacun
Tudor B. Ionescu
Sebastian Schlund
Crowdsourced Evaluation of Robot Programming Environments: Methodology and Application
description Industrial robot programming tools increasingly rely on graphical interfaces, which aim at rendering the programming task more accessible to a wide variety of users. The usability of such tools is currently being evaluated in controlled environments, such as laboratories or companies, in which a group of participants is asked to carry out several tasks using the tool and then fill out a standardized questionnaire. In this context, this paper proposes and evaluates an alternative evaluation methodology, which leverages online crowdsourcing platforms to produce the same results as face-to-face evaluations. We applied the proposed framework in the evaluation of a web-based industrial robot programming tool called <i>Assembly</i>. Our results suggest that crowdsourcing facilitates a cost-effective, result-oriented, and reusable methodology for performing user studies anonymously and online.
format article
author Daria Piacun
Tudor B. Ionescu
Sebastian Schlund
author_facet Daria Piacun
Tudor B. Ionescu
Sebastian Schlund
author_sort Daria Piacun
title Crowdsourced Evaluation of Robot Programming Environments: Methodology and Application
title_short Crowdsourced Evaluation of Robot Programming Environments: Methodology and Application
title_full Crowdsourced Evaluation of Robot Programming Environments: Methodology and Application
title_fullStr Crowdsourced Evaluation of Robot Programming Environments: Methodology and Application
title_full_unstemmed Crowdsourced Evaluation of Robot Programming Environments: Methodology and Application
title_sort crowdsourced evaluation of robot programming environments: methodology and application
publisher MDPI AG
publishDate 2021
url https://doaj.org/article/440542a8f55d41d6844bf133fc6d40d2
work_keys_str_mv AT dariapiacun crowdsourcedevaluationofrobotprogrammingenvironmentsmethodologyandapplication
AT tudorbionescu crowdsourcedevaluationofrobotprogrammingenvironmentsmethodologyandapplication
AT sebastianschlund crowdsourcedevaluationofrobotprogrammingenvironmentsmethodologyandapplication
_version_ 1718413072631070720