Crowdsourced Evaluation of Robot Programming Environments: Methodology and Application
Industrial robot programming tools increasingly rely on graphical interfaces, which aim at rendering the programming task more accessible to a wide variety of users. The usability of such tools is currently being evaluated in controlled environments, such as laboratories or companies, in which a gro...
Enregistré dans:
Auteurs principaux: | , , |
---|---|
Format: | article |
Langue: | EN |
Publié: |
MDPI AG
2021
|
Sujets: | |
Accès en ligne: | https://doaj.org/article/440542a8f55d41d6844bf133fc6d40d2 |
Tags: |
Ajouter un tag
Pas de tags, Soyez le premier à ajouter un tag!
|
Résumé: | Industrial robot programming tools increasingly rely on graphical interfaces, which aim at rendering the programming task more accessible to a wide variety of users. The usability of such tools is currently being evaluated in controlled environments, such as laboratories or companies, in which a group of participants is asked to carry out several tasks using the tool and then fill out a standardized questionnaire. In this context, this paper proposes and evaluates an alternative evaluation methodology, which leverages online crowdsourcing platforms to produce the same results as face-to-face evaluations. We applied the proposed framework in the evaluation of a web-based industrial robot programming tool called <i>Assembly</i>. Our results suggest that crowdsourcing facilitates a cost-effective, result-oriented, and reusable methodology for performing user studies anonymously and online. |
---|