Thumbs up, thumbs down:non-verbal human-robot interaction through real-time EMG classification via inductive and supervised transductive transfer learning

Abstract

In this study, we present a transfer learning method for gesture classification via an inductive and supervised transductive approach with an electromyographic dataset gathered via the Myo armband. A ternary gesture classification problem is presented by states of ’thumbs up’, ’thumbs down’, and ’relax’ in order to communicate in the affirmative or negative in a non-verbal fashion to a machine. Of the nine statistical learning paradigms benchmarked over 10-fold cross validation (with three methods of feature selection), an ensemble of Random Forest and Support Vector Machine through voting achieves the best score of 91.74% with a rule-based feature selection method. When new subjects are considered, this machine learning approach fails to generalise new data, and thus the processes of Inductive and Supervised Transductive Transfer Learning are introduced with a short calibration exercise (15 s). Failure of generalisation shows that 5 s of data per-class is the strongest for classification (versus one through seven seconds) with only an accuracy of 55%, but when a short 5 s per class calibration task is introduced via the suggested transfer method, a Random Forest can then classify unseen data from the calibrated subject at an accuracy of around 97%, outperforming the 83% accuracy boasted by the proprietary Myo system. Finally, a preliminary application is presented through social interaction with a humanoid Pepper robot, where the use of our approach and a most-common-class metaclassifier achieves 100% accuracy for all trials of a ‘20 Questions’ game.

Publication DOI: https://doi.org/10.1007/s12652-020-01852-z
Additional Information: This article is licensed under a Creative Commons Attribution 4.0 International License, which permits use, sharing, adaptation, distribution and reproduction in any medium or format, as long as you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons licence, and indicate if changes were made. The images or other third party material in this article are included in the article's Creative Commons licence, unless indicated otherwise in a credit line to the material. If material is not included in the article's Creative Commons licence and your intended use is not permitted by statutory regulation or exceeds the permitted use, you will need to obtain permission directly from the copyright holder. To view a copy of this licence, visit http://creativecommons.org/licenses/by/4.0/.
Uncontrolled Keywords: Electromyography,Gesture classification,Human-robot interaction,Inductive transfer learning,Machine learning,Myo armband,Pepper robot,Supervised transductive transfer Learning,Transfer learning,Computer Science(all)
Publication ISSN: 1868-5145
Last Modified: 22 Apr 2024 07:26
Date Deposited: 01 Apr 2020 13:10
Full Text Link:
Related URLs: http://www.scop ... tnerID=8YFLogxK (Scopus URL)
https://link.sp ... 652-020-01852-z (Publisher URL)
PURE Output Type: Article
Published Date: 2020-12
Published Online Date: 2020-03-07
Accepted Date: 2020-02-27
Authors: Kobylarz, Jhonatan
Bird, Jordan J. (ORCID Profile 0000-0002-9858-1231)
Faria, Diego R. (ORCID Profile 0000-0002-2771-1713)
Ribeiro, Eduardo Parente
Ekárt, Anikó (ORCID Profile 0000-0001-6967-5397)

Download

[img]

Version: Published Version

License: Creative Commons Attribution

| Preview

Export / Share Citation


Statistics

Additional statistics for this record