Enhancing Hand Interactions and Accessibility in Virtual Reality Environments for Users With Motor Disabilities: A Practical Case Study on VR-Shopping

Abstract

Over the past decade, Virtual Reality (VR) has achieved significant advancements in both quality and accessibility of its devices, particularly with VR headsets that offer an enhanced immersive experience at a reduced cost. This improvement is not only in graphical fidelity but also in interactivity within virtual environments, highlighted by advancements in tracking systems that allow direct manipulation with hands-free of external devices. However, these technologies are not fully adapted for use by individuals with motor disabilities in their arms and hands. This study addresses the ethical and moral obligation to make VR accessible to all users by proposing specific adaptations to the manual interaction mechanisms. Focusing on VR e-commerce as a use case, which is anticipated to revolutionize the shopping experience in the coming years, this study explores a scenario where users need to navigate and manipulate virtual products to examine and make purchase decisions. The experimental phase was conducted in a real-world setting at the Hospital Nacional de Parapléjicos in Toledo (HNPT), involving patients with spinal cord injuries and motor limitations. The results demonstrate that the adapted interactions not only enable users to perform tasks that are impossible with conventional mechanisms but also reduce the time and effort required to complete these tasks. Specifically, completion times achieved by users increased up to 91.6%. Also, the number of tasks completed highly increased in comparison with unadapted interactions. We elaborate an Effort Degree (ED) formula with various data items based on hand movements, that demonstrated that adapted interactions required around 40% less effort for individuals with mobility issues. These findings underscore the potential of adapted VR technologies to significantly enhance the inclusivity and utility of emerging e-commerce platforms, providing equitable access to new forms of digital engagement.

Publication DOI: https://doi.org/10.1109/access.2025.3549527
Divisions: College of Engineering & Physical Sciences
Aston University (General)
Funding Information: This work was supported in part by the Research Project, entitled Mixed Reality-Based Platform on 5G Infrastructure for Remote Support and Diagnosis Among Nursing, General Practitioners, and Medical Specialists (MRP-5G) under Grant ID SBPLY/23/180225/0001
Additional Information: Copyright © 2025 The Authors. This work is licensed under a Creative Commons Attribution 4.0 License. For more information, see https://creativecommons.org/licenses/by/4.0/
Uncontrolled Keywords: Virtual reality,adapted hand interaction,motor limitations,accessibility,inclusivity,e-commerce
Publication ISSN: 2169-3536
Data Access Statement: All data collected during the experiment are publicly available in the GitHub repository: https://github.com/AIR-Research-Group-UCLM/VR-Hands-Interactions-Adapted. The repository is organized into two folders. One of them contains twelve folders, one for each of the users, containing all the data. The other, contains some Python code used to extract the information shown in the results section. The following Google Drive link contains a video showing the adapted interactions used in our case study https://github.com/AIR-Research-Group-UCLM/VR-Hands-Interactions-Adapted/blob/master/video_link.txt.
Last Modified: 31 Mar 2025 07:27
Date Deposited: 28 Mar 2025 17:02
Full Text Link:
Related URLs: https://ieeexpl ... ument/10918638/ (Publisher URL)
PURE Output Type: Article
Published Date: 2025-03-17
Published Online Date: 2025-03-10
Accepted Date: 2025-03-04
Authors: Grande, Rubén
Albusac, Javier
Herrera, Vanesa
Monekosso, Dorothy (ORCID Profile 0000-0001-7322-5911)
Reyes-Guzman, Ana De Los
Vallejo, David
Castro-Schez, J. J.

Download

[img]

Version: Published Version

License: Creative Commons Attribution


Export / Share Citation


Statistics

Additional statistics for this record