Multimodal 'Eyes-Free' interaction techniques for mobile devices

Brewster, Stephen, Lumsden, Joanna, Bell, Marek, Hall, Malcolm and Tasker, Stuart (2003). Multimodal 'Eyes-Free' interaction techniques for mobile devices. IN: CHI '03 : proceedings of the SIGCHI Conference on Human Factors in Computing Systems. New York, NY (US): ACM.


Mobile and wearable computers present input/output prob-lems due to limited screen space and interaction techniques. When mobile, users typically focus their visual attention on navigating their environment - making visually demanding interface designs hard to operate. This paper presents two multimodal interaction techniques designed to overcome these problems and allow truly mobile, 'eyes-free' device use. The first is a 3D audio radial pie menu that uses head gestures for selecting items. An evaluation of a range of different audio designs showed that egocentric sounds re-duced task completion time, perceived annoyance, and al-lowed users to walk closer to their preferred walking speed. The second is a sonically enhanced 2D gesture recognition system for use on a belt-mounted PDA. An evaluation of the system with and without audio feedback showed users' ges-tures were more accurate when dynamically guided by au-dio-feedback. These novel interaction techniques demon-strate effective alternatives to visual-centric interface de-signs on mobile devices.

Publication DOI:
Divisions: Engineering & Applied Sciences > Computer science
Engineering & Applied Sciences > Computer science research group
Event Title: Conference on Human Factors in Computing Systems
Event Type: Other
Event Dates: 2003-04-05 - 2003-04-10
Full Text Link: http://dl.acm.o ... n.cfm?id=642694
Related URLs:
Published Date: 2003-05-04
Authors: Brewster, Stephen
Lumsden, Joanna ( 0000-0002-8637-7647)
Bell, Marek
Hall, Malcolm
Tasker, Stuart

Export / Share Citation


Additional statistics for this record