Eyes-free interaction with aural user interfaces
Date
Language
Embargo Lift Date
Department
Committee Chair
Committee Members
Degree
Degree Year
Department
Grantor
Journal Title
Journal ISSN
Volume Title
Found At
Abstract
People engaged in parallel tasks at once, such as walking and browsing the web, cannot efficiently access web content and safely monitor their surroundings at the same time. To combat this, we investigate techniques to design novel aural interfaces, which remodel existing web information architectures as linear, aural flows to be listened to. An aural flow is a design-driven, concatenated sequence of pages that can be listened to with minimal interaction required. Aural flows are exemplified in ANFORA News, a semi-aural mobile site optimized to aurally browse large collections of news stories on the go. An exploratory study involving frequent news readers (n=20) investigated the usability and navigation experience with ANFORA News in a mobile setting. Initial evidence suggests that aural flows are a promising paradigm to support eyes-free mobile navigation while on the go, but users still require assistance and additional learning to fully master the aural mechanics of the flows. To unleash a more natural interaction with aural flows, we are currently exploring linkless navigation, which enables users to control the flow via a small set of dialogic commands, issued via voice. Overall, our approach will open new avenues to design appropriate aural user interfaces for content-intensive web systems. This research material is based on work supported by the National Science Foundation under Grant #1018054. Any opinions, findings and conclusions or recommendations expressed in this material are those of the authors and do not necessarily reflect those of the NSF.