Eyes-free interaction with aural user interfaces

dc.contributor.advisorBolchini, Davide
dc.contributor.authorRohani Ghahari, Romisa
dc.date.accessioned2016-01-12T16:13:02Z
dc.date.available2016-01-12T16:13:02Z
dc.date.issued2015-04-11
dc.degree.date2015
dc.degree.disciplineSchool of Informatics
dc.degree.grantorIndiana University
dc.degree.levelPh.D.
dc.descriptionIndiana University-Purdue University Indianapolis (IUPUI)en_US
dc.description.abstractExisting web applications force users to focus their visual attentions on mobile devices, while browsing content and services on the go (e.g., while walking or driving). To support mobile, eyes-free web browsing and minimize interaction with devices, designers can leverage the auditory channel. Whereas acoustic interfaces have proven to be effective in regard to reducing visual attention, a perplexing challenge exists in designing aural information architectures for the web because of its non-linear structure. To address this problem, we introduce and evaluate techniques to remodel existing information architectures as "playlists" of web content - aural flows. The use of aural flows in mobile web browsing can be seen in ANFORA News, a semi-aural mobile site designed to facilitate browsing large collections of news stories. An exploratory study involving frequent news readers (n=20) investigated the usability and navigation experiences with ANFORA News in a mobile setting. The initial evidence suggests that aural flows are a promising paradigm for supporting eyes-free mobile navigation while on the go. Interacting with aural flows, however, requires users to select interface buttons, tethering visual attention to the mobile device even when it is unsafe. To reduce visual interaction with the screen, we also explore the use of simulated voice commands to control aural flows. In a study, 20 participants browsed aural flows either through a visual interface or with a visual interface augmented by voice commands. The results suggest that using voice commands decreases by half the time spent looking at the device, but yields similar walking speeds, system usability and cognitive effort ratings as using buttons. To test the potential of using aural flows in a higher distracting context, a study (n=60) was conducted in a driving simulation lab. Each participant drove through three driving scenario complexities: low, moderate and high. Within each driving complexity, the participants went through an alternative aural application exposure: no device, voice-controlled aural flows (ANFORADrive) or alternative solution on the market (Umano). The results suggest that voice-controlled aural flows do not affect distraction, overall safety, cognitive effort, driving performance or driving behavior when compared to the no device condition.en_US
dc.identifier.urihttps://hdl.handle.net/1805/8036
dc.language.isoen_USen_US
dc.subjectAural flowen_US
dc.subjectAural web browsingen_US
dc.subjectEyes-free interactionen_US
dc.subjectInformation architectureen_US
dc.subjectMobile web browsing while drivingen_US
dc.subjectMultimodal user interfacesen_US
dc.subject.lcshHuman-computer interactionen_US
dc.subject.lcshMobile computingen_US
dc.subject.lcshUser interfaces (Computer systems)en_US
dc.subject.lcshAssistive computer technologyen_US
dc.subject.lcshAutomatic speech recognitionen_US
dc.subject.lcshCell phones and traffic accidentsen_US
dc.titleEyes-free interaction with aural user interfacesen_US
Files
Original bundle
Now showing 1 - 1 of 1
Loading...
Thumbnail Image
Name:
RohaniGhahari_iupui_0104D_10053.pdf
Size:
11.44 MB
Format:
Adobe Portable Document Format
License bundle
Now showing 1 - 1 of 1
No Thumbnail Available
Name:
license.txt
Size:
1.88 KB
Format:
Item-specific license agreed upon to submission
Description: