- Browse by Author
Browsing by Author "George-Palilonis, Jennifer"
Now showing 1 - 3 of 3
Results Per Page
Sort Options
Item ANFORA (AURAL NAVIGATION FLOWS ON RICH ARCHITECTURES)(Office of the Vice Chancellor for Research, 2012-04-13) Ghahari, Romisa R.; George-Palilonis, Jennifer; Bolchini, DavideExisting web applications make users focus their visual attention on the mobile device while browsing content and services on-the-go. To support eyes-free, mobile experiences, designers can minimize the in-teraction with the device by leveraging the auditory channel. Whereas acoustic interfaces have shown to be effective to reduce visual atten-tion, a perplexing challenge is designing aural information architec-tures typical of the web. To address this problem, we introduce Aural Navigation Flows on Rich Architectures (ANFORA), a novel design framework that transforms existing information architectures as linear, aural flows. We demonstrate our approach in ANFORAnews, a semi-aural mobile site designed to browse large collections of news stories. A study with frequent news readers (N=20) investigated the usability and navigation experience with ANFORAnews in a mobile setting. Aural flows are enjoyable, easy-to-use and appropriate for eyes-free, mobile contexts. Future work will optimize the mechanisms to customize con-tent and control the aural navigation.Item Eyes-free interaction with aural user interfaces(Office of the Vice Chancellor for Research, 2013-04-05) Ghahari, Romisa Rohani; George-Palilonis, Jennifer; Kaser, Lindsay; Bolchini, DavidePeople engaged in parallel tasks at once, such as walking and browsing the web, cannot efficiently access web content and safely monitor their surroundings at the same time. To combat this, we investigate techniques to design novel aural interfaces, which remodel existing web information architectures as linear, aural flows to be listened to. An aural flow is a design-driven, concatenated sequence of pages that can be listened to with minimal interaction required. Aural flows are exemplified in ANFORA News, a semi-aural mobile site optimized to aurally browse large collections of news stories on the go. An exploratory study involving frequent news readers (n=20) investigated the usability and navigation experience with ANFORA News in a mobile setting. Initial evidence suggests that aural flows are a promising paradigm to support eyes-free mobile navigation while on the go, but users still require assistance and additional learning to fully master the aural mechanics of the flows. To unleash a more natural interaction with aural flows, we are currently exploring linkless navigation, which enables users to control the flow via a small set of dialogic commands, issued via voice. Overall, our approach will open new avenues to design appropriate aural user interfaces for content-intensive web systems. This research material is based on work supported by the National Science Foundation under Grant #1018054. Any opinions, findings and conclusions or recommendations expressed in this material are those of the authors and do not necessarily reflect those of the NSF.Item Semi-aural Interfaces: Investigating Voice-controlled Aural Flows(Oxford, 2016-11) Ghahari, Romisa Rohani; George-Palilonis, Jennifer; Gahangir, Hossain; Kaser, Lindsay; Bolchini, Davide; Department of Human-Centered Computing, School of Informatics and ComputingTo support mobile, eyes-free web browsing, users can listen to ‘playlists’ of web content— aural flows . Interacting with aural flows, however, requires users to select interface buttons, tethering visual attention to the mobile device even when it is unsafe (e.g. while walking). This research extends the interaction with aural flows through simulated voice commands as a way to reduce visual interaction. This paper presents the findings of a study with 20 participants who browsed aural flows either through a visual interface only or by augmenting it with voice commands. Results suggest that using voice commands reduced the time spent looking at the device by half but yielded similar system usability and cognitive effort ratings as using buttons. Overall, the low-cognitive effort engendered by aural flows, regardless of the interaction modality, allowed participants to do more non-instructed (e.g. looking at the surrounding environment) than instructed activities (e.g. focusing on the user interface).