ScholarWorksIndianapolis
  • Communities & Collections
  • Browse ScholarWorks
  • English
  • Català
  • Čeština
  • Deutsch
  • Español
  • Français
  • Gàidhlig
  • Italiano
  • Latviešu
  • Magyar
  • Nederlands
  • Polski
  • Português
  • Português do Brasil
  • Suomi
  • Svenska
  • Türkçe
  • Tiếng Việt
  • Қазақ
  • বাংলা
  • हिंदी
  • Ελληνικά
  • Yкраї́нська
  • Log In
    or
    New user? Click here to register.Have you forgotten your password?
  1. Home
  2. Browse by Author

Browsing by Author "Kaser, Lindsay"

Now showing 1 - 2 of 2
Results Per Page
Sort Options
  • Loading...
    Thumbnail Image
    Item
    Eyes-free interaction with aural user interfaces
    (Office of the Vice Chancellor for Research, 2013-04-05) Ghahari, Romisa Rohani; George-Palilonis, Jennifer; Kaser, Lindsay; Bolchini, Davide
    People engaged in parallel tasks at once, such as walking and browsing the web, cannot efficiently access web content and safely monitor their surroundings at the same time. To combat this, we investigate techniques to design novel aural interfaces, which remodel existing web information architectures as linear, aural flows to be listened to. An aural flow is a design-driven, concatenated sequence of pages that can be listened to with minimal interaction required. Aural flows are exemplified in ANFORA News, a semi-aural mobile site optimized to aurally browse large collections of news stories on the go. An exploratory study involving frequent news readers (n=20) investigated the usability and navigation experience with ANFORA News in a mobile setting. Initial evidence suggests that aural flows are a promising paradigm to support eyes-free mobile navigation while on the go, but users still require assistance and additional learning to fully master the aural mechanics of the flows. To unleash a more natural interaction with aural flows, we are currently exploring linkless navigation, which enables users to control the flow via a small set of dialogic commands, issued via voice. Overall, our approach will open new avenues to design appropriate aural user interfaces for content-intensive web systems. This research material is based on work supported by the National Science Foundation under Grant #1018054. Any opinions, findings and conclusions or recommendations expressed in this material are those of the authors and do not necessarily reflect those of the NSF.
  • Loading...
    Thumbnail Image
    Item
    Semi-aural Interfaces: Investigating Voice-controlled Aural Flows
    (Oxford, 2016-11) Ghahari, Romisa Rohani; George-Palilonis, Jennifer; Gahangir, Hossain; Kaser, Lindsay; Bolchini, Davide; Department of Human-Centered Computing, School of Informatics and Computing
    To support mobile, eyes-free web browsing, users can listen to ‘playlists’ of web content— aural flows . Interacting with aural flows, however, requires users to select interface buttons, tethering visual attention to the mobile device even when it is unsafe (e.g. while walking). This research extends the interaction with aural flows through simulated voice commands as a way to reduce visual interaction. This paper presents the findings of a study with 20 participants who browsed aural flows either through a visual interface only or by augmenting it with voice commands. Results suggest that using voice commands reduced the time spent looking at the device by half but yielded similar system usability and cognitive effort ratings as using buttons. Overall, the low-cognitive effort engendered by aural flows, regardless of the interaction modality, allowed participants to do more non-instructed (e.g. looking at the surrounding environment) than instructed activities (e.g. focusing on the user interface).
About IU Indianapolis ScholarWorks
  • Accessibility
  • Privacy Notice
  • Copyright © 2025 The Trustees of Indiana University