- Browse by Subject
Browsing by Subject "web navigation"
Now showing 1 - 4 of 4
Results Per Page
Sort Options
Item Navigating the Aural Web(Office of the Vice Chancellor for Research, 2011-04-08) Bolchini, DavideThe current paradigm of web navigation poses great obstacles to users in two eyes-free scenarios: mobile computing and information access for the visually-impaired. The common thread of these scenarios is the inability to efficiently navigate complex information architectures, due to the mechanical and cognitive limitations emerging while listening to instead of looking at information and navigation prompts. New paradigms for aural navigation design are still unexplored, yet they are crucial to address increasingly important requirements. Inspired by the effective practice of human-to-human aural dialogues, we present a work-in-progress research funded by a 3-year NSF grant that introduces innovative design strategies for aural navigation in complex information architectures typical of the web. Specifically, in this exhibit we introduce and demonstrate design patterns supporting aural back navigation in large collections, aimed at improving the efficiency and usability of aural navigation. Current evaluation thrusts of the new navigation techniques involve blind users accessing the web through screen readers and sighted users using a mobile application prototype.Item Navigating the Aural Web: Augmenting User Experience for Visually Impaired and Mobile Users(Office of the Vice Chancellor for Research, 2013-04-05) Bolchini, Davide; Yang, Tao; Gadde, Prathik; Ghahari, Romisa RohaniThe current web navigation paradigm structures interaction around vision and thus hampers users in two eyes-free scenarios: mobile computing and information access for the visually impaired. Users in both scenarios are unable to navigate complex information architectures efficiently because of the strictly linear perceptual bandwidth of the aural channel. To combat this problem, we are conducting a long-term research program aimed at establishing novel design strategies that can augment the aural navigation while users browse complex information architectures typical of the web. A pervasive problem in designing for web accessibility (especially for screen reader users) is to provide efficient access to a large collection of contents, which is manifested in long lists indexing the underlying contents. Cognitively managing the interaction with long lists is cumbersome in the aural paradigm because users need to listen attentively to each list item to make a decision about what link to follow and then select a link. For every non relevant page selected, screen reader users need to go back to the list to select another page. Our most recent study studies compared the performance of index-based web navigation to guided-tour navigation (navigation without lists) for screen-reader users. Guided-tour navigation allows users to move directly back and forth across the content pages of a collection, bypassing lists. An experiment (N=10), conducted at the Indiana School for the Blind and Visually Impaired (ISBVI), examined these web navigation strategies during fact-finding tasks. Guided-tour significantly reduced time on task, number of pages visited, number of keystrokes, and perceived cognitive effort while enhancing the navigational experience. By augmenting existing navigational methods for screen-reader users, our research offers design strategies to web designers to improve web accessibility without costly site redesign. This research material is based upon work supported by the National Science Foundation under Grant #1018054.Item Navigating the Aural Web: Listening-based Back Navigation in Large Architectures(Office of the Vice Chancellor for Research, 2012-04-13) Bolchini, DavideThe current paradigm of web navigation structures interaction around the visual channel and thus poses obstacles to users in two eyes-free scenarios: mobile computing and information access for the visually-impaired. The common thread of these scenarios is the inability to efficiently navigate complex information architectures due to the limited perceptual bandwidth of the aural channel. To address this problem, we are conducting a long-term research program aimed at establishing novel design strategies for aural navigation in complex information architectures typical of the web. As first line of results, we introduce topic- and list-based back: two navigation strategies to enhance aural browsing. Both are manifest in Green-Savers Mobile (GSM), an aural mobile site. A study (N=29) compared both solutions to traditional back mechanisms. Our findings indicate that topic- and list-based back enable faster access to previous pages, improve the navigation experience and reduce perceived cognitive load. To expand this line of work, we have also completed the evaluation of topic- and list-based back with blind and visually-impaired users of screen readers (N=10). The preliminary findings of the study, conducted in close collaboration with the Indiana School for the Blind in Indianapolis are promising. Topic- and list-based back decrease the number of web pages visited in aural browsing, and increase the self-rated navigation experience with respect to traditional back mechanisms. The proposed designs apply to a wide range of content-intensive, ubiquitous web systems. This research is based upon work supported by the NSF under the 3-year grant IIS-1018054 “Navigating the Aural Web” and two Research Experience for Undergraduate (REU) supplement grants.Item The Participatory Design of an Adaptive Interface to Support Users with Changing Pointing Ability(ACM, 2017-10) Martin-Hammond, Aqueasha; Hamidi, Foad; Bhalerao, Tejas; Ali, Abdullah; Hornback, Catherine; Means, Casey; Hurst, Abdullah; Human-Centered Computing, School of Informatics and ComputingIndividuals who experience temporary, intermittent, or gradual changes in pointing ability may encounter frustrating experiences when using computer input devices. Personalized pointing systems that automatically assess changes in performance and provide individualized information and assistance may benefit these users. However, there has been little inquiry into this populations' expectations for interacting with these types of systems. We describe a participatory design process in which we used a technology probe to assess the information needs and expectations of 27 individuals who experience occasional changes in pointing ability, through interactions with and discussion regarding a high-fidelity personalized pointing prototype. Participants preferred notification and adaptation interactions that provided them with control and explanation of system actions, instead of abstract notifications and automatic adaptations. We describe how we applied these finding in the design of the PINATA system.