Situated Touch Audio Annotator And Reader (STAAR) For Individuals With Blindness Or Severe Visual Impairment
Project Team
- Dr. Francis Quek: PI. TEILab Director, Professor of Visualization, By Courtesy: Professor of Computer Science and Engineering, Professor of Psychology
- Niloofar Zarei Ph.D. Student, Computer Science and Engineering
Previous Participants
- Dr. Yasmine Elglaly Graduated PhD Student
- Dr. Tonya Smith Jackson Professor, Industrial and Systems Engineering
- Gurjot Dhillon PhD student
- Manpreet Hora Masters student
Project Overview
This research aims to enable Individuals with Blindness or Severe Visual Impairment (IBSVI) to engage their broad spatial cognition and memory resources in reading and annotating textual material. We argue that from the invention of print media forward, information itself is formulated and optimized for consumption by embodied beings with a dominant visual capability. This visuo-spatial bias in not well- understood or studied in the context of information access by and delivery to IBSVI. Most technological information aids funnel information to IBSVI as sequential aural streams, obviating the use of broader spatial cognitive resources. We propose a multimodal approach that enables IBSVI to fuse spatial layout and informational content through touch location on a slate-type device and audio rendering of text to speech respectively. We will develop a Spatial Touch Audio Annotator and Reader (STAAR) testbed to investigate this nexus of interaction. STAAR enables self-paced reading using a tactile overlay pattern over an iPad surface. The pattern will be designed to provide tactile landmarks to help the IBSVI to navigate the ‘page’. STAAR renders the text chunk touched audibly. We will also investigate the use of touch gestures to enable contextualized highlighting and note-taking. We will study how IBSVI may employ spatial strategies and exploration to refind and reaccess information both in the act of reading and for recall after some time interval.
Overlay
To help IBSVI move horizontally and keep track of their location on the iPad screen, we augmented the iPad screen with a plastic overlay that has tangible landmarks, as illustrated in Figure 1. The overlay is composed of a vertical ruler at the left margin, a set of horizontal and vertical lines in the ‘reading area’, and haptic control buttons at the bottom. Although the tactile pattern is static, it can provide the user with the opportunity to map out the space on the iPad surface. The goal is not that the tactile patterns will directly show the structure of the underlying pages – obviously it cannot. Rather, the tactile patterns provide a kind of landmark grid that IBSVI may appropriate to maintain spatial grounding for the document being read. As the IBSVI moves her fingers over the tactilely adorned surface, the words (and sounds) associated with the touch locations are sounded, so that the IBSVI is able to create a spatial index of what is read. Hence, the tactile patterns help to ground the user’s mental model of how information is located and organized in the document being read. This mental model serves as a dynamic and persistent reference to enable the IBSVI user to interact more efficiently and effectively with the textual material.

Publications
- Yasmine N. Elglaly, F. Quek, Tonya Smith-Jackson, and Gurjot Dhillon. It Is Not a Talking Book; It Is More Like Really Reading a Book!. ASSETS 2012.
- Yasmine N. Elglaly, F. Quek, Tonya Smith-Jackson, and Gurjot Dhillon. Audible Rendering of Text Documents Controlled by Multi-Touch Interaction. To appear, ACM International Conference of Multimodal Interaction, 2012.
- Yasmine N. Elglaly, F. Quek, Tonya Smith-Jackson, and Gurjot Dhillon. Spatial Tactile Audio Reader System For People With Blindness Or Severe Visual Impairment. Grace Hopper Conference, 2012.
- Gurjot S. Dhillon, Yasmine N. El-Glaly, William H. Holbach, Tonya L. Smith-Jackson, Francis Quek, “Use of Participatory Design to Enhance Accessibility of Slate-Type Devices”, in Proceedings of the 56th Human Factors and Ergonomics, 2012, Boston, Massachusetts., USA
- Yasmine Elglaly, Francis Quek, Tonya Smith-Jackson, Haptic Reading System for The Blind, in 1st Interdisciplinary Research Symposium at Virginia Tech, 2011, Blacksburg, Virginia, USA.
Acknowledgements
This research is supported by the National Science Foundation grant: STAAR: Spatial Touch Audio Annotator and Reader for Individuals with Blindness or Severe Visual Impairment, 1 August 2011 – July 21, 2014, IIS-1117854, and “I-EN: Device and Display Ecologies,” National Science Foundation, 1 August 2010 – July 31, 2014, IIS- 1059398. Any opinions, findings, and conclusions or recommendations expressed in this material are those of the author(s) and do not necessarily reflect the views of the National Science Foundation.