Francis Quek
Two key refinements that improve the experience of blind or visually impaired people who use iPads as touch-based reading devices have been developed by Francis Quek, Texas A&M professor of [visualization] (http://viz.arch.tamu.edu) , and Yasmine N. El-Glaly, assistant professor of computer science at Port Said University in Egypt.
A research paper detailing the refinements, which improve how accurately the software responds to a user’s touch, earned an outstanding paper award at the Nov. 12-16, 2014 [International Conference for Multimodal Interaction] (http://icmi.acm.org/2014/) at Bogazici University in Istanbul, Turkey.
In their paper, “ [Digital Reading Support for The Blind by Multimodal Interaction] (http://dl.acm.org/citation.cfm?id=2663266) ,”Quek and El-Glalydescribe how blind or visually impaired readers drag their fingertips along virtual lines of text on the tablet’s screen or an overlay to hear the tablet “speak” text of a book or article.
“Existing applications force the user to be slow,” said Quek, director of the [Texas A&M Embodied Interaction Laboratory] (http://one.arch.tamu.edu/news/2014/9/30/lab-assistive-technology/) . “If the user runs her finger too quickly on the virtual lines of text or changes the software’s access mode to read bigger chunks of words, she can easily lose her place or wander between virtual lines of text without realizing it,” he said.
Even if existing systems are adjusted to render words faster, he said, interaction problems remain because there is often a poor relationship between the speed of a user’s finger on the tablet and the speed of the words pronounced by the device, he said.
To address these issues, Quek and El-Glaly developed software for an iPad that predicts the direction of a user’s finger on a tablet overlay, audibly rendering words in sequence then alerting the reader if she strays from the reading line. Their work was supported with a $302,000 grant from the National Science Foundation.
Words in the new software are also rendered in sync with the speed of the user’s finger across the tablet screen, not at a default speed set by the application.
In the future, Quek foresees developing note-taking and highlighting capabilities for the software. He will continue to develop the overlay and application in the TEIL with Sharon Lyn Chu, TEIL associate director, and Akash Sahoo, a graduate computer science student.
The Istanbul conference where Quek and El-Glaly present their paper is a global forum for multidisciplinary research on human-human and human-computer interaction, interfaces and system development.
“This is a very selective conference with an 18% acceptance rate for oral presentations,” said Quek.
The conference focused on component technologies, theoretical and empirical foundations and combined multimodal processing techniques that define the field of multimodal interaction analysis, interface design, and system development.
Facebook Twitter Vimeo Youtube Flickr RSS