EyeR: Detection Support for Visually Impaired Users

Viet Ba Hirvola, Yin-Chiung Shen, Ilyena Hirskyj-Douglas

Research output: Chapter in Book/Report/Conference proceedingConference contributionScientificpeer-review


Lack of adequate support in navigation and object detection can limit independence of visually impaired (VI) people in their daily routines. Common solutions include white canes and guide dogs. White canes are useful in object detection, but require physically touching objects with the cane, which may be undesired. Guide dogs allow navigation without touching objects in the vicinity, but cannot help in object detection. By addressing this gap, employing a user-centric research approach, we aim to find a solution to improve the independence of VI people. Here, we began by initially gathering requirements through online questionnaires. Working from this, we build a prototype of a glove that alerts its users when an obstacle is detected at the pointed position; we call this EyeR. Lastly, we evaluated EyeR with VI users and found out that in use our prototype provides real time feedback and is helpful in navigation. We also provide future recommendations for VI prototypes from our participants, who would additionally like the device to recognise objects.
Original languageEnglish
Title of host publicationCHI EA '19: Extended Abstracts of the 2019 CHI Conference on Human Factors in Computing Systems
Number of pages6
ISBN (Electronic)978-1-4503-5971-9
Publication statusPublished - 3 May 2019
MoE publication typeA4 Article in a conference publication
EventACM SIGCHI Annual Conference on Human Factors in Computing Systems - Glasgow, United Kingdom
Duration: 4 May 20199 May 2019


ConferenceACM SIGCHI Annual Conference on Human Factors in Computing Systems
Abbreviated titleACM CHI
CountryUnited Kingdom
Internet address

Fingerprint Dive into the research topics of 'EyeR: Detection Support for Visually Impaired Users'. Together they form a unique fingerprint.

Cite this