Abstract
Lack of adequate support in navigation and object detection can limit independence of visually impaired (VI) people in their daily routines. Common solutions include white canes and guide dogs. White canes are useful in object detection, but require physically touching objects with the cane, which may be undesired. Guide dogs allow navigation without touching objects in the vicinity, but cannot help in object detection. By addressing this gap, employing a user-centric research approach, we aim to find a solution to improve the independence of VI people. Here, we began by initially gathering requirements through online questionnaires. Working from this, we build a prototype of a glove that alerts its users when an obstacle is detected at the pointed position; we call this EyeR. Lastly, we evaluated EyeR with VI users and found out that in use our prototype provides real time feedback and is helpful in navigation. We also provide future recommendations for VI prototypes from our participants, who would additionally like the device to recognise objects.
Original language | English |
---|---|
Title of host publication | CHI EA '19: Extended Abstracts of the 2019 CHI Conference on Human Factors in Computing Systems |
Publisher | ACM |
Number of pages | 6 |
ISBN (Electronic) | 978-1-4503-5971-9 |
DOIs | |
Publication status | Published - 3 May 2019 |
MoE publication type | A4 Conference publication |
Event | ACM SIGCHI Annual Conference on Human Factors in Computing Systems - Glasgow, United Kingdom Duration: 4 May 2019 → 9 May 2019 https://chi2019.acm.org/ |
Conference
Conference | ACM SIGCHI Annual Conference on Human Factors in Computing Systems |
---|---|
Abbreviated title | ACM CHI |
Country/Territory | United Kingdom |
City | Glasgow |
Period | 04/05/2019 → 09/05/2019 |
Internet address |