Assisting Visually Impaired People to Acquire Targets on a Large Wall-Mounted Display
DC Field | Value | Language |
---|---|---|
dc.contributor.author | Kim, Kibum | - |
dc.contributor.author | Ren, Xiangshi | - |
dc.date.accessioned | 2021-06-22T22:43:44Z | - |
dc.date.available | 2021-06-22T22:43:44Z | - |
dc.date.created | 2021-01-21 | - |
dc.date.issued | 2014-09 | - |
dc.identifier.issn | 1000-9000 | - |
dc.identifier.uri | https://scholarworks.bwise.kr/erica/handle/2021.sw.erica/21967 | - |
dc.description.abstract | Large displays have become ubiquitous in our everyday lives, but these displays are designed for sighted people. This paper addresses the need for visually impaired people to access targets on large wall-mounted displays. We developed an assistive interface which exploits mid-air gesture input and haptic feedback, and examined its potential for pointing and steering tasks in human computer interaction (HCI). In two experiments, blind and blindfolded users performed target acquisition tasks using mid-air gestures and two different kinds of feedback (i.e., haptic feedback and audio feedback). Our results show that participants perform faster in Fitts' law pointing tasks using the haptic feedback interface rather than the audio feedback interface. Furthermore, a regression analysis between movement time (MT) and the index of difficulty (ID) demonstrates that the Fitts' law model and the steering law model are both effective for the evaluation of assistive interfaces for the blind. Our work and findings will serve as an initial step to assist visually impaired people to easily access required information on large public displays using haptic interfaces. | - |
dc.language | 영어 | - |
dc.language.iso | en | - |
dc.publisher | SCIENCE PRESS | - |
dc.title | Assisting Visually Impaired People to Acquire Targets on a Large Wall-Mounted Display | - |
dc.type | Article | - |
dc.contributor.affiliatedAuthor | Kim, Kibum | - |
dc.identifier.doi | 10.1007/s11390-014-1471-4 | - |
dc.identifier.scopusid | 2-s2.0-84919881310 | - |
dc.identifier.wosid | 000342412700009 | - |
dc.identifier.bibliographicCitation | JOURNAL OF COMPUTER SCIENCE AND TECHNOLOGY, v.29, no.5, pp.825 - 836 | - |
dc.relation.isPartOf | JOURNAL OF COMPUTER SCIENCE AND TECHNOLOGY | - |
dc.citation.title | JOURNAL OF COMPUTER SCIENCE AND TECHNOLOGY | - |
dc.citation.volume | 29 | - |
dc.citation.number | 5 | - |
dc.citation.startPage | 825 | - |
dc.citation.endPage | 836 | - |
dc.type.rims | ART | - |
dc.type.docType | Article | - |
dc.description.journalClass | 1 | - |
dc.description.isOpenAccess | N | - |
dc.description.journalRegisteredClass | scie | - |
dc.description.journalRegisteredClass | scopus | - |
dc.relation.journalResearchArea | Computer Science | - |
dc.relation.journalWebOfScienceCategory | Computer Science, Hardware & Architecture | - |
dc.relation.journalWebOfScienceCategory | Computer Science, Software Engineering | - |
dc.subject.keywordAuthor | haptic I/O | - |
dc.subject.keywordAuthor | auditory (non-speech) feedback | - |
dc.subject.keywordAuthor | interaction style | - |
dc.subject.keywordAuthor | human computer interaction | - |
dc.subject.keywordAuthor | visually impaired people | - |
dc.identifier.url | https://link.springer.com/article/10.1007/s11390-014-1471-4 | - |
Items in ScholarWorks are protected by copyright, with all rights reserved, unless otherwise indicated.
55 Hanyangdeahak-ro, Sangnok-gu, Ansan, Gyeonggi-do, 15588, Korea+82-31-400-4269 sweetbrain@hanyang.ac.kr
COPYRIGHT © 2021 HANYANG UNIVERSITY. ALL RIGHTS RESERVED.
Certain data included herein are derived from the © Web of Science of Clarivate Analytics. All rights reserved.
You may not copy or re-distribute this material in whole or in part without the prior written consent of Clarivate Analytics.