Detailed Information

Cited 1 time in webofscience Cited 4 time in scopus
Metadata Downloads

Development of an electrooculogram-based human-computer interface using involuntary eye movement by spatially rotating sound for communication of locked-in patients

Full metadata record
DC Field Value Language
dc.contributor.authorKim, Do Yeon-
dc.contributor.authorHan, Chang-Hee-
dc.contributor.authorIm, Chang-Hwan-
dc.date.accessioned2021-08-02T13:29:11Z-
dc.date.available2021-08-02T13:29:11Z-
dc.date.created2021-05-12-
dc.date.issued2018-06-
dc.identifier.issn2045-2322-
dc.identifier.urihttps://scholarworks.bwise.kr/hanyang/handle/2021.sw.hanyang/16914-
dc.description.abstractIndividuals who have lost normal pathways for communication need augmentative and alternative communication (AAC) devices. In this study, we propose a new electrooculogram (EOG)-based human-computer interface (HCI) paradigm for AAC that does not require a user's voluntary eye movement for binary yes/no communication by patients in locked-in state (LIS). The proposed HCI uses a horizontal EOG elicited by involuntary auditory oculogyric reflex, in response to a rotating sound source. In the proposed HCI paradigm, a user was asked to selectively attend to one of two sound sources rotating in directions opposite to each other, based on the user's intention. The user's intentions could then be recognised by quantifying EOGs. To validate its performance, a series of experiments was conducted with ten healthy subjects, and two patients with amyotrophic lateral sclerosis (ALS). The online experimental results exhibited high-classification accuracies of 94% in both healthy subjects and ALS patients in cases where decisions were made every six seconds. The ALS patients also participated in a practical yes/no communication experiment with 26 or 30 questions with known answers. The accuracy of the experiments with questionnaires was 94%, demonstrating that our paradigm could constitute an auxiliary AAC system for some LIS patients.-
dc.language영어-
dc.language.isoen-
dc.publisherNATURE PUBLISHING GROUP-
dc.titleDevelopment of an electrooculogram-based human-computer interface using involuntary eye movement by spatially rotating sound for communication of locked-in patients-
dc.typeArticle-
dc.contributor.affiliatedAuthorIm, Chang-Hwan-
dc.identifier.doi10.1038/s41598-018-27865-5-
dc.identifier.scopusid2-s2.0-85048962030-
dc.identifier.wosid000436046500019-
dc.identifier.bibliographicCitationSCIENTIFIC REPORTS, v.8, pp.1 - 10-
dc.relation.isPartOfSCIENTIFIC REPORTS-
dc.citation.titleSCIENTIFIC REPORTS-
dc.citation.volume8-
dc.citation.startPage1-
dc.citation.endPage10-
dc.type.rimsART-
dc.type.docTypeArticle-
dc.description.journalClass1-
dc.description.isOpenAccessY-
dc.description.journalRegisteredClassscie-
dc.description.journalRegisteredClassscopus-
dc.relation.journalResearchAreaScience & Technology - Other Topics-
dc.relation.journalWebOfScienceCategoryMultidisciplinary Sciences-
dc.subject.keywordPlusAMYOTROPHIC-LATERAL-SCLEROSIS-
dc.subject.keywordPlusALTERNATIVE COMMUNICATION-
dc.subject.keywordPlusATTENTION-
dc.subject.keywordPlusSTATE-
dc.identifier.urlhttps://www.nature.com/articles/s41598-018-27865-5-
Files in This Item
Appears in
Collections
ETC > 1. Journal Articles

qrcode

Items in ScholarWorks are protected by copyright, with all rights reserved, unless otherwise indicated.

Related Researcher

Researcher Im, Chang Hwan photo

Im, Chang Hwan
COLLEGE OF ENGINEERING (서울 바이오메디컬공학전공)
Read more

Altmetrics

Total Views & Downloads

BROWSE