Detailed Information

Cited 0 time in webofscience Cited 0 time in scopus
Metadata Downloads

Towards Natural and Intuitive Human-Robot Collaboration based on Goal-Oriented Human Gaze Intention Recognition

Full metadata record
DC Field Value Language
dc.contributor.authorLim, Taeyhang-
dc.contributor.authorLee, Joosun-
dc.contributor.authorKim, Wansoo-
dc.date.accessioned2024-04-29T05:30:29Z-
dc.date.available2024-04-29T05:30:29Z-
dc.date.issued2024-04-
dc.identifier.issn0000-0000-
dc.identifier.urihttps://scholarworks.bwise.kr/erica/handle/2021.sw.erica/118886-
dc.description.abstractThe objective of this paper is to introduce a new method for predicting human gaze intention using a head-mounted display, with the aim of enabling natural and intuitive collaboration between humans and robots. Human eye gaze is strongly linked to cognitive processes and can facilitate communication between humans and robots. However, accurately identifying the goal-directed object through human intention remains challenging. This study focuses on developing a method to differentiate between goal and non-goal gaze by creating an area of interest (AOI) on each object through the goal-directed gaze. The Microsoft HoloLens 2 was used to simulate the robot using real-time gaze data in augmented reality (AR). The methods with and without AOI were compared through pick-and-place robot manipulation through human gaze prediction. The AOI method resulted a maximum improvement of 19% in the F1 score compared to the baseline method. The results yield strong evidence on intuitiveness and usefulness that the use of pre-defined AOI allows improved performance to predict gaze intention that has the potential to be applied in various fields, where human-robot collaboration can enhance efficiency and productivity.-
dc.format.extent6-
dc.language영어-
dc.language.isoENG-
dc.publisherIEEE COMPUTER SOC-
dc.titleTowards Natural and Intuitive Human-Robot Collaboration based on Goal-Oriented Human Gaze Intention Recognition-
dc.typeArticle-
dc.publisher.location미국-
dc.identifier.doi10.1109/IRC59093.2023.00027-
dc.identifier.scopusid2-s2.0-85190069551-
dc.identifier.wosid001195993100021-
dc.identifier.bibliographicCitation2023 Seventh IEEE International Conference on Robotic Computing (IRC), pp 115 - 120-
dc.citation.title2023 Seventh IEEE International Conference on Robotic Computing (IRC)-
dc.citation.startPage115-
dc.citation.endPage120-
dc.type.docTypeProceedings Paper-
dc.description.isOpenAccessN-
dc.description.journalRegisteredClassscie-
dc.description.journalRegisteredClassscopus-
dc.relation.journalResearchAreaComputer Science-
dc.relation.journalResearchAreaRobotics-
dc.relation.journalWebOfScienceCategoryComputer Science, Theory & Methods-
dc.relation.journalWebOfScienceCategoryRobotics-
dc.subject.keywordAuthorHuman-Robot Interaction-
dc.subject.keywordAuthorIntention Recognition-
dc.subject.keywordAuthorAugmented Reality-
dc.subject.keywordAuthorService Robotics-
dc.identifier.urlhttps://ieeexplore.ieee.org/document/10473548-
Files in This Item
Go to Link
Appears in
Collections
COLLEGE OF ENGINEERING SCIENCES > DEPARTMENT OF ROBOT ENGINEERING > 1. Journal Articles

qrcode

Items in ScholarWorks are protected by copyright, with all rights reserved, unless otherwise indicated.

Related Researcher

Researcher KIM, WANSOO photo

KIM, WANSOO
ERICA 공학대학 (DEPARTMENT OF ROBOT ENGINEERING)
Read more

Altmetrics

Total Views & Downloads

BROWSE