Real-time grasp planning based on motion field graph for human-robot cooperation
DC Field | Value | Language |
---|---|---|
dc.contributor.author | Hwang, Jaepyung | - |
dc.contributor.author | Yang, Myungsik | - |
dc.contributor.author | Suh, Il Hong | - |
dc.contributor.author | Kwon, Taesoo | - |
dc.date.accessioned | 2022-07-14T23:55:54Z | - |
dc.date.available | 2022-07-14T23:55:54Z | - |
dc.date.created | 2021-05-13 | - |
dc.date.issued | 2016-12 | - |
dc.identifier.issn | 2153-0858 | - |
dc.identifier.uri | https://scholarworks.bwise.kr/hanyang/handle/2021.sw.hanyang/153404 | - |
dc.description.abstract | We present a real-time framework for planning natural and smooth grasping motions in an online manner based on the interaction with human. The proposed framework is able to change its grasp strategies agilely according to the interaction with human. Given human demonstrations, we develop a motion field graph consisting of nodes and edges where the nodes contains reference finger poses and their time derivatives, and the edges indicates the similarity between a pair of nodes. Based on the graph, a new grasping motion can be planned that adapts to the changes of the environment and the interaction. The motion field guarantees smooth motions by integrating the velocities obtained from the demonstrations. To validate the framework, we build a demo system where a human can hand a cup over to a tele-operated robot or a virtual humanoid avatar which are controlled by an another person at a remote location. Also, the virtual avatar can grasp and manipulate a cola can. | - |
dc.language | 영어 | - |
dc.language.iso | en | - |
dc.publisher | Institute of Electrical and Electronics Engineers Inc. | - |
dc.title | Real-time grasp planning based on motion field graph for human-robot cooperation | - |
dc.type | Article | - |
dc.contributor.affiliatedAuthor | Kwon, Taesoo | - |
dc.identifier.doi | 10.1109/IROS.2016.7759175 | - |
dc.identifier.scopusid | 2-s2.0-85006367890 | - |
dc.identifier.bibliographicCitation | IEEE International Conference on Intelligent Robots and Systems, v.2016-November, pp.1025 - 1032 | - |
dc.relation.isPartOf | IEEE International Conference on Intelligent Robots and Systems | - |
dc.citation.title | IEEE International Conference on Intelligent Robots and Systems | - |
dc.citation.volume | 2016-November | - |
dc.citation.startPage | 1025 | - |
dc.citation.endPage | 1032 | - |
dc.type.rims | ART | - |
dc.type.docType | Conference Paper | - |
dc.description.journalClass | 1 | - |
dc.description.isOpenAccess | N | - |
dc.description.journalRegisteredClass | scopus | - |
dc.subject.keywordPlus | Intelligent robots | - |
dc.subject.keywordPlus | Robots | - |
dc.subject.keywordPlus | Virtual reality | - |
dc.subject.keywordPlus | Grasp planning | - |
dc.subject.keywordPlus | Human demonstrations | - |
dc.subject.keywordPlus | Human-robot cooperation | - |
dc.subject.keywordPlus | Remote location | - |
dc.subject.keywordPlus | Teleoperated robots | - |
dc.subject.keywordPlus | Time derivative | - |
dc.subject.keywordPlus | Virtual avatar | - |
dc.subject.keywordPlus | Virtual humanoids | - |
dc.subject.keywordPlus | Robot programming | - |
dc.identifier.url | https://ieeexplore.ieee.org/document/7759175 | - |
Items in ScholarWorks are protected by copyright, with all rights reserved, unless otherwise indicated.
222, Wangsimni-ro, Seongdong-gu, Seoul, 04763, Korea+82-2-2220-1365
COPYRIGHT © 2021 HANYANG UNIVERSITY.
Certain data included herein are derived from the © Web of Science of Clarivate Analytics. All rights reserved.
You may not copy or re-distribute this material in whole or in part without the prior written consent of Clarivate Analytics.