Detailed Information

Cited 0 time in webofscience Cited 0 time in scopus
Metadata Downloads

Soft modularized robotic arm for safe human-robot interaction based on visual and proprioceptive feedback

Full metadata record
DC Field Value Language
dc.contributor.authorKu, Subyeong-
dc.contributor.authorSong, Byung-Hyun-
dc.contributor.authorPark, Taejun-
dc.contributor.authorLee, Younghoon-
dc.contributor.authorPark, Yong-Lae-
dc.date.accessioned2024-08-10T04:30:20Z-
dc.date.available2024-08-10T04:30:20Z-
dc.date.issued2024-07-
dc.identifier.issn0278-3649-
dc.identifier.issn1741-3176-
dc.identifier.urihttps://scholarworks.bwise.kr/gachon/handle/2020.sw.gachon/92167-
dc.description.abstractThis study proposes a modularized soft robotic arm with integrated sensing of human touches for physical human-robot interactions. The proposed robotic arm is constructed by connecting multiple soft manipulator modules, each of which consists of three bellow-type soft actuators, pneumatic valves, and an on-board sensing and control circuit. By employing stereolithography three-dimensional (3D) printing technique, the bellow actuator is capable of incorporating embedded organogel channels in the thin wall of its body that are used for detecting human touches. The organogel thus serves as a soft interface for recognizing the intentions of the human operators, enabling the robot to interact with them while generating desired motions of the manipulator. In addition to the touch sensors, each manipulator module has compact, soft string sensors for detecting the displacements of the bellow actuators. When combined with an inertial measurement unit (IMU), the manipulator module has a capability of estimating its own pose or orientation internally. We also propose a localization method that allows us to estimate the location of the manipulator module and to acquire the 3D information of the target point in an uncontrolled environment. The proposed method uses only a single depth camera combined with a deep learning model and is thus much simpler than those of conventional motion capture systems that usually require multiple cameras in a controlled environment. Using the feedback information from the internal sensors and camera, we implemented closed-loop control algorithms to carry out tasks of reaching and grasping objects. The manipulator module shows structural robustness and the performance reliability over 5,000 cycles of repeated actuation. It shows a steady-state error and a standard deviation of 0.8 mm and 0.3 mm, respectively, using the proposed localization method and the string sensor data. We demonstrate an application example of human-robot interaction that uses human touches as triggers to pick up and manipulate target objects. The proposed soft robotic arm can be easily installed in a variety of human workspaces, since it has the ability to interact safely with humans, eliminating the need for strict control of the environments for visual perception. We believe that the proposed system has the potential to integrate soft robots into our daily lives.-
dc.format.extent23-
dc.language영어-
dc.language.isoENG-
dc.publisherSAGE PUBLICATIONS LTD-
dc.titleSoft modularized robotic arm for safe human-robot interaction based on visual and proprioceptive feedback-
dc.typeArticle-
dc.identifier.wosid001145686200001-
dc.identifier.doi10.1177/02783649241227249-
dc.identifier.bibliographicCitationINTERNATIONAL JOURNAL OF ROBOTICS RESEARCH, v.43, no.8, pp 1128 - 1150-
dc.description.isOpenAccessN-
dc.identifier.scopusid2-s2.0-85182866186-
dc.citation.endPage1150-
dc.citation.startPage1128-
dc.citation.titleINTERNATIONAL JOURNAL OF ROBOTICS RESEARCH-
dc.citation.volume43-
dc.citation.number8-
dc.type.docTypeArticle-
dc.publisher.location영국-
dc.subject.keywordAuthorSoft robotics-
dc.subject.keywordAuthorsoft actuators-
dc.subject.keywordAuthorsoft sensors-
dc.subject.keywordAuthormodularized soft robotic arm-
dc.subject.keywordAuthorcomputer vision-
dc.subject.keywordAuthordeep learning-
dc.subject.keywordPlusCONTINUUM ROBOTS-
dc.subject.keywordPlusDESIGN-
dc.subject.keywordPlusSENSOR-
dc.relation.journalResearchAreaRobotics-
dc.relation.journalWebOfScienceCategoryRobotics-
dc.description.journalRegisteredClassscie-
dc.description.journalRegisteredClassscopus-
Files in This Item
There are no files associated with this item.
Appears in
Collections
ETC > 1. Journal Articles

qrcode

Items in ScholarWorks are protected by copyright, with all rights reserved, unless otherwise indicated.

Related Researcher

Researcher Lee, Younghoon photo

Lee, Younghoon
Engineering (기계·스마트·산업공학부(기계공학전공))
Read more

Altmetrics

Total Views & Downloads

BROWSE