Snapbot: Enabling Dynamic Human Robot Interactions for Real-Time Computational Photography
DC Field | Value | Language |
---|---|---|
dc.contributor.author | Choi, Chanyeok | - |
dc.contributor.author | Kim, Jeonghan | - |
dc.contributor.author | Nam, Yunjae | - |
dc.contributor.author | Lee, Youngmoon | - |
dc.date.accessioned | 2024-03-29T07:00:51Z | - |
dc.date.available | 2024-03-29T07:00:51Z | - |
dc.date.issued | 2024-03 | - |
dc.identifier.issn | 2167-2148 | - |
dc.identifier.uri | https://scholarworks.bwise.kr/erica/handle/2021.sw.erica/118278 | - |
dc.description.abstract | Photography remains an expert area requiring right focus, exposure, composition, and even post-processing. Yet, robotic automation can enable precise camera manipulation, focus and exposure adjustment, camera composition, and post-processing by leveraging state-of-the-art computational photography. Existing proposals for robotic photography focus on adjusting camera angles for static portraits or developing image evaluation metrics, thus falling short in capturing dynamic human robot interactions. This paper describes the design and implementation of Snapbot, a human robot interaction system designed specifically for computational photography. Snapbot dynamically detects face and pose for exposure and focus and interactively controls robot arm for camera composition to perform image scoring and enhancing. As perception, control, and computational photography form an end-to-end pipeline, Snapbot promises a new future in which image focus, exposure, composition, and generation can be jointly optimized as a unified process. We have implemented and deployed Snapbot on a UR3 demonstrating the mean image quality score is 1.51× compared to aesthetic visual analysis dataset. We also perform ablation study to analyze the impact of each stage of Snapbot both visually and quantitatively. © 2024 Copyright held by the owner/author(s) | - |
dc.format.extent | 5 | - |
dc.language | 영어 | - |
dc.language.iso | ENG | - |
dc.publisher | IEEE Computer Society | - |
dc.title | Snapbot: Enabling Dynamic Human Robot Interactions for Real-Time Computational Photography | - |
dc.type | Article | - |
dc.publisher.location | 미국 | - |
dc.identifier.doi | 10.1145/3610978.3640712 | - |
dc.identifier.scopusid | 2-s2.0-85188066801 | - |
dc.identifier.bibliographicCitation | HRI '24: Companion of the 2024 ACM/IEEE International Conference on Human-Robot Interaction, pp 327 - 331 | - |
dc.citation.title | HRI '24: Companion of the 2024 ACM/IEEE International Conference on Human-Robot Interaction | - |
dc.citation.startPage | 327 | - |
dc.citation.endPage | 331 | - |
dc.type.docType | Conference paper | - |
dc.description.isOpenAccess | Y | - |
dc.description.journalRegisteredClass | scopus | - |
dc.subject.keywordAuthor | Computational Photography | - |
dc.subject.keywordAuthor | Human-Robot interaction | - |
dc.subject.keywordAuthor | Robotics | - |
dc.identifier.url | https://dl.acm.org/doi/10.1145/3610978.3640712 | - |
Items in ScholarWorks are protected by copyright, with all rights reserved, unless otherwise indicated.
55 Hanyangdeahak-ro, Sangnok-gu, Ansan, Gyeonggi-do, 15588, Korea+82-31-400-4269 sweetbrain@hanyang.ac.kr
COPYRIGHT © 2021 HANYANG UNIVERSITY. ALL RIGHTS RESERVED.
Certain data included herein are derived from the © Web of Science of Clarivate Analytics. All rights reserved.
You may not copy or re-distribute this material in whole or in part without the prior written consent of Clarivate Analytics.