Detailed Information

Cited 0 time in webofscience Cited 0 time in scopus
Metadata Downloads

Automatic Generation of Multimodal 4D Effects for Immersive Video Watching Experiences

Full metadata record
DC Field Value Language
dc.contributor.authorNam, Seoyong-
dc.contributor.authorChung, Minho-
dc.contributor.authorKim, Haerim-
dc.contributor.authorKim, Eunchae-
dc.contributor.authorKim, Taehyeon-
dc.contributor.authorYoo, Yongjae-
dc.date.accessioned2024-12-05T06:00:24Z-
dc.date.available2024-12-05T06:00:24Z-
dc.date.issued2024-12-
dc.identifier.urihttps://scholarworks.bwise.kr/erica/handle/2021.sw.erica/121169-
dc.description.abstractThe recent trend of watching content using over-the-top (OTT) services pushes the 4D movie industry to seek a way of transformation. To address the issue, this paper suggests an AI-driven automatic 4D effects generation algorithm applied to a low-cost comfort chair. The system extracts multiple features using psychoacoustic analysis, saliency detection, optical flow, and an LLM-based thermal effect synthesis, and maps them into various sensory displays such as vibration, heat, wind, and poking automatically. To evaluate the system, a user study with 21 participants across seven film genres was conducted. The results showed that 1) there was a general improvement with 4D effects in terms of immersion, concentration, and expressiveness, and 2) multisensory effects were particularly useful in action and fantasy movie scenes. The suggested system could be directly used in current general video-on-demand services.-
dc.format.extent4-
dc.language영어-
dc.language.isoENG-
dc.publisherACM-
dc.titleAutomatic Generation of Multimodal 4D Effects for Immersive Video Watching Experiences-
dc.typeArticle-
dc.publisher.location미국-
dc.identifier.doi10.1145/3681758.3698021-
dc.identifier.scopusid2-s2.0-85214815067-
dc.identifier.wosid001443093400025-
dc.identifier.bibliographicCitationSA '24: SIGGRAPH Asia 2024 Technical Communications, pp 1 - 4-
dc.citation.titleSA '24: SIGGRAPH Asia 2024 Technical Communications-
dc.citation.startPage1-
dc.citation.endPage4-
dc.type.docTypeProceedings Paper-
dc.description.isOpenAccessN-
dc.description.journalRegisteredClassother-
dc.relation.journalResearchAreaComputer Science-
dc.relation.journalResearchAreaImaging Science & Photographic Technology-
dc.relation.journalWebOfScienceCategoryComputer Science, Interdisciplinary Applications-
dc.relation.journalWebOfScienceCategoryComputer Science, Software Engineering-
dc.relation.journalWebOfScienceCategoryImaging Science & Photographic Technology-
dc.subject.keywordAuthor4D films-
dc.subject.keywordAuthorhaptics-
dc.subject.keywordAuthormultimodal-
dc.subject.keywordAuthorAI-driven haptic effects generation-
dc.identifier.urlhttps://dl.acm.org/doi/10.1145/3681758.3698021-
Files in This Item
Go to Link
Appears in
Collections
COLLEGE OF COMPUTING > DEPARTMENT OF ARTIFICIAL INTELLIGENCE > 1. Journal Articles

qrcode

Items in ScholarWorks are protected by copyright, with all rights reserved, unless otherwise indicated.

Related Researcher

Researcher Yoo, Yongjae photo

Yoo, Yongjae
ERICA 소프트웨어융합대학 (DEPARTMENT OF ARTIFICIAL INTELLIGENCE)
Read more

Altmetrics

Total Views & Downloads

BROWSE