Integrating character networks for extracting narratives from multimodal data
- Authors
- Lee, O.-J.; Jung, Jason J.
- Issue Date
- Sep-2019
- Publisher
- Elsevier Ltd
- Keywords
- Character network; Computational narrative; Story analytics; Story synchronization
- Citation
- Information Processing and Management, v.56, no.5, pp 1894 - 1923
- Pages
- 30
- Journal Title
- Information Processing and Management
- Volume
- 56
- Number
- 5
- Start Page
- 1894
- End Page
- 1923
- URI
- https://scholarworks.bwise.kr/cau/handle/2019.sw.cau/18488
- DOI
- 10.1016/j.ipm.2019.02.005
- ISSN
- 0306-4573
1873-5371
- Abstract
- This study aims to integrate diverse data within narrative multimedia (i.e., artworks containing stories and distributed through multimedia) into a unified character network (i.e., a social network between characters that appear in the story). By combining multiple data sources (e.g., the text, video, and audio), we attempted to enhance the accuracy and semantic richness of existing character networks that confine themselves to a particular data source. To merge various data, we propose story synchronization for (i) improving the accuracy of data extracted from the narrative multimedia and (ii) integrating the data into the unified character network. The story synchronization mainly consists of three steps: synchronizing (i) scenes, (ii) characters, and (iii) character networks. First, we synchronize dialogues in the text and audio, to discover speakers and time of dialogues. This enables us to segment the scene using time periods when dialogues (in the text and audio) and characters (in the video) do not commonly occur. Through the scene segmentation, we can discretize stories in the narrative work. By comparing the occurrence of dialogues and characters in each scene, we synchronize identities of the characters in the text and video (e.g., names and faces of characters). Thereby, we can more accurately estimate participants and time of a conversation between characters (i.e., a set of connected dialogues). Based on the conversation, the existing character networks are refined and integrated into the unified character network. In addition, we verified the efficacy of the proposed methods using movies in the real world, which are among the most accessible and popular narrative multimedia. © 2019 Elsevier Ltd
- Files in This Item
-
- Appears in
Collections - College of Software > School of Computer Science and Engineering > 1. Journal Articles
Items in ScholarWorks are protected by copyright, with all rights reserved, unless otherwise indicated.