A Korean emotion-factor dataset for extracting emotion and factors in Korean conversations
- Authors
- Yoo, SoYeop; Lee, HaYoung; Song, Jein; Jeong, OkRan
- Issue Date
- Oct-2023
- Publisher
- NATURE PORTFOLIO
- Citation
- SCIENTIFIC REPORTS, v.13, no.1
- Journal Title
- SCIENTIFIC REPORTS
- Volume
- 13
- Number
- 1
- URI
- https://scholarworks.bwise.kr/gachon/handle/2020.sw.gachon/89653
- DOI
- 10.1038/s41598-023-45386-8
- ISSN
- 2045-2322
- Abstract
- Humans express their emotions in various ways, such as through facial expressions and voices. In particular, emotions are directly expressed or indirectly implied in the text of utterance. Research on the technology to identify emotions included in human speech and generate utterances is being conducted in conversational artificial intelligence technology. Despite the importance of recognizing the factors of previously generated emotions to generate emotion-based utterances, most of the existing datasets only provide the classification of emotions in text and utterances. In addition, in the case of Korean datasets, the classification of emotions is not diverse, and it is mainly biased toward negative emotion classification. In this paper, we propose KEmoFact, a Korean emotion-factor dataset for extracting emotion and factors in Korean conversations. We also define two tasks for the KEmoFact dataset, EFE (Emotion Factor Extraction) and EFPE (Emotion-Factor Pair Extraction), and propose baseline models for the tasks. We contribute to the study of conversational artificial intelligence, especially in Korean, one of the low-resource languages, by proposing the KEmoFact dataset and suggesting baseline models for two tasks.
- Files in This Item
- There are no files associated with this item.
- Appears in
Collections - ETC > 1. Journal Articles
![qrcode](https://api.qrserver.com/v1/create-qr-code/?size=55x55&data=https://scholarworks.bwise.kr/gachon/handle/2020.sw.gachon/89653)
Items in ScholarWorks are protected by copyright, with all rights reserved, unless otherwise indicated.