Utilizing context-relevant keywords extracted from a large collection of user-generated documents for music discovery
DC Field | Value | Language |
---|---|---|
dc.contributor.author | Hyung, Ziwon | - |
dc.contributor.author | Park, Joon-Sang | - |
dc.contributor.author | Lee, Kyogu | - |
dc.date.available | 2020-07-10T04:55:09Z | - |
dc.date.created | 2020-07-06 | - |
dc.date.issued | 2017-09 | - |
dc.identifier.issn | 0306-4573 | - |
dc.identifier.uri | https://scholarworks.bwise.kr/hongik/handle/2020.sw.hongik/5336 | - |
dc.description.abstract | The contextual background of a user is one of the important criteria when deciding what music to listen to. In this paper, we propose a novel method to embed the user context for music search and retrieval. The proposed system extracts keywords from a large collection of documents written by users. Each of these documents contains a personal story about the writer's situation and/or mood, followed by a song request. We consider that there is a strong correlation between the story and the song. Therefore, by extracting keywords from these documents, it is possible to develop a list of terms that can generally be used to describe the user context when requesting a song, which may then be employed to represent a music item in a richer manner. Once each song is represented using the proposed context-relevant music descriptors, we perform Latent Dirichlet Allocation to retrieve similar music based on context similarity. By conducting a series of experiments, we identified a correlation between the proposed music descriptors and conventional approaches, such as acoustic features or lyrics. The identified correlation can be used to auto-tag songs with no document association. We also qualitatively evaluated our system by comparing the performance of our proposed music descriptors with other conventional features for music retrieval. The results showed that the performance of the proposed music descriptors was competitive with conventional features, thereby suggesting their potential use for describing music in semantic music search/retrieval. (C) 2017 Elsevier Ltd. All rights reserved. | - |
dc.language | 영어 | - |
dc.language.iso | en | - |
dc.publisher | ELSEVIER SCI LTD | - |
dc.subject | RECOMMENDATION | - |
dc.subject | EMOTION | - |
dc.title | Utilizing context-relevant keywords extracted from a large collection of user-generated documents for music discovery | - |
dc.type | Article | - |
dc.contributor.affiliatedAuthor | Park, Joon-Sang | - |
dc.identifier.doi | 10.1016/j.ipm.2017.04.006 | - |
dc.identifier.scopusid | 2-s2.0-85019679107 | - |
dc.identifier.wosid | 000407402200011 | - |
dc.identifier.bibliographicCitation | INFORMATION PROCESSING & MANAGEMENT, v.53, no.5, pp.1185 - 1200 | - |
dc.relation.isPartOf | INFORMATION PROCESSING & MANAGEMENT | - |
dc.citation.title | INFORMATION PROCESSING & MANAGEMENT | - |
dc.citation.volume | 53 | - |
dc.citation.number | 5 | - |
dc.citation.startPage | 1185 | - |
dc.citation.endPage | 1200 | - |
dc.type.rims | ART | - |
dc.type.docType | Article | - |
dc.description.journalClass | 1 | - |
dc.description.journalRegisteredClass | scie | - |
dc.description.journalRegisteredClass | ssci | - |
dc.description.journalRegisteredClass | scopus | - |
dc.relation.journalResearchArea | Computer Science | - |
dc.relation.journalResearchArea | Information Science & Library Science | - |
dc.relation.journalWebOfScienceCategory | Computer Science, Information Systems | - |
dc.relation.journalWebOfScienceCategory | Information Science & Library Science | - |
dc.subject.keywordPlus | RECOMMENDATION | - |
dc.subject.keywordPlus | EMOTION | - |
dc.subject.keywordAuthor | Context-relevant keywords | - |
dc.subject.keywordAuthor | Song-document association | - |
dc.subject.keywordAuthor | Keyword extraction | - |
dc.subject.keywordAuthor | Music descriptors | - |
dc.subject.keywordAuthor | Music retrieval | - |
Items in ScholarWorks are protected by copyright, with all rights reserved, unless otherwise indicated.
94, Wausan-ro, Mapo-gu, Seoul, 04066, Korea02-320-1314
COPYRIGHT 2020 HONGIK UNIVERSITY. ALL RIGHTS RESERVED.
Certain data included herein are derived from the © Web of Science of Clarivate Analytics. All rights reserved.
You may not copy or re-distribute this material in whole or in part without the prior written consent of Clarivate Analytics.