Fine-tuning BERT Models for Keyphrase Extraction in Scientific Articles
DC Field | Value | Language |
---|---|---|
dc.contributor.author | 임연수 | - |
dc.contributor.author | 서덕진 | - |
dc.contributor.author | 정유철 | - |
dc.date.available | 2020-10-21T03:40:22Z | - |
dc.date.created | 2020-10-05 | - |
dc.date.issued | 2020-01 | - |
dc.identifier.issn | 2234-1072 | - |
dc.identifier.uri | https://scholarworks.bwise.kr/kumoh/handle/2020.sw.kumoh/18134 | - |
dc.description.abstract | Despite extensive research, performance enhancement of keyphrase (KP) extraction remains a challenging problem in modern informatics. Recently, deep learning-based supervised approaches have exhibited state-of-the-art accuracies with respect to this problem, and several of the previously proposed methods utilize Bidirectional Encoder Representations from Transformers (BERT)-based language models. However, few studies have investigated the effective application of BERT-based fine-tuning techniques to the problem of KP extraction. In this paper, we consider the aforementioned problem in the context of scientific articles by investigating the fine-tuning characteristics of two distinct BERT models — BERT (i.e., base BERT model by Google) and SciBERT (i.e., a BERT model trained on scientific text). Three different datasets (WWW, KDD, and Inspec) comprising data obtained from the computer science domain are used to compare the results obtained by fine-tuning BERT and SciBERT in terms of KP extraction. | - |
dc.language | 영어 | - |
dc.language.iso | en | - |
dc.publisher | 한국정보기술학회 | - |
dc.title | Fine-tuning BERT Models for Keyphrase Extraction in Scientific Articles | - |
dc.title.alternative | Fine-tuning BERT Models for Keyphrase Extraction in Scientific Articles | - |
dc.type | Article | - |
dc.contributor.affiliatedAuthor | 정유철 | - |
dc.identifier.bibliographicCitation | 한국정보기술학회 영문논문지, v.10, no.1, pp.45 - 56 | - |
dc.relation.isPartOf | 한국정보기술학회 영문논문지 | - |
dc.citation.title | 한국정보기술학회 영문논문지 | - |
dc.citation.volume | 10 | - |
dc.citation.number | 1 | - |
dc.citation.startPage | 45 | - |
dc.citation.endPage | 56 | - |
dc.type.rims | ART | - |
dc.identifier.kciid | ART002612030 | - |
dc.description.journalClass | 2 | - |
dc.description.isOpenAccess | N | - |
dc.description.journalRegisteredClass | kci | - |
dc.description.journalRegisteredClass | other | - |
dc.subject.keywordAuthor | keyphrase extraction | - |
dc.subject.keywordAuthor | BERT | - |
dc.subject.keywordAuthor | fine-tuning | - |
dc.subject.keywordAuthor | embedding | - |
dc.subject.keywordAuthor | scientific articles | - |
Items in ScholarWorks are protected by copyright, with all rights reserved, unless otherwise indicated.
350-27, Gumi-daero, Gumi-si, Gyeongsangbuk-do, Republic of Korea (39253)054-478-7170
COPYRIGHT 2020 Kumoh University All Rights Reserved.
Certain data included herein are derived from the © Web of Science of Clarivate Analytics. All rights reserved.
You may not copy or re-distribute this material in whole or in part without the prior written consent of Clarivate Analytics.