Detailed Information

Cited 0 time in webofscience Cited 0 time in scopus
Metadata Downloads

A pre-trained BERT for Korean medical natural language processing

Full metadata record
DC Field Value Language
dc.contributor.authorKim, Yoojoong-
dc.contributor.authorKim, Jong-Ho-
dc.contributor.authorLee, Jeong Moon-
dc.contributor.authorJang, Moon Joung-
dc.contributor.authorYum, Yun Jin-
dc.contributor.authorKim, Seongtae-
dc.contributor.authorShin, Unsub-
dc.contributor.authorKim, Young-Min-
dc.contributor.authorJoo, Hyung Joon-
dc.contributor.authorSong, Sanghoun-
dc.date.accessioned2022-09-19T12:14:25Z-
dc.date.available2022-09-19T12:14:25Z-
dc.date.created2022-09-08-
dc.date.issued2022-08-
dc.identifier.issn2045-2322-
dc.identifier.urihttps://scholarworks.bwise.kr/hanyang/handle/2021.sw.hanyang/171527-
dc.description.abstractWith advances in deep learning and natural language processing (NLP), the analysis of medical texts is becoming increasingly important. Nonetheless, despite the importance of processing medical texts, no research on Korean medical-specific language models has been conducted. The Korean medical text is highly difficult to analyze because of the agglutinative characteristics of the language, as well as the complex terminologies in the medical domain. To solve this problem, we collected a Korean medical corpus and used it to train the language models. In this paper, we present a Korean medical language model based on deep learning NLP. The model was trained using the pre-training framework of BERT for the medical context based on a state-of-the-art Korean language model. The pre-trained model showed increased accuracies of 0.147 and 0.148 for the masked language model with next sentence prediction. In the intrinsic evaluation, the next sentence prediction accuracy improved by 0.258, which is a remarkable enhancement. In addition, the extrinsic evaluation of Korean medical semantic textual similarity data showed a 0.046 increase in the Pearson correlation, and the evaluation for the Korean medical named entity recognition showed a 0.053 increase in the F1-score.-
dc.language영어-
dc.language.isoen-
dc.publisherNATURE PORTFOLIO-
dc.titleA pre-trained BERT for Korean medical natural language processing-
dc.typeArticle-
dc.contributor.affiliatedAuthorKim, Young-Min-
dc.identifier.doi10.1038/s41598-022-17806-8-
dc.identifier.scopusid2-s2.0-85135987936-
dc.identifier.wosid000841397200059-
dc.identifier.bibliographicCitationSCIENTIFIC REPORTS, v.12, no.1, pp.1 - 10-
dc.relation.isPartOfSCIENTIFIC REPORTS-
dc.citation.titleSCIENTIFIC REPORTS-
dc.citation.volume12-
dc.citation.number1-
dc.citation.startPage1-
dc.citation.endPage10-
dc.type.rimsART-
dc.type.docTypeArticle-
dc.description.journalClass1-
dc.description.isOpenAccessY-
dc.description.journalRegisteredClassscie-
dc.description.journalRegisteredClassscopus-
dc.relation.journalResearchAreaScience & Technology - Other Topics-
dc.relation.journalWebOfScienceCategoryMultidisciplinary Sciences-
dc.subject.keywordPlusarticle-
dc.subject.keywordPlusdeep learning-
dc.subject.keywordPlushuman-
dc.subject.keywordPlushuman experiment-
dc.subject.keywordPlusnatural language processing-
dc.subject.keywordPlusprediction-
dc.subject.keywordPluslanguage-
dc.subject.keywordPlussemantics-
dc.subject.keywordPlusSouth Korea-
dc.identifier.urlhttps://www.nature.com/articles/s41598-022-17806-8-
Files in This Item
Go to Link
Appears in
Collections
서울 기술경영전문대학원 > 서울 기술경영학과 > 1. Journal Articles

qrcode

Items in ScholarWorks are protected by copyright, with all rights reserved, unless otherwise indicated.

Related Researcher

Researcher Kim, Young min photo

Kim, Young min
GRADUATE SCHOOL OF TECHNOLOGY & INNOVATION MANAGEMENT (DEPARTMENT OF TECHNOLOGY MANAGEMENT)
Read more

Altmetrics

Total Views & Downloads

BROWSE