Knowledge distillation for BERT unsupervised domain adaptation
DC Field | Value | Language |
---|---|---|
dc.contributor.author | Ryu, Minho | - |
dc.contributor.author | Lee, Geonseok | - |
dc.contributor.author | Lee, Kichun | - |
dc.date.accessioned | 2023-09-26T07:51:45Z | - |
dc.date.available | 2023-09-26T07:51:45Z | - |
dc.date.created | 2022-09-08 | - |
dc.date.issued | 2022-11 | - |
dc.identifier.issn | 0219-1377 | - |
dc.identifier.uri | https://scholarworks.bwise.kr/hanyang/handle/2021.sw.hanyang/191145 | - |
dc.description.abstract | A pre-trained language model, BERT, has brought significant performance improvements across a range of natural language processing tasks. Since the model is trained on a large corpus of diverse topics, it shows robust performance for domain shift problems in which data distributions at training (source data) and testing (target data) differ while sharing similarities. Despite its great improvements compared to previous models, it still suffers from performance degradation due to domain shifts. To mitigate such problems, we propose a simple but effective unsupervised domain adaptation method, adversarial adaptation with distillation (AAD), which combines the adversarial discriminative domain adaptation (ADDA) framework with knowledge distillation. We evaluate our approach in the task of cross-domain sentiment classification on 30 domain pairs, advancing the state-of-the-art performance for unsupervised domain adaptation in text sentiment classification. | - |
dc.language | 영어 | - |
dc.language.iso | en | - |
dc.publisher | SPRINGER LONDON LTD | - |
dc.title | Knowledge distillation for BERT unsupervised domain adaptation | - |
dc.type | Article | - |
dc.contributor.affiliatedAuthor | Lee, Kichun | - |
dc.identifier.doi | 10.1007/s10115-022-01736-y | - |
dc.identifier.scopusid | 2-s2.0-85136537920 | - |
dc.identifier.wosid | 000842758700003 | - |
dc.identifier.bibliographicCitation | KNOWLEDGE AND INFORMATION SYSTEMS, v.64, no.11, pp.3113 - 3128 | - |
dc.relation.isPartOf | KNOWLEDGE AND INFORMATION SYSTEMS | - |
dc.citation.title | KNOWLEDGE AND INFORMATION SYSTEMS | - |
dc.citation.volume | 64 | - |
dc.citation.number | 11 | - |
dc.citation.startPage | 3113 | - |
dc.citation.endPage | 3128 | - |
dc.type.rims | ART | - |
dc.type.docType | Article; Early Access | - |
dc.description.journalClass | 1 | - |
dc.description.isOpenAccess | N | - |
dc.description.journalRegisteredClass | scie | - |
dc.description.journalRegisteredClass | scopus | - |
dc.relation.journalResearchArea | Computer Science | - |
dc.relation.journalWebOfScienceCategory | Computer Science, Artificial Intelligence | - |
dc.relation.journalWebOfScienceCategory | Computer Science, Information Systems | - |
dc.subject.keywordPlus | Data distribution | - |
dc.subject.keywordPlus | Domain adaptation | - |
dc.subject.keywordPlus | Knowledge distillation | - |
dc.subject.keywordPlus | Language model | - |
dc.subject.keywordPlus | Language processing | - |
dc.subject.keywordPlus | Large corpora | - |
dc.subject.keywordPlus | Natural languages | - |
dc.subject.keywordPlus | Performance | - |
dc.subject.keywordPlus | Robust performance | - |
dc.subject.keywordPlus | Sentiment classification | - |
dc.subject.keywordAuthor | Language model | - |
dc.subject.keywordAuthor | Knowledge distillation | - |
dc.subject.keywordAuthor | Domain adaptation | - |
dc.identifier.url | https://link.springer.com/article/10.1007/s10115-022-01736-y | - |
Items in ScholarWorks are protected by copyright, with all rights reserved, unless otherwise indicated.
222, Wangsimni-ro, Seongdong-gu, Seoul, 04763, Korea+82-2-2220-1365
COPYRIGHT © 2021 HANYANG UNIVERSITY.
Certain data included herein are derived from the © Web of Science of Clarivate Analytics. All rights reserved.
You may not copy or re-distribute this material in whole or in part without the prior written consent of Clarivate Analytics.