Detailed Information

Cited 0 time in webofscience Cited 0 time in scopus
Metadata Downloads

Knowledge distillation for BERT unsupervised domain adaptation

Full metadata record
DC FieldValueLanguage
dc.contributor.authorRyu, Minho-
dc.contributor.authorLee, Geonseok-
dc.contributor.authorLee, Kichun-
dc.date.accessioned2023-09-26T07:51:45Z-
dc.date.available2023-09-26T07:51:45Z-
dc.date.created2022-09-08-
dc.date.issued2022-11-
dc.identifier.issn0219-1377-
dc.identifier.urihttps://scholarworks.bwise.kr/hanyang/handle/2021.sw.hanyang/191145-
dc.description.abstractA pre-trained language model, BERT, has brought significant performance improvements across a range of natural language processing tasks. Since the model is trained on a large corpus of diverse topics, it shows robust performance for domain shift problems in which data distributions at training (source data) and testing (target data) differ while sharing similarities. Despite its great improvements compared to previous models, it still suffers from performance degradation due to domain shifts. To mitigate such problems, we propose a simple but effective unsupervised domain adaptation method, adversarial adaptation with distillation (AAD), which combines the adversarial discriminative domain adaptation (ADDA) framework with knowledge distillation. We evaluate our approach in the task of cross-domain sentiment classification on 30 domain pairs, advancing the state-of-the-art performance for unsupervised domain adaptation in text sentiment classification.-
dc.language영어-
dc.language.isoen-
dc.publisherSPRINGER LONDON LTD-
dc.titleKnowledge distillation for BERT unsupervised domain adaptation-
dc.typeArticle-
dc.contributor.affiliatedAuthorLee, Kichun-
dc.identifier.doi10.1007/s10115-022-01736-y-
dc.identifier.scopusid2-s2.0-85136537920-
dc.identifier.wosid000842758700003-
dc.identifier.bibliographicCitationKNOWLEDGE AND INFORMATION SYSTEMS, v.64, no.11, pp.3113 - 3128-
dc.relation.isPartOfKNOWLEDGE AND INFORMATION SYSTEMS-
dc.citation.titleKNOWLEDGE AND INFORMATION SYSTEMS-
dc.citation.volume64-
dc.citation.number11-
dc.citation.startPage3113-
dc.citation.endPage3128-
dc.type.rimsART-
dc.type.docTypeArticle; Early Access-
dc.description.journalClass1-
dc.description.isOpenAccessN-
dc.description.journalRegisteredClassscie-
dc.description.journalRegisteredClassscopus-
dc.relation.journalResearchAreaComputer Science-
dc.relation.journalWebOfScienceCategoryComputer Science, Artificial Intelligence-
dc.relation.journalWebOfScienceCategoryComputer Science, Information Systems-
dc.subject.keywordPlusData distribution-
dc.subject.keywordPlusDomain adaptation-
dc.subject.keywordPlusKnowledge distillation-
dc.subject.keywordPlusLanguage model-
dc.subject.keywordPlusLanguage processing-
dc.subject.keywordPlusLarge corpora-
dc.subject.keywordPlusNatural languages-
dc.subject.keywordPlusPerformance-
dc.subject.keywordPlusRobust performance-
dc.subject.keywordPlusSentiment classification-
dc.subject.keywordAuthorLanguage model-
dc.subject.keywordAuthorKnowledge distillation-
dc.subject.keywordAuthorDomain adaptation-
dc.identifier.urlhttps://link.springer.com/article/10.1007/s10115-022-01736-y-
Files in This Item
Go to Link
Appears in
Collections
서울 공과대학 > 서울 산업공학과 > 1. Journal Articles

qrcode

Items in ScholarWorks are protected by copyright, with all rights reserved, unless otherwise indicated.

Related Researcher

Researcher Lee, Ki chun photo

Lee, Ki chun
COLLEGE OF ENGINEERING (DEPARTMENT OF INDUSTRIAL ENGINEERING)
Read more

Altmetrics

Total Views & Downloads

BROWSE