Detailed Information

Cited 0 time in webofscience Cited 0 time in scopus
Metadata Downloads

Knowledge distillation for BERT unsupervised domain adaptation

Authors
Ryu, MinhoLee, GeonseokLee, Kichun
Issue Date
Nov-2022
Publisher
SPRINGER LONDON LTD
Keywords
Language model; Knowledge distillation; Domain adaptation
Citation
KNOWLEDGE AND INFORMATION SYSTEMS, v.64, no.11, pp.3113 - 3128
Indexed
SCIE
SCOPUS
Journal Title
KNOWLEDGE AND INFORMATION SYSTEMS
Volume
64
Number
11
Start Page
3113
End Page
3128
URI
https://scholarworks.bwise.kr/hanyang/handle/2021.sw.hanyang/191145
DOI
10.1007/s10115-022-01736-y
ISSN
0219-1377
Abstract
A pre-trained language model, BERT, has brought significant performance improvements across a range of natural language processing tasks. Since the model is trained on a large corpus of diverse topics, it shows robust performance for domain shift problems in which data distributions at training (source data) and testing (target data) differ while sharing similarities. Despite its great improvements compared to previous models, it still suffers from performance degradation due to domain shifts. To mitigate such problems, we propose a simple but effective unsupervised domain adaptation method, adversarial adaptation with distillation (AAD), which combines the adversarial discriminative domain adaptation (ADDA) framework with knowledge distillation. We evaluate our approach in the task of cross-domain sentiment classification on 30 domain pairs, advancing the state-of-the-art performance for unsupervised domain adaptation in text sentiment classification.
Files in This Item
Go to Link
Appears in
Collections
서울 공과대학 > 서울 산업공학과 > 1. Journal Articles

qrcode

Items in ScholarWorks are protected by copyright, with all rights reserved, unless otherwise indicated.

Related Researcher

Researcher Lee, Ki chun photo

Lee, Ki chun
COLLEGE OF ENGINEERING (DEPARTMENT OF INDUSTRIAL ENGINEERING)
Read more

Altmetrics

Total Views & Downloads

BROWSE