Detailed Information

Cited 0 time in webofscience Cited 0 time in scopus
Metadata Downloads

Domain Generalization by Mutual-Information Regularization with Pre-trained Models

Authors
Cha, JunbumLee, KyungjaePark, SungraeChun, Sanghyuk
Issue Date
2022
Publisher
SPRINGER INTERNATIONAL PUBLISHING AG
Citation
COMPUTER VISION, ECCV 2022, PT XXIII, v.13683, pp 440 - 457
Pages
18
Journal Title
COMPUTER VISION, ECCV 2022, PT XXIII
Volume
13683
Start Page
440
End Page
457
URI
https://scholarworks.bwise.kr/cau/handle/2019.sw.cau/69559
DOI
10.1007/978-3-031-20050-2_26
ISSN
0302-9743
1611-3349
Abstract
Domain generalization (DG) aims to learn a generalized model to an unseen target domain using only limited source domains. Previous attempts to DG fail to learn domain-invariant representations only from the source domains due to the significant domain shifts between training and test domains. Instead, we re-formulate the DG objective using mutual information with the oracle model, a model generalized to any possible domain. We derive a tractable variational lower bound via approximating the oracle model by a pre-trained model, called Mutual Information Regularization with Oracle (MIRO). Our extensive experiments show that MIRO significantly improves the out-ofdistribution performance. Furthermore, our scaling experiments show that the larger the scale of the pre-trained model, the greater the performance improvement of MIRO. Code is available at https://github.com/ kakaobrain/miro.
Files in This Item
There are no files associated with this item.
Appears in
Collections
College of Software > Department of Artificial Intelligence > 1. Journal Articles

qrcode

Items in ScholarWorks are protected by copyright, with all rights reserved, unless otherwise indicated.

Related Researcher

Researcher Lee, Kyungjae photo

Lee, Kyungjae
소프트웨어대학 (AI학과)
Read more

Altmetrics

Total Views & Downloads

BROWSE