Detailed Information

Cited 0 time in webofscience Cited 0 time in scopus
Metadata Downloads

Meta-Learning with Adaptive Hyperparameters

Authors
Baik, SungyongChoi, MyungsubChoi, JanghoonKim, HeewonLee, Kyoung Mu
Issue Date
Dec-2020
Publisher
Curran Associates
Citation
Conference on Neural Information Processing Systems, pp.20755 - 20765
Indexed
OTHER
Journal Title
Conference on Neural Information Processing Systems
Start Page
20755
End Page
20765
URI
https://scholarworks.bwise.kr/hanyang/handle/2021.sw.hanyang/191686
Abstract
Despite its popularity, several recent works question the effectiveness of MAML when test tasks are different from training tasks, thus suggesting various task-conditioned methodology to improve the initialization. Instead of searching for better task-aware initialization, we focus on a complementary factor in MAML framework, inner-loop optimization (or fast adaptation). Consequently, we propose a new weight update rule that greatly enhances the fast adaptation process. Specifically, we introduce a small meta-network that can adaptively generate per-step hyperparameters: learning rate and weight decay coefficients. The experimental results validate that the Adaptive Learning of hyperparameters for Fast Adaptation (ALFA) is the equally important ingredient that was often neglected in the recent few-shot learning approaches. Surprisingly, fast adaptation from random initialization with ALFA can already outperform MAML.
Files in This Item
Go to Link
Appears in
Collections
ETC > 1. Journal Articles

qrcode

Items in ScholarWorks are protected by copyright, with all rights reserved, unless otherwise indicated.

Related Researcher

Researcher Baik, Sungyong photo

Baik, Sungyong
COLLEGE OF ENGINEERING (DEPARTMENT OF INTELLIGENCE COMPUTING)
Read more

Altmetrics

Total Views & Downloads

BROWSE