Detailed Information

Cited 0 time in webofscience Cited 0 time in scopus
Metadata Downloads

Incremental Learning with Adaptive Model Search and a Nominal Loss Modelopen access

Authors
Ahn, C.Kim, EunwooOh, S.
Issue Date
Feb-2022
Publisher
Institute of Electrical and Electronics Engineers Inc.
Keywords
Adaptation models; Artificial neural network; Computational modeling; computer vision; Convolution; incremental learning; Learning systems; model selection; object recognition; Search methods; Task analysis; Training
Citation
IEEE Access, v.10, pp 16052 - 16062
Pages
11
Journal Title
IEEE Access
Volume
10
Start Page
16052
End Page
16062
URI
https://scholarworks.bwise.kr/cau/handle/2019.sw.cau/61689
DOI
10.1109/ACCESS.2022.3149598
ISSN
2169-3536
Abstract
In incremental learning, different tasks are learned sequentially without access to the previously trained dataset. Catastrophic forgetting is a significant bottleneck to incremental learning as the network performs poorly on previous tasks when it is trained on a new task. This paper provides an adaptive model search method that uses a different part of parameters of the backbone network depending on an input image to mitigate catastrophic forgetting. Our model search method can prevent forgetting by minimizing the update of critical parameters for the previous tasks while learning a new task. This model search involves a trainable model search network that selects the model structure for an input image to minimize loss functions for all the tasks. We also propose a method for approximating the loss function of previous tasks using only the network parameters. The critical parameters for the previous tasks can be found according to the influence of the parameters on the approximated loss function. The proposed approximation method has the potential to theoretically reach a parameter set with a lower loss value than the parameter set learned by the previous task. The proposed framework is the first method of model search that can consider the performance of both current and previous tasks in the incremental learning problem. Empirical studies show that the proposed method outperforms other competitors for both old and new tasks while requiring less computation. Author
Files in This Item
Appears in
Collections
College of Software > School of Computer Science and Engineering > 1. Journal Articles

qrcode

Items in ScholarWorks are protected by copyright, with all rights reserved, unless otherwise indicated.

Related Researcher

Researcher Kim, Eun Woo photo

Kim, Eun Woo
소프트웨어대학 (소프트웨어학부)
Read more

Altmetrics

Total Views & Downloads

BROWSE