Detailed Information

Cited 0 time in webofscience Cited 0 time in scopus
Metadata Downloads

Growing a Brain with Sparsity-Inducing Generation for Continual Learning

Full metadata record
DC Field Value Language
dc.contributor.authorJin, Hyundong-
dc.contributor.authorKim, Gyeong-Hyeon-
dc.contributor.authorAhn, Chanho-
dc.contributor.authorKim, Eunwoo-
dc.date.accessioned2024-03-20T00:30:24Z-
dc.date.available2024-03-20T00:30:24Z-
dc.date.issued2023-
dc.identifier.issn1550-5499-
dc.identifier.urihttps://scholarworks.bwise.kr/cau/handle/2019.sw.cau/72940-
dc.description.abstractDeep neural networks suffer from catastrophic forgetting in continual learning, where they tend to lose information about previously learned tasks when optimizing a new incoming task. Recent strategies isolate the important parameters for previous tasks to retain old knowledge while learning the new task. However, using the fixed old knowledge might act as an obstacle to capturing novel representations. To overcome this limitation, we propose a framework that evolves the previously allocated parameters by absorbing the knowledge of the new task. The approach performs under two different networks. The base network learns knowledge of sequential tasks, and the sparsity-inducing hyper-network generates parameters for each time step for evolving old knowledge. The generated parameters transform old parameters of the base network to reflect the new knowledge. We design the hypernetwork to generate sparse parameters conditional to the task-specific information and the structural information of the base network. We evaluate the proposed approach on class-incremental and task-incremental learning scenarios for image classification and video action recognition tasks. Experimental results show that the proposed method consistently outperforms a large variety of continual learning approaches for those scenarios by evolving old knowledge. © 2023 IEEE.-
dc.format.extent10-
dc.language영어-
dc.language.isoENG-
dc.publisherInstitute of Electrical and Electronics Engineers Inc.-
dc.titleGrowing a Brain with Sparsity-Inducing Generation for Continual Learning-
dc.typeArticle-
dc.identifier.doi10.1109/ICCV51070.2023.01738-
dc.identifier.bibliographicCitationProceedings of the IEEE International Conference on Computer Vision, v.2023 IEEE, pp 18915 - 18924-
dc.description.isOpenAccessN-
dc.identifier.wosid001169500503050-
dc.identifier.scopusid2-s2.0-85185866915-
dc.citation.endPage18924-
dc.citation.startPage18915-
dc.citation.titleProceedings of the IEEE International Conference on Computer Vision-
dc.citation.volume2023 IEEE-
dc.type.docTypeProceedings Paper-
dc.relation.journalResearchAreaComputer Science-
dc.relation.journalResearchAreaImaging Science & Photographic Technology-
dc.relation.journalWebOfScienceCategoryComputer Science, Artificial Intelligence-
dc.relation.journalWebOfScienceCategoryComputer Science, Theory & Methods-
dc.relation.journalWebOfScienceCategoryImaging Science & Photographic Technology-
dc.description.journalRegisteredClassscopus-
Files in This Item
There are no files associated with this item.
Appears in
Collections
College of Software > School of Computer Science and Engineering > 1. Journal Articles

qrcode

Items in ScholarWorks are protected by copyright, with all rights reserved, unless otherwise indicated.

Related Researcher

Researcher Kim, Eun Woo photo

Kim, Eun Woo
소프트웨어대학 (소프트웨어학부)
Read more

Altmetrics

Total Views & Downloads

BROWSE