Detailed Information

Cited 0 time in webofscience Cited 0 time in scopus
Metadata Downloads

Clustering-Guided Incremental Learning of Tasks

Full metadata record
DC Field Value Language
dc.contributor.authorKim, Y.-
dc.contributor.authorKim, E.-
dc.date.accessioned2021-06-18T07:14:21Z-
dc.date.available2021-06-18T07:14:21Z-
dc.date.issued2021-01-
dc.identifier.issn1976-7684-
dc.identifier.urihttps://scholarworks.bwise.kr/cau/handle/2019.sw.cau/44128-
dc.description.abstractIncremental deep learning aims to learn a sequence of tasks while avoiding forgetting their knowledge. One naïve approach using a deep architecture is to increase the capacity of the architecture as the number of tasks increases. However, this is followed by heavy memory consumption and makes the approach not practical. If we attempt to avoid such an issue with a fixed capacity, we encounter another challenging problem called catastrophic forgetting, which leads to a notable degradation of performance on previously learned tasks. To overcome these problems, we propose a clustering-guided incremental learning approach that can mitigate catastrophic forgetting while not increasing the capacity of an architecture. The proposed approach adopts a parameter-splitting strategy to assign a subset of parameters in an architecture for each task to prevent forgetting. It uses a clustering approach to discover the relationship between tasks by storing a few samples per task. When we learn a new task, we utilize the knowledge of the relevant tasks together with the current task to improve performance. This approach could maximize the efficiency of the approach realized in a single fixed architecture. Experimental results with a number of fine-grained datasets show that our method outperforms existing competitors. © 2021 IEEE.-
dc.format.extent5-
dc.language영어-
dc.language.isoENG-
dc.publisherIEEE Computer Society-
dc.titleClustering-Guided Incremental Learning of Tasks-
dc.typeArticle-
dc.identifier.doi10.1109/ICOIN50884.2021.9334003-
dc.identifier.bibliographicCitationInternational Conference on Information Networking, v.2021-January, pp 417 - 421-
dc.description.isOpenAccessN-
dc.identifier.wosid000657974100073-
dc.identifier.scopusid2-s2.0-85100763213-
dc.citation.endPage421-
dc.citation.startPage417-
dc.citation.titleInternational Conference on Information Networking-
dc.citation.volume2021-January-
dc.type.docTypeConference Paper-
dc.subject.keywordAuthorcatastrophic forgetting-
dc.subject.keywordAuthorclustering-
dc.subject.keywordAuthordeep neural networks-
dc.subject.keywordAuthorIncremental learning-
dc.subject.keywordPlusCatastrophic forgetting-
dc.subject.keywordPlusClustering approach-
dc.subject.keywordPlusDeep architectures-
dc.subject.keywordPlusFine grained-
dc.subject.keywordPlusImprove performance-
dc.subject.keywordPlusIncremental learning-
dc.subject.keywordPlusMemory consumption-
dc.subject.keywordPlusSplitting strategies-
dc.subject.keywordPlusDeep learning-
dc.relation.journalResearchAreaComputer Science-
dc.relation.journalResearchAreaEngineering-
dc.relation.journalResearchAreaTelecommunications-
dc.relation.journalWebOfScienceCategoryComputer Science, Information Systems-
dc.relation.journalWebOfScienceCategoryComputer Science, Theory & Methods-
dc.relation.journalWebOfScienceCategoryEngineering, Electrical & Electronic-
dc.relation.journalWebOfScienceCategoryTelecommunications-
dc.description.journalRegisteredClassscie-
dc.description.journalRegisteredClassscopus-
Files in This Item
There are no files associated with this item.
Appears in
Collections
College of Software > School of Computer Science and Engineering > 1. Journal Articles

qrcode

Items in ScholarWorks are protected by copyright, with all rights reserved, unless otherwise indicated.

Related Researcher

Researcher Kim, Eun Woo photo

Kim, Eun Woo
소프트웨어대학 (소프트웨어학부)
Read more

Altmetrics

Total Views & Downloads

BROWSE