Detailed Information

Cited 0 time in webofscience Cited 0 time in scopus
Metadata Downloads

Two-stage architectural fine-tuning for neural architecture search in efficient transfer learning

Full metadata record
DC Field Value Language
dc.contributor.authorPark, Soohyun-
dc.contributor.authorSon, Seok Bin-
dc.contributor.authorLee, Youn Kyu-
dc.contributor.authorJung, Soyi-
dc.contributor.authorKim, Joongheon-
dc.date.accessioned2024-01-29T05:00:28Z-
dc.date.available2024-01-29T05:00:28Z-
dc.date.issued2023-12-
dc.identifier.issn0013-5194-
dc.identifier.issn1350-911X-
dc.identifier.urihttps://scholarworks.bwise.kr/hongik/handle/2020.sw.hongik/32603-
dc.description.abstractIn many deep neural network (DNN) applications, the difficulty of gathering high-quality data in industry fields hinders the practical use of DNN. Thus, the concept of transfer learning (TL) has emerged, which leverages the pretrained knowledge of the DNN which was built based on large-scale datasets. For this TL objective, this paper suggests two-stage architectural fine-tuning for reducing the costs and time while exploring the most efficient DNN model, inspired by neural architecture search (NAS). The first stage is mutation, which reduces the search costs using a priori architectural information. Moreover, the next stage is early-stopping, which reduces NAS costs by terminating the search process in the middle of computation. The data-intensive experimental results verify that the proposed method outperforms benchmarks. This paper suggests two-stage architectural fine-tuning for reducing the costs and time while exploring the most efficient neural network model, inspired by neural architecture search (NAS). The first stage is mutation, which reduces the search costs using a priori architectural information. Moreover, the next stage is early-stopping, which reduces NAS costs by terminating the search process in the middle of computation.image-
dc.language영어-
dc.language.isoENG-
dc.publisherWILEY-
dc.titleTwo-stage architectural fine-tuning for neural architecture search in efficient transfer learning-
dc.typeArticle-
dc.publisher.location미국-
dc.identifier.doi10.1049/ell2.13066-
dc.identifier.scopusid2-s2.0-85180133284-
dc.identifier.wosid001128010300001-
dc.identifier.bibliographicCitationELECTRONICS LETTERS, v.59, no.24-
dc.citation.titleELECTRONICS LETTERS-
dc.citation.volume59-
dc.citation.number24-
dc.type.docTypeArticle-
dc.description.isOpenAccessY-
dc.description.journalRegisteredClassscie-
dc.description.journalRegisteredClassscopus-
dc.relation.journalResearchAreaEngineering-
dc.relation.journalWebOfScienceCategoryEngineering, Electrical & Electronic-
dc.subject.keywordAuthorimage processing-
dc.subject.keywordAuthorneural nets-
dc.subject.keywordAuthorneural net architecture-
Files in This Item
There are no files associated with this item.
Appears in
Collections
College of Engineering > Computer Engineering > Journal Articles

qrcode

Items in ScholarWorks are protected by copyright, with all rights reserved, unless otherwise indicated.

Related Researcher

Researcher Lee, Youn Kyu photo

Lee, Youn Kyu
Engineering (Department of Computer Engineering)
Read more

Altmetrics

Total Views & Downloads

BROWSE