Detailed Information

Cited 0 time in webofscience Cited 0 time in scopus
Metadata Downloads

A Surrogate-Assisted Level-Based Learning Swarm Optimizer for Convolutional Neural Network Architecture Search

Full metadata record
DC Field Value Language
dc.contributor.authorZhu, Jin-Hao-
dc.contributor.authorWei, Feng-Feng-
dc.contributor.authorLin, Qiuzhen-
dc.contributor.authorHu, Xiao-Min-
dc.contributor.authorJeon, Sang-Woon-
dc.contributor.authorChen, Wei-Neng-
dc.date.accessioned2025-07-25T05:00:16Z-
dc.date.available2025-07-25T05:00:16Z-
dc.date.issued2025-06-
dc.identifier.issn1865-0929-
dc.identifier.issn1865-0937-
dc.identifier.urihttps://scholarworks.bwise.kr/erica/handle/2021.sw.erica/126171-
dc.description.abstractConstructing a high-performance neural network demands a substantial amount of proficiency in the continuous design and fine-tuning of neural network architectures, which propels the advancement of neural network architecture search (NAS) algorithms. However, considerable evaluation time is still a severe challenge in NAS. In this undertaking, we put forward a novel algorithm based on the level-based learning swarm optimizer (LLSO) for the automatic search of meaningful deep convolutional neural network (CNN) architectures suitable for image classification tasks, named CLCNN. To reduce the evaluation time, a classifier is built to predict the architecture performance, replacing a majority of tedious network training. Considering the characteristics of the classifier, LLSO is adopted to search for superior architecture, in which the particles are classified into different layers according to fitness. Particles in lower layers learn from ones in higher layers to search for global optima. As a result, LLSO not only has fast convergence ability but also has the ability to evolve the population with only the prediction results of the classifier, without knowing the accurate objective values. Experimental findings demonstrate that CLCNN is capable of rapidly discovering satisfactory CNN architectures that achieve competitive quality compared to state-of-the-art approaches. © The Author(s), under exclusive license to Springer Nature Singapore Pte Ltd. 2025.-
dc.format.extent13-
dc.language영어-
dc.language.isoENG-
dc.publisherSpringer Science and Business Media Deutschland GmbH-
dc.titleA Surrogate-Assisted Level-Based Learning Swarm Optimizer for Convolutional Neural Network Architecture Search-
dc.typeArticle-
dc.publisher.location독일-
dc.identifier.doi10.1007/978-981-96-6948-6_17-
dc.identifier.scopusid2-s2.0-105009906878-
dc.identifier.bibliographicCitationCommunications in Computer and Information Science, v.2282 CCIS, pp 238 - 250-
dc.citation.titleCommunications in Computer and Information Science-
dc.citation.volume2282 CCIS-
dc.citation.startPage238-
dc.citation.endPage250-
dc.type.docTypeConference paper-
dc.description.isOpenAccessN-
dc.description.journalRegisteredClassscopus-
dc.subject.keywordAuthorConvolutional neural networks-
dc.subject.keywordAuthorEvolutionary Algorithm-
dc.subject.keywordAuthorNetwork architecture search-
dc.subject.keywordAuthorParticle swarm optimization-
dc.subject.keywordAuthorSurrogate-assisted EA-
Files in This Item
There are no files associated with this item.
Appears in
Collections
COLLEGE OF ENGINEERING SCIENCES > SCHOOL OF ELECTRICAL ENGINEERING > 1. Journal Articles

qrcode

Items in ScholarWorks are protected by copyright, with all rights reserved, unless otherwise indicated.

Related Researcher

Researcher Jeon, Sang Woon photo

Jeon, Sang Woon
ERICA 공학대학 (SCHOOL OF ELECTRICAL ENGINEERING)
Read more

Altmetrics

Total Views & Downloads

BROWSE