A Surrogate-Assisted Level-Based Learning Swarm Optimizer for Convolutional Neural Network Architecture Search
DC Field | Value | Language |
---|---|---|
dc.contributor.author | Zhu, Jin-Hao | - |
dc.contributor.author | Wei, Feng-Feng | - |
dc.contributor.author | Lin, Qiuzhen | - |
dc.contributor.author | Hu, Xiao-Min | - |
dc.contributor.author | Jeon, Sang-Woon | - |
dc.contributor.author | Chen, Wei-Neng | - |
dc.date.accessioned | 2025-07-25T05:00:16Z | - |
dc.date.available | 2025-07-25T05:00:16Z | - |
dc.date.issued | 2025-06 | - |
dc.identifier.issn | 1865-0929 | - |
dc.identifier.issn | 1865-0937 | - |
dc.identifier.uri | https://scholarworks.bwise.kr/erica/handle/2021.sw.erica/126171 | - |
dc.description.abstract | Constructing a high-performance neural network demands a substantial amount of proficiency in the continuous design and fine-tuning of neural network architectures, which propels the advancement of neural network architecture search (NAS) algorithms. However, considerable evaluation time is still a severe challenge in NAS. In this undertaking, we put forward a novel algorithm based on the level-based learning swarm optimizer (LLSO) for the automatic search of meaningful deep convolutional neural network (CNN) architectures suitable for image classification tasks, named CLCNN. To reduce the evaluation time, a classifier is built to predict the architecture performance, replacing a majority of tedious network training. Considering the characteristics of the classifier, LLSO is adopted to search for superior architecture, in which the particles are classified into different layers according to fitness. Particles in lower layers learn from ones in higher layers to search for global optima. As a result, LLSO not only has fast convergence ability but also has the ability to evolve the population with only the prediction results of the classifier, without knowing the accurate objective values. Experimental findings demonstrate that CLCNN is capable of rapidly discovering satisfactory CNN architectures that achieve competitive quality compared to state-of-the-art approaches. © The Author(s), under exclusive license to Springer Nature Singapore Pte Ltd. 2025. | - |
dc.format.extent | 13 | - |
dc.language | 영어 | - |
dc.language.iso | ENG | - |
dc.publisher | Springer Science and Business Media Deutschland GmbH | - |
dc.title | A Surrogate-Assisted Level-Based Learning Swarm Optimizer for Convolutional Neural Network Architecture Search | - |
dc.type | Article | - |
dc.publisher.location | 독일 | - |
dc.identifier.doi | 10.1007/978-981-96-6948-6_17 | - |
dc.identifier.scopusid | 2-s2.0-105009906878 | - |
dc.identifier.bibliographicCitation | Communications in Computer and Information Science, v.2282 CCIS, pp 238 - 250 | - |
dc.citation.title | Communications in Computer and Information Science | - |
dc.citation.volume | 2282 CCIS | - |
dc.citation.startPage | 238 | - |
dc.citation.endPage | 250 | - |
dc.type.docType | Conference paper | - |
dc.description.isOpenAccess | N | - |
dc.description.journalRegisteredClass | scopus | - |
dc.subject.keywordAuthor | Convolutional neural networks | - |
dc.subject.keywordAuthor | Evolutionary Algorithm | - |
dc.subject.keywordAuthor | Network architecture search | - |
dc.subject.keywordAuthor | Particle swarm optimization | - |
dc.subject.keywordAuthor | Surrogate-assisted EA | - |
Items in ScholarWorks are protected by copyright, with all rights reserved, unless otherwise indicated.
55 Hanyangdeahak-ro, Sangnok-gu, Ansan, Gyeonggi-do, 15588, Korea+82-31-400-4269 sweetbrain@hanyang.ac.kr
COPYRIGHT © 2021 HANYANG UNIVERSITY. ALL RIGHTS RESERVED.
Certain data included herein are derived from the © Web of Science of Clarivate Analytics. All rights reserved.
You may not copy or re-distribute this material in whole or in part without the prior written consent of Clarivate Analytics.