Detailed Information

Cited 0 time in webofscience Cited 0 time in scopus
Metadata Downloads

A Dimension Group-Based Comprehensive Elite Learning Swarm Optimizer for Large-Scale Optimization

Full metadata record
DC Field Value Language
dc.contributor.authorYang, Qiang-
dc.contributor.authorZhang, Kai-Xuan-
dc.contributor.authorGao, Xu-Dong-
dc.contributor.authorXu, Dong-Dong-
dc.contributor.authorLu, Zhen-Yu-
dc.contributor.authorJeon, Sang-Woon-
dc.contributor.authorZhang, Jun-
dc.date.accessioned2022-07-18T01:16:44Z-
dc.date.available2022-07-18T01:16:44Z-
dc.date.issued2022-04-
dc.identifier.issn2227-7390-
dc.identifier.urihttps://scholarworks.bwise.kr/erica/handle/2021.sw.erica/107893-
dc.description.abstractHigh-dimensional optimization problems are more and more common in the era of big data and the Internet of things (IoT), which seriously challenge the optimization performance of existing optimizers. To solve these kinds of problems effectively, this paper devises a dimension group-based comprehensive elite learning swarm optimizer (DGCELSO) by integrating valuable evolutionary information in different elite particles in the swarm to guide the updating of inferior ones. Specifically, the swarm is first separated into two exclusive sets, namely the elite set (ES) containing the top best individuals, and the non-elite set (NES), consisting of the remaining individuals. Then, the dimensions of each particle in NES are randomly divided into several groups with equal sizes. Subsequently, each dimension group of each non-elite particle is guided by two different elites randomly selected from ES. In this way, each non-elite particle in NES is comprehensively guided by multiple elite particles in ES. Therefore, not only could high diversity be maintained, but fast convergence is also likely guaranteed. To alleviate the sensitivity of DGCELSO to the associated parameters, we further devise dynamic adjustment strategies to change the parameter settings during the evolution. With the above mechanisms, DGCELSO is expected to explore and exploit the solution space properly to find the optimum solutions for optimization problems. Extensive experiments conducted on two commonly used large-scale benchmark problem sets demonstrate that DGCELSO achieves highly competitive or even much better performance than several state-of-the-art large-scale optimizers.-
dc.format.extent32-
dc.language영어-
dc.language.isoENG-
dc.publisherMDPI AG-
dc.titleA Dimension Group-Based Comprehensive Elite Learning Swarm Optimizer for Large-Scale Optimization-
dc.typeArticle-
dc.publisher.location스위스-
dc.identifier.doi10.3390/math10071072-
dc.identifier.scopusid2-s2.0-85127800431-
dc.identifier.wosid000782051400001-
dc.identifier.bibliographicCitationMathematics, v.10, no.7, pp 1 - 32-
dc.citation.titleMathematics-
dc.citation.volume10-
dc.citation.number7-
dc.citation.startPage1-
dc.citation.endPage32-
dc.type.docTypeArticle-
dc.description.isOpenAccessY-
dc.description.journalRegisteredClassscie-
dc.description.journalRegisteredClassscopus-
dc.relation.journalResearchAreaMathematics-
dc.relation.journalWebOfScienceCategoryMathematics-
dc.subject.keywordPlusPARTICLE SWARM-
dc.subject.keywordPlusCOOPERATIVE COEVOLUTION-
dc.subject.keywordPlusGENETIC ALGORITHM-
dc.subject.keywordPlusDECOMPOSITION-
dc.subject.keywordPlusFASTER-
dc.subject.keywordAuthorlarge-scale optimization-
dc.subject.keywordAuthorparticle swarm optimization-
dc.subject.keywordAuthordimension group-based comprehensive elite learning-
dc.subject.keywordAuthorhigh-dimensional problems-
dc.subject.keywordAuthorelite learning-
dc.identifier.urlhttps://www.mdpi.com/2227-7390/10/7/1072-
Files in This Item
Appears in
Collections
COLLEGE OF ENGINEERING SCIENCES > SCHOOL OF ELECTRICAL ENGINEERING > 1. Journal Articles

qrcode

Items in ScholarWorks are protected by copyright, with all rights reserved, unless otherwise indicated.

Related Researcher

Researcher Jeon, Sang Woon photo

Jeon, Sang Woon
ERICA 공학대학 (SCHOOL OF ELECTRICAL ENGINEERING)
Read more

Altmetrics

Total Views & Downloads

BROWSE