Detailed Information

Cited 0 time in webofscience Cited 0 time in scopus
Metadata Downloads

Auction-guided model diffusion for communication-efficient federated learning on non-IID data

Full metadata record
DC Field Value Language
dc.contributor.authorAhn, Seyoung-
dc.contributor.authorKim, Soohyeong-
dc.contributor.authorKwon, Yongseok-
dc.contributor.authorYoun, Jiseung-
dc.contributor.authorPark, Joohan-
dc.contributor.authorCho, Sunghyun-
dc.date.accessioned2025-10-02T06:00:13Z-
dc.date.available2025-10-02T06:00:13Z-
dc.date.issued2026-01-
dc.identifier.issn08936080-
dc.identifier.urihttps://scholarworks.bwise.kr/erica/handle/2021.sw.erica/126611-
dc.description.abstractIn 6G mobile communication systems, various AI-based network functions and applications have been standardized. Federated learning (FL) is adopted as the core learning architecture for 6G systems to avoid privacy leakage from mobile user data. However, in FL, users with non-independent and identically distributed (non-IID) datasets can deteriorate the performance of the global model because the convergence direction of the gradient for each dataset is different, thereby inducing a weight divergence problem. To address this problem, we propose a novel diffusion strategy for machine learning (ML) models (FedDif) to maximize the performance of the global model with non-IID data. FedDif enables the local model to learn different distributions before parameter aggregation by passing the local models to users via device-to-device communication. Furthermore, we theoretically demonstrate that FedDif can circumvent the weight-divergence problem. Based on this theory, we propose a communication-efficient diffusion strategy for ML models that can determine the trade-off between learning performance and communication cost using auction theory. The experimental results show that FedDif improves the top-1 test accuracy by up to 20.07 %p and reduces communication costs by up to 45.27 % compared to FedAvg. © 2025 Elsevier B.V., All rights reserved.-
dc.language영어-
dc.language.isoENG-
dc.publisherElsevier Ltd-
dc.titleAuction-guided model diffusion for communication-efficient federated learning on non-IID data-
dc.typeArticle-
dc.identifier.doi10.1016/j.neunet.2025.108066-
dc.identifier.scopusid2-s2.0-105015143755-
dc.identifier.bibliographicCitationNeural Networks, v.193-
dc.citation.titleNeural Networks-
dc.citation.volume193-
dc.type.docTypeArticle-
dc.description.isOpenAccessN-
dc.description.journalRegisteredClassscie-
dc.description.journalRegisteredClassscopus-
dc.subject.keywordAuthorCooperative Learning-
dc.subject.keywordAuthorFederated Learning-
dc.subject.keywordAuthorMobile Communications-
dc.subject.keywordAuthorNon-iid Data-
dc.subject.keywordAuthorCooperative Communication-
dc.subject.keywordAuthorData Communication Systems-
dc.subject.keywordAuthorData Privacy-
dc.subject.keywordAuthorDiffusion-
dc.subject.keywordAuthorE-learning-
dc.subject.keywordAuthorFederated Learning-
dc.subject.keywordAuthorLearning Systems-
dc.subject.keywordAuthorMobile Telecommunication Systems-
dc.subject.keywordAuthorCooperative Learning-
dc.subject.keywordAuthorDiffusion Strategies-
dc.subject.keywordAuthorDistributed Data-
dc.subject.keywordAuthorDivergence Problems-
dc.subject.keywordAuthorGlobal Models-
dc.subject.keywordAuthorIid Data-
dc.subject.keywordAuthorMachine Learning Models-
dc.subject.keywordAuthorMobile Communications-
dc.subject.keywordAuthorNon-iid Data-
dc.subject.keywordAuthorPerformance-
dc.subject.keywordAuthorEconomic And Social Effects-
Files in This Item
There are no files associated with this item.
Appears in
Collections
COLLEGE OF COMPUTING > ERICA 컴퓨터학부 > 1. Journal Articles

qrcode

Items in ScholarWorks are protected by copyright, with all rights reserved, unless otherwise indicated.

Related Researcher

Researcher Cho, Sung hyun photo

Cho, Sung hyun
ERICA 소프트웨어융합대학 (ERICA 컴퓨터학부)
Read more

Altmetrics

Total Views & Downloads

BROWSE