FedCLS: Class-Aware Federated Learning in a Heterogeneous Environment
DC Field | Value | Language |
---|---|---|
dc.contributor.author | Bhatti, Dost Muhammad Saqib | - |
dc.contributor.author | Nam, Haewoon | - |
dc.date.accessioned | 2023-07-05T05:39:20Z | - |
dc.date.available | 2023-07-05T05:39:20Z | - |
dc.date.issued | 2023-06 | - |
dc.identifier.issn | 1932-4537 | - |
dc.identifier.uri | https://scholarworks.bwise.kr/erica/handle/2021.sw.erica/113123 | - |
dc.description.abstract | Federated learning is an approach of training the global model on the server by utilizing the personal data of the end users while data privacy is preserved. The users, referred to as clients, are responsible for performing local training using their respective datasets. Once trained, the clients forward their local models to the server, where the models are aggregated to update the global model. Practically, the datasets of clients have different classes of labels regardless of the number of samples. In other words, the data is non-independent and identically distributed (non-iid) among clients in terms of classes of labels, which creates heterogeneity among them. Hence, the local model weights updated by clients result in a broad variation due to heterogeneity among their local datasets. Thus, the process of aggregating the diversified local models of clients has a valuable impact in performance of global training. When the server aggregates the local models by calculating weighted average based solely on the number of samples available at the clients, the aggregation process may misguide global training process. To address this issue, our paper proposes a novel reweighting method called FedCLS that performs based on the volume and variance of local datasets among clients. By taking into account the heterogeneity of data for aggregation in federated learning, the proposed method aims to achieve the minimum global point. The simulation results show that the proposed method achieves 28% performance improvement compared to the conventional federated learning methods. IEEE | - |
dc.format.extent | 12 | - |
dc.language | 영어 | - |
dc.language.iso | ENG | - |
dc.publisher | Institute of Electrical and Electronics Engineers Inc. | - |
dc.title | FedCLS: Class-Aware Federated Learning in a Heterogeneous Environment | - |
dc.type | Article | - |
dc.publisher.location | 미국 | - |
dc.identifier.doi | 10.1109/TNSM.2023.3278023 | - |
dc.identifier.scopusid | 2-s2.0-85160257819 | - |
dc.identifier.wosid | 001022694300047 | - |
dc.identifier.bibliographicCitation | IEEE Transactions on Network and Service Management, v.20, no.2, pp 1 - 12 | - |
dc.citation.title | IEEE Transactions on Network and Service Management | - |
dc.citation.volume | 20 | - |
dc.citation.number | 2 | - |
dc.citation.startPage | 1 | - |
dc.citation.endPage | 12 | - |
dc.type.docType | Article | - |
dc.description.isOpenAccess | N | - |
dc.description.journalRegisteredClass | scopus | - |
dc.relation.journalResearchArea | Computer Science | - |
dc.relation.journalWebOfScienceCategory | Computer Science, Information Systems | - |
dc.subject.keywordPlus | NETWORKS | - |
dc.subject.keywordPlus | COMMUNICATION | - |
dc.subject.keywordPlus | FRAMEWORK | - |
dc.subject.keywordAuthor | deep neural networks | - |
dc.subject.keywordAuthor | distributed learning | - |
dc.subject.keywordAuthor | Federated learning | - |
dc.subject.keywordAuthor | Heterogeneous network | - |
dc.subject.keywordAuthor | unbiased aggregation | - |
dc.identifier.url | https://ieeexplore.ieee.org/document/10130085 | - |
Items in ScholarWorks are protected by copyright, with all rights reserved, unless otherwise indicated.
55 Hanyangdeahak-ro, Sangnok-gu, Ansan, Gyeonggi-do, 15588, Korea+82-31-400-4269 sweetbrain@hanyang.ac.kr
COPYRIGHT © 2021 HANYANG UNIVERSITY. ALL RIGHTS RESERVED.
Certain data included herein are derived from the © Web of Science of Clarivate Analytics. All rights reserved.
You may not copy or re-distribute this material in whole or in part without the prior written consent of Clarivate Analytics.