A Communication Efficient Approach of Global Training in Federated Learning
DC Field | Value | Language |
---|---|---|
dc.contributor.author | Bhatti, Dost Muhammad Saqib | - |
dc.contributor.author | Haris, Muhammad | - |
dc.contributor.author | Nam, Haewoon | - |
dc.date.accessioned | 2023-08-01T06:32:35Z | - |
dc.date.available | 2023-08-01T06:32:35Z | - |
dc.date.issued | 2022-10 | - |
dc.identifier.issn | 2162-1233 | - |
dc.identifier.uri | https://scholarworks.bwise.kr/erica/handle/2021.sw.erica/113629 | - |
dc.description.abstract | Federated learning is a privacy preserving method of training the model on server by utilizing the end users' private data without accessing it. The central server shares the global model with all end users, called clients of the network. The clients are required to train the shared global model using their local datasets. The updated local trained models are forwarded back to the server to further update the global model. This process of training the global model is carried out for several rounds. The procedure of updating the local model and transmitting back to the server rises the communication cost. Since several clients are involved in training the global model, the aggregated communication cost of the network is escalated. This article proposes a communication effective aggregation method for federated learning, which considers the volume and variety of local clients' data before aggregation. The proposed approach is compared with the conventional methods and it achieves highest accuracy and minimum loss with respect to aggregated communication cost. © 2022 IEEE. | - |
dc.format.extent | 6 | - |
dc.language | 영어 | - |
dc.language.iso | ENG | - |
dc.publisher | IEEE Computer Society | - |
dc.title | A Communication Efficient Approach of Global Training in Federated Learning | - |
dc.type | Article | - |
dc.publisher.location | 미국 | - |
dc.identifier.doi | 10.1109/ICTC55196.2022.9952661 | - |
dc.identifier.scopusid | 2-s2.0-85143251047 | - |
dc.identifier.bibliographicCitation | International Conference on ICT Convergence, v.2022-October, pp 1441 - 1446 | - |
dc.citation.title | International Conference on ICT Convergence | - |
dc.citation.volume | 2022-October | - |
dc.citation.startPage | 1441 | - |
dc.citation.endPage | 1446 | - |
dc.type.docType | Conference Paper | - |
dc.description.isOpenAccess | N | - |
dc.description.journalRegisteredClass | scopus | - |
dc.subject.keywordAuthor | deep learning | - |
dc.subject.keywordAuthor | distributed learning | - |
dc.subject.keywordAuthor | Federated learning | - |
dc.identifier.url | https://ieeexplore.ieee.org/document/9952661 | - |
Items in ScholarWorks are protected by copyright, with all rights reserved, unless otherwise indicated.
55 Hanyangdeahak-ro, Sangnok-gu, Ansan, Gyeonggi-do, 15588, Korea+82-31-400-4269 sweetbrain@hanyang.ac.kr
COPYRIGHT © 2021 HANYANG UNIVERSITY. ALL RIGHTS RESERVED.
Certain data included herein are derived from the © Web of Science of Clarivate Analytics. All rights reserved.
You may not copy or re-distribute this material in whole or in part without the prior written consent of Clarivate Analytics.