Detailed Information

Cited 0 time in webofscience Cited 0 time in scopus
Metadata Downloads

Communication-Efficient Randomized Algorithm for Multi-Kernel Online Federated Learning

Authors
Hong, SongnamChae, Jeongmin
Issue Date
Dec-2022
Publisher
IEEE COMPUTER SOC
Keywords
Collaborative work; Data models; Downlink; Federated learning; Kernel; kernel-based learning; online learning; Predictive models; reproducing kernel Hilbert space (RKHS); Servers; Uplink
Citation
IEEE TRANSACTIONS ON PATTERN ANALYSIS AND MACHINE INTELLIGENCE, v.44, no.12, pp.9872 - 9886
Indexed
SCIE
SCOPUS
Journal Title
IEEE TRANSACTIONS ON PATTERN ANALYSIS AND MACHINE INTELLIGENCE
Volume
44
Number
12
Start Page
9872
End Page
9886
URI
https://scholarworks.bwise.kr/hanyang/handle/2021.sw.hanyang/172807
DOI
10.1109/TPAMI.2021.3129809
ISSN
0162-8828
Abstract
Online federated learning (OFL) is a promising framework to learn a sequence of global functions from distributed sequential data at local devices. In this framework, we first introduce a single kernel-based OFL (termed S-KOFL) by incorporating random-feature (RF) approximation, online gradient descent (OGD), and federated averaging (FedAvg). As manifested in the centralized counterpart, an extension to multi-kernel method is necessary. Harnessing the extension principle in the centralized method, we construct a vanilla multi-kernel algorithm (termed vM-KOFL) and prove its asymptotic optimality. However, it is not practical as the communication overhead grows linearly with the size of a kernel dictionary. Moreover, this problem cannot be addressed via the existing communication-efficient techniques (e.g., quantization and sparsification) in the conventional federated learning. Our major contribution is to propose a novel randomized algorithm (named eM-KOFL), which exhibits similar performance to vM-KOFL while maintaining low communication cost. We theoretically prove that eM-KOFL achieves an optimal sublinear regret bound. Mimicking the key concept of eM-KOFL in an efficient way, we propose a more practical pM-KOFL having the same communication overhead as S-KOFL. Via numerical tests with real datasets, we demonstrate that pM-KOFL yields the almost same performance as vM-KOFL (or eM-KOFL) on various online learning tasks.
Files in This Item
Go to Link
Appears in
Collections
서울 공과대학 > 서울 융합전자공학부 > 1. Journal Articles

qrcode

Items in ScholarWorks are protected by copyright, with all rights reserved, unless otherwise indicated.

Related Researcher

Researcher Hong, Song nam photo

Hong, Song nam
COLLEGE OF ENGINEERING (SCHOOL OF ELECTRONIC ENGINEERING)
Read more

Altmetrics

Total Views & Downloads

BROWSE