Detailed Information

Cited 0 time in webofscience Cited 0 time in scopus
Metadata Downloads

Quantized Distributed Online Kernel Learning

Authors
Park, JonghwanHong, Songnam
Issue Date
Dec-2021
Publisher
IEEE Computer Society
Keywords
distributed learning; kernel-based learning; Online learning
Citation
International Conference on ICT Convergence, v.2021, no.October, pp.357 - 361
Indexed
SCOPUS
Journal Title
International Conference on ICT Convergence
Volume
2021
Number
October
Start Page
357
End Page
361
URI
https://scholarworks.bwise.kr/hanyang/handle/2021.sw.hanyang/140083
DOI
10.1109/ICTC52510.2021.9620759
ISSN
2162-1233
Abstract
In this paper we propose a communication-efficient kernel-based learning method by means of random-feature approximation and quantization. The proposed algorithm is named quantized distributed online kernel learning (QDOKL). We theoretically prove that QDOKL over N time slots can achieve an optimal sublinear regret \mathrm{O}(\sqrt{N}), provided that a quantization level scales with \sqrt{N}. Our analysis implies that every node in the network can learn a common function having a diminishing gap from the best function in hindsight. We verify our theoretical result via numerical tests with real datasets on online regression tasks. Also, it is demonstrated that QDOKL can achieve the almost same accuracy as the unquantized counterpart while having a lower communication overhead.
Files in This Item
Go to Link
Appears in
Collections
서울 공과대학 > 서울 융합전자공학부 > 1. Journal Articles

qrcode

Items in ScholarWorks are protected by copyright, with all rights reserved, unless otherwise indicated.

Related Researcher

Researcher Hong, Song nam photo

Hong, Song nam
COLLEGE OF ENGINEERING (SCHOOL OF ELECTRONIC ENGINEERING)
Read more

Altmetrics

Total Views & Downloads

BROWSE