Detailed Information

Cited 0 time in webofscience Cited 0 time in scopus
Metadata Downloads

Block-cyclic stochastic coordinate descent for deep neural networksopen access

Authors
Nakamura, K.Soatto, S.Hong, B.-W.
Issue Date
Jul-2021
Publisher
Elsevier Ltd
Keywords
Coordinate descent; Deep neural network; Energy optimization; Stochastic gradient descent
Citation
Neural Networks, v.139, pp 348 - 357
Pages
10
Journal Title
Neural Networks
Volume
139
Start Page
348
End Page
357
URI
https://scholarworks.bwise.kr/cau/handle/2019.sw.cau/47751
DOI
10.1016/j.neunet.2021.04.001
ISSN
0893-6080
1879-2782
Abstract
We present a stochastic first-order optimization algorithm, named block-cyclic stochastic coordinate descent (BCSC), that adds a cyclic constraint to stochastic block-coordinate descent in the selection of both data and parameters. It uses different subsets of the data to update different subsets of the parameters, thus limiting the detrimental effect of outliers in the training set. Empirical tests in image classification benchmark datasets show that BCSC outperforms state-of-the-art optimization methods in generalization leading to higher accuracy within the same number of update iterations. The improvements are consistent across different architectures and datasets, and can be combined with other training techniques and regularizations. © 2021 Elsevier Ltd
Files in This Item
Appears in
Collections
College of Software > Department of Artificial Intelligence > 1. Journal Articles

qrcode

Items in ScholarWorks are protected by copyright, with all rights reserved, unless otherwise indicated.

Related Researcher

Researcher Hong, Byung-Woo photo

Hong, Byung-Woo
소프트웨어대학 (AI학과)
Read more

Altmetrics

Total Views & Downloads

BROWSE