Block-cyclic stochastic coordinate descent for deep neural networks
DC Field | Value | Language |
---|---|---|
dc.contributor.author | Nakamura, K. | - |
dc.contributor.author | Soatto, S. | - |
dc.contributor.author | Hong, B.-W. | - |
dc.date.accessioned | 2021-07-22T06:53:52Z | - |
dc.date.available | 2021-07-22T06:53:52Z | - |
dc.date.issued | 2021-07 | - |
dc.identifier.issn | 0893-6080 | - |
dc.identifier.issn | 1879-2782 | - |
dc.identifier.uri | https://scholarworks.bwise.kr/cau/handle/2019.sw.cau/47751 | - |
dc.description.abstract | We present a stochastic first-order optimization algorithm, named block-cyclic stochastic coordinate descent (BCSC), that adds a cyclic constraint to stochastic block-coordinate descent in the selection of both data and parameters. It uses different subsets of the data to update different subsets of the parameters, thus limiting the detrimental effect of outliers in the training set. Empirical tests in image classification benchmark datasets show that BCSC outperforms state-of-the-art optimization methods in generalization leading to higher accuracy within the same number of update iterations. The improvements are consistent across different architectures and datasets, and can be combined with other training techniques and regularizations. © 2021 Elsevier Ltd | - |
dc.format.extent | 10 | - |
dc.language | 영어 | - |
dc.language.iso | ENG | - |
dc.publisher | Elsevier Ltd | - |
dc.title | Block-cyclic stochastic coordinate descent for deep neural networks | - |
dc.type | Article | - |
dc.identifier.doi | 10.1016/j.neunet.2021.04.001 | - |
dc.identifier.bibliographicCitation | Neural Networks, v.139, pp 348 - 357 | - |
dc.description.isOpenAccess | Y | - |
dc.identifier.wosid | 000652682000015 | - |
dc.identifier.scopusid | 2-s2.0-85104361217 | - |
dc.citation.endPage | 357 | - |
dc.citation.startPage | 348 | - |
dc.citation.title | Neural Networks | - |
dc.citation.volume | 139 | - |
dc.type.docType | Article | - |
dc.publisher.location | 영국 | - |
dc.subject.keywordAuthor | Coordinate descent | - |
dc.subject.keywordAuthor | Deep neural network | - |
dc.subject.keywordAuthor | Energy optimization | - |
dc.subject.keywordAuthor | Stochastic gradient descent | - |
dc.subject.keywordPlus | Classification (of information) | - |
dc.subject.keywordPlus | Gradient methods | - |
dc.subject.keywordPlus | Optimization | - |
dc.subject.keywordPlus | Stochastic systems | - |
dc.subject.keywordPlus | Block coordinate descents | - |
dc.subject.keywordPlus | Coordinate descent | - |
dc.subject.keywordPlus | Energy optimization | - |
dc.subject.keywordPlus | First order | - |
dc.subject.keywordPlus | Neural-networks | - |
dc.subject.keywordPlus | Optimization algorithms | - |
dc.subject.keywordPlus | Ordering optimizations | - |
dc.subject.keywordPlus | Stochastic gradient descent | - |
dc.subject.keywordPlus | Stochastics | - |
dc.subject.keywordPlus | Training sets | - |
dc.subject.keywordPlus | Deep neural networks | - |
dc.subject.keywordPlus | algorithm | - |
dc.subject.keywordPlus | article | - |
dc.subject.keywordPlus | deep neural network | - |
dc.subject.keywordPlus | diagnostic test accuracy study | - |
dc.subject.keywordPlus | stochastic model | - |
dc.relation.journalResearchArea | Computer Science | - |
dc.relation.journalResearchArea | Neurosciences & Neurology | - |
dc.relation.journalWebOfScienceCategory | Computer Science, Artificial Intelligence | - |
dc.relation.journalWebOfScienceCategory | Neurosciences | - |
dc.description.journalRegisteredClass | scie | - |
dc.description.journalRegisteredClass | scopus | - |
Items in ScholarWorks are protected by copyright, with all rights reserved, unless otherwise indicated.
84, Heukseok-ro, Dongjak-gu, Seoul, Republic of Korea (06974)02-820-6194
COPYRIGHT 2019 Chung-Ang University All Rights Reserved.
Certain data included herein are derived from the © Web of Science of Clarivate Analytics. All rights reserved.
You may not copy or re-distribute this material in whole or in part without the prior written consent of Clarivate Analytics.