Detailed Information

Cited 0 time in webofscience Cited 0 time in scopus
Metadata Downloads

Block-cyclic stochastic coordinate descent for deep neural networks

Full metadata record
DC Field Value Language
dc.contributor.authorNakamura, K.-
dc.contributor.authorSoatto, S.-
dc.contributor.authorHong, B.-W.-
dc.date.accessioned2021-07-22T06:53:52Z-
dc.date.available2021-07-22T06:53:52Z-
dc.date.issued2021-07-
dc.identifier.issn0893-6080-
dc.identifier.issn1879-2782-
dc.identifier.urihttps://scholarworks.bwise.kr/cau/handle/2019.sw.cau/47751-
dc.description.abstractWe present a stochastic first-order optimization algorithm, named block-cyclic stochastic coordinate descent (BCSC), that adds a cyclic constraint to stochastic block-coordinate descent in the selection of both data and parameters. It uses different subsets of the data to update different subsets of the parameters, thus limiting the detrimental effect of outliers in the training set. Empirical tests in image classification benchmark datasets show that BCSC outperforms state-of-the-art optimization methods in generalization leading to higher accuracy within the same number of update iterations. The improvements are consistent across different architectures and datasets, and can be combined with other training techniques and regularizations. © 2021 Elsevier Ltd-
dc.format.extent10-
dc.language영어-
dc.language.isoENG-
dc.publisherElsevier Ltd-
dc.titleBlock-cyclic stochastic coordinate descent for deep neural networks-
dc.typeArticle-
dc.identifier.doi10.1016/j.neunet.2021.04.001-
dc.identifier.bibliographicCitationNeural Networks, v.139, pp 348 - 357-
dc.description.isOpenAccessY-
dc.identifier.wosid000652682000015-
dc.identifier.scopusid2-s2.0-85104361217-
dc.citation.endPage357-
dc.citation.startPage348-
dc.citation.titleNeural Networks-
dc.citation.volume139-
dc.type.docTypeArticle-
dc.publisher.location영국-
dc.subject.keywordAuthorCoordinate descent-
dc.subject.keywordAuthorDeep neural network-
dc.subject.keywordAuthorEnergy optimization-
dc.subject.keywordAuthorStochastic gradient descent-
dc.subject.keywordPlusClassification (of information)-
dc.subject.keywordPlusGradient methods-
dc.subject.keywordPlusOptimization-
dc.subject.keywordPlusStochastic systems-
dc.subject.keywordPlusBlock coordinate descents-
dc.subject.keywordPlusCoordinate descent-
dc.subject.keywordPlusEnergy optimization-
dc.subject.keywordPlusFirst order-
dc.subject.keywordPlusNeural-networks-
dc.subject.keywordPlusOptimization algorithms-
dc.subject.keywordPlusOrdering optimizations-
dc.subject.keywordPlusStochastic gradient descent-
dc.subject.keywordPlusStochastics-
dc.subject.keywordPlusTraining sets-
dc.subject.keywordPlusDeep neural networks-
dc.subject.keywordPlusalgorithm-
dc.subject.keywordPlusarticle-
dc.subject.keywordPlusdeep neural network-
dc.subject.keywordPlusdiagnostic test accuracy study-
dc.subject.keywordPlusstochastic model-
dc.relation.journalResearchAreaComputer Science-
dc.relation.journalResearchAreaNeurosciences & Neurology-
dc.relation.journalWebOfScienceCategoryComputer Science, Artificial Intelligence-
dc.relation.journalWebOfScienceCategoryNeurosciences-
dc.description.journalRegisteredClassscie-
dc.description.journalRegisteredClassscopus-
Files in This Item
Appears in
Collections
College of Software > Department of Artificial Intelligence > 1. Journal Articles

qrcode

Items in ScholarWorks are protected by copyright, with all rights reserved, unless otherwise indicated.

Related Researcher

Researcher Hong, Byung-Woo photo

Hong, Byung-Woo
소프트웨어대학 (AI학과)
Read more

Altmetrics

Total Views & Downloads

BROWSE