Detailed Information

Cited 0 time in webofscience Cited 2 time in scopus
Metadata Downloads

Convergence-aware neural network training

Authors
Oh, HyungjunYu, YongseungRyu, GihaAhn, GunjooJeong, YuriPark, YongjunSeo, Jiwon
Issue Date
Jul-2020
Publisher
Institute of Electrical and Electronics Engineers Inc.
Citation
Proceedings - Design Automation Conference, v.2020-July, pp.1 - 6
Indexed
SCOPUS
Journal Title
Proceedings - Design Automation Conference
Volume
2020-July
Start Page
1
End Page
6
URI
https://scholarworks.bwise.kr/hanyang/handle/2021.sw.hanyang/145403
DOI
10.1109/DAC18072.2020.9218518
ISSN
0738-100X
Abstract
Training a deep neural network(DNN) is expensive, requiring a large amount of computation time. While the training overhead is high, not all computation in DNN training is equal. Some parameters converge faster and thus their gradient computation may contribute little to the parameter update; in nearstationary points a subset of parameters may change very little. In this paper we exploit the parameter convergence to optimize gradient computation in DNN training. We design a light-weight monitoring technique to track the parameter convergence; we prune the gradient computation stochastically for a group of semantically related parameters, exploiting their convergence correlations. These techniques are efficiently implemented in existing GPU kernels. In our evaluation the optimization techniques substantially and robustly improve the training throughput for four DNN models on three public datasets.
Files in This Item
Go to Link
Appears in
Collections
서울 공과대학 > 서울 컴퓨터소프트웨어학부 > 1. Journal Articles

qrcode

Items in ScholarWorks are protected by copyright, with all rights reserved, unless otherwise indicated.

Related Researcher

Researcher Park, Yong jun photo

Park, Yong jun
서울 공과대학 (서울 컴퓨터소프트웨어학부)
Read more

Altmetrics

Total Views & Downloads

BROWSE