Detailed Information

Cited 0 time in webofscience Cited 0 time in scopus
Metadata Downloads

Out-Of-Order BackProp: An Effective Scheduling Technique for Deep Learning

Full metadata record
DC Field Value Language
dc.contributor.authorOh, Hyungjun-
dc.contributor.authorLee, Junyeol-
dc.contributor.authorKim, Hyeongju-
dc.contributor.authorSeo, Jiwon-
dc.date.accessioned2022-07-06T06:24:11Z-
dc.date.available2022-07-06T06:24:11Z-
dc.date.created2022-05-04-
dc.date.issued2022-04-
dc.identifier.urihttps://scholarworks.bwise.kr/hanyang/handle/2021.sw.hanyang/138969-
dc.description.abstractNeural network training requires a large amount of computation and thus GPUs are often used for the acceleration. While they improve the performance, GPUs are underutilized during the training. This paper proposes out-of-order (ooo) back-prop, an effective scheduling technique for neural network training. By exploiting the dependencies of gradient computations, ooo backprop enables to reorder their executions to make the most of the GPU resources. We show that the GPU utilization in single-and multi-GPU training can be commonly improve by applying ooo backprop and prioritizing critical operations. We propose three scheduling algorithms based on ooo backprop. For single-GPU training, we schedule with multi-stream ooo computation to mask the kernel launch overhead. In data-parallel training, we reorder the gradient computations to maximize the overlapping of computation and parameter communication; in pipeline-parallel training, we prioritize critical gradient computations to reduce the pipeline stalls. We evaluate our optimizations with twelve neural networks and five public datasets. Compared to the respective state of the art training systems, our algorithms improve the training throughput by 1.03-1.58× for single-GPU training, by 1.10-1.27× for data-parallel training, and by 1.41-1.99× for pipeline-parallel training.-
dc.language영어-
dc.language.isoen-
dc.publisherAssociation for Computing Machinery, Inc-
dc.titleOut-Of-Order BackProp: An Effective Scheduling Technique for Deep Learning-
dc.typeArticle-
dc.contributor.affiliatedAuthorSeo, Jiwon-
dc.identifier.doi10.1145/3492321.3519563-
dc.identifier.scopusid2-s2.0-85128082743-
dc.identifier.wosid000926506800027-
dc.identifier.bibliographicCitationEuroSys 2022 - Proceedings of the 17th European Conference on Computer Systems, pp.435 - 452-
dc.relation.isPartOfEuroSys 2022 - Proceedings of the 17th European Conference on Computer Systems-
dc.citation.titleEuroSys 2022 - Proceedings of the 17th European Conference on Computer Systems-
dc.citation.startPage435-
dc.citation.endPage452-
dc.type.rimsART-
dc.type.docTypeProceedings Paper-
dc.description.journalClass1-
dc.description.isOpenAccessN-
dc.description.journalRegisteredClassscopus-
dc.relation.journalResearchAreaComputer Science-
dc.relation.journalWebOfScienceCategoryComputer Science, Hardware & Architecture-
dc.relation.journalWebOfScienceCategoryComputer Science, Information Systems-
dc.relation.journalWebOfScienceCategoryComputer Science, Software Engineering-
dc.relation.journalWebOfScienceCategoryComputer Science, Theory & Methods-
dc.subject.keywordPlusConvolutional neural networks-
dc.subject.keywordPlusDeep learning-
dc.subject.keywordPlusPipelines-
dc.subject.keywordPlusProgram processors-
dc.subject.keywordPlusScheduling algorithms-
dc.subject.keywordPlusScheduling-
dc.subject.keywordPlusCritical operations-
dc.subject.keywordPlusData parallel-
dc.subject.keywordPlusDeep learning system-
dc.subject.keywordPlusGradients computation-
dc.subject.keywordPlusLarge amounts-
dc.subject.keywordPlusNeural networks trainings-
dc.subject.keywordPlusOut of order-
dc.subject.keywordPlusParallel training-
dc.subject.keywordPlusPerformance-
dc.subject.keywordPlusScheduling techniques-
dc.subject.keywordAuthorDeep learning systems-
dc.identifier.urlhttps://dl.acm.org/doi/10.1145/3492321.3519563-
Files in This Item
Go to Link
Appears in
Collections
서울 공과대학 > 서울 컴퓨터소프트웨어학부 > 1. Journal Articles

qrcode

Items in ScholarWorks are protected by copyright, with all rights reserved, unless otherwise indicated.

Related Researcher

Researcher Seo, Ji won photo

Seo, Ji won
COLLEGE OF ENGINEERING (SCHOOL OF COMPUTER SCIENCE)
Read more

Altmetrics

Total Views & Downloads

BROWSE