Detailed Information

Cited 0 time in webofscience Cited 0 time in scopus
Metadata Downloads

Learning-Rate Annealing Methods for Deep Neural Networksopen access

Authors
Nakamura, KensukeDerbel, BilelWon, Kyoung-JaeHong, Byung-Woo
Issue Date
Aug-2021
Publisher
MDPI
Keywords
learning rate annealing; stochastic gradient descent; image classification
Citation
ELECTRONICS, v.10, no.16
Journal Title
ELECTRONICS
Volume
10
Number
16
URI
https://scholarworks.bwise.kr/cau/handle/2019.sw.cau/49038
DOI
10.3390/electronics10162029
ISSN
2079-9292
2079-9292
Abstract
Deep neural networks (DNNs) have achieved great success in the last decades. DNN is optimized using the stochastic gradient descent (SGD) with learning rate annealing that overtakes the adaptive methods in many tasks. However, there is no common choice regarding the scheduled-annealing for SGD. This paper aims to present empirical analysis of learning rate annealing based on the experimental results using the major data-sets on the image classification that is one of the key applications of the DNNs. Our experiment involves recent deep neural network models in combination with a variety of learning rate annealing methods. We also propose an annealing combining the sigmoid function with warmup that is shown to overtake both the adaptive methods and the other existing schedules in accuracy in most cases with DNNs.
Files in This Item
Appears in
Collections
College of Software > Department of Artificial Intelligence > 1. Journal Articles

qrcode

Items in ScholarWorks are protected by copyright, with all rights reserved, unless otherwise indicated.

Related Researcher

Researcher Hong, Byung-Woo photo

Hong, Byung-Woo
소프트웨어대학 (AI학과)
Read more

Altmetrics

Total Views & Downloads

BROWSE