Detailed Information

Cited 0 time in webofscience Cited 0 time in scopus
Metadata Downloads

Implicit Jacobian Regularization Weighted with Impurity of Probability Output

Authors
Lee, SungyoonPark, JinseongLee, Jaewook
Issue Date
Jul-2023
Publisher
ML Research Press
Citation
Proceedings of Machine Learning Research, v.202, pp.19094 - 19140
Indexed
SCOPUS
Journal Title
Proceedings of Machine Learning Research
Volume
202
Start Page
19094
End Page
19140
URI
https://scholarworks.bwise.kr/hanyang/handle/2021.sw.hanyang/192170
Abstract
The success of deep learning is greatly attributed to stochastic gradient descent (SGD), yet it remains unclear how SGD finds well-generalized models. We demonstrate that SGD has an implicit regularization effect on the logit-weight Jacobian norm of neural networks. This regularization effect is weighted with the impurity of the probability output, and thus it is active in a certain phase of training. Moreover, based on these findings, we propose a novel optimization method that explicitly regularizes the Jacobian norm, which leads to similar performance as other state-of-the-art sharpness-aware optimization methods.
Files in This Item
Go to Link
Appears in
Collections
서울 공과대학 > 서울 컴퓨터소프트웨어학부 > 1. Journal Articles

qrcode

Items in ScholarWorks are protected by copyright, with all rights reserved, unless otherwise indicated.

Related Researcher

Researcher Lee, Sungyoon photo

Lee, Sungyoon
COLLEGE OF ENGINEERING (SCHOOL OF COMPUTER SCIENCE)
Read more

Altmetrics

Total Views & Downloads

BROWSE