Detailed Information

Cited 0 time in webofscience Cited 0 time in scopus
Metadata Downloads

Stochastic Precision Ensemble: Self-Knowledge Distillation for Quantized Deep Neural Networks

Authors
Boo, YoonhoShin, SunghoChoi, Jung wookSung, Wonyong
Issue Date
Feb-2021
Citation
Proceedings of the AAAI Conference on Artificial Intelligence, v.35, no.8
Indexed
OTHER
Journal Title
Proceedings of the AAAI Conference on Artificial Intelligence
Volume
35
Number
8
URI
https://scholarworks.bwise.kr/hanyang/handle/2021.sw.hanyang/190557
DOI
10.1609/aaai.v35i8.16839
ISSN
2159-5399
Abstract
The quantization of deep neural networks (QDNNs) has been actively studied for deployment in edge devices. Recent studies employ the knowledge distillation (KD) method to improve the performance of quantized networks. In this study, we propose stochastic precision ensemble training for QDNNs (SPEQ). SPEQ is a knowledge distillation training scheme; however, the teacher is formed by sharing the model parameters of the student network. We obtain the soft labels of the teacher by randomly changing the bit precision of the activation stochastically at each layer of the forward-pass computation. The student model is trained with these soft labels to reduce the activation quantization noise. The cosine similarity loss is employed, instead of the KL-divergence, for KD training. As the teacher model changes continuously by random bit-precision assignment, it exploits the effect of stochastic ensemble KD. SPEQ outperforms the existing quantization training methods in various tasks, such as image classification, question-answering, and transfer learning without the need for cumbersome teacher networks.
Files in This Item
Go to Link
Appears in
Collections
서울 공과대학 > 서울 융합전자공학부 > 1. Journal Articles

qrcode

Items in ScholarWorks are protected by copyright, with all rights reserved, unless otherwise indicated.

Related Researcher

Researcher Choi, Jung wook photo

Choi, Jung wook
COLLEGE OF ENGINEERING (SCHOOL OF ELECTRONIC ENGINEERING)
Read more

Altmetrics

Total Views & Downloads

BROWSE