An Information-Theoretic Justification for Model Pruning
- Authors
- Isik, B.; Weissman, T.; No, A.
- Issue Date
- 2022
- Publisher
- ML Research Press
- Citation
- Proceedings of Machine Learning Research, v.151, pp.3821 - 3846
- Journal Title
- Proceedings of Machine Learning Research
- Volume
- 151
- Start Page
- 3821
- End Page
- 3846
- URI
- https://scholarworks.bwise.kr/hongik/handle/2020.sw.hongik/31494
- ISSN
- 2640-3498
- Abstract
- We study the neural network (NN) compression problem, viewing the tension between the compression ratio and NN performance through the lens of rate-distortion theory. We choose a distortion metric that reflects the effect of NN compression on the model output and derive the tradeoff between rate (compression) and distortion. In addition to characterizing theoretical limits of NN compression, this formulation shows that pruning, implicitly or explicitly, must be a part of a good compression algorithm. This observation bridges a gap between parts of the literature pertaining to NN and data compression, respectively, providing insight into the empirical success of model pruning. Finally, we propose a novel pruning strategy derived from our information-theoretic formulation and show that it outperforms the relevant baselines on CIFAR-10 and ImageNet datasets. Copyright © 2022 by the author(s)
- Files in This Item
- There are no files associated with this item.
- Appears in
Collections - College of Engineering > School of Electronic & Electrical Engineering > 1. Journal Articles
![qrcode](https://api.qrserver.com/v1/create-qr-code/?size=55x55&data=https://scholarworks.bwise.kr/hongik/handle/2020.sw.hongik/31494)
Items in ScholarWorks are protected by copyright, with all rights reserved, unless otherwise indicated.