Presentation 2022-12-15
A DNN compression method based on output error of activation functions
Koji Kamma, Toshikazu Wada,
PDF Download Page PDF download Page Link
Abstract(in Japanese) (See Japanese page)
Abstract(in English) Deep Neural Networks (DNNs) are dominant in the field of machine learning. However, because DNN models have large computational complexity, implementation of DNN models on resource-limited equipment is challenging. Therefore, techniques for compressing DNN models without degrading their accuracy is desired. Pruning is one such technique that re- moves redundant neurons (or channels). In this paper, we present Pruning with Output Error Minimization (POEM), a method that performs not only pruning but also reconstruction to compensate the error caused by pruning. The strength of POEM lies in its reconstruction to minimize the output error of the activation function, whereas the previous methods minimize the error before the activation function. The experiments with well-known DNN models (VGG-16, ResNet-18, MobileNet) and image recognition datasets (ImageNet, CUB-200-2011) were conducted. The results show that POEM significantly outperformed the previous methods in maintaining the accuracy of the compressed models.
Keyword(in Japanese) (See Japanese page)
Keyword(in English) pruning / reconstruction / activation function
Paper # PRMU2022-38
Date of Issue 2022-12-08 (PRMU)

Conference Information
Committee PRMU
Conference Date 2022/12/15(2days)
Place (in Japanese) (See Japanese page)
Place (in English) Toyama International Conference Center
Topics (in Japanese) (See Japanese page)
Topics (in English)
Chair Seiichi Uchida(Kyushu Univ.)
Vice Chair Takuya Funatomi(NAIST) / Mitsuru Anpai(Denso IT Lab.)
Secretary Takuya Funatomi(CyberAgent) / Mitsuru Anpai(Univ. of Tokyo)
Assistant Nakamasa Inoue(Tokyo Inst. of Tech.) / Yasutomo Kawanishi(Riken)

Paper Information
Registration To Technical Committee on Pattern Recognition and Media Understanding
Language JPN
Title (in Japanese) (See Japanese page)
Sub Title (in Japanese) (See Japanese page)
Title (in English) A DNN compression method based on output error of activation functions
Sub Title (in English)
Keyword(1) pruning
Keyword(2) reconstruction
Keyword(3) activation function
1st Author's Name Koji Kamma
1st Author's Affiliation Wakayama University(Wakayama Univ.)
2nd Author's Name Toshikazu Wada
2nd Author's Affiliation Wakayama University(Wakayama Univ.)
Date 2022-12-15
Paper # PRMU2022-38
Volume (vol) vol.122
Number (no) PRMU-314
Page pp.pp.34-39(PRMU),
#Pages 6
Date of Issue 2022-12-08 (PRMU)