講演抄録/キーワード |
講演名 |
2021-12-17 15:15
Task-independent redundancy reduction method using regularization for efficient neural network training ○Charvi Vitthal・Florian Beye・Koichi Nihei・Hayato Itsumi(NEC) PRMU2021-58 |
抄録 |
(和) |
(まだ登録されていません) |
(英) |
Neural networks (NNs) are widely used for various applications in recent years. However, it is difficult for the NN to learn optimum amount of information due to under-fitting and over-fitting. One reason is the presence of repeated information or inoperative components, in other words, redundancies. Hence, mitigating redundancies is essential for improving accuracy. Current methods do not capture all the ways to reduce redundancies without changing the network architecture. This paper proposes a neural network training method to reduce the redundancies. We propose novel metrics to quantify redundancies and ways to compute them. We evaluate our method on different tasks: 2D object detection, 3D object detection and image classification. Experimental results show upto 4% increase in accuracy for 2D object detection task. |
キーワード |
(和) |
/ / / / / / / |
(英) |
Redundancy / Regularization / Information / Neural Network / / / / |
文献情報 |
信学技報, vol. 121, no. 304, PRMU2021-58, pp. 188-194, 2021年12月. |
資料番号 |
PRMU2021-58 |
発行日 |
2021-12-09 (PRMU) |
ISSN |
Online edition: ISSN 2432-6380 |
著作権に ついて |
技術研究報告に掲載された論文の著作権は電子情報通信学会に帰属します.(許諾番号:10GA0019/12GB0052/13GB0056/17GB0034/18GB0034) |
PDFダウンロード |
PRMU2021-58 |