Presentation 2022-06-09
A Study on Accelerating Stochastic Weight Difference Propagation with Momentum Term
Shahrzad Mahboubi, Hiroshi Ninomiya,
PDF Download Page PDF download Page Link
Abstract(in Japanese) (See Japanese page)
Abstract(in English) With the rapid development of the IoT, there has been an increasing need to process the data on microcomputers equipped with neural networks (NN). On the other hand, the NN implemented in the microcomputer is based on the stochastic error backpropagation (SGD) method. In SGD, backpropagation of the derivative of the output layer is required for each training sample to update the weights. This is the reason that hardware and computational costs are increasing. To solve these problems, we proposed the Stochastic Weight Difference Propagation (SWDP), which uses the product of the weights and the difference of that for updating the weights. In this research, the modified algorithm of SWDP is proposed as Momentum in stochastic weighted differential propagation (MoSWDP), which introduces the momentum term to accelerate the SWDP. The proposed method is expected to suppress the increase in the number of iterations, which is one of the disadvantages of SWDP and enables faster training while maintaining training accuracy.
Keyword(in Japanese) (See Japanese page)
Keyword(in English) Neural network / gradient-based training algorithm / stochastic gradient decsent method / weight diffrence propagation / momentum term
Paper # NLP2022-9,CCS2022-9
Date of Issue 2022-06-02 (NLP, CCS)

Conference Information
Committee CCS / NLP
Conference Date 2022/6/9(2days)
Place (in Japanese) (See Japanese page)
Place (in English)
Topics (in Japanese) (See Japanese page)
Topics (in English)
Chair Megumi Akai(Hokkaido Univ.) / Akio Tsuneda(Kumamoto Univ.)
Vice Chair Masaki Aida(TMU) / Hidehiro Nakano(Tokyo City Univ.) / Hiroyuki Torikai(Hosei Univ.)
Secretary Masaki Aida(TDK) / Hidehiro Nakano(Shibaura Insti. of Tech.) / Hiroyuki Torikai(Sojo Univ.)
Assistant Tomoyuki Sasaki(Shonan Instit. of Tech.) / Hiroyasu Ando(Tsukuba Univ.) / Miki Kobayashi(Rissho Univ.) / " Hiroyuki YASUDA(The Univ. of Tokyo) / Yuichi Yokoi(Nagasaki Univ.) / Yoshikazu Yamanaka(Utsunomiya Univ.)

Paper Information
Registration To Technical Committee on Complex Communication Sciences / Technical Committee on Nonlinear Problems
Language JPN
Title (in Japanese) (See Japanese page)
Sub Title (in Japanese) (See Japanese page)
Title (in English) A Study on Accelerating Stochastic Weight Difference Propagation with Momentum Term
Sub Title (in English)
Keyword(1) Neural network
Keyword(2) gradient-based training algorithm
Keyword(3) stochastic gradient decsent method
Keyword(4) weight diffrence propagation
Keyword(5) momentum term
1st Author's Name Shahrzad Mahboubi
1st Author's Affiliation Shonan Institute of Technology(Shonan Inst. of Tech.)
2nd Author's Name Hiroshi Ninomiya
2nd Author's Affiliation Shonan Institute of Technology(Shonan Inst. of Tech.)
Date 2022-06-09
Paper # NLP2022-9,CCS2022-9
Volume (vol) vol.122
Number (no) NLP-65,CCS-66
Page pp.pp.40-45(NLP), pp.40-45(CCS),
#Pages 6
Date of Issue 2022-06-02 (NLP, CCS)