Presentation 2021-12-01
A Multilayer Perceptron Training Accelerator using Systolic Array
Takeshi Senoo, Akira Jinguji, Ryosuke Kuramochi, Hiroki Nakahara,
PDF Download Page PDF download Page Link
Abstract(in Japanese) (See Japanese page)
Abstract(in English) Neural networks are being used in various applications, and the demand for fast training with large amounts of data is emerging. For example, a network intrusion detection~(NID) system needs to be trained in a short period to detect attacks based on large amount of traffic logs. We propose a training accelerator as a systolic array on a Xilinx U50 Alveo FPGA card to solve this problem. We found that the accuracy is almost the same as conventional training even when the forward and backward paths are run simultaneously by delaying the weight update. Compared to the Intel Core i9 CPU and NVIDIA RTX 3090 GPU, it was three times faster than the CPU and 2.5 times faster than the GPU. The processing speed per power consumption was 11.5 times better than the CPU and 21.4 times better than the GPU. From these results, we can conclude that implementing a training accelerator on FPGAs as a systolic array can achieve high speed and high energy efficiency.
Keyword(in Japanese) (See Japanese page)
Keyword(in English) neural network / multilayer perceptron / training accelerator / machine learning / intrusion detection system
Paper # VLD2021-23,ICD2021-33,DC2021-29,RECONF2021-31
Date of Issue 2021-11-24 (VLD, ICD, DC, RECONF)

Conference Information
Committee VLD / DC / RECONF / ICD / IPSJ-SLDM
Conference Date 2021/12/1(2days)
Place (in Japanese) (See Japanese page)
Place (in English) Online
Topics (in Japanese) (See Japanese page)
Topics (in English) Design Gaia 2021 -New Field of VLSI Design-
Chair Kazutoshi Kobayashi(Kyoto Inst. of Tech.) / Hiroshi Takahashi(Ehime Univ.) / Kentaro Sano(RIKEN) / Masafumi Takahashi(Kioxia) / Yuichi Nakamura(NEC)
Vice Chair Minako Ikeda(NTT) / Tatsuhiro Tsuchiya(Osaka Univ.) / Yoshiki Yamaguchi(Tsukuba Univ.) / Tomonori Izumi(Ritsumeikan Univ.) / Makoto Ikeda(Univ. of Tokyo)
Secretary Minako Ikeda(Osaka Univ.) / Tatsuhiro Tsuchiya(NEC) / Yoshiki Yamaguchi(Nihon Univ.) / Tomonori Izumi(Chiba Univ.) / Makoto Ikeda(NEC) / (Tokyo Inst. of Tech.)
Assistant / / Yukitaka Takemura(INTEL) / Yasunori Osana(Ryukyu Univ.) / Kosuke Miyaji(Shinshu Univ.) / Yoshiaki Yoshihara(キオクシア) / Takeshi Kuboki(Kyushu Univ.)

Paper Information
Registration To Technical Committee on VLSI Design Technologies / Technical Committee on Dependable Computing / Technical Committee on Reconfigurable Systems / Technical Committee on Integrated Circuits and Devices / Special Interest Group on System and LSI Design Methodology
Language JPN
Title (in Japanese) (See Japanese page)
Sub Title (in Japanese) (See Japanese page)
Title (in English) A Multilayer Perceptron Training Accelerator using Systolic Array
Sub Title (in English)
Keyword(1) neural network
Keyword(2) multilayer perceptron
Keyword(3) training accelerator
Keyword(4) machine learning
Keyword(5) intrusion detection system
1st Author's Name Takeshi Senoo
1st Author's Affiliation Tokyo Institute of Technology(Toyko Tech)
2nd Author's Name Akira Jinguji
2nd Author's Affiliation Tokyo Institute of Technology(Toyko Tech)
3rd Author's Name Ryosuke Kuramochi
3rd Author's Affiliation Tokyo Institute of Technology(Toyko Tech)
4th Author's Name Hiroki Nakahara
4th Author's Affiliation Tokyo Institute of Technology(Toyko Tech)
Date 2021-12-01
Paper # VLD2021-23,ICD2021-33,DC2021-29,RECONF2021-31
Volume (vol) vol.121
Number (no) VLD-277,ICD-278,DC-279,RECONF-280
Page pp.pp.37-42(VLD), pp.37-42(ICD), pp.37-42(DC), pp.37-42(RECONF),
#Pages 6
Date of Issue 2021-11-24 (VLD, ICD, DC, RECONF)