Presentation 2022-01-24
Addition of DPU Training Function by Tail Layer Training
Yuki Takashima, Akira Jinguji, Hiroki Nakahara,
PDF Download Page PDF download Page Link
Abstract(in Japanese) (See Japanese page)
Abstract(in English) The demand for deep learning has been increasing, and many hardware implementations have been proposed. The Deep learning Processor Unit (DPU) was provided by Xilinx. Althogh it can perform inference at high speed, it cannot perform training. We propose a tail layer training that makes the tail layer of a Convolutional Neural Network (CNN) independent. All layers except the tail layer are computed using a DPU, and the tail layer is computed by a CPU. Since the number of neurons and classes in the output must be the same for image classification, it is effective for retraining to add the number of classes. The tail layer training, found that the relationship between the existing classes and the classes to be added is important. Therefore, it is not suitable for training on large number of classes. However, with a dataset such as cifar10, it is able to reduce the loss of accuracy by about 3 points between training the entire model with all 10 classes and training only the tail layer with 2 add classes after training the entire model with 8 classes.
Keyword(in Japanese) (See Japanese page)
Keyword(in English) CNN / Image Classification / DPU / Tail Layer Training
Paper # VLD2021-59,CPSY2021-28,RECONF2021-67
Date of Issue 2022-01-17 (VLD, CPSY, RECONF)

Conference Information
Committee RECONF / VLD / CPSY / IPSJ-ARC / IPSJ-SLDM
Conference Date 2022/1/24(2days)
Place (in Japanese) (See Japanese page)
Place (in English) Online
Topics (in Japanese) (See Japanese page)
Topics (in English) FPGA Applications, etc.
Chair Kentaro Sano(RIKEN) / Kazutoshi Kobayashi(Kyoto Inst. of Tech.) / Michihiro Koibuchi(NII) / Hiroshi Inoue(Kyushu Univ.) / Yuichi Nakamura(NEC)
Vice Chair Yoshiki Yamaguchi(Tsukuba Univ.) / Tomonori Izumi(Ritsumeikan Univ.) / Minako Ikeda(NTT) / Kota Nakajima(Fujitsu Lab.) / Tomoaki Tsumura(Nagoya Inst. of Tech.)
Secretary Yoshiki Yamaguchi(NEC) / Tomonori Izumi(Tokyo Inst. of Tech.) / Minako Ikeda(Osaka Univ.) / Kota Nakajima(NEC) / Tomoaki Tsumura(JAIST) / (Hitachi) / (Univ. of Tokyo)
Assistant Yukitaka Takemura(INTEL) / Yasunori Osana(Ryukyu Univ.) / / Ryohei Kobayashi(Tsukuba Univ.) / Takaaki Miyajima(Meiji Univ.)

Paper Information
Registration To Technical Committee on Reconfigurable Systems / Technical Committee on VLSI Design Technologies / Technical Committee on Computer Systems / Special Interest Group on System Architecture / Special Interest Group on System and LSI Design Methodology
Language JPN
Title (in Japanese) (See Japanese page)
Sub Title (in Japanese) (See Japanese page)
Title (in English) Addition of DPU Training Function by Tail Layer Training
Sub Title (in English)
Keyword(1) CNN
Keyword(2) Image Classification
Keyword(3) DPU
Keyword(4) Tail Layer Training
1st Author's Name Yuki Takashima
1st Author's Affiliation Tokyo Institute of Technology(Tokyo Tech)
2nd Author's Name Akira Jinguji
2nd Author's Affiliation Tokyo Institute of Technology(Tokyo Tech)
3rd Author's Name Hiroki Nakahara
3rd Author's Affiliation Tokyo Institute of Technology(Tokyo Tech)
Date 2022-01-24
Paper # VLD2021-59,CPSY2021-28,RECONF2021-67
Volume (vol) vol.121
Number (no) VLD-342,CPSY-343,RECONF-344
Page pp.pp.55-60(VLD), pp.55-60(CPSY), pp.55-60(RECONF),
#Pages 6
Date of Issue 2022-01-17 (VLD, CPSY, RECONF)