Presentation 2023-01-19
[Short Paper] A Study of Model Training Method based on Federated Learning for Distributed Inference with Split Computing
Yutaro Horikawa, Takayuki Nishio,
PDF Download Page PDF download Page Link
Abstract(in Japanese) (See Japanese page)
Abstract(in English) SC (Split computing) is a distributed inference method for load balancing and latency reduction, which splits a neural network and deploys them to a mobile device with low computing power and a server with high computing power. The models executed on the mobile device and the server are called Head Network and Tail Network, respectively. The model size of the Head Network is necessary to be small according to the computing and communication performance of the mobile device without degrading its accuracy. In this paper, we propose a data-efficient Head Network training method for SC. The proposed method can train a Head Network with low computational load and low communication traffic using a small amount of non-IID and unlabeled data from mobile devices and a pre-trained model with a large number of parameters. The proposed method achieves data-efficient training of the Head Network by integrating a method that trains Head Network using knowledge distillation and Federated learning. The experimental evaluation shows that the proposed method can train Head Network models with low computational load and low communication cost while suppressing the accuracy degradation from the pre-trained model.
Keyword(in Japanese) (See Japanese page)
Keyword(in English) Federated learning / Distributed inference / Split computing / Knowledge distillation / non-IID data / Personalization
Paper # SeMI2022-76
Date of Issue 2023-01-12 (SeMI)

Conference Information
Committee SeMI
Conference Date 2023/1/19(2days)
Place (in Japanese) (See Japanese page)
Place (in English) Naruto grand hotel
Topics (in Japanese) (See Japanese page)
Topics (in English)
Chair Koji Yamamoto(Kyoto Univ.)
Vice Chair Kazuya Monden(Hitachi) / Yasunori Owada(NICT) / Shunsuke Saruwatari(Osaka Univ.)
Secretary Kazuya Monden(NTT DOCOMO) / Yasunori Owada(Tokyo Univ. of Agri. and Tech.) / Shunsuke Saruwatari(Osaka Univ.)
Assistant Yuki Matsuda(NAIST) / Akihito Taya(Aoyama Gakuin Univ.) / Takeshi Hirai(Osaka Univ.)

Paper Information
Registration To Technical Committee on Sensor Network and Mobile Intelligence
Language JPN
Title (in Japanese) (See Japanese page)
Sub Title (in Japanese) (See Japanese page)
Title (in English) [Short Paper] A Study of Model Training Method based on Federated Learning for Distributed Inference with Split Computing
Sub Title (in English)
Keyword(1) Federated learning
Keyword(2) Distributed inference
Keyword(3) Split computing
Keyword(4) Knowledge distillation
Keyword(5) non-IID data
Keyword(6) Personalization
1st Author's Name Yutaro Horikawa
1st Author's Affiliation Tokyo Institute of Technology(Tokyo Tech)
2nd Author's Name Takayuki Nishio
2nd Author's Affiliation Tokyo Institute of Technology(Tokyo Tech)
Date 2023-01-19
Paper # SeMI2022-76
Volume (vol) vol.122
Number (no) SeMI-341
Page pp.pp.25-27(SeMI),
#Pages 3
Date of Issue 2023-01-12 (SeMI)