Presentation 2024-02-29
Model Shifting Method in Federated Learning Using Distillation
Hiromichi Yajima, Shota Ono, Takumi Miyoshi, Taku Yamazaki,
PDF Download Page PDF download Page Link
Abstract(in Japanese) (See Japanese page)
Abstract(in English) Due to the drastic increase in the data for machine learning, distributed machine learning such as federated learning has been attracting attention to avoid the intensive load on a server. Since the process of machine learning requires a huge amount of computation, it is difficult to perform federated learning process on low-performance client devices. To solve this problem, federated learning with downsizing machine learning model by distillation has been proposed. This method can reduce the processing cost and learning time on the clients but degrades the achieved accuracy of machine learning model. This paper proposes a method that improves the accuracy of machine learning model in a short time with downsized machine learning model in the early stage of learning and then switches to general federated learning process. From the experimental results, the proposed method can rapidly improve the accuracy of machine learning model while counteracting the degradation of the final accuracy achieved.
Keyword(in Japanese) (See Japanese page)
Keyword(in English) Machine learning / Federated learning / Distillation / Machine learning model
Paper # NS2023-186
Date of Issue 2024-02-22 (NS)

Conference Information
Committee NS / IN
Conference Date 2024/2/29(2days)
Place (in Japanese) (See Japanese page)
Place (in English) Okinawa Convention Center
Topics (in Japanese) (See Japanese page)
Topics (in English) General
Chair Tetsuya Oishi(NTT) / Kunio Hato(NTT)
Vice Chair Takumi Miyoshi(Shibaura Inst. of Tech.) / Tsutomu Murase(Nagoya Univ.)
Secretary Takumi Miyoshi(NTT) / Tsutomu Murase(Kogakuin Univ.)
Assistant Hiroshi Yamamoto(NTT)

Paper Information
Registration To Technical Committee on Network Systems / Technical Committee on Information Networks
Language JPN
Title (in Japanese) (See Japanese page)
Sub Title (in Japanese) (See Japanese page)
Title (in English) Model Shifting Method in Federated Learning Using Distillation
Sub Title (in English)
Keyword(1) Machine learning
Keyword(2) Federated learning
Keyword(3) Distillation
Keyword(4) Machine learning model
1st Author's Name Hiromichi Yajima
1st Author's Affiliation Shibaura Institute of Technology(SIT)
2nd Author's Name Shota Ono
2nd Author's Affiliation The University of Tokyo(The Univ. of Tokyo)
3rd Author's Name Takumi Miyoshi
3rd Author's Affiliation Shibaura Institute of Technology(SIT)
4th Author's Name Taku Yamazaki
4th Author's Affiliation Shibaura Institute of Technology(SIT)
Date 2024-02-29
Paper # NS2023-186
Volume (vol) vol.123
Number (no) NS-397
Page pp.pp.86-89(NS),
#Pages 4
Date of Issue 2024-02-22 (NS)