Presentation 2021-03-15
Proposal of Novel Distributed Learning Algorithms for Multi-Neural Networks
Kazuaki Harada, Tsuyoshi Migita, Norikazu Takahashi,
PDF Download Page PDF download Page Link
Abstract(in Japanese) (See Japanese page)
Abstract(in English) A method for multiple neural networks (NNs) with the same structure to learn multiple sets of training data collected at different institutions or places in a distributed manner without aggregating the data has recently been proposed. In this method, each NN iteratively updates the parameter values in the direction obtained by composing the direction of the steepest descent of the loss function for its own training data and the weighted average of the parameter values of its neighboring NNs. Under some conditions on the structure of the graph represening the communication between NNs and the weights for computing the average, it has been proved that the parameter values of all NNs converge to the same stationary point of the loss function for all training data. However, when the training data sets differ from NN to NN and the graph is not dense, it leads to decrease in learning accuracy. In this report, we propose a novel distributed learning method that prevents the degredation of learning accuracy even in such cases, and evaluate its effectiveness experimentally.
Keyword(in Japanese) (See Japanese page)
Keyword(in English) distributed learning / neural network / multi-agent network
Paper # NLP2020-58
Date of Issue 2021-03-08 (NLP)

Conference Information
Committee NLP / MSS
Conference Date 2021/3/15(2days)
Place (in Japanese) (See Japanese page)
Place (in English) Online
Topics (in Japanese) (See Japanese page)
Topics (in English) MSS, NLP, Work In Progress (MSS only), and etc.
Chair Kiyohisa Natsume(Kyushu Inst. of Tech.) / Shigemasa Takai(Osaka Univ.)
Vice Chair Takuji Kosaka(Chukyo Univ.) / Atsuo Ozaki(Osaka Inst. of Tech.)
Secretary Takuji Kosaka(Kyushu Inst. of Tech.) / Atsuo Ozaki(Kagawa Univ.)
Assistant Toshikaza Samura(Yamaguchi Univ.) / Hideyuki Kato(Oita Univ.) / Naoki Hayashi(Osaka Univ.)

Paper Information
Registration To Technical Committee on Nonlinear Problems / Technical Committee on Mathematical Systems Science and its Applications
Language JPN
Title (in Japanese) (See Japanese page)
Sub Title (in Japanese) (See Japanese page)
Title (in English) Proposal of Novel Distributed Learning Algorithms for Multi-Neural Networks
Sub Title (in English)
Keyword(1) distributed learning
Keyword(2) neural network
Keyword(3) multi-agent network
1st Author's Name Kazuaki Harada
1st Author's Affiliation Okayama University(Okayama Univ.)
2nd Author's Name Tsuyoshi Migita
2nd Author's Affiliation Okayama University(Okayama Univ.)
3rd Author's Name Norikazu Takahashi
3rd Author's Affiliation Okayama University(Okayama Univ.)
Date 2021-03-15
Paper # NLP2020-58
Volume (vol) vol.120
Number (no) NLP-430
Page pp.pp.17-22(NLP),
#Pages 6
Date of Issue 2021-03-08 (NLP)