Presentation | 2022-10-14 Robust Semi-Supervised Learning for Noisy Labels Using Early-learning Regularization and Weighted Loss Ryota Higashimoto, Soh Yoshida, Mitsuji Muneyasu, |
---|---|
PDF Download Page | PDF download Page Link |
Abstract(in Japanese) | (See Japanese page) |
Abstract(in English) | Training Deep Neural Networks (DNNs) on datasets with incorrect labels (label noise) is an important challenge. In the presence of label noise, DNNs adapt to training samples with correct labels in the early training phase and adapt to samples with label noise in the late training phase. Recently, methods based on semi-supervised learning have shown promise in exploiting this property. In this conventional method, the sample loss distribution is modeled by a Gaussian mixture model, and the DNN is trained by assuming that samples with label noise are unlabeled. On the other hand, the accuracy of the division into labeled and unlabeled samples depends on the performance of the DNN during warm-up. In addition, the conventional method cannot consider the effect of the number of labeled samples, which varies with the percentage of label noise in the training data. In this paper, we introduce Early-learning Regularization (ELR), which suppresses the DNN's adaptation to label noise in the initial learning phase. Furthermore, we propose a learning method that introduces a weighted loss that can dynamically change the loss depending on the fraction of labeled samples. To verify the effectiveness of the proposed method, we conducted experiments using the CIFAR-100 image classification dataset with pseudo-label noise. The results show that the proposed method outperforms the existing methods on CIFAR-100 with 20textasciitilde90% label noise. |
Keyword(in Japanese) | (See Japanese page) |
Keyword(in English) | Learning with noisy labels / Semi-supervised learning / Early-learning regularization / weighted loss |
Paper # | SIS2022-16 |
Date of Issue | 2022-10-06 (SIS) |
Conference Information | |
Committee | SIS / ITE-BCT |
---|---|
Conference Date | 2022/10/13(2days) |
Place (in Japanese) | (See Japanese page) |
Place (in English) | Hachinohe Institute of Technology |
Topics (in Japanese) | (See Japanese page) |
Topics (in English) | |
Chair | Tomoaki Kimura(Kanagawa Inst. of Tech.) / 斎藤 恭一(NHK) |
Vice Chair | Naoto Sasaoka(Tottori Univ.) / Hakaru Tamukoh(Kyushu Inst. of Tech.) / 村田 英一(山口大) / 斉藤 一(テレビ東京) |
Secretary | Naoto Sasaoka(NTT) / Hakaru Tamukoh(Kansai Univ.) / 村田 英一(千葉大) / 斉藤 一 |
Assistant | Yoshiaki Makabe(Kanagawa Inst. of Tech.) / Yosuke Sugiura(Saitama Univ.) / 神原 浩平(NHK) / 鈴村 高幸(テレビ朝日) / 松﨑 敬文(NHK) / 宮野 真由子(東芝インフラシステムズ) / 大内 幹博(パナソニック) / 榎 芳栄(TBSテレビ) / 水本 哲弥(日本学術振興会) |
Paper Information | |
Registration To | Technical Committee on Smart Info-Media Systems / Technical Group on Broadcasting Technology |
---|---|
Language | JPN |
Title (in Japanese) | (See Japanese page) |
Sub Title (in Japanese) | (See Japanese page) |
Title (in English) | Robust Semi-Supervised Learning for Noisy Labels Using Early-learning Regularization and Weighted Loss |
Sub Title (in English) | |
Keyword(1) | Learning with noisy labels |
Keyword(2) | Semi-supervised learning |
Keyword(3) | Early-learning regularization |
Keyword(4) | weighted loss |
1st Author's Name | Ryota Higashimoto |
1st Author's Affiliation | Kansai University(Kansai Univ.) |
2nd Author's Name | Soh Yoshida |
2nd Author's Affiliation | Kansai University(Kansai Univ.) |
3rd Author's Name | Mitsuji Muneyasu |
3rd Author's Affiliation | Kansai University(Kansai Univ.) |
Date | 2022-10-14 |
Paper # | SIS2022-16 |
Volume (vol) | vol.122 |
Number (no) | SIS-209 |
Page | pp.pp.27-32(SIS), |
#Pages | 6 |
Date of Issue | 2022-10-06 (SIS) |