Paper Abstract and Keywords |
Presentation |
2020-12-18 14:55
Regularization Using Knowledge Distillation in Learning Small Datasets Ryota Higashi, Toshikazu Wada (Wakayama Univ.) PRMU2020-61 |
Abstract |
(in Japanese) |
(See Japanese page) |
(in English) |
Knowledge distillation is a method mainly used for compressing deep learning models, but it has recently gained attention for its effectiveness in learning from small amounts of data as well. In this report, taking the image classification problem as an example, we focused on the fact that the decrease in classification accuracy can be suppressed by distillation when the training data is reduced, and the accuracy varies with a distillation parameter called “temperature”. First, we prepare a teacher model trained on all the training data and then distill it into a student model. In this case, we found that the accuracy of the student model is improved by increasing the temperature, especially when the number of training data is small, and that this effect is not related to the calibration of the teacher model. |
Keyword |
(in Japanese) |
(See Japanese page) |
(in English) |
Deep Learning / Knowledge Distillation / Image Classification / Few-Shot Learning / Calibration / / / |
Reference Info. |
IEICE Tech. Rep., vol. 120, no. 300, PRMU2020-61, pp. 133-138, Dec. 2020. |
Paper # |
PRMU2020-61 |
Date of Issue |
2020-12-10 (PRMU) |
ISSN |
Online edition: ISSN 2432-6380 |
Copyright and reproduction |
All rights are reserved and no part of this publication may be reproduced or transmitted in any form or by any means, electronic or mechanical, including photocopy, recording, or any information storage and retrieval system, without permission in writing from the publisher. Notwithstanding, instructors are permitted to photocopy isolated articles for noncommercial classroom use without fee. (License No.: 10GA0019/12GB0052/13GB0056/17GB0034/18GB0034) |
Download PDF |
PRMU2020-61 |
Conference Information |
Committee |
PRMU |
Conference Date |
2020-12-17 - 2020-12-18 |
Place (in Japanese) |
(See Japanese page) |
Place (in English) |
Online |
Topics (in Japanese) |
(See Japanese page) |
Topics (in English) |
Transfer learning and few shot learning |
Paper Information |
Registration To |
PRMU |
Conference Code |
2020-12-PRMU |
Language |
Japanese |
Title (in Japanese) |
(See Japanese page) |
Sub Title (in Japanese) |
(See Japanese page) |
Title (in English) |
Regularization Using Knowledge Distillation in Learning Small Datasets |
Sub Title (in English) |
|
Keyword(1) |
Deep Learning |
Keyword(2) |
Knowledge Distillation |
Keyword(3) |
Image Classification |
Keyword(4) |
Few-Shot Learning |
Keyword(5) |
Calibration |
Keyword(6) |
|
Keyword(7) |
|
Keyword(8) |
|
1st Author's Name |
Ryota Higashi |
1st Author's Affiliation |
Wakayama University (Wakayama Univ.) |
2nd Author's Name |
Toshikazu Wada |
2nd Author's Affiliation |
Wakayama University (Wakayama Univ.) |
3rd Author's Name |
|
3rd Author's Affiliation |
() |
4th Author's Name |
|
4th Author's Affiliation |
() |
5th Author's Name |
|
5th Author's Affiliation |
() |
6th Author's Name |
|
6th Author's Affiliation |
() |
7th Author's Name |
|
7th Author's Affiliation |
() |
8th Author's Name |
|
8th Author's Affiliation |
() |
9th Author's Name |
|
9th Author's Affiliation |
() |
10th Author's Name |
|
10th Author's Affiliation |
() |
11th Author's Name |
|
11th Author's Affiliation |
() |
12th Author's Name |
|
12th Author's Affiliation |
() |
13th Author's Name |
|
13th Author's Affiliation |
() |
14th Author's Name |
|
14th Author's Affiliation |
() |
15th Author's Name |
|
15th Author's Affiliation |
() |
16th Author's Name |
|
16th Author's Affiliation |
() |
17th Author's Name |
|
17th Author's Affiliation |
() |
18th Author's Name |
|
18th Author's Affiliation |
() |
19th Author's Name |
|
19th Author's Affiliation |
() |
20th Author's Name |
|
20th Author's Affiliation |
() |
Speaker |
Author-1 |
Date Time |
2020-12-18 14:55:00 |
Presentation Time |
15 minutes |
Registration for |
PRMU |
Paper # |
PRMU2020-61 |
Volume (vol) |
vol.120 |
Number (no) |
no.300 |
Page |
pp.133-138 |
#Pages |
6 |
Date of Issue |
2020-12-10 (PRMU) |
|