IEICE Technical Committee Submission System
Conference Paper's Information
Online Proceedings
[Sign in]
Tech. Rep. Archives
 Go Top Page Go Previous   [Japanese] / [English] 

Paper Abstract and Keywords
Presentation 2020-08-28 10:55
Theoretical Analysis on Convergence Acceleration of Deep-Unfolded Gradient Descent
Satoshi Takabe, Tadashi Wadayama (NITech) SIP2020-35
Abstract (in Japanese) (See Japanese page) 
(in English) Deep unfolding is a promising deep learning technique whose network architecture is based on existing iterative algorithms. By unfolding the recursive structure of an iterative algorithm and embedding trainable parameters, deep unfolding can learn the parameters, which results in improving the convergence performance such as convergence speed. The goal of this paper is to give a plausible interpretation of convergence acceleration based on theoretical analyses of deep-unfolded gradient descent (DUGD). As a result, we will introduce Chebyshev steps based on Chebyshev polynomials, which well reproduce learned step sizes in DUGD and improve the convergence rate of GD.
Keyword (in Japanese) (See Japanese page) 
(in English) deep learning / deep unfolding / gradient descent / convergence rate / / / /  
Reference Info. IEICE Tech. Rep., vol. 120, no. 142, SIP2020-35, pp. 25-30, Aug. 2020.
Paper # SIP2020-35 
Date of Issue 2020-08-20 (SIP) 
ISSN Online edition: ISSN 2432-6380
Copyright
and
reproduction
All rights are reserved and no part of this publication may be reproduced or transmitted in any form or by any means, electronic or mechanical, including photocopy, recording, or any information storage and retrieval system, without permission in writing from the publisher. Notwithstanding, instructors are permitted to photocopy isolated articles for noncommercial classroom use without fee. (License No.: 10GA0019/12GB0052/13GB0056/17GB0034/18GB0034)
Download PDF SIP2020-35

Conference Information
Committee SIP  
Conference Date 2020-08-27 - 2020-08-28 
Place (in Japanese) (See Japanese page) 
Place (in English) Online 
Topics (in Japanese) (See Japanese page) 
Topics (in English)  
Paper Information
Registration To SIP 
Conference Code 2020-08-SIP 
Language Japanese 
Title (in Japanese) (See Japanese page) 
Sub Title (in Japanese) (See Japanese page) 
Title (in English) Theoretical Analysis on Convergence Acceleration of Deep-Unfolded Gradient Descent 
Sub Title (in English)  
Keyword(1) deep learning  
Keyword(2) deep unfolding  
Keyword(3) gradient descent  
Keyword(4) convergence rate  
Keyword(5)  
Keyword(6)  
Keyword(7)  
Keyword(8)  
1st Author's Name Satoshi Takabe  
1st Author's Affiliation Nagoya Institute of Technology (NITech)
2nd Author's Name Tadashi Wadayama  
2nd Author's Affiliation Nagoya Institute of Technology (NITech)
3rd Author's Name  
3rd Author's Affiliation ()
4th Author's Name  
4th Author's Affiliation ()
5th Author's Name  
5th Author's Affiliation ()
6th Author's Name  
6th Author's Affiliation ()
7th Author's Name  
7th Author's Affiliation ()
8th Author's Name  
8th Author's Affiliation ()
9th Author's Name  
9th Author's Affiliation ()
10th Author's Name  
10th Author's Affiliation ()
11th Author's Name  
11th Author's Affiliation ()
12th Author's Name  
12th Author's Affiliation ()
13th Author's Name  
13th Author's Affiliation ()
14th Author's Name  
14th Author's Affiliation ()
15th Author's Name  
15th Author's Affiliation ()
16th Author's Name  
16th Author's Affiliation ()
17th Author's Name  
17th Author's Affiliation ()
18th Author's Name  
18th Author's Affiliation ()
19th Author's Name  
19th Author's Affiliation ()
20th Author's Name  
20th Author's Affiliation ()
Speaker Author-1 
Date Time 2020-08-28 10:55:00 
Presentation Time 25 minutes 
Registration for SIP 
Paper # SIP2020-35 
Volume (vol) vol.120 
Number (no) no.142 
Page pp.25-30 
#Pages
Date of Issue 2020-08-20 (SIP) 


[Return to Top Page]

[Return to IEICE Web Page]


The Institute of Electronics, Information and Communication Engineers (IEICE), Japan