Presentation 2014-03-14
Experimental Study on Effect of Pre-training in Deep Learning through Visualization of Unit Outputs
Tsubasa OCHIAI, Hideyuki WATANABE, Shigeru KATAGIRI, Miho OHSAKI, Shigeki MATSUDA, Chiori HORI,
PDF Download Page PDF download Page Link
Abstract(in Japanese) (See Japanese page)
Abstract(in English) To clarify the capability of recent powerful classifier concept, Deep Neural Networks (DNN), we experimentally investigate effects of the pre-training used to initialize DNN. A deep neural network is first pre-trained using Restricted Boltzmann Machine (RBM), then it is run as an embodiment of Deep Belief Networks, which basically possess associative memory function, and a Deep Autoencoder, which is expected to realize feature representation for an input pattern over the inner layers of network. Analyses are conducted through the visualization of network unit outputs. Based on the experiments, we reveal that the RBM-based pre-training successfully makes networks memorize some information of training patterns and also represent pattern features inside the networks.
Keyword(in Japanese) (See Japanese page)
Keyword(in English) Deep Learning / Pre-training / Visualization of unit outputs / Deep Neural Networks
Paper # PRMU2013-210
Date of Issue

Conference Information
Committee PRMU
Conference Date 2014/3/6(1days)
Place (in Japanese) (See Japanese page)
Place (in English)
Topics (in Japanese) (See Japanese page)
Topics (in English)
Chair
Vice Chair
Secretary
Assistant

Paper Information
Registration To Pattern Recognition and Media Understanding (PRMU)
Language JPN
Title (in Japanese) (See Japanese page)
Sub Title (in Japanese) (See Japanese page)
Title (in English) Experimental Study on Effect of Pre-training in Deep Learning through Visualization of Unit Outputs
Sub Title (in English)
Keyword(1) Deep Learning
Keyword(2) Pre-training
Keyword(3) Visualization of unit outputs
Keyword(4) Deep Neural Networks
1st Author's Name Tsubasa OCHIAI
1st Author's Affiliation Doshisha University()
2nd Author's Name Hideyuki WATANABE
2nd Author's Affiliation NICT:Doshisha University
3rd Author's Name Shigeru KATAGIRI
3rd Author's Affiliation Doshisha University
4th Author's Name Miho OHSAKI
4th Author's Affiliation Doshisha University
5th Author's Name Shigeki MATSUDA
5th Author's Affiliation NICT
6th Author's Name Chiori HORI
6th Author's Affiliation NICT
Date 2014-03-14
Paper # PRMU2013-210
Volume (vol) vol.113
Number (no) 493
Page pp.pp.-
#Pages 6
Date of Issue