Presentation 1993/11/24
Speedup of back propagation algorithm employing threshold logic transform networks
Masahiko Tateishi, Shin'ichi Tamura, Shigeyuki Akita,
PDF Download Page PDF download Page Link
Abstract(in Japanese) (See Japanese page)
Abstract(in English) A back propagation network model employing Threshold Logic Transform(TLT)for faster training is proposed.TLT,a simple mathematical transformation,is inserted between the input layer and the hidden layer to speed up extraction of complex features of input.When comparing the conventional method versus adding TLT the study of three classification tasks revealed that with TLT the number of iteration steps reduces to 1, 5 - 1/20.The rate of convergence is also much greater.The conventional method is 33.3%, whereas,with TLT it is 99.3%.
Keyword(in Japanese) (See Japanese page)
Keyword(in English) Neural networks / Threshold logic transform / Back propagation / Intut data transformation
Paper # NC93-49
Date of Issue

Conference Information
Committee NC
Conference Date 1993/11/24(1days)
Place (in Japanese) (See Japanese page)
Place (in English)
Topics (in Japanese) (See Japanese page)
Topics (in English)
Chair
Vice Chair
Secretary
Assistant

Paper Information
Registration To Neurocomputing (NC)
Language JPN
Title (in Japanese) (See Japanese page)
Sub Title (in Japanese) (See Japanese page)
Title (in English) Speedup of back propagation algorithm employing threshold logic transform networks
Sub Title (in English)
Keyword(1) Neural networks
Keyword(2) Threshold logic transform
Keyword(3) Back propagation
Keyword(4) Intut data transformation
1st Author's Name Masahiko Tateishi
1st Author's Affiliation Research laboratories,Nippondenso()
2nd Author's Name Shin'ichi Tamura
2nd Author's Affiliation Research laboratories,Nippondenso
3rd Author's Name Shigeyuki Akita
3rd Author's Affiliation Research laboratories,Nippondenso
Date 1993/11/24
Paper # NC93-49
Volume (vol) vol.93
Number (no) 341
Page pp.pp.-
#Pages 5
Date of Issue