Presentation 1998/3/19
Error Correcting Codes by Very Sparse Networks
Yoshiyuki Kabashima, David Saad,
PDF Download Page PDF download Page Link
Abstract(in Japanese) (See Japanese page)
Abstract(in English) In this work, we investigate the performance of a new family of error correcting codes by using statistical mechanics. In our codes, the code word comprises products of K bits randomly selected from original message and decoding is carried out by utilizing a very sparce tensor with C fixed connections per index. We show that its performance saturates to well-know Shannon's bound for the channel capacity in the limit K, C→∞ with the code rate R = K/C finite. We also assess the usage of i) simulated annealing, ii) dynamics of associative memory and iii) TAP equation for the purpose of practical decoding algorithms.
Keyword(in Japanese) (See Japanese page)
Keyword(in English) Information Theory / Statistical Mechanics / Dilute Spin Glass Models / TAP equations
Paper #
Date of Issue

Conference Information
Committee NC
Conference Date 1998/3/19(1days)
Place (in Japanese) (See Japanese page)
Place (in English)
Topics (in Japanese) (See Japanese page)
Topics (in English)
Chair
Vice Chair
Secretary
Assistant

Paper Information
Registration To Neurocomputing (NC)
Language JPN
Title (in Japanese) (See Japanese page)
Sub Title (in Japanese) (See Japanese page)
Title (in English) Error Correcting Codes by Very Sparse Networks
Sub Title (in English)
Keyword(1) Information Theory
Keyword(2) Statistical Mechanics
Keyword(3) Dilute Spin Glass Models
Keyword(4) TAP equations
1st Author's Name Yoshiyuki Kabashima
1st Author's Affiliation Department of Computational Intelligence and Systems Science, Tokyo Institute of Technology()
2nd Author's Name David Saad
2nd Author's Affiliation Neural Computing Research Group, Aston University, UK
Date 1998/3/19
Paper #
Volume (vol) vol.97
Number (no) 623
Page pp.pp.-
#Pages 8
Date of Issue