Best Paper Award

Concatenated Permutation Codes under Chebyshev Distance[IEICE TRANS. FUNDAMENTALS, VOL.E106–A, NO.3 MARCH 2023]

Motohiro KAWASUMI
Motohiro KAWASUMI
Kenta KASAI
Kenta KASAI

Permutation codes are attracting attention for improving the performance of flash memory and for the practical application of DNA storage.
Permutation codes are error-correcting codes that have a subset of all permutation patterns for a given length.
This paper addresses the problem of coding and decoding permutation codes based on the Chebyshev distance as the code distance.
Most of the previous studies in coding theory assume bounded distance decoding based on the minimum Hamming distance.
However, this paper focuses on constructing codes with a large minimum Hamming distance.
In contrast, this paper assumes soft-decision decoding, in which the minimum Hamming distance of a code is not important, and introduces the idea of sparse graph codes, which have achieved great success as a code class based on soft-decision decoding, to permutation codes.

First, in the analysis of the minimum Chebyshev distance of concatenated permutation codes that combine permutation codes and binary codes, pseudo-distance was introduced for the outer codes and used to derive the Chebyshev distance of the concatenated permutation codes.
This is an original result that derives the upper bound of the minimum Chebyshev distance by deriving the distance distribution of concatenated permutation codes using random binary linear codes as outer codes.
As a result, significant results have been obtained for permutation codes based on soft-decision decoding.

It also shows the results of constructing concatenated permutation codes using low-density parity check (LDPC) codes, which are sparse graph codes, as outer codes.
A sum-product decoder for concatenated permutation codes using LDPC codes was formulated, and it was shown that decoding performance is superior to that of hard decision decoding for non-concatenated permutation codes.
In particular, the decoder is evaluated not on storage-based channels but on additive white Gaussian noise (AWGN) channels, which are often discussed in coding theory, as well as on high order symmetric channels and binary erasure channels, which is also a significant contribution.