Achievement Award

Pioneering Research on The Constructions of Optimal Codes in Information Theory

Hirosuke Yamamoto

  Information theory, originating from Shannon's seminal paper in 1948 provides the basic theory for communication, data compression and data security. Prof. Yamamoto has been an outstanding researcher in information theory in Japan. He has developed excellent theories and discussed constructions of optimal codes for various problems in information theory. Prof. Yamamoto has also been a pioneer especially in the research area of information-theoretic security.
  One of the outstanding contributions of Prof. Yamamoto can be found in his papers on secret sharing schemes [1-4]. Secret sharing schemes were proposed independently by Shamir and Blakley in 1979. In the secret sharing scheme called a (k , n)-threshold scheme,n shares are generated from a secret. While the secret is reconstructed from more than or equal to k arbitrary shares, no information on the secret is revealed from less than k shares.It is well-known that the size of any one of the n shares cannot be less than the size of the secret. Prof. Yamamoto proposed a scheme called a (k , L , n)-threshold scheme where information on the secret is gradually revealed from more than k - L shares. He proved that the size of one share is equal to 1=L times of the size of the secret [1] in the (k , L , n)-threshold scheme. We should notice that the (k , L , n)-threshold scheme was rst proposed on May, 1984 before the proposal of the well-known ramp schemes [5] in CRYPTO '84.The two security criteria, weak secrecy and strong secrecy, were also given in [1]DThese criteria often appear in the contexts of information-theoretic security. In particular, a network coding scheme that meets the strong secrecy criterion against wiretappers was developed in [4]. This paper was a winner of the IEICE best paper award in 2009.
  Prof. Yamamoto discussed coding theorems for communication systems that require security against wiretappers ( [6, 7] etc.). In particular, the achievable rate regions for the communication systems in Fig. 1 are clari ed in [6], where a legitimate sender encodes outputs from a memoryless source to a codeword and sends the codeword to a legitimate receiver over a noisy channel in the presence of a wiretapper. The legitimate receiver who shares a key with the legitimate sender can reproduce the source outputs within an acceptable distortion D, while the wiretapper never reproduces the source outputs with a
Fig. 1 Shannon's cipher system with delity criterion
Fig. 1 Shannon's cipher system with delity criterion
distortion D. Since this paper unveils a new connection between the problems of wiretap channels and lossy data compression, this paper is often cited by various authors.
  Prof. Yamamoto has also been a pioneer in source coding. Recently, he proposed a lossless data compression code called an AIFV (almost-instantaneous xed-to-variable length) code in which we use multiple trees in encoding and decoding [8]. It is well-known that the Huffman code minimizes the average codeword length if a source is stationary and memoryless. The average codeword length of the AIFV code, however, can be strictly less than the average codeword length of the Huffman code. In a binary Huffman code, every source symbol is assigned to one of the leaves of a complete binary binary tree. On the other hand, in a binary AIFV code source symbols can also be assigned to certain internal nodes of a binary tree (Fig. 2), which enables us to construct a code with a smaller average codeword length. The result on the AIFV codes is quite surprising because it is widely believed that no code can attain a smaller average codeword length than the Huffman code. Prof. Yamamoto's outstanding contributions in source coding can also be found in [9] and [10], where he treated lossy source coding using an LDPC code and lossless source coding using the Burrows-Wheeler transform, respectively.
  Prof. Yamamoto richly deserves the IEICE Achievement Award from these outstanding achievements.

References

Fig. 2 Huffman code and AIFV code
Fig. 2 Huffman code and AIFV code
Close