The Best Paper Award
Four Limits in Probability and Their Roles in Source Coding
Hiroki KOGA
(英文論文誌A 平成23年11月号掲載)

   Hiroki KOGA
 One of the main interests in information theory is characterization of the optimal performance achieved by coding. In particular, information-spectrum methods, which originate from the seminal paper by Han and Verdu in 1993, provide a framework for treating vast classes of sources and channels with memory. In the fixed-length coding of a general source the spectral sup/inf-entropy rates play crucial roles, where the spectral sup/inf-entropy rates correspond to the right and left endpoints of the entropy-spectrum, respectively, and the entropy spectrum means the probability distribution of the self- information per source symbol.
 The most important contribution of this paper is unveiling the existence of two more basic quantitiesrelated to the entropy-spectrum and clarifying their operational meaningsin the fixed-length coding of a general source. The author first defines the two quantities and establishes inequalities on their values. We can interpret these two quantities as another right and left endpoints of the entropy-spectrum, respectively.
 This increase of the number of basic quantities leads to a new theory in information-spectrum methods. It is well-known that the class of sources in which the spectral sup-entropy rate coincides with the spectral inf-entropy rate has various important properties. By using the introduced two more quantities, we can define new important classes of sources and investigate their properties. In addition, the author clarifies relationships between the four quantities and the smooth Renyi entropy that often appears in the literature in quantum information theory. It is shown that the four quantities coincide with the four kinds of limits of the smooth Renyi entropy. Furthermore, the author succeeds in obtaining a new lower bound on the width of the entropy-spectrum.
 The arguments demonstrated in this paper can be applied to other problems in information theory, as the author recently demonstrated in a different paper in which the same approach was applied to distributed coding of correlated sources.It may also be possible to apply the arguments given in this paper to certain problems of channel coding. In conclusion, this paper opens a door to nonconventional theory in information theory

Close