Four Limits in Probability and Their Roles in Source Coding
スポンサーリンク
概要
- 論文の詳細を見る
In information-spectrum methods proposed by Han and Verdú, quantities defined by using the limit superior (or inferior) in probability play crucial roles in many problems in information theory. In this paper, we introduce two nonconventional quantities defined in probabilistic ways. After clarifying basic properties of these quantities, we show that the two quantities have operational meaning in the eps-coding problem of a general source in the ordinary and optimistic senses. The two quantities can be used not only for obtaining variations of the strong converse theorem but also establishing upper and lower bounds on the width of the entropy-spectrum. We also show that the two quantities are expressed in terms of the smooth Rényi entropy of order zero.
- 電子情報通信学会の論文
- 2011-11-01
著者
-
Koga Hiroki
Graduate School Of Systems And Information Engineering University Of Tsukuba
-
KOGA Hiroki
Graduate School of Engineering, The University of Tokyo
関連論文
- On the Asymptotic Behaviors of the Recurrence Time with Fidelity Criterion for Discrete Memoryless Sources and Memoryless Gaussian Sources
- A Digital Fingerprinting Code Based on a Projective Plane and Its Identifiability of All Malicious Users
- The Optimal (t, n)-Threshold Visual Secret Sharing Scheme with Perfect Reconstruction of Black Pixels
- New Results on Optimistic Source Coding(Information Theory)(Information Theory and Its Applications)
- Four Limits in Probability and Their Roles in Source Coding
- Redundancy-Optimal FF Codes for a General Source and Its Relationships to the Rate-Optimal FF Codes