Properties of the "Entropy" H_λ(ξ) and the "Average Conditional Entropy" H_λ(ξ|η)
スポンサーリンク
概要
- 論文の詳細を見る
In [2], [3] and [6], the information of a random variable ξ with respect to another random variable η is defined as [numerical formula] which is a generalization of Shannon's definition of information. This quantity plays a fundamental role in information theory. The quantity I(ξ, ξ), which is called "the entropy of ξ", is often used. But, when ξ is a continuous random variable, I(ξ, ξ) becomes infinite, and so, sometimes, it is inconvenient to use I(ξ, ξ) as "the entropy of ξ". In [7], the author introduced a new quantity L(P_1, P_2, μ) and, using this, defined the entropy H(P_ξ;λ)=H_λ(ξ) of P_ξ with respect to λ. This definition of H_λ(ξ) is more general than that of "entropy" in [3] and serves as both Shannon's entropy and Wiener's one. The aims of this paper are (i) to define the "average conditional entropy" H_λ(ξ|η) (Section 3), (ii) to investigate various properties of H_λ(ξ) and H_λ(ξ|η) (Sections 4 and 5) and (iii) to consider the relations among H_λ(ξ), H_λ(ξ|η), I(ξ, η), etc, and to derive analogous formulas to usual ones.
- 横浜国立大学の論文
横浜国立大学 | 論文
- 足柄層郡産パラステゴドン象
- 新製品開発の国際比較 : 日伊製造企業の比較分析(笹井均先生退職記念号)
- わが国製造企業における人的資源管理 : 機械,電機,自動車製造事業所の実証分析
- わが国製造企業におけるオペレーション戦略の形成過程 : 機械,電機,自動車製造事業所の実証分析
- わが国製造企業における品質管理システム : 機械,電機,自動車製造事業所の実証分析