Equations of States in Statistical Learning for on Unrealizable and Regular Case
スポンサーリンク
概要
- 論文の詳細を見る
Many learning machines that have hierarchical structure or hidden variables are now being used in information science, artificial intelligence, and bioinformatics. However, several learning machines used in such fields are not regular but singular statistical models, hence their generalization performance is still left unknown. To overcome these problems, in the previous papers, we proved new equations in statistical learning, by which we can estimate the Bayes generalization loss from the Bayes training loss and the functional variance, on the condition that the true distribution is a singularity contained in a learning machine. In this paper, we prove that the same equations hold even if a true distribution is not contained in a parametric model. Also we prove that, the proposed equations in a regular case are asymptotically equivalent to the Takeuchi information criterion. Therefore, the proposed equations are always applicable without any condition on the unknown true distribution.
- (社)電子情報通信学会の論文
- 2010-03-01
著者
-
WATANABE Sumio
Tokyo Institute of Technology
-
Watanabe Sumio
Tokyo Inst. Of Technol. Yokohama‐shi Jpn
関連論文
- Generalization Performance of Subspace Bayes Approach in Linear Neural Networks(Algorithm Theory)
- Equations of States in Statistical Learning for on Unrealizable and Regular Case
- Algebraic geometrical methods for hierarchical learning machines