On d-Asymptotics for High-Dimensional Discriminant Analysis with Different Variance-Covariance Matrices
スポンサーリンク
概要
- 論文の詳細を見る
In this paper we consider the two-class classification problem with high-dimensional data. It is important to find a class of distributions such that we cannot expect good performance in classification for any classifier. In this paper, when two population variance-covariance matrices are different, we give a reasonable sufficient condition for distributions such that the misclassification rate converges to the worst value as the dimension of data tends to infinity for any classifier. Our results can give guidelines to decide whether or not an experiment is worth performing in many fields such as bioinformatics.
著者
-
Suzuki Joe
Department Of Mathematics Osaka University
-
SUZUKI Joe
Department of Mathematics, Graduate School of Science, Osaka University
-
AYANO Takanori
Department of Mathematics, Graduate School of Science, Osaka University
関連論文
- Performance of Data Compression in Terms of Hausdorff Dimension
- On d-Asymptotics for High-Dimensional Discriminant Analysis with Different Variance-Covariance Matrices