A Unified Framework of Density Ratio Estimation under Bregman Divergence
スポンサーリンク
概要
- 論文の詳細を見る
Estimation of the ratio of probability densities has attracted a great deal of attention since it can be used for addressing various statistical paradigms such as non-stationarity adaptation, two-sample test, outlier detection, mutual information estimation, dimensionality reduction, independent component analysis, causal inference, conditional density estimation, and probabilistic classification. A naive approach to density ratio approximation is to first estimates numerator and denominator densities separately and then take their ratio. However, this two-step approach does not perform well in practice, and methods for directly estimating the density ratio without going through density estimation have been explored, including methods based on moment matching, probabilistic classification, density matching, and density-ratio fitting. The contributions of this paper are three folds: First, we give a comprehensive review of existing density ratio estimation methods and discuss their pros and cons. The second contribution is that we propose a new framework of density ratio estimation in which a density-ratio model is fitted to the true density-ratio under the Bregman divergence. Our new framework includes all the above existing approaches as special cases, and is substantially more general. Thus, it provides a unified view of various density ratio estimation methods. Finally, we develop a robust density ratio estimation method under the power divergence, which is a novel instance in our framework.
- 2010-10-28
著者
-
Sugiyama Masashi
Tokyo Inst. Of Technol.
-
KANAMORI Takafumi
Nagoya University
-
Suzuki Taiji
University of Tokyo
関連論文
- Statistical active learning for efficient value function approximation in reinforcement learning (ニューロコンピューティング)
- Lighting Condition Adaptation for Perceived Age Estimation
- Computationally Efficient Multi-task Learning with Least-squares Probabilistic Classifiers
- A Unified Framework of Density Ratio Estimation under Bregman Divergence
- Adaptive importance sampling with automatic model selection in value function approximation (ニューロコンピューティング)
- Improving Model-based Reinforcement Learning with Multitask Learning
- Improving Model-based Reinforcement Learning with Multitask Learning
- Least-Squares Conditional Density Estimation
- Direct Importance Estimation with a Mixture of Probabilistic Principal Component Analyzers
- カーネル密度比推定の統計的解析(学習問題の解析,テキスト・Webマイニング,一般)
- A Semi-Supervised Approach to Perceived Age Prediction from Face Images
- Conditional Density Estimation Based on Density Ratio Estimation
- Conditional Density Estimation Based on Density Ratio Estimation
- A density ratio approach to two-sample test (パターン認識・メディア理解)
- A density ratio approach to two-sample test (情報論的学習理論と機械学習)
- Theoretical Analysis of Density Ratio Estimation
- Independent component analysis by direct density-ratio estimation (ニューロコンピューティング)
- Multiclass Boosting Algorithms for Shrinkage Estimators of Class Probability(Artificial Intelligence and Cognitive Science)
- FOREWORD
- Superfast-Trainable Multi-Class Probabilistic Classifier by Least-Squares Posterior Fitting
- Direct Importance Estimation with Gaussian Mixture Models
- Improving the Accuracy of Least-Squares Probabilistic Classifiers
- Artist agent A[2]: stroke painterly rendering based on reinforcement learning (パターン認識・メディア理解)
- Artist agent A[2]: stroke painterly rendering based on reinforcement learning (情報論的学習理論と機械学習)
- Least-Squares Independence Test
- Density Difference Estimation (情報論的学習理論と機械学習)
- Density-ratio matching under the Bregman divergence : a unified framework of density-ratio estimation
- Multiscale Bagging and Its Applications
- Relative Density-Ratio Estimation for Robust Distribution Comparison (情報論的学習理論と機械学習)
- Density Difference Estimation
- Winning the Kaggle Algorithmic Trading Challenge with the Composition of Many Models and Feature Engineering
- Artist Agent: A Reinforcement Learning Approach to Automatic Stroke Generation in Oriental Ink Painting
- Early stopping Heuristics in Pool-Based Incremental Active Learning for Least-Squares Probabilistic Classifier
- Computationally Efficient Multi-Label Classification by Least-Squares Probabilistic Classifiers
- Multi-Task Approach to Reinforcement Learning for Factored-State Markov Decision Problems
- Constrained Least-Squares Density-Difference Estimation
- A Density-ratio Framework for Statistical Data Processing
- Computationally Efficient Multi-task Learning with Least-squares Probabilistic Classifiers
- Model-Based Policy Gradients with Parameter-Based Exploration by Least-Squares Conditional Density Estimation
- A Density-ratio Framework for Statistical Data Processing
- FOREWORD
- On Kernel Parameter Selection in Hilbert-Schmidt Independence Criterion