Constrained Least-Squares Density-Difference Estimation
スポンサーリンク
概要
- 論文の詳細を見る
We address the problem of estimating the difference between two probability densities. A naive approach is a two-step procedure of first estimating two densities separately and then computing their difference. However, such a two-step procedure does not necessarily work well because the first step is performed without regard to the second step and thus a small error incurred in the first stage can cause a big error in the second stage. Recently, a single-shot method called the least-squares density-difference (LSDD)estimator has been proposed. LSDD directly estimates the density difference without separately estimating two densities, and it was demonstrated to outperform the two-step approach. In this paper, we propose a variation of LSDD called the constrained least-squares density-difference (CLSDD)estimator, and theoretically prove that CLSDD improves the accuracy of density difference estimation for correctly specified parametric models. The usefulness of the proposed method is also demonstrated experimentally on semi-supervised class-balance estimation under class-balance change.
- 一般社団法人電子情報通信学会の論文
- 2013-02-25
著者
-
Sugiyama Masashi
Tokyo Inst. Of Technol.
-
KANAMORI Takafumi
Nagoya University
-
DU PLESSIS
Tokyo Institute of Technology
-
Nguyen Tuan
Tokyo Institute of Technology
関連論文
- Statistical active learning for efficient value function approximation in reinforcement learning (ニューロコンピューティング)
- Lighting Condition Adaptation for Perceived Age Estimation
- Computationally Efficient Multi-task Learning with Least-squares Probabilistic Classifiers
- A Unified Framework of Density Ratio Estimation under Bregman Divergence
- Adaptive importance sampling with automatic model selection in value function approximation (ニューロコンピューティング)
- Improving Model-based Reinforcement Learning with Multitask Learning
- Improving Model-based Reinforcement Learning with Multitask Learning
- Least-Squares Conditional Density Estimation
- Direct Importance Estimation with a Mixture of Probabilistic Principal Component Analyzers
- カーネル密度比推定の統計的解析(学習問題の解析,テキスト・Webマイニング,一般)
- A Semi-Supervised Approach to Perceived Age Prediction from Face Images
- Conditional Density Estimation Based on Density Ratio Estimation
- Conditional Density Estimation Based on Density Ratio Estimation
- A density ratio approach to two-sample test (パターン認識・メディア理解)
- A density ratio approach to two-sample test (情報論的学習理論と機械学習)
- Theoretical Analysis of Density Ratio Estimation
- Multiclass Boosting Algorithms for Shrinkage Estimators of Class Probability(Artificial Intelligence and Cognitive Science)
- FOREWORD
- Superfast-Trainable Multi-Class Probabilistic Classifier by Least-Squares Posterior Fitting
- Direct Importance Estimation with Gaussian Mixture Models
- Improving the Accuracy of Least-Squares Probabilistic Classifiers
- Artist agent A[2]: stroke painterly rendering based on reinforcement learning (パターン認識・メディア理解)
- Artist agent A[2]: stroke painterly rendering based on reinforcement learning (情報論的学習理論と機械学習)
- Least-Squares Independence Test
- Multiscale Bagging and Its Applications
- Density Difference Estimation
- Winning the Kaggle Algorithmic Trading Challenge with the Composition of Many Models and Feature Engineering
- Artist Agent: A Reinforcement Learning Approach to Automatic Stroke Generation in Oriental Ink Painting
- Early stopping Heuristics in Pool-Based Incremental Active Learning for Least-Squares Probabilistic Classifier
- Computationally Efficient Multi-Label Classification by Least-Squares Probabilistic Classifiers
- Multi-Task Approach to Reinforcement Learning for Factored-State Markov Decision Problems
- Constrained Least-Squares Density-Difference Estimation
- A Density-ratio Framework for Statistical Data Processing
- Computationally Efficient Multi-task Learning with Least-squares Probabilistic Classifiers
- Model-Based Policy Gradients with Parameter-Based Exploration by Least-Squares Conditional Density Estimation
- A Density-ratio Framework for Statistical Data Processing
- FOREWORD
- On Kernel Parameter Selection in Hilbert-Schmidt Independence Criterion