Recent Advances and Trends in Large-Scale Kernel Methods
スポンサーリンク
概要
- 論文の詳細を見る
Kernel methods such as the support vector machine are one of the most successful algorithms in modern machine learning. Their advantage is that linear algorithms are extended to non-linear scenarios in a straightforward way by the use of the kernel trick. However, naive use of kernel methods is computationally expensive since the computational complexity typically scales cubically with respect to the number of training samples. In this article, we review recent advances in the kernel methods, with emphasis on scalability for massive problems.
著者
-
KASHIMA Hisashi
Tokyo Research Laboratory, IBM Research
-
IDÉ Tsuyoshi
Tokyo Research Laboratory, IBM Research
-
KATO Tsuyoshi
Center for Informational Biology, Ochanomizu University
-
SUGIYAMA Masashi
Department of Computer Science, Tokyo Institute of Technology
-
Kashima Hisashi
Ibm Res. Yamato‐shi Jpn
関連論文
- Recent Advances and Trends in Large-Scale Kernel Methods
- Statistical active learning for efficient value function approximation in reinforcement learning (ニューロコンピューティング)
- Risk-Sensitive Learning via Minimization of Empirical Conditional Value-at-Risk(Artificial Intelligence and Cognitive Science)
- Improving the Accuracy of Least-Squares Probabilistic Classifiers
- Improving the Accuracy of Least-Squares Probabilistic Classifiers
- Least Absolute Policy Iteration — A Robust Approach to Value Function Approximation
- A New Meta-Criterion for Regularized Subspace Information Criterion
- Approximating the Best Linear Unbiased Estimator of Non-Gaussian Signals with Gaussian Noise
- A Spectrum Tree Kernel
- Multi-task learning with least-squares probabilistic classifiers (パターン認識・メディア理解)
- Multi-task learning with least-squares probabilistic classifiers (情報論的学習理論と機械学習)
- Adaptive importance sampling with automatic model selection in value function approximation (ニューロコンピューティング)
- Analytic Optimization of Adaptive Ridge Parameters Based on Regularized Subspace Information Criterion(Neural Networks and Bioengineering)
- Adaptive Ridge Learning in Kernel Eigenspace and Its Model Selection
- Recent Advances and Trends in Large-Scale Kernel Methods
- Syntheses of New Artificial Zinc Finger Proteins Containing Trisbipyridine-ruthenium Amino Acid at The N-or C-terminus as Fluorescent Probes
- Analytic Optimization of Shrinkage Parameters Based on Regularized Subspace Information Criterion(Neural Networks and Bioengineering)
- Constructing Kernel Functions for Binary Regression(Pattern Recognition)
- Clustering Unclustered Data : Unsupervised Binary Labeling of Two Datasets Having Different Class Balances