A Fast Stochastic Gradient Algorithm : Maximal Use of Sparsification Benefits under Computational Constraints
スポンサーリンク
概要
- 論文の詳細を見る
In this paper, we propose a novel stochastic gradient algorithm for efficient adaptive filtering. The basic idea is to sparsify the initial error vector and maximize the benefits from the sparsification under computational constraints. To this end, we formulate the task of algorithm-design as a constrained optimization problem and derive its (non-trivial) closed-form solution. The computational constraints are formed by focusing on the fact that the energy of the sparsified error vector concentrates at the first few components. The numerical examples demonstrate that the proposed algorithm achieves the convergence as fast as the computationally expensive method based on the optimization without the computational constraints.
- (社)電子情報通信学会の論文
- 2010-02-01
著者
-
Yukawa Masahiro
Laboratory For Mathematical Neuroscience Bsi Riken
-
UTSCHICK Wolfgang
Associate Institute for Signal Processing, Technische Universitat Munchen
-
Utschick Wolfgang
Associate Institute For Signal Processing Technische Universitat Munchen
関連論文
- A Fast Stochastic Gradient Algorithm : Maximal Use of Sparsification Benefits under Computational Constraints
- Multi-Domain Adaptive Learning Based on Feasibility Splitting and Adaptive Projected Subgradient Method