Automatic Annotation of Tennis Action for Content-Based Retrieval by Collaborating Audio and Visual Information
スポンサーリンク
概要
- 論文の詳細を見る
This paper proposes an automatic annotation method of tennis actions using audio and video information collaboratively. The proposed method extracts ball-hitting times clled "impact times" using audio information, and evaluates position relations between the player and the ball at the impact time to identify player's basic actions, such as forehand swing, overhead swing, etc. Simulation results show that detection rate of impact time influences recognition rate of player's basic actions. They also show that using audio information avoids some event recognition failures that cannot be averted when using only video information, demonstrating the performance and the validity of our approach.
- 社団法人電子情報通信学会の論文
- 2003-04-01