Depth-based Gait Feature Representation
スポンサーリンク
概要
- 論文の詳細を見る
This paper proposes a novel gait feature representation that well describes characteristics of a walking person from the perspective of a range sensor. Most existing methods for gait feature extraction use a sequence of his/her silhouette as their input, so that they inevitably suffer from the difficulty of silhouette extraction in real scenes and change of view direction, which prevent them from being applied in practice. The proposed method, on the other hand, does not require such accurate segmentation, and is not affected by view change since captured range data has three-dimensional information. In addition, our method can explicitly separate dynamic feature from a static one, e.g., body shape, which have never been realized. Experimental results of gait authentication show its effectiveness.
著者
-
Yagi Yasushi
The Institute of Industrial and Scientific Research, Osaka University
-
Nakajima Hozuma
The Institute of Scientific and Industrial Research, Osaka University
-
Mitsugami Ikuhisa
The Institute of Scientific and Industrial Research, Osaka University
関連論文
- Video Capsule Endoscopy Analysis For Diagnostic Assistance
- Robust and Real-time Estimation of Camera Rotation with Translation-invariant Features
- Full-dimensional Sampling and Analysis of BSSRDF
- Gait Verification System for Criminal Investigation
- Gait Verification System for Criminal Investigation
- Depth-based Gait Feature Representation
- Full-dimensional Sampling and Analysis of BSSRDF