1A1-E25 Vision based Gesture User Interaction with Aibo
スポンサーリンク
概要
- 論文の詳細を見る
Interactive mobile robots are an active area of research. This paper presents a framework for designing a real-time vision based hand-body dynamic gesture recognition system for such robots. The said framework works in real world lighting conditions with complex backgrounds, and can handle intermittent motion of the camera. We present here a novel way in which the Motion History Image (MHI) and the Motion Energy Image (MEI) is built. We propose a robust combining of the motion and color cues and we call this image as Motion Color Image (MCI). The input signal is captured by using monocular color camera. Vision is the only feedback sensor being used. It is assumed that the gesturer is wearing clothes that are slightly different from the background. Gestures are first learned offline and then matched to the temporal data generated online in real time. We have tested this on a gesture database consisting of 11 hand-body gestures and have recorded recognition accuracy up to 90%. We have implemented the system for Sony's Aibo robot dog using Remote Framework (RFW) SDK by Sony.
著者
-
Desai Uday
Department Of Electrical Engineering Indian Institute Of Technology
-
SINGH Randeep
Kanwal Rekhi School of Information Technology
-
SETH Bhartendu
Department of Mechanical Engineering, Indian Institute of Technology
-
Seth Bhartendu
Department Of Mechanical Engineering Indian Institute Of Technology