Development of Motion Control Using Kinect Sensor as Kansei Communication Interface
スポンサーリンク
概要
- 論文の詳細を見る
Because Japanese farmers are aging, the need for cooperative working type agricultural robots that can perform tasks together with human beings is becoming particularly acute. Such robots need the ability to integrate their activities seamlessly with human beings and should possess functions that can be controlled intuitively, even by new agricultural workers and elderly people. These functions are generally considered under the heading of Kansei communication. Robots that incorporate such abilities include the Kansei Agri-robot and the Chinou robot, which can extract tacit knowledge and retain it for future use, and which have been the subject of intense study and development work. In this paper, we report on the fabrication and evaluation of an intuitive control component that utilizes human motions, which is one of the core technologies for issuing instructions, that is based on the Kinect sensor. The Kinect sensor, which can trace and replicate motion information based on a human skeleton movements, is a gaming device for the Xbox 360 released by Microsoft Corporation in 2010. It consists of three-dimensional depth sensors comprising an infrared (IR) emitter and an IR depth sensor, an RGB camera, and a multi-array microphone. The motion control technique targeted in this study is the finger pointing that will be used to instruct the robot on how to move and position itself correctly in a working area or other location. As the development environment, we used Microsoft Windows 7 as the operating system, OpenNI as the library, and NITE as the middleware. Visual Studio 2010 and the C++ language were used for software development. The following results were obtained. First, we found that the skeleton motion information of a farmer could be extracted at various angles using the Kinect sensor. Next, an algorithm for calculating finger-pointing points from the joint coordinate information relating to the shoulders, hands, and feet of the farmer was formulated. Based on the results of our verification experiments, we found that the algorithm accuracy was high when considered in terms of the assumed robot size and working area, and that control of a robot by finger pointing was possible. Estimation errors were found to vary depending on the sensing angle of the robot in relation to the farmer, and sensing errors from behind the farmer were greater than those occurring from other angles. It was also found that the Kinect sensor could be used in field conditions during early morning and late afternoon hours when light intensities had decreased, as well as under artificial lighting conditions.
著者
-
Sasaki Yutaka
Faculty Of Engineering Ibaraki University
-
Shibusawa Sakae
Dept. Environmental and Agricultural Engineering, Tokyo University of Agriculture and Technology
-
Negishi Hanako
Graduate School of Agriculture, Tokyo University of Agriculture
関連論文
- Designs and Fabrications of Photonic Crystal Fiber Couplers with Air Hole Controlled Tapers
- Development of Motion Control Using Kinect Sensor as Kansei Communication Interface