Development of a Guide-Dog Robot: Leading and Recognizing a Visually-Handicapped Person using a LRF
スポンサーリンク
概要
- 論文の詳細を見る
A conceptual Guide-Dog Robot prototype to lead and to recognize a visually-handicapped person is developed and discussed in this paper. Key design features of the robot include a movable platform, human-machine interface, and capability of avoiding obstacles. A novel algorithm enabling the robot to recognize its followers locomotion as well to detect the center of corridor is proposed and implemented in the robots human-machine interface. It is demonstrated that using the proposed novel leading and detecting algorithm along with a rapid scanning laser range finder (LRF) sensor, the robot is able to successfully and effectively lead a human walking in corridor without running into obstacles such as trash boxes or adjacent walking persons. Position and trajectory of the robot leading a human maneuvering in common corridor environment are measured by an independent LRF observer. The measured data suggest that the proposed algorithms are effective to enable the robot to detect center of the corridor and position of its follower correctly.
著者
-
Saegusa Shozo
Collaborative Research Center Hiroshima University
-
Uratani Yoshitaka
Collaborative Research Center Hiroshima University
-
Makino Toshiaki
Tokuyama National College Of Tech.
-
TANAKA Eiichirou
Shibaura institute of Tech.
-
Yasuda Yuya
Collaborative Research Center Hiroshima University
-
TANAKA Eiichirou
Shibaura Institute of Technology
-
CHANG Jen-Yuan
Massey University
-
MAKINO Toshiaki
Tokuyama College of Technology
関連論文
- INT-15 Development of Guide-Dog Robot (second report) : Leading and recognizing a visually handicapped person using LRF(Intelligent Machines IV,Technical Program of Oral Presentations)
- Development of a Guide-Dog Robot: Leading and Recognizing a Visually-Handicapped Person using a LRF
- Development of a Damage Diagnosis for a Gear using a Laser Beam:(A Proposal for a Method of Creating Benchmark Data to be used in Diagnosis)