My favorites | Sign in
Project Home Downloads Wiki Issues Source
READ-ONLY: This project has been archived. For more information see this post.
Search
for
Introduction  
Updated Dec 10, 2012 by an...@norture.com

The development of robots has been significant in production, including factories. The expectation is high for the development of intelligent robot systems that work cooperatively with human beings in daily life and in medical treatment and welfare. Smooth interfacing of human beings and robots is essential for the operation of robots by people in daily life. Anyone can operate robots easily by giving instructions to the robot using gestures and voice instructions, just as people communicate with each other. This interaction has been the subject of extensive research in recent years. As a result, an intelligent manipulator system using tracking vision has been developed [4]. The control algorithm for a service robot through the hand over task was proposed [5]. Human actions are utilized in human-robot interaction [6]. The intelligent house for physically impaired people using hand pointing gestures has been developed [7]. The voice controller that can operate a robot by a voice instruction using a fuzzy control method was presented [8]. This project presents the robot system using gesture and voice instructions. The goals of our project are as follows:

  • Build robust system for controlling the motion of a robot by human speech and gesture commands. Robot we are using is Turtlebot Roomba. It is using Microsoft Kinect for both speech and gesture input.
  • Integrate audio and gesture commands intelligently
  • Integrating the system with ROS for controlling the motion of the robot
  • Developing a probabilistic inference model - grammar
  • Noise reduction for audio module and making it more robust
Powered by Google Project Hosting