|
Deducing_Direction_from_Gesture
One-sentence summary of this page.
Deducing direction from gestureComplementary integration of speech and gesture is also tackled. For more information about complementary integration, refer to Module integration page. Example of this integration is "Go there" command: saying gothere and pointing to certain direction. Robot should move to the direction parallel with user's hand. As we explained in Gesture section, we generally use 2D skeleton information for gesture recognition. This is place where we use 3D skeleton information available. We compute vector determined by shoulder and hand joint. We obtain 3D vector in Kinect frame which for this purpose we can consider as robot frame. Then we project this vector to XZ plane, which is approximated to be parallel with ground plane. Angle between positive part of Z axis and this new vector is computed. That is angle for which robot will have to rotate. For negative angle, robot will rotate to the left and for positive angle it will rotate to the right. We then rotate robot for needed amount of time because angular velocity and angle are known. Another variation is to use elbow and hand joints, but hand-shoulder vector has shown better results.
| |