GSoC’20 RoboComp project: Hand Gesture Recognition Component
June 2, 2020
I am Kanav Gupta. I am a final year undergraduate pursuing B.Tech in Computer Science and Engineering from the International Institute of Information Technology, Hyderabad, India. My interests lie in Computer Vision, Machine Learning, and Software Development. I also love to explore and learn about research articles that help solve numerous use cases in the field of robotics and vision. This year, I will be working on the Hand Gesture Recognition Component.
About the Project
One of the capabilities of the robot is to efficiently communicate with humans, either through voice or gestures. Currently, EBO is equipped with components that enable it to perform social tasks like face detection, emotion recognition, people identification, etc. This project aims to add upon the existing functionalities by integrating a ‘Hand Gesture Recognition Component’. Hand Gestures are very significant for human-robot interaction. This component can be used as a tool to enable communication between robots and humans. Furthermore, this component will enable the robot to understand American Sign Language (ASL), which will be an efficient way of communication.
Hand Gesture Component developed in this project will be able to achieve the following objectives
- Track Hands in the input video stream.
- Estimate hand pose or key points.
- Using the estimated hand pose to recognize the intended Hand Gesture.