Live Gesture Recognition is an application that captures and interprets human gestures as numbers. Gesture here, can refer to any physical movement, large or small of the fingers. This project has been built using OpenCV and Python. It is based on the concepts of object segmentation. It segments one foreground obeject which in this case is our hand, from a live video sequence.
The two essential parts of this project are
- Segmentation of hand region from the video sequence
- Counting the number of fingers from segmented region
One of the efficient methods to separate foreground from background is using the concept of running averages. More about running averages can be found over here. The system looks over a particular number of frames of the background and runs the average of those. The background is figured out with this average.
When the hand is brought inside the frame after the calibration(once the application starts, it takes 30 sec to calibrate itself according to the background), absolute difference between the background model and the current frame is obtained resulting in a single foreground object - the hand. This method so far as a whole, is known as Background subtraction.
Getting the foreground object does not suffice, there is a need to threshold the difference image to make the hand region visible making all other regions black. To detect the motion of the fingers, contours are of great help. Contours of the difference image are taken and the contour with the largest area is assumed to be the hand.
The process of counting the fingers has five intermediate steps
- Finding the convex hull of the segmented hand region.
- Computing the most extreme points in the convex hull
- Finding the center of the palm using t he extreme points.
- Constructing a cricle around the hand with centre of the palm as origin.
- Bitwise AND between the thresholded hand image and the circle(also known as ROI - Region Of Interest)
- OpenCV
- Python3
- Flask
- Install Python3 and pip
- Install
virtualenv
and add it to your terminal path. - Clone the repository and create the virtual environment
$ git clone https://github.com/abhishekbvs/Gesture-Detection.git $ cd Gesture-Detection
- Create a python 3 virtualenv, and activate the environment.
$ virtualenv -p python3 . $ source bin/activate
- Install the dependencies from
requirements.txt
$ pip install -r requirements.txt
- Run the Flask application
$ python app.py
This project can be extended to understand the gestures and execute commands based on the gestures.