Intelligent Navigesture is an ML-baseds project which helps the user to have a touchless experience through gestures interaction with the interface and thus experience the following :
- Touchless mouse: This helps the user to navigate through the interface
- Volume control: Contactless controlling of volume through hand signs
- Ping pong game: A fun experience where the user can play the classic ping pong game with his two hands without needing a companion.
• Github Repository : https://github.com/Dhruv-Sapra/Intelligent-Navigesture
• Drive Link : https://drive.google.com/drive/folders/1eRAi9lJdZqaI9nLOB5vco8tGI2iMh2LC?usp=share_link
Computer Vision :
• OpenCV
• Python programming language
Deep Learning :
• Neural networks and model training
- Gesture based mouse control is working with functionalities - left click, right click , scroll , drag and drop
- Volumn control via gestures is also working
- A webmodule is added that can open a website using gestures
- A ping pong can be played in real-time using gestures by two people.
- Current code only uses cpu to function. Trying to find ways to use gpu power to get better results
- Building and training a model that can train on user-inputed gestures and control the operating system based on those gestures.
Hand tracking can be used as an alternative input method for controlling a mouse on a computer. Here are some potential usage scenarios for hand tracking in this context:
- Virtual and Augmented Reality: Hand tracking will likely play a larger role in the development of virtual and augmented reality technologies, providing users with more intuitive and natural ways to interact with virtual objects and environments.
- Accessibility: Hand tracking can provide a more accessible option for individuals with disabilities that make it difficult to use a traditional mouse.
- Gaming: In gaming, hand tracking can provide a more immersive experience by allowing players to control the game using hand gestures.
- Health-Care: Can be use in an environment where it is not viable to physical touch is needed to operate machines , for example- surgery.
- Presentations: In presentations, hand tracking can be used to interact with slides, providing a more dynamic and engaging experience for the audience.
As of today(02/02/2023), Mediapipe and Pyautogui is not supported for python version 3.11.1
Install a lower python version in your system to install these libraries
- Python 3.8.10 :- https://www.python.org/downloads/release/python-3810/
Install following libraries by running following code in your terminal :
- Mediapipe :- pip install mediapipe
- Pyautogui :- pip install pyautogui
- Pycaw :- pip install pycaw
- Numpy :- pip install numpy
- Opencv :- pip install opencv-python
Clone this repository and run the code using any IDE.
Sahil Dhillon: [email protected]
Ruchika Shirsath: [email protected]
Dhruv Sapra: [email protected]
Soham Sangole: [email protected]
Devansh Joshi: [email protected]
Ketaki Deshmukh: [email protected]