Skip to content

We implement a model to distinguish between a large class of hand gestures

Notifications You must be signed in to change notification settings

yAya-yns/Hand-Gesture-Recognition

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

24 Commits
 
 
 
 
 
 
 
 

Repository files navigation

The goal of this project is to categorize human hand gestures using a machine learning approach. The motivation is to offer an alternative method of human-computer interfacing to people with disabilities or the elderly, as well as average consumers. With this new method, we hope to promote inclusivity and expand usability in consumer electronics.

Traditional methods of hand gesture recognition and categorization involve additional devices such as specially designed gloves or depth camera, which are inconvenient, unintuitive, and require extra expenses on the user. Thus, RGB-based computer vision is the best solution. Considering the complexity and reliability requirement of the task, we propose to use deep learning based computer vision (CV) techniques, as it offers more flexibility, accuracy, and possibility of feature expansion compared to traditional CV techniques.

Sorry at how jank this is at the moment but to run look at files in the following order. Instructions on how to run the script should be included in each file.

  1. src/preprocessing/get_dataset.py
  2. src/preprocessing/demo.py
  3. src/train.py
  4. src/inference.py
  5. src/cluster.py

About

We implement a model to distinguish between a large class of hand gestures

Topics

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages