Geshna B / Sign Language Classification Public

Sign Language Classification

This Edge Impulse project classifies Indian Sign Language gestures using image data and machine learning for accurate gesture recognition.

Images

About this project

This project classifies Indian Sign Language (ISL) gestures, including both digits and alphabets, using image data and machine learning for accurate gesture recognition. The main goal is to enable real-time gesture classification, supporting applications such as assistive communication technologies for the hearing impaired.

Key Features: Gesture Classification: Classifies hand gestures into digits and alphabets. Feature Extraction: Uses image data and machine learning for feature extraction and classification. High Accuracy: Achieved high accuracy in classifying gestures from the dataset. Real-time Deployment: Trained model can be deployed for real-time gesture recognition on embedded devices.

Dataset: The dataset includes images of hand gestures representing digits (0-9) and alphabets (A-Z) in Indian Sign Language. Each gesture category contains 1,200 images. The images were processed, augmented, and labeled to ensure efficient training.

Tools and Libraries Used: Edge Impulse Studio: Used for designing the impulse, training the model, and deploying it for gesture recognition. Image Data: Hand gestures representing digits and alphabets in Indian Sign Language. Edge Impulse’s Built-in Tools: Utilized for feature extraction, model training, and evaluation within the platform.

The trained model can be deployed on Edge Impulse-supported devices for real-time gesture recognition, enabling applications in assistive communication for the hearing impaired.

211
800
662
614
1027
1073
436
599

Run this model

On any device

Dataset summary

Data collected
41,977 items
Labels
1, 2, 3, 4, 5 and 30 others

Project info

Project ID 577176
License Apache 2.0
No. of views 27,995
No. of clones 1