indian_hand_sign_language Computer Vision Project
Updated 6 months ago
151
11
Metrics
Introduction to the Indian Sign Language (ISL) Model
This project focuses on creating a precise and real-time detection model for Indian Sign Language (ISL) gestures. We utilize the YOLO-NAS Object Detection model with the coco/14 checkpoint and leverage Roboflow for data preprocessing, which includes annotating various images and balancing classes. The training, validation, and testing phases are executed using a Google Colab notebook, followed by live detection implementation in PyCharm. This model aims to facilitate seamless communication for the deaf and hard-of-hearing community by translating ISL gestures into text and speech.
Use This Trained Model
Try it in your browser, or deploy via our Hosted Inference API and other deployment methods.
Build Computer Vision Applications Faster with Supervision
Visualize and process your model results with our reusable computer vision tools.
Cite This Project
If you use this dataset in a research paper, please cite it using the following BibTeX:
@misc{
indian_hand_sign_language_dataset,
title = { indian_hand_sign_language Dataset },
type = { Open Source Dataset },
author = { Indian Sign Language },
howpublished = { \url{ https://universe.roboflow.com/indian-sign-language-ebxwp/indian_hand_sign_language } },
url = { https://universe.roboflow.com/indian-sign-language-ebxwp/indian_hand_sign_language },
journal = { Roboflow Universe },
publisher = { Roboflow },
year = { 2024 },
month = { jul },
note = { visited on 2025-01-07 },
}