Signify
Learn sign language through real-time gesture recognition and gamified lessons.
Team Members
1008034 Arishya Jindal
1008244 Bersamin Paigie Rae Carmona
1008233 Brian Lim Wei Lun
1008497 Dhruv Kadam
1008470 Maheshwari Nair
1008487 Shreya Radhakrishnan
1008483 Vaishavi Venkatesh
Description
This project is a smartphone app designed to help users practice and learn sign language using real-time hand gesture recognition through their phone’s camera. It detects and evaluates each letter of the alphabet using a trained YOLOv5 model, providing immediate feedback, visual cues, and gamified progress tracking through a heart and level system.
The app encourages consistent practice by prompting users to retry mistakes. If a user loses all their hearts, they are redirected to a course overview page where they can review or restart previous content.
Currently, the app offers one lesson: signing all the letters of the alphabet. Future improvements will include lessons for gestures used in basic conversations.
Technical Features:
- CameraX API for real-time camera access
- TensorFlow Lite for model inference
- Google Firebase Firestore for real-time updates and structured storage
- Data collections include:
- User profiles (preferences and progress)
- Sign database (alphabet images and meanings)
- Lessons (content and progress)
- Vocabulary bank (words and images)
- Learning tips
The app follows a modular design with object-oriented principles, ensuring scalability and maintainability. With an intuitive interface and interactive feedback system, it provides an accessible, beginner-friendly way to learn sign language through technology.
Poster