Sign language is a type of language that uses manual communication to convey meaningful messages to other people. This includes simultaneous employing of hand gestures, movement, orientation of the fingers, arms or body, and facial expressions to convey a speaker's ideas.
A real-time sign language translator is required for facilitating communication between the deaf community and the general public. We designed and created CNN (Convolutional Neural Network) model for hand image & pattern recognition. This model recognize 27 symbols(A-Z, blank) of American Sign Language.
Streamlit Framework is used to develop web-application which captures real-time video of user through web-cam of a device and processes input video stream frame-by-frame and tries to recognise hand gestures using CNN model. After recognition of a hand gesture, recognized characters are shown on end-users screen in real-time.
- Python
- OpenCV
- Streamlit