Real-time glove and android application for visual and audible Arabic sign language translation
Name:
Publisher version
View Source
Access full-text PDFOpen Access
View Source
Check access options
Check access options
Abstract
Researchers can develop new systems to capture, analyze, recognize, memorize and interpret hand gestures with machine learning and sensors. Acoustic communication is a way to convey human opinions, feelings, messages, and information. Deaf and mute individuals communicate using sign language that is not understandable by everyone. Unfortunately, they face extreme difficulty in conveying their messages to others. To facilitate the communication between deaf/mute individuals and normal people, we propose a real-time prototype using a customized glove equipped with five flex and one-accelerometer sensors. These sensors are able to detect the bindings of the fingers and the movements of the hand. In addition, we developed an android mobile application to recognize the captured Arabic Sign Language (ArSL) gestures and translate them into displayed texts and audible sounds. The developed prototype is accurate, low cost and fast in response.Department
Electrical and Computer Engineeringae974a485f413a2113503eed53cd6c53
10.1016/j.procs.2019.12.128