Loading...
Thumbnail Image
Publication

Real-time glove and android application for visual and audible Arabic sign language translation

Alharbib, Saja
Khezendarc, Raghdah
Alshami, Hedaih
Research Projects
Organizational Units
Journal Issue
Abstract
Researchers can develop new systems to capture, analyze, recognize, memorize and interpret hand gestures with machine learning and sensors. Acoustic communication is a way to convey human opinions, feelings, messages, and information. Deaf and mute individuals communicate using sign language that is not understandable by everyone. Unfortunately, they face extreme difficulty in conveying their messages to others. To facilitate the communication between deaf/mute individuals and normal people, we propose a real-time prototype using a customized glove equipped with five flex and one-accelerometer sensors. These sensors are able to detect the bindings of the fingers and the movements of the hand. In addition, we developed an android mobile application to recognize the captured Arabic Sign Language (ArSL) gestures and translate them into displayed texts and audible sounds. The developed prototype is accurate, low cost and fast in response.
Sponsor
Copyright
Book title
Journal title
Embedded videos