The project came into existence for one sole purpose, to help the deaf community to easily communicate and interact with thier nearby surrounding. The aim is to convert basic symbols that represent the 26 English alphabet as mentioned under ASL (American Sign Language) script and display them on a smartphone screen.
The project was inspired with the idea of controlling robotic arm with the help of hand movements. Most of the working is same but implementing the remaining part is rather a complex task. Accelorometer is used to measure the tilt in the palm. Five bend sensors are placed on a glove, four for the fingers and one for the thumb. These sensors measure the bend in the fingers and thumb and palm and according to the bend angle value the Arduino Nano microcontroller understands which set of value represent which symbol and transfer the appropriate outcome value to the Android app via Bluetooth which displays and speaks the symbol generated.
Representing the first few symbols was quite easy and fun, but there were few symbols that were hard to distinguish such as “U” and “V” which are very slightly different form each other and gave almost same value. The earlier prototype failed drastically to represent the same but the problem was solved using a metallic strip between the finger, which was used to tell if they are in contact or not.
The accuracy was increased by continuously updating the data set for each symbol from time to time.Working VideoSample Picture
I have used Android Studio to create the app. The app shows the symbol generated with voice output.
This tool can be:
1) Further integrated with various services and help to generate employment for the deaf and dumb people.
2) Geared up with the controller to provide home automation on finger tips.
3) Paired up with fitness sensor to monitor health of the individual.