OBJECTIVE
Around 800 million people around the world are affected by hearing loss. This number is expected to reach 1.1 billion by 2016 – about 16% of the world’s population. There are two types of communication systems that aid such people - unaided communication systems and aided communication systems. Unaided communication systems are those which rely on the user’s body for communication, like sign language and body language. Aided communication systems include actual tools and devices such as hearing aids, (gesture recognition devices) and cochlear implants. Studies have showed that only one in five people who require a hearing aid actually uses one. This is because such hearing aids are often physically cumbersome and difficult to transport.
People suffering from hearing loss as well as their families need to develop certain abilities to communicate with each other through sign language. Such audio vocally impaired individuals can make use of gesture recognition systems which would enable others to understand their gestures through audio or visual output. Several gesture recognition engines exist, which accurately identify sign languages such as American Sign Language, and the British Sign Language. However, such systems are expensive. Since majority of the people with disabling hearing loss live in low and middle income countries, the use of such high-cost systems are unaffordable to them.
We implement a low-cost American Sign Language Translator Glove with custom-made flex and contact sensors embedded on to it for interpretation of static ASL gestures. The output of these sensors are processed by an ATmega 328P microcontroller. Further, a gesture recognition algorithm was developed to improve the accuracy of the system
COMPONENTS
a) Flex Sensors
b) ATmega328 microcontroller (Arduino Uno)
c) Contact Sensors
d) CD 4051 MULTIPEXER
e) 10K RESISTORS
f) M-M, F-F and M-F Jumper wires
g) IC-7805
SYSTEM OVERVIEW
Each glove has nine custom made flex sensors and five contact sensors. The flex sensors are connected through a voltage divider circuit powered by a constant 5V DC supply. The sensor outputs are multiplexed using two 8x1 MUX and are given to analog pins 2 and 3 ATmega 328P microcontroller. The sensor readings from the are processed locally by the microcontroller and the recognized gesture is then displayed.
The voltage divider circuit has equal value of resistances to divide the 5V from the regulator IC-7805 by half. The flex sensors change their resistances when they are bent and there is a corresponding change in voltage which is sent as input to the multiplexer. In case of the contact sensors, whenever a contact is made with the reference contact sensor placed in the thumb finger (5 V) , there is a path for flow of current and the voltage increases to 5V, else it remains at 0V.
The voltage values from the 13 sensors given as input to the multiplexers are sent sequentially to analog pins of Arduino by polling method. The select lines in the MUX are used to select voltage value from each sensor sequentially and they are toggled using digital pins 10,11 and 12 of the Arduino. The primary purpose of using the MUX is to obtain all the sensor values as Arduino doesn’t have as many analog pins as required.
WORKING OF THE SYSTEM:
With respect to the amount of bend of flex sensors, only two states are considered; no bend and full bend. These bends are mapped as bits of 0s and 1s by the gesture recognition engine. Calibration of the sensors is done in such a way that beyond a certain level of bend, the voltage due to change of resistance is set as threshold and the bits are set accordingly. If the voltage due to bend in a sensor exceeds the threshold, the corresponding sensor bit is set HIGH, else it is reset. For contact sensors, the voltage varies between 0 and 5 V during no contact and full contact states and these are mapped as bit 0 and bit 1 respectively. Every alphabet with a static gesture in the American Sign Language is represented by a stream of 13 bits and equivalent decimal value and this is present in the form of a lookup table. Whenever a gesture is signed, the change in voltages are mapped onto bits and the corresponding alphabets are identified after comparison with the lookup table. In order to increase the accuracy of detection, the number of 1s in the bit stream obtained is compared with two successor and predecessor bit streams, that is, if the number of 1s in the bit stream is 7, then it is compared with the values in the lookup table ranging from number of 1s equal to 5 up to number of 1s equal to 9. When an ex-or operation is done with the range of bit stream values, the closest match is obtained. This is done to ensure that exact gesture is identified even if some of the bends made by the user aren’t recognized or signed properly.
Comments