Just Give Me a Sign

NVIDIA's AI-powered Signs platform teaches people sign language, and the data it collects helps developers build more accessible tools.

Signs is a tool that makes it easier to learn American Sign Language (📷: NVIDIA)

For the deaf and hard of hearing, sign languages are a way of connecting with a world that often feels like it was not designed with them in mind. By signing, an individual can communicate in a highly expressive and nuanced manner that is in many ways as effective as verbal communication. Or rather, it would be as effective as verbal communication if more people were fluent in it.

Since learning a new language — sign or otherwise — requires a lot of effort, few people learn to sign unless they either are deaf themselves, or have a close friend or relative that communicates in this way. As such, a small minority of people are competent in signing. And if few people speak a language, it really does not matter how effective or expressive it is. Those individuals will face constant barriers as they try to communicate with others in their daily lives.

NVIDIA recently teamed up with the American Society for Deaf Children and a creative agency called Hello Monday in an effort to address this problem. Together, they launched Signs, an AI-powered interactive platform that not only supports American Sign Language (ASL) learners, but also aids in the development of advanced AI applications designed for accessibility.

Signs is designed to help users of all skill levels learn ASL by leveraging cutting-edge AI technology. The platform includes a 3D avatar that demonstrates ASL signs, allowing users to expand their vocabulary in a visually intuitive manner. Additionally, an AI-driven tool analyzes webcam footage in real time, providing immediate feedback on a user’s signing accuracy.

However, Signs is not just a learning tool — it is also a collaborative effort to build a large-scale ASL dataset. Volunteers can contribute by recording themselves signing specific words, which helps expand the platform’s database. NVIDIA aims to grow this dataset to 400,000 video clips covering 1,000 words, ensuring that it becomes a robust and valuable resource for AI-driven accessibility tools.

To maintain accuracy and quality, fluent ASL users and professional interpreters validate all submitted video clips. This will provide the fuel for a highly reliable digital dictionary that can support both ASL education and future AI applications.

By making the dataset publicly available later this year, NVIDIA is hoping to usher in a future where AI-powered applications can facilitate communication between deaf and hearing individuals. The data could be integrated into AI agents, digital human applications, video conferencing tools, and other accessibility-focused technologies. Additionally, improvements to Signs itself could enable real-time ASL translation and more advanced AI-powered signing assistance that could be instrumental in breaking down communication barriers for deaf and hard-of-hearing individuals.

nickbild

R&D, creativity, and building the next big thing you never knew you wanted are my specialties.

Latest Articles