February 21, 2011 - Technology media outlet WinBuzzer published a blog post yesterday, February 20, reporting thatNvidiaIn partnership with the American Society for Deaf Children (ASDC) and digital agency Hello Monday.Launched a product called Signs of AI platform to help more people learn and apply American Sign Language (ASL).
Note: The challenge in training AI to understand sign language is that ASL conveys meaning through a combination of gestures, facial expressions, and spatial localization, and many AI models struggle to deal with this complexity due to the fact that AI relies heavily on hand tracking, ignoring the nuances of non-gestural signals.
Unlike static sign language dictionaries, the Signs platform is dedicated to improving the accuracy, scalability, and understanding of natural sign language styles of AI sign language recognition by continuously learning and improving through user contributions, and by training AI models with user-submitted videos of real ASL gestures.
One of the core features of the Signs platform is to provide real-time AI feedback. Users can express themselves in sign language using the camera, the system analyzes their gestures and provides corrections as necessary, and a 3D avatar demonstrates the correct ASL gestures for easy comparison and learning.
NVIDIA's goal is to build a training model that understands ASL in natural use, rather than relying on rigid, predefined actions. Currently, the dataset contains 400,000 video clips covering 1,000 sign language words, and NVIDIA plans to open the platform to a wider audience to expand the size of the dataset and make portions of the dataset publicly available to support research on accessibility applications.