Indian students build an AI model to translate sign language

Rate this post
Indian students develop an AI model that translates sign language into English in real time

Artificial intelligence (AI) has been used to develop different types of translation models to improve communication between users and break down language barriers between regions. Companies such as Google and Facebook are using AI to develop advanced translation models for their services. Currently, a third-year engineering student in India has created an AI model that can detect American Sign Language (ASL) and translate them into English in real time.

Indian student develops AI-based ASL detector

Priyanjali Gupta, a student at the Vellore Institute of Technology (VIT), shared a video on her LinkedIn profile and demonstrated how an AI-based ASL detector works. The AI ​​model can detect sign language in real time and translate it into English, but currently only a few words and phrases are supported.These include: Hello, please, thank you, I love you, yes, When No.

Gupta created the model by Leverage the Tensorflow object detection API and use transfer learning via a pre-trained model called ssd_mobilenet.. That is, she was able to reuse the existing code to fit the ASL detector model. In addition, it is worth mentioning that the AI ​​model does not actually translate ASL into English. Instead, it identifies the object (in this case, the indicator) and determines how similar the objects are based on the pre-programmed objects in the database.

In an interview with Interesting engineeringGupta she The biggest inspiration for creating such an AI model is her mother Persistently her “Do something” After attending a VIT engineering course. “She made fun of me, but it made me think about what I could do with my knowledge and skill set. One sunny day, in a conversation with Alexa, I came up with the idea of ​​inclusive technology. It came to me. That was the catalyst for a series of plans. “ She told the publication.

Gupta also credited data scientist Nicholas Renotte’s 2020 video on YouTuber. This video details the development of an AI-based ASL detector.

Gupta’s post on LinkedIn has received a lot of positive reactions and ratings from the community, but the AI ​​vision engineer said that the transfer learning method used in her model is “Training by other experts” and it is “The easiest thing to do with AI.” Gupta acknowledged the statement and wrote the building “A deep learning model just to detect signs is a very difficult problem, but not impossible.”

“Currently I’m an amateur student, but I’m learning. Sooner or later, I believe that a much more experienced and learned open source community can find solutions and create deep learning models specifically for sign language. Language “ She added further.

Check out Priyanjali’s GitHub page to learn more about the AI ​​model and access related resources for your project. Also, please tell us your thoughts on Gupta’s ASL detector in the comments below.