Google Makes A New Software To Read Sign Language
Google says it has made it possible for a smartphone to interpret and “read aloud” sign language.
The tech firm has not made an app of its own but has published algorithms which it hopes developers will use to make their own apps.
Google research engineers Valentin Bazarevsky and Fan Zhang said the intention of the freely published technology was to serve as “the basis for sign language understanding”. It was created in partnership with image software company MediaPipe.
Google acknowledges this is a first step. Campaigners say an app that produced audio from hand signals alone would miss any facial expressions or speed of signing and these can change the meaning of what is being discussed.
Until now, when trying to track hands on video, finger-bending and flicks of the wrist have hidden other parts of the hand. This confused earlier versions of this kind of software. Google imposes a graph on 21 points across the fingers, palm and back of the hand, making it easier to understand a hand signal if the hand and arm twist or two fingers touch.