Sign language translation apps are designed to be used quickly and immediately, just like we do with Google translate. They try to respond to everyday needs and situations that deaf people have to face, and which can often create discomfort or impediment.
Imagine ordinary situations, such as a bank or post office appointment, a medical visit or simply a conversation between a group of friends. Thanks to these apps, many interactions can be streamlined, making those who use the app autonomous to be understood and to be understood.
Human languages are surprisingly complex and diverse. We express ourselves in infinite ways, and every nuance is important to understand the general meaning of a conversation. Unlike spoken languages, sign languages are perceived visually. The expressive and visual nature of sign languages allows its users to process elements at the same time.
It should also be specified that each sign language, as is the case for all other languages, is different according to the country of origin.
The technologies that have allowed the development of these apps are varied, and range from speech recognition and text transcription, to machine vision, the technology that makes software capable of recognizing and making sense of images.
Over the years, there have been many attempts to use technology to translate sign language, but sadly many have failed, as it is difficult to find a single formula that decodes a complex phenomenon such as human expressiveness.
Let’s look specifically at some projects that have managed to progress along this arduous path.
SignAll Chat
SignAll uses a technology that records visual inputs from the outside world and converts this information into data that can be processed by the computer.
Visual input can be collected through one or more cameras, which can be 2D or 3D or any combination of the two; the accuracy of the translation is directly related to the quality of the visual input. The images represent signs that can have multiple meanings, the visually recognized data is then processed within a database, which retrieves all the possible meanings of a sign.
In some cases, a single sign could also be used in 100 different meaning combinations, as it can have various syntactic roles within sentences. By applying a variety of machine translation approaches, the probability of different combinations is calculated, ultimately identifying the top 3 most likely.
The first 3 results are displayed on the screen for the user, who selects the translation that best matches their sign language. The selected phrase is then displayed on the screen to be read by the hearing user, or can be communicated using speech synthesis technology.
SignAll is the result of collaboration with Gallaudet University (GU), the world’s leading university for the deaf and hard of hearing; at the moment it is only available for translation from ASL (American Sign Language) to English, and vice versa.
Jeenie
Jeenie is the app that allows you to connect in real time with a sign language translator, who can help the user navigate conversations. It is therefore a platform that, via call or video call, puts the user in contact with a professional who can help him communicate, in any situation and at any time.
It is currently available in 140 countries, and covers the translation of 250 languages (including Italian). Available on any device, it also offers the ability to create video conferences with an online interpreter.
Live Transcribe
is an app developed by Google, and intended for the Android operating system. As the name suggests, it is based on a voice recognition system, which allows the transcription of a conversation in real time. The deaf user can thus read on the screen of their phone what a person is saying to them.
It recognizes 70 languages and dialects, and is able to detect the change of language within a bilingual conversation.
This app was also developed in collaboration with Gallaudet University in Washington D.C.
The importance of translation apps for sign language
Language and the ability to communicate are fundamental rights. Being able to interact with each other means participating, empowering, and being able to access more opportunities; for this reason, accessibility can no longer be a secondary issue, and tools such as these represent important steps forward in order to guarantee full autonomy and participation to all.