Real-time Subtitle System Developed

by Asociacion RUVID

Courtesy of Medical Xpress

https://medicalxpress.com/news/2019-01-real-time-subtitle.html

In Spain, there are over 1 million people over the age of six who have varying types and degrees of hearing disabilities; they represent 8 percent of the population, according to a study by the National Confederation of Deaf People (CNSE). Of these, over 97 percent communicate with oral language, according to data of the National Statistics Institute (INE). All these people can have problems understanding lectures and other learning activities at university.

The Machine Learning and Language Processing (MLLP) research group of Valencia's Polytechnic University (UPV) have developed an automated, real-time subtitle system called Polisubs, which improves disability attention services, making events and conferences more accessible.

Using this system, users can follow the event by reading the subtitles on a mobile app (IOS and Android), or they can do so on a website on their phone. In the case that the event is recorded, the subtitles are automatically added to the video.

Polisubs receives the ambient sound via a microphone system and sends it to the UPV's central servers. There, it is processed with an artificial intelligence system that creates a flow of text subtitles. It has 97 percent accuracy compared to a human transcriber, although it there are still comprehension issues for certain individuals. The system works simultaneously in Spanish, Valenciano and English, and is being developed in other languages.

The system was presented last November at the IV International Congress on the University and Disability, organised by the ONCE Foundation for the Inclusion and Social Cooperation of People with Disabilities.

Throughout 2018, the UPV has conducted implementation tests of Polisubs in the university's main conference halls, with the intention of continuing the deployment in 2019 and aiming to have this technology available in all halls and classrooms.