This Thursday, Meta AI announced a groundbreaking update to its Universal Speech Translator (UST) project, an open-source, real-time speech-to-speech translation system for primarily oral languages.
The UST project has successfully translated Hokkien, a widely spoken dialect within the Chinese diaspora that lacks a formal written format. UST systems enable Hokkien speakers to talk in English through real-time translation technology and vice versa.
Meta’s AI researchers use machine learning (ML) data to create the aural translation system, including data gathering, model design, and evaluation.
Meta is releasing its integrated ML Hokkien translation data and research papers as open resources, enabling AI developers to create UST projects which cover more languages.
Gathering Low-Resource Data for the Future of Translation
Due to its unwritten nature, Meta faced significant issues trying to gather ML data to create a Hokkien translation platform. The Menlo Park-based firm also leveraged data from similar high-resource languages, like Mandarin, to assist with creating ML training data.
Additionally, Meta is using speech mining systems to gather appropriate translation data without needing source text. In the process, Meta AI developers use a pre-trained speech encoder that aligns unwritten Hokkien speech data to similar English text, enabling an ML system to translate Hokkien based on pre-existing language data.
Meta notes that its translation system is a work in progress and can only translate one sentence at a time. Although the firm explains that the Hokkien project is the first step towards real-time simultaneous translation between languages.
What does this mean for XR?
In its announcement, Meta also noted how its real-time translation research applies to Metaverse services. The firm wishes to encourage connection and mutual understanding through its UST systems virtually and in the real world.
Meta is building Horizon, a Metaverse platform accessible through its Meta Quest portfolio of virtual reality (VR) headsets, including the recently announced Quest Pro.
Should Meta integrate its real-time translation systems into a Metaverse platform like Horizons, it could allow users to chat with individuals worldwide with lower language barriers.
Additionally, the Meta Quest Pro comes packed with eye, face, and body tracking features that allow for greater individual expression. Combined with UST integration, Horizon could contain powerful tools to connect individuals digitally.
Read More: www.xrtoday.com