Covering Scientific & Technical AI | Saturday, December 14, 2024

Facebook Funds NLP Research 

via Shutterstock

In an effort to improve machine translation of non-English languages, Facebook has announced research award winners who will also investigate ways to improve natural language processing and its deployment of better-trained NLP models on edge devices.

Facebook (NASDAQ: FB) said it is awarding research grants covering neural machine translation for “low-resource languages” as well as computationally efficient NLP while making greater use of deep learning to improve NLP models.

The NLP initiative launched in April attracted 115 proposals from around with world. Facebook announced this week it has awarded 11 NLP researcher grants spanning neural machine translation and how to use the emerging technologies to translate different languages into English.

For language translation, current models lack sufficient training data, resulting in poor results. Hence, Facebook awarded three grants to researchers at Johns Hopkins University, the Swiss Ecole Polytechnique Federale of Lausanne and Forschungszentrum für Künstliche Intelligenz, a German AI research center.

The Swiss NLP project will combine text and graphs to come up with what researchers calls “better cross-lingual embeddings.”

A Johns Hopkins researcher will focus on using visualization techniques to bolster neural machine translation.

The German effort will seek to develop a “self-supervised” neural machine translation platform, Facebook said.

Other projects funded under the NLP initiative include computational advances, including model compression techniques and so-called “sparse” and modular models that are viewed as more efficient. Those efforts also would focus on machine translation models and deploying NLP technology on devices such as Amazon’s Alexa and Apple’s Siri.

The third Facebook grant category, “Robust Deep Learning” for NLP, would among things focus on new modeling approaches and learning methods aimed at improving neural network generalization capabilities That approach promises to improve the way models handle linguistic subtleties ranging from colloquialisms to “lexical choice variation.”

Facebook said it would use the research results for applications ranging from improving customer service and targeted advertising to boosting content delivering via social media.

 

About the author: George Leopold

George Leopold has written about science and technology for more than 30 years, focusing on electronics and aerospace technology. He previously served as executive editor of Electronic Engineering Times. Leopold is the author of "Calculated Risk: The Supersonic Life and Times of Gus Grissom" (Purdue University Press, 2016).

AIwire