Can Large Language Models Translate No-Resource Languages?

By: Ana Moirano

In a May 16, 2024 paper, Jared Coleman, Bhaskar Krishnamachari, and Khalil Iskarous from the University of Southern California, along with Ruben Rosales from California State University introduced a new approach for machine translation (MT) that is “particularly useful” for no-resource languages, which lack publicly available bilingual or monolingual corpora.

This approach, named LLM-RBMT (LLM-Assisted Rule-Based Machine Translation), combines the strengths of large language models (LLMs) and rule-based machine translation (RBMT) techniques.

The researchers highlighted the exceptional capabilities of LLMs in MT but noted their limitations in low-resource or no-resource language scenarios. “There have been many efforts in improving MT for low-resource languages, but no-resource languages have received much less attention,” they said.

Despite the perception of RBMT as a “relic of the past”, the researchers emphasized ongoing research and development in RBMT systems tailored for under-resourced languages.

Source: https://slator.com/

Full article: https://slator.com/can-large-language-models-translate-no-resource-languages/

Comments about this article



Translation news
Stay informed on what is happening in the industry, by sharing and discussing translation industry news stories.

All of ProZ.com
  • All of ProZ.com
  • Term search
  • Jobs
  • Forums
  • Multiple search