Dodging AI Biases in Machine Translation – Slator

Dodging AI Biases in Machine Translation

Artificial Intelligence is changing the future of how we work. While businesses around the world will realize the promise of AI, they will also have to adapt in order to meet unique and complex challenges posed by this technology.

While conventional software is used as a tool by workers to increase productivity, AI works in the loop alongside humans. Although we are only experiencing the early days of this new kind of working relationship, in 2020 our AI tech is already starting to feel more like a work colleague than a software tool.

As we move forward, it is easy to imagine collaborative teams working with hybrid cognitive models, centered around both human and artificial intelligence.

Advertisement

This new co-worker comes with a lot of benefits and skills. For customized NMT solutions, we already see up to 70% of machine translation output accepted by professional linguists and up to 97% by end users. But there are also some new risks.

AI and Company Culture

We all want to work in a welcoming, diverse, and inclusive environment; we all know how essential such an environment is to a company’s success.

But how do we make sure that our AI co-workers avoid harmful biases? It is highly unlikely that we will be able to achieve our business goals if only part of the team aligns their values with the company culture. We have to ensure that our machine translation is working towards a more positive and productive global working environment rather than against it. Every part of your team has to meet the best standards, even your AI.

Machine translation must work towards a more positive and productive global working environment rather than against it

What if your text read by your international colleague via machine translation reads as sexist?

What if your message, translated in a corporate messenger, uses an improper tone of voice?

What we want to focus on here is the specific challenges that arise at a global company when applying machine translation to facilitate international, multilingual communication with its customer base, or internal communication between company employees. Both are vulnerable to disruption due to biased translation.

Gender Bias

State-of-the-art research in the possible gender bias of machine translation (see the WinoMT Challenge) uses sentences that linguistically imply a feminine form of traditionally masculine words, typically professions. After the machine translation is completed, they analyze the bias strong enough to overcome the linguistic dependency.

In a test of 31 segments of customer support dialogue (i.e., English to French) on all popular stock NMT engines, 90–95% of the translations defaulted to masculine

We found that in practical day-to-day translation, another form of gender bias is actually more consequential; the short phrases that lack content to properly define gender, such as “Could you help me?” After machine translation, such phrases may become either feminine or masculine, depending on the machine translation bias.

In order to investigate this further, we tested 31 segments of customer support dialogue (for this example, English to French) on all popular stock NMT engines, and saw that 90–95% of the translations defaulted to masculine.

You can attempt to tackle gender bias in several ways.

One is to copy and paste your text into Google Translate Web App, which supports gender selection for certain languages. Sadly for our experiment, this is not the case for French. Also, this option is not available for commercial Google Translate API, so one would have to copy and paste text into the unsecured free web application. Not ideal.

It is possible to use the available metadata to inject the context into the text to be translated and then do automated post-editing

Another option is to instruct operators to build long phrases with enough gender content to ensure correct translation. However, you can hardly require employees and users to do this.

We implement the third option in the Intento MT Hub. Here, all sorts of simple edits may not work due to inflections and language complexity. However, it is possible to use the available metadata to inject the context into the text to be translated and then do automated post-editing. This way, you may request to get either masculine or feminine translation. It works pretty well for most of the MT engines, but we are still working to get it closer to 100% accuracy.

Tone of Voice Bias

If you are a native English speaker, you probably do not spend much time thinking of honorifics — most English speakers have not for the last 500-plus years. For others, the dual use of formal and informal language is the norm.

This creates a further problem for most MT engines, which are inconsistent when it comes to the formal / informal divide. This could potentially create disastrous results when translating dialogue.

As with gender bias, tone-of-voice features typically go hand in hand with inflections, rendering any search and replace techniques useless

To investigate the tone bias, we again tested segments from customer support dialogue, this time with 210 segments (English to German) on all popular stock NMT engines. Here, we observed a heavy bias defaulting to the formal tone in 60–70% of phrases.

One option to solve the tone bias is to go with DeepL, which has the native tone-of-voice control feature. It is pretty accurate, resulting in about 99.5% of translations being informal. However, what if you need a custom model and terminology, or another MT system works better for your content?

As with gender bias, tone-of-voice features typically go hand in hand with inflections, rendering any search and replace techniques useless. We have added some MT-agnostic NLP in our tools, which enables tone-of-voice control and provides a wider choice of MT engines for such cases.

Biased translation may hinder efforts to build a positive and productive work environment for your company

As businesses are becoming more globalized, machine translation is becoming a ubiquitous part of company culture. Biased translation may hinder efforts to build a positive and productive work environment for your company. To succeed, your AI companion needs to actively fight bias, and we need to hold our AI to the same standards we would of any human co-worker.

Emerging digital trends have been proliferated by the Covid pandemic. We are focusing our efforts on resolving bias issues because, as more and more business processes have moved online, the importance of adequate, personalized, and inclusive machine translation has never been higher.