Anyone who wants to do a quick translation is probably happy to fall back on technological aids once in a while: Google Translate, Linguee or DeepL are widely known by now. But machine translation can prevent linguistic progress or the successful establishment of non-discriminatory language. Translation programmes draw from already existing texts – and these are far from being free of discrimination. That’s why we’re excited that our macht.sprache. project will be able to develop an integration with existing translation websites to support gender-sensitive translation with the help of the Prototype Fund.
When trying to translate “100 translators” into German, there are several translation options. If there are 99 female translators and 1 male translator, I can choose between the following: “Die Übersetzer” (generic masculine), “die Übersetzer und Übersetzerinnen” (double naming) or variants with asterisk, underscore or colon, e.g. “die Übersetzer*innen” – just to name some of the options. One thing is clear: when translating texts from English into German, gender is an issue that should definitely be consciously considered.
At first sight, the English language appears to be inclusive. It has only one article for all genders, “the”, and nouns and adjectives lack gender-specific endings. In German, on the other hand, there are three articles that automatically imply gender attribution, “der”, “die”, “das”, moreover, nouns often have feminine or masculine forms, as in the aforementioned example of the person who translates.
However, there are also many cases in English in which a potential bias is inherent in the language, even if it is not grammatically gendered – it is the same bias that exists in the dominant cultural understanding, which actually shapes linguistic expression. Some people are even automatically thinking of a man, when there is talk about a person. In English, “man” and “person” are sometimes used synonymously. This standard, human = man, is only challenged when the human is further specified.
People and machines – how linguistic bias reproduces itself
But even in the descriptions of a person, there are culturally developed prejudices with regard to gender. In English, numerous job titles are gender-neutral. Nevertheless, some are clearly interpreted as men’s jobs, e.g. “Surgeon”, “scientist”, “actor”, or “lawyer”, and they are translated as such: “Chirurg”, “Wissenschaftler”, “Schauspieler” oder “Rechtsanwalt”. Other professions, on the other hand, directly evoke the connection to women: “nurse”, “prostitute”, “homemaker”. In the German translation, these tend to become explicitly female – “Krankenschwester”, “Prostituierte” and “Hausfrau” – even though there would be gender-free or gender-inclusive variants.
Since a majority still seems to believe that certain professions can only be done by certain genders, machine translations, which many people like to use for multilingual social media posts, event announcements or blog posts, reproduce this bias. If most of society does not use gender-inclusive or gender-neutral language, and machines learn from data that is available to them, they are a reflection of social norms. But users of translation programmes could – and should – double-check whether the transltion reflects what they want to say and how they want to say it especially with regard to gender. Users could pay particular attention to nouns that label people.
In her book Wordslut: a feminist guide to taking back the English language, Amanda Montell explains that adjectives can also be implicitly gendered – at least culturally, not grammatically. Montell refers to data from sociolinguist Eliza Scruton, who conducted a study that revealed that terms like “nasty” or “bossy” appear most often before mother or wife (we don’t even need to talk about “bitchy” here). I just typed “bossy” into the online dictionary Linguee and as an example sentence for common usages it suggested: “My mother has a dominant and bossy personality”.
In a patriarchal society, terms that label people living in that society are often equally patriarchal. Everybody who works with language – which includes translators – has the choice to make conscious decisions for or against certain terms. In most cases, several words with similar meanings exist. Therefore, choosing a term that has no (or not such strong) gendered connotations is possible. “Stubborn” is quite a suitable alternative for “bitchy”. But of course bitchy can also be used deliberately: In my view, it’s a particularly good choice precisely when it is used subversively as a feminist reappropriation.
Even speaking about animals reveals a lot about gender norms
When translating, thinking about gender is not only important with regard to human beings. Linguist Deborah Cameron explains that English speakers usually refer to animals as “he”, even when the animal is visibly female, like a lioness. To test Cameron’s point, she suggests going to a zoo and listening to the conversations between adults and children. There are authors who consciously distance themselves from the default of using male pronouns. For example, in her book World of Wonders: In Praise of Whale Sharks, Fireflies and Other Astonishments, Aimee Nezhukumatathil offers anecdotes about various animals and she repeatedly refers to them as “she”: “But at the last possible moment before I thought she would crash into me, the whale shark sank just low enough not to touch me at all.” Should this sentence be translated into German, the problem arises that the whale shark is grammatically male – “der Walhai”. The other option would be “die Walhaikuh”. The same question arises when it comes to the cactus wren, the cassowary or the cardinal. They are all grammatically male in German: “Der Kaktuszaunkönig”, “der Helmkasuar”, “der Kardinal”.
These examples illustrate that translation needs to consider the structures and the bias that are inscribed in language. Thinking about gender plays a central role here, because gender norms permeate every aspect of human life and thought. Which gender is ascribed to an animal reveals a lot about socially prevailing ideas of femininity and masculinity – or why is it called “der Löwe” (male) in German, but “die Maus” (female)?
Gender-sensitive machine translation
Examples like the ones just mentioned appear in texts and their translations, which are used to train machine translation models. This means that programs like Google Translate or DeepL are taught gender biases. In their datasets, they automatically reproduce the bias – currently, they can’t help it. Then, users of these translation tools may simply copy the translated texts as they are and spread the bias. Here, macht.sprache. wants to intervene: With the help of the Prototype Fund, the macht.sprache. team is developing a text checker that will integrate with existing translation websites and support gender-sensitive translations. The text checker will recognize and highlight potentially sensitive terms and their translations. It will also display some information about the terms in question to allow the users to choose their translation with sensitivity. The tool encourages uses to use machine translations, to go through the translation and to adapt it, when needed. With the macht.sprache.-integration, users of common translation programmes receive the necessary support. In this way, macht.sprache. fosters awareness and users’ texts will become more inclusive.