A new study has revealed that automated suggestions on predicted text is often gender biased. Uswitch, a comparison and switching service, tested a series of adjectives on smartphones including the Samsung Galaxy S21 and iPhone 12, using the phrase âYouâre a/an *insert word*â to determine results.
Of the 236 adjectives tested, 72% suggested a gender biased response overall. On iOS, almost two thirds of words generated a male biased response.
Samsung Androidâs algorithm proved slightly more gender neutral, with two thirds of inserted phrases generating gender neutral outcomes. That's four times more than iOS.
âQuick-wittedâ, âempatheticâ, and âself-confident,â for example, generated a gender-neutral word suggestion on Android, compared to a male, gender exclusive word suggestion on iOS.
However, both devices are still predicting and putting forward gender-biased phraseology. On both predictive text algorithms, the phrase âYouâre an intelligentâŠâ led both devices to suggest âmanâ as an option for the next word. Both devices failed to suggest any gender-neutral words for adjectives describing intelligence, including âbright,â suggesting both machines are perpetuating discriminatory gender stereotypes.
Adjectives associated with STEM skills, including âlogicalâ, âdecisiveâ and âassertiveâ also generated a male biased response, while words of high praise such as, âbrilliantâ and âcommittedâ were considered male qualities on both software systems. âAthleticâ also generated a male biased response on Android and iOS.
In results which indicate significant unconscious bias, the study revealed that âgirlâ or âgirlsâ were frequently suggested as often as âwoman,â and were generated as the predicted word when adjectives describing weight and appearance were used on the messaging app, including âchubbyâ and âskinnyâ on both devices.
The adjectives, âchunkyâ, âhotâ and âuglyâ also generated âgirlâ or âgirlsâ suggestions on iOS.
Of the results, Lu Li, Founder and CEO of Blooming Founders, said, âLanguage is one of the most powerful means through which gender biases are perpetrated and reproduced.â
âIn male-dominated industries like tech, women have a harder time being taken seriously compared to their male counterparts. And they are passed over for promotions more often because of words such as 'supportive' or 'nurturing' that are often associated with being female.â
The bias is stark not only on our devices, but also in the entrepreneurial world where such innovations are led.
âFemale founders only receive 1% of venture capital, which means that the vast majority of innovation is designed and led by men,â Li continued.
âGender-biased predictive text algorithms are another example of what's inherently wrong in the industry. If people with conscious and unconscious biases input biased data, it will result in biased outcomes, which the rest of society will inherit. Having gender-neutral word suggestions is critical to breaking this cycle and undoing the semantic relations and gender stereotypes that are still deeply rooted in our society.â
You can read You can read Uswitchâs full report on âPredictive Sexismâ here.