NetMag Global
Google is afraid of assuming your gender with Gmail's Smart Compose feature

Google is afraid of assuming your gender with Gmail’s Smart Compose feature

Instead of building a better AI system or giving users a choice of suggestions, Google is removing all gender-specific terms from Gmail’s Smart Compose suggestion tool over fears of backlash from easily offended users.

Earlier this year Google introduced an AI-assisted auto complete feature on Gmail that is meant to make writing emails faster and easier. Apparently political correctness is getting in the way of improving the AI engine. Instead of analyzing profile data, names, photos, and other information passed through Gmail, Google is removing gender pronouns so as not to offend users.

Google will no longer use words such as “her” or “him” in auto complete suggestions. According to Gmail product manager Paul Lambert, this alleged issue was brought up by a Google employee in January of this year. When writing about meeting with an investor, the AI engine suggested “Do you want to meet him?” as a potential follow-up question instead of offering the possibility that the investor could be a woman.

Must Read: OPPO to Invest RMB 10 Billion Research & Development in 2019

Currently, Google is up to around 1.5 billion Gmail users worldwide. Smart Compose is now being used on around 11 percent of emails sent through Gmail’s web interface. Lambert has said that certain mistakes are far more forgivable than others, calling a mistake of gender “a big, big thing.”

According to Lambert, a team of around 15 employees were tasked with removing any form of bias from Smart Compose, but ultimately were not successful in correctly determining user gender with high enough accuracy. Removal of the pronouns is said to affect less than one percent of suggestions.

Even though Google is taking the easy way out of simply removing functionality from its AI, other tech giants have done the same. On mobile devices, Apple has already removed several suggestions that could imply a gender. Microsoft and Amazon have been offering feminine words in translations whenever there is doubt. Microsoft has also removed gendered pronouns from LinkedIn’s Smart Replies feature.

Related Articles

Leave a Reply

Your email address will not be published. Required fields are marked *