Google Apologizes for Offensive “Gorillas” Tag

By on

While facial recognition software represents a hallmark of how far social media has gone in terms of actualizing people, even an industry leader such as Google can be proven time and again to be far from perfect. Google has recently been accused of being racially insensitive in depicting some of its users.

Last Sunday evening, Jacky Alcine was just looking through his library in Google Photos when he noticed that he and a friend were tagged in an album called “Gorillas.” The facial recognition software had tagged them under the label as he and his friend are both African-Americans.

As a computer programmer with a working knowledge of how data is used in social media applications, he took to Twitter to air some of his grievances: “What kind of sample image data you collected that would result in this?”

Hours after as the tweets had reached?hundreds of people, it caught the attention of Google’s chief architect of social Yonatan Zunger who was shocked and replied, “G+ here. No, this is not how you determine someone’s target market. This is 100% Not Okay.” His team promptly scrutinized the data to find solutions. and? He immediately followed up with Alcine the following morning for feedback and to make sure that he was alright.

Google immediately apologized for the incident through a spokesperson,

“We’re appalled and genuinely sorry that this happened. We are taking immediate action to prevent this type of result from appearing. There is still clearly a lot of work to do with automatic image labeling, and we’re looking at how we can prevent these types of mistakes from happening in the future.”

Google has stopped usage of the label “gorillas” and is currently working on the glitch in its search results. Zunger explained that facial recognition software still has problems with skin tones, lighting and obscured faces.

Meanwhile, Facebook has been perfecting its own recognition software by identifying users not only by facial features but also through advanced algorithms that ?measure other variables such as clothes, hairstyle, pose and body shape.

Racial stereotyping is a widespread problem that Google would never consciously take part in, yet even as the fault lies in the software, Google has accepted responsibility for the incident and is doing all it can to prevent and minimize similar outcomes.

About the author

To Top