Written by 1:25 pm Cyber Security, Entertainment, Mobile, News, Social Media, Tech Views: 3

Facebook Mistakenly Calls Black Men Primates; Google Photos Once Labeled Blacks “Gorillas” 

Facebook Mistakenly Calls Black Men Primtes; Google Photos Once Labeled Them “gorillas”

Facebook has said that it disabled an Artificial Intelligence (AI) topic recommendation feature after it mistakenly called black men primates.

Primates are members of the group of mammals which includes humans, monkeys, and apes.

Meanwhile, Google has also meted out similar offensive labeling against blacks before when it called people gorillas.

However, on Friday, Facebook apologized to, particularly, black men, for what it described as “clearly unacceptable error.”

The Offensive Comments against Black Community:

Facebook tailors content to users based on their past browsing and viewing habits.

So, sometimes, it asks people if they would like to continue seeing posts under related categories.

Consequently, Facebook users who recently watched a British tabloid video featuring Black men got an auto-generated prompt asking if they would like to “keep seeing videos about Primates.”

ALSO READ: South Korea Bans Apple, Google From App Store payment Monopolies

The offensive terminology refers to a video which The Daily Mail posted on June 27, 2020.

The title of the video clip is: “white man calls cops on black men at marina.”

Although it featured Black men in disputes with white police officers and civilians, it had no connection to primates.

Similarly, in 2015, Google’s algorithm reportedly tagged two Black people’s faces with the word “gorilla.”

The company later said in a statement to The Wall Street Journal that it was “genuinely sorry that this happened.”

People React:

Meanwhile, a former Facebook content design manager, Darci Groves, shared a screen capture of the offensive recommendation on Twitter.

Groves tweeted saying, “This ‘keep seeing’ prompt is unacceptable.”

She described the prompt as “horrifying and egregious.”

Similarly, in a 2019 Al Jazeera piece, David A Love, a journalist also said Facebook is willingly “enabling hate groups.

“He also accused the social media giant of willingly enabling “white nationalists and far-right extremists.”

Facebook, Google Apologise:

However, reacting to this, Facebook apologized in a response to an AFP inquiry.

According to the platform, “We apologize to anyone who may have seen these offensive recommendations.”

“We disabled the entire topic recommendation feature as soon as we realized this was happening so we could investigate the cause and prevent this from happening again.”

Facial recognition software has been blasted by civil rights advocates who point out problems with accuracy, particularly it comes to people who are not white.

This claim is supported by studies which have shown that facial recognition technology is biased against people of color.

The studies also said facial recognition features have more trouble identifying, especially blacks.

This explains why Facebook’s AI described blacks as primates.

Also, in 2020, Twitter also got some backlash for racially discriminating against blacks.

Twitter’s automated tool selected which part of a picture to preview in tweets that was racially biased against Black people.

However, the micro-blogging site said it investigated the claim and in August, it announced it would offer people $3,500 reward to help fix the issue.

Implications:

Statements like this, if not well addressed, could lead to law suits and penalties against Facebook or other tech platforms.

Also, the black community that makes up users of these platforms may begin to ditch the platforms for racially discriminating against them.

Meanwhile, tech platforms must put an end to the frequent racial discrimination against some people because of colour of their skins.

Visited 3 times, 1 visit(s) today

Found this interesting? Share!

Close

Welcome to Techuncode

Install
×
×