Facebook has begun testing prompts alerting users of extremist posts on its website.
The tech giant first sent out extremist-warning notification pop-ups to its users in the United States.
Facebook, through the alerts, asked users if they sensed their friends are becoming extremists.
The notifications urged users to “get support” if they had been exposed to extremist posts recently.
The pop-ups, seen on Twitter, read as follows:
“Are you concerned that someone you know is becoming an extremist?”
Another one reads, “you may have been exposed to harmful extremist content recently.”
The prompts contain a link to landing page that offered help to users seeking assistance against extremist posts.
The alert functions on a Redirect initiative that sends users to a support group page where they can get help.
On the support page, Facebook asks users to “take action now to protect yourself and others” from violent groups.
“Violent groups try to manipulate your anger and disappointment,” it said.
ALSO READ: Facebook Emerges Youngest Tech Company Valued Over $1trillion
Dave Bondy of NOW media who also received the alert posted it asking if other users had received it too.
He said, “I am hearing from some of you Facebook has been giving you this message when you click on certain content or individuals. Have you seen this?”
The alert also reads thus:
“Are you concerned that someone you know is becoming an extremist?
“We care about preventing extremism on Facebook.
“Others in your situation have received confidential support.”
The prompt also explained that users can be of help by listening to stories from others who escaped extremism.
It said, “Hear stories and get advice from people who escaped violent extremist groups.”
Concerns about the prompt:
Meanwhile, Virginia State Republican politician, Nicholas Freitas, also shared the alert in a tweet.
However, he raised concerns that the alert could be an avenue to deny people of their freedom of expression.
He said, “I have a real concern that some leftist technocrats are creating an Orwellian environment where people are being arbitrarily silenced or banned for saying something the ‘thought police’ doesn’t like.”
Nevertheless, Facebook spokesperson, Andy Stone revealed that the company, through the alerts, is further trying to combat violent extremism.
Facebook said though it is still test-running the alerts, it will also support extremist victims on the platform.
Aims of the alert:
According to Stone, “This test is part of our larger work to assess ways to provide resources…”
He said the alerts will give “support to people on Facebook who may have engaged with or were exposed to extremist content.”
He also said the initiative will assist users provide help to “someone who is at risk” of extremist posts.
Need for the initiative:
Many people have accused Facebook and other social media platforms like Twitter of aiding the spread of radical and extremist agenda.
They have also been accused of being environments for hate speech, cyber-bullying and other anti-social activities.
Nigeria, for instance, banned Twitter for aiding activities capable of undermining the corporate existence of the country.
The minister of Information and Culture, Alhaji Lai Mohammed had announced the ban.
The Nigerian government said all technological giants like Facebook and Twitter are subject to the Finance Act 2019 and constitution.
Similarly, Facebook has been under pressure from US lawmakers and civil rights groups over extremist contents on its platform.
It has also been accused of aiding the spread of fake news and misinformation.
Last year, Facebook was criticized of failing to take down an account of a militant group urging citizens to take up arms and enter the streets of Kenosha, Wisconsin.
The US lawmakers and rights groups, however, urged the tech company to combat extremism on its platform.
However, Facebook has maintained that it runs regular checks on its platforms to suspend or remove accounts violating its rules.
Recently, the tech giant banned former US president, Donald Trump for two years for allegedly promoting hate speech.
Trump had posted a video saying he loved his “violent” supporters who condemned his election loss.
Facebook’s Vice President of Global Affairs, Nick Clegg, noted on the “…gravity of the circumstances that led to Mr. Trump’s suspension.”
He had said, “We believe his actions constituted a severe violation of our rules.”
He said Trump’s violation of Facebook’s rules “merit the highest penalty available under the new enforcement protocols.”
Therefore, Facebook said it has further tightened its rules against violent groups and hate speech promoter with the extremist alerts.
Found this interesting? Share!