Facebook is starting to warn some users who may have seen "extremist content" on the social network, the company said on Thursday.
Screenshots posted on Twitter showed a notice asking "Are you worried that someone you know is becoming an extremist?" or "you may have recently been exposed to harmful extremist content." Both screenshots have "download support" links.
The largest social network has long been under pressure from lawmakers and civil rights groups to fight extremism on its platforms, including the US internal movements involved in the January 6 attack on the Capitol.
Hey has anyone had this message pop up- Kira (@RealKiraDavis) July 1th, 2021
on their FB? My friend (who is not an ideologue but hosts lots of competing chatter) got this message twice. He's very disturbed. pic.twitter.com/LjCMjCvZtS
Facebook has announced that it will conduct a small test, on its main platform, as a pilot in the United States. The tests will help to develop a global approach to preventing radicalization through its platform.
"This test is part of our larger work to evaluate ways to help and support people on Facebook who may be involved or exposed to extremist content or may know someone who is at risk," said a Facebook spokesman.
"We work with NGOs and academic experts in this field and hope to have more to share in the future."
Facebook said the tests identified both users who may have been exposed to extremist content that violated the rules of the online community and users who had caused problems on Facebook with extremist content.
The company, which has tightened regulations against violent groups and hate speech in recent years, has said it is removing content and accounts that violate its rules as a precaution before other users see it. This content can only be viewed if it is controlled and authorized by the company.