Facebook is reportedly testing changes to its News Feed algorithm. The tests are initially valid for English-speaking users of the social network. The company adds three submenus to the Facebook menu to manage what appears in the News Feed: friends and family, groups and pages, and the third option is public figures.
Whoever participates in the tests can do the same for the subjects that appear, identifying things that interest him or he does not prefer to see. In a post on the company's blog, Facebook he says that the trials will affect "a small percentage of people" around the world before gradually expanding in the coming weeks.
Facebook will also expand a tool that allows advertisers block their content from specific topics that appear in users' News Feeds, allowing companies that advertise to not appear next to "news and politics," "social issues," and "crime and tragedy."
Facebook algorithms are known for promoting incendiary content and dangerous misinformation. So Facebook — and its new parent company Meta — are under increasing regulation pressure to clean up the platform and make its practices more transparent.
As Congress looks at solutions that could give users more control over what they see and eliminate some of the opacity around algorithmic content, Facebook hopes there is still room for its own self-regulation.
Only one question remains to be answered: Do you trust Mark?