Facebook is reportedly testing changes to its News Feed algorithm. The tests are initially valid for English-speaking users of the social network. The company adds three submenus to the Facebook menu to manage what appears in the News Feed: friends and family, groups and pages, and the third option is public figures.
Users in the test can choose to keep the ratio of these posts in their News Feed to "normal" or change it to more or less, depending on their preferences.
Whoever participates in the tests can do the same for the subjects that appear, identifying things that interest him or he does not prefer to see. In a post on the company's blog, Facebook he says that the trials will affect "a small percentage of people" around the world before gradually expanding in the coming weeks.
Facebook will also expand a tool that allows advertisers to exclude their content from specific topics appearing in users' News Feed, allowing advertising companies not to appear next to "news and politics", "social issues" and "crime" and tragedy ”.
Facebook algorithms are known for promoting incendiary content and dangerous misinformation. So Facebook - and its new parent company Meta - are under increasing regulatory pressure to clean up the platform and make its practices more transparent.
As Congress looks at solutions that could give users more control over what they see and eliminate some of the opacity around algorithmic content, Facebook hopes there is still room for its own self-regulation.
Last month before Congress, the former Facebook employee Frances Hagen leaked the ways in which Facebook's opaque algorithms can prove dangerous.
Only one question remains to be answered: Do you trust Mark?