Facebook AI Against Violence and Terrorism

Following intense criticism from both EU leaders and its own community, Facebook is preparing artificial intelligence to fight the exchange of messages by terrorists and, in general, to curb any propaganda on its platform.

AI technology will not only be applied to identify Muslim extremists like ISIS, but for any group that promotes violent behavior or acts of violence. A wider definition of terrorism could include violent gang activities, drug trafficking, or white nationalists who support violence.Facebook

Facebook is currently testing the use of natural language processing and analysis by adding functions that identify violent publications based on the words used from previously suspended accounts.

"Today we are experimenting with text analysis that we have already removed to identify terrorist organizations such as ISIS or Al Qaeda. "We're trying to identify text messages using an algorithm that is still in its infancy," said Monika Bickert, director of global policy management and Brian Fishman, director of counterterrorism policy, in a Facebook blog post. .

The outlined a number of initiatives it has already taken to control extremist content, such as the use to identify terrorist-related photos and videos, remove fake accounts from Facebook and seek advice from a panel of counter-terrorism experts.

Earlier this month, following the terrorist attack in London that killed seven people, British Prime Minister Theresa May called on nations to reach international agreements to "regulate cyberspace, to prevent the spread of extremist and terrorist planning". Facebook to take immediate action, Theresa May partly blamed the social network for the crime, because its platforms give terrorists a "safe place" that helps them function.

In March, a minister of He said companies like Facebook and Google could face fines of up to $53 million if they don't curb hateful posts.

Shortly after the events of May, Facebook's director of policy, Simon Milner, said that the will make its social network a "hostile environment" for terrorists.

Last month, Mark Zuckerberg said his company plans to hire 3.000 additional people for the group Facebook Community Operations, by 4.500 he is currently employing. The team evaluates content that is potentially inappropriate or violates Facebook's policy.

With the latest social blog blog publishing, Facebook asks for help from anyone who thinks it can help. So if you have ideas on how to stop the spread of online terrorism you can write to: hardquestions@fb.com.

iGuRu.gr The Best Technology Site in Greecefgns

every publication, directly to your inbox

Join the 2.087 registrants.

Written by giorgos

George still wonders what he's doing here ...

Leave a reply

Your email address is not published. Required fields are mentioned with *

Your message will not be published if:
1. Contains insulting, defamatory, racist, offensive or inappropriate comments.
2. Causes harm to minors.
3. It interferes with the privacy and individual and social rights of other users.
4. Advertises products or services or websites.
5. Contains personal information (address, phone, etc.).