After intense criticism that it has received from leaders of European countries, but also from its own community, Facebook is preparing artificial intelligence to combat terrorist messaging and generally curtail any propaganda on its platform.
AI technology will not only be applied to detect Muslim extremists like ISIS, but for every team that promotes violent behavior or engages in acts of violence. A broader definition of terrorism could include violent gang activity, drug trafficking, or white nationalists who support violence.
Facebook is currently testing its processing and analytics usage physicslanguage by adding functions that identify abusive posts based on the words used by accounts that have already been suspended.
"Today we are experimenting with text analysis that we have already removed to identify terrorist organizations such as ISIS or Al Qaeda. "We're trying to identify text messages using an algorithm that is still in its infancy," said Monika Bickert, director of global policy management and Brian Fishman, director of counterterrorism policy, in a Facebook blog post. .
The social network highlighted a number of initiatives it has already taken to control extremist content, such as using technology to recognize photos and video related to terrorists, deleting fake accounts from Facebook and seeking advice from a panel of counter-terrorism experts.
Earlier this month, after the terrorist attack στο Λονδίνο που σκότωσε επτά άτομα, η Βρετανίδα πρωθυπουργός Theresa May κάλεσε τα έθνη να δημιουργήσουν διεθνείς συμφωνίες για τη “ρύθμιση του κυβερνοχώρου, για την πρόληψη της εξάπλωσης του εξτρεμιστικού και του τρομοκρατικού σχεδιασμού” καλώντας το Facebook για να αναλάβει άμεσα δράση, Η Theresa May κατηγόρησε εν μέρει το κοινωνικό δίκτυο για το έγκλημα, επειδή οι πλατφόρμες του δίνουν στους τρομοκράτες ένα “safe part” that helps them function.
In March, a German minister said that companies like Facebook and the google could face $ 53 million in fines if they did not curb hate speech.
Shortly after the events in May, Facebook's policy manager, Simon Milner, said the company would make its social network a "hostile environment" for terrorists.
Last month, o mark zuckerberg said his company plans to hire 3.000 extra people for the team Facebook Community Operations, by 4.500 he is currently employing. The team evaluates content that is potentially inappropriate or violates Facebook's policy.
With the latest social blog blog publishing, Facebook asks for help from anyone who thinks it can help. So if you have ideas on how to stop the spread of online terrorism you can write to: firstname.lastname@example.org.