Facebook AI Against Violence and Terrorism

After intense criticism that it has received from leaders of European countries, but also from its own community, Facebook is preparing intelligence to combat terrorist messaging and generally curtail any propaganda on its platform.

AI technology will not only be applied to detect Muslim extremists like ISIS, but for every that promotes violent behavior or engages in acts of violence. A broader definition of terrorism could include violent gang activity, drug trafficking, or white nationalists who support violence.Facebook

Facebook is currently testing its processing and analytics usage language by adding functions that identify abusive posts based on the words used by accounts that have already been suspended.

"Today we are experimenting with text analysis that we have already removed to identify terrorist organizations such as ISIS or Al Qaeda. "We're trying to identify text messages using an algorithm that is still in its infancy," said Monika Bickert, director of global policy management and Brian Fishman, director of counterterrorism policy, in a Facebook blog post. .

The social network highlighted a number of initiatives it has already taken to control extremist content, such as using technology to recognize photos and related to terrorists, deleting fake accounts from Facebook and seeking advice from a panel of counter-terrorism experts.

Earlier this month, after the terrorist στο Λονδίνο που σκότωσε επτά άτομα, η Βρετανίδα πρωθυπουργός Theresa May κάλεσε τα έθνη να δημιουργήσουν διεθνείς συμφωνίες για τη “ρύθμιση του κυβερνοχώρου, για την πρόληψη της εξάπλωσης του εξτρεμιστικού και του τρομοκρατικού σχεδιασμού” καλώντας το Facebook για να αναλάβει άμεσα δράση, Η Theresa May κατηγόρησε εν μέρει το κοινωνικό δίκτυο για το έγκλημα, επειδή οι πλατφόρμες του δίνουν στους τρομοκράτες ένα “ part” that helps them function.

In March, a German minister said that companies like Facebook and the could face $ 53 million in fines if they did not curb hate speech.

Shortly after the events in May, Facebook's policy manager, Simon Milner, said the company would make its social network a "hostile environment" for terrorists.

Last month, o said his company plans to hire 3.000 extra people for the team Facebook Community Operations, by 4.500 he is currently employing. The team evaluates content that is potentially inappropriate or violates Facebook's policy.

With the latest social blog blog publishing, Facebook asks for help from anyone who thinks it can help. So if you have ideas on how to stop the spread of online terrorism you can write to: hardquestions@fb.com.

iGuRu.gr The Best Technology Site in Greecefgns

Written by giorgos

George still wonders what he's doing here ...

Leave a reply

Your email address is not published. Required fields are mentioned with *

Your message will not be published if:
1. Contains insulting, defamatory, racist, offensive or inappropriate comments.
2. Causes harm to minors.
3. It interferes with the privacy and individual and social rights of other users.
4. Advertises products or services or websites.
5. Contains personal information (address, phone, etc.).