Australia and the Facebook are working together to try a new experimental measure against vindictive pornography that could possibly stop the release of images before it happens.
In accordance with Australian Broadcasting Corporation (ABC), Facebook is testing a bigger one program which he hopes will eventually be released globally. The social network will reportedly work very closely with each country's cybersecurity agency for the new one service security.
How will it work?
Users could block an image simply by saving it to the social network, and thus any attempts by third parties to post it on Facebook, Messenger or Instagram they won't work.
Users who initially agree to upload an image but do not want it to be distributed outside of the service they upload will be able to cautiously refer to the image as a "non-consensual familiar image". Australian users, for example, can contact the e-Safety office directly and then send the image with a message to themselves in Messenger.
This way, the company can collect the hash of the image, so it can recognize it from its electronic fingerprint. So if someone ever tries to upload them to other services of the company they will not be able to.
According to a Facebook spokesperson, any images that are referred to as non-consensual images will be recognized by Facebook Photo Matching Technology and will inform the user who tries to upload them that it violates Facebook policies.
Needless to say, this option only works if you are the owner of the original photo or at least if you have access to it, which is not always the case for victims of this type of act.
Australian e-Security Commissioner Julie Inman Grant has acknowledged that this solution is aimed primarily at those who are very cautious.
Of course, as you understand, this measure does not completely solve the problem, especially if photos are not distributed through Facebook.
But it is a beginning…