Facebook wields new weapon in battle against revenge porn

Umeambiwa Kuna Video/Picha,
WhatsApp Group
au Namba za Mrembo
BONYEZA HAPA KUFUNGUA
TANGAZO
The social network is now using photo matching to make sure images it removes aren't re-uploaded across Facebook, Messenger and Instagram.


Revenge porn is the scourge of the social internet -- emotionally distressing at best and a downright life-ruiner at worst.
Facebook is stepping up its game when it comes to protecting victims of the practice. The social network announced a new tool on Wednesday that will prevent people from resharing nonconsensual intimate images it has pulled down, not only on Facebook, but through Messenger and on Instagram as well.
The company will use photo-matching technology to keep tabs on pictures that have been reported and removed under its current rules, which already outlaw revenge porn, and stop people from resharing them. It will also warn them that the photo they're trying to share or upload violates its policies.
Revenge porn has been around for as long as the internet, but the increased popularity of social networks has boosted its potential impact. Linked networks of family, friends and colleagues, combined with the potential for things to go viral, hugely increases the opportunity for humiliation.
A study published last year by the Center for Innovative Public Health Research showed that 2 percent of the straight-identifying population has been affected by revenge porn, while for gay, lesbian and bisexual individuals this jumps to 17 percent. And according to the Civil Rights Initiative, 82 percent of victims report significant impairment to their social or work lives.

The devastating consequences of revenge porn for victims are exactly why everyone from lawmakers to tech companies are fighting the practice.
The photo-matching tech Facebook is using isn't new, but hasn't been deployed in this way before. Tools like Microsoft's PhotoDNA are put to work across the internet everyday, primarily to try to prevent the spread of illegal child abuse imagery.
PhotoDNA works by converting photos into grayscale and breaking them up into tiny squares, which it labels numerically, creating a signature that's stored in database. When new images are uploaded, they're automatically cross-referenced with the database. If a match is found, the upload is stopped in its tracks.
"This new process will provide reassurance for many victims of image based sexual abuse, and dramatically reduce the amount of harmful content on the platform," said Laura Higgins, founder of the Revenge Porn Helpline UK, in a statement.
Facebook also has a special reporting mechanism in its Help Center for victims, as well as for people who are being extorted or are victims of domestic violence, with resources and links to partner organizations that may be able to offer further help and support.
"We hope that this will inspire other social media companies to take similar action and that together we can make the online environment hostile to abuse," said Higgins.



TANGAZO


Emoticon Emoticon

Copyright © 2017 Hit Bongo - Thank . You For . Visting .