Facebook is taking a proactive stance on its revenge porn problem.
After several high-profile incidents in which users posted nude photos without the consent of the photographed subjects, the social media company has taken steps to address the abusive behavior.
CEO Mark Zuckerberg posted in February that Facebook wanted to become a safer, more inclusive environment. That statement was followed by a new policy in April addressing revenge porn and how the platform would flag pictures that violated its terms of service.
Now the company is piloting a program under which users can upload pictures of themselves—photos they worry might be posted without their consent—and have Facebook’s artificial intelligence keep those images off its platform.
The Australian Broadcasting Corporation wrote:
Facebook is partnering with a small Australian Government agency to prevent sexual or intimate images being shared without the subject's consent.e-Safety Commissioner Julie Inman Grant said victims of "image-based abuse" would be able to take action before photos were posted to Facebook, Instagram or Messenger.
"We see many scenarios where maybe photos or videos were taken consensually at one point, but there was not any sort of consent to send the images or videos more broadly," Ms. Inman Grant said.
One in five Australian women aged 18-45 and one in four Indigenous Australians are victims of that abuse, she said.
As news of the plan spread, some expressed doubts about whether Facebook could deliver on its good intentions.
Facebook’s goal here is laudable. But there are some clear problems. For one thing, if someone is able to hack into someone’s Facebook account (like say, if the person is already logged in or the password is stored in the browser), could the image be retrieved? Grant says Facebook won’t store the images, but few things online are ever truly permanently deleted.More fundamentally, though, asking women who have been victims to upload naked photos of themselves is a rather tone-deaf approach, one that’s not particularly trauma-informed. When a naked photo of a person is circulated without her consent, it can be ruinous emotionally and professionally. Requesting that women relive that trauma and trust Facebook, of all companies, to hold that photo in safekeeping is a big ask.
Others highlighted a part of the process Facebook had downplayed in interviews and press: a Facebook employee would review nude photos sent to the company.
Potential victims must send nude pictures of themselves though the social network's official messenger so the images can be viewed, in full, unedited form, by an employee of the social network.
Facebook tried to alleviate fears that this initiative was corporate cover for the very behavior it was trying to prevent.
Ars Technica continued:
A Facebook spokeswoman said the employee would be a member of the company's community operations team who has been trained to review such photos. If the employee determines the image violates site policies, it will be digitally fingerprinted to prevent it from being published on Facebook and Facebook-owned Instagram. An article posted by the Australian Broadcasting Corporation reported said the service is still being tested with help from Australian government officials. To use it, potential victims will first complete this online form, and then send the images to themselves over Facebook Messenger.
At first there was also confusion about how these nude phots would be stored by Facebook’s servers for reference in blocking the banned photos.
Facebook will keep hold of these images for a period of time to make sure that the company is correctly enforcing those policies. Here, images will be blurred and only available to a small number of people, according to the Facebook spokesperson. An individual employee at Facebook, however, will have at that point already examined the un-blurred versions.
The platform’s chief security officer, Alex Stamos, tweeted in response:
There are algorithms that can be used to create a fingerprint of a photo/video that is resilient to simple transforms like resizing. I hate the term "hash" because it implies cryptographic properties that are orthogonal to how these fingerprints work.
— Alex Stamos (@alexstamos) November 8, 2017
Other news outlets cleared up the issue, revealing that the phots would not be stored explicitly but rather that a link of hash value would be used to identify metadata in the photo.
Australia’s e-safety commissioner, Julie Inman Grant, said in an interview:
“They're not storing the image, they're storing the link and using artificial intelligence and other photo-matching technologies. So, if somebody tried to upload that same image, which would have the same digital footprint or hash value, it will be prevented from being uploaded.”
Twitter users were quick to point out the oddity of Facebook’s request, highlighting the poor optics.
Everyone: Fake news and bot accounts are a big problem. Do something Facebook!
— ™ (@TefoMohapi) November 8, 2017
Facebook: Send nudes.
The only thing that can stop a bad guy with your nudes is a good guy with your nudes. https://t.co/sTUqFYrxM5
— Richard. (@RtodaizH) November 9, 2017
Facebook asks people to send them their nudes so they can prevent them from being posted as revenge porn. Nothing will go wrong with this plan. https://t.co/ftQbQji4YD
— Eva (@evacide) November 8, 2017
[Free Download: Keep your cool in a crisis with these 13 tips.]Others pointed out that the risks of the new program might not outweigh the benefits, especially given that the benefits might be relatively small.
Facebook already bans accounts engaging in revenge porn, and a scheme to combat it was launched by the site last April, where users can flag nonconsensual explicit images to Facebook’s trained reps that review and remove the pics. Technology prevents it being posted again. The new tool that’s being tested provides preemptive assurance to people concerned about their intimate, private pictures.Lawyer Iain Wilson points to the limitations of such technology. “In many instances, a victim of revenge porn will not have access to the digital image themselves,” he says. “This is because revenge porn is often about the threat of disclosure, or a false claim that disclosure has been made, as a device to control, blackmail or simply to cause distress and anxiety. Without the source file, images cannot be blocked.
“Secondly, whilst making the most-widely used platforms such as Facebook safer, these only represent part of the world wide web which extends to some five billion pages. The nature of and revenge porn is such that the perpetrators would simply seek to post material on other websites (which could be linked to from Facebook). That said, if victims are attracted to the idea, in theory any number of websites could sign up.”
The consensus was that Facebook would have to be exemplary and aboveboard in communicating about this program.
Dazed continued:
Myles Jackman, the legal director of digital campaigning organisation for online privacy and free speech Open Rights Group, told Dazed about these security issues: “Whilst so-called revenge porn is both a moral and criminal breach of consent in the UK – this Australian pilot scheme (which may be Facebook’s attempt to achieve the sheen of corporate social responsibility) is riven with privacy and security risks for any user uploading nude pictures.”"If Facebook wants to go down this road they need to maintain the highest levels of transparency on how this very sensitive data is stored and processed, and ensure these nude photos will not violate Facebook's prudish terms of service."
What do you think of Facebook’s new program, PR Daily readers? How can it earn the trust necessary for its plan to succeed?
(Image via)
from PR Daily News Feed http://ift.tt/2hmnXEu
No comments:
Post a Comment