Science·Analysis

Send us your naked photos to help block revenge porn, Facebook invites users

Facebook is rolling out measure to crack down on the non-consensual sharing of explicit images. The initiative comes as trust in the social network is particularly low.

Trust in the social network is low, so this offer might make people skeptical

Facebook is partnering with safety organizations in Canada, the U.S. and the U.K. to help combat revenge porn, the sharing of explicit images without consent. (Dado Ruvic/Reuters)

Facebook wants its users to send them their naked photos.

The social network is rolling out the measure to crack down on revenge porn, the non-consensual sharing of explicit images.

The catch is you need to share those photos with the company first.

Praised by some as a long overdue step to protect potential victims from this digital form of gender-based violence, the initiative comes as trust in the social network is particularly low, following a media firestorm about its covert behaviour and government hearings in the U.S. and Europe about the company's handling of its users' personal data.

According to Facebook's announcement, a hashing system, piloted in Australia, is being introduced in Canada, the United States and the United Kingdom to block explicit photos from being shared.

The premise is that anyone who fears someone else might publicly post a sensitive or explicit image of them that they do not want shared, can pre-emptively upload that photo as a means of preventing others from doing so.

Using a form obtained through a partnering organization — in Canada, Facebook has partnered with YWCA — users can request a secure, one-time link with which to upload the image.

The image is then reviewed and a unique hash is generated for it. Like encryption, hashing turns data into a string of numbers so that the original content is unidentifiable. That blocks all future uploads of the same image on Facebook, Messenger and Instagram.

Steph Guthrie, a gender justice consultant, warns, 'You can have every technical security measure under the sun in place to protect the data, but all it would take for a breach of users' intimate images would be one rogue misogynist employee.' (Makda Ghebreslassie/CBC)

But if uploading your intimate photos to a giant corporation makes you nervous, the fact that the process involves having those images screened by an employee of the company might also make you squeamish.

Dealing with people's nude photos and potential cases of revenge porn is an extremely sensitive issue, and Facebook can't rely on a purely automated system to handle it. As countless mishaps have shown, algorithms just aren't fail-safe. But then again, neither are humans.

Steph Guthrie, a gender justice consultant who has spoken about revenge porn before the House of Commons justice committee, warns, "You can have every technical security measure under the sun in place to protect the data, but all it would take for a breach of users' intimate images would be one rogue misogynist employee."

"Given that user trust is at an all-time low," she says, "I think it would really behoove Facebook to be more transparent about the kind of screening and training their employees receive prior to being entrusted with manually reviewing people's intimate image."

It would really behoove Facebook to be more transparent about the kind of screening and training their employees receive.

According to Facebook, the review will be carried out by one specifically trained member of their community operations safety team; however, the post doesn't describe what their training will consist of, or what the review will entail.

"The project responds to a real need," says Danielle Citron, a law professor at the University of Maryland, and one of the leading figures in the crusade to battle revenge porn. "Victims' groups were telling Facebook that there were individuals who received threats that images they shared in confidence would be posted online."

A 'huge increase in revenge porn'

Indeed, as Maya Roy, the CEO of YWCA Canada, explains, "We reached out to Facebook four years ago to start having these conversations because we were seeing a huge increase in revenge porn in young women and girls across the country. We needed to look at how to prevent it."

Roy says the organization has encountered a huge continuum of issues among diverse women, ranging from 14-year-old girls being threatened by angry ex-boyfriends to married women kept hostage in bad relationships.

"It's about abuse of power and the way men control women," says Roy. "It may have looked different in the past, but now they're using their phones and images against us."

Citron, who has worked closely with Facebook's global safety team, says, "The company is working hard to mitigate real harms in the most secure way it can."

Two hands shaded in greeny-dark hover over a laptop keyboard, with the screen above showing vertical rows of 1s and 0s.
In addition to the Cambridge Analytica scandal, the Facebook platform 'for many years, has had a well-documented history of removing things like, say, photos of breastfeeding parents, while refusing to remove pages that publish graphic jokes about raping women,' Guthrie says. (Kacper Pempel/Reuters)

But while advocates including Roy and Citron are confident Facebook has been carefully working through ensuring that images can be sent in securely and then hashed and destroyed, the system only works if the individuals who might use it trust that it is safer than doing nothing. And given Facebook's recent track record, that might be difficult.

In addition to the Cambridge Analytica scandal, which revealed that tens of thousands of Facebook profiles had been harvested without their users' permission, the platform "for many years has had a well-documented history of removing things like, say, photos of breastfeeding parents, while refusing to remove pages that publish graphic jokes about raping women," says Guthrie.

She says, "It feels dicey to trust how their team members will handle users' most intimate data."

There is no question that the repercussions of revenge porn can be devastating. As Roy notes, it can result in job loss, ongoing harassment, potential immigration repercussions, and even suicide. She says this pilot is intended for emergencies when an individual knows their images have been compromised and there is clear risk.

But while people who fear being victimized may welcome support, comments in response to the company's announcement of the new feature show a more cautious response from the public.

Some commenters are skeptical about Facebook's commitment to victims, saying their response when women complain about images that violate the company's terms of service is far too slow. Others have responded that the system shouldn't require the images to be shared with Facebook.

Letting people control their own images

Another suggestion was to give the control over what is allowed to be posted to the subjects of the images — since Facebook has such robust facial recognition capabilities, they should just ask users to approve any image that is posted with their likeness.

Done right, this initiative could inspire other platforms to follow suit, says Guthrie.

"I'd love to see other sites adopt this kind of technology, if they are doing so with great care and attention to the human factors of how these photos are handled internally," she says. The only way for this type of technology to be truly useful to victims is if it becomes the norm on other sites where revenge porn is common, she says.

One of the biggest challenges is that once a photo has been shared without consent, it is impossible for the victims to stop the spread of that image. It can easily be downloaded or screen-grabbed and shared, even if the original site removes it.

So, while the headlines will keep popping into our news feeds about whether we can trust Facebook to protect our data from unknown sources, the question now is, can we trust it to protect us from each other?

This Facebook initiative is clearly responding to a strong need, but it only works if people are willing to trust the corporate giant with their most private and sensitive data.

ABOUT THE AUTHOR

Ramona Pringle

Technology Columnist

Ramona Pringle is an associate professor in Faculty of Communication and Design and director of the Creative Innovation Studio at Ryerson University. She is a CBC contributor who writes and reports on the relationship between people and technology.