Facebook last week announced the launch of a controversial pilot program designed to combat revenge porn.
In a voluntary process the social media giant insists is safe and secure, Australia-based users who fear nonconsensual use of their intimate personal images may share such photos with a trained Facebook employee. That employee, in turn, then creates a nonvisual hash of the image. The hash is meant to prevent future postings of the image across the Facebook, Messenger, and Instagram properties.
The pilot program — a test — is rolling out in Australia and three other countries. Facebook’s chief security officer, Alex Stamos, tweeted out a few days ago, “We are trying to balance [risk] against the serious, real-world harm that occurs every day [when people engage in revenge porn].”
Stamos also shared a notion of caution: "We are not asking random people to submit their nudes. This is a test to provide 'some' option to victims to take back control."
More explanation, please. Facebook's global head of safety, Antigone Davis, also gave an overview of the "Non Consensual Intimate Image Pilot" program in a statement Thursday. It explains the collaborative approach to combatting the use of NCII (nonconsensual intimate images). The team that developed it includes representatives from Facebook and government agencies — as well as an international working group of survivors and victim advocates.
As word of the pilot program spreads, there's the risk of phishing-type attempts.
Various people from Facebook explained the multi-step process. First, a user files a report with Australia's "eSafety" commissioner, Julie Inman Grant, via a form on the government official's online portal. (Grant formerly served as Microsoft's global safety director; she worked in similar public policy and safety roles at Twitter and Adobe.) The user then sends a copy of the photo to himself or herself using the Facebook Messenger app.
Next, the eSafety commissioner's office notifies Facebook the message was sent; Facebook accesses the image and creates a hash of the image. That hash — a unique, mathematical representation — is stored on Facebook's servers as prohibited content. If anyone attempts to upload a photo matching that hash — that person is automatically prevented from posting it.
So revenge porn is that significant a problem? A piece in CNN notes that revenge porn — the unauthorized sharing of someone else's intimate, nude or sexual photos — is an epidemic in Australia. "One in five Australians between the ages of 16 and 49 are affected, according to a recent study," noted CNN.
Last spring, a 30,000-member private Facebook group, Marines United, was found to be sharing explicit photos of female Marines without the women's permission. The group was banned but was re-formed under different names; the cache of photos was also uploaded and further disseminated via a Dropbox account. Forty-four Marines have been disciplined to date for their involvement, according to a piece in The Washington Post. The Corps also bolstered its social media-related misconduct policies.
Let's be wary, folks. Though dispensing with revenge porn is important, sending one's own revealing photos to Facebook's staff — even if such staff are "specially trained" (whatever that means) — is very likely not the best way to accomplish that goal. Some people worry, for example, that although Facebook says it will not retain the photos, a single noncompliant employee could make that promise impossible to keep.
Further, as word of the pilot program spreads, there's the risk of phishing-type attempts. An incoming email might appear to be from Facebook — inviting users to submit photos pre-emptively. The goal, of course, would be to target naïve users — duping them into emailing intimate photos.
Either way, the pilot program has risks — even as it seeks ideas and opinions from users. Said the global head of security, "We look forward to getting feedback from our community to learn the best ways to keep tackling these [revenge-porn issues]."
Here's an Interesting side note (or maybe it's not a side note at all). Napster co-founder Sean Parker — who was a founding president of Facebook, of course — shared harsh criticisms on Thursday of his former employer. At an Axios event in Philadelphia, Parker expressed concern about the intentionally addictive nature of the social network. He described Facebook's system of likes, comments and shares as "a social feedback loop ... exploiting a vulnerability in human psychology."
Parker, who described himself as "something of a conscientious objector to social media," also added, "God only knows what it's doing to our children's brains."
Michele Blood is a freelance writer with a passion for children's literature. Based in Flemington, New Jersey, she leverages her background in psychology in her work for publishers, businesses and NPOs.