Blood, gore and nudity: Why graphic content shows up in your newsfeed
Child pornography, human trafficking, drug and gun sales are off limits
Take a quick scroll down your newsfeed today and you'll likely see news about the U.S. Democratic convention, the latest Donald Trump rally and Pokemon Go.
But sprinkled between these is more graphic content — recent examples include photos of the wounded bloody dentist who got attacked by a shark or the video of a rapper shooting himself through the cheek for a music video.
Both posts were full of blood and some severely mangled skin, and were gory to stumble upon. So how did these end up in your newsfeed?
- What Facebook and Twitter ban: New tool tracks social media censorship
- Why the Virginia shooting is being called the 1st 'social media murder'
Despite the blood and gore, neither post goes against Facebook's community standards or Twitter's media policy. The content is graphic and many may not like it, but the companies keep it online.
Both posts were actually being promoted this week by Facebook Newswire — a service used to help newsrooms find stories that are trending on social media — and were written by many different media outlets.
The shark post had a small graphic warning written in its caption, though the bloody photos could be seen regardless. The shooting was given a "graphic warning" bumper ahead of the video, meaning it didn't automatically start playing or show up in the newsfeeds of minors.
There have been countless other examples of graphic content showing up in newsfeeds — the bloody aftermath of the Philando Castile shooting in Minnesota during a routine traffic stop was streamed on Facebook, and the Islamic State of Iraq and Syria (ISIS) has used Twitter to post gruesome content.
Facebook and Twitter say it is largely up to users to self-censor content such as this before they post it online — that can be done by marking it as sensitive material and giving context in the caption. Otherwise, they warn there's a chance it might be flagged by another user and taken down.
Child porn, trafficking no-go
If offensive or graphic content is reported to Facebook or Twitter, the posts are reviewed by their moderation teams.
Much like newsrooms make editorial judgments, these moderators must decide if there's value to the flagged content or if it oversteps the company's rules and guidelines — which they are expected to know intimately.
Any content that includes child pornography, human trafficking, or selling drugs and guns is off limits.
Both companies rely on their users to report any problem posts. Twitter-owned Periscope has taken this a step further — it has implemented "flash juries," a randomly selected group of users during a livestream who judge whether a flagged comment is acceptable or not.
Depending on time of day, content is reviewed in different Facebook and Twitter offices around the world. Policies remain the same worldwide, but because it comes down to the judgment of an individual moderator, cultural differences may factor in.
For example, a moderator reviewing content at Facebook's offices in India may make different decisions about the flagged content than a moderator working at Facebook's offices in Austin, Texas.
Facebook photo brought CPS
Sometimes moderators make mistakes.
Arizona-based photographer Heather Whitten takes photos of her four kids — sometimes they are topless or not wearing bottoms. She has had some of these pulled and then reinstated on Facebook several times. Her Instagram account has also been banned.
Her most controversial photo was one she posted on Facebook of her husband comforting her young son in their shower. Her kid was violently ill with salmonella poisoning. The photo was taken in 2014, but Whitten only posted it in May on her Facebook page, where it went viral.
The photo didn't violate Facebook's nudity guidelines, but it kept getting reported and taken down by the site. Whitten said she didn't get any notices from Facebook and they never reached out to explain what happened.
"It never crossed my mind that it would be controversial or anything because it's just so normal in our life, so it always kind of takes me aback a little bit," she told CBC News.
Three weeks after she posted the shower photo, Child Protective Services showed up at her door and said someone had anonymously reported her photos. CPS conducted an investigation and interviewed the family, but found nothing.
"That really shook us," she said. "We're very desensitized to violence ... but when there's nudity or intimacy, we really shut that down."
Whitten said she hasn't been sharing much online since and is still trying to figure out where her style of photography fits on social media.
"I definitely do still feel very passionate that these kinds of images should be allowed and there should be kind of like a normalization," she said.
"[Photographers] need social media. We love social media. We just kind of want it reciprocated a little bit."