'We are failing': Internet companies like Google have too much power over child sex abuse images, report says
'How do you know for sure that's a child?' Tech companies resist removing child sex abuse images, says centre
The Canadian Centre for Child Protection says the removal of child sex abuse material should not be left to the discretion of online companies because they are not doing enough.
In a report released Wednesday, titled How we are Failing Children: Changing the Paradigm, the centre says 400 companies have received notices to take down child sexual abuse images and videos. The best 10 per cent of the industry remove the material within a day or less, and the worst 10 per cent take more than two weeks to act.
The centre regularly gets pushback from companies, including the major players in the industry, said associate executive director Signy Arnason.
Google — whose parent company, Alphabet, owns YouTube — has challenged requests to remove images by questioning the age of a child, said Arnason.
"So children from 11 to 13 years of age where … there are slight signs of sexual maturation — a little bit of pubic hair, a little breast development, some auxiliary hair — and then suddenly it's 'how do you know for sure that's a child?' And then we're struggling to get this material removed," said Arnason.
"We've let the internet be the Wild, Wild West. It's a completely lawless space," she said.
"We don't allow children to drink until a certain age. We don't allow them to drive.… Yet nude images of them all over the internet are allowed to sit up and stay online. We've completely lost our way and we need to step in and start doing more."
The report says the pace at which children develop varies considerably, so companies should not rely on that alone to assess whether or not they are dealing with an image or video of abuse featuring a child.
A Google spokesperson said the company is working to ensure its platforms are never used to spread content that exploits children, but admits there is more work to be done.
"We will continue to invest to help keep our platforms and victims safe from this type of abhorrent content," the spokesperson said in an emailed statement.
Industry not doing enough: prof
The industry isn't doing enough, though to stop "a horrific crime that is fuelling more horrific crimes, that is then further victimizing the children," said Dr. Hany Farid, a professor at the University of California, Berkeley.
"And the industry is, like, 'This isn't our problem.'"
In 2008, Farid worked with Microsoft to develop technology called photo DNA, which creates a digital signature of an image and compares it to others to find copies. They donated the program to the National Centre for Missing and Exploited Children (NCMEC) to use as a tool to find and remove child sex abuse material from their platforms.
Farid said it took several years but tech companies finally came on board and started to use the software.
"The fact is that the companies have come along begrudgingly and kicking and screaming. They don't want to be in the business of taking down this horrible content," said Farid.
"They don't want to be in the business of regulating their platforms for a number of reasons. It's a liability problem. It's a financial issue.
"And it opens up the door to them moderating other aspects of their platform, which are awful and terrible."
He said last year NCMEC received 18.5 million reports of child sex abuse material.
"That's 2,000 reports an hour, every hour, every day, every month for a year. That's how many reports they've gotten," said Farid, adding 80 per cent of child sexual abuse material is prepubescent kids under the age of 12, and as young as one to two months old.
"They are literally infants and toddlers being sexually abused. Their material is being shared online. It is fuelling the creation of more content. The children who are victims of this, when you talk to them, will tell you every time that content is viewed, they are being violated again," said Farid.
In January 2017, the Canadian Centre for Child Protection used Photo DNA to develop an online web crawler called Project Arachnid. It scans the open and dark web for known images of child sexual abuse and issues notices to companies to remove them.
Since then, more than 13.5 million child sex abuse images have been identified, and nearly five million takedown notices have been sent to providers around the globe.
What needs to change?
The centre proposed a framework that puts the interests and protection of children first, while clarifying the roles and responsibilities of governments and the industry. It includes the immediate removal of child sexual abuse imagery and a standardized response to the problem.
It's also calling for the removal of all images tied to the sexual abuse of a child, including photos made before and after, when the victim is fully or partially clothed.
"This framework is grounded in the best interests of the child, and the rights of children to dignity, privacy, and protection from harm," said Lianna McDonald, the executive director of the Canadian Centre for Child Protection.
"The undeniable truth is the rights of a victimized child will be continually violated as long as images of them being sexually harmed and abused are available on the internet."
The centre says to date, the removal of child sexual abuse images has been mostly left to the discretion of industry — those businesses that intersect with user generated content by way of the internet.
"We have a real challenge in terms of the way that the internet has been built for profit," said Dr. Michael Salter, associate professor of Criminology at the University of New South Wales, in Sydney, Australia.
Salter said social media provides users a platform to upload and circulate a vast number of images and historically that material hasn't be regulated or monitored in any way.
"The architecture of the internet is built to allow that material to to circulate. It is not built to monitor that, to prevent that or to take content down," said Salter.
"And so we really need to confront technology industries with the fact that they are making billions and billions of dollars using a business model that facilitates the traffic in child sexual abuse material," said Salter.
The centre's report says if a reasonable person would believe the image is of a child, and that the child was being harmed due to the public availability of the material, those images and videos should be taken down immediately, with no questions asked.
"By approaching the removal of child sexual abuse images and harmful/abusive images of children from a protection and rights framework, we are reaffirming the principle that every child is deserving of the rights to dignity, safety, privacy, freedom from harm, and security," the report says.
"We need to educate the masses and say 'this is really the state of the nation of what exists online," said the Canadian Centre for Child Protection's Arnason.
"And I think … a healthy, well person would be on our side and saying, 'this is not OK, and it needs to come down."