Skip to main content

Working In The Shadows To Keep Web Free Of Child Porn

caption: Project Space, a Canadian investigation targeting child pornography, spanned three years and resulted in the rescue of 386 children.
Enlarge Icon
Project Space, a Canadian investigation targeting child pornography, spanned three years and resulted in the rescue of 386 children.
Courtesy of Toronto Police

Rich Brown knows how to crack a child pornography case. For decades, he was a cop for the New Jersey State Police. He worked on a task force dedicated to Internet crimes against children. And part of his job was to look through suspects' hard drives, image by image.

“There are certain things that are more difficult,” Brown said. “I do remember seeing the first molestation video. It was a husband that was molesting his two daughters.”

When he went home at night, it was difficult for him to forget what he had seen. “I have two boys, and I remember being ultra-protective of my boys during the time that I was involved with this type of work, and I think that's pretty common,” he said.

Brown recalled that work in the wake of Canada’s announcement that authorities had busted an enormous international child pornography operation. The investigation, dubbed Project Spade, was the end of a three-year investigation into a website that trafficked in illicit videos of young boys. Three hundred and forty-eight people have been arrested in connection with the videos, 76 of them in the U.S.

Those arrested include six law enforcement officials, nine religious leaders, 40 school teachers, three foster parents, 32 children volunteers and nine doctors and nurses, according to Toronto Police.

Investigations like this end with press conferences and high-profile trials, but, according to Brown, they begin far away from the public eye with one of the most difficult jobs in the world.

But police are no longer the only ones sifting through images of child pornography. The rise of Internet porn has created an industry of people working for tech companies like Google, Microsoft and Facebook. They spend eight hours a day, 40 hours a week looking at pictures and videos, asking, Is this child pornography?

Sarah Roberts, at the University of Western Ontario, is one of the few who studies those who do this work. She said many have to sign nondisclosure agreements.

“There's an aspect of trauma that can often go along with this work, and many workers would rather go home and tune out not talk about it,” Roberts said.

Microsoft and Google both declined to put NPR in touch with anyone who reviews these images.

Samantha Doerr, director of child protection at Microsoft, called it a “yucky job.”

“Microsoft has to invest in wellness programs for people that work on this,” Doerr said.

In March, Microsoft and eight other tech companies came out with guidelines for these wellness programs. They suggest employees take their minds off traumatic content by, for example, taking a 15-minute walk or engaging in a hobby. Many companies have a counseling hotline for employees.

A Google spokesman said that when reviewers spot child pornography, they create a digital signature – a series of ones and zeros unique to that image that act as a digital fingerprint.

Tagged pictures are sent to a database at the National Center for Missing and Exploited Children, which coordinates with law enforcement. Earlier this year, Twitter also started using the Microsoft program to screen every photo posted to its site.

Despite the software advances, the need for content moderators is still growing. And Sarah Roberts says that workers she's spoken with often feel stigmatized.

“Because this industry is so new and the need for this work is so new, I think the jury is out as to what the real implications are going to be for these people later on in their life,” she said.

Why you can trust KUOW