The tip that led to the arrest last month of Eliot Cutler, a former two-time candidate for Maine governor, was one of more than a thousand that state investigators handled last year.

And that is just one step in the process of bringing people who sexually abuse and exploit children to justice, and perhaps finding safety and peace for the survivors of these crimes.

At each point along the way to charges and a conviction, there are places where a lack of attention or resources can allow horrible crimes to continue unnoticed.

Simply put, we are not doing everything in our power to stop child sex abuse.

BIG TECH COMING UP SHORT

As with so many things in our lives and politics today, it starts with the largest tech companies, whose platforms people use to store and share all sorts of illegal and shocking images, including those depicting child sex abuse and exploitation.

Advertisement

The companies by federal law are required to report illegal material they find on their products, whether they be social media sites or file storage drives. Last year, tech companies sent more than 29 million tips on such images to the National Center for Missing and Exploited Children, up from nearly 22 million in 2020 and 17 million in 2019.

Those numbers are big, but they are not by any means the whole story. A 2019 report by The New York Times found that tech companies are not taking full advantage of all the tools they have to identify, remove and report illegal images, sometimes going after copyright infringement with more vigor. Some of the companies don’t look at all for the images, while others don’t look everywhere under their domain.

Even Facebook, which accounted for over 90 percent of imagery flagged at the time – a sure sign the other companies aren’t trying hard enough – was not looking in all its available databases for illegal material.

The approaches employed by tech companies, the Times wrote, “are inconsistent, largely unilateral and pursued in secret, often leaving pedophiles and other criminals who traffic in the material with the upper hand.”

There have been improvements in some cases. But, clearly, tech companies are missing a lot of the child abuse that is being perpetrated with their help.

TRAUMATIZING WORK

Advertisement

Often, too, they shortchange the workers employed in the awful task of deciding which images must be deleted. Much of the process that identifies illegal photos and videos is automatic, but ultimately, a human must look at them.

It is difficult and traumatizing work, which companies often outsource to contractors, making it easier to underpay them, and otherwise leave them without the support they need to do the work well and survive with their mental health intact.

We shouldn’t allow these companies to shirk what should be their first responsibility: making sure their powerful, profitable platforms aren’t being used to abuse and exploit children.

And we should make sure, too, that law enforcement has the resources to investigate child sex crimes, which, as the Press Herald’s Matt Byrne showed in two stories last week, are as complex and time-consuming as they are varied.

Of the 29 million or so tips collected by the national clearinghouse last year, about 2 million were traced back to users in the U.S., and 1,236 were forwarded to the Maine State Police’s Computer Crimes Lab for further investigation. Through last week, Maine investigators had already received 600 tips this year, putting them on pace for about 2,300 this year.

GIVE INVESTIGATORS WHAT THEY NEED

Advertisement

Despite an agreement that brings more investigators into the fold, they are having trouble keeping up, with a backlog of 95 cases waiting to be assigned.

Each of those has to be sorted through by hand, too, by detectives and analysts. Not only are the investigators challenged by the sheer number of cases, but each case is itself grinding, subjecting them to image after image that they’ll later try, often in vain, to forget.

In recognition of their work, lawmakers last year voted to allow some civilian analysts in the computer crimes lab to receive better retirement benefits, and will vote soon on whether the rest can too.

Those analysts should have those benefits. They earn them, as do the rest of the investigators.

The people who for the rest of us shoulder the burden of investigating these horrible crimes should have whatever resource they need. They should be able to pursue these cases with speed and precision, and while maintaining their own well-being.

The survivors of these crimes deserve nothing less.

Comments are no longer available on this story