Can Google Stamp Out Child Porn On The Internet For Good?

Google is, for better or worse, a repository of many things. Among those things are sites that can direct you to child pornography. And Google has officially decided to put an end to it, once and for all.

Google’s plan to drive child porn off the Internet is, among other things, very much a Google way of doing it. Flagged images will be hashed, tagged, and taken down:

The new system will work by sharing data on images which have been identified as illegal and then flagged, or “hashed”, using software originally created in 2008. The lack of an industry standard means data on images earmarked in this way is difficult to share, and therefore hard to eradicate completely.

As we all know, removing an image from the Internet completely is like trying to scrape the last of the peanut butter out of the jar with a knife. But, first of all, setting a standard means it will be much, much harder to find this stuff, and secondly, setting that standard means it will be much easier to track.

There are, however, some technical problems here. For obvious reasons, many sites that traffic in horrific images of child abuse are not terribly eager to be listed on Google, where people like law enforcement and angry mobs can find them, so Google can’t remove the images completely. Some are also concerned that this is more of a bandage than a real attempt to solve the problem, although realistically there’s little, if anything, Google can do beyond what it’s already doing. Finally, some are concerned that this system can be applied later on to images governments may find “objectionable”, instead of attempting to address a social ill.

All that said, it’s a little hard to argue that we shouldn’t be trying to stamp out child porn. Maybe we won’t succeed completely, but if nothing else, making the images harder to find is not a bad thing. Although, again, we’re just saying an angry mob with torches can be a useful deterrent, Google. Think about it.

×