Google, Microsoft Block Child Pornography and Abuse Images from Search Functions
Bad news for pedophiles seems like good news for the rest of us: search engines Google and Microsoft are joining forces to make it significantly harder to find child pornography and abuse images online. Now, if you type "child pornography" into Google search, the results will be quite different. Instead of images, you'll see bring up a Google statement about protecting kids from sexual abuse, warnings that child abuse imagery is illegal, and give you links to seeking help if you think you have a problem with child porn (hint: if you're Googling it for funsies, you might). Other search results will now mainly feature news clippings referring to child pornography.
The two competing internet giants have come together to work on the issue of child pornography, with Microsoft's Bing search engine enacting a similar policy as Google's search. Microsoft already has a system in place which can give a photo a unique "fingerprint" that allows it to be tracked as it travels across the web, and now Google's VideoID does the same thing with videos. Both companies are planning on giving the technology to the National Crime Agency, among other organizations, to help with tracing the proliferators of inappropriate content.
"Day-to-day we're fierce competitors, and we collaborate on this issue because it transcends that. It will be much harder to find that content on both Bing and Google. We are blocking content, removing content and helping people to find the right content or also sources of help should they need that," said Microsoft's general manager of marketing and operations.
The steps come mainly in response to British Prime Minister David Cameron's demand back in July that the search engines do more to stop people from getting access to sketchy web content — including ensuring that searches bring up no results if the terms are unambiguously illegal.
On Monday, Cameron praised the decision as "significant progress," saying it was a big turnaround from the companies' initial stance that changes "couldn't be done, shouldn't be done." He also said that Downing Street would be following up and monitoring the internet giants to make sure the changes come about "urgently." The restrictions will begin in the UK, and will then expand to include 158 other languages over the next six months.
But there are those who think the changes won't do much to help the protect children from pedophiles — figures suggest that only one in 15 people caught viewing pedophiliac content on the internet is actually arrested.
"They don't go on to Google to search for images. They go on to the dark corners of the internet on peer-to-peer websites," the former head of the Child Exploitation and Online Protection Centre (Ceop) told BBC Breakfast. "A better solution would be to spend £1.5m on hiring 12 child protection experts and 12 coordinators in each of the police regions to hunt down online predators,"
A police officer who specializes in investigating child abuse echoed the concern, telling PCPro "I simply do not see people using Google, etc to search for child abuse. It's too risky for them."
"We need more staff. We have a nine-month backlog — that's not fair to victims," he added.
But what about free speech? Some point out that there's a slippery slope from stopping pedophiles from accessing illegal content into straightforward censorship. As Olly Lennard of the Huffington Post points out:
As much as the idea of censorship and tracing evokes images of an Orwellian dystopia, maybe it's alright that we don't have unfiltered access to kiddie porn.
After all, these changes come only a few days after police caught a Canadian man accused of running one of the largest child pornography rings ever. The 45 terabytes of his child pornography included hundreds of thousands of images depicting “horrific acts of sexual abuse — some of the worst [officers] have ever seen,” according to the Toronto sex crimes investigator.