TruthMovement an internet research-guide for students and scholars. Best viewed in Chrome Browser

Blog Search

Tuesday, August 12, 2014

So Google scans email for dodgy images – should we be worried about scanning for sensitive documents? by Paul Bradshaw.

So Google scans email for dodgy images – should we be worried about scanning for sensitive documents? by Paul Bradshaw.        
Gmail logo



You could be forgiven for not having heard of John Henry Skillern. The 41 year old is facing charges of possession and promotion of child pornography after Google detected images of child abuse on his Gmail account. 

Because of his case we now know that Google “proactively scours hundreds of millions of email accounts” for certain images. The technology has raised some privacy concerns which have been largely brushed aside because, well, it’s child pornography. 

Sky’s technology correspondent Tom Cheshire, for example, doesn’t think it is an invasion of our privacy for “technical and moral reasons”. But should journalists be worried about the wider applications of the technology, and the precedent being set? 


Automated matching 

Part of Cheshire’s technical argument against the software representing an invasion of privacy is that it is almost entirely automated. As The Telegraph reported


“It is understood that the software works by comparing images held in users’ accounts against a vast database of child abuse images which have been collated by child protection agencies around the world. 

“Each one of the images is given a unique fingerprint, known as a hash, which is then used to compare with those held in the database.” 

When a match is found, humans come into the process: “Trained specialists at organisations examine the image and decide whether to alert the police.” 

But it’s not too big a leap of the imagination to see the same technology being used to spot documents held in users’ accounts against a database of documents the authorities don’t want made public (on the basis of ‘national security’). Or even images the police don’t want distributed. 

And if that technology was employed, it is much less likely that its use would be made public in a court case in the same way as Skillern’s. 

This ‘feature creep’ has been seen before in both technologies and laws. The Regulation of Investigatory Powers Act (RIPA), for example, was intended to allow surveillance related to terrorism or serious crime, but authorities used it for purposes including “spying on garden centres for selling pot plants; snooping on staff for using work showers or monitoring shops for unlicensed parrots.” 
Who controls the database controls what gets flagged 

In the description given above Google is entirely reliant on whoever compiles the database, and whoever they pass the images onto. 

However noble the stated purpose, this is state surveillance, with the notable quirk that those conducting the surveillance are ‘blind’. 

As Cheshire reports: “No humans are looking at images, which would be illegal. Nor does Google store child abuse images itself, which would also be illegal.” 

So if a government whistleblower was trying to share documents their employers could be notified without anyone else knowing. 

If a journalist passed on sensitive documents to a colleague a ‘red flag’ would be raised in a government office. 

Where protestors shared images of police brutality, that image could be used to identify all of the recipients, including any reporters. 


Google says it is not looking for other crimes at the moment, but it’s safe to say any extension of the technology, if introduced, would be operating without users knowing for some time. 

On that basis journalists should assume that documents and images cannot be safely shared using Gmail – our account or any source’s. 

Encryption, suggested by Cheshire, is not going to be a practical option for most sources. At the very least we should switch to a different email service ourselves and recommend that documents are shared using old fashioned post. 

In the meantime, we need to talk about the oversight for systems of mass warrantless surveillance and the implications that such systems have for freedom of speech. 

Google may be a commercial organisation, but in these situations it is acting as an agent of the state, and should be subject to the same checks and balances.

Hysteria over child sexuality

Frank Gillice
The hysteria over child sexuality has persisted for so long because those who profit from the hysteria are allowed to frame the debate. The Child Abuse Industry is a Billion dollar a yr. business with its own Law Enforcement arm well paid CEO’s and staff that create Gothic Melodramas, monster stories of child molesting playing them out on TV news and in newsprint every day to keep people scared and keep the donations and government grants coming. The people are provided these Gothic melodramas with not just titillation but assurances by them they are our righteousness protectors. They demonize child porn to connect you to their pedophile drama while pretending to shut it down. The majority of child porn on the net so I am told it selfies; children taking pictures and videos of themselves. Also Aficionados and vice cops concede that practically all the sexually explicit images of children circulating on the net are the same stack of yellowing pages found at the back of those X-rated shops, only digitized. These pictures tend to be twenty to fifty years old, made overseas, badly re-reproduced, and for the most part pretty chaste. That is why federal agents never show journalists the contraband.

What Law enforcement and private nonprofits are doing is using HASH TAG DATAbases.  These databases do not contain any images and are comprised of millions of hashes, which are merely fingerprints of images. There are several reasons why the database contains only hashes and no images.
1. Image files are large; it would be nearly impossible to create a database large enough to house the billions of images on the Internet. Hash values on the other hand are extremely small. It is much faster to compare hashes than it is entire image files. By only referencing the hash values,
2. Speed. By only referencing the hash value law enforcement can quickly scan a computer and compare its images to the IFID.
3. Possession of child pornography is a federal crime.
4. We want to help people keep pornography off their computers, not become the single largest repository of pornography.

However you cannot take the hashtag/fingerprint of an image and determine what the picture is in the same way you can’t determine what someone looks like by merely looking at their fingerprints. What Law Enforcement, private independent corporations, The Entertainment industry and other nefarious individuals are sending out these collected hash values without content to P2P file sharers using popular file search parameters in an attempt to circumvent file sharing copyright issues. So right now I suspect the FBI and ICE are setting up people like Luke Rudkowski, Stewart Rhodes, Dan Johnson, Madison Ruppert, and any other individuals in the liberty/truth movement as well as P2P file sharers. I also suspect the majority of all child porn raids and busts have been fabricated or have been somewhat of a set up or entrapment. We need a congressional investigation into their actions of the Internet Crimes Against Children task force including the National Center for Missing and Exploited Children (NCMEC) to find out what is really going on here. We need to find out their methods, techniques, and whether they are really just targeting people they have reasonable suspicion(for investigation) and probable cause(for search warrants) to suspect they are pedophiles. During the fight to pass the infamous “$700 Billion Bailout” plan (a.k.a. HR 1424), Congress added around $150 billion in new spending or “pork” projects as well as $300 million over the next five years to finance the currant CHILD PORN WITCH HUNT. Spending money this nation doesn’t have only for it to be paid back by our children’s children is Child Abuse.