ITsecurity
twitter facebook rss

Google and Differential Privacy – RAPPOR

Posted by on November 8, 2014.

There is a great deal of press coverage this week about Google’s announcement at CCS 2014 that they are working on a new project called RAPPOR (PDF) which reportedly uses techniques from the 1960s based around differential privacy.

This is good news, or would be if it meant Google were becoming more ethical with regards to privacy, but before we go on let’s make one thing very clear – this technique is used only for very limited purposes such as the example they give as to whether or not a Chrome user has chosen to enable “Do Not Track” – it is used purely for macro data analysis to give a very high level summary of a simple data set.  Google will not be choosing differential privacy for all or even a large amount of the data analysis they do and here is why.

Google are not your friend, they are not your buddy and you are not their customer – you are their product – you are what they sell and you are who they target their advertisements at.  This is a very important fact to understand when considering any news about privacy enhancements from any of the Big Data corporations.  They will talk about anonymisation, de-linking data and privacy as if they really care because they know most people simply do not understand 99% of what is actually happening to their data and their behavioural profiles.  The reality is the value of the data they collect about you is inversely proportional to how well they can identify you from it – in other words – the less identifiable you are from your data, the less that data is worth to the likes of Google.

Google are a giant advertising agency, they use their products and services to track your behaviour online (and increasingly offline) to build a profile about you and then sell that to the highest bidder (and governments) across the world – they will never be privacy friendly, they cannot be privacy friendly because their entire raison d’etre is to sell you and to sell you, they have to be able to identify you – in effect Google are and always will be the anti-privacy (think anti-christ but in relation to privacy) and no research paper will ever change that.

Is differential privacy good for high level analysis of simple data sets?  Yes.  Are Google suddenly your best friend working to develop new privacy enhancing data analytics to protect you from their data juicing machine? Absolutely not – they didn’t even invent differential privacy, it is a research method which is over half a century old.

Don’t fall for the prestidigitation – you are still very much the target of Google’s global privacy conspiracy and they absolutely know who you are, where you are and what you looked at online five minutes ago and what’s more, they are already selling that data thousands of times an hour.

Leave a Reply

Your email address will not be published. Required fields are marked *

Submitted in: Alexander Hanff, Expert Views, News_privacy | Tags: , ,