Over the years, I’ve often tried to persuade clients that the time will come when privacy will be part of the upfront consumer proposition rather than a back office hygiene factor.
privacy should be an integral part of the customer proposition that sways the choice of product or service
This means that organisations should plan for investments in more sophisticated security infrastructure (you can’t have privacy without security) and that these should be on a roadmap that exploits this transition. I think we may be getting closer to this transition time, because I notice that Apple appear to taking quite a big step forward to improve the privacy of individuals in a networked, hyper-connected world by introducing “differential privacy” in its products.
Differential privacy provides a way to mathematically guarantee that statistics about a pool of data collected from many people can’t be used to reveal much about the contribution of any one individual. Apple has built it into the new version of its mobile operating system, iOS, for iPhones and iPads to be released this fall.
From Apple’s New Privacy Technology May Pressure Competitors to Better Protect Our Data
If you’re wondering what this means, and can’t understand the wikipedia article (I couldn’t), let me give you an example from some software that I wrote many, many years ago. I’ll use the example of recreational drug use, although this isn’t what the project I worked on was about (well, not during daylight hours, anyway).
Suppose for some reason — e.g., public health planning — the government wants to know how many people smoke dope. Imagine that there’s an app on your phone that asks you if you smoke dope. So it asks you “Do you smoke dope?”. The app sends your answer back to some survey database big data cloud thing. Now the big data cloud thing can tell other people (e.g., the government) that you smoke dope but that means that the police will know and also if hackers get into the survey database big data cloud thing they could blackmail you (or sell you dope).
But there is another privacy-enhancing way to do this.
The app asks you if you smoke dope. You answer. Then the app tosses a coin. If the coin comes down heads, then the app tells the big data cloud thing “yes”. If the coin comes down tails, then the app tells the big data cloud thing whatever your real answer was.
Let’s say 10 million people answer. In the big data cloud thing, there are seven million yes answers and three million no answers. Remember, because the coin toss is fair, then five million of the answers will be a yes anyway. So you know that five million of the yes answers were there because of the coin coming down heads, and you can ignore them because they are not the real answer. You can take away five million of the yes answers as down to random chance.
Now you are left with the remaining five million real answers. There are the two million yes answers and three million no answers that are not down to random chance. You can therefore deduce that 40% of the population smoke dope.
Now, if hackers or the police get into the database and discover a yes answer next to your phone number, they cannot tell whether it is a real yes or a yes because of the coin toss. And you don’t want to reveal that you smoke dope, you can say that’s because of the coin toss.
Thus, the statistics for the population are correct and you know that 40% of the population smoke dope but you cannot tell whether any individual person smokes dope.
Now the differential privacy used by Apple is more complex than this simple example, but you get the point, and good on them for taking practical privacy-enhancing action whether it is to advance the sum total of human privacy or to put pressure on Facebook and Google. Either way, making privacy part of the proposition that might sway the customer’s choice is a very good thing.