At the (sadly, virtual) Fintech South event the year, I was asked to chair a discussion on identity and privacy with three extremely well-qualified experts who had informed perspectives on the state of, and trends in, those important pillars of a digital society. These were Adam Gunther (SVP, Digital Identity for Equifax), Andrew Gowasack (Co-Founder and President at TrustStamp) and Megan Heinze (President, Financial Institutions, North America for IDEMIA). It was great to talk to a group of people who were not only well-informed on these topics but had some passion for them too.
I won’t go over everything that was discussed, but I do want to pick up on a comment that was made in passing when I was chatting to the panelists: someone said that a guiding principle should be “no scary systems”. Hear hear! But what is a scary system? It is, in my opinion, a system that privileges security over privacy. This is not how we should be designing the identity systems for the 21st century!
It is well-understood that there is a fundamental asymmetry between privacy and security when it comes to providing products and services. You can have security without privacy, but it doesn’t work the other way round. You can’t have privacy without security. If you can’t keep your data secure then it doesn’t matter what any of your privacy goals or policies are, because none of the data will be private for long. Therefore, we all (by which I mean the public, organisations, governments and law enforcement agencies and so on) all want a secure infrastructure. Not all of the stakeholders, however, want a private infrastructure! There are a great many people who have very good reasons for wanting to have access to data. The police, to choose an obvious example, might want access to phone messages. It is a topic of some complexity, and well beyond the bounds of our panel, to discuss under what circumstances or by what mechanisms they might obtain those messages but I think we can all agree that the messages should be secure to the extent compatible with a democratic society.
No scary systems! No digital identity infrastructure should involve any sort of trade-off between privacy and security: we (ie, the industry) should be perfectly capable of delivering both. This why I found the discussion with leading organisations in the field so interesting: we all agreed from our different perspectives that this is a reasonable goal. And we know how to do it. Designing an identity infrastructure that is founded on credentials rather than identity (what some people refer to a reputation economy) is not only feasible but highly desirable.
Suppose that the vision for national identity (based on the concepts of social graph, mobile authentication, pseudonyms and so on) focused on the entitlements rather than on either the transport mechanism or biographical details? Then, as a user of the scheme, I might have an entitlement to (for example) access health care, enter a bar or read to the Wall Street Journal online. I might have these entitlements on my phone (so that’s the overwhelming majority of the population taken care of) or stored somewhere safe (eg, in my bank) or out on a blockchain somewhere. Remember, these entitlements would attest to my ability to do something: they would prove that I am entitled to do something (see a doctor, drink in the pub, read about people who a richer than me), not who I am. They are about entitlement, not identity as a proxy for entitlement.
Megan knows how to secure the cryptographic keys needed to make this all work, Adam knows how to attach reputations to them and Andrew knows how to authenticate their use. So I was very happy to take part in a discussion that gave the audience (and me) a good dose of optimism about where we might go next with digital identity.