CBDCs are everywhere – and nowhere. Everyone is discussing them, but almost no one is actually deploying them. Sure, this is in part due to the early stage thinking that is going into working out what is actually required but it’s also due to the tricky business of actually working out how they would be implemented. Developing a retail payment solution is a lot harder than creating a Central Bank backed payment instrument.
For Safer Internet Day, I thought I’d bring a Mediterranean theme. As a classicist, I frequently switch between ancient and modern, applying time-tested principles to emerging technologies. Plato had it right on data protection: the price of not participating in public life is to be ruled by less able men.
Have you noticed that some of the best attended events at conferences recently are the investment panels, populated by canny investors talking about where they are currently placing their funds? And so this was the case with Consult Hyperion’s recent webinar The Role of Due Diligence in Investment Cycles, featuring Jonathan Luff Co-Founder of CyLon, Europe’s leading investor in pre-seed and seed stage cyber and security technology startups. Howard Hall, Managing Director of Consult Hyperion North America, and Gary Munro, Technical Director Consult Hyperion and Dave Birch our Global Ambassador, who moderated the discussion.
In our Live 5 for 2021, we said that governance would be a major topic for digital identity this year. Nowhere has this been more true than in the UK, where the government has been diligently working with a wide set of stakeholders to develop its digital identity and attribute trust framework – the rules of road for digital identity in the UK. The work continues but with the publication of the second iteration of the framework I thought it would be helpful to focus on one particular aspect – how might the framework apply to decentralised identity, given that is the direction of travel in the industry.
I was delighted to be asked to present a keynote at the FIDO Authenticate Summit and chose to focus on digital identity governance, which is something of a hot topic at the moment. Little did I know that the day before my session was recorded the European Commission would propose a monumental change to eIDAS, the Europe Union’s digital identity framework – one of the main examples I was planning to refer to. I hastily skimmed the proposed new regulation before the recording but have since had the time to take a more detailed look.
Today marks the 10th anniversary of Safer Internet Day in the UK. Each year Industry, Educators, Regulators, Health & Social Care workers and Parents rally to raise awareness and put into action, plans to tackle findings from significant research on the topic of trust and safety on the internet. This year one of the research pieces talks of the challenge ‘An Internet Young People Can Trust’. As a mum of two school age children, I am sat here wondering if the internet will ever be safe … for them or me.
If I think about life BC (before COVID), my eldest used social media for broadcast communications to her friends. She was guided on the appropriateness of certain apps and our acid test on the content she was posting, was always ‘would you go up to a stranger in the street and give him your name, age, location and a photo of you in a bikini’ … her reaction was always ‘err, no’. My youngest had never been online apart from BBC Bitesize for homework assignments. We’re not online gamers so have never had constant nagging to go online. Additionally, you have to remember the internet (and mobile internet) has been significant in my work world since 1990 so I have a heightened understanding of the pitfalls and have seen many fall foul of their online reputation, tarnishing their in-person reputation.
What did you think of the US election? I don’t mean the candidates and the outcome. What did you think of the election process? Should it be possible for national elections of this type to be done online? Last week the IET published a paper on internet voting in the UK, led by our good friend at the University of Surrey, Professor Steve Schneider. It’s well worth a read. As the paper explains, internet voting for statutory political elections is a uniquely challenging problem. Firstly voting systems have exacting requirements and secondly, the stakes are high with the threat of state level interference.
I listened with interest to yesterday’s parliamentary committee on the proposed NHSX contact tracing app, which is being trialled on the Isle of Wight from today. You can see the recording here.
Much of the discussion concerned the decision to follow a centralised approach, in contrast to several other countries such as Germany, Switzerland and Ireland. Two key concerns were raised:
1. Can a centralised system be privacy respecting?
Of course the answer to this question is yes, but it depends on how data is collected and stored. Cryptographic techniques such as differential privacy are designed to allow data to be de-indentified so that is can be analysed anonymously (e.g. for medical research) for example, although there was no suggestion that NHSX is actually doing this.
The precise details of the NHSX app are not clear at this stage but it seems that the approach will involve identifiers being shared between mobile devices when they come into close proximity. These identifiers will then be uploaded to a central service to support studying the epidemiology of COVID-19 and to facilitate notifying people who may be at risk, having been in close proximity to an infected person. Whilst the stated intention is for those identifiers to be anonymous, the parliamentary debate clearly showed there a number of ways that the identifiers could become more identifiable over time. Because the identifiers are persistent they are likely to only be pseudonymous at best.
By way of contrast, a large team of academics has developed an approach called DP-3T, which apparently has influenced designs in Germany and elsewhere. It uses ephemeral (short-lived) identifiers. The approach is not fully decentralised however. When a user reports that they have COVID-19 symptoms, the list of ephemeral identifiers that user’s device has received, when coming into close proximity to other devices, is shared via a centralised service. In fact, they are broadcast to every device in the system so that risk decisioning is made at the edges not in the middle. This means that no central database of identifiers is needed (but presumably there will be database of registered devices).
It also means there will be less scope for epidemiological research.
All of this is way beyond the understanding of most people, including those tasked with providing parliamentary scrutiny. So how can the average person on the street or the average peer in Westminster be confident in the NHSX app? Well apparently the NHSX app is going to be open sourced and that probably is going to be our greatest protection. That will mean you won’t need to rely on what NHSX says but inevitably there will be universities, hackers, enthusiasts and others lining up to pick it apart.
2. Can a centralised system interoperate with the decentralised systems in other countries to allow cross border contact tracing?
It seems to us that whether a system is centralised or not is a gross simplification of the potential interoperability issues. True, the primary issue does seem to be the way that identifiers are generated, shared and used in risk decisioning. For cross border contact tracing to be possible there will need to be alignment on a whole range of other things including technical standards, legal requirements and perhaps even, dare I say it, liability. Of course, if the DP-3T model is adopted by many countries then it could become the de facto standard, in which case that could leave the NHSX app isolated.
Will the NHSX app be an effective tool to help us get back to normal? This will depend entirely on how widely it is adopted, which in turn will require people to see that the benefits outweigh the costs. That’s a value exchange calculation that most people will not be able to make. How can they make a value judgment on the potential risks to their civil liberties of such a system? The average user is probably more likely to notice the impact on their phone’s battery life or when their Bluetooth headphones stop working.
The opening keynote at identity week in London was given by Oliver Dowden, the Minister for implementation at the Cabinet office and therefore the person in charge of the digital transformation of government. At Consult Hyperion we think digital identity is central to digital transformation of government (and the digital transformation of everything else, for that matter) so I was looking forward to hearing the UK government’s vision for digital identity. I accompanied the Minister on his visit to the IDEMIA stand where he was shown a range of attractive burgundy passports.
In his keynote, the Minister said that the UK is seen as being at the cutting edge of digital identity and that GOV.UK Verify is at the heart of that success.
(For foreign visitors, perhaps unfamiliar with this cutting edge position, a spirit of transparency requires me to note that back on 9th October 2016, Mr. Dowden gave written statement HCWS978 to Parliament, announcing that the government was going to stop funding Verify after 18 months with the private sector responsible for funding after that.)
Given that the government spends around £1.5 billion per annum on “identity, fraud, error, debt, how much identity costs to validate, and how much proprietary hardware and software bought”, it’s obviously important for them to set an effective strategy. Now, members of the public, who don’t really know or care about digital ID might be saying to themselves, “why can’t we just use ‘sign in with Apple’ to do our taxes?”, and this is a good point. Even if they are not saying it right now, they’ll be saying it soon as they get used to Apple’s mandate that all apps that allow third-party sign-in must support it.
Right now you can’t use a GOV.UK Verify Identity Provider to log into your bank or any other private sector service provider. But in his speech the Minister said that he looks forward to a time when people can use a single login to “access their state pension and the savings account” and I have to say I agree with him. Obviously you’d want a different single login for gambling and pornography, but that’s already taken care of as, according to Sky News, “thanks to its ill-conceived porn block, the government has quietly blundered into the creation of a digital passport – then outsourced its development to private firms, without setting clear limits on how it is to be used”. One of these firms runs the world’s largest pornography site, Pornhub, so I imagine they know a thing or two about population-scale identity management.
Back to the Minister’s point though. Yes, it would be nice to have some sort of ID app on my phone and it would be great if my bank and the HMRC and Woking Council and LinkedIn would all let me log in with this ID. The interesting question is how you get to this login. Put a PIN in that and we’ll come back to it later.
The Minister made three substantive points in the speech. He talked about:
- The creation of a new Digital Identity Unit, which is a collaboration between DCMS and Cabinet Office. The Unit will help foster co-operation between the public and private sector, ensure the adoption of interoperable standards, specification and schemes, and deliver on the outcome of the consultation.
- A consultation to be issued in the coming weeks on how to deliver the effective organisation of the digital identity market. Through this consultation the government will work with industry, particularly with sectors who have frequent user identity interactions, to ensure interoperable ‘rules of the road’ for identity.
- The start of engagement on the commercial framework for consuming digital identities from the private sector for the period from April 2020 to ensure the continued delivery of public services. The Government Digital Service will continue to ensure alignment of commercial models that are adopted by the developing identity market to build a flourishing ecosystem that delivers value for everyone.
The Minister was taken away on urgent business and therefore unable to stay for my speech, in which I suggested that the idea of a general-purpose digital identity might be quite a big bite to take at the problem. So it would make sense to look at who else might provide the “digital identities from the private sector” used for the delivery of public services. Assuming the current GOV.UK Verify identities fail to gain traction in the private sector, then I think there are two obvious private sector coalitions that might step in to do this for the government: the big banks and the big techs.
For a variety of reasons, I hope that the big banks are able to come together to respond to the comments of Mark Carney, the Governor of the Bank of England, on the necessity for a digital identity in the finance sector to work with the banks to develop some sort of financial services passport. I made some practical suggestions about this earlier in the year and have continued to discuss the concept with potential stakeholders. I think it stacks up, but we’ll have to see how things develop.
On the other hand, if the banks can’t get it together and the big techs come knocking, they are already showing off their solutions. I’ll readily admit that when the Minister first said “private sector identities”, the first thought to flash across my brain was “Apple”. But I wouldn’t be at all surprised to go over to the HMRC web site fairly soon to find a “log in with Amazon” and “log in with Apple” next a button with some incomprehensible waffle about eIDAS that I, and most other normal consumers I’m sure, will simply ignore.
How do you use Apple ID to log into the Inland Revenue? Easy: you log in as you do now after sending off for the password and waiting for it to come in the post and that sort of thing and then once you are connected tell them the Apple ID that you want to use in the future. If you want to be “email@example.com” or whatever, it doesn’t matter. It’s just an identifier for the Revenue to recognise you in the future. Then next time you go to the Inland Revenue, you log in as firstname.lastname@example.org, something pops up on your iPhone and you put your thumb on it or look at it, and bingo you logged in to fill out your PAYE.
How time flies, GDPR has just had its first birthday!
This past year you will have been inundated with articles and blogs about GDPR and the impact on consumers and businesses alike. According to the UK’s Information Commissioner, Elizabeth Denham, GDPR and its UK implementation, the Data Protection Act (DPA) 2018, has marked a “seismic shift in privacy and information rights”. Individuals are now more aware of their information rights and haven’t been shy about demanding it. In the UK, the ICO received around 14,000 personal data breach reports and over 41,000 data protection concerns from the public from 25 May 2018 to 1 May 2019, compared to around 3,300 PDB reports and 21,000 data protection concerns in the preceding year. Beyond Europe, the regulation has had a remarkable influence in other jurisdictions, where they have either enacted or are in the process of enacting a ‘GDPR equivalent’ law – something similar is underway in Brazil, Australia, California, Japan and South Korea.
With all the good intentions of GDPR some of its provisions contradict, other, equally well-intentioned EU laws. Bank Secrecy Laws on one hand, require that customers’ personal data should be protected and used for the intended purpose(s), except where otherwise consented to by the customer. AMLD4/5 on the other hand, requires that identifying personal data in ‘suspicious transactions’ should be passed on to appropriate national authorities (of course without the customer’s consent/ knowledge). Then PSD2 requires banks to open up customers’ data to authorised Third Party Providers (TPPs), subject to obtaining the customer’s consent. One issue that arises out of this is the seeming incongruity between Article 94 PSD2’s explicit consent, and GDPR’s (explicit) consent.
Under GDPR, consent is one of the lawful bases for processing personal data, subject to the strict requirements for obtaining, recording, and managing it, otherwise it’s deemed invalid. In some cases, a lack of good understanding of these rules has resulted in poor practices around consent processing. That is why organisations like the Kantara Initiative are leading the effort in developing specifications for ‘User Managed Access’ and ‘Consent Receipt’.
In addition, EU regulators have been weighing in to clarify some of the conundrums. For example, the Dutch DPA issued a guidance on the interplay of PSD2/GDPR, which shows that there’s no straightforward answer to what seems like a relatively simple question, as one might think. The EDPB has also published an opinion on the interplay between GDPR, and the slowly but surely evolving ePrivacy regulation. Suffice to say, correctly navigating the compliance requirements of all these laws are indeed challenging, but possible.
What will the second year of GDPR bring?
While regulators are keen to enforce the law, their priority is transparent co-operation, not penalties. The ICO has provided support tools, and guidance, including a dedicated help line and chat services to support SMEs. They are also in the process of “establishing a one-stop shop for SMEs, drawing together the expertise from across our regulatory teams to help us better support those organisations without the capacity or obligation to maintain dedicated in-house compliance resources.” However, for those who still choose to ‘wilfully or negligently break the law’, GDPR’s recommended administrative fines may help to focus the mind on what is at stake, in addition to the ‘cleaning up’ costs afterward. Supervisory Authorities require time and resources to investigate and clear the backlog as a result of the EU wide increase in information rights queries and complaints of the past one year. The UK’s ICO, and its Netherlands and Norwegian counterparts are collaborating to harmonise their approaches and establish a “matrix” for calculating fines. France’s CNIL has led the way with the $57 million Google fine earlier in the year, however, the ICO has confirmed that there will soon be fines for “a couple of very large cases that are in the pipeline, so also,the Irish DPC expects to levy “substantial” fines this summer.
A new but important principle in GDPR is the ‘accountability principle’ – which states that the data controller is responsible for complying with the regulation and must be able to demonstrate compliance. So, it is not enough to say, ‘we have it,’ you must be able to produce ‘appropriate evidence’ on demand to back it up. The ICO states in its ‘GDPR – one year on’ blog that “the focus for the second year of the GDPR must be beyond baseline compliance – organisations need to shift their focus to accountability with a real evidenced understanding of the risks to individuals in the way they process data and how those risks should be mitigated.” By now one would expect that most organisations would have put in the effort required beyond tick boxes to achieve an appropriate level of compliance with the regulation so they can reap the reward of continued business growth borne out of trust/loyalty from their customers.
One of the methods of demonstrating GDPR accountability is through a Data Protection Impact Assessment (DPIA) – a process by which organisations can systematically analyse, identify, and minimise the data protection risks of their project or plan ‘before going live.’ GDPR does not mandate a specific DPIA process, but expects whichever methodology chosen by the data controller to meet the requirements specified in its Article 35(7).
At Consult Hyperion, we have a long track record of thinking about the risks associated with transactional data, so much so that we published and continue to use our own Structured Risk Analysis (SRA) methodology. Our approach, in response to the needs of our customers, has always been to describe the technological risks in a language that allow the business owner, who ultimately owns the risk, to make a judgement. Building on this we have developed a business focused approach to GDPR compliant DPIA to help our customers, for the products we design, review, or develop for them.
If you’re interested in finding out more, please contact: email@example.com