Is your mobile banking app exposed by someone else’s software?

This post was written in collaboration with Neal Michie, Director, Product Management, Verimatrix.

Banks are facing massive disruption and change from many directions. The rise of app-only banks has made the need for traditional banks to have compelling app services an imperative. Banks have of course been building mobile apps for several years. If not already, they will soon be the most important channel for engaging with and serving customers. However, mobile banking apps will also become the primary focus of hackers, intent on getting access to other people’s information and money.

How have mobile banking apps evolved?

From lightweight apps supporting basic banking functions they have now evolved into full service branches including all manner of sophisticated features such as biometric identification, payments, loyalty and personal finance management driven by data science and machine learning.

Some of the things you might expect to find feature rich mobile apps include:

  • Numerous APIs supporting the large number of features – creating a large attack surface through API Abuse.
  • A mixture of native code, managed and web-code – providing different levels of control over how sensitive data is handled.
  • Dozens of dependencies on third party SDKs and libraries – each outside of the direct control of the bank.
  • Hundreds of thousands of lines of code – making isolating and auditing critical security functionality very difficult.
  • Use of WebViews to render web-code – providing great flexibility but making the app more reliant on external services for its security.

Third-party components are a particular issue

The use of third-party components may expose banks to new security risks. If these risks are not carefully managed and the customer data is compromised, then banks risk regulatory fines and reputation damage, not to mention the inconvenience and worry caused to customers.

To give you an idea of the size of the issue – third-party components can provide all manner of functions such as remote user monitoring, instrumentation, error and crash reporting, profiling, binding the user to the device, analytics, cryptography, UI enhancing, financial charts and many more. Components may be proprietary or open source code. They are integrated into the app where they may gather or process data and connect to third-party backend servers that may or may not be under the control of the bank.

The risk isn’t just theoretical. The well-publicised attack on British Airways breached the airline through known vulnerabilities in third party code. The business risk: a fine equivalent to 1.9% of revenue and long-term reputation harm – we are still talking about it 2 years later.

While not exhaustive, some of main security issues to watch out for include:

  • Auditing: Banks may not have access to the source code of proprietary third-party SDKs and libraries. This makes the task of ensuring that the software is safe much more difficult. Penetration testing techniques can be used to monitor the behaviour of the component, but this is no substitute for a source code review.
  • Deployment model: Often third-party components need to connect to the third-party backend services. Sometimes those backend services have to be operated by the third party. Other times it may be possible to bring those backend services in-house. Obviously bringing them in-house gives more control to the bank to ensure any sensitive data stays within their infrastructure. Where this is not possible, then the bank may unwittingly introduce a “data processor” in GDPR terms into their service without the necessary oversight.
  • Known vulnerabilities: Third party components may in turn utilise various other proprietary or open source libraries. Some of these dependencies may have known vulnerabilities which leave the bank app exposed.
  • Potential information disclosure in transit: Third party SDKs may not have adequate transport security enabled. For instance, no or weak TLS certificate pinning may be implemented, making any communication between mobile apps and the backend susceptible to man-in-the-middle attacks. This can potentially leak sensitive information in transit.
  • Potential information disclosure at rest: Third party SDKs may gather and handle sensitive customer or bank information, and this may be cached in the persistent storage without encryption or without being cleared from memory after use. A backup of the device/app can potentially expose sensitive information if not adequately protected.
  • Potential information disclosure due to misconfiguration: Backend service endpoints necessarily allow connections from the mobile apps. If these are misconfigured, then they may potentially leak sensitive information. A recent example was the exposure of personal data by Google Firebase. Data exposed included email addresses, usernames, passwords, phone numbers, full names, GPS information, IP addresses and street addresses.

How can we mitigate the identified security risks?

  • Risk Assessment: A risk assessment of third-party components will help to determine whether using “black box” components is acceptable. Consult Hyperion’s Structured Risk Assessment (SRA) methodology is designed for just this, ensuring that technical decisions have the right business context.
  • Code review and penetration testing: Source code review and penetration testing should be a standard process in any bank, employed for every release. Banks should go beyond the use of automated scanning and analysis tools. These may help in catching common issues but will not cover everything. For third party components, the options available could include reverse engineering (which is costly) or dynamic testing, where the behaviour of the component and its external communications are monitored real-time as it is used.
  • Tamper detection and integrity protection: Banks should also utilise tamper and integrity protection to protect code and builds – often known as App Shielding or In-App Protection. This should cover the integration of any third-party components where possible – either separately or as a whole. By anchoring third party libraries and obfuscating their boundaries, vulnerabilities become much hard to find and exploit. This additional layer of security can safeguard mobile banking apps and provide security assurance against reverse engineering and runtime hacks.

Conclusion

Protecting the personal and financial information of customers is a fundamental responsibility of banks. Doing so in mobile banking apps is more important than ever. The use of third-party components makes this more challenging. The contract a bank has with a third party may only provide very limited protection when things go wrong and it will certainly not cover the detrimental risk of losing customer trust and business. Great care is needed therefore, when choosing, integrating and deploying third party components.  On the other hand, third party components allow banks to provide their customers with better services quicker, so it is imperative that they employ best practice to ensure they serve their customers well, maintaining their trust.

About the Co-Author
Neal Michie

Neal Michie serves as Director of Product Management for Verimatrix‘s award-winning code protection solutions. Helping countless organisations instil trust in their IoT and mobile applications, Neal oversees Verimatrix’s foundational security products that are relied upon by some of the world’s largest organisations. He champions the need to position security as a top-notch concern for IoT companies, seeking to elevate code protection to new heights by serving as a sales enabler and brand protector. Neal brings more than 18 years of software development experience and spent the last decade building highly secure software solutions, including the first to be fully certified by both Mastercard and Visa. A graduate of Heriot-Watt University with an advanced degree in electrical and electronics engineering, Neal is an active and enthusiastic participant in Mobey’s expert groups.

No Delay to SCA

Since the FCA announced a further 6 month delay in the UK’s deadline for Strong Customer Authentication there’s been a general expectation that the EBA would follow suit and relax the date for the EEA. However, it now appears that won’t happen – the 31st December 2020 remains the key date and there won’t be any further relaxation in the rules.

This hasn’t been officially announced but appears to have been the gist of a letter by the European Commission’s Executive Vice President Valdis Dombrovskis which makes clear that there’s no consideration in place for a delay and that, in the Commission’s view, the Coronavirus pandemic and the subsequent rise in e-commerce makes it more urgent to implement rather than less. It looks like the Commission is not for turning and with only a little over six months left to be prepared any merchant or payment service provider than hasn’t been planning for this is likely to be in full panic mode.

At one level it’s hard to disagree with the Commission’s position – the deadline has been shifted already from last September in order to accommodate the industry’s inability to implement in time. Although, in fairness, it ought to be noted that original requirements require a degree in semiotics to fully understand and clarifications have been fitful and, on occasion, too late. However, there’s a degree of real-world pragmatism missing from the decision – the last thing the European economy needs right now is an e-commerce cliff edge right in the middle of the busiest shopping period of the year.

The divergence between the UK and Europe also starts to raise some interesting questions. PSD2 applies to countries within the EEA and not to transactions starting or finishing outside – and as of January 1st 2021 the UK will be fully outside. PSD2 will apply within the EEA ex-UK and within the UK ex-Europe but, barring some kind of passporting agreement, not between them. One option for desperate European e-tailers may be to shift operations to the UK where the SCA deadline is a further 9 months away. Of course, the same applies in reverse: logically there ought to be a compromise, but those seem thin on the ground.

Overall, then, the message to all organisations involved in electronic payments is to assume that SCA will be  enforced from January 1st next year and any firm that can’t support it should expect to see transactions declined. Merchants and PSPs may choose or may not be able to handle SCA but issuers will be ready and won’t want to be upsetting the regulators. For any companies out there that don’t know what to do come and talk to us, we can help guide you through the process – first by helping ensure you’re compliant and then by addressing the additional friction that SCA will introduce.

It isn’t too late to do something about SCA but it does very much look like we are at the eleventh hour.

Identity – Customer Centric Design

The team put on an excellent webinar this Thursday (May 21st, 2020) in the Tomorrow’s Transactions series. The focus was on Trust over IP, although digital identity and privacy were covered in the round.

The panellists were Joni Brennan of the DIACC (Digital ID & Authentication Council of Canada—full disclosure: a valued customer), long-time collaborator Andy Tobin of Evernym and our own Steve Pannifer and Justin Gage. Each of the panellists is steeped in expertise on the subject, gained from hard-won experience.

Joni and Andy presented, respectively, the DIACC and ToIP layered architectural models (largely congruent) for implementing digital identification services. The panellists agreed that no service could work without fully defined technical, business and governance structures. Another key point was that the problems of identification and privacy merge into one another. People need to make themselves known, but are reserved about making available a slew of personal information to organisations with whom they may seek no persistent relationship or do not fully trust.

At one point, it was mentioned that practical progress has been slow, even though the basic problem (to put one aspect crudely, why do I need so many passwords?) of establishing trust over digital networks has been defined for 20 years at least. It could be argued that Consult Hyperion has earned its living by designing, developing and deploying point solutions to the problem. I began to wonder why a general solution has been slow to arise, and speculated (to myself) that it was because the end-user has been ill-served. In particular, the user sign-up and sign-in experiences are inconsistent and usually horrible.

Therefore, I posed the question “What is the panel’s vision for how people will gain access to personalised digital services in 2030?” The responses were interesting (after momentary intakes of breath!) but time was short and no conclusions were reached.

I slept on the problem and came up with some tentative ideas. Firstly, when we are transacting with an organisation (from getting past a registration barrier to download some info, through buying things, to filing tax returns), everything on our screens is about the organisation (much of it irrelevant for our purposes) and nothing is about us. Why can’t our platforms present a prominent avatar representing us, clickable to view and edit information we’ve recorded, and dragable onto register, sign-in or authorise fields in apps or browsers?

Now, there could be infinite variations of ‘me’ depending on how much personal information I want to give away; and the degree of assurance the organisation needs to conduct business with me (of course, it’s entirely possible there could be no overlap). I reckon I could get by with three variations, represented by three personas:

  • A pseudonym (I get tired of typing flintstone@bedrock.com just to access a café’s wifi; there are some guilty parties registering for our webinars too!)
  • Basic personal information (name, age, sex, address) for organisations I trust, with a need-to-know
  • All of the above, maybe more, but (at least, partly) attested by some trusted third party.

Obsessives could be given the ability to define as many options, with as many nuances, as they like; but complexity should be easily ignorable to avoid clutter for the average user.

I think it’s the major operating system providers that need to make this happen: essentially, Apple, Android and Microsoft, preferably in a standard and portable way. For each we would set up an ordered list of our preferred authentication methods (PIN, facial recognition, etc) and organisations would declare what is acceptable to them. The system would work out what works for both of us. If the organisation wants anything extra, say some kind of challenge/response, that would be up to them. Hopefully, that would be rare.

The Apple Pay and Google Pay wallets are some way to providing a solution. But sitting above the payment cards and boarding passes there needs to be the concept of persona. At the moment, Apple and Google may be too invested in promulgating their own single customer views to see the need to take this extra step.

I sensed frustration from the panellists that everything was solvable, certainly technically. Governance (e.g. who is liable for what when it all goes wrong?) was taken to be a sticking point. True, but I think we need to put the average user front and centre. Focus groups with mocked-up user experiences would be a good start; we’d be happy to help with that!

Would you use the NHSX app?

I listened with interest to yesterday’s parliamentary committee on the proposed NHSX contact tracing app, which is being trialled on the Isle of Wight from today. You can see the recording here.

Much of the discussion concerned the decision to follow a centralised approach, in contrast to several other countries such as Germany, Switzerland and Ireland. Two key concerns were raised:

1. Can a centralised system be privacy respecting?
Of course the answer to this question is yes, but it depends on how data is collected and stored. Cryptographic techniques such as differential privacy are designed to allow data to be de-indentified so that is can be analysed anonymously (e.g. for medical research) for example, although there was no suggestion that NHSX is actually doing this.

The precise details of the NHSX app are not clear at this stage but it seems that the approach will involve identifiers being shared between mobile devices when they come into close proximity. These identifiers will then be uploaded to a central service to support studying the epidemiology of COVID-19 and to facilitate notifying people who may be at risk, having been in close proximity to an infected person. Whilst the stated intention is for those identifiers to be anonymous, the parliamentary debate clearly showed there a number of ways that the identifiers could become more identifiable over time. Because the identifiers are persistent they are likely to only be pseudonymous at best.

By way of contrast, a large team of academics has developed an approach called DP-3T, which apparently has influenced designs in Germany and elsewhere. It uses ephemeral (short-lived) identifiers. The approach is not fully decentralised however. When a user reports that they have COVID-19 symptoms, the list of ephemeral identifiers that user’s device has received, when coming into close proximity to other devices, is shared via a centralised service. In fact, they are broadcast to every device in the system so that risk decisioning is made at the edges not in the middle. This means that no central database of identifiers is needed (but presumably there will be database of registered devices).

It also means there will be less scope for epidemiological research.

All of this is way beyond the understanding of most people, including those tasked with providing parliamentary scrutiny. So how can the average person on the street or the average peer in Westminster be confident in the NHSX app? Well apparently the NHSX app is going to be open sourced and that probably is going to be our greatest protection. That will mean you won’t need to rely on what NHSX says but inevitably there will be universities, hackers, enthusiasts and others lining up to pick it apart.

2. Can a centralised system interoperate with the decentralised systems in other countries to allow cross border contact tracing?
It seems to us that whether a system is centralised or not is a gross simplification of the potential interoperability issues. True, the primary issue does seem to be the way that identifiers are generated, shared and used in risk decisioning. For cross border contact tracing to be possible there will need to be alignment on a whole range of other things including technical standards, legal requirements and perhaps even, dare I say it, liability. Of course, if the DP-3T model is adopted by many countries then it could become the de facto standard, in which case that could leave the NHSX app isolated.

Will the NHSX app be an effective tool to help us get back to normal? This will depend entirely on how widely it is adopted, which in turn will require people to see that the benefits outweigh the costs. That’s a value exchange calculation that most people will not be able to make. How can they make a value judgment on the potential risks to their civil liberties of such a system? The average user is probably more likely to notice the impact on their phone’s battery life or when their Bluetooth headphones stop working.

There’s a lot more that could be said and I’ll be discussing the topic further with Edgar WhitleyNicky Hickman and Justin Gage on Thursday during our weekly webinar.

KYC at a distance

We live in interesting times. Whatever you think about the Coronavirus situation, social distancing will test our ability to rely on digital services. And one place where digital services continue to struggle is onboarding – establishing who your customer is in the first place.  

One of the main reasons for this, is that regulated industries such as financial services are required to perform strict “know your customer” checks when onboarding customers and risk substantial fines in the event of compliance failings. Understandably then, financial service providers need to be cautious in adopting new technology, especially where the risks are not well understood or where regulators are yet to give clear guidance.

Fortunately, a lot of work is being done. This includes the development of new identification solutions and an increasing recognition that this is a problem that needs to be solved.

The Paypers has recently published its “Digital Onboarding and KYC Report 2020”. It is packed full of insights into developments in this space, features several Consult Hyperion friends and is well worth a look.

You can download the report here: https://thepaypers.com/reports/digital-onboarding-and-kyc-report-2020

Fraudsters target loyalty schemes for easier gains

It has become practically impossible to keep up with the number of loyalty-related security breaches. In today’s edition of “Who Got Hit?”, we read that Tesco is sending security warnings to 600,000 Tesco Clubcard loyalty members following fraudulent activities[1]. The breach is suspected to be attackers trying to ‘brute-force’ their way into the loyalty system, using stolen credentials, potentially from a different breach. In recent years, fraud associated with loyalty has been on the rise. According to a 2019 report by Forter was an 89% increase in loyalty related fraud, from the previous year.

Perhaps one explanation for such a rise is that the payment industry has become increasingly effective in securing the payment infrastructure and making it harder for criminals to steal money. Additionally, the amount of value sitting in customer loyalty accounts continues to rise. For example, Starbucks has over $1.6 billion of unspent value in customer’s loyalty card and wallet accounts. Such trends are increasingly turning criminals’ focus to ‘softer’ targets such as loyalty schemes, taking advantage of weaker security of the systems to steal this value which can be converted into goods if not redeemed as actual cash.

Loyalty fraudsters can loosely be categorised, based on their motivations, technical expertise and level of access to the loyalty systems and processes. The table below outlines such categorisation:


Strong Passwords are no Panacea!

Security experts often suggest implementing stronger security features such as multifactor-authentication and the use of strong passwords to protect loyalty schemes. These are welcome suggestions; it is however not always realistic to implement expensive countermeasures just to protect loyalty points. A holistic approach to securing the systems and reducing frauds is required in order to enforce the security controls on customers and fraudsters alike.

Colleagues at Consult Hyperion have called for a closer alignment between Payment and Loyalty for years now. Card (and mobile) payments are a mature technology with relatively acceptable levels of security which has been proven over numerous decades. A seamless way of integrating loyalty into payments would allow loyalty schemes take advantage of the robustness of the payment schemes. Despite clear benefits, such integration has been limited, perhaps due to the associated costs to the merchant or the inconvenience to the customer. But a lot is changing in the world of customer authentication. Recent advances such as FIDO 2 and 3D-Secure 2.0, will allow strong customer authentication to be achieved within various contexts (including loyalty!), while maintaining a positive customer experience.

Within Consult Hyperion, our subject matter experts bring a deep understanding of the relevant payments technologies, as well as decades of experience in assessing and designing secure systems. If you would like to know more, feel free to give us a call.

More detail can be found here

Consult Hyperion’s Live 5 for 2020

At Consult Hyperion we take a certain amount of enjoyment looking back over some of our most interesting projects around the world over the previous year or so, wrapping up thoughts on what we’re hearing in the market and spending some time thinking about the future. Each year we consolidate the themes and bring together our Live Five.

2020 is upon us and so it’s time for some more future gazing! Now, as in previous years, how can you pay any attention to our prognostications without first reviewing our previous attempts? In 2017 we highlighted regtech and PSD2, 2018 was open banking and conversational commerce, and for 2019 it was secure customer authentication and digital wallets — so we’re a pretty good weathervane for the secure transactions’ world! Now, let’s turn to what we see for this coming year.

Hello 2020

Our Live Five has once again been put together with particular regard to the views of our clients. They are telling us that over the next 12 months retailers, banks, regulators and their suppliers will focus on privacy as a proposition, customer intimacy driven by hyper-personalisation and personalized payment options, underpinned by a focus on cyber-resilience. In the background, they want to do what they can to reduce their impact on the global environment. For our transit clients, there will be a particular focus on bringing these threads together to reduce congestion through flexible fare collection.

So here we go…

1. This year will see privacy as a consumer proposition. This is an easy prediction to make, because serious players are going to push it. We already see this happening with “Sign in with Apple” and more services in this mould are sure to follow. Until quite recently privacy was a hygiene factor that belonged in the “back office”. But with increasing industry and consumer concerns about privacy, regulatory drivers such as GDPR and the potential for a backlash against services that are seen to abuse personal data, privacy will be an integral part of new services. As part of this we expect to see organisations that collect large amounts of personal data looking at ways to monetise this trend by shifting to attribute exchange and anonymised data analytics. Banks are an obvious candidate for this type of innovation, but not the only one – one of our biggest privacy projects is for a mass transit operator, concerned by the amount of additional personal information they are able to collect on travellers as they migrate towards the acceptance of contactless payment cards at the faregate.

2. Underpinning all of this is the urgent need to address cyber-resilience. Not a week goes by without news of some breach or failure by a major organisation putting consumer data and transactions at risk. With the advent of data protection regulations such as GDPR, these issues are major threats to the stability and profitability of companies in all sectors. The first step to addressing this is to identify the threats and vulnerabilities in existing systems before deciding how and where to invest in countermeasures.

Our Structured Risk Analysis (SRA) process is designed to help our customers through this process to ensure that they are prepared for the potential issues that could undermine their businesses.

3. Privacy and Open Data, if correctly implemented and trusted by the consumer, will facilitate the hyper-personalisation of services, which in turn will drive customer intimacy. Many of us are familiar with Google telling us how long it will take us to get home, or to the gym, as we leave the office. Fewer of us will have experienced the pleasure of being pushed new financing options by the first round of Open Banking Fintechs, aimed at helping entrepreneurs to better manage their start-up’s finances.

We have already demonstrated to our clients that it is possible to use new technology in interesting ways to deliver hyper-personalisation in a privacy-enhancing way. Many of these depend on the standardization of Premium Open Banking API’s, i.e. API’s that extend the data shared by banks beyond that required by the regulators, into areas that can generate additional revenue for the bank. We expect to see the emergence of new lending and insurance services, linked to your current financial circumstances, at the point of service, similar to those provided by Klarna.

4. One particular area where personalisation will have immediate impact is giving consumers personalised payment options with new technologies being deployed, such as EMV’s Secure Remote Commerce (SRC) and W3C’s payment request API. Today, most payment solutions are based around payment cards but increasingly we will see direct to account (D2A) payment options such as the PSD2 payment APIs. Cards themselves will increasingly disappear to be replaced by tokenized equivalents which can be deployed with enhanced security to a wide range of form factors – watches, smartphones, IoT devices, etc. The availability of D2A and tokenized solutions will vastly expand the range of payment options available to consumers who will be able to choose the option most suitable for them in specific circumstances. Increasingly we expect to see the awkwardness and friction of the end of purchase payment disappear, as consumers select the payment methods that offer them the maximum convenience for the maximum reward. Real-time, cross-border settlement will power the ability to make many of our commerce transactions completely transparent. Many merchants are confused by the plethora of new payment services and are uncertain about which will bring them more customers and therefore which they should support. Traditionally they have turned to the processors for such advice, but mergers in this field are not necessarily leading to clear direction.

We know how to strategise, design and implement the new payment options to deliver value to all of the stakeholders and our track record in helping global clients to deliver population-scale solutions is a testament to our expertise and experience in this field.

5. In the transit sector, we can see how all of the issues come together. New pay-as-you-go systems based upon cards continue to rollout around the world. The leading edge of Automated Fare Collection (AFC) is however advancing. How a traveller chooses to identify himself, and how he chooses to pay are, in principle, different decisions and we expect to see more flexibility. Reducing congestion and improving air quality are of concern globally; best addressed by providing door-to-door journeys without reliance on private internal combustion engines. This will only prove popular when ultra-convenient. That means that payment for a whole journey (or collection or journeys) involving, say, bike/ride share, tram and train, must be frictionless and support the young, old and in-between alike.

Moving people on to public transport by making it simple and convenient to pay is how we will help people to take practical steps towards sustainability.

So, there we go. Privacy-enhanced resilient infrastructure will deliver hyper-personalisation and give customers more safe payment choices. AFC will use this infrastructure to both deliver value and help the environment to the great benefit of all of us. It’s an exciting year ahead in our field!



Horizon Brief

I had the pleasure of attending a “Horizon Brief” organised by the Centre for the Study of Financial Innovation for Dentons. The well-informed speakers, ably chaired by Andrew Hilton (Director of the CSFI), were lawyer Dominic Grieve (previously the Attorney General and, until last week, Chair of Parliament’s Intelligence and Security Committee), lawyer Anton Moiseienko from Royal United Services Institute Centre for Financial Crime and Security, lawyer Richard Parlour (Chairman of the EU Task Force on Cybersecurity Policy for the Financial Sector) and lawyer Antonis Patrikos from Dentons’ Privacy and Cybersecurity Practice.

Margot James, the Minister for Digital was quoted in The Daily Telegraph that the UK must “get over” privacy and cyber security fears and adopt technology such as online identities. While this Minister was advocating online identities, another Minister was ending government funding for the government’s own Verify digital identity service. And more recently another Minister has scrapped the online age verification plan that would have at least bootstrapped digital identity into the mass market.

During the questions, I noted it might seem that the government has no actual strategy. As Mr. Grieve pointed out in response to my question, there is a tension at the heart of government strategy. I will paraphrase, but the issue is that the government wants to accumulate data but the accumulation of data raises the likelihood of cyberattack. How do we deal with this tension and make progress? This point was illustrated rather well later in the week, when the Parliament’s Joint Human Rights Committee recommended that The Government should “explore the practicality and usefulness of creating a single online registry that would allow people to see, in real time, all the companies that hold personal data on them and what data they hold.”

The Chair of the Committee, the lawyer Harriet Harman, said “It should be simple to know what data is shared about individuals and it must be equally easy to correct or delete data held about us as it was to us to sign up to the service in the first place”. As far as I can see, this completely impractical, expensive and pointless mechanism for logging in to some government website to find out if you signed up for the Wetherspoons loyalty scheme will be of no benefit whatsoever. The vast majority of the population neither know nor care what the Tesco Clubcard database holds about them so long as they get money off vouchers now and then. The Committee’s concerns about privacy are real and valid (and at Consult Hyperion we share them) but their proposed solution will not address them. Apart from anything else, what will stop hackers from getting into the database, finding out that you have an account at Barclays and then using this to phone you up and asking you to transfer your money into a safe account?

I wonder if the lawyers are aware that technologists can help resolve this fundamental paradox. Having had a few years’ experience in delivering highly secure systems to the financial sector, my colleagues at Consult Hyperion are familiar with a number of cryptographic techniques – such as homomorphic encryption, cryptographic blinding, zero-knowledge proofs and verifiable credentials – that can deliver apparently paradoxical results. It is possible to store data and perform computations on it without reading it, it is possible to determine that someone is over 18 without seeing their age and it is possible to find out whether you ate at a certain restaurant without disclosing your name.

Right now, the use of these technologies is nothing more than a hygiene factor for the companies involved. But as legislation (and social pressure) steadily converts personal information into toxic waste, more and more companies will want to avoid it. Privacy will become part of the overall package that a company offers to its customers and we understand the technologies that can deliver it and how to deploy them at population scale. Give us a call – our number’s not a secret.

Identity Week

The opening keynote at identity week in London was given by Oliver Dowden, the Minister for implementation at the Cabinet office and therefore the person in charge of the digital transformation of government. At Consult Hyperion we think digital identity is central to digital transformation of government (and the digital transformation of everything else, for that matter) so I was looking forward to hearing the UK government’s vision for digital identity. I accompanied the Minister on his visit to the IDEMIA stand where he was shown a range of attractive burgundy passports.

In his keynote, the Minister said that the UK is seen as being at the cutting edge of digital identity and that GOV.UK Verify is at the heart of that success.

(For foreign visitors, perhaps unfamiliar with this cutting edge position, a spirit of transparency requires me to note that back on 9th October 2016, Mr. Dowden gave written statement HCWS978 to Parliament, announcing that the government was going to stop funding Verify after 18 months with the private sector responsible for funding after that.)

Given that the government spends around £1.5 billion per annum on “identity, fraud, error, debt, how much identity costs to validate, and how much proprietary hardware and software bought”, it’s obviously important for them to set an effective strategy. Now, members of the public, who don’t really know or care about digital ID might be saying to themselves, “why can’t we just use ‘sign in with Apple’ to do our taxes?”, and this is a good point. Even if they are not saying it right now, they’ll be saying it soon as they get used to Apple’s mandate that all apps that allow third-party sign-in must support it.

Right now you can’t use a GOV.UK Verify Identity Provider to log into your bank or any other private sector service provider. But in his speech the Minister said that he looks forward to a time when people can use a single login to “access their state pension and the savings account” and I have to say I agree with him. Obviously you’d want a different single login for gambling and pornography, but that’s already taken care of as, according to Sky News, “thanks to its ill-conceived porn block, the government has quietly blundered into the creation of a digital passport – then outsourced its development to private firms, without setting clear limits on how it is to be used”. One of these firms runs the world’s largest pornography site, Pornhub, so I imagine they know a thing or two about population-scale identity management.

Back to the Minister’s point though. Yes, it would be nice to have some sort of ID app on my phone and it would be great if my bank and the HMRC and Woking Council and LinkedIn would all let me log in with this ID. The interesting question is how you get to this login. Put a PIN in that and we’ll come back to it later.

The Minister made three substantive points in the speech. He talked about:

  • The creation of a new Digital Identity Unit, which is a collaboration between DCMS and Cabinet Office. The Unit will help foster co-operation between the public and private sector, ensure the adoption of interoperable standards, specification and schemes, and deliver on the outcome of the consultation.
  • A consultation to be issued in the coming weeks on how to deliver the effective organisation of the digital identity market. Through this consultation the government will work with industry, particularly with sectors who have frequent user identity interactions, to ensure interoperable ‘rules of the road’ for identity.
  • The start of engagement on the commercial framework for consuming digital identities from the private sector for the period from April 2020 to ensure the continued delivery of public services. The Government Digital Service will continue to ensure alignment of commercial models that are adopted by the developing identity market to build a flourishing ecosystem that delivers value for everyone.

The Minister was taken away on urgent business and therefore unable to stay for my speech, in which I suggested that the idea of a general-purpose digital identity might be quite a big bite to take at the problem. So it would make sense to look at who else might provide the “digital identities from the private sector” used for the delivery of public services. Assuming the current GOV.UK Verify identities fail to gain traction in the private sector, then I think there are two obvious private sector coalitions that might step in to do this for the government: the big banks and the big techs.

For a variety of reasons, I hope that the big banks are able to come together to respond to the comments of Mark Carney, the Governor of the Bank of England, on the necessity for a digital identity in the finance sector to work with the banks to develop some sort of financial services passport. I made some practical suggestions about this earlier in the year and have continued to discuss the concept with potential stakeholders. I think it stacks up, but we’ll have to see how things develop.

On the other hand, if the banks can’t get it together and the big techs come knocking, they are already showing off their solutions. I’ll readily admit that when the Minister first said “private sector identities”, the first thought to flash across my brain was “Apple”. But I wouldn’t be at all surprised to go over to the HMRC web site fairly soon to find a “log in with Amazon” and “log in with Apple” next a button with some incomprehensible waffle about eIDAS that I, and most other normal consumers I’m sure, will simply ignore.

How do you use Apple ID to log into the Inland Revenue? Easy: you log in as you do now after sending off for the password and waiting for it to come in the post and that sort of thing and then once you are connected tell them the Apple ID that you want to use in the future. If you want to be “jackdaniels@me.com” or whatever, it doesn’t matter. It’s just an identifier for the Revenue to recognise you in the future. Then next time you go to the Inland Revenue, you log in as jackdaniels@me.com, something pops up on your iPhone and you put your thumb on it or look at it, and bingo you logged in to fill out your PAYE.

Yet another GDPR article – the story so far

How time flies, GDPR has just had its first birthday!

This past year you will have been inundated with articles and blogs about GDPR and the impact on consumers and businesses alike. According to the UK’s Information Commissioner, Elizabeth Denham, GDPR and its UK implementation, the Data Protection Act (DPA) 2018, has marked a “seismic shift in privacy and information rights”. Individuals are now more aware of their information rights and haven’t been shy about demanding it. In the UK, the ICO received around 14,000 personal data breach reports and over 41,000 data protection concerns from the public from 25 May 2018 to 1 May 2019, compared to around 3,300 PDB reports and 21,000 data protection concerns in the preceding year. Beyond Europe, the regulation has had a remarkable influence in other jurisdictions, where they have either enacted or are in the process of enacting a ‘GDPR equivalent’ law – something similar is underway in Brazil, Australia, California, Japan and South Korea.

With all the good intentions of GDPR some of its provisions contradict, other, equally well-intentioned EU laws. Bank Secrecy Laws on one hand, require that customers’ personal data should be protected and used for the intended purpose(s), except where otherwise consented to by the customer. AMLD4/5 on the other hand, requires that identifying personal data in ‘suspicious transactions’ should be passed on to appropriate national authorities (of course without the customer’s consent/ knowledge). Then PSD2 requires banks to open up customers’ data to authorised Third Party Providers (TPPs), subject to obtaining the customer’s consent. One issue that arises out of this is the seeming incongruity between Article 94 PSD2’s explicit consent, and GDPR’s (explicit) consent.

Under GDPR, consent is one of the lawful bases for processing personal data, subject to the strict requirements for obtaining, recording, and managing it, otherwise it’s deemed invalid. In some cases, a lack of good understanding of these rules has resulted in poor practices around consent processing. That is why organisations like the Kantara Initiative are leading the effort in developing specifications for ‘User Managed Access’ and ‘Consent Receipt’.

In addition, EU regulators have been weighing in to clarify some of the conundrums. For example, the Dutch DPA issued a guidance on the interplay of PSD2/GDPR, which shows that there’s no straightforward answer to what seems like a relatively simple question, as one might think. The EDPB has also published an opinion on the interplay between GDPR, and the slowly but surely evolving ePrivacy regulation. Suffice to say, correctly navigating the compliance requirements of all these laws are indeed challenging, but possible.

What will the second year of GDPR bring?

While regulators are keen to enforce the law, their priority is transparent co-operation, not penalties. The ICO has provided support tools, and guidance, including a dedicated help line and chat services to support SMEs. They are also in the process of “establishing a one-stop shop for SMEs, drawing together the expertise from across our regulatory teams to help us better support those organisations without the capacity or obligation to maintain dedicated in-house compliance resources.” However, for those who still choose to ‘wilfully or negligently break the law’, GDPR’s recommended administrative fines may help to focus the mind on what is at stake, in addition to the ‘cleaning up’ costs afterward. Supervisory Authorities require time and resources to investigate and clear the backlog as a result of the EU wide increase in information rights queries and complaints of the past one year. The UK’s ICO, and its Netherlands and Norwegian counterparts are collaborating to harmonise their approaches and establish a “matrix” for calculating fines. France’s CNIL has led the way with the $57 million Google fine earlier in the year, however, the ICO has confirmed that there will soon be fines for “a couple of very large cases that are in the pipeline, so also,the Irish DPC expects to levy substantial” fines this summer.

A new but important principle in GDPR is the ‘accountability principle’ – which states that the data controller is responsible for complying with the regulation and must be able to demonstrate compliance. So, it is not enough to say, ‘we have it,’ you must be able to produce ‘appropriate evidence’ on demand to back it up. The ICO states in its ‘GDPR – one year on’ blog that “the focus for the second year of the GDPR must be beyond baseline compliance – organisations need to shift their focus to accountability with a real evidenced understanding of the risks to individuals in the way they process data and how those risks should be mitigated.” By now one would expect that most organisations would have put in the effort required beyond tick boxes to achieve an appropriate level of compliance with the regulation so they can reap the reward of continued business growth borne out of trust/loyalty from their customers.

One of the methods of demonstrating GDPR accountability is through a Data Protection Impact Assessment­ (DPIA) – a process by which organisations can systematically analyse, identify, and minimise the data protection risks of their project or plan ‘before going live.’ GDPR does not mandate a specific DPIA process, but expects whichever methodology chosen by the data controller to meet the requirements specified in its Article 35(7).

At Consult Hyperion, we have a long track record of thinking about the risks associated with transactional data, so much so that we published and continue to use our own Structured Risk Analysis (SRA) methodology. Our approach, in response to the needs of our customers, has always been to describe the technological risks in a language that allow the business owner, who ultimately owns the risk, to make a judgement. Building on this we have developed a business focused approach to GDPR compliant DPIA to help our customers, for the products we design, review, or develop for them.

If you’re interested in finding out more, please contact: sales@chyp.com


Subscribe to our newsletter

You have successfully subscribed to the newsletter

There was an error while trying to send your request. Please try again.

By accepting the Terms, you consent to Consult Hyperion communicating with you regarding our events, reports and services through our regular newsletter. You can unsubscribe anytime through our newsletters or by emailing us.