As an example of creative thinking in promoting inclusion, I would like to highlight John Patrick Crichton-Stuart, 3rd Marquess of Bute, a thoroughly modern Victorian, educated by his mother until the age of 12. He was ridiculed by society for his progressive views in paying great attention to the education of his daughters as well as his sons. Considered the richest man of his time, his hobby was building the finest fairy tale castles. He also built a magnificent building for the medical school at the University of St Andrews and endowed the Bute Chair of Medicine. When the male anatomy lecturer refused to teach women, he simply hired a woman as an additional lecturer, to teach any students who wished to learn with her. In this way, he managed to provide an environment in which women and men could train alongside one another, without coming into conflict with the existing hierarchy. Perhaps surprisingly, we still have lessons to learn from his approach.
What did you think of the US election? I don’t mean the candidates and the outcome. What did you think of the election process? Should it be possible for national elections of this type to be done online? Last week the IET published a paper on internet voting in the UK, led by our good friend at the University of Surrey, Professor Steve Schneider. It’s well worth a read. As the paper explains, internet voting for statutory political elections is a uniquely challenging problem. Firstly voting systems have exacting requirements and secondly, the stakes are high with the threat of state level interference.
As if lockdown were not bad enough, many of us are now faced with spending the next year with children unable to spend their Gap Year travelling the more exotic parts of the world. The traditional jobs within the entertainment and leisure sectors that could keep them busy, and paid for their travel, are no longer available. The opportunity to spend time with elderly relatives depends on the results of their last COVID-19 test.
I recognize that we are a lucky family to have such ‘problems’. However, they are representative of the issues we all face as we work hard to bring our families, companies and organizations out of lockdown. When can we open up our facilities to our employees, customers and visitors? What protection should we offer those employees that must or choose to work away from home? What is the impact of the CEO travelling abroad to meet new employees or customers, sign that large deal or deliver the keynote at that trade fair in Las Vegas?
The team put on an excellent webinar this Thursday (May 21st, 2020) in the Tomorrow’s Transactions series. The focus was on Trust over IP, although digital identity and privacy were covered in the round.
The panellists were Joni Brennan of the DIACC (Digital ID & Authentication Council of Canada—full disclosure: a valued customer), long-time collaborator Andy Tobin of Evernym and our own Steve Pannifer and Justin Gage. Each of the panellists is steeped in expertise on the subject, gained from hard-won experience.
Joni and Andy presented, respectively, the DIACC and ToIP layered architectural models (largely congruent) for implementing digital identification services. The panellists agreed that no service could work without fully defined technical, business and governance structures. Another key point was that the problems of identification and privacy merge into one another. People need to make themselves known, but are reserved about making available a slew of personal information to organisations with whom they may seek no persistent relationship or do not fully trust.
At one point, it was mentioned that practical progress has been slow, even though the basic problem (to put one aspect crudely, why do I need so many passwords?) of establishing trust over digital networks has been defined for 20 years at least. It could be argued that Consult Hyperion has earned its living by designing, developing and deploying point solutions to the problem. I began to wonder why a general solution has been slow to arise, and speculated (to myself) that it was because the end-user has been ill-served. In particular, the user sign-up and sign-in experiences are inconsistent and usually horrible.
Therefore, I posed the question “What is the panel’s vision for how people will gain access to personalised digital services in 2030?” The responses were interesting (after momentary intakes of breath!) but time was short and no conclusions were reached.
I slept on the problem and came up with some tentative ideas. Firstly, when we are transacting with an organisation (from getting past a registration barrier to download some info, through buying things, to filing tax returns), everything on our screens is about the organisation (much of it irrelevant for our purposes) and nothing is about us. Why can’t our platforms present a prominent avatar representing us, clickable to view and edit information we’ve recorded, and dragable onto register, sign-in or authorise fields in apps or browsers?
Now, there could be infinite variations of ‘me’ depending on how much personal information I want to give away; and the degree of assurance the organisation needs to conduct business with me (of course, it’s entirely possible there could be no overlap). I reckon I could get by with three variations, represented by three personas:
- A pseudonym (I get tired of typing firstname.lastname@example.org just to access a café’s wifi; there are some guilty parties registering for our webinars too!)
- Basic personal information (name, age, sex, address) for organisations I trust, with a need-to-know
- All of the above, maybe more, but (at least, partly) attested by some trusted third party.
Obsessives could be given the ability to define as many options, with as many nuances, as they like; but complexity should be easily ignorable to avoid clutter for the average user.
I think it’s the major operating system providers that need to make this happen: essentially, Apple, Android and Microsoft, preferably in a standard and portable way. For each we would set up an ordered list of our preferred authentication methods (PIN, facial recognition, etc) and organisations would declare what is acceptable to them. The system would work out what works for both of us. If the organisation wants anything extra, say some kind of challenge/response, that would be up to them. Hopefully, that would be rare.
The Apple Pay and Google Pay wallets are some way to providing a solution. But sitting above the payment cards and boarding passes there needs to be the concept of persona. At the moment, Apple and Google may be too invested in promulgating their own single customer views to see the need to take this extra step.
I sensed frustration from the panellists that everything was solvable, certainly technically. Governance (e.g. who is liable for what when it all goes wrong?) was taken to be a sticking point. True, but I think we need to put the average user front and centre. Focus groups with mocked-up user experiences would be a good start; we’d be happy to help with that!
I listened with interest to yesterday’s parliamentary committee on the proposed NHSX contact tracing app, which is being trialled on the Isle of Wight from today. You can see the recording here.
Much of the discussion concerned the decision to follow a centralised approach, in contrast to several other countries such as Germany, Switzerland and Ireland. Two key concerns were raised:
1. Can a centralised system be privacy respecting?
Of course the answer to this question is yes, but it depends on how data is collected and stored. Cryptographic techniques such as differential privacy are designed to allow data to be de-indentified so that is can be analysed anonymously (e.g. for medical research) for example, although there was no suggestion that NHSX is actually doing this.
The precise details of the NHSX app are not clear at this stage but it seems that the approach will involve identifiers being shared between mobile devices when they come into close proximity. These identifiers will then be uploaded to a central service to support studying the epidemiology of COVID-19 and to facilitate notifying people who may be at risk, having been in close proximity to an infected person. Whilst the stated intention is for those identifiers to be anonymous, the parliamentary debate clearly showed there a number of ways that the identifiers could become more identifiable over time. Because the identifiers are persistent they are likely to only be pseudonymous at best.
By way of contrast, a large team of academics has developed an approach called DP-3T, which apparently has influenced designs in Germany and elsewhere. It uses ephemeral (short-lived) identifiers. The approach is not fully decentralised however. When a user reports that they have COVID-19 symptoms, the list of ephemeral identifiers that user’s device has received, when coming into close proximity to other devices, is shared via a centralised service. In fact, they are broadcast to every device in the system so that risk decisioning is made at the edges not in the middle. This means that no central database of identifiers is needed (but presumably there will be database of registered devices).
It also means there will be less scope for epidemiological research.
All of this is way beyond the understanding of most people, including those tasked with providing parliamentary scrutiny. So how can the average person on the street or the average peer in Westminster be confident in the NHSX app? Well apparently the NHSX app is going to be open sourced and that probably is going to be our greatest protection. That will mean you won’t need to rely on what NHSX says but inevitably there will be universities, hackers, enthusiasts and others lining up to pick it apart.
2. Can a centralised system interoperate with the decentralised systems in other countries to allow cross border contact tracing?
It seems to us that whether a system is centralised or not is a gross simplification of the potential interoperability issues. True, the primary issue does seem to be the way that identifiers are generated, shared and used in risk decisioning. For cross border contact tracing to be possible there will need to be alignment on a whole range of other things including technical standards, legal requirements and perhaps even, dare I say it, liability. Of course, if the DP-3T model is adopted by many countries then it could become the de facto standard, in which case that could leave the NHSX app isolated.
Will the NHSX app be an effective tool to help us get back to normal? This will depend entirely on how widely it is adopted, which in turn will require people to see that the benefits outweigh the costs. That’s a value exchange calculation that most people will not be able to make. How can they make a value judgment on the potential risks to their civil liberties of such a system? The average user is probably more likely to notice the impact on their phone’s battery life or when their Bluetooth headphones stop working.
It’s that time of year again. I’ve had a chat with my colleagues at Consult Hyperion, gone back over my notes from the year’s events, taken a look at our most interesting projects around the world and brought together our “live five” for 2018. Now, as in previous years, I don’t expect you to pay any attention to our prognostications without first reviewing our previous attempts, otherwise you won’t have any basis for taking us seriously! So let’s begin by looking back over the last year and then we’ll take a shot at the new one!
This was the “live five” of technology-driven changes in the secure transactions field that we thought would have a real business impact over the previous year. In the spirit of openness and honesty and disclosure that we are famed for, let’s see how those predictions fared.
- RegTech. I think we did pretty well with this prediction. Interest in regtech has grown throughout the year and the ability of regtech to make real differences in major markets is established.
- Digital Identity. As we noted, one of the key regtechs, if not the key regtech, is digital identity. It did shoot up the agenda over the year and some interesting initiatives opened up.
- PSD2 (still). No commentary is needed!.
- Paying on the Go. We thought that a key use of open APIs will be payments, and very likely mobile payments. MasterCard’s purchase of VocaLink would tend to support this view!
- Invisible POS. The shift from “check out to check in” paradigms is underway but it is fair to observe that we did not see the number of launches we were expecting as many of the projects remain in beta and will be holding to wait for the arrival of PSD2 (and CMA remedies in the UK).
Not bad. In fact, pretty good. So now let’s take a look at where we think the action will be in the coming year in our corner of the transactions treehouse. My guess is that you’ll agree with four out of the five – if not… let us know!
From the perspective of our home base in the UK, the really big trend is easy to predict and wholly uncontroversial, since open banking is going to transform our industry. Thinking around this opens up a couple of adjacent areas as well. So…
- Open Banking. In the UK, the regulators’ determination to bring real competition to the financial services world means that we are about to see major disruption in the space. Last year I called this before a “crossing of the streams” (in an hommage to Ghostbusters!) because there are three different initiatives coming together.The first stream is the PSD2 provisions for access to payment accounts. As you may recall, these include a set of proposals that are due to come into force in 2018. A group of those proposals are what we in the business call “XS2A”, the proposals which force banks to open up to permit the initiation of credit transfer (“push payments”) and account information queries. Even at a pure compliance level these PSD2 regulations pose significant questions for the structure of the existing payments industry. While PSD2 does not mandate APIs (I think – it’s all gotten a bit complicated but as far as I know the screen-scrapers have fought a decent rearguard action) an open banking API is the obvious way to implement the PSD2 provisions.
The second stream is Her Majesty’s Treasury’s push for more competition in retail banking. This led to the creation of the Open Banking Working Group (OBWG), which published its report in 2016. It set out was a four part framework, comprising:
- A data model (so that everyone knows what “account”, “amount”, “account holder” etc means);
- An API standard.
- A security standard.
- A governance model.
The third stream is the CMA report that triggered the remedies mentioned above. This envisages APIs to improve competition in retail banking by focusing on the use of APIs to obtain access to personal data that can be shared with third-parties to obtain better, more cost-effective services.
These streams are coming together to create an environment of what is now called Open Banking. And it’s a big deal. And it begins in January 2018 when the nine biggest banks open up their APIs and the UK becomes a fascinating and exciting laboratory for new services. Who will take advantage of this new environment? Well, in our opinion, it’s not the fintechs. And we are not the only ones who think this.
Much has been made of the rise of fintech [but] according to a report by the World Economic Forum (WEF), traditional banks are more vulnerable to competition from another source: tech giants like Amazon, Facebook, and Google.
As we have pointed out for some time, it is not all obvious that what we refer to as the “challenger” banks in the UK (i.e., the new banks who have obtained licences in recent years) are really challengers at all. The era of the “challenger banks” is coming to an end as the internet giants compete to be the front end to the customers transactional financial services.
- Conversational Transactions. One class of application that will exploit API integration with banking and payment systems is chat, whether through standard messaging applications or “chatbot” interfaces. This is hardly a wild prediction, but we think that the early steps (e.g., Facebook Messenger’s recent UK payments launch) indicate a major shift in 2018. Right now, when my sons at University ask me for money on WhatsApp, I have to switch to Barclays Pingit to send the money. Not for much longer. And it is important to understand the roadmap here, because the link between conversational commerce and voice commerce is straightforward. It’s all small step from typing “Send £20 for the ticket” to saying “Send £20 for the ticket”.
- The Internet of Cars. Anyone who visited Mobile World Congress or CES or, I’m sure, many other events throughout the year, couldn’t have failed to notice the amount of work going on in the “internet of things” (we all understand just how important that will be) and how much of the IoT focus is on the automobile sector. You can see why this is: cars are expensive, so they can stand the cost of adding smart technology that can deliver new functionality. However, as Consult Hyperion have always said, doors are easy but locks are hard. It’s easy to connect the myriad systems in the modern car to the world, but it’s really hard to secure them. This is a great opportunity for organisations with skills in encryption, authentication, key management, operational security and so on to help the automobile industry,It’s one thing when your bank account gets hacked (because the bank has to give you your money back) but when the hackers are crashing cars for fun it’s another thing altogether. If we want our cars to engage in transactions then we have to be sure that the security infrastructure for those transactions is absolutely solid.
- Artificial Intelligence. Well, when it comes to money, and indeed absolutely everything else, there is no doubt that AI will be the most disruptive technology of our generation. We may be a long way from Terminators and HAL 9000, but the massive AI investments pouring into financial services around the world mean that the technology is going to our business, and soon. If you examine where banks are spending their AI budgets right now, machine learning is the main focus. An Infosys poll earlier in the year showed that two-thirds of banks were already spending in this area and this is no surprise. Banks have large quantities of data that in the past they have found difficult to extract wisdom from and they have large transactional flows that they find it difficult to manage in the context of increasing regulatory burdens. Machine learning systems excel at finding patterns and exceptions in such data, provided that they can be fed the voracious quantities of raw material, so the main use of the machine learning systems is currently fraud detection and prevention. This throws up an interesting strategic challenge for banks in the new Open Banking world, because there is a threat to risk management, information analysis and sales/marketing processes in the new environment where they may not get to see the data held by third-party providers but those providers have access to bank accounts.
- Tokens/ICOs. Well, those first four predictions are mainstream. But it’s fun to pick something out of left field (as our American cousins would say) by looking where technology might mean very different kinds of assets being used in transactions. We might well see a new kind of money emerge in the coming year. Not Bitcoin, but “tokens” (the basis of Initial Coin Offerings, or ICOs). When the current craziness is past and tokens become a regulated but wholly new kind of digital asset, a cross between corporate paper and a loyalty scheme, they will present an opportunity to remake markets in a new and better way. One might imagine a new version of London Alternative Investment Market (AIM) where start-ups launch but instead of issuing equity they create claims on their future in the form of tokens. The trading of these tokens is indistinguishable from the trading of electronic cash (because they are bearer instruments with no clearing or settlement) but there will be an additional transparency in corporate affairs because aspects of the transactions are public. The transparency obtained from using modern cryptography (e.g. homomorphic encryption and zero-knowledge proofs) in interesting way iss, as an aside, one of the reasons why we tend to think of the blockchain as a regtech, not a fintech.
All in all, the coming year will see much more disruption than might be apparent at first because the shift to open banking, starting in the UK, is what will drive the reshaping of the sector while at the same time the advance of AI into the transaction space (transactions of all types, from buying a train ticket to selling corporate bonds) begins to reshape the way we do business.
The publication by NIST of an updated version of its digital identity guidelines marks a significant change in its approach to identity management. It highlights the importance of implementing digital identity in context, with three different elements replacing the previously monolithic Level of Assurance. These Levels are the Identity Assurance Level for identity proofing, the Authenticator Assurance Level for authentication and the Federation Assurance Level for use in a federated environment. Criteria for each Assurance type run from Level 1 to Level 3. This is intended to provide greater flexibility in implementation, for example combining pseudonymity with strong authentication for privacy purposes. Although optional, federation is positively encouraged for reasons of user experience, cost and privacy.
Risk management features prominently in the guidelines, with risk assessments used to determine appropriate identity choices according to system requirements. Although the requirements are technology agnostic, they are prescriptive regarding the assurance levels required for particular purposes. One area in which the guidelines are particularly refreshing is in their approach to passwords. Drawing on research into passwords exposed during data breaches, the use of unwieldy complexity rules is discouraged. Instead, it is suggested that users should be allowed to make passwords as long as they wish, encouraging the use of pass phrases and excluding very short passwords.
Faced with restrictive rules, many users will select predictable passwords which just meet the system requirements but are easily guessed. It is suggested that passwords should be checked against a blacklist of obvious choices and known compromised passwords, to filter these out. Randomly-generated secrets are therefore preferred to user-generated secrets.
The guidelines also highlight the importance of usability, supporting the use of password managers and only requiring passwords to be changed when there is evidence of compromise. There is some flexibility regarding displaying passwords on screen, depending on the context. In order to maintain an adequate level of security, a mechanism for limiting the number of possible failed authentication attempts is required.
This new, more person-centric approach from NIST follows on from UK government guidance published by GCHQ in 2016, advising ‘dramatic simplification’ of password management policies. This guidance also focused on achieving security by implementing processes which are easier for people to follow and therefore less susceptible to being undermined by users attempting to take short cuts through the system.
CHYP’s involvement in research has highlighted for us the difference between the way people say they behave and how they actually behave online. This kind of performativity may take the form of people describing how careful they are online (perhaps repeating recent official advice), while doing something conflicting on screen even as they are speaking. A similar effect can be seen when comparing figures produced from a user survey by the Gambling Commission, to usage statistics reported by gambling companies. The companies are able to draw statistics directly from their systems, while the survey figures are composed of gamblers’ reporting of their own behaviour. These discrepancies highlight the importance of observation when developing policies based on user behaviour.
It is encouraging to see a more effective approach to combination of privacy, security and usability in Identity Management being promoted at the highest levels. Even in local hospitals, it is now common to see screens showing simply ‘tap your pass or enter your passphrase’, where previously unpredictable processes were in place. Organisations such as FIDO have done a great deal to promote standardisation.
For a standalone organisation to adopt the new NIST rules would seem both positive and achieveable. They are in any case intended to be used within the US government. However, where organisations are already working in partnership and have existing legacy agreements regarding security requirements, it may be necessary to revisit these and agree a new set of password rules to replace existing, outdated approaches. Standardisation and education can go a long way towards supporting this process, although for larger organisations and those with multiple partners, it may take longer.
Publications such as ‘Why Johnny can’t encrypt’ and ‘Users are not the enemy’ have long been recognised for highlighting enduring issues with implementing security software. While education is important, attempts to fundamentally change people will inevitably fail, resulting in escalating support costs and unpredictable security risks. People are simply not equipped to adjust that quickly. In comparison, machines are generally designed by people and comparatively easily modified. Even with the advent of AI, machines are likely to remain reasonably malleable.
Where most user interaction involves people and machines, security tends also to involve mathematics. The NIST guidelines prescribe the use of appropriate cryptography at every stage. This is essential to securing the system but does not of itself guarantee that the system will remain secure. Appropriate system design and implementation are crucial to ensuring secure operations. This is exemplified by the recent flaw discovered in the WPA2 WiFi protocol. A mathematical proof is available for the security of the protocol but there is a vulnerability in the key management, which is not covered by the proof.
As in any system, a mathematical proof has to be ‘situated’ to be useful. Effective risk modelling will take into account the wider context of the system, focusing in on the most critical areas for greater attention. This process may have to be revisited over time, as the surrounding environment evolves. The increasing interconnectedness of the Internet of Things will require greater attention to disconnection technologies to preserve system integrity over time.
How are mobile payments getting on in the UK? According to the most recent figures from Transport for London, mobile phones now account for about 8% of their contactless transactions, so clearly there are plenty of people who already use the phone in their hand rather than reach for the card in their pocket. Yet as many commentators have observed, out in the wider world — whether AndroidPay or Tesco PayQiq, PayM or Barclaycard Mobile — mobile payments seem to be facing something of a struggle to become mainstream.
With Consult Hyperion’s annual Tomorrow’s Transactions Forum coming up this week, we asked our good friends at Crescendo to use their array of clever Twitter sentiment analysis tools to give us an up-to-the-minute snapshot of the UK. They found that in conversations about mobile payments (which are dominated by Apple Pay, accounting for almost four-fifths of the conversations) there are roughly twice as many negative conversations as positive ones! Now that might be because people are quick to vent on Twitter when something doesn’t work properly but slower to praise when it does (I’m certainly guilty of this), but if we take the sentiment analysis at face value it seems to show that customers by and large like mobile payments when they work but are frustrated with the experience because it just doesn’t work the way it should and where it should.
There are a variety of reasons for this, ranging from gaps in the training of checkout staff to a failure of education (most people still don’t realise that the £30 limit that applies to contactless cards does not apply to contactless mobile payments so you can use your phone for your weekly shop) and confusion about acceptance (in some shops, for example, you can pay by contact with some cards but not pay with those same cards using mobile contactless).
Now, mobile payments is not all about mobile contactless. It’s about mobile initiated transfer of money from one account (the consumer’s) to another account (the merchant’s). And while we use cards for this now (except in Starbucks where we all use our app), with PSD2 on the horizon and MasterCard’s purchase of VocaLink we can certainly expect to see more direct-to-account credit transfers in the consumer marketplace. So we asked Crescendo to see if there’s any talk around this. They found that right now those conversations are dominated by Barclays PingIt and while the negative comments still outweigh the positive comments it is, rarther interestingly, by a much smaller margin than for mobile contactless. I wonder if this is perhaps a weak signal that mobile payment apps will be more popular than mobile contactless taps?
Does any of this matter? Perhaps the way that mobile payments work now isn’t much of a guide to the way they will work in the future. Maybe tapping on things, whether a card or a phone or a wristband or anything else is all a bit last year? Maybe it doesn’t matter whether people tap phones or cards because in time all payments will be going in-app (or in-browser) and that’s where we should be focusing for the future. The web’s standard body, the World-Wide Web Consortium (W3C), is currently working on a standard for these payments and this will likely hasten the physical and virtual convergence.
You can hear about the status of the standardisation process from the W3C themselves at the 20th annual Consult Hyperion Tomorrow’s Transactions Forum in London this week. Oh, and you’ll hear all about the status of PSD2, the future for mass market payments, financial inclusion, innovative uses of the blockchain, privacy, the Internet of Things, transit payments and much else besides.
At this point I would normally implore you to head over to our web site to score a ticket for this unique event. But there’s no point today because all the tickets have been sold and there are no places left. If you’re one of the lucky few with a delegate place, see you Wednesday.
The BBC World Service has a podcast series called “50 things that made the modern economy” hosted by the economist Tim Harford. It features inventions ranging from COBOL and banks to antibiotics and, interestingly, M-PESA. This caught my attention because M-PESA is one of the Consult Hyperion projects from the last couple of decades that we might find ourselves chatting about at the forthcoming 20th annual Forum, Tomorrow’s Transactions 2017. The Forum will be held at the America Square conference centre in London on 26th/27th April and Kevin Amateshe, the current M-PESA product manager will be coming in from Nairobi to give us a detailed picture of where M-PESA is now and where it will be going next.
The Forum, thanks to the wonderful support from our friends at Vocalink, PaySafeGroup, WorldPay and Olswang, will once again provide a unique environment for learning, investigation, discussion and debate about the future of electronic transactions. The future of people, businesses and government in the post-industrial online and interconnected economy.
This year’s invited keynote will be given by Professor Lisa Servon, one of the world’s leading authorities on financial and social inclusion. All delegates will receive a copy of Lisa’s new book “The Unbanking of America: How the New Middle Class Survives”.
Incidentally, listening to the BBC podcast narrating the story of our good friends Nick Hughes and Susie Lonie (Susie will be at the Forum too if you’d like to come along and say hi to her) brought back many memories, so I decided to conduct a little bit of post-industrial archaeology and I tracked down the presentations on M-PESA that Nick Hughes and our very own Paul Makin (who led the original feasibility study for M-PESA!) ave at the Centre for the Study of Financial Innovation (CSFI) in November 2005 when M-PESA had 300 users and eight agents!!! As of today, it has 25 million users and 261,000 agents across 11 countries.
You can read them here….
See you all in April when we get together and try to work out what the next M-PESA will be!