But what if it’s something more important than money at stake? How can confidence be maintained? Remember the example discussed a couple of weeks ago. Staff at the South Warwickshire NHS Trust have been logging into their new government computer system. This system is being installed in all hospitals (and elsewhere) and since everyone in Britain will have their health records stored on it, it’s important that it is secure. Now, I know that it is secure because I went to the Parliamentary IT Committee (PITCOM) meeting at which the head of the project said that everyone who works for the NHS (which, from a risk analysis point of view, is essentially everyone in Britain) would get a super new smart card. This was very reassuring. He also said that getting access to medical records would not be as "simple as sticking a card in a machine".
No matter that some of the smart cards were sent out with the same PIN number and that the PIN number was writen on the backs of the cards, although to be fair the the relevant government person Liam Byrne said that this breach of procedures has not posed a significant risk to the confidentiality of patient information and he used to work for Accenture, so that’s quite reassuring.
The system is now up and running, and NHS staff need to use their smart card to access to the system. Well, they need to use someone’s smart card. In fact, at the South Warwickshire NHS Trust, staff just stick a card in a machine. In fact, they stick their supervisor’s card in to the system and then leave it logged in for them all to share, because they can’t be bothered to log-in and log-out every time they want to use the system. Not surprising, since log-in to the hospital’s new iSoft iPM patient administration system (PAS) is averaging 60-90 seconds, which is not acceptable in a busy A&E environment.
I’m interested in this case study because of what it says about the practical use of smart cards in large-scale government projects, so I googled around and I found the meeting record where the acting head of IT for the Trust says that there’s no security problem around staff using other people’s smart cards to log in. So, the trajectory appears to be "don’t worry, it’s secure because all the users have personal smart card" to "there’s no risk in people sharing smart cards" to (someday soon, I’m sure) Scott McNealy’s famous "you have no privacy, get over it". As a spokesman for the British Medical Association’s GP IT subcommittee told Computer Weekly, the behaviour "[drove] a coach and horses through the so-called privacy in the new systems".
No one would, for one moment, argue that what the NHS is trying to do is simple. It isn’t. But I’m sure that this approach isn’t the best way to build and maintain public confidence. Nor are the statements from Connecting for Health, the actual programme responsible for the new NHS computer system. They had originally said that the sharing of smart cards would be treated as misconduct, requiring disciplinary procedures. Now, however, they’ve washed their hands of it saying that "responsibility for the security of patient information ultimately lies with individual Trusts, hospitals and NHS organisations" and not, apparently, with the people who designed the system.
How does this sort of thing happen? I don’t mean technically — we were advising clients several years ago that they should plan for the advent of PKI-based contactless smart cards if they wanted fast, secure transactions — but organisationally. Presumably the government people in charge of all of this, and their management consultants, had some idea about how it would all work but I don’t know how it was ever communicated or scrutinised properly. Wouldn’t transparency be the best way to build confidence for this kind of scheme? Explain how the smart cards (and the cryptography) work, use widely-understood standards, invite public comment and inspection, develop a public specification and then invite the industry to build competitive solutions. Telling people that something is going to be secure and asking them to simply trust you is sub-optimal in so many ways.
My opinions are my own (I think) and are presented solely in my capacity as an interested member of the general public.
[posted with ecto]