Lockstep

Mobile: +61 (0) 414 488 851
Email: swilson@lockstep.com.au

Safeguarding the pedigree of identifiers

The problem of identity takeover

The root cause of much identity theft and fraud today is the sad fact that customer reference numbers and personal identifiers are so easy to copy. Simple numerical data like bank account numbers and health IDs can be stolen from many different sources, and replayed in bogus trans-actions.

Our personal data nowadays is leaking more or less constantly, through breached databases, websites, online forms, call centres and so on, to such an extent that customer reference numbers on their own are no longer reliable. Privacy consequentially suffers because customers are required to assert their identity through circumstantial evidence, like name and address, birth date, mother’s maiden name and other pseudo secrets. All this data in turn is liable to be stolen and used against us, leading to spiralling identity fraud.

To restore the reliability of personal identifiers, we need to know their pedigree. We need to know that a presented number is genuine, that it originated from a trusted authority, it’s been stored safely by its owner, and it’s been presented with the owner’s consent.

"Notarising" personal data in chip devices

There are ways of issuing personal data to a smart chip device that prevent those data from being stolen, copied and claimed by anyone else. One way to do so is to encapsulate and notarise personal data in a unique digital certificate issued to a chip. Today, a great many personal devices routinely embody cryptographically suitable chips for this purpose, including smart phones, SIM cards, “Secure Elements”, smartcards and many wearable computers.

Consider an individual named Smith to whom Organisation A has issued a unique customer reference number N. If N is saved in ordinary computer memory or something like a magnetic stripe card, then it has no pedigree. Once the number N is presented by the cardholder in a transaction, it looks like any other number. To better safeguard N in a chip device, it can be sealed into a digital certificate, as follows:

1. generate a fresh private-public key pair inside Smith’s chip
2. export the public key
3. create a digital certificate around the public key, with an attribute corresponding to N
4. have the certificate signed by (or on behalf of) organisation A.

Pedigree Diagram 140901

The result of coordinating these processes and technologies is a logical triangle that inextricably binds cardholder Smith to their reference number N and to a specific personally controlled device. The certificate signed by organisation A attests to Smith’s ownership of both N and a particular key unique to the device. Keys generated inside the chip are retained internally, never divulged to outsiders. It is impossible to copy the private key to another device, so the triangle cannot be cloned, reproduced or counterfeited.

Note that this technique lies at the core of the EMV "Chip-and-PIN" system where the smart payment card digitally signs cardholder and transaction data, rendering it immune to replay, before sending it to the merchant terminal. See also my 2012 paper Calling for a uniform approach to card fraud, offline and on. Now we should generalise notarised personal data and digitally signed transactions beyond Card-Present payments into as much online business as possible.

Restoring privacy and consumer control

When Smith wants to present their personal number in an electronic transaction, instead of simply copying N out of memory (at which point it would lose its pedigree), Smith’s transaction software digitally signs the transaction using the certificate containing N. With standard security software, any third party can then verify that the transaction originated from a genuine chip holding the unique key certified by A as matching the number N.

Note that N doesn’t have to be a customer number or numeric identifier; it could be any personal data, such as a biometric template or a package of medical information like an allergy alert.

The capability to manage multiple key pairs and certificates, and to sign transactions with a nominated private key, is increasingly built into smart devices today. By narrowing down what you need to know about someone to a precise customer reference number or similar personal data item, we will reduce identity theft and fraud while radically improving privacy. This sort of privacy enhancing technology is the key to a safe Internet of Things, and fortunately now is widely available.

Posted in Smartcards, Security, PKI, Payments, Identity, Fraud, Biometrics

Postcard from Monterey 3 #CISmcc

Days 3 and 4 at CIS Monterey.

Andre Durand's Keynote

The main sessions at the Cloud Identity Summit (namely days three and four overall) kicked off with keynotes from Ping Identity chief Andre Durand, New Zealand technology commentator Ben Kepes, and Ping Technical Director Mark Diodati. I'd like to concentrate on Andre's speech for it was truly fresh.

Andre has an infectious enthusiasm for identity, and is a magnificent host to boot. As I recall, his CIS keynote last year in Napa was pretty simply a dedication to the industry he loves. Not that there's anything wrong with that. But this year he went a whole lot further, with a rich deep dive into some things we take for granted: identity tokens and the multitude of security domains that bound our daily lives.

It's famously been said that "identity is the new perimeter" and Andre says that view informs all they do at Ping. It's easy I think to read that slogan to mean security priorities (and careers) are moving from firewalls to IDAM, but the meaning runs deeper. Identity is meaningless without context, and each context has an edge that defines it. Identity is largely about boundaries, and closure.

  • MyPOV and as an aside: The move to "open" identities which has powered IDAM for a over a decade is subject to natural limits that arise precisely because identities are perimeters. All identities are closed in some way. My identity as an employee means nothing beyond the business activities of my employer; my identity as an American Express Cardholder has no currency at stores that don't accept Amex; my identity as a Qantas OneWorld frequent flyer gets me nowhere at United Airlines (nor very far at American, much to my surprise). We discovered years ago that PKI works well in closed communities like government, pharmaceutical supply chains and the GSM network, but that general purpose identity certificates are hopeless. So we would do well to appreciate that "open" cross-domain identity management is actually a special case and that closed IDAM systems are the general case.

Andre reviewed the amazing zoo of hardware tokens we use from day to day. He gave scores of examples, including driver licenses of course but license plates too; house key, car key, garage opener, office key; the insignias of soldiers and law enforcement officers; airline tickets, luggage tags and boarding passes; the stamps on the arms of nightclub patrons and the increasingly sophisticated bracelets of theme park customers; and tattoos. Especially vivid was Andre's account of how his little girl on arriving at CIS during the set-up was not much concerned with all the potential playthings but was utterly rapt to get her ID badge, for it made her "official".

IMG 5493 Ping Durand CISmcc Tokens make us official

Tokens indeed have always had talismanic powers.

Then we were given a fly-on-the-wall slide show of how Andre typically starts his day. By 7:30am he has accessed half a dozen token-controlled physical security zones, from his home and garage, through the road system, the car park, the office building, the elevator, the company offices and his own corner office. And he hasn't even logged into cyberspace yet! He left unsaid whether or not all these domains might be "federated".

  • MyPOV: Isn't it curious that we never seem to beg for 'Single Sign On' of our physical keys and spaces? I suspect we know instinctively that one-key-fits-all would be ridiculously expensive to retrofit and would require fantastical cooperation between physical property controllers. We only try to federate virtual domains because the most common "keys" - passwords - suck, and because we tend to underestimate the the cost of cooperation amongst digital RPs.
IMG 5493 Ping Durand CISmcc Tokens properties

Tokens are, as Andre reminded us, on hand when you need them, easy to use, easy to revoke, and hard to steal (at least without being noticed). And they're non-promiscuous in respect of the personal information they disclose about their bearers. It's a wondrous set of properties, which we should perhaps be more conscious of in our work. And tokens can be used off-line.

  • MyPOV: The point about tokens working offline is paramount. It's a largely forgotten value. Andre's compelling take on tokens makes for a welcome contrast to the rarely questioned predominance of the cloud. Managing and resolving identity in the cloud complicates architectures, concentrates more of our personal data, and puts privacy at risk (for it's harder to unweave all the traditionally independent tracks of our lives).

In closing, Andre asked a rhetorical question which was probably forming in most attendees' minds: What is the ultimate token? His answer had a nice twist. I thought he'd say it's the mobile device. With so much value now remote, multi-factor cloud access control is crucial; the smart phone is the cloud control du jour and could easily become the paragon of tokens. But no, Andre considers that a group of IDAM standards could be the future "universal token" insofar as they beget interoperability and portability.

He said of the whole IDAM industry "together we are networking identity". That's a lovely sentiment and I would never wish to spoil Andre Durand's distinctive inclusion, but on that point technically he's wrong, for really we are networking attributes! More on that below and in my previous #CISmcc diary notes.

The identity family tree

My own CISmcc talk came at the end of Day 4. I think it was well received; the tweet stream was certainly keen and picked up the points I most wanted to make. Attendance was great, for which I should probably thank Andre Durand, because he staged the Closing Beach Party straight afterwards.

I'll post an annotated copy of my slides shortly. In brief I presented my research on the evolution of digital identity. There are plenty of examples of how identity technologies and identification processes have improved over time, with steadily stronger processes, regulations and authenticators. It's fascinating too how industries adopt authentication features from one another. Internet banking for example took the one-time password fob from late 90's technology companies, and the Australian PKI de facto proof-of-identity rules were inspired by the standard "100 point check" mandated for account origination.

Steve Wilson CIS2014 Authentication Family Tree  Bank identity evolves blog CISmcc

Clearly identity techniques shift continuously. What I want to do is systematise these shifts under a single unifying "phylogeny"; that is, a rigorously worked-out family tree. I once used the metaphor of a family tree in a training course to help people organise their thinking about authentication, but the inter-relationships between techniques was guesswork on my part. Now I'm curious if there is a real family tree that can explain the profusion of identities we have been working so long on simplifying, often to little avail.

Steve Wilson CIS2014 Authentication Family Tree (1 0) CISmcc CIS Cloud Identity

True Darwinian evolution requires there to be replicators that correspond to the heritable traits. Evolution results when the proportions of those replicators in the "gene pool" drift over generations as survival pressures in the environment filter beneficial traits. The definition of Digital Identity as a set of claims or attributes provides a starting point for a Darwinian treatment. I observe that identity attributes are like "Memes" - the inherited units of culture first proposed by biologist Richard Dawkins. In my research I am trying to define sets of available "characters" corresponding to technological, business and regulatory features of our diverse identities, and I'm experimenting with phylogenetic modelling programs to see what patterns emerge in sets of character traits shared by those identities.

Steve Wilson CIS2014 Authentication Family Tree  Memome Characters blog CISmcc

So what? A rigorous scientific model for identity evolution would have many benefits. First and foremost it would have explanatory power. I do not believe that as an industry we have a satisfactory explanation for the failure of such apparently good ideas as Information Cards. Nor for promising federation projects like the Australian banking sector's "Trust Centre" and "MAMBO" lifetime portable account number. I reckon we have been "over federating" identity; my hunch is that identities have evolved to fit particular niches in the business ecosystem to such an extent that taking a student ID for instance and using it to log on to a bank is like dropping a saltwater fish into a freshwater tank. A stronger understanding of how attributes are organically interrelated would help us better plan federated identity, and to even do "memetic engineering" of the attributes we really want to re-use between applications and contexts.

If a phylogenetic tree can be revealed, it would confirm the 'secret lives' of attributes and thereby lend more legitimacy to the Attributes Push (which coincidentally some of us first spotted at a previous CIS, in 2013). It would also provide evidence that identification risks in local environments are why identities have come to be the way they are. In turn, we could pay more respect to authentication's idiosyncrasies, instead of trying to pigeonhole them into four rigid Levels of Assurance. At Sunday's NSTIC session, CTO Paul Grassi floated the idea of getting rid of LOAs. That would be a bold move of course; it could be helped along by a new fresh focus to attributes. And of course we kept hearing throughout CIS Monterey about the FIDO Alliance with its devotion to authentication through verified device attributes, and its strategy to stay away from the abstract business of identities.

Steve Wilson CIS2014 Authentication Family Tree  FIDO blog CISmcc

Reflections on CIS 2014

I spoke with many people at CIS about what makes this event so different. There's the wonderful family program of course, and the atmosphere that creates. And there's the paradoxical collegiality. Ping has always done a marvelous job of collaborating in various standards groups, and likewise with its conference, Ping's people work hard to create a professional, non-competitive environment. There are a few notable absentees of course but all the exhibitors and speakers I spoke to - including Ping's direct competitors - endorsed CIS as a safe and important place to participate in the identity community, and to do business.

But as a researcher and analyst, the Cloud Identity Summit is where I think you can see the future. People report hearing about things for the first time at a CIS, only to find those things coming true a year or two later. It's because there are so many influencers here.

Last year one example was the Attributes Push. This year, the onus on Attributes has become entirely mainstream. For example, the NSTIC pilot partner ID.me (a start-up business focused on improving veterans' access to online discounts through improved verification of entitlements) talks proudly of their ability to convey attributes and reduce the exposure of identity. And Paul Grassi proposes much more focus on Attributes from 2015.

Another example is the "Authorization Agent" (AZA) proposed for SSO in mobile platforms, which was brand new when Paul Madsen presented it at CIS Napa in 2013. Twelve months on, AZA has broadened into the Native Apps (NAPPS) OpenID Working Group.

Then there are the things that are nearly completely normalised. Take mobile devices. They figured in just about every CISmcc presentation, but were rarely called out. Mobile is simply the way things are now.

Hardware stores

So while the mobile form factor is taken for granted, the cryptographic goodies now standard in most handsets, and increasingly embedded in smart things and wearables, got a whole lot of express attention at CISmcc. I've already made much of Andre Durand's keynote on tokens. It was the same throughout the event.

    • There was a session on hybrid Physical and Logical Access Control Systems (PACS-LACS) featuring the US Government's PIV-I smartcard standard and the major ongoing R&D on that platform sponsored by DHS.
    • Companies like SecureKey are devoted to hardware-based keys, increasingly embedded in "street IDs" like driver licenses, and are working with numerous players deep in the SIM and smartcard supply chains.
    • The FIDO Alliance is fundamentally about hardware based identity security measures, leveraging embedded key pairs to attest to the pedigree of authenticator models and the attributes that they transmit on behalf of their verified users. FIDO promises to open up the latent authentication power of many 100s of millions of devices already featuring Secure Elements of one kind or another. FIDO realises PKI the way nature intended all along.
    • The good old concept of "What You See Is What You Sign" (WYSIWYS) is making a comeback, with mobile platform players appreciating that users of smartphones need reliable cues in the UX as to the integrity of transaction data served up in their rich operating systems. Clearly some exciting R&D lies ahead.
    • In a world of formal standards, we should also acknowledge the informal standards around us - the benchmarks and conventions that represent the 'real way' to do things. Hardware based security is taken increasingly for granted. The FIDO protocols are based on key pairs that people just seem to assume (correctly) will be generated in the compliant devices during registration. And Apple with its iTouch has helped to 'train' end users that biometrics templates must never leave the safety of a controlled hardware end point. FIDO of course makes that a hard standard.

What's next?

In my view, the Cloud Identity Summit is the only not-to-be missed event on the IDAM calendar. So long may it continue. And if CIS is where you go to see the future, what's next?

    • Judging by CISmcc, I reckon we're going to see entire sessions next year devoted to Continuous Authentication, in which signals are collected from wearables and the Internet of Things at large, to gain insights into the state of the user at every important juncture.
    • With the disciplined separation of abstract identities from concrete attributes, we're going to need an Digital Identity Stack for reference. FIDO's pyramid is on the right track, but it needs some work. I'm not sure the pyramid is the right visualisation; for one thing it evokes Maslow's Hierarchy of Needs in which the pinnacle corresponds to luxuries not essentials!
    • Momentum will grow around Relationships. Kantara's new Identity Relationship Management (IRM) WG was talked about in the CISmcc corridors. I am not sure we're all using the word in the same way, but it's a great trend, for Digital Identity is only really a means to an end, and it's the relationships they support that make identities important.

So there's much to look forward to!

See you again next year (I hope) in Monterey!

Posted in Smartcards, PKI, Language, Identity, Federated Identity, Cloud

Postcard from Monterey 2 #CISmcc

Second Day Reflections from CIS Monterey.

Follow along on Twitter at #CISmcc (for the Monterey Conference Centre).

The Attributes push

At CIS 2013 in Napa a year ago, several of us sensed a critical shift in focus amongst the identerati - from identity to attributes. OIX launched the Attributes Exchange Network (AXN) architecture, important commentators like Andrew Nash were saying, 'hey, attributes are more interesting than identity', and my own #CISnapa talk went so far as to argue we should forget about identity altogether. There was a change in the air, but still, it was all pretty theoretical.

Twelve months on, and the Attributes push has become entirely practical. If there was a Word Cloud for the NSTIC session, my hunch is that "attributes" would dominate over "identity". Several live NSTIC pilots are all about the Attributes.

ID.me is a new company started by US military veterans, with the aim of improving access for the veterans community to discounted goods and services and other entitlements. Founders Matt Thompson and Blake Hall are not identerati -- they're entirely focused on improving online access for their constituents to a big and growing range of retailers and services, and offer a choice of credentials for proving veterans bona fides. It's central to the ID.me model that users reveal as little as possible about their personal identities, while having their veterans' status and entitlements established securely and privately.

Another NSTIC pilot Relying Party is the financial service sector infrastructure provider Broadridge. Adrian Chernoff, VP for Digital Strategy, gave a compelling account of the need to change business models to take maximum advantage of digital identity. Broadridge recently announced a JV with Pitney Bowes called Inlet, which will enable the secure sharing of discrete and validated attributes - like name, address and social security number - in an NSTIC compliant architecture.

Mind Altering

Yesterday I said in my #CISmcc diary that I hoped to change my mind about something here, and half way through Day 2, I was delighted it was already happening. I've got a new attitude about NSTIC.

Over the past six months, I had come to fear NSTIC had lost its way. It's hard to judge totally accurately when lurking on the webcast from Sydney (at 4:00am) but the last plenary seemed pedestrian to me. And I'm afraid to say that some NSTIC committees have got a little testy. But today's NSTIC session here was a turning point. Not only are there a number or truly exciting pilots showing real progress, but Jeremy Grant has credible plans for improving accountability and momentum, and the new technology lead Paul Grassi is thinking outside the box and speaking out of school. The whole program seems fresh all over again.

In a packed presentation, Grassi impressed me enormously on a number of points:

  • Firstly, he advocates a pragmatic NSTIC-focused extension of the old US government Authentication Guide NIST SP 800-63. Rather than a formal revision, a companion document might be most realistic. Along the way, Grassi really nailed an issue which we identity professionals need to talk about more: language. He said that there are words in 800-63 that are "never used anywhere else in systems development". No wonder, as he says, it's still "hard to implement identity"!
  • Incidentally I chatted some more with Andrew Hughes about language; he is passionate about terms, and highlights that our term "Relying Party" is an especially terrible distraction for Service Providers whose reason-for-being has nothing to do with "relying" on anyone!
  • Secondly, Paul Grassi wants to "get very aggressive on attributes", including emphasis on practical measurement (since that's really what NIST is all about). I don't think I need to say anything more about that than Bravo!
  • And thirdly, Grassi asked "What if we got rid of LOAs?!". This kind of iconoclastic thinking is overdue, and was floated as part of a broad push to revamp the way government's orthodox thinking on Identity Assurance is translated to the business world. Grassi and Grant don't say LOAs can or should be abandoned by government, but they do see that shoving the rounded business concepts of identity into government's square hole has not done anyone much credit.

Just one small part of NSTIC annoyed me today: the persistent idea that federation hubs are inherently simpler than one-to-one authentication. They showed the following classic sort of 'before and after' shots, where it seems self-evident that a hub (here the Federal Cloud Credential Exchange FCCX) reduces complexity. The reality is that multilateral brokered arrangements between RPs and IdPs are far more complex than simple bilateral direct contracts. And moreover, the new forms of agreements are novel and untested in real world business. The time and cost and unpredictability of working out these new arrangements is not properly accounted for and has often been fatal to identity federations.

IMG 5412 BEFORE cropped
IMG 5413 AFTER cropped


The dog barks and this time the caravan turns around

One of the top talking points at #CISmcc has of course been FIDO. The FIDO Alliance goes from strength to strength; we heard they have over 130 members now (remember it started with four or five less than 18 months ago). On Saturday afternoon there was a packed-out FIDO show case with six vendors showing real FIDO-ready products. And today there was a three hour deep dive into the two flagship FIDO protocols UAF (which enables better sharing of strong authentication signals such that passwords may be eliminated) and U2F (which standardises and strengthens Two Factor Authentication).

FIDO's marketing messages are improving all the time, thanks to a special focus on strategic marketing which was given its own working group. In particular, the Alliance is steadily clarifying the distinction between identity and authentication, and sticking adamantly to the latter. In other words, FIDO is really all about the attributes. FIDO leaves identity as a problem to be addressed further up the stack, and dedicates itself to strengthening the authentication signal sent from end-point devices to servers.

The protocol tutorials were excellent, going into detail about how "Attestation Certificates" are used to convey the qualities and attributes of authentication hardware (such as device model, biometric modality, security certifications, elapsed time since last user verification etc) thus enabling nice fine-grained policy enforcement on the RP side. To my mind, UAF and U2F show how nature intended PKI to have been used all along!

Some confusion remains as to why FIDO has two protocols. I heard some quiet calls for UAF and U2F to converge, yet that would seem to put the elegance of U2F at risk. And it's noteworthy that U2F is being taken beyond the original one time password 2FA, with at least one biometric vendor at the showcase claiming to use it instead of the heavier UAF.

Surprising use cases

Finally, today brought more fresh use cases from cohorts of users we socially privileged identity engineers for the most part rarely think about. Another NSTIC pilot partner is AARP, a membership organization providing "information, advocacy and service" to older people, retirees and other special needs groups. AARP's Jim Barnett gave a compelling presentation on the need to extend from the classic "free" business models of Internet services, to new economically sustainable approaches that properly protect personal information. Barnett stressed that "free" has been great and 'we wouldn't be where we are today without it' but it's just not going to work for health records for example. And identity is central to that.

There's so much more I could report if I had time. But I need to get some sleep before another packed day. All this changing my mind is exhausting.

Cheers again from Monterey.

Posted in Security, Privacy, PKI, Language, Identity, Federated Identity, e-health

Bob is dead

With apologies to Friedrich Nietzsche. The hero of many a crypto folk tale Bob is dead, and we have killed him.

We now know that in PKI, Alice's Relying Party is almost always a machine and not a human being. The idea that two strangers would use PKI to work out whether or not to trust one another was deeply distracting and led to the complexity that in the main stymied early PKI.

All of which might be academic except the utopian idea persists that identity frameworks can and should underpin stranger-to-stranger e-business. With NSTIC for instance I fear we are sleep walking into a repeat of Big PKI, when we could be solving a simpler problem: the robust and bilateral presentation of digital identity data in established contexts, without changing the existing relationships that cover almost all serious transactional business.

The following is an extract from a past paper of mine, "Public Key Superstructure" which was presented to the NIST IDTrust Workshop in 2008. There I examine the shortfalls and pitfalls of using signed email as a digital signature archetype.

E-mail not a killer application for PKI

A total lack of real applications would explain why e-mail became by default the most talked about PKI application. Many PKI vendors to this day continue to illustrate their services and train their users with imaginary scenarios where our heroes Alice and Bob breathlessly exchange signed e-mails. Like the passport metaphor, e-mail seems easily understood, but it manifestly has not turned out to be a ‘killer application’, and worse still, has contributed to a host of misunderstandings.

The story usually goes go that Alice has received a secure e-mail from stranger Bob and wishes to work out if he is trustworthy. She double clicks on his digital signature and certificate in order to identify his CA. And now the fun begins. If Alice is not immediately trusting of the CA (presumably by reputation) then she is expected to download the CP and CPS, read them, and satisfy herself that the registration processes and security standards are adequate for her needs.

Does this sort of rigmarole have any parallel in the real world? A simple e-mail with no other context is closely equivalent to a letter or fax sent on plain white paper. Under what circumstances should we take seriously a message sent on plain paper from a stranger, even if we could track down their name?

In truth, the vast majority of serious communications occurs not between strangers but in a rich existing context, where the receiver has already been qualified in some way by the sender as likely being the right party to contact. In e-business, routine transactions are not usually conducted by e-mail but instead use special purpose software or dedicated websites with purpose built content. Thus we see most of the digital signature action in cases such as e-prescriptions, customs broking, trade documentation, company returns, patent filing and electronic conveyancing.

Several important simplifying assumptions flow from the fact that most e-business has a rich context, and these should be heeded when planning PKI:

Emphasise straight-through processing

In spite of the common worked example of Alice and Bob exchanging e-mails, the receiver of most routine transactions – such as payment instructions, tax returns, medical records, import/export declarations, or votes – is not a human but instead is a machine. The notion that a person will examine digital certificates and chase down the CA and its practices is simply false in the vast majority of cases. One of PKI’s great strengths is the way it aids straight-through processing, so it has been a great pity that vendors, through their training and marketing materials, have stressed manual over automatic processing.

Play down Relying Party Agreements

The sender and receiver of digitally signed transactions are hardly ever un-related. This is in stark contrast to orthodox legal analyses of PKI which foundered on the supposed lack of contractual privity between Relying Party and CA. For example the Australian Government’s extensive investigation into legal liability in digital certificates after 76 pages still could not reach a firm conclusion about whether a “CA may owe a duty of care to a [Relying Party] who is not known to the CA” [http://www.egov.vic.gov.au/pdfs/publication_utz1508.pdf]. The fact is, this sort of scenario is entirely academic and should never have been given the level of attention that it was. The idea of a “Relying Party Agreement” to join in contract the RP and the CA is moot in all “closed” e-business settings where PKI in thriving. It is this lesson that needs to be generalised by PKI regulators, not the hypothetical model of “open” PKI where all parties are strangers.

Play down certificate path discovery

The fact that in real life, parties are transacting in the context of some explicit scheme, means that the receiver’s software can predict the type of certificate that will most often be used by senders. For instance, when doctors are using e-prescribing software, there is not going to be a wide choice of certificate options; indeed, the appropriate scheme root keys and certificates for authenticating a whole class of doctors will likely be installed at both the sending and receiving ends, at the same time that the software is. When a doctor writes a prescription, their private key can be programmatically selected by their client and invoked to create a digital signature, according to business rules enshrined in the software design. And when such a transaction is received, the software of the pharmacist (or insurance company, government agency etc.) will similarly ‘know’ by design which classes of certificates are expected to verify the digital signature. All this logic in most transaction systems can be settled at design time, which can greatly simplify the task of certificate path discovery, or eliminate it altogether. In most systems it is straightforward for the sender’s software to attach the whole certificate chain to the digital signature, safe in the knowledge that the receiver’s software will be configured with the necessary trust anchors (i.e. Root CA certificates) with which to parse the chain.

Posted in PKI, Identity, Federated Identity

An authentication family tree

How do we make best sense of the bewildering array of authenticators on the market? Most people are familiar with single factor versus two factor, but this simple dichotomy doesn’t help match technologies to applications. The reality is more complex. A family tree like the one sketched here may help navigate the complexity.

Different distinctions define various branch points. The first split is between what I call Transient authentication (i.e. access control) which tells if a user is allowed to get at a resource or not, and Persistent authentication, which lets a user leave a lasting mark (i.e. signature) on what they do, such as binding electronic transactions.

Working our way up the Transient branch, we see that most access controls are based either on shared secrets or biometrics. Dynamic shared secrets change with every session, either in a series of one time passwords or via challenge-response.

On the biometric branch, we should distinguish those traits that can be left behind inadvertently in the environment and are more readily stolen. The safer biometrics are “clean” and leave no residue. Note that while the voice might be recorded without the speaker’s knowledge, I don't see it as a residual biometric in practice because voice recognition solutions usually use dynamic phrases that resist replay.

For persistent authentication, the only practical option today is PKI and digital signatures, technology which is available in an increasingly wide range of forms. Embedded certificates are commonplace in smartcards, cell phones, and other devices.

The folliage in the family tree indicates which technologies I believe will continue to thrive, and which seem more likely to be dead-ends.

I'd appreciate feedback. Is this useful? Does anyone know of other taxonomies?

Posted in Security, PKI, Biometrics

Simpler PKI is on the cards

PKI has a reputation for terrible complexity, but it is actually simpler than many mature domestic technologies.

PKI stack pics 110121 eye candy cropped

It's interesting to ponder why PKI got to be (or look) so complicated. There have been at least two reasons. First, the word is frequently taken to mean the original overblown "Big PKI" general purpose identification schemes, with their generic and context-free passport grade ID checks, horrid user agreements and fine print. Yet there are alternative ways to deploy public key technology in closed authentication schemes, and indeed that is where it thrives; see http://lockstep.com.au/library/pki. Second, there is all that gory technical detail foisted on lay people in the infamous "PKI 101" sessions. Typical explanations start with a tutorial on asymmetric cryptography even before they tell you what PKI is for. ]

I've long wondered what it is about PKI that leads its advocates to train people into the ground. Forty-odd years ago when introducing the newfangled mag stripe banking card, I bet the sales reps didn't feel the need to explain electromagnetism and ferric oxide chemistry.

This line of thought leads to fresh models for 'domesticating' PKI by embedding it in plastic cards. By re-framing PKI keys and certificates as being means to an end and not ends in themselves, we can also:

― identify dramatically improved supply chains to deliver PKI's benefits
― re-cast the traditionally difficult business model for CAs, and
― demystify how PKI can support a plurality of IDs and apps.

Consider the layered complexity of the conventional plastic card, and the way the layers correspond to steps in the supply chain. At its most basic level, the card is based on solid state physics and Maxwell's Equations for electromagnetism. These govern the properties of ferric oxide crystals, which are manufactured as powders and coated onto bulk tape by chemical companies like BASF and 3M. The tape is applied to blank cards, which are distributed to different schemes for personalisation. Usually the cards are pre-printed in bulk with artwork and physical security features specific to the scheme. In general, personalisation in each scheme starts with user registration. Data is written to the mag stripe according to one of a handful of coding standards which differ a little bwteen banks, airlines and other niches. The card is printed or embossed, and distributed.

Lockstep PKI Tech Stack Ferric

The variety of distinct schemes using magnetic stripe cards is almost limitless: bank cards, credit cards, government entitlements, health insurance, clubs, loyalty cards, gift cards, driver licences, employee ID, universities, professional associations etc etc. They all use the same ferromagnetic components delivered through a global supply chain, which at the lower layers is very specialised and delivered by only a handful of companies.

And needless to say, hardly anyone needs to know Maxwell's Equations to make sense of a mag stripe card.

The smartcard supply chain is very similar. The only technical difference is the core technology used to encode the user information. The theoretical foundations are cryptography instead of electromagnetism, and instead of bulk ferric oxide powders and tapes, specialist semiconductor companies fabricate the ICs and preload them with firmware. From that point on, the smartcard and mag stripe card supply chains overlap. In fact the end user in most cases can't tell the difference between the different generations of card technologies.

Lockstep PKI Tech Stack Crypto

Smartcards (and their kin: SIMs, USB keys and smartphones) are the natural medium for deploying PKI technology.

Re-framing PKI deployment like this ...

― decouples PK technology from the application and scheme layers, and tames the technical complexity; it shows where to draw the line in "PKI 101" training

― provides a model for transitioning from conventional "id" technology to PKI with minimum disruption of current business processes and supplier arrangements

― shows that it's perfectly natural for PKI to be implemented in closed communities of interest (schemes) and takes us away from the unhelpful orthodox Big PKI model

― suggests new "wholesale" business models for CAs; historically CAs found it difficult to sell certificates direct, but a clearly superior model is to provide certificates into the initialisation step

― demonstrates how easy to use PKI should be; that is, exactly as easy to use as the mag stripe card.

I once discussed this sort of bulk supply chain model at a conference in Tokyo. Someone in the audience asked me how many CAs I thought were needed worldwide. I said maybe three or four, and was greeted with incredulous laughter. But seriously, if certificates are reduced to digitally signed objects that bind a parcel of cardholder information to a key associated with a chip, why shouldn't certificates be manufactured by a fully automatic CA service on an outsourced managed service basis? It's no different from security printing, another specialised industry with the utmost "trust" requirements but none of the weird mystique that has bedevilled PKI.

Posted in PKI, Identity

No such thing as a passport

What do you call it when a metaphor or analogy outgrows the word it is based on, thus co-opting that word to mean something quite new? Metaphors are meant to clarify complex concepts by letting people think of them in simpler terms. But if the detailed meaning is actually different, then the metaphor becomes misleading and dangerous.

I'm thinking of the idea of the electronic passport. Ever since the early days of Big PKI, there's been the beguiling idea of an electronic passport that will let the holder into all manner of online services and enable total strangers to "trust" one another online. Later Microsoft of course even named their digital identity service "Passport", and the word is still commonplace in discussing all manner of authentication solutions.

The idea is that the passport allows you to go wherever you like.

Yet there is no such thing.

A real world passport doesn't let you into any old country. It's not always sufficient; you often need a visa. You can't stay as long as you like in a foreign place. Some countries won't let you in at all if you carry the passport of an unfriendly nation. You need to complete a landing card and customs declarations specific to your particular journey. And finally, when you've got to the end of the arrivals queue, you are still at the mercy of an immigration officer who has the discretion to turn you away. As with all business, there is so much more going on here than personal identity.

So in the sense of the meaning important to the electronic passport metaphor, the "real" passport doesn't actually exist!

The simplistic notion of electronic passport is really deeply unhelpful. The dream and promise of general purpose digital certificates is what derailed PKI, for they're unwieldy, involve unprecedented mechanisms for conferring open-ended "trust", and are rarely useful on their own (ironically that's also a property of real passports). Think of the time and money wasted chasing the electronic passport when all along PKI technology was better suited to closed transactions. What matters in most transactions is not personal identity but rather, credentials specific to the business context. There never has been a single general purpose identity credential.

And now with "open" federated identity frameworks, we're sleep-walking into the same intractable problems, all because people have been seduced by a metaphor based on something that doesn't exist.

The well initiated understand that the Laws of Identity, OIX, NSTIC and the like involve a plurality of identities, and multiple attributes tuned to different contexts. Yet NSTIC in particular is still confused by many with a single new ID, a misunderstanding aided and abetted by NSTIC's promoters using terms like "interoperable" without care, and by casually 'imagining' that a student in future will log in to their bank using their student card .

Words are powerful and they're also malleable. Some might say I'm being too pedantic sticking to the traditional reality of the "passport". But no. It would be OK in my opinion for "passport" to morph into something more powerful and universal -- except that it can't. The real point in all of this is that multiple identities are an inevitable consequence of how identities evolve to suit distinct business contexts, and so the very idea of a digital passport is a bit delusional.

Posted in Security, PKI, Language, Internet, Identity, Culture