Putting verifiable credentials in the right hands

Orange fingerprint with digital art

The development and evolution of a verifiable credential (VC) ecosystem depends on data issuers and secure wallets, but also on something that isn’t getting any attention: schemes to distribute credential data and metadata to the risk owner, the party which is using the credentials to make a risk assessment and decision.

The last step in this process is credential acceptance. Here, the risk owner consumes credentials as input to its own decision-making system, comprising processes and technologies assembled to address the risk owner’s specific risk management needs. Risk is individual, specific to each organization.

To consume these credentials, today’s enterprises must individually establish relationships with data providers, parties who may issue credentials themselves or, more commonly, act as data aggregators by drawing on multiple origins, other aggregators, and even data “generators” who sift and correlate data breadcrumbs to develop a useful signal.

In other words, each enterprise has to choose data pools that meet its needs, most of which have data of uncertain provenance and accuracy.

The platform alternative Lockstep suggests, the Data Verification Platform (DVP), provides risk owners with better-quality data by connecting up to one or more data distributors.

The risk owner must perform a number of steps:

  1. Establish a contractual relationship with a data distributor that defines the terms and conditions of credential use. There’s a lot of fine print here.
  2. Choose the VCs it requires for each use case and for each acceptance channel, online or offline.
  3. Acquire and deploy the necessary acceptance technology. As credential-sharing becomes an all-online activity, much of this will be via APIs. That said, there are compelling offline, in-person use cases where image scanners for reading QR codes and NFC readers may be employed. Both approaches connect to applications that route signals to risk-management systems and the VC distribution network.

How hard is this step?

Compared to a wallet-based approach, developing a common acceptance infrastructure is straightforward. Wallets are essential components, but if they lack a set of scheme or network rules, or data distribution technology, or a brand, a data verification business model isn’t possible. And without a business raison d’être, well, nearly all solutions fail that lack economic incentives.

Wallets are in danger of replicating the data sourcing challenge today’s risk owners face. As we said, today each risk owner must develop its own data sources. The range of useful credentials is high. To be useful and credible, each wallet must onboard the credentials it stores. That onboarding process goes beyond the technical. Scaling matters.

Let’s examine the difficulty of building out VC acceptance through the lens of what networks or schemes offer: consistent rules, technology to move and protect data, and the branding support that communicates what to expect.

The DVP approach borrows heavily from the processes established and evolved over decades by the card networks that have produced today’s global card acceptance footprint.

  • Rules. Rules are expressed through contractual obligations between the party distributing the VCs and the risk owner consuming those VCs. The rules require compliance with technical specifications and subsequent data handling responsibilities.
  • Tech. Both data subjects and risk owners require methods to engage in a VC-sharing dialog. For data subjects, the good news is they already possess smartphone-based security superpowers. And even for risk owners who must invest in this capability, the effort involved relies on well-understood online and mobile technologies. While there is much that needs building, neither the data subject nor the risk owner will face a heavy technology lift.
    • Data Subject Tech. Subjects are the party that VCs describe. Presentation of VCs by subjects should be straightforward. The subject grants permission to the risk owner to access the VCs which are needed for the particular transaction. Typically, that will be via a wallet interface. Using the Swiss army knife that is a smartphone, it is easy to develop cryptographically secured QR codes using the camera and NFC-based exchanges using that chipset. Device authentication is biometric. Go FIDO and passkey!
    • Risk Owner Tech. On the acceptance side, channel-specific tooling is required. APIs will be used by those apps and for online-only interactions to manage communications and credential-sharing. Borrowing tech from in-person payment card acceptance, subjects and risk owners meet, face to face, at the edge of the VC network through apps, low-cost image scanners, NFC readers, or Bluetooth-enabled devices.
  • Brand. Setting expectations and signaling what to expect are essential to creating behavioral change. We’re not talking about a logo here, although one will be used. We mean the entire process a user experiences every time they interact with a company or system. The Visa logo represents a company but, more importantly — and the reason why the company has spent billions in advertising — the logo also communicates the expected user experience. Users of a verifiable data platform will require a similarly consistent experience that communicates the virtues of security, privacy, and control. The brand should promise that people can share their data with the same ease, privacy, and security as they already do when sharing their payment details.

Data minimization starts at the enterprise border

Credential acceptance can be the first step in data minimization. Data minimization practice starts at the organization border. If you don’t collect data from subjects in the first place, it won’t be there to be breached or grow stale, all the while requiring costly security controls.

Opportunities abound for enterprise data minimization. Given the specificity needed for credential exchange — computers don’t handle ambiguity welll — there are use cases where the display of all the data contained in a driver license, for example, should no longer be necessary for age verification. All an alcohol delivery service needs, for example, is the name, address, an optional photo of the customer, and a calculated age validation. No need to share birthdate, license number, and physical attributes during the interaction, and there is no need to store this data.

Optimistic about acceptance infrastructure, mostly

Lockstep isn’t playing down the amount of work that will be involved, but is optimistic about the ability of enterprises and government agencies to deploy VC acceptance infrastructure. We know (mostly) what it needs to look like, and the core technical building blocks already exist. Smartphones, NFC, Bluetooth, QR, and APIs are beautiful things. Internet tech and ubiquitous low-cost hardware (when needed for in-person interactions) are well-understood tools. Most of us already have a wallet-capable smartphone.

But it all becomes a dog’s breakfast without technical, UX, and contractual standards. Wallets are a key component but their development uncomfortably straddles the tension between proprietary and open standards. And without a nation-spanning brand, consumer confusion will inject unnecessary friction, leading to uneven experience and skepticism, both of which are adoption killers.

Standards will play a critical role in providing users with a consistent experience across different interaction types. There will need to be standardized ground rules to guide the parties which are delivering VCs to risk owners. There is plenty of work to do.

Questions to ask

The VC ecosystem is in its infancy. If you want it to grow, ask yourself the following questions. If you’re a risk owner — and we all are — questions about data quality and availability, never mind cost, should occupy our minds.

When bringing on new data sources, ask:

  • What’s their provenance? Their recency? The refresh rate?
  • How is the data generated? Generated by algorithm? Correlated?
  • How is the data sourced? Sourced directly from the data origin? Gathered by a third party from government sources and repackaged?
  • How would you make this data better?

Lockstep’s Data Verification Platform is a scheme to rationalise and organise data flows between data originators such as government and the risk owners who rely on accurate data to guide decisions. Join us in conversation.

If you’d like to follow the development of the Data Verification Platform model, please subscribe for email updates.​