Lockstep

Mobile: +61 (0) 414 488 851
Email: swilson@lockstep.com.au

Biometrics and false advertising

Use of the word “unique” in biometrics constitutes false advertising.

There is little scientific basis for any of the common biometrics to be inherently “unique”. The iris is a notable exception, where the process of embryonic development of eye tissue is known to create random features. But there's little or no literature to suggest that finger vein patterns or gait or voice traits should be highly distinctive and randomly distributed in ways that create what security people call "entropy". In fact, one of the gold standards in biometrics - fingerprinting - has been shown to be based more on centuries old folklore than science (see the work of Simon Cole).

But more's the point, even if a trait is highly distinctive, the vagaries of real world measurement apparatus and conditions mean that every system commits false positives. Body parts age, sensors get grimy, lighting conditions change, and biometric systems must tolerate such variability. In turn, they make odd mistakes. In fact, consumer biometrics are usually tuned to deliberately increase the False Accept Rate, so as not to inconvenience too many bona fide users with a high False Reject Rate.

So no biometric system ever behaves like the trait is unique! Every system has a finite False Accept Rate; FARs of one or two percent are not uncommon. If one in fifty people are confused with someone else on a measured trait, how is that trait “unique”?

The word "unique" should be banned in conenction with biometrics. It's not accurate, and it's used to create over-statements in biometric product marketing.

This is not mere nit picking. The biometrics industry gets away with terrible hyperbole, aided and abetted by loose talk, lulling users into a false sense of security. Managers and strategists need to understand at every turn that there is no such thing as perfect security. Biometric systems fail. But when lay people hear “unique” they think that’s the end of the story. They’re not encouraged to look at the error rate specs and think deeply about what they really mean.

Exaggeration in use of the word "unique" is just the tip of the iceberg. Biometrics vendors are full of it:

Economical with the truth

    • Major palm vein vendors claim spectacular error rates of FAR = 0.00008% and FRR = 0.01%. Their brochures show these specs side-by-side, without any mention of the fact that these are best case figures, and utterly impossible to achieve together. I've been asking one vendor for their Detection Error Tradeoff (DET) curves for years but I'm told they're commercial in confidence. The vendor won't even cough up the Equal Error Rate. And why? Because the tradeoff is shocking.
    • The International Biometric Group in 2006 published the only palm vein DET curve I have managed to find, in its Comparative Biometric Testing Round 6 ("CBT 6"). Curiously this report is hard to find nowadays, but I have a copy if anyone wants to see it. The DET curves give the lie to the best case vendor specs. For when the palm vein system is tuned to highest security setting with a best possible False Match Rate of 0.0007%, the False Non Match rate deteriorates to 12%, or worse than one in ten. [Ref: CBT6 Executive Summary, p6]

Clueless about privacy

    • You'd think that biometric vendors would brush up on privacy. One of them attempted recently to calm fears over facial recognition by asserting that "a face is not, nor has it ever been, considered private". This red herring belies a terrible misunderstanding of information privacy. Once faces are rendered personally identifiable by OSNs and names attached to the terabytes of hitherto anonymous snapshots in their stores, then that data becomes automatically subject to privacy law in many jurisdictions. It's a scandal of the highest order: albums innocently uploaded into the cloud over many years, now suddently rendered identifiable, and trawled for commercially valuable intelligence, without consent, and without any explanation in the operators' Privacy Policies.

Ignoring published research

    • And you'd think that for such a research-intensive field (where many products are barely out of the lab) vendors would be up to date. Yet one of them has repeatedly claimed that biometric templates "are nearly impossible to be reverse engineered". This is either a lie or willful ignorance. The academic literature has many examples of facial and fingerprint templates being reverse engineered by successive approximation methods to create synthetic raw biometrics that generate matches with target templates. Tellingly, the untruth that templates can't be reversed has been recently repeated in connection with the possible theft of biometric data of all Israeli citizens. When passwords or keys or any normal security secrets are breached, then the first thing we do is cancel them and re-issue the users with new ones, along with abject apologies for the inconvenience. But with biometrics, that's not an option. So no wonder vendors are so keen to stretch the truth about template security; to admit there is a risk of identity theft, without the ability to reinstate the biometrics of affected victims, would be catastrophic

With more critical thinking, managers and biometric buyers would start to ask the tough questions. Such as How are you testing this system? How do real life error rates compare with bench testing (which the FBI warns is always optimistic)? And what is the disaster recovery plan in the event that a criminal steals a user’s biometric?

Posted in Security, Language, Biometrics

Comments

Emilio MordiniMon 30 Jan 2012, 9:24am

About reverse engineering, you make the standard confusion between true reverse engineering and repruduction of a fake biometric sample. One shoudl say, to be sure, that it is certain possible to reconstruct the biometric sample from the biometric template, but it is impossible to reconstruct the original biometric feature, say, you can shape a fake face which cheat the sensor, but you cannot reconstruct the orginal face. This difference is paramount in terms of data protection. it goes without saying that templates must be protected if one wants to prevent spoofing attacks, but one cannot infere from template medical or ethnic or whatever else information on the data subject. This change the sensitivity of the template in comparison to analogic data (say, raw images)

Stephen WilsonMon 30 Jan 2012, 9:50am

Ok, but I'm talking about resistance to spoofing and identity theft. When vendors deny it's possible to reverse engineer the Subject's trait values, they seem to be denying the possibility of spoofing.

As you say, it is certainly possible to "reconstruct the biometric sample from the biometric template". Or in other words, it is possible to calculate (by Hill Climbing or successive approximation) a synthetic biometric sample that does not necessarily resemble the subject's actual face or fingerprint, but which is processed by the biometric algorithm so as to match the Subject's template. This has been demonstrated many times for different modalities and algorithms.

AllevateFri 15 Mar 2013, 9:43pm

Biometrics without a doubt provide significant benefit by more accurately identifying individuals than conventional methods in a manner that is far more efficient. ROI is measurable and absolute.

You are correct, though, as in many nascent industries, there are claims being made that are, let's say, of an exaggerated nature.

Such claims will only serve to erode trust and hinder the adoption of, and the benefit realised by, such technology.

By calling the industry to account, you are helping to foster an atmosphere of plain and open speaking. This can only be positive.

Your efforts, in my opinion, are of great advantage to the biometrics industry, and surely will serve to encourage an honest dialogue which will help secure the place of this technology in our society. Once you cut through the "hyperbole" the facts remain: Improved process, greater risk mitigation, enhanced efficiency, tangible ROI.

Stephen WilsonSat 16 Mar 2013, 12:28pm

And once again a biometrics debate ends with a wimper: just more vapid unsubstantiated claims. How for example is ROI "measurable and absolute" when experts say no biometric technology has predictable performance out in the real world? See http://lockstep.com.au/blog/2013/02/11/technological-imperialism.

Allevate is one of the self-styled leaders of biometrics on Twitter. Along with M2Sys and SecurLinx, they're enthusiastic #biometricchatters, but they're all strangely unwilling or unable to engage in the real issues. M2Sys in fact blocks me on Twitter and prevents me commenting on their website.

When I asked @Allevate to respond to my criticisms above (namely that unique is an exaggerated word, that vendors lie about FAR and FRR and that they don't understand reverse engineering) he said "why would I defend other people's claims?". Well, I would expect he'd defend the claims of the biometrics industry because he believed in them, but also because serious debate is what marks serious information security. But no, it's a sad hallmark of the biometrics industry that there is hardly any debate, hardly any science, hardly any engagement.

I would happily put biometrics vendors in the same category as used car salesmen and ignore them, save for the fact that these guys' tall claims extend to saving the world's poor from the digital divide: http://lockstep.com.au/blog/2013/02/11/technological-imperialism. An industry with grandiose plans for billions of the world's most disadvantaged peoples needs to be held to account!

Post a comment

If you are a registered user, Please click here to Sign In

Your Name*

Your Email Address* required, but won't be displayed on this site

To help prevent spam in our blog comments, please type in "Biometrics" (without the quotation marks) below*