Lockstep

Mobile: +61 (0) 414 488 851
Email: swilson@lockstep.com.au

Calling for a moratorium on SM facial recognition

Further update 27 Nov 2012: It is reported that new facial recognition services invite you to upload photos to find look alikes in pornography. That is, you can find out if the "girl next door" has a secret life. It's such an egregious threat to privacy that I call again for a moratorium on facial recognition. Mine will not be a popular nor politically correct view, but I reckon this technology is so intrinsically unsafe that we should suspend its use while we agree on ways to control its application.

I'd like to see a moratorium on commercial facial recognition.

It should be of acute concern that photos originally uploaded for personal use are being rendered personally identifiable, and put to secondary purposes by social media companies, who are silent in their Privacy Policies about what those purposes might be. The essence of privacy is control, and with facial recognition, people are utterly impotent: you might be identified through FR by virtue of being snapped at a party or in a public place by someone you never even met.

It’s clear that sites like Facebook have facial recognition bots poring over their image libraries, because this is how they generate tag suggestions. Crucially, when a user asks for a tag to be removed, Facebook does not automatically remove the underlying template that joins the distilled biometric data to the user’s name. That requires a separate and obscure request not mentioned in their Privacy Policy.[Update August 2012: earlier this year, Facebook made welcome changes. More information is provided now about facial recognition, and when tag suggestions is turned off, templates are indeed deleted. Their Privacy Policy however still leaves much to be desired for it does not restrain Facebook's secondary use of biometric templates.]

Identifiable faces in photos are an incredible resource. Combined with image analysis for picking out features like place names, buildings and logos, FR enables social media companies to work out countless new connections to add to their commercial lifeblood. They will be able to work out what we like - the brands we wear, cars we drive, the phones we use, airlines we fly, the places we frequent - without us having to expressly 'Like' anything.

Many users may be unaware of the rich metadata that goes with their photos and which then supercharges their linkages, including data about when a photo was taken, and in many cases where, thanks to GPS or geolocation in their camera phones. And then there is the metadata that the social media service adds, like the name of the user who uploaded the files. And from now on, who else is in it.

By encouraging its members to tag their friends, and then making tag suggestions which are validated by their subjects, Facebook is crowd-sourcing the calibration of its FR algorithms. Even if users are wily enough to have the templates deleted, at the very least Facebook still benefits from the learning to improve its mathematics. All this volunteer testing and training by Facebook’s members is another example of the unfair bargain and false pretenses under which Facebook harvests Personal Information.

As we're seeing in Europe, it appears that current Data Protection laws will put the brakes on facial recognition. There is a straightforward threshold issue: facial recognition converts hitherto anonymous image data into Personally Identifiable Information (in enormous volumes) and thus OECD style Privacy Principles apply. The custodians of this PII must in many jurisdictions account for the necessity of collecting it, they must limit themselves in how they use & disclose it, and they must be transparent in their Privacy Policies about these matters. Collection and use of biometric data may also be subject to consent rules; it may be necessary for individuals to consent in advance to the creation of biometric templates, which is a thorny issue when so many photographs in Facebook were taken by other people and uploaded without the subjects even being aware of it.

Where is all this heading?

Automated facial recognition is a lot like granting social media companies x-ray vision into millions and millions of personal photo albums. As the FR bots do their work, it’s equivalent to magically tattooing names onto the foreheads of people in the photos. And then they can figure out where everyone was, at different points in time, who they were hanging with, and what they were doing. In effect, social media companies can stitch together global surveillance tapes.

Today those "tapes" will be patchy, but they will become steadily more complete and detailed over time, as users innocently upload more and more imagery, and as the biometric efficacy improves.

Posted in Social Networking, Social Media, Privacy, Biometrics

Comments

Peer BarilMon 27 Jan 2014, 5:34pm

Apologies, I'd not seen this before tweeting earlier today.

Q. Regarding the distinctions you draw between 'commercial' and other applications of facial recognition, I'm trying to understand how the Facebook example you cite would differ from the ubiquitous street cams in a city like London, for example.

Though not quite 'NSA' equivalents, private security and technology firms to whom the street cam networks are contracted would apply similar meta data analyses to footage from public places, raising an equal concerned that private investigators and debt collectors, etc., would keenly subscribe to such a 'service'.

Rather than split the two(commercial v. other) is there a single deeper constitutional and legislative framework that could apply to both contexts?

I know, very tough. The phrase 'reasonable expectation of privacy' is so context dependent.

Stephen WilsonTue 28 Jan 2014, 6:11am

Thanks Peter.

I suspect that the use of face recognition by London police falls under law enforcement exemptions to British or European data protection laws. That is, broadly, police are allowed to collect and to use PII outside the normal constraints of a Privacy Policy and Collection and Use Limitations, if their purpose is to fight crime. Of course we can have a separate debate about whether CCTV does deter crime or whether it really is useful in forensics, but those are not privacy questions per se.

So there is an underlying privacy framework in jurisdictions that have OECD style data protection laws. I'm referring to all of Europe, Australia, New Zealand and dozens of other countries; not the USA.

This type of framework is all about data protection or information privacy. It side steps philosophical questions of the self or data ownership; data protection laws in places like Australia don't even use the concepts of "public domain" or "private". Instead the focus is controlling the flow of Personal Information.

The framework starts with a definition of Personal Information or PII, namely any data about a natural person whose identity is apparent or can be reasonably ascertained. Interestingly, the dominant definitions of PII internationally and in the US General Services Administration are essentially the same. Here's a recent progressive discussion of the uncertainty arising in the definition of PII around the potential for identifiability: http://lockstep.com.au/blog/2013/09/27/pii-or-not-pii.

The human rights oriented data protection framework then sets out principles that call for limits on the Collection of PII (only collect what's needed for an express purpose), Use and Disclosure (refrain from taking PII collected for one purpose and using it for unrelated purposes), Openness (tell people what PII you collect, how, when and why) and Access (provide people with access to whatever PII you have about them).

Note that this is where the American privacy regime departs from the rest of the world. For starters, there is no broad data protection statute in the US, only various sector-specific laws like HIPAA. Where there are data protection regulations, they tend to focus on determining practical harms from breaches, rather than affording in-principle protection of all PII as a right. And above all, there is no Collection Limitation principle in the American Fair Information Practice Principles (FIPPs). There is something of an American philosophy that no harm is done merely collecting Personal Information, and that it's fair for businesses to collect today and innovate tomorrow, to come up with new uses for the PII they have aggregated.

Internationally, privacy principles tend to be technology neutral and blind to the method of collection; it doesn't matter how PII comes to be in a company's or government's possession, the principles are the same. So in particular, privacy principles and privacy rights apply to metadata collected indirectly. And they apply, and to the outputs of Big Data processes and biometric face recognition (as discussed in my blog above). Hence Facebook in Europe was forbidden from automatically attaching names to anonymous photos without consent because the creation of named records represents a collection of PII.

Finally, data protection laws have various exemptions, for law enforcement and other legal processes. This would be how CCTV cameras have proliferated in places like the UK with otherwise strong privacy protections. I'm not naive about the quality of the contracts that are written for private contractors running CCTV on behalf of police but in theory there should be restraints against companies re-using PII they collect for any other purpose, like private investigations. It should be impossible for a third party to 'subscribe' to access CCTV data.

In closing, I'd like to see the human rights orientated privacy framework used as an armature for having the necessary debate regarding national security surveillance. My biggest problem with the PRISM scandal and other Snowden revelations is that these operations were (and presumably still are) happening in secret. There is no security in obscurity: government surveillance should be transparent. If there is a need for intelligence agencies to trawl through telephone metadata, then let's see them make the case for the Collection Necessity! See also http://lockstep.com.au/blog/2013/12/18/proper-surveillance-debate.

Post a comment

If you are a registered user, Please click here to Sign In

Your Name*

Your Email Address* required, but won't be displayed on this site

To help prevent spam in our blog comments, please type in "Calling" (without the quotation marks) below*