I’d like to see a moratorium on commercial facial recognition.
It should be of acute concern that photos originally uploaded for personal use are being rendered personally identifiable, and put to secondary purposes by social media companies, who are silent in their Privacy Policies about what those purposes might be. The essence of privacy is control, and with facial recognition, people are utterly impotent: you might be identified through FR by virtue of being snapped at a party or in a public place by someone you never even met.
It’s clear that sites like Facebook have facial recognition bots poring over their image libraries, because this is how they generate tag suggestions. Crucially, when a user asks for a tag to be removed, Facebook does not automatically remove the underlying template that joins the distilled biometric data to the user’s name. That requires a separate and obscure request not mentioned in their Privacy Policy.[Update August 2012: earlier this year, Facebook made welcome changes. More information is provided now about facial recognition, and when tag suggestions is turned off, templates are indeed deleted. Their Privacy Policy however still leaves much to be desired for it does not restrain Facebook’s secondary use of biometric templates.]
Identifiable faces in photos are an incredible resource. Combined with image analysis for picking out features like place names, buildings and logos, FR enables social media companies to work out countless new connections to add to their commercial lifeblood. They will be able to work out what we like – the brands we wear, cars we drive, the phones we use, airlines we fly, the places we frequent – without us having to expressly ‘Like’ anything.
Many users may be unaware of the rich metadata that goes with their photos and which then supercharges their linkages, including data about when a photo was taken, and in many cases where, thanks to GPS or geolocation in their camera phones. And then there is the metadata that the social media service adds, like the name of the user who uploaded the files. And from now on, who else is in it.
By encouraging its members to tag their friends, and then making tag suggestions which are validated by their subjects, Facebook is crowd-sourcing the calibration of its FR algorithms. Even if users are wily enough to have the templates deleted, at the very least Facebook still benefits from the learning to improve its mathematics. All this volunteer testing and training by Facebook’s members is another example of the unfair bargain and false pretenses under which Facebook harvests Personal Information.
As we’re seeing in Europe, it appears that current Data Protection laws will put the brakes on facial recognition. There is a straightforward threshold issue: facial recognition converts hitherto anonymous image data into Personally Identifiable Information (in enormous volumes) and thus OECD style Privacy Principles apply. The custodians of this PII must in many jurisdictions account for the necessity of collecting it, they must limit themselves in how they use & disclose it, and they must be transparent in their Privacy Policies about these matters. Collection and use of biometric data may also be subject to consent rules; it may be necessary for individuals to consent in advance to the creation of biometric templates, which is a thorny issue when so many photographs in Facebook were taken by other people and uploaded without the subjects even being aware of it.
Where is all this heading?
Automated facial recognition is a lot like granting social media companies x-ray vision into millions and millions of personal photo albums. As the FR bots do their work, it’s equivalent to magically tattooing names onto the foreheads of people in the photos. And then they can figure out where everyone was, at different points in time, who they were hanging with, and what they were doing. In effect, social media companies can stitch together global surveillance tapes.
Today those “tapes” will be patchy, but they will become steadily more complete and detailed over time, as users innocently upload more and more imagery, and as the biometric efficacy improves.
Further update 27 Nov 2012: It is reported that new facial recognition services invite you to upload photos to find look alikes in pornography. That is, you can find out if the “girl next door” has a secret life. It’s such an egregious threat to privacy that I call again for a moratorium on facial recognition. Mine will not be a popular nor politically correct view, but I reckon this technology is so intrinsically unsafe that we should suspend its use while we agree on ways to control its application.