The Australian government is currently holding an inquiry into the federal Privacy Act of 1988 (as amended in 2000). The inquiry has received public submissions, most of which have been recently published here. The inquiry is extremely wide ranging. I made a short submission (PDF) focusing on technology neutrality, synthetic data, anonymity and metadata. This blog is the first of three I plan to post based on selected elements of my submission.
I do not subscribe to the popular view that the march of technology has “outpaced” privacy law. The foundations of contemporary privacy rules laid down by Brandeis and Warren in the nineteenth century ― in particular the right to be let alone ― remain as true today as they ever were. Successive generations of technology ― from the telegraph and portable cameras in the 1890s, through computerisation in the 1970s, to Big Data and AI today ― have made it easier to infringe privacy, and at the same time harder to notice or detect the intrusion, but the average citizen has always insisted that others basically mind their own business.
The technology-neutral regulatory framework instigated by the OECD’s privacy rules of 1980 have stood the test of time. The principles of Collection Limitation, Use & Disclosure Limitation, Purpose Specification and Openness predated the Internet and yet have been applied in recent years to reign in excessive application of, for example, search algorithms and biometrics, technologies would have been considered pure sci-fi 40 years ago.
I frame privacy simply as restraint. Respecting others’ privacy – that is, letting them be alone – is about refraining from knowing things about them. For digital businesses, it is about choosing to not collect data, to not run analytics or comb over digital breadcrumbs even if they are in the “public domain”. Privacy is less about what we do with data than what we not do with it.
One of the most powerful contemporary ideas in data privacy is Collection by Creation (a phrase first coined, as best I can tell, by the Office of the Australian Privacy Commissioner in its work on data analytics and the Australian Privacy Principles). Because technology-neutral privacy laws are silent on the method of collection, it follows that any Personal Data that ends up in a computer system by any means has been collected and as such is subject to privacy protections. This goes for Personal Data automatically collected by IoT sensors or synthesised by software.
Indeed, the results of search algorithms, if they relate to identifiable individuals, constitute synthetic Personal Data. It is will known that Internet search results are sensitive to context and the history and interests of the person making the query. Search results, despite appearances, are not discrete facts plucked from the public domain but are constructed by complex processes in the service of a greater and under-appreciated commercial purpose ― getting to know people intimately so as to better target advertising to them. And hence it was reasonable for the European Court of Justice to make its original Right To Be Forgotten (RTBF) ruling in the 2014 case of Google Spain v AEPD and Mario Costeja Gonzálezi. The decision embodied privacy principles that can moderate the flow of any Personal Data, even that created (and hence collected) by algorithms.
So technology-neutral data privacy rules are ironically impersonal! They operate objectively. From experience we tend to think of personal data collection in terms of forms, questionnaires, surveys and interviews, and that’s probably how the big digital companies would prefer we kept imagining data flows. But data collection is increasingly automatic, entirely untouched by human hands. If a computer program mining shopping data can predict which customers are pregnant ― without any marketing person asking any questions ― then surely those customers expect and deserve the same privacy as if they had been expressly interviewed in person. The good news is, they do. Data privacy laws govern Personal Data flows created by Big Data, artificial intelligence and robots, and will continue to govern information technologies as they continue to develop.
It’s like a regulatory superpower.