That is, what don't you need to know?
Lockstep's angle on privacy
After many years specialising in the field of Digital Identity, I came to realise that privacy and identity management are two sides of the same coin. Identity management means working out what you need to know about someone in order to do some rt of e-business with them, and privacy management is knowing what you don't need to know!
Lockstep's privacy practice is based around this very practical yet rigorous perspective. We bring deep experience in information systems design and governance to bear on complex and subtle privacy problems, bridging technology, business and compliance.
We have undertaken Privacy Impact Assessments for clients including:
- Confyrm Inc. (a US-based anti-fraud technology developer conducting an NSTIC Pilot)
- VANguard (the federal government e-authentication hub operated by the Department of Industry)
- Victorian Department of Health (multiple PIAs)
- Department of Finance's suite of public federal government web sites (one of the first PIAs to be done under the new Australian Privacy Principles)
- Victoria's Smart Meter program (managed by the Department of Primary Industries)
- Department of Foreign Affairs online passport application system
- Department of Finance Gender Balance on Boards database
- Australia Post
- Queensland Health
- HealthSMART, Department of Human Services (Victoria) state wide clinical application and client management system
- Vicroads' registration & licensing system overhaul "RandL".
We have done hands-on privacy design and engineering for the following clients (typically working within dev teams):
- Confyrm Inc.
- Australia Post
- Department of Foreign Affairs
- Department of Justice (Victoria)
- BusinessLink Ltd.
And we have delivered privacy research and/or training for:
- The Biometrics Institute
- Office of the [Federal] Privacy Commissioner
- Department of Education and Communities (NSW)
- Australian General Practice Network
- NSW Human Services Agencies' "HSNet"
Designing Privacy by Design
"Privacy by Design" is basically the same good idea as designing in quality or designing in security. Everyone is talking about PbD but for most systems designers it remains a set of abstract principles. Lockstep has made real progress on design tools that reduce PbD principles down to actionable information management specifications. Our tools hybridise standard information security methodologies and architectural practices. They include:
- Information Flow Mapping templates
- Extended Information Asset Inventory
- Privacy Threat & Risk Assessment.
Stephen was busy through 2013 presenting Lockstep's Privacy by Design methods to, amongst others, the iapp, Swinburne University, the Institute of Information Managers, the Systems Administrators Guild, the Privacy Reform and Compliance Forum, the Australian Institute of Professional Intelligence Officers and the AusCERT security conference.
NSTIC Preliminary Privacy Evaluation Jan-Feb 2013
Stephen has been selected by the Privacy Coordination Subcommittee of the NSTIC* project to lead the preliminary privacy evaluation, road-testing both the NSTIC Privacy Evaluation Methodology and the project website http://www.idecosystem.org.[*US National Strategy for Trusted Identities in Cyberspace].
Advocates for Privacy and Privacy Enhancing Technologies
Lockstep Principal Stephen Wilson blogs regularly on privacy; search his posts at http://lockstep.com.au/blog/privacy.
Leveraging unique digital identity expertise, Lockstep has led the way in articulating a positive and robust vision for smart Privacy Enhancing Technologies. On this point, Stephen made a detailed submission to the 2005 Senate Inquiry into the Privacy Act, looking closely at intelligent authentication, smartcards and biometrics.
Stephen was an invited member of the Australia Law Reform Commission's Developing Technology Advisory Subcommittee convened during the Commission's review of the national privacy regime.
We have worked closely with privacy commissioners over meany years to develop and articulate sophisticated position on technology and privacy. For instance we made a detailed submission to the Australian Privacy Commissioner on the development of guidelines for security practitioners. And Stephen was invited by the Commissioner to give a keynote speech at the launch of Privacy Awareness Week 2013.
In general, Lockstep resists the fatalistic view held by so many of our peers that technology has overtaken privacy; almost all privacy threats in today's digital technologies are anticipated by international Privacy Principles, and can be well managed by careful Privacy Engineering.
Sister company Lockstep Technologies undertakes award-winning R&D into innovative PETs.
Stephen's experience in privacy is summarised in the profile linked at the bottom of this page.
His numerous publications on the topic are gathered at the privacy section of the Lockstep library. Recent papers include:
- "Legal Limits to Data Re-Identification", Science Magazine, 8 February 2013
- "Designing Privacy By Design", IAPP Australia New Zealand Chapter Bulletin #40, February 2013
- "Facebook’s challenge to the Collection Limitation Principle" in Encyclopedia of Social Network Analysis and Mining, Rokne & Alhajj editors (in press)
- "Privacy Compliance Risks for Facebook", IEEE Technology and Society Magazine, Summer 2012
- "A bigger threat to patient privacy when doctors use Facebook", Journal of Medical Ethics, 20 Dec 2010
- "Public and yet still private", Online Banking Review, June 2010.
He has also made numerous submissions on privacy to government inquiries into:
- the Privacy Act
- the ill-conceived Human Services Access Card, and
We have a second sense for how technologists think and work, which allows us to root out sometimes surprising privacy problems.
For example, for a state government client we discovered a particular systemic over-disclosure of PII, by intuiting how designers were going about transitioning third parties from fax and paper to electronic interfaces. The client routinely provides extracts of citizen records to a hundred or so authorised third parties. Different statutes govern different disclosures, restricting PII disclosures on a need-to-know basis. We knew that various third parties were moving from fax to electronic interfaces. When we saw that the upgrades were occurring over time and opportunistically, we judged it likely that support engineers would re-use existing FTP patterns for successive upgrades, instead of writing new interfaces on spec for each case. This re-use would lead to the PII extracts being broadened beyond what really needed in each instance; in fact it turned out that essentially whole records were being exported. Here, reasonable engineering practice had the surprising side effect of compromising privacy; without our empathy for ICT work practices, no regular privacy compliance review could have foreseen the problem.
Privacy is so much more than security, or secrecy, or confidentiality
While most business people appreciate that good privacy compliance requires good information security, many organisations still struggle to identify tangible privacy controls; i.e. practical IT and e-security design features that pro-actively protect privacy and improve the organisation's privacy posture. As with security, privacy is now subject to special technical governance requirements, but there is a grave risk that an overly compliance-oriented approach can be expensive and ineffective. Businesses must be careful they can still 'see the wood for the trees'.
Bridging a gap
Most privacy advisers come from a legal and/or policy background, and look at privacy through the lens of compliance, or public policy. That's perfectly fine of course, yet the compliance perspective can fall short of engaging IT projects in their formative stages. To build privacy in, you need to understand in detail what privacy means for informatics requirements, architecture, software design, and security.
IT professionals sometimes underestimate privacy because they have long been told that "privacy is not a technology issue". They can presume that if they have security covered, then privacy will follow, and in any case, it seems to be someone else's responsibility! But many times we've seen this viewpoint morph into complacency, which in turn leads to privacy vulnerabilities in information systems that aren't then detected until it's too late.
Some examples help to illustrate the gap:
1. Security is not the same thing as privacy. Consider two highly secure organisations A and B, and suppose that A wishes to share data about its customers with B. Let's assume A and B are both secure to the highest standards. Then what's the problem? Simply, it doesn't matter how secure is B; under the law, personal information about A's customers cannot generally be transferred to B without those customers being informed, and without limits being placed on what B can do with it. Disclosure, or Secondary Usage, are not automatically OK just because the receiver is "secure".
2. Equally, secrecy is not the same thing as privacy. Too often the mistake is made that personal information found in the public domain can be exploited without the individuals' knowledge (this was the fundamental error made by Google in their StreetView wifi misadventure). But information privacy law doesn't much care where personal information comes from; the law doesn't even use the terms 'public' and 'private'. Instead the test is simply whether the information in question is personally identifiable. If it is identifiable, then there are limits on what an organisation can do with it, no matter how it was obtained.
3. Software designers can think that collecting personal information involves explicit forms or customer interviews, while the act of copying PI from another source, or generating new information from previously collected data can be presumed to escape privacy law. Unfortunately these are misconceptions. The Privacy Act is blind to the manner of collection; no matter how PI comes to be in your information systems, you may be deemed to have collected it. And that includes processes whereby PI is generated from e.g. "Big Data" stores. When Sensitive PI is involved, special care is required, as discussed at this recent blog post.
Lockstep Consulting bridges the gap between 'technology' and the 'business'.
In addition to trust & privacy strategy development and Privacy Impact Assessments (PIAs), a truly unique offering of Lockstep's is what we call Privacy Engineering, a privacy-by-design approach that generates tailored, practical guidance for ICT architects, designers and project managers so they can build privacy controls into their systems. We work closely with our clients to fine-tune local design practices, building privacy controls in (as opposed to hoping 'audit' them in). Privacy Engineering protects customer relations, pro-actively uncovers privacy problems, saves money by solving problems sooner, and enhances compliance. Special focus areas include audit logs and transaction histories, web forms, change management processes, and databases.
A little more detail on the approach is given in Babystep 14.
More sophisticated PIAs
For an example of our Privacy Impact Assessments, see http://www.smartmeters.vic.gov.au/resources/reports-and-consultations/lockstep-dpi-ami-pia-report.
Lockstep's PIAs are more technologically sophisticated than most. We focus on discovering issues and identifying privacy controls in a timely manner, as part of systems design. We have the experience and orientation to work with architects and developers, and to even be embedded in dev teams at the crucial stages of a project. Many PIAs, even if they manage to uncover significant technical issues, are conducted too late in the development life-cycle to make a real difference.
Lockstep keeps a watching brief over ongoing developments in PIA practices. Over time we have developed a flexible generalised PIA template in line with international best practices, and which we customise for each engagement. In particular, our approach adapts readily to different privacy regimes. An indicative table of contents is shown below.
- Introduction and overview
- Description of the project/system
- Information flows
- Privacy analysis
Each system has its own characteristic privacy issues. To underpin realistic recommendations, in this section we document the major issues. Subsections are varied by agreement with the client. We find that typically, the most prominent issues are Collection, Disclosure and Openness.
- Privacy risk assessment
- Privacy enhancing responses (if applicable)
Depending on where a client’s project is at on the development lifecycle, it can be appropriate to provide substantive responses to the privacy analysis, in the form of design suggestions or process improvements.
- Compliance with Privacy Principles
Here we capture in detail the degree to which compliance with applicable privacy principles (as agreed with the client depending on their regulatory environment) is impacted by features and functions of the system.
- Use and Disclosure
- Data Quality
- Data Security
- Access and Correction
- Unique Identifiers
- Trans-border Data Flows
- Sensitive Information