That is, what do you not need to know about someone when dealing with them?

We bring deep experience in information security, systems design and governance to bear on complex and subtle privacy problems, bridging technology, business and compliance. Lockstep provides early stage PIAs, Privacy by Design and privacy engineering services.

The way I think about privacy

After many years specialising in the field of digital identity, I came to realise that privacy and identity are two sides of the same coin. Identity is what you need to know about someone in order to deal with them, and privacy is about what you don’t need to know and still be able to deal with them!

Privacy is about restraint.

Privacy is less about what you do with data than what you don’t do with it. 

That’s why privacy is more a policy issue than a technology one, because strong privacy generally comes from leaving personal information alone (echoing the fundamental right of people “to be let alone”). Respecting privacy means choosing to not know things about your customers or users. 

Lockstep’s privacy practice is based around this very practical yet rigorous perspective. We bring deep experience in information security, systems design, risk and governance to bear on complex and subtle privacy problems, bridging technology, business and compliance. 

Don’t sugar-coat privacy

Lockstep has a strong systems orientation towards privacy, recognising that it does create tensions with other system objectives like utility, cost and security. If privacy means choosing not to know things, then the tension is stark: information is power and we all know how many digital businesses thrive on that power.

So respecting privacy comes with a cost.  

A mature sense of Privacy Engineering

All engineering is about resolving conflicting requirements.  Lockstep has innovated in Privacy Engineering and provides a truly rounded approach that takes our clients well beyond orthodox ‘Privacy by Design’. 

Sometimes “privacy engineering” means privacy for engineers, and that’s fine. Engineers are often in need of nuanced training in privacy. But Lockstep also sees the need for engineering privacy; that is, evaluating and calibrating privacy requirements alongside security, usability, compliance and cost.  

And just like its cousin security, privacy is never perfect. Privacy is just another objective in complex systems design. And it’s dynamic. The privacy performance of a design can change over time. So good privacy engineering practice sees us gauging privacy performance at a point in time, using tools like Privacy Impact Assessment, and then reviewing performance periodically, as things change.  

Privacy is not the same as secrecy

Many technologists continue to mistake privacy for secrecy or confidentiality, a limited perspective which can confine them to encryption and access control.  But developers and architects have so much more to offer privacy design, if they orientate themselves to managing personal information flows.

Few of us want to live their lives in secret. We mostly want others to know things about us but to know only what they need to know, and to not abuse what they know. 

Remember that the near universal definition of personal information or personal data is essentially any data which can—on its own or in concert with other data—be associated with a natural person. And modern privacy protection centres on data protection principles such as collection minimisation, use minimisation, disclosure minimisation, and openness. Privacy does not prohibit collection and use of personal data: it seeks to moderate these activities and make them apparent to those affected. 

It follows that privacy is the protection you need for data that is not secret! 

Privacy is not complicated

Common privacy principles in regulations worldwide — dating back to the OECD privacy framework of 1980 and now legislated in over 120 countries — are readily boiled down as follows:

  • Don’t collect personal information unless you really need it.
  • Don’t collect more personal information than you need, and do not re-purpose personal information.
  • Get rid of personal information when you no longer need it.
  • Tell people concerned what personal information you have about them and why.


We have delivered privacy strategy, research and analysis for:

  • AU Treasury’s Consumer Data Right open data regime (CDR)
  • The Biometrics Institute privacy Trust Mark (UK)
  • Consumer biometrics privacy strategy at U.S. health insurer Aetna
  • Australia Post’s digital identity trust strategy
  • U.S. private health insurance giant Aetna
  • Office of the [Federal] Privacy Commissioner (AU)
  • Department of Education and Communities (AU NSW)
  • Australian General Practice Network
  • CSIRO.

See also training clients below.

We have undertaken Privacy Impact Assessments (PIAs) for numerous clients including:

  • Attorney General’s biometric Face Verification Service (FVS)
  • NSW COVIDSafe QR code check-in system for contact tracing
  • NSW COVID Stimulus Program vouchers programs
  • Confyrm Inc. (a U.S. based ID fraud management start-up)
  • U.S. National Strategy for Trusted Identities in Cyberspace (NSTIC)
  • Department of Foreign Affairs online passport application system
  • Victorian Department of Health cloud transformation program
  • AusIndustry’s VANguard credential verification hub
  • Department of Finance’s suite of public federal government web sites (one of the first PIAs to be done under the Australian Privacy Principles)
  • State of Victoria smart electricity meter program (one of the first major smart grid PIAs in the world)
  • NSW Births, Deaths & Marriages
  • The Australian National University
  • Australian Federal Police
  • Pharmacy Guild of Australia
  • Department of Finance Gender Balance on Boards database
  • Australia Post’s mobile and over-the-counter identity services
  • Queensland Health
  • Victoria Dept of Human Services state wide clinical application and client management system (HealthSMART)
  • Vicroads’ registration & licensing system overhaul “RandL”.


Lockstep delivers customised privacy training, especially for technologists and information security practitioners. Here’s an overview and sample of our full day privacy masterclass format.

We have designed and delivered customised training for clients including:

  • Australian Prudential Regulation Authority (APRA)
  • The Digital Transformation Agency (DTA)
  • NSW Department of Education & Training
  • University of NSW Law School
  • Macquarie University Business School
  • ISACA (formerly the Information Systems Audit and Control Association)
  • The Office of the Federal Privacy Commissioner.


Stephen has been Privacy Track Chair for Identiverse / Cloud Identity Summit since 2015.

Selected privacy presentations are available below:

A Goldilocks point for Digitised Vaccination Certificates

Turing Trustworthy Digital Identity Conference, 13 September 2021

More information

A Digital ‘Yellow Card’ using Community PKI certificates

IEEE International Symposium on Technology & Society (Public Interest Technologies), 2020

More information

Update on the Privacy Trust Mark

Biometrics Institute Showcase Australia, Canberra, 14 November 2016

More information

“Privacy Matters” Forum

NSW Privacy Commissioner, Privacy Awareness Week, Sydney, 8 May 2015

More information

“Is a Biometrics Trust Mark Viable?”

Identity New Zealand, Wellington, 28 May 2015

More information

“Privacy Master Class”

One day tutorial for infosec professionals, AusCERT 2015 Security Conference, Gold Coast Queensland, 2 June 2015

More information

“Rationing Identity in the Internet of Things”

Cloud Identity Summit, San Diego, 9 June 2015

More information

Designing Privacy by Design

“Privacy by Design” is basically the same good idea as designing in quality or designing in security. Everyone is talking about PbD but for most systems designers it remains a set of abstract principles. Lockstep has made real progress on design tools that reduce PbD principles down to actionable information management specifications. Our tools hybridise standard information security methodologies and architectural practices. They include:

  • Information Flow Mapping templates
  • Extended Information Asset Inventory
  • Privacy Threat & Risk Assessment.

Stephen has presented Lockstep’s Privacy Engineering and Design methods to, amongst others, the iappSwinburne University, the Institute of Information Managers, the Systems Administrators Guild, the Privacy Reform and Compliance Forum, the Australian Institute of Professional Intelligence Officers and the AusCERT security conference.

NSTIC Preliminary Privacy Evaluation Jan-Feb 2013

Stephen has been selected by the Privacy Coordination Subcommittee of the NSTIC* project to lead the preliminary privacy evaluation, road-testing both the NSTIC Privacy Evaluation Methodology and the project website

[*US National Strategy for Trusted Identities in Cyberspace].

Advocates for Privacy and Privacy Enhancing Technologies

We have long been active in the Australian and international privacy communities, advocating for better privacy design, better application of existing technology-neutral data protection laws, and adoption of verifiable credential technologies.  Steve blogs regularly on privacy; search his posts at

We recently made a detailed submission to the Australian Attorney-General’s 2020 inquiry into the Privacy Act:

Leveraging unique digital identity expertise, Lockstep has led the way in articulating a positive and robust vision for smart Privacy Enhancing Technologies. On this point, Stephen made a detailed submission to the 2005 Senate Inquiry into the Privacy Act, looking closely at intelligent authentication, smartcards and biometrics.

Stephen was an invited member of the Australia Law Reform Commission’s Developing Technology Advisory Subcommittee convened during the Commission’s review of the national privacy regime.

We have worked closely with privacy commissioners over many years to develop and articulate sophisticated position on technology and privacy. For instance we made a detailed submission to the Australian Privacy Commissioner on the development of guidelines for security practitioners. And Stephen was invited by the Commissioner to give a keynote speech at the launch of Privacy Awareness Week 2013.

In general, Lockstep resists the fatalistic view held by so many of our peers that technology has overtaken privacy; almost all privacy threats in today’s digital technologies are anticipated by international Privacy Principles, and can be well managed by careful Privacy Engineering.

Sister company Lockstep Technologies undertakes award-winning R&D into innovative PETs.

Track record

Stephen’s experience in privacy is summarised in the profile linked at the bottom of this page.

His numerous publications on the topic are gathered at the privacy section of the Lockstep library. Significant recent papers include:

  • “Big data held to privacy laws, too”, Nature, 26 March 2015
  • “Seeing privacy through the engineer’s eyes”, Privacy Law Bulletin, March 2015
  • “The collision between Big Data and privacy law”, Australian Journal of Telecommunications and the Digital Economy, Vol 2.3, Oct 2014
  • “Facebook and Personal Information”, S. Wilson & A. Johnston, in Encyclopedia of Social Network Analysis and Mining, Rokne & Alhajj editors, 2014
  • “Big Privacy: Rising to the challenge of Big Data”, Computers, Freedom, and Privacy Conference 2014, Washington DC, 9 June 2014
  • “Applying Information Privacy Norms to Re-Identification”, Harvard Law School Symposium on Law, Ethics & Science of Re-Identification Demonstrations, May 2013
  • “Legal Limits to Data Re-Identification”, Science Magazine, 8 February 2013
  • “A bigger threat to patient privacy when doctors use Facebook”, Journal of Medical Ethics, 20 Dec 2010.

He has also made numerous submissions on privacy to government inquiries into:

  • the Privacy Act
  • the ill-conceived Human Services Access Card, and
  • spyware.

Success stories

We have a second sense for how technologists think and work, which allows us to root out sometimes surprising privacy problems.

For example, for a state government client we discovered a particular systemic over-disclosure of PII, by intuiting how designers were going about transitioning third parties from fax and paper to electronic interfaces. The client routinely provides extracts of citizen records to a hundred or so authorised third parties. Different statutes govern different disclosures, restricting PII disclosures on a need-to-know basis. We knew that various third parties were moving from fax to electronic interfaces. When we saw that the upgrades were occurring over time and opportunistically, we judged it likely that support engineers would re-use existing FTP patterns for successive upgrades, instead of writing new interfaces on spec for each case. This re-use would lead to the PII extracts being broadened beyond what really needed in each instance; in fact it turned out that essentially whole records were being exported. Here, reasonable engineering practice had the surprising side effect of compromising privacy; without our empathy for ICT work practices, no regular privacy compliance review could have foreseen the problem.

Privacy is so much more than security, or secrecy, or confidentiality

While most business people appreciate that good privacy compliance requires good information security, many organisations still struggle to identify tangible privacy controls; i.e. practical IT and e-security design features that pro-actively protect privacy and improve the organisation’s privacy posture. As with security, privacy is now subject to special technical governance requirements, but there is a grave risk that an overly compliance-oriented approach can be expensive and ineffective. Businesses must be careful they can still ‘see the wood for the trees’.

Bridging a gap

Most privacy advisers come from a legal and/or policy background, and look at privacy through the lens of compliance, or public policy. That’s perfectly fine of course, yet the compliance perspective can fall short of engaging IT projects in their formative stages. To build privacy in, you need to understand in detail what privacy means for informatics requirements, architecture, software design, and security.

IT professionals sometimes underestimate privacy because they have long been told that “privacy is not a technology issue”. They can presume that if they have security covered, then privacy will follow, and in any case, it seems to be someone else’s responsibility! But many times we’ve seen this viewpoint morph into complacency, which in turn leads to privacy vulnerabilities in information systems that aren’t then detected until it’s too late.

Some examples help to illustrate the gap:

  1. Security is not the same thing as privacy. Consider two highly secure organisations A and B, and suppose that A wishes to share data about its customers with B. Let’s assume A and B are both secure to the highest standards. Then what’s the problem? Simply, it doesn’t matter how secure is B; under the law, personal information about A’s customers cannot generally be transferred to B without those customers being informed, and without limits being placed on what B can do with it. Disclosure, or Secondary Usage, are not automatically OK just because the receiver is “secure”.
  2. Equally, secrecy is not the same thing as privacy. Too often the mistake is made that personal information found in the public domain can be exploited without the individuals’ knowledge (this was the fundamental error made by Google in their StreetView wifi misadventure). But information privacy law doesn’t much care where personal information comes from; the law doesn’t even use the terms ‘public’ and ‘private’. Instead the test is simply whether the information in question is personally identifiable. If it is identifiable, then there are limits on what an organisation can do with it, no matter how it was obtained.
  3. Software designers can think that collecting<> personal information involves explicit forms or customer interviews, while the act of copying PI from another source, or generating new information from previously collected data can be presumed to escape privacy law. Unfortunately these are misconceptions. The Privacy Act is blind to the manner of collection; no matter how PI comes to be in your information systems, you may be deemed to have collected it. And that includes processes whereby PI is generated from e.g. “Big Data” stores. When Sensitive PI is involved, special care is required, as discussed at this recent blog post.

Lockstep Consulting bridges the gap between ‘technology’ and the ‘business’.

“Privacy Engineering”

In addition to trust & privacy strategy development and Privacy Impact Assessments (PIAs), a truly unique offering of Lockstep’s is what we call Privacy Engineering, a privacy-by-design approach that generates tailored, practical guidance for ICT architects, designers and project managers so they can build privacy controls into their systems. We work closely with our clients to fine-tune local design practices, building privacy controls in (as opposed to hoping ‘audit’ them in). Privacy Engineering protects customer relations, pro-actively uncovers privacy problems, saves money by solving problems sooner, and enhances compliance. Special focus areas include audit logs and transaction histories, web forms, change management processes, and databases.

A little more detail on the approach is given in Babystep 14.

More sophisticated PIAs

For an example of our Privacy Impact Assessments, see

Lockstep’s PIAs are more technologically sophisticated than most. We focus on discovering issues and identifying privacy controls in a timely manner, as part of systems design. We have the experience and orientation to work with architects and developers, and to even be embedded in dev teams at the crucial stages of a project. Many PIAs, even if they manage to uncover significant technical issues, are conducted too late in the development life-cycle to make a real difference.

Lockstep keeps a watching brief over ongoing developments in PIA practices. Over time we have developed a flexible generalised PIA template in line with international best practices, and which we customise for each engagement. In particular, our approach adapts readily to different privacy regimes. An indicative table of contents is shown below.

  • Introduction and overview
  • Description of the project/system
  • Information flows
  • Privacy analysis

Each system has its own characteristic privacy issues. To underpin realistic recommendations, in this section we document the major issues. Subsections are varied by agreement with the client. We find that typically, the most prominent issues are Collection, Disclosure and Openness.

  • Privacy risk assessment
  • Privacy enhancing responses (if applicable)

Depending on where a client’s project is at on the development lifecycle, it can be appropriate to provide substantive responses to the privacy analysis, in the form of design suggestions or process improvements.

  • Compliance with Privacy Principles

Here we capture in detail the degree to which compliance with applicable privacy principles (as agreed with the client depending on their regulatory environment) is impacted by features and functions of the system.

  • Collection
  • Use and Disclosure
  • Data Quality
  • Data Security
  • Openness
  • Access and Correction
  • Unique Identifiers
  • Anonymity
  • Trans-border Data Flows
  • Sensitive Information
  • Recommendations
Steve Wilson privacy profile v8 7 Feb 2017