Mobile: +61 (0) 414 488 851
Email: swilson@lockstep.com.au

Award winning blockchain paper at HIMSSAP17

David Chou, CIO at Children’s Mercy Hospital Kansas City, and I wrote a paper “How Healthy is Blockchain Technology?” for the HIMSS Asia Pacific 17 conference in Singapore last week. The paper is a critical analysis of the strategic potential for current blockchains in healthcare applications, with a pretty clear conclusion that the technology is largely misunderstood, and on close inspection, not yet a good fit for e-health.

And we were awarded Best Paper at the conference!

The paper will be available soon from the conference website. The abstract and conclusions are below, and if you’d like a copy of the full paper in the meantime, please reach out to me at Steve@ConstellationR.com.


Blockchain captured the imagination with a basket of compelling and topical security promises. Many of its properties – decentralization, security and the oft-claimed “trust” – are highly prized in healthcare, and as a result, interest in this technology is building in the sector. But on close inspection, first generation blockchain technology is not a solid fit for e-health. Born out of the anti-establishment cryptocurrency movement, public blockchains remove ‘people’ and ‘process’ from certain types of transactions, but their properties degrade or become questionable in regulated settings where people and process are realities. Having inspired a new wave of innovation, blockchain technology needs significant work before it addresses the broad needs of the health sector. This paper recaps what blockchain was for, what it does, and how it is evolving to suit non-payments use cases. We critically review a number of recent blockchain healthcare proposals, selected by a US Department of Health and Human Services innovation competition, and dissect the problems they are trying to solve.


When considering whether first generation blockchain algorithms have a place in e-health, we should bear in mind what they were designed for and why. Bitcoin and Ethereum are intrinsically political and libertarian; their outright rejection of central authority is a luxury only possible in the rarefied world of cryptocurrency but is simply not rational in real world healthcare, where accountability, credentialing and oversight are essentials.

Despite its ability to transact and protect pure “math-based money”, it is a mistake to think public blockchains create trust, much less that they might disrupt existing trust relationships and authority structures in healthcare. Blockchain was designed on an assumption that participants in a digital currency would not trust each other, nor want to know anything about each other (except for a wallet address). On its own, blockchain does not support any other real world data management.

The newer Synchronous Ledger Technologies – including R3 Corda, Microsoft’s Blockchain as a Service, Hyperledger Fabric and IBM’s High Security Blockchain Network – are driven by deep analysis of the strengths and weaknesses of blockchain, and then re-engineering architectures to deliver similar benefits in use cases more complex and more nuanced than lawless e-cash. The newer applications involve orchestration of data streams being contributed by multiple parties (often in “coopetition”) with no one leader or umpire. Like the original blockchain, these ledgers are much more than storage media; their main benefit is that they create agreement about certain states of the data. In healthcare, this consensus might be around the order of events in a clinical trial, the consent granted by patients to various data users, or the legitimacy of serial numbers in the pharmaceuticals supply chain.


We hope healthcare architects, strategic planners and CISOs will carefully evaluate how blockchain technologies across what is now a spectrum of solutions apply in their organizations, and understand the work entailed to bring solutions into production.
Blockchain is no silver bullet for the challenges in e-health. We find that current blockchain solutions will not dramatically change the way patient information is stored, because most people agree that personal information does not belong on blockchains. And it won’t dispel the semantic interoperability problems of e-health systems; these are outside the scope of what blockchain was designed to do.

However newer blockchain-inspired Synchronous Ledger Technologies show great potential to address nuanced security requirements in complex networks of cooperating/competing actors. The excitement around the first blockchain has been inspirational, and is giving way to earnest sector-specific R&D with benefits yet to come.

Posted in Security, Privacy, Innovation, e-health, Blockchain

Blending security and privacy

An extract from my chapter “Blending the practices of Privacy and Information Security to navigate Contemporary Data Protection Challenges” in the new book “Trans-Atlantic Data Privacy Relations as a Challenge for Democracy”, Kloza & Svantesson (editors), Intersentia, 2017.

The relationship between privacy regulators and technologists can seem increasingly fraught. A string of adverse (and sometimes counter intuitive) privacy findings against digital businesses – including the “Right to be Forgotten”, and bans on biometric-powered photo tag suggestions – have left some wondering if privacy and IT are fundamentally at odds. Technologists may be confused by these regulatory developments, and as a result, uncertain about their professional role in privacy management.

Several efforts are underway to improve technologists’ contribution to privacy. Most prominent is the “Privacy by Design” movement (PbD), while a newer discipline of ‘privacy engineering’ is also striving to emerge. A wide gap still separates the worlds of data privacy regulation and systems design. Privacy is still not often framed in a way that engineers can relate to. Instead, PbD’s pat generalisations overlook essential differences between security and privacy, and at the same time, fail to pick up on the substantive common ground, like the ‘Need to Know’ and the principle of Least Privilege.

There appears to be a systematic shortfall in the understanding that technologists and engineers collectively have of information privacy. IT professionals routinely receive privacy training now, yet time and time again, technologists seem to misinterpret basic privacy principles, for example by exploiting personal information found in the ‘public domain’ as if data privacy principles do not apply there, or by creating personal information through Big Data processes, evidently with little or no restraint.

See also ‘Google's wifi misadventure, and the gulf between IT and Privacy’, and ‘What stops Target telling you're pregnant?’.

Engaging technologists in privacy is exacerbated by the many mixed messages which circulate about privacy, its relative importance, and purported social trends towards promiscuity or what journalist Jeff Jarvis calls ‘publicness’. For decades, mass media headlines regularly announce the death of privacy. When US legal scholars Samuel Warren and Louis Brandeis developed some of the world’s first privacy jurisprudence in the 1880s, the social fabric was under threat from the new technologies of photography and the telegraph. In time, computers became the big concern. The cover of Newsweek magazine on 27 July 1970 featured a cartoon couple cowered by mainframe computers and communications technology, under the urgent upper case headline, ‘IS PRIVACY DEAD?’.Of course it’s a rhetorical question. And after a hundred years, the answer is still no.

In my new paper published as a chapter of the book “Trans-Atlantic Data Privacy Relations as a Challenge for Democracy”, I review how engineers tend collectively to regard privacy and explore how to make privacy more accessible to technologists. As a result, difficult privacy territory like social networking and Big Data may become clearer to non-lawyers, and the transatlantic compliance challenges might yield to data protection designs that are more fundamentally compatible across the digital ethos of Silicon Valley and the privacy activism of Europe.

Privacy is contentious today. There are legitimate debates about whether the information age has brought real changes to privacy norms or not. Regardless, with so much personal information leaking through breaches, accidents, or digital business practices, it’s often said that ‘the genie is out of the bottle’, meaning privacy has become hopeless. Yet in Europe and many jurisdictions, privacy rights attach to Personal Information no matter where it comes from. The threshold for data being counted as Personal Information (or equivalently in the US, ‘Personally Identifiable Information’) is low: any data about a person whose identity is readily apparent constitutes Personal Information in most places, regardless of where or how it originated, and without any reference to who might be said to ‘own’ the data. This is not obvious to engineers without legal training, who have formed a more casual understanding of what ‘private’ means. So it may strike them as paradoxical that the terms ‘public’ and ‘private’ don’t even figure in laws like Australia’s Privacy Act.

Probably the most distracting message for engineers is the well-intended suggestion ‘Privacy is not a Technology Issue’. In 2000, IBM chair Lou Gerstner was one of the first high-profile technologists to isolate privacy as a policy issue. The same trope (that such-and-such ‘is not a technology issue’) is widespread in online discourse. It usually means that multiple disciplines must be brought to bear on certain complex outcomes, such as safety, security or privacy. Unfortunately, engineers can take it to mean that privacy is covered by other departments, such as legal, and has nothing to do with technology at all.

In fact all of our traditional privacy principles are impacted by system design decisions and practices, and are therefore apt for engagement by information technologists. For instance, IT professionals are liable to think of ‘collection’ as a direct activity that solicits Personal Information, whereas under technology neutral privacy principles, indirect collection of identifiable audit logs or database backups should also count.

The most damaging thing that technologists hear about privacy could be the cynical idea that ‘Technology outpaces the Law’. While we should not underestimate how cyberspace will affect society and its many laws borne in earlier ages, in practical day-to-day terms it is the law that challenges technology, not the other way round. The claim that the law cannot keep up with technology is often a rhetorical device used to embolden developers and entrepreneurs. New technologies can make it easier to break old laws, but the legal principles in most cases still stand. If privacy is the fundamental ‘right to be let alone’, then there is nothing intrinsic to technology that supersedes that right. It turns out that technology neutral privacy laws framed over 30 years ago are powerful against very modern trespasses, like wi-fi snooping by Google and over-zealous use of biometrics by Facebook. So technology in general might only outpace policing.

We tend to sugar-coat privacy. Advocates try to reassure harried managers that ‘privacy is good for business’ but the same sort of naïve slogan only undermined the quality movement in the 1990s. In truth, what’s good for business is peculiar to each business. It is plainly the case that some businesses thrive without paying much attention to privacy, or even by mocking it.

Let’s not shrink from the reality that privacy creates tensions with other objectives of complex information systems. Engineering is all about resolving competing requirements. If we’re serious about ‘Privacy by Design’ and ‘Privacy Engineering’, we need to acknowledge the inherent tensions, and equip designers with the tools and the understanding to optimise privacy alongside all the other complexities of modern information systems.

A better appreciation of the nature Personal Information and of technology-neutral data privacy rules should help to demystify European privacy rulings on matters such as facial recognition and the Right to be Forgotten. The treatment of privacy can then be lifted from a defensive compliance exercise, to a properly balanced discussion of what organisations are seeking to get out of the data they have at their disposal.

Posted in Big Data, Biometrics, Privacy, RTBF, Social Media