THE LONG READ: Data privacy and the Covid-19 app


Today, on Data Privacy Day, Professor Eerke Boiten of DMU's Cyber Technology Institute looks at the evolution of the Government's Covid-19 app and its approach to personal data


Many of the data protection professionals that I know are great creative thinkers. That will sound odd if you think of data protection as a dull compliance exercise, or as a lazy excuse for not having nice things.

But it’s essential when you get to what I think is the essence of modern data protection, which is where people think deeply about the reasons for the data processing proposed: in data protection impact assessment (DPIA).

dataprivacy

The main thing to remember about DPIA is that it asks two core questions:

-    should we be doing this? and
-    what could possibly go wrong?

It is especially this last question that requires ingenuity. Will databases be created that are attractive to hackers or inside attackers, such as was reported this week from the Netherlands with their contact tracing databases? Or would managers be tempted with “function-creep” thoughts starting “If we collect this data anyway, we could also …”? Can the data collected be combined with other data to cause bad privacy effects?

The law (GDPR and UK Data Protection Act 2018) says we should do a DPIA whenever the proposed data processing is “risky” – large scale surveillance or profiling, extensive use of sensitive data, new technologies, automated decision making, etcetera. The DPIA then concentrates on identifying and mitigating the risks to “rights and freedoms” that arise from the data processing.

A DPIA is both a process, and a “living document”. It isn’t just part of the requirements gathering phase, but needs to be updated during design and implementation of any system, e.g. on the basis of specific technology choices made – and subsequently while the context of operation potentially changes.

Data privacy and the Covid-19 app

Just for today, I choose not to believe my contacts who tell me that actual DPIAs languish at the bottoms of drawers. So today I will look at one of the best examples of a DPIA that I know, and will see how it has changed over time, and assess where we are now.  The subject of it is the NHS COVID “contact tracing” app (version 2), whose DPIA was originally dated August 2020.

This app measures proximity of other phones with the same app running on it, by measuring Bluetooth communication signal strength, and uses that and other info to decide whether people are at risk of being infected with COVID. It also allows people to record when they attend particular venues such as restaurants by scanning a QR code.

When I was part of a group of academics asking questions about the first version of the NHS Covid app and its (initially absent) DPIA in April 2020, we worried about the health benefits of such an application in relation to the surveillance risks, and the additional risks imposed by a centralised solution – i.e., one where the phones send large amounts of data to a national data centre that then does the Covid-19 infection risk assessment.

Our efforts and those of others promoting decentralised solutions were sometimes caricatured as privacy-absolutist, putting it ahead of human lives.

This is a very shallow view, because any such app is aiming to constrain people’s freedom of movement, on the basis of making inferences about their health, so it is, and needs to be, privacy invasive by its very nature. And that is actually fine, because the right to life trumps the right to privacy.

The DPIA was published after some delay, and found to be problematic; eventually the centralised app v1 was abandoned and the de-centralised v2 app appeared with a much stronger DPIA immediately attached. It was updated in December 2020 to cover improvements in technology and slight extensions of functionality such as getting isolation support payments. One big brownie point there. In fact, I don’t think I’ve ever seen a DPIA that is more thorough or comprehensive.

The discussion about effectiveness of the app in April focused on two issues: take-up of the app (would it really need to get to the unrealistic 60% to be effective?) and precision of the Bluetooth signal measurements as a proxy for proximity. Rumours at the time were that the latter was both the main reason for NCSC encouraging NHSX to deviate from the international consensus (which was moving towards decentralised), and the eventual reason for abandoning app v1 – the v2 DPIA indeed says it was due to a lack of “reliability”. But we may actually have been looking in the wrong place for the real efficacy problems.

One question that needs to be answered in any DPIA is whether the data processing is “necessary and proportionate”. This DPIA says “The app is a key part of the country’s ongoing Covid-19 response, aiming to extend the speed, precision and reach of the existing NHS Test and Trace service.”

Unfortunately, we are currently lacking reported data on the claimed “world beating” nature of this overall system and in particular the app’s role in this.

Any anecdotal information that comes through suggests the app reports few productive proximity tracing inputs, and reports few productive venue check-in alerts; any info we have suggests the Test and Trace service produces very little effect in the “Trace” dimension.

Realistically, given the number of downloads (e.g. 5M+ on Android) and the time elapsed, by now we should have seen the app’s positive effect. Given the government’s boastful approach to COVID measures in general, absence of reported positive news must imply negative news. So: sorry, folks! An exemplary DPIA. Careful design to minimise collection of data, much of which is barely personal data, if at all. A location check-in solution that is infinitely superior to grubby hand-written lists on the bar or to pub chains’ quick software hacks. But in the end, a system which does not produce anywhere near the hoped-for amounts of helpful data, and where there’s doubt whether such outputs are then used productively by the surrounding systems, cannot claim to be “necessary”. Will the next revision of the DPIA have the courage to draw this distressing conclusion?

About the author:

Prof Eerke Boiten is the Director of the Cyber Technology Institute at De Montfort University, recognised as an Academic Centre of Excellence in Cyber Security Research by EPSRC and the National Cyber Security Centre.

He engages with the privacy consultancy community in practical research on DPIA and supervises several PhD students looking at technological aspects of privacy. The topic also appears regularly in his teaching on DMU’s MSc in Cyber Technology.

He can be contacted on eerke.boiten@dmu.ac.uk and you can find him on Twitter @eerkeboiten

Posted on Thursday 28 January 2021

  Search news archive