NHS COVID-19 App - health vs privacy?

NHS COVID-19 App - health vs privacy?

DocuSign: 10 FAQs

The launch of the NHS’s COVID-19 contact tracing app (“App”) is just around the corner, with a trial of the App currently underway on the Isle of Wight. The App is intended to take an important role in the contact tracing and testing approach proposed by the UK government, aimed at reducing transmission of the virus by notifying people who have come into contact with those who may be infected. A number of privacy concerns have been (and continue to be) raised about the App and the way it collects and uses personal data – are these justifiable concerns, or should privacy come second best as we attempt to ease the lockdown rules?

A centralised approach

Much of the criticism of the App has been directed at its design choice. The centralised design of the App means data, such as data identifying other devices that your phone has come into close proximity with, will be stored on a central database. In contrast, a number of other countries have opted for a de-centralised design, where the relevant data is stored locally on users’ phones and then exchanged between users’ phones where alerts are made.

The de-centralised approach is also favoured by Apple and Google, although the European Data Protection Board’s (“EDPB”) recent guidelines consider both approaches to be viable “provided adequate security measures are in place”.

The justification given by the National Cyber Security Centre (which has been supporting the development of the App) for using a centralised model is that it gives the public health authority data to help it understand how the disease appears to be spreading. 

Concerns from Isle of Wight trial DPIA​

The data protection impact assessment (DPIA) carried out by NHSX in relation to the NHS COVID-19 App on the Isle of Wight is currently being reviewed by the ICO, and it appears at first glance that are a number of potential privacy issues with the App that would benefit from greater clarity, including:

  • Pseudonymous data: the word “anonymous” has been used by the government in relation to the data in public statements, and indeed within the DPIA itself. However, the DPIA confirms that the data is in fact pseudonymised personal data instead, which means that individuals could, at least in theory, be re-identified from the data if the correct identifier is used.
     
  • Systematic monitoring: the DPIA claims there is no systematic monitoring of publicly accessible areas carried out by the App on a large scale, but it is difficult to see how this is the case given that each downloaded instance of the App effectively monitors every other instance of the App in the vicinity.
     
  • Data subjects’ rights (e.g. subject access requests and right to erasure): the DPIA states that the App will “have a process enabling users to exercise data subject rights” but the DPIA later questions the technical practicality of its proposed method for subject access requests. Additionally, once data is held on government servers it appears that individuals cannot request that it be deleted. The DPIA refers to relying on Article 11 of the GDPR in order to avoid having to comply with data subjects’ rights, although it is not entirely clear that this would apply.
     
  • Risks redacted: the identified risks of the processing have been redacted in the DPIA. Whilst some level of detail on the risks may understandably be held back for security reasons, general identification of certain risks is likely to be seen as a necessary part of understanding the implications of the App. In particular, there have been concerns about re-identification of personal data and exploitation of the data, including by hackers, and we would expect users of the App to need to be made aware of this if these risks remain.
     
  • Data retention: the exact retention period for data has not been set nor has criteria been given for setting this.
     
  • Third parties’ use of data: the DPIA states “there may be requests to process data from the App for research purposes, which may be linked with identifiable data”. This may leave the door open for sharing data with third parties for research purposes.

 

Comment

From a data protection perspective at least, it is fair to say that any contact tracing application is likely to need to address privacy concerns, although doing so in a manner which cedes to wider public health benefits (and encourages the majority of the population to download and use the App) may be a challenge. The centralised design of the App potentially creates issues in and of itself, and these are borne out in the detail of the DPIA. Clearly there ought to be benefits to enabling centralised data analysis, but if privacy laws are to be observed (which the Information Commissioner has said they should) then, amongst other privacy requirements, we would expect that users of the App are made aware of the risks of using the App and the impact this may have on their rights. How this looks in practice remains to be seen, and perhaps publication of key risks in the next UK-wide DPIA for the App will assist.

Contact our experts for further advice

Search our site