top of page
Victoria Hordern

Google DeepMind Health under the GDPR Lens


The Information Commissioner’s Officer (ICO) ruled, on 3 July 2017, that the Royal Free NHS Foundation Trust (the Trust) had failed to comply with the Data Protection Act 1998 (DPA) when it provided 1.6 million patient details to Google DeepMind as part of a trial diagnosis and detection system for acute kidney injury, and required the Trust to sign an undertaking. The investigation brings together some of the most potent and controversial issues in data privacy today; sensitive health information and its use by the public sector to develop solutions combined with innovative technology driven by a sophisticated global digital company. This analysis provides insight on the investigation into Google DeepMind with focus on how the General Data Protection Regulation (Regulation (EU) 2016/679) (GDPR) may impact the use of patient data going forward.

All About Testing

The area of focus in the ICO’s investigation was the clinical testing phase for Streams, the app that Google DeepMind was developing with the Trust to improve diagnosis of acute kidney injury. The development phase had already taken place but that phase had used synthetic data or dummy data so did not raise any data privacy issues. From the perspective of the Trust, and their obligations under the Health and Social Care Act 2012, any new technology must properly undergo clinical safety testing before being deployed. In order to properly test the clinical safety of a new technology, the Trust argued that personal data of patients had to be provided. Significantly, neither the current DPA nor the GDPR specifically refer to the way personal data can be used in a ‘testing phase’ before the benefit of a technology (whether for controller or individual or both) is provided. However, importantly the ICO did not rule in its covering letter to the Trust that testing activity cannot use personal data but indicated that the issue required further exploration and should be considered more fully in a Data Protection Impact Assessment (DPIA).

Lawful Grounds

Assuming that the use of patient personal data is not totally out of hand for testing new technology, what lawful grounds would be available to the Trust under the GDPR? Under Schedule 2 of the DPA, the legitimate interests ground can be relied upon by public authorities such as the Trust so long as they can demonstrate that their legitimate interests are not overridden by the prejudice to the privacy rights of the patients. However, under Article 6 of the GDPR, the Trust has bigger problems since the GDPR specifically states that the legitimate interests ground is not available to public authorities in the performance of their tasks, as it is for the legislator to provide by law for the legal basis for public authorities to process personal data. Although there is nothing in the GDPR that outlaws a public authority obtaining consent from individuals to data processing, the requirement that consent must be freely given means that it cannot be relied upon when there is a clear imbalance between the individual and controller which is in particular when the controller is a public authority. Consequently the Trust would have to be able to prove that patient consent is freely given to overcome the implication in the GDPR that consents given to a public authority are unlikely to be freely given.

So where does this leave the Trust?

Effectively, the processing must be specifically laid down in EU or Member State law to be lawful under Article 6 of the GDPR. This seems to be a tough conclusion, but is backed up by the grounds for processing special categories of data (under Article 9(2) (h), (i) and (j) of the GDPR) which, although providing a broader description of acceptable purposes (preventative or occupational medicine, provision of health treatment, reasons of public interest in the area of public health, and scientific research purposes), still indicates that the processing must be necessary on the basis of EU or Member State law.

The one exception to this requirement appears to be that processing for the purposes of preventive or occupational medicine, for the assessment of the working capacity of the employee, medical diagnosis, provision of health or social care or treatment or the management of health or social care systems and services can be made pursuant to a contract with a health professional (and not specifically under EU or Member State law) so long as the health professional is subject to obligations of professional secrecy under EU or Member State law or processing is by another person also subject to an obligation of secrecy under EU or Member State law.

This means that the Trust could meet Article 9 of the GDPR so long as it can demonstrate that the use of patient personal data in the testing phase of Streams was necessary for the purpose of preventative medicine (i.e. to prevent morbidity) or medical diagnosis (i.e. diagnose kidney problems) and was only processed on the basis of contracts with health professionals or other persons subject to obligations of secrecy under EU or Member State law.

While it is not unreasonable to argue that those in the Trust who can access the data from the testing for Streams are likely to be either healthcare professionals or employees subject to equivalent obligations of secrecy, would this ground also cover Google DeepMind personnel?

It seems unlikely to, if Google DeepMind is in reality a controller (as some commentators have argued it is in the live phase though probably not in the testing phase) then it seems quite a stretch to argue that this ground applies. In addition, if Google DeepMind is a controller it will be difficult to resolve this conundrum of the lawful ground. Similarly, if Google DeepMind is a processor (as the contracts between the Trust and Google DeepMind firmly state and the ICO accepted) then surely the requirement under Article 9 would not apply to Google DeepMind’s access anyway. This is because, the act of disclosing personal data to a processor in itself does not give rise to a separate obligation on a controller to rely on a lawful ground.

In any event if the Trust cannot fully rely on there being appropriate contracts in place with healthcare professionals or proper secrecy obligations, the Trust is faced with either having to rely on explicit consent (and demonstrating that the consents given are wholly free) or relying on EU or Member State law that explicitly gives public health authorities the ability to use personal data for testing new technology (which may have significant public benefits) and to additionally engage third party technology companies to assist with the testing. It would be perverse if, amidst the enthusiasm for innovation in public healthcare and backing from those in both the EU and European Governments, the fundamentals of data protection law did not provide sufficient clarity and certainty for public authorities to be able to explore new technologies that could deliver significant public benefits, ease the circumstances of those with health conditions and save lives. Certainly, it is to be hoped that the UK Government’s recent call for views on the necessary GDPR derogations will produce a clear way forward for public authorities on the use of sensitive personal data in these types of circumstances.

Wider Data Protection Compliance

And yet all such forays must be balanced against the need for other fundamentals of data protection compliance – points that the ICO raised in its letter to the Trust and subsequently agreed to by the Trust in the signed undertaking. To comply with the GDPR, these types of projects must devote sufficient time and resources to:

  • Transparency – ensuring that all affected individuals understand the purpose of the data use and how it will and could impact them

  • Fairness – ensuring that all such uses of the personal data are within the expectations of individuals.

  • Proportionality – considering the amount and type of personal data which needs to be processed at a testing phase (as opposed to a live phase) and being able to justify the decision taken.

  • Individuals’ control of their personal information – giving individuals the right not to be part of the project from the outset and additionally giving participants further information about their rights if they choose to become part of the project.

  • Data security – putting appropriate contracts in place that comply with Article 28 of the GDPR with any processor(s) as well as suitable internal security measures.

  • DPIA – carrying out a comprehensive and balanced DPIA before the roll out of the project so that the Trust can demonstrate to the regulator, should it become necessary, that they have a written assessment taking account of all the issues, risks and mitigating steps before actual processing of personal data takes place.

While not specifically mentioned by the ICO, any parties embarking on this kind of public health big data project should always seriously consider consulting the relevant data protection authority (DPA). Although the GDPR abolishes the general requirement to file with DPA, there is still a requirement to consult with DPAs where a DPIA indicates that the processing would result in a high risk in the absence of measures taken by the controller to mitigate the risk.

Indeed, even if a public health authority considers it is not technically required to consult the DPA, it could still be a prudent step especially if private sector partners are involved in the processing, in order to demonstrate transparency and good faith should the matter be publicly criticised at a later stage. The Trust would furthermore need to be able to demonstrate accountability (through policies, procedures, training and audits) concerning how the patient personal data will be used in the context of testing (and later live use) as well as ensuring privacy by design are embedded in the technology from the beginning.

The Role of Governments and Regulators

From what is publicly available, it appears that the Royal Free and Google DeepMind data use would need to meet additional compliance steps to satisfy GDPR requirements. This is not to say that all similar types of data use would be prohibited under the GDPR. But there is an obligation on Governments and regulators to provide public health authorities with appropriate guidance and clarity in order to understand how they can use patient personal data lawfully under the GDPR in testing and live environments. This is essential for public health authorities to be able to use technology efficiently for the public good and in a way that preserves patient privacy. In this regard, in its covering letter to the Trust, the ICO recognised the benefits that can be achieved by using patient data for wider public good and, where appropriate, the ICO supported the development of innovative technological solutions that help improve clinical care however, such schemes must comply with data protection law.

This article was originally published in Data Protection Leader in August 2017.

Screenshot 2023-11-06 at 13.13.55.png
bottom of page