UK Officials Say Google DeepMind Failed to Follow Patient Data Rules

The UK's Information Commissioner's Office faults London hospitals and Google DeepMind unit over how it handled patient data privacy in implementing its clinical safety initiative.

Data security

The United Kingdom's Information Commissioner's Office Monday ruled that Google's DeepMind division failed to comply with relevant data privacy laws when handling personal health data belonging to some 1.6 million patients.

The Royal Free London NHS Foundation Trust, a group of three hospitals provided the data to Google as part of a clinical safety testing initiative in connection with a new application for detecting, diagnosing and preventing Acute Kidney Injury.

The agreement between Royal Free and Google basically called on DeepMind to process partial patient records containing personally identifiable information belonging to Royal Free's patients.

The work resulted in a mobile application known as Streams that clinicians at the Royal Free began using actively earlier this year.

The ICO found that neither Royal Free, as the controller of the data, nor Google DeepMind as the data processor, followed UK data protection laws when handling the data.

For instance, DeepMind and Royal Free did not properly inform the 1.6 million patients that their personal health data was being used in a clinical safety initiative to develop a mobile application.

U.K. data privacy laws require organizations to have legitimate reasons for collecting personal data and using it. Data collectors and processors have to be clear about why they are collecting the data and how they plan on using it, the ICO said.

This is especially true when private data is used in a way that an individual might not reasonably expect. A patient who engaged with Royal Free to receive emergency treatment or radiology services would not reasonably expect their data would be used to develop a mobile application the ICO noted.

Yet, a majority of the patients whose data was used did not receive proper notification of that use and hence was deprived of the opportunity to either provide or deny consent.

It is also not clear why Google DeepMind required or was provided data records on as many as 1.6 million applications. Both organizations have provided high-level explanations but little evidence so far why so many partial patient records were absolutely needed for the stated testing purposes, according to ICO officials. "The Commissioner is not persuaded that it was necessary and proportionate to process 1.6 million partial patient records in order to test the clinical safety of the application," the ICO said.

The information sharing agreement between Royal Free and Google DeepMind also did not fully specify the measures that DeepMind had to take to protect the data it was processing or to minimize the amount of data that was accessible to it.

The ICO listed several other areas where Royal Free and Google DeepMind fail to apply the required measures for protecting the privacy of sensitive patient data.

"There’s no doubt the huge potential that creative use of data could have on patient care and clinical improvements," information commissioner Elizabeth

Denham said in a prepared statement. "But the price of innovation does not need to be the erosion of fundamental privacy rights."

The ICO has stipulated several measures that Royal Free as the controller of the patient data is required to implement to bring its data sharing agreement with Google DeepMind into compliance with UK data privacy laws.

Google did not respond immediately to a request for comment.

Jaikumar Vijayan

Jaikumar Vijayan

Vijayan is an award-winning independent journalist and tech content creation specialist covering data security and privacy, business intelligence, big data and data analytics.