With the NHS under continuing pressure, innovation and technology is increasingly being used to try and aid the swift diagnosis and treatment of patients. Although an admirable idea in practice, without using existing patient data, the development of clinical support tools is practically impossible.

But how can this data be used in a safe way that ensures that patient information is protected but that clinical tools are fit for purpose? This is a problem encountered by the Royal Free London NHS Foundation Trust. In 2015 they entered into a partnership with technology company DeepMind, a British company acquired by Google in 2014. Although the company’s main focus is on artificial intelligence, it partnered with the Royal Free to develop a tool called “Streams” which is designed to help diagnose Acute Kidney Injury (AKI) at the earliest possible opportunity. Up to 40,000 patients die each year with associated AKI conditions and AKI costs the NHS in England over £1 billion per year therefore the demand for such a clinical aid can clearly be seen (The Royal Free).

The issue comes with the access to patient data to help test clinical apps such as this, and in particular patient consent for the data be used in this way. The Royal Free provided approximately 1.6 million patient records as part of a trial to develop the app and on 3rd July 2017 the Information Commissioner’s Office (ICO) found that the data was not handled in an appropriate way and that patients were not sufficiently informed that their data would be used for such testing. The investigation by the ICO revealed that the Royal Free had breached Data Protection laws by not obtaining explicit consent. When a patient consents to their information being shared for a particular use this forms a legal basis for that information sharing. Implied consent, the ICO concluded, was not sufficient in this case and that explicit consent should have been given by the individuals which would have ensured that the conditions for processing sensitive personal information in schedules 2 and 3 of the Data Protection Act 1998 were satisfied.

Interestingly, the Trust have not been fined and instead have been asked to sign an undertaking to commit to making changes to ensure that the data is handled in a legal way under the Data Protection Act for not only the DeepMind project but potential future trials.  Elizabeth Denham, Information Commissioner stated that “The Data Protection Act is not a barrier to innovation, but it does need to be considered wherever people’s data is being used”. This appears to be the smartest way to approach such projects. Innovation is an essential tool for improving health and social care outcomes in the future and is something to be encouraged. The fact that the ICO appreciate the need for innovation and the benefits that this can bring the NHS is a major step forward – and sensibly balancing this need with fundamental privacy rights could pave the way for the NHS in the digital world.