Department for Digital, Culture, Media & Sport
8 December 2021
The PHG Foundation welcomes the opportunity to comment on this consultation on reforms to the UK data protection regime. We are a health policy think tank with a focus on genomics and innovative health technologies, and over the last five years we have conducted significant research on the interaction between data protection law and genomic data in healthcare and research.
In 2017 we published the report Identification and genomic data and in 2020 we published a comprehensive report from our year long ICO-funded research on the impact of the GDPR on genomic data processing in healthcare and scientific research. In this research we conducted legal analysis, stakeholder interviews and convened an expert meeting of specialists in genomic data, health, research and data protection to identify the key impacts of the GDPR on genomic data processing. Alongside this we have also carried out significant research on the development of AI tools in healthcare, including the extent to which the GDPR requires machine learning in the context of healthcare and research to be transparent, interpretable, or explainable. These resources are freely available on our website. Drawing on this and subsequent research, we have responded to the consultation questions and uploaded our answers to the online platform. However, we would also like to make some supplementary general comments that cut across multiple aspects of the proposals and different sections of the consultation document.
The importance of a continued ‘adequacy decision’ for the UK
An overarching concern is that the proposals risk diverging sufficiently from the European Union’s standards that the UK will be adjudged to offer a lower (and inadequate) level of protection for personal data. This would jeopardise free flows of data between the UK and the EU, which are crucial to scientific and genomic research in particular. The UK does not need to maintain an exact copy of the EU GDPR but we are concerned that some of the proposals could be viewed as sufficiently divergent to impact on adequacy. For example, the proposal to adopt a statutory test for anonymisation could lead to a view that the UK regime has a fundamentally different scope of ‘personal data’ to the EU. If this is narrower, it will de facto be viewed as offering lower protection. In our research we identified a range of challenges that EU/EEA collaborators faced agreeing and authorising international transfers of data outside the EEA. These included different views about whether data are ‘personal data’ or not, causing significant harm to international research collaborations in certain cases. At present the UK is not suffering from the same level of friction but this is at stake if reforms are brought forward to adjust the framework without due regard for the impact on adequacy.
Evolution not revolution
Allied to this challenge, we advocate for a gradual process of adjustment to our data protection laws, based on broad and deep consultation with relevant sectors, to ensure that the proposals will not unnecessarily impact international flows of data and will not lead to unwarranted lowering of the level of protection afforded to fundamental data protection and privacy rights. It is difficult to do justice to the novel proposals within this consultation given its breadth and length. We hope that this consultation is a starting point for continuing engagement about what is being proposed, and not the final opportunity for comment, especially since our lack of comment does not reflect a lack of familiarity with the context or relevant law but because we do not have resources to give each aspect the consideration that it is due.
Improving data protection for research
We strongly welcome the focus of this consultation on the impact of data protection on scientific research processing. We think many of the barriers identified and a number of the suggestions in this area are sensible. However, we question the strength of some of the claims made about the role of law as a barrier in these proposals. In our research, we identified that uncertainty and complexity were the key and overarching challenges for researchers. However, we recognised that both of these are inevitable in the application of a comprehensive and sector-agnostic new law. Our work highlights that the challenge is not the wording of the law (although there are undoubtedly aspects that could be better drafted) but rather the inevitable scope for argument about its proper application in a specific context, such as genomic research. To change the law would be unlikely to significantly alter the scope for that argument. We recommend that as much resource and effort as possible is put into developing specific guidance, in consultation with relevant industries and sectors to address these challenges, rather than changes to the legislation.
The importance of the ICOs role and specific guidance
In our research we found considerable approval for the ICO’s track record in producing user friendly and sensible guidance. What is required is resources for this to be expanded and updated to address new challenges. An obvious area for continuing and updated ICO guidance is de-identification and technical approaches to privacy preservation. If the ICO can keep pace with the state of the art this will give greater confidence to data controllers and subjects about the measures put in place to reduce (but almost never eliminate) threats to privacy. We strongly refute the concept of a ‘surfeit of guidance’ in this regard. The ICO’s position means that its guidance is more authoritative than guidance, principles or technical documents that may be produced by non-regulatory bodies.
Innovation, trustworthiness and data governance
As a policy think-tank championing the role of innovation in improving population health we welcome the Government’s ambition to support innovation, particularly in our field of health. However, we believe that the UK’s record is and should continue to be as a world leader in safe, effective and ethical innovation. This requires proportionate regulation that ensures controllers, such as AI developers, act in ways that demonstrate their trustworthiness and maintain the support and confidence of the public. As we address in our latest research on changes to the regulation of confidential patient information during COVID-19, public trust and confidence is paramount. A loss of confidence can critically harm health research in particular, or even faith in the healthcare system.
Data protection law is only part of this picture. Alongside other areas of law and governance, including the common law of confidentiality, consumer protection and professional negligence, the goal of any reforms should be to safeguard fundamental rights and freedoms while supporting ethical innovation. This extends to aspects that are largely outside the realm of data protection. Notably the governance of non-personal data, addressing the challenges of opaque or adaptive high-risk AI, and addressing group, as opposed to individual, impacts. If this is successful, law and regulation will play an important role in supporting, not hindering innovation.
The PHG Foundation have provided a comprehensive response which can be downloaded here.