Assessing AI for early detection of oesophageal cancer
Research
As part of an Innovate UK funded research collaboration, PHG Foundation lead analysis of the ethical and legal implications of using artificial intelligence to help detect oesophageal cancer. Project DELTA (integrateD diagnostic solution for EarLy deTection of oesophageal cAncer) aims to identify up to 50% of cases of oesophageal cancer earlier, using the innovative Cytosponge™-TFF3 test - the ‘sponge on a string’ - to diagnose Barratt’s oesophagus.
This initiative includes the Universities of Cambridge, Oxford, Kings College London, the PHG Foundation and Cyted.
Published paper: Ethical and legal considerations influencing human involvement in the implementation of AI in a clinical pathway in Frontiers in Digital Health
PHG Foundation's work on the DELTA project is now complete. Reports will be published later in 2023. Below is the position statement arising from the project, which makes recommendations for future practice regarding novel risk stratification, screening and surveillance for oesophageal cancer.
Position statement from the DELTA project
Released: July 2023
The DELTA Project is a multistakeholder research project focusing on novel risk stratification, screening and surveillance for oesophageal cancer. Effective cancer prediction, prevention and treatment cannot be achieved without changes to existing health policies and regulations. This Position Statement summarises the changes that need to occur and makes recommendations for future practice.
The DELTA Project highlights the potential for personalised prevention of oesophageal cancer using a novel pathway encompassing three key elements:
- A novel risk algorithm to identify those at highest risk of developing oesophageal cancer
- A novel, non-invasive, nurse administered non-endoscopic sampling device (Cytosponge™)
- A novel pathway to evaluate and interpret cellular samples at scale incorporating laboratory immunohistochemical assays and an AI digital pathology tool
Much has been achieved in the move from research to a diagnostic test being implemented in the NHS. However, in order to detect patients with Barrett’s oesophagus at scale, a range of challenges need to be addressed including clarifying and streamlining the regulatory requirements, avoiding potential inequity by maximising patient access through the patient pathway, and actively targeting those at greatest need.
To do so will require:
1. Transparency and clarity from regulators, research ethics committees and research funders about the regulatory and ethical requirements and regulatory oversight for each of the elements making up the pathway (the risk algorithm, novel sampling device and AI digital pathology tool). This requires a clear roadmap for meeting regulatory requirements including:
- Proportionate evidence standards for novel AI and software devices informed by the extent to which they are intended to replace human intervention, including standards for clinical utility and performance monitoring
- Best practice standards to promote transparency and trustworthiness across the pathway
Materials/guidance to support health professionals, including information about how the results have been generated and whether they are intended to supplement or replace clinical decision making - Accessible information and materials for patients and publics to build wider trust and transparency
2. Alignment by regulators, commissioners and other decision-makers of these requirements so that they are consistent across the implementation pathway to streamline the implementation of novel technologies that are intended to work together
3. Ensuring access to innovative pathways through the active efforts of developers, healthcare providers, regulators, and commissioners to:
- Support the development and integration of novel algorithms into electronic health records systems
- Support interoperable electronic health records systems and laboratory reporting infrastructures, particularly infrastructures to flag future screening or surveillance needs in individual patient records
- Monitor performance over time and support updates to the algorithms which will be required as data improves, statistical techniques advance or requirements/guidelines change
- Facilitate long-term monitoring of device performance over the lifetime of the device (post-marketing surveillance)
4. Build health professionals’, patients’ and publics’ trust in novel technologies by:
- Reporting results to users (health professionals and patients) in ways that build trust and trustworthiness in these technologies including:
- Demonstrating that the tools can be used safely and that potential biases have been addressed
- Providing accessible individual level results that are supported by further details providing additional context e.g. for decisions supported by predictive algorithms this might include details of saliency, confidence intervals, internal logic to build trustworthiness
5. Resolving health care professionals’ concerns about liability for harm
- Continuing liability of health professionals for decisions made using algorithm driven systems is a significant barrier to implementation at scale. Governments, MHRA and other regulators, ICO, National Data Guardian, NICE and NHS England should clarify and align the responsibilities and obligations placed on manufacturers, developers and users including what standards of care are appropriate, how those standards might be met, and consider alternative methods for compensating patients who have been harmed. These could include extending existing clinical negligence schemes such as NHS Indemnity
6. Recognising how these initiatives can be harnessed to meet wider policy goals to reduce healthcare inequalities:
- Population stratification to screen for, detect and treat oesophageal cancer could serve as an exemplar for meeting wider policy goals such as Core20PLUS5 which seeks to diagnose 75% of cancers at stage 1 or 2 by 2028
On behalf of DELTA Project Collaborators including:
- Professor Rebecca Fitzgerald, Principal Investigator and Work Package 2 Lead, University of Cambridge
- Professor Julia Hippisley-Cox, Work Package 1 Lead, University of Oxford
- Marcel Gehrung, Work Package 3 Lead, Cyted AI
- Alison Hall, PHG Foundation
- Mimi McCord, Heartburn Cancer UK
- Alan Moss, Action Against Heartburn