Community Working Group

AI risk evaluation community group

A group of experts and members of the public who developed guidelines for the use of AI in working with patient data in Trusted Research Environments (TREs) to protect personal identity and information.

As interest in Artificial Intelligence (AI) using sensitive data grows among the data research community, the need to understand the unique privacy risks that AI poses has never been greater. The AI Risk Evaluation Community Group was a group of experts and members of the public who set out to create guidelines for using AI on patient data in Trusted Research Environments (TREs) that protect the identities of individuals within the data.

The increasing availability of potentially sensitive medical data, such as brain imaging and genetic blood tests, within TREs for AI model development holds transformative potential for healthcare but raises important issues in ensuring data privacy and security. To mitigate this risk, AI Risk Evaluation Community Group brought together a team of experts and members of the public to develop comprehensive guidelines for the ethical use of AI on sensitive data in TREs. Their work built on previous initiatives, engaging with diverse stakeholders to ensure responsible AI integration into clinical settings and use of medical data.

Project outputs

The AI Risk Evaluation Community Group conducted a four-part workshop series to assess and mitigate risks associated with AI models trained on patient data:

  • Public/Patient Workshop: WHAT are the risks associated with AI model release: This workshop focused on gathering patient perspectives on the risks of using their data for AI research.
  • Researcher Workshop: WHAT are the most effective mitigation techniques: The second workshop brought together AI and clinical experts to discuss current AI methods, privacy-preserving techniques, and potential privacy risks in implementing AI models and produce a set of recommendations of best practices for mitigating privacy risks in AI and effective privacy-preserving techniques.
  • Data Provider Workshop: WHAT is the risk appetite of data providers: The third workshop engaged data providers to understand their risk appetite and any barriers or restrictions they have regarding AI training on their data.
  • Developing guidelines and recommendations for Trusted Research Environments on assessing AI risk and implementing privacy-preserving methods: The final workshop aimed to consolidate insights from the previous sessions by bringing together all participants from the previous three workshops to co-develop comprehensive guidelines for AI risk assessment and recommendations for privacy-preserving methods within TREs.

Participation and collaboration

The AI Risk Evaluation Community Group comprised a diverse group of researchers (with expertise in neuroimaging, data security, AI development, and clinical research), data and infrastructure providers, and public representatives. The working group also included members of Dementias Platform UK (DPUK), which has built a community of data providers and researchers with plans to collaborate with networks like the British Neuroscience Association, the Deep Dementia Phenotyping (DEMON) Network, and the UK Health Data Research Alliance. DPUK is well-versed in neuroimaging and genomic data, with ongoing AI model development that collaborates closely with data providers to ensure safe model deployment in clinical environments. Their collective expertise and ongoing initiatives laid a solid foundation for the community’s objectives.

Ways of working

Four workshops were held over the funding period (November 2023 – March 2024). These workshops were held in person and online to allow for maximum attendance and participation.

Group co-chairs

  • Prof Simon Thompson, Swansea University
  • Prof John Gallacher, Oxford University
  • Dr Timothy Rittman, Cambridge University
  • Lewis Hotchkiss, Swansea University

For enquiries, please email catrin.morris@swansea.ac.uk.

Join the DARE UK mailing list to stay informed about project updates

If you’re interested in learning more about our work, how it can benefit you, or how to get involved, click the button to get in touch with us using our contact form.