DARE UK (Data and Analytics Research Environments UK) is on a mission to build a UK-wide network of Trusted Research Environments (TREs) to enable better, faster and more trustworthy sensitive data research for public benefit.
In the early stages of the programme, understanding public views about how sensitive data is made available for research was key to shaping the direction of the programme.
In 2022, as a crucial element of Phase 1 of the programme, we carried out a deliberative dialogue with 44 members of the public from across the four nations of the UK. A series of online workshops explored views towards how the UK’s data research infrastructure could work in a more joined-up, efficient and trustworthy way.
The dialogue resulted in six key recommendations, each of which has helped shape our programme activities in a myriad of ways. Here, we explore what we heard, and how we’re responding.
Transparency
What we heard: Proactive transparency should be practiced by those handling and using sensitive data for research.
How we’re responding: In 2024, we launched our new brand and website to better showcase the DARE UK programme to our audiences, particularly the public. A key priority was the inclusion of plain English descriptions and videos describing our work, to enhance understanding for those less familiar with sensitive data research.
As a founding partner of the Public Engagement in Data Research Initiative (PEDRI), DARE UK has also been involved in cross-sector efforts to raise public awareness about the use of sensitive data in research. In 2023, in direct response to this recommendation, we carried out a pilot public communications campaign in collaboration with PEDRI partners to test approaches for awareness raising. The findings will be useful for informing future campaigns, and there is more to be done across the whole sector to increase awareness and understanding of sensitive data research.
Inclusion
What we heard: Public involvement and engagement should be inclusive and meaningful.
How we’re responding: In Phase 2, DARE UK’s Public Advisory Group (PAG) has expanded from five to 15 members, enabling us to include a wider variety of voices to help shape our work. The PAG meets quarterly to feed into our day-to-day activities, and members play a decision-making role in DARE UK on the Programme Board and as panel members for our open funding calls. To ensure the group is as inclusive as possible, our next round of recruitment will focus on bringing in more young people and more people from across all four nations of the UK.
In 2026, DARE UK will also officially be launching its new online public involvement network, ‘Our Data, Our Say’ – a dedicated place for the wider public to keep updated about our work and find out about opportunities to get involved.
Data security
What we heard: Efforts should be made to raise awareness of security processes to protect data, and make sure those processes remain fit for purpose.
How we’re responding: DARE UK’s TREvolution programme, coordinated by the University of Nottingham, is adopting shared standards and building trustworthy technical innovations to deliver better and faster research for public good. This involves collaboration to strengthen governance, security and transparency so that confidence is earned and maintained throughout the research process.
Simultaneously, our Early Adopter projects are testing the integration of solutions developed by TREvolution in existing TREs. The VISTA project, for example, is testing new approaches to semi-automating how research outputs are evaluated as safe to release from the NHS East of England Secure Data Environment (SDE). It’s also exploring methods for how to safely export trained artificial intelligence (AI) and machine learning (ML) models from the SDE. The FRIDGE project is exploring the creation of a ready-to-use TRE on the AI Research Resource (a suite of advanced supercomputers that provides AI-specialised compute capacity). This would enable the safe, secure use of sensitive data to develop AI models for research, while ensuring adherence to strict information governance requirements. Find out more about these and other DARE UK Early Adopters.
Unified approaches
What we heard: The processes and systems supporting data research across the UK should be unified in their approaches where possible.
How we’re responding: DARE UK operates on a model of co-creation. Phase 1 was all about listening to the community, including the public, to understand how to achieve a more efficient and trustworthy sensitive data research landscape. Phase 2 is a cross-sector collaboration to establish unified approaches to solving the priorities identified in Phase 1.
Since 2022, DARE UK has supported 19 community groups bringing together partners from across the data research landscape to co-create pieces of work to solve mutual problems. From developing open-source tools that facilitate the use of synthetic data in TREs; to developing learning modules to drive good practice in public involvement and engagement, these community groups are exploring innovative ideas to address the needs of the community. Each group has a public member as Co-chair to ensure approaches meet public expectations. You can find out more on the DARE UK Community Groups hub.
Standardisation
What we heard: Where feasible, processes enabling access to sensitive data for research should be standardised and centralised.
How we’re responding: In 2023, as part of Phase 1, DARE UK funded five Driver Projects to explore standardised approaches to running and governing TREs. In particular, ‘Standardised Architecture for Trusted Research Environments’ (SATRE), led by researchers at the University of Dundee and the Alan Turing Institute, compared the characteristics of existing UK TREs to bring them into alignment with a standardised TRE technical specification.
Standardisation is an ongoing community endeavour. The SATRE Community Group now oversees updates to the TRE specification. Meanwhile, a core focus of TREvolution is to explore and adopt existing standards – from technical requirements and security protocols, to ethical principles and governance processes – to build trust and enable TREs to work together more effectively.
Public benefit
What we heard: Sensitive data should be made available for research when it is in the public benefit.
How we’re responding: To explore the societal challenges that could be addressed if the UK’s data research infrastructure was more joined-up, in 2024 we hosted a workshop with researchers and members of the public to surface scientific use cases for linking sensitive data at scale. From improving the air we breathe by enhancing low emission zones, to transforming access to primary care for those who need it most, 52 distinct uses cases were identified addressing a broad spectrum of opportunities to improve lives.
A final report sets out a curated shortlist of 10 use cases that demonstrate the breadth and depth of insights. They’re helping define future DARE UK funding opportunities by prioritising areas where linked sensitive data has the biggest potential to benefit society. The public have said they want their data to be used for public benefit – we’re making it happen.
What’s next?
Recently, we’ve been focused on getting public input on a rapidly evolving area of work in sensitive data research: the development of artificial intelligence (AI). We’ve just completed a series of workshops with members of the public in Birmingham to explore views towards the development and use of AI with sensitive data, including the training and development of AI models in TREs; and the use of AI capabilities to support sensitive data analysis and risk assessment. It served as a valuable opportunity to explore approaches for meaningful public involvement in the topic.
The findings, which will be published in the coming months, will be useful for informing future work to dig deeper into public views on AI in sensitive data research on a UK-wide scale. Stay tuned!
Find out more about how AI is being developed in TREs from DARE UK’s Technical Lead, Rob Baxter, in this video:
Sign up to our mailing list to be first to hear about our AI workshop findings and future work.