• Content Type

Introducing CDEI’s Portfolio of AI Assurance Techniques to the AI Standards Hub community

Blog post by:

James Scott – CDEI

On 7 June, DSIT’s Centre for Data Ethics & Innovation (CDEI) announced the launch of the Portfolio of Assurance Techniques. This blog explains the purpose of the Portfolio, and the important connections between AI assurance and standardisation. The blog also highlights AI Standards Hub resources that can help you to make the most of the Portfolio and explains how you can submit a case study for the next iteration in September.  

The Portfolio of AI Assurance Techniques

CDEI has identified one of the key barriers to the adoption of AI assurance as a significant lack of knowledge and skills around assurance techniques. Research participants reported that even if they want to assure their systems, they often don’t know what assurance techniques exist, or how these might be applied in practice across different contexts and use cases. 

We have created the Portfolio of AI Assurance Techniques to address this lack of knowledge and help industry practitioners navigate the AI Assurance landscape as they develop trustworthy AI systems. The portfolio showcases a variety of real-world case studies, encompassing technical, procedural, and socio-technical approaches across a range of industries. CDEI has mapped these techniques against the principles set out in the UK’s pro-innovation approach to AI regulation, to illustrate the potential role of these techniques in supporting systematic AI governance. You can read the list of case studies here.

The Portfolio and the AI Standards Hub

The portfolio is also well aligned with the work of the AI Standards Hub, and its mission to advance trustworthy and responsible AI with a focus on the role that standards can play as governance tools and innovation mechanisms. Underpinning  AI assurance, standards are a crucial tool for trustworthy AI, providing the criteria for measuring, evaluating, and communicating the trustworthiness of AI systems.

The AI Standards Hub offers a range of resources which can be used in conjunction with the CDEI’s Portfolio, to support anyone involved in designing, developing, deploying, or procuring AI-enabled systems to think about AI assurance. For example, The  Observatory section on the Hub’s online platform provides a filterable database of AI standards which can underpin the assurance of AI systems, and the Hub’s dedicated e-learning platform offers an Introduction to AI assurance module, co-developed by The Alan Turing institute and CDEI, which spotlights the relationship between assurance and standards.

Want to get involved?

CDEI intends to continue to grow and develop the portfolio over time with new case studies, with the next iteration in September.  We would welcome future submissions from organisations across all sectors, and welcome ideas about how we can best position the portfolio to ensure that it meets its stated goals of supporting organisations in navigating the AI assurance landscape.

If you would like to submit case studies to the portfolio or would like further information please get in touch at [email protected].







Submit a Comment