How do you specify for an autonomous system to be trustworthy?

That is the question asked in our new research article “On Specifying for Trustworthiness”, published today in the Communications of the ACM journal. This article represents the collaborative thinking of multi-disciplinary researchers from across all six TAS nodes and hub, that emerged from a workshop held at the 2021 TAS All hands meeting. Their goal was to investigate this question, by considering how these systems operate differently within varying contexts, and to begin the journey of creating a roadmap towards top-down specifications for trustworthy autonomous systems. 

Autonomous systems are considered trustworthy when the design, engineering, and operation of these systems generates positive outcomes and mitigates potentially harmful outcomes. There are various techniques for demonstrating the trustworthiness of systems however, common to all techniques is the need to formulate specifications. A specification is a detailed formulation that provides “a definitive description of a system for the purpose of developing or validating the system” but writing them for autonomous systems to capture ‘trust’ is challenging. This is also compounded by the inherent uncertainty of the environment in which they operate. 

Framed around 10 intellectual challenges, the authors examined four different domains of autonomous  systems: when there’s a single autonomous agent (for example, automated driving, UAVs), when there’s a group of autonomous agents (for example, swarms), when there’s an autonomous agent assisting a human (for example, AI in healthcare, human–robot interaction) or when there is a group of autonomous agents collaborating with humans (for example, emergency situations, disaster relief). 

Please do watch our video below or read the paper here to find out more. We’re really pleased to be part of this important and timely research area, recognised by our acceptance into this journal which has an impact Factor of 22.7 in 2023.

All authors from University of Bristol are fully or part funded via the UKRI’s Trustworthy Autonomous Systems Node in Functionality under grant number EP/V026518/1.

Leave a Reply

Your email address will not be published. Required fields are marked *