How do you specify for an autonomous system to be trustworthy?

That is the question asked in our new research article “On Specifying for Trustworthiness”, published today in the Communications of the ACM journal. This article represents the collaborative thinking of multi-disciplinary researchers from across all six TAS nodes and hub, that emerged from a workshop held at the 2021 TAS All hands meeting. Their goal was to investigate this question, by considering how these systems operate differently within varying contexts, and to begin the journey of creating a roadmap towards top-down specifications for trustworthy autonomous systems. 

Autonomous systems are considered trustworthy when the design, engineering, and operation of these systems generates positive outcomes and mitigates potentially harmful outcomes. There are various techniques for demonstrating the trustworthiness of systems however, common to all techniques is the need to formulate specifications. A specification is a detailed formulation that provides “a definitive description of a system for the purpose of developing or validating the system” but writing them for autonomous systems to capture ‘trust’ is challenging. This is also compounded by the inherent uncertainty of the environment in which they operate. 

Framed around 10 intellectual challenges, the authors examined four different domains of autonomous  systems: when there’s a single autonomous agent (for example, automated driving, UAVs), when there’s a group of autonomous agents (for example, swarms), when there’s an autonomous agent assisting a human (for example, AI in healthcare, human–robot interaction) or when there is a group of autonomous agents collaborating with humans (for example, emergency situations, disaster relief). 

Please do watch our video below or read the paper here to find out more. We’re really pleased to be part of this important and timely research area, recognised by our acceptance into this journal which has an impact Factor of 22.7 in 2023.

All authors from University of Bristol are fully or part funded via the UKRI’s Trustworthy Autonomous Systems Node in Functionality under grant number EP/V026518/1.

Focus groups: have your say

We are looking for participants for online focus groups as part of our SWARM project, taking place in early 2024. We’re looking to get a range of perspectives about the future use of nanoswarms in cancer treatment. This project is investigating the ethics and regulations of their first in-human clinical trial. The aim is to explore how nanoswarm medicine should be regulated once this technology is available for clinical trials.

Would you like to participate?

Focus groups will be taking place in early 2024. We are looking for:

  • Oncology healthcare professionals
  • Cancer patients
  • Regulatory or policymakers in drug delivery/oncology
  • Nanomedicine researcher or developers
  • Other stakeholders including patient support and information groups, patient advisory committees, public health practitioners, professional associations for healthcare professionals, hospitals, cancer charities and family members/caregivers.

Volunteers must be over the age of 18 years old to take part. Please find more information on our website here, including a series of videos explaining the study and concepts associated with it, including “What is a nanonswarm?” below:

To take part in the focus groups please complete this Expression of Interest Form. If you have any questions please email swarm-study@bristol.ac.uk.