ARET: Adaptable Robots, Ethics, and Trust study

Background

In engineering, autonomous systems are computer-controlled technologies that are programmed to carry out certain tasks without the need for humans to be involved. The functionality of autonomous systems (what they are meant to do, what they do, and what they could do) can gradually change over time from a simpler to a more advanced state. While changes in the functionality of a technology are generally only permitted at slow time scales and are controlled by humans, thanks to advances in artificial intelligence (AI) emerging autonomous systems can change functionality on much faster time scales and without a human controller.

Systems with ‘evolving functionality’ could potentially operate autonomously in complex and dynamic real-world environments. However, their ability to adapt to changes in the environment and in the system itself makes it difficult to predict their behaviour. Thus, these systems raise significant ethical questions around why people should trust them and consider them trustworthy.

Aim and research question

The ARET study explores how evolving functionality affects understandings of, and ethical concerns around, trust and trustworthiness in autonomous systems. We want to hear from a variety of people, including developers of these technologies, potential end users, and other stakeholders, to answer the following research question: How can and should autonomous systems with evolving functionality be ethically developed and deployed to ensure they are trustworthy?

Researchers

This study is being organised by Dr Arianna Manzini, Research Associate in Ethics of Autonomous Systems at the Centre for Ethics in Medicine (CEM) at the University of Bristol; and Prof Jonathan Ives, Deputy Director of CEM and study lead.

Would you like to participate?

If you are aged 18 or over, we would love to hear from you. You do not need to have any previous knowledge of autonomous systems to participate as we will show you images and videos to introduce you to the technologies.

Your contribution would be very helpful! For more details please contact Arianna Manzini at arianna.manzini@bristol.ac.uk OR complete this volunteer form and we will be in touch to schedule an interview.

Fun facts

  • In old English aret means to entrust
  • Arete is the term ancient Greek philosophers used to indicate moral virtue and excellence

The ARET study is part of a larger UKRI-funded Trustworthy Autonomous Systems Node in Functionality research programme, which is a multidisciplinary collaboration between ethicists, sociologists, computer scientists and engineers working together to produce guidelines for the development of trustworthy autonomous systems with evolving functionality.

Ethical Approval

This project has been reviewed and approved by the University of Bristol Faculty of Engineering Research Ethics Committee (Ref: 0137).