Marie Farrell

Royal Academy of Engineering Research Fellow in Autonomy and Verification

Marie Farrell

What are your research interests?

I am interested in formal methods and how they can be used to verify complex, safety and mission-critical systems.

What is the focus of your current research?

My Royal Academy of Engineering Research Fellowship is focused on Strong Software Reliability for Autonomous Space Robotics. This work aims to devise new ways of describing, analysing and assuring the autonomous behaviour of robotic space systems.

What are some projects or breakthroughs you wish to highlight?

To demonstrate that software controlling safety-critical systems behaves correctly and safely, it goes through a detailed verification phase. However, we must first define what the software should be verified against. What is safe behaviour? What is correct functionality? To answer, developers define software requirements using, for example, NASA’s Formal Requirements Elicitation Tool (FRET) to express requirements in an unambiguous and logical way that can be understood by developers and software verification tools. We discovered that requirements for autonomous systems differ from classical systems because they make decisions based on learning and/or interactions with an unpredictable physical environment. Specifically, these requirements contain probabilities, e.g., “robot shall correctly identify rocks with 80% accuracy”. Tools like FRET traditionally don’t support writing requirements with probabilities, so we have extended FRET to support this. This enables developers to accurately express requirements for autonomous systems, guiding the verification process and resulting in more reliable software.

My research specifically targets space robotic systems but the results are certainly more widely applicable. In fact, the extensions to FRET can be used in the development of any complex critical system. This kind of generalisability makes the advancements in this project useful for the development of safe and reliable software. This has clear societal benefit since the systems that we use everyday are becoming more and more reliant on software.

I am secretary of the working group developing the IEEE P7009 Standard on Fail-Safe Design of Autonomous and Semi-Autonomous Systems. This effort has been ongoing for some time and in October 2023 the draft standard was formally sent to ballot. I contributed to writing the standard and documented useful case studies to illustrate how it might be used.

What memberships and awards do you hold/have you held in the past?

Memberships: IEEE, ACM, London Mathematical Society (LMS)
Awards: Royal Academy of Engineering Research Fellowship, Irish Research Council Postgraduate Scholarship

What is the biggest challenge in Digital Trust and Security now?

For me, the biggest challenges in Digital Trust and Security relate to how we verify and assure autonomous systems. The advent of autonomy has made these processes more difficult and understanding the role that software requirements play will be crucial.

What real world challenges do you see Digital Trust and Security meeting in the next 25 years?

Although I see verification of autonomous robots as a big challenge, I also view it as an opportunity. For many years now, formal methods have experienced slow uptake in industry and I wonder if the systems simply were not complex or critical enough to really let formal methods shine. Autonomous systems is, for me, a playground where we can demonstrate these powerful verification methods and make meaningful improvements to them. I would hope that, in the next 25 years, we will see robots in space that have been verified using a range of formal techniques.

 

Find out more about Marie's research here.