EDCC: International SafeAutonomy Workshop

EDCC 2024: 1st International Workshop on Safe Autonomous Systems

Welcome and greetings from the organizers

We are happy to announce the 1th international Workshop on Safe Autonomous Systems which is being organized in the context of the ICON project “Layers of Protection Architecture for Autonomous Systems” (LOPAAS), a collaboration between Fraunhofer and the University of York / Assuring Autonomy International Programme.

With the SafeAutonomy Workshop, we want to bring together different groups to create synergies and establish a shared understanding of the "challenges and solutions related to continuous safety assurance of autonomous systems (AS). We welcome you to join the workshop and help in shaping a reference framework for assuring autonomy.

The SafeAutonomy Workshop is co-located with the 19th European Dependable Computing Conference (EDCC) in Leuven, Belgium from 8th - 11th April 2024.

Dr. Rasmus Adler and Philipp Schleiß from Fraunhofer

Dr. Richard Hawkins from the University of York / Assuring Autonomy International Programme

The Safe Autonomy workshop explores concepts, techniques and technology related to the continuous safety assurance of autonomous systems (AS). The first three workshops in the series ran under the name DREAMS (Dynamic Risk managEment for Autonomous Systems) and focused on dynamic risk management of AS. This remains an important aspect of safety assurance for AS, but for this workshop we also welcome a broad range of contributions in any related area.  AS have enormous potential to transform society. The key trait of AS is their ability to pursue and achieve their goals independently and without human guidance or intervention. In contexts where safety needs to be guaranteed, it is difficult currently to exploit autonomous systems to their full potential due to the difficulty in providing assurance they will be safe throughout operation. The assurance challenge increases when AS take advantage of Machine Learning to cope with the complexity of their mission and the operating context, and when assuring AS in the context of systems of systems where emergent behaviours and dynamic composition must be considered.

The Safe Autonomy workshop will explore a range of topics related to continuous safety assurance of AS including but not limited to:

  • Dynamic risk management
  • situational awareness
  • resilience
  • human machine interaction
  • uncertainty management
  • assurance cases
  • virtual validation
  • Safety assessment

It invites experts, researchers, and practitioners for presentations and in-depth discussions about assuring autonomy, its relevance for specific use cases, its relation to exiting regulatory frameworks and standardization activities, and solutions from systems and software engineering.

Safe Autonomy aims at bringing together communities from diverse disciplines, such as safety engineering, runtime adaptation, predictive modelling, control theory, and from different application domains such as automotive, healthcare, manufacturing, agriculture and critical infrastructures.

9:15 -- 10:00 Keynote: Continuous safety assurance of autonomous cyber-physical systems - challenges and directions, Speaker: Martin Törngren

10:00 -- 10:30 Presentation (1):

  • 10:00 --> (1) Defining an Effective Context for the Safe Operation of Autonomous Systems

10:30 -- 11:00 Coffee Break → Location: ELEC B91.200

11:00 -- 12:30 Presentation (2,3,4)

  • 11:00 --> (2) STARS: A Tool for Measuring Scenario Coverage When Testing Autonomous Robotic Systems
  • 11:30 --> (3) Providing Evidence for the Validity of the Virtual Verification of Automated Driving Systems
  • 12:00 --> (4) What level of power should we give an automation? —Adjusting the level of automation in HCPS

12:30 -- 14:00 Lunch Break → Location: ELEC B91.200

14:00 -- 15:00 Presentation (5,6)

  • 14:00 --> (5) Towards Continuous Assurance Case Creation for ADS with the Evidential Tool Bus
  • 14:30 --> (6) A Physics-based Fault Tolerance Mechanism for UAVs’ Flight Controller

15:00 -- 15:30 Wrap up

15:30 -- 16:00 Coffee Break → Location: ELEC B91.200

16:00 -- 17:30 Working Session (optional): Which autonomy use cases can drive innovation and applied safety research?
Most discussions around autonomous system seem to take place with respect to autonomous road vehicles. Automotive industry promised substantial progress in this field but many predictions have not come true. Companies are stepping back and correcting their predictions. A major reason for this is the safety challenge. What does this mean for applied safety research? Are there domains and autonomy use cases that are less challenging from safety perspective and that can be handled by means of the lessons learned from automotive industry? How can research help automotive industry most effectively to drive autonomy-based innovation?

All submissions will be peer-reviewed by at least three members of the program committee. They will be evaluated based on originality, contribution to the field, technical and presentation quality, and relevance to the workshop.

Please consider the following page limits:

  • Regular technical papers describing original theoretical or practical work (6-8 pages) 
  • Case studies describing practitioner experience or field studies (8-12 pages) 
  • PhD Forum papers describing objectives, methodology, and results at an early stage of research (6-8 pages) 
  • Position papers on challenges and emerging trends (6 pages)
  • Please consult Springer's authors' guidelines and use their proceedings templates, either for LaTeX or for Word, for the preparation of their paper
  • Please submit your paper via easychair https://easychair.org/cfp/SafeAutonomy2024

 

Paper submission

Dec 18rd, 2023

new Deadline: Jan 19th, 2024

Author notification

Jan 26th, 2024

Camera-ready paper

Feb 5th, 2024

Workshops

Apr 8th, 2024

Organizers

  • Rasmus Adler (Fraunhofer IESE, Germany)
  • Richard Hawkins (University of York, UK)
  • Phillipp Schleiß (Fraunhofer IKS, Germany) 
     

Program Committee

  • Karl-Erik Arzen (Lund University, Sweden)
  • Patrik Feth, (psiori, Germany)
  • Martin Fränzle (Carl von Ossietzky Universität Oldenburg, Germany)
  • Andrey Morozov (University of Stuttgart, Germany)
  • Ganesh Pai (KBR / NASA Ames Research Center, USA)
  • Davy Pissoort (Katholieke Universiteit Leuven, Belgium)
  • Ioannis Sorokos (Fraunhofer IESE, Germany)
  • Martin Törngren (KTH, Sweden)
  • Ran Wei (University of Cambridge, UK)

Review DREAMS Workshop 2022

The 2022 workshop was in Zaragossa/Spain and the proceedings can be found here.

Review DREAMS Workshop 2021

Videos and recordings of the talks at the DREAMS Workshop 2021

Privacy warning

With the click on the play button an external video from www.youtube.com is loaded and started. Your data is possible transferred and stored to third party. Do not start the video if you disagree. Find more about the youtube privacy statement under the following link: https://policies.google.com/privacy

Keynote "Artificial Morality in Dynamic Risk Management for Autonomous Systems" (Dr. Rasmus Adler at EDCC 2021)

In this keynote, the research field of dynamic risk management is structured and viewed in relation to some topics of machine ethics and the ethics of risk. In this context, the Responsive Sensitive Safety Model (RSS) concept is linked to the Situation-Aware Dynamic Risk Assessment (SINADRA) concept.    

Privacy warning

With the click on the play button an external video from www.youtube.com is loaded and started. Your data is possible transferred and stored to third party. Do not start the video if you disagree. Find more about the youtube privacy statement under the following link: https://policies.google.com/privacy

"Architecture for Situation-Aware Dynamic Risk Assessment" (Jan Reich at EDCC 2021)

The talk is about the architectural building blocks required for performing situation-aware dynamic risk assessment (SINADRA).

If you want to know more about our research on safety topics at Fraunhofer IESE, also check our webpage: Dependable AI

Privacy warning

With the click on the play button an external video from www.youtube.com is loaded and started. Your data is possible transferred and stored to third party. Do not start the video if you disagree. Find more about the youtube privacy statement under the following link: https://policies.google.com/privacy

"Handling Uncertainties of Data-Driven Models in Compliance with Safety Constraints for Autonomous Behavior" (Rasmus Adler at EDCC 2021)

This talk is about handling uncertainties of data-driven models with respect to safety constraints such as RSS. 

Proceedings and other talks

Other presentations of accepted workshop papers on the topics of “Service-Oriented Reconfiguration in Systems of Systems Assured by Dynamic Modular Safety Cases”, “Behavior Prediction of Cyber-Physical Systems for Dynamic Risk Assessment”, “Autonomic service operation for cloud applications: Safe actuation and risk management”, and “Evaluation of Human-in-the-Loop Learning based Autonomous Systems” were not recorded, so we have to refer to the proceedings.

The presentations and discussions about dynamic risk management for autonomous systems were recorded at an event organized in the context of “The Autonomous”.

Review DREAMS Workshop 2020

Videos and recordings of the talks at the DREAMS Workshop 2020

Privacy warning

With the click on the play button an external video from www.youtube.com is loaded and started. Your data is possible transferred and stored to third party. Do not start the video if you disagree. Find more about the youtube privacy statement under the following link: https://policies.google.com/privacy

Keynote "A Safety Case plus SPIs Metric Approach for Self-Driving Car Safety" (Prof. Philip Koopman at EDCC 2020)

This keynote talk given by Philip Koopman puts dynamic risk management (DRM) into the big picture of safety assurance for self-driving vehicles. It focuses on the safety case and on monitoring its validity by means of safety performance indicators. DRM relates to the “runtime safety monitor” on slide 14 (minute 27).

Privacy warning

With the click on the play button an external video from www.youtube.com is loaded and started. Your data is possible transferred and stored to third party. Do not start the video if you disagree. Find more about the youtube privacy statement under the following link: https://policies.google.com/privacy

External Talk: "Mathematical Risk Model for Assuring Functional Safety in Autonomous Vehicles" (Michael Woon at EDCC 2020)

Privacy warning

With the click on the play button an external video from www.youtube.com is loaded and started. Your data is possible transferred and stored to third party. Do not start the video if you disagree. Find more about the youtube privacy statement under the following link: https://policies.google.com/privacy

External Talk: "Smart Safety – Safe detection of new hazards" (Dr. Detlev Richter at EDCC 2020)

This keynote talk by Detlev Richter from TüV Süd discusses dynamic risk management in the context of smart manufacturing. It highlights the need for dynamic risk management and provides a clear vision for its realization.
The working group for AI in smart manufacturing (of which Detlev Richter and Dr. Rasmus Adler are members) agreed on the need for DRM and a related recommended action was formulated in the upcoming German standardization roadmap for AI.  

 


The safety- administration shells (green boxes at minute 16:30) relate to
Digital Dependability Identities and their instantiation for functions/components/systems in the production domain, like in this success story with Sick AG. This can enable automated risk reasoning in an IT backbone (see
https://www.basys40.de/ and  https://www.fab-os.org/).

Privacy warning

With the click on the play button an external video from www.youtube.com is loaded and started. Your data is possible transferred and stored to third party. Do not start the video if you disagree. Find more about the youtube privacy statement under the following link: https://policies.google.com/privacy

External Talk: "Concepts of Dynamic Assurance for Trusted Autonomy" (Ganesh Pai at EDCC 2020)

This invited talk from Pai Ganesh provides an overview of dynamic assurance concepts, and an example of how these concepts can be implemented in aviation. It also refers to several research papers for further reading.

Privacy warning

With the click on the play button an external video from www.youtube.com is loaded and started. Your data is possible transferred and stored to third party. Do not start the video if you disagree. Find more about the youtube privacy statement under the following link: https://policies.google.com/privacy

External Talk: "Safety Cases for Adaptive Systems of Systems: State of the Art & Current Challenges" (Elham Mirzaei, Carsten Thomas, Mirko Conrad at EDCC 2020)

This paper presentation by Elham Mirzaei focuses on dynamic safety cases. The approach is strongly related to our research basaed on Conditional Safety Certificates (ConSerts) and Digital Dependability Identities.

Dynamic safety cases enable runtime reasoning about the safety of systems/components that are dynamically composed at runtime. It is also the basis for making a composition aware of the current risk as it enables safe shared perception.

 

Privacy warning

With the click on the play button an external video from www.youtube.com is loaded and started. Your data is possible transferred and stored to third party. Do not start the video if you disagree. Find more about the youtube privacy statement under the following link: https://policies.google.com/privacy

External Talk: "Enforcing Geofences for Managing Automated Transportation Risks in Production Sites" (Faiz Ui Muram at EDCC 2020)

This paper presentation by Faiz Ui Muram presents a simulation-based approach for identifying hazards during verification and validation to gain higher confidence in production-side safety. The simulation is done at design-time (during V&V) but the fundamental idea is closely related to runtime-assurance. Furthermore, the paper contributes to the topic of continuous (safety) engineering.