Dr. Holger Flühr is with FH Joanneum since 2003 and has been a professor of avionics and air traffic control technology there since 2006. He also acted temporarily as head of the institute and of the degree programs. Before that, after years of working in research at the Karlsruhe Research Center (became the KIT) and Graz University of Technology, he also worked for a company that deals with the development of high-frequency components for mobile phone applications. Finally, the memberships in advisory bodies and interest groups should be mentioned.

In recent weeks, newspapers and online media have increasingly written about the 5G mobile communications standard and possible complications with airplanes. In short, there is a risk that the 5G cellphone waves could influence planes during landing. And while aviation is generally considered to be very safe, takeoff and landing are among the more critical phases of a flight. Now what about the reports? Mr. Flühr answered these and other questions in the podcast. At the beginning, he explained what avionics actually is and which basic idea is used to design aircraft and their systems safely.

Further resources:

Either listen here, on Spotify or on the platform of your choice!

Andreas Gerstinger ist System-Safety-Experte mit Erfahrung in verschiedenen sicherheitskritischen Bereichen, vor allem in der Flugsicherungs- und Bahnindustrie. Er ist bei Frequentis and also lecturer at the UAS Campus Vienna and the UAS Technikum Vienna .

In this episode, we talked about the Boeing 737 MAX crashes in October 2018 and March 2019 that killed more than 300 people in total. A system that was supposed to stabilize the flight attitude was identified as the cause. We discussed in detail the circumstances that led to the incorrect design of this system and ultimately to a system safety failure.

Here are the documents and sources of additional information addressed in the podcast:

The papers mentioned in the episode can be found here:

Either listen here, on Spotify or on the platform of your choice!

Dr Siddartha Khastgir is the Head of Verification & Validation of collaborative autonomous vehicles (CAV) at WMG, University of Warwick, UK. His research areas in the CAV domain include test scenario generation, safety, simulation-based testing, Safe AI among many others. He has received numerous national and international awards for his research contributions, including the prestigious UKRI Future Leaders Fellowship, a seven-year Fellowship focused on safety evaluation of CAVs, and is a Forbes 30 Under 30 Europe list maker. He is also the project leader for ASAM standardisation project - OpenODD, and an active participant at ASAM, SAE, ISO and UNECE discussions.

In this episode we talked about verification and validation of autonomous vehicles. This includes the advantages and challenges of simulations and how one research question raises several more questions. We also talked about the low-speed autonomous driving and about the new standard ISO 22737 “Low-Speed Automated Driving (LSAD) systems”. He was the lead author of that standard, as well as of ISO 34503 “Taxonomy for ODD”, where ODD stands for Operational design domain.

Further resources:

  • BSI PAS 1883 - The publicly available standard on how to define an ODD can be found here
  • ISO 22737:2021 - The new standard on low-speed autonomous vehicles can be found here
  • More on openODD can be found here
  • Check out Siddarthas website

Either listen here, on Spotify or on the platform of your choice!

Michael Schmid is a Technology Architect and Loss Prevention Specialist in the field of autonomous systems. His research focuses on preventing losses related to the use of Artificial Intelligence (AI) and making AI safe for use in everyday technology.

Previously, Michael has worked on automation features in cars, self-driving software, and has developed a certification approach for automated vehicles. Michael has a Master‘s degree from the Massachusetts Institute of Technology (MIT) and is currently a PhD candidate in the Group for System Safety and Cybersecurity at MIT.

In this episode, Michael provided some insights into his research and explained why we need a systems approach to solve many of today‘s problems in technology. As an example, Michael and I discussed some of the challenges of autonomous cars and he outlined a systems-based approach for their certification. Michael provided a quick overview of his current research, making AI-based technology safe, and described some of his main ideas. STAMP, a new accident causality model developed by Nancy Leveson, Michael‘s supervisor at MIT, serves as the basis for his approach.

Additional sources of information:

  • To learn more about Michael, his projects and current work, or to download his Master‘s thesis on the certification of automated vehicles visit his webpage: michael.systems
  • For info about STAMP and the next STAMP workshop go to: PSAS website

Either listen here, on Spotify or on the platform of your choice!