Autonomous Driving AI Pulls the Emergency Brake

From Otto von Guericke University Magdeburg | Translated by AI 3 min Reading Time

Related Vendors

Computer scientists at Otto von Guericke University Magdeburg (Germany) have developed an AI process that recognizes weather and environmental disturbances in sensor data from autonomous vehicles and triggers a safe reaction such as stopping in case of doubt.

The autonomous shuttle of the Aula AI project during a demonstration of sensor data with artificial fog.(Image: University of Magdeburg)
The autonomous shuttle of the Aula AI project during a demonstration of sensor data with artificial fog.
(Image: University of Magdeburg)

"In rain, snow, fog or dense vegetation, autonomous vehicles find themselves in a situation that many people are familiar with from everyday life: You can still see, but not precisely enough to make safe decisions," says project manager Dr. Christoph Steup from the Faculty of Computer Science at the University of Magdeburg. It is precisely this threshold that makes Aula AI technically tangible, the computer scientist continues: "An autonomous vehicle needs to know when it can still trust its own sensors and when it cannot. It is precisely this ability of AI that we have been able to significantly improve."

An autonomous vehicle needs to know when it can still trust its own sensors and when it cannot. It is precisely this ability of AI that we have been able to significantly improve.

Dr. Christoph Steup, University Magdeburg

Uncertainties Must Be Recognized

Uncertain sensor data does not mean a complete lack of information, but rather that the available measured values are less reliable. Drops and flakes can create apparent obstacles as interference signals, fog weakens or scatters signals, dense vegetation obscures lines of sight like a drawn curtain. It is crucial for the safety of an autonomous vehicle that it recognizes this uncertainty before it reacts incorrectly.

First Tests With Autonomous Shuttle

The concepts developed have already been partially implemented in software and tested on a 2nd generation autonomous Easymile EZ10 shuttle on the Galileo test field at the University of Magdeburg in the Port of Science. "The shuttle can carry a total of 6 people with its approx. 6.5 × 6.5 × 13 feet and is equipped with a large number of sensors, including 8 lidars and 2 cameras," explains Dr. Steup. Fog could be detected very reliably, even if complete compensation of the interference was only possible to a limited extent. In the case of rain and snow, disturbances could not only be detected, but also partially compensated for. Vegetation was also reliably detected as an influence, but proved to be so massive in places during testing that it was no longer possible to continue driving safely.

Autonomous driving will only gain widespread acceptance if it remains reliably safe even under challenging conditions. We provide a key building block for this by combining machine vision with machine self-assessment in our AI.

Dr. Christoph Steup, University Magdeburg

AI: When in Doubt, Better Be Careful

In the project, the main reaction strategy implemented in practice was controlled stopping in the event of severe disruption. "In heavy rain or snowfall, we were not always able to fully correct the sensor data," explains the computer scientist. "But the AI can recognize very reliably that the data is disturbed, and that is crucial for safety." Under extreme conditions, the disturbance detection worked so consistently in some cases that the shuttle stopped out of caution. "When in doubt, the system preferred to be too cautious rather than too risky," says the scientist. From a scientific point of view, this is a strong result, but at the same time a clear development mandate for later operation.

About the Project

At the end of the project, the results were presented with partners from research and industry, including Götting KG, ITPower and the Fraunhofer Institute for Transportation and Infrastructure Systems IVI in Dresden, Germany.
The AULA-KI research project was funded by the BMFTR (formerly BMBF) for three years with 1.03 million US dollars.

Subscribe to the newsletter now

Don't Miss out on Our Best Content

By clicking on „Subscribe to Newsletter“ I agree to the processing and use of my data according to the consent form (please expand for details) and accept the Terms of Use. For more information, please see our Privacy Policy. The consent declaration relates, among other things, to the sending of editorial newsletters by email and to data matching for marketing purposes with selected advertising partners (e.g., LinkedIn, Google, Meta)

Unfold for details of your consent