How to simulate automated vehicle ECUs in the lab with artificial test data

Partner content

Advanced Driver Assistance Systems (ADAS) and Autonomous Driving (AD) systems represent a major change for the automotive industry. Testing and validating hardware and software systems are among  the biggest challenges as autonomous vehicles must be more reliable than a human driver. Human drivers primarily use their eyes to recognize the car’s surroundings. In contrast, most automotive suppliers rely on multiple sensors: camera, radar, lidar, and ultrasonic sensors.

To develop reliable and robust functions, validating the fusion and perceptual algorithms as well as the overall system is crucial. Various validation methods are available [1]. Test drives allow for the validation of the entire autonomous vehicle but cover few critical situations and are very expensive in comparison.

Software-in-the-loop (SIL) and hardware-in-the-loop (HIL) simulations make it possible to test critical traffic scenarios for automated vehicles with an almost infinite number of parameter combinations, including weather conditions, lens effects, and error simulation for sensors.

In the simulation, everything can be configured, from the vehicle and sensors to the maneuvers, and the scenarios used there can be simulated safely. However, one of the greatest challenges here is calculating realistic sensor data in real time. In the context of driver assistance systems, it is often sufficient to fall back on sensor-independent object lists based on ground truth data. This data can easily and quickly be extracted from the traffic simulation, while the autonomous vehicle architecture processes the raw data collected by the respective sensor front-end in a central control unit [2]. The computation of raw sensor data is much more complex as it is based on the physical principles of the respective sensor. The differences in the raw data format for the respective sensors are considerable.

HIL test benches make it possible to test real vehicle Electrical Control Units (ECUs) in the laboratory by stimulating them with recorded or artificial test data. This illustration below shows an example setup of open-/closed-loop HIL simulation and raw data injection for a front camera.

dspace diagram
Example setup for open/closed loop HIL and raw data simulation 

Here, the camera image sensor and lens are replaced by the HIL environment.The dSPACE traffic and vehicle dynamics are simulated on a real-time PC. In addition, the real-time PC is connected to the vehicle network (CAN, Ethernet, FlexRay, etc.) for bus simulation.

The results of the traffic simulations are transferred to a powerful computer, which generates a three-dimensional representation of the environment. Based on this, the respective parameterized sensor models are calculated. These sensor models can be supplied by simulation and testing solution providers, such as dSPACE. Alternatively, sensor models from tier-1 suppliers can be integrated using Open Simulation Interfaces (OSI). In addition, dSPACE supports other standards, such as OpenDrive for defining roads and OpenScenario as a scenario description format.

Raw sensor data is transferred to the dSPACE Environment Sensor Interface Unit via the GPU's DisplayPort interface. Further sensor model components are executed on this FPGA-based platform. The sensor data is then sent to the ECU via standard or proprietary interfaces like Maxim GMSL2, TI FPD-Link IV, MIPI A-PHY, or Ethernet.

Sensor simulation increases productivity when validating sensor-based ECUs in all phases of the development and test process.

[1] G. Sievers et. al. Driving Simulation Technologies for Sensor Simulation in SIL and HIL Environments, Driving Simulation Conference Europe 2018 VR

[2] O. Maschmann, AI-in-the-Loop, dSPACE Magazine 2/2019 https://www.dspace.com/en/inc/home/applicationfields/stories/zf-ai-in-the-loop.cfm

Dr. Gregor Sievers is a product manager and group lead in engineering services at dSpace.  He joins Thorsten Opperman, regional operations manager for the western region at dSpace, at Sensors Converge in San Jose on September 21 to talk about “Autonomous Technologies: Twists and Turns in the Road Ahead.” For more information the Sept. 21-23 Sensors Converge   event and to register, go here.