Test and Validation of Multi-Sensor Systems

Interview with Dr. Raphael Pfeffer, Prasanna Kannan, IPG Automotive GmbH

We spoke to Raphael Pfeffer, manager of the “Innovation Management“ team, and to Prasanna Kannan, senior engineer from the “Test Systems & Engineering“ team, about a new modular test platform that enables virtual testing and validation of radar sensors. As a result, advanced driver assistance systems (ADAS) and highly automated driving functions can be validated at a faster pace.



The number of sensors and the amount of data that are processed inside the decision-making units of ADAS-enabled vehicles are constantly rising. What are the consequences of this development?

Pfeffer: The need for testing and the validation effort for users, and accordingly for system developers, are increasing. This increase is caused by the numerous interactions between the systems installed in the vehicle. Additionally, many different existing technologies have to work together smoothly. Due to the high number of embedded systems that need to be considered, it is also referred to as a “system of systems”. When we then start to think about future level 3+ systems, also referred to as automated driving, that do not have a driver anymore to serve as a fallback level for the system, there will be an even higher need for testing. The magnitude of required tests will not be comparable to the leap we made with “conventional” advanced driver assistance systems.


IPG Automotive is designing and developing a modular test platform in cooperation with Keysight Technologies and Nordsys that speeds up the validation of ADAS and highly automated driving functions. Why is that so important?

Pfeffer: There are a variety of sensors and various sensor technologies in the vehicle, such as camera, radar, lidar or ultrasound. But vehicles are different: There are plenty of distinct configurations and combinations that require flexible modeling in the test system. In addition, the focus is on the simulation environment which is why we need a realistic model of the surroundings. All characteristics of the sensors, such as the camera resolution, have to be modeled with sufficient detail. Only the interplay of the components, including a realistic simulation environment and the possibility to model all sensors in the test system, enables the development of this modular test platform and the creation of representative tests for highly automated driving functions.

What can customers achieve with this system that they can’t do now?

Kannan: The open integration and test platform CarMaker allows them to test algorithms for autonomous driving functions with different levels of maturity. We can support the users from the beginning. When the algorithms are in the phase of prototyping or at the software stage, we can perform tests with model-in-the-loop or software-in-the-loop for example. Since the system is subject to continuous development until the series production status, we can test ECUs with our environment as well. Examples would be “over-the-air” (OTA) or “injection” technologies.

Pfeffer: Besides, the concept of the Autonomous Drive Emulation (ADE) platform enables OTA testing for all modeled components in the test environment. This applies to sensors operating within the line of sight, such as radar, camera and lidar sensors, but also to V2X systems that do not require a free line of sight. We are therefore able to test the entire event chain of the sensors individually as well as within their network. No compromises need to be made in modeling.


Can you describe the structure of the platform architecture?

Kannan: Our ADE architecture considers sensor emulation with the help of CarMaker sensor models in real time by using the hardware platform Xpack4. The camera can be emulated via the Video Interface Box X, which emulates the image sensor. The simulation of the radar sensor is performed with an OTA simulator.

...

Download full interview