Virtual testing of ADAS

Improved validation thanks to comprehensive sensor technologies 

Whether brake assist, collision avoidance systems, cruise control or park assist functions – we can no longer imagine our cars without advanced driver assistance systems (ADAS). They are able to influence the acceleration, braking and steering of the vehicle in automated mode. With visual, acoustic or haptic alerts, they support and relieve drivers before and during critical traffic situations, improve the safety of occupants and increase their comfort. In the future, requirements for assistance systems will continue to rise since they build the foundation for the next levels towards autonomous driving.

 

Chart showing the different automated driving levels
Illustration of the environmental detection of various sensors

Achieving complete environment recognition from the interplay of sensors 

Current ADAS capture the vehicle’s environment including all relevant objects. This is enabled by using a variety of sensors, and analyzing and combining the often complementary data they deliver. Camera, lidar (light detection and ranging), radar (radio detection and ranging)  and ultrasonic systems are typically used in the following ways:

  • Camera systems recognize traffic signs, people and other road users or lane markings.
  • Ultrasonic systems detect objects in the vehicle’s closer surroundings.
  • Radar and lidar systems detect objects at long range.

 

Lidar systems emit laser impulses and determine the distance to stationary or moving objects in the vehicle’s surroundings by measuring the time of flight of the backscattered light.

Testing tomorrow’s complex scenarios for autonomous vehicles with virtual test driving

The development process of ADAS necessitates a massive test effort since the sensors’ required functionalities need to be tested in all conceivable scenarios. Validation using virtual tests is therefore assuming an ever-greater role. Virtual test driving enables a variety of systems and components to be tested and optimized within the full vehicle early on.

With the CarMaker product family and the implementation of models for lidar systems, users now have access to all sensor technologies in one simulation solution. Real-time capable, virtual modeling and individual adaptation via a large number of parameters is now possible for radar, camera, ultrasonic and lidar systems. In this way, the sources of errors can be identified and eliminated already during the development stage regardless of whether the tests focus on object detection or on the resulting decisions.

Specifications of Lidar, Radar, Ultrasound and Camera sensors

At a glance

Advantages of virtual testing with models of all real sensor technologies with the CarMaker product family:

  • Real-time capable testing of the models in interaction
  • Option of individual adjustments
  • Validation of all sensor functionalities
  • Early identification of sources of errors in assistance systems

Find detailed information on the subject of lidar and the CarMaker product family here.

Contact our experts