Archive Newsletter 2016 ()

Vehicle-in-the-Loop

Immersing Oneself in the Virtual World with Augmented Reality

Reproducible and cost-efficient testing of advanced driver assistance systems and automated driving functions in the real-world vehicle on the test track is a challenge that can be met with the vehicle-in-the-loop (VIL) method established by IPG Automotive. A real-world test vehicle driven on an open area is embedded into the virtual environment of CarMaker including traffic, pedestrians, cyclists, road signs as well as lane markings. Systems based on ultrasonic, radar, LiDAR or camera technology can be tested. Virtual sensors collect information on traffic objects and transmit these to the electronic control units of the advanced driver assistance systems under test. The test vehicle reacts as specified by the system, e.g. with real braking.

This method allows for testing of diverse traffic scenarios, including parking maneuvers, turns on a busy intersection as well as emergency braking scenarios on the highway. In order to test the widest possible range of different scenarios, these can be easily adapted in virtual test driving, and various components such as pedestrians, cyclists, parked cars etc. can be added as needed. Professional test drivers as well as other test participants are thus able to experience the functions of the driver assistance system in a risk-free test environment. In addition, the operability of novel systems can be experienced by those involved in the development already during the conceptual phase.

The feeling of being in the middle of things – also called immersion – is key to an experience that is as realistic as possible. IPG Automotive achieves this by using augmented reality (AR) glasses with so-called optical see-through technology. These glasses are semitransparent which allows the driver a view of the real-world environment. Virtual traffic objects are superimposed on the real-world visuals.

Using a head-tracking system, the movements of the driver’s head are transmitted to the visuals on the display. The system is mounted on the dashboard and detects the position and orientation of the head (head pose) by tracking the markers on the AR glasses via two cameras. The head pose is sent to the visualization tool IPGMovie in CarMaker and the virtual camera in IPGMovie is oriented so that it corresponds to the driver’s viewing direction. All relevant traffic objects are displayed on the AR glasses from this perspective. The images must be transmitted precisely and without delay to achieve a high degree of immersion.

Tests conducted in such a mixed-reality environment are especially suited for examining the human machine interface of new advanced driver assistance systems. How long does it take the driver to get acquainted with the new system? Does the handover of the driving task between the driver and the assistance system work? Does the driver recognize the limits of the system? Thanks to the VIL method, these questions can already be answered in early development phases without risking personal injury or material damage.