Our toolchain produces high fidelity 3D semi-synthetic imagery necessary for sensor simulation and object recognition.
Scenes are built dynamically at run-time from AAI or customer-provided map data with procedural placement of 3D objects loaded from our asset library. Our assets are optimized for a realistic sensor simulation with physical material information. We ensure that the drivability information of the original map material is preserved at scene creation for use within the simulated test runs. Editing options, for example to insert obstructions such as a construction site, are available.
The original map data source (OpenDRIVE®, HD maps, and more) is automatically converted into our AAI internal data format ATLAS and checked for errors regarding the drivability information. After correction of errors and, if desired, also the manual modification of lane information, the system will automatically build the 3D meshes and scenery based on the available information.
We developed optimized algorithms for GPU-based procedural placement of vegetation and surroundings for high-performance runtime scenery creation during simulation runs.
In order to provide diversity and localization options to our customers, we are constantly extending our asset library.
Upon request, we can export scenery created based on imported maps in FBX format for detailed editing or alternative use.
It is also possible to export synthetic imagery including the corresponding object labeling from our Ground Truth sensor for the training of object recognition algorithms. Contact us for more information!