Main Content

Coordinate Systems for Scenario Generation

Scenario generation is the process of building virtual scenarios from real-world vehicle data recorded from sensors such as GPS sensors, IMU sensors, cameras, and lidar sensors. The Scenario Builder for Automated Driving Toolbox™ support package enables you to create an accurate digital twin of a real-world scenario by fusing recorded data from multiple sensors. To generate virtual scenarios using the Scenario Builder for Automated Driving Toolbox support package workflows, you must represent the data collected from the sensors using one of these coordinate systems:

  • World — A fixed universal coordinate system in which all vehicles and their sensors are placed.

  • Vehicle — A coordinate system that is anchored to the ego vehicle. Typically, the vehicle coordinate system is placed on the ground right below the midpoint of the rear axle.

  • Sensor — Specific to one of these sensors.

    • IMU — A local coordinate system that represents data in the ENU (east-north-up) or NED (north-east-down) frame.

    • GPS — A geographic coordinate system represented in latitude, longitude, and altitude.

    • Camera — A local coordinate system whose origin is located at the optical center of the camera.

    • Lidar — A local coordinate system whose origin is located at the center of the lidar sensor.

World Coordinate System

The world coordinate system is a fixed universal frame of reference for all the vehicles, sensors, and objects in a scene. In a multisensor system, each sensor captures data in its own coordinate system. You can use the world coordinate system as a reference to transform data from different sensors into a single coordinate system.

The Scenario Builder for Automated Driving Toolbox support package uses the right-handed Cartesian world coordinate system defined in ISO 8855, where the z-axis points up from the ground. Units are in meters.

Vehicle Coordinate System

The vehicle coordinate system is a human-defined coordinate system fixed at a point on the ego vehicle. You can position the origin of the vehicle coordinate system at different locations on the ego vehicle such as center of the rear axle. The vehicle coordinate system in the Scenario Builder for Automated Driving Toolbox support package follows the coordinate system used by Automated Driving Toolbox, where the x-axis is positive in the direction of ego vehicle movement, the y-axis is positive to the left with regard to ego vehicle movement, and the z-axis points up from the ground. For more information, see Vehicle Coordinate System.

Sensor Coordinate System

An automated driving system can contain sensors located anywhere on or in the vehicle. The location of each sensor contains an origin of its coordinate system. The origin specifies the position of sensor on the ego vehicle, which defines the sensor coordinate system.

This figure shows an example of placement of these sensors with their orientation on an ego vehicle. However, an ego vehicle can contain sensors at positions and orientations different from the positions and orientations shown in this figure.

You can use one or more of these sensors and represent data with their sensor coordinate systems to use to generate virtual scenarios. To use sensor data in Scenario Builder for Automated Driving Toolbox workflows, you must transform sensor data from the sensor coordinate system to the vehicle coordinate system.

IMU

The IMU sensor measures the position, orientation, acceleration, and angular velocity of the ego vehicle along its x-, y-, and z-axes in local coordinates in the ENU or NED frame.

To use IMU sensor data in Scenario Builder for Automated Driving Toolbox workflows, your data must be in the ENU coordinate frame. If your data is in the NED coordinate reference frame, you must convert it to the ENU coordinate frame. For more information on how to convert IMU data from the NED coordinate frame to the ENU coordinate frame, see the Ego Vehicle Localization Using GPS and IMU Fusion for Scenario Generation example. For more information on ENU and NED local reference frames, see Orientation, Position, and Coordinate Convention (Navigation Toolbox).

GPS

The GPS sensor measures the position of the ego vehicle in the geographic coordinates of latitude, longitude, and altitude.

To use recorded GPS data in Scenario Builder for Automated Driving Toolbox workflows, you can represent it using the GPSData object. The GPSData object stores data in the geographic coordinate system.

You can also store trajectories created from GPS or other sensor data by using the Trajectory object. Trajectory in a Trajectory object is represented in the world coordinate system in the ENU frame of reference.

Camera

A camera captures ego-centric on-road images in camera coordinate system.

The origin of the camera coordinate system is located at the optical center of the camera along the x-, y-, and z-axes. For more information, see Sensor Coordinate System.

Spatial coordinates enable you to specify a location in an image with greater granularity than pixel coordinates. For more information on the spatial coordinate system, see Spatial Coordinates.

Image coordinate systems represent image pixels on the x- and y-axes. For more information, see Pattern Coordinate System.

Use recorded camera data in Scenario Builder for Automated Driving Toolbox workflows by representing the data using the CameraData object. You can store camera parameters such as intrinsics, extrinsics, and transformations in the CameraData object.

Images collected from cameras onboard the ego vehicle contain actor information, which you can use to extract actor tracks. You can store extracted actor tracks by using the ActorTrackData object. You can visualize the stored actor tracks in vehicle coordinate system in a bird's-eye plot by using the plot and play object functions.

Lidar

The lidar sensor typically measures the distance of objects from the sensor and collects the information as points. These points are represented in the local coordinate system with respect to the origin, which is located at the center of the sensor. For more information, see Sensor Coordinate System (Lidar Toolbox).

To use recorded lidar data in Scenario Builder for Automated Driving Toolbox workflows, represent the data using the LidarData object. You can store the lidar parameters such as sensor characteristics, mounting information, and transformations in the LidarData object.

Point clouds collected from lidar sensors onboard the ego vehicle contain actor information, which you can use to extract actor tracks. You can store extracted actor tracks by using the ActorTrackData object. You can visualize the stored actor tracks in the vehicle coordinate system in a bird's-eye plot by using the plot and play object functions.

See Also

Functions

Topics