SOLUTIONS

Sensor Calibration Software

About Proton Engine

Integrating multiple sensor data for sensor fusion applications requires establishing accurate relative position and orientation relationships among the sensors. Proton Engine is a semi-automated multi-sensor calibration package that precisely calculates the intrinsic and extrinsic parameters for LiDAR, cameras, and Radar sensors.

What is Calibration for?

Extrinsic calibration is a fundamental process that ensures the proper alignment and coordination of different sensors and equipment, such as cameras and LiDARs, in various applications ranging from robotics to computer vision. The calibration process helps to ensure that the location of obstacles detected by each sensor corresponds to the same object in 3D space. Accuracy in multi-sensor calibration is critical, as errors or mismatches can lead to hazardous consequences in specific applications. For example, incorrectly identifying a pedestrian could result in misjudgment and wrong positioning.

Therefore, extrinsic calibration is vital in ensuring that various sensors and equipment are well-integrated and perform optimally, leading to better results, safety, and efficiency.

Use Case

Cameras are adept at capturing color, texture, and other rich information in 2D space, while LiDAR sensors can record accurate distance readings in 3D space. By leveraging the strengths of both sensors and through precise calibration, it is possible to integrate color information obtained through cameras into the 3D distance readings captured through LiDAR sensors. This approach could significantly improve the accuracy and reliability of perception systems.

Autonomous driving applications rely heavily on multiple sensors, including radars, cameras, LiDARs, IMUs, and GNSS, to create a reliable multi-sensor fusion system that enables perception, localization, and vehicle control tasks. The seamless integration of these sensors requires accurate time synchronization and calibration.

Radars sensors use radio waves to detect objects and measure their distance, speed, and sometimes even their direction. Integrating radars with other sensors requires estimating the relative 3D position and orientation.

Cameras use mainly two components to generate an image: a sensor and a lens. Integrating cameras with other sensing systems requires estimating the relative 3D position and orientation and estimating the 3D to 2D projection parameters (known as camera intrinsic parameters). Depending on the application, additional tuning might be required on the camera sensor. For instance, define an adequate White Balance value in sunset scenarios or limit the sensor’s shutter speed for bright scenes.

LiDARs (Light Detection and Ranging) use laser light to measure distances and create highly detailed three-dimensional (3D) scans of their surroundings. Integrating LiDARs with other sensors requires estimating the relative 3D position and orientation.

IMUs (Inertial Measurement Unit) provide information about an object’s orientation and acceleration. IMUs also integrate gyroscopes to measure angular velocity. Similar to the radars and Lidars, these sensors require estimating the relative 3D position and orientation to incorporate them in a multi-sensor system.

GNSS (Global Navigation Satellite System) includes multiple satellite navigation systems that provide users with precise location and time information anywhere on or near the Earth’s surface. Integrating GNSS sensors requires the relative 3D position and orientation and a time synchronization mechanism.

Back to List