Calibrating sensors is a necessary link in the autonomous driving perception system, and is a necessary step and prerequisite for subsequent sensor fusion. Its purpose is to transform two or more sensors into a unified spatio-temporal coordinate system, so that sensor fusion is meaningful and is the key to perception decision-making. key premise. Any sensor needs to be calibrated through experiments after manufacturing and installation to ensure that the sensor meets the design specifications and ensures the accuracy of the measurement values. After the sensor is installed on the autonomous vehicle, it needs to be calibrated; at the same time, during the driving process of the vehicle, due to vibration and other reasons, the sensor position will deviate from the original position, so it is necessary to calibrate the sensor at certain intervals. calibration. Self-driving cars work simultaneously through multiple types of sensors for environmental perception and self-awareness. The robustness and accuracy of sensors are particularly important in the perception process of self-driving cars.
Camera calibration
The vehicle-mounted camera is installed on the vehicle at a certain angle and position. In order to correspond the environmental data collected by the vehicle-mounted camera to the real objects in the vehicle driving environment, that is, to find the point coordinates in the image pixel coordinate system generated by the vehicle-mounted camera and the camera The conversion relationship between point coordinates in the environment coordinate system requires camera calibration.●
● Establishment of camera model
Through the mutual conversion relationship between the environment coordinate system, camera coordinate system, image physical coordinate system, and image pixel coordinate system, we can find the conversion relationship between the environment coordinate system and the image pixel coordinate system. conversion relationship between
For the internal parameter matrix, its four constants fx, fy, Uo, Vo. It is related to the design technical indicators such as the focal length, main point and sensor of the camera, and has nothing to do with external factors (such as the surrounding environment, camera position), so it is called the internal parameter of the camera. The internal reference is determined when the camera leaves the factory. However, due to manufacturing process and other issues, even cameras produced on the same production line have slightly different internal parameters. Therefore, it is often necessary to determine the internal parameters of the camera through experiments. The calibration of a monocular camera usually refers to determining the internal parameters of the camera through experimental means.
The external parameter matrix includes the rotation matrix and the translation matrix. The rotation matrix and the translation matrix jointly describe how to convert the point from the world coordinate system to the camera coordinate system. In computer vision, the process of determining the external parameter matrix is usually called visual localization. After the on-board camera is installed in a self-driving car, the camera position needs to be calibrated in the vehicle coordinate system. In addition, due to the bumps and vibrations of the car, the position of the on-board camera will slowly change over time, so self-driving cars need to recalibrate the camera position regularly, a process called calibration.
● Camera distortion correction
In actual use, the camera cannot completely accurately perform perspective projection according to the ideal pinhole camera model. Lens distortion usually exists, that is, there is a certain optical distortion between the image generated by the object point on the actual camera imaging plane and the ideal image. Error, the distortion error is mainly radial distortion error and tangential distortion error.
Radial distortion: Due to the characteristics of the lens, light tends to bend to a small or large extent at the edge of the camera lens, which is called radial distortion. This kind of distortion is more obvious in ordinary cheap lenses. Radial distortion mainly includes barrel distortion and pincushion distortion. Barrel distortion is a barrel-shaped expansion of the imaging image caused by the structure of the lens object and the lens group in the lens. Barrel distortion is usually easier to detect when using a wide-angle lens or when using the wide-angle end of a zoom lens. Pincushion distortion is the phenomenon of the image “shrinking” toward the center caused by the lens. People are more likely to notice pincushion distortion when using the telephoto end of a zoom lens.
Tangential distortion: It is caused by the fact that the lens itself is not parallel to the camera sensor plane (imaging plane) or image plane. This situation is mostly caused by the installation deviation of the lens being pasted to the lens module.
In computer vision, radial distortion has a very important impact on scene reconstruction. The autonomous driving system’s perception of the environment requires the camera to achieve high-precision reconstruction of the surrounding environment. If distortion is not corrected, accurate environmental information cannot be obtained. For example, targets in the environment may appear in any area of the image. If distortion is not corrected, the target location and size obtained through vision technology are often inaccurate, which will directly affect the driving safety of autonomous vehicles. In addition, self-driving cars are equipped with multiple cameras at different locations. If radial distortion is not considered, during the image stitching process, the blur effect of the stitched images will be caused by mismatching of corresponding features.
For general cameras, the radial distortion of the image is often described as a low-order polynomial model. Assume (u, v) is the coordinate of the corrected point, (u’, u’) is the coordinate of the uncorrected point, then the transformation between the two can be determined by the following formula:
Calibration of external parameters between cameras
In self-driving cars, in order to reduce the blind spots of perception as much as possible, multi-camera mode is often adopted. Determining the relative positional relationship between multiple cameras is called the external parameter calibration of the camera.
From another perspective, the camera’s external parameter calibration can also be called an “attitude estimation” problem. The relative pose [R|t] between the two cameras has 6 degrees of freedom (spatial position and rotation relationship). Theoretically, as long as the two cameras acquire 3 points in the space at the same time, the relationship between the two can be restored. relative posture. The problem of recovering the relative posture between cameras from three pairs of corresponding points is called the “Perspective-3-Point-Problem, P3P”. In reality, more than 3 points are often used to restore the relative posture to improve robustness, and the P3P problem is generalized as a PnP problem.
Initially, researchers used the direct linear method to solve the PnP problem. Later, in order to improve the accuracy, the researchers proposed the reprojection error of robust linearization, and began to use the selective generation method to solve the PnP problem, and thus proposed the famous attitude estimation method. Bundle Adjustment (BA).

Calibration of lidar
Lidar is one of the main sensors of the autonomous driving platform and plays an important role in perception and positioning. Like cameras, lidar also needs to calibrate its internal and external parameters before use. Internal parameter calibration refers to the conversion relationship between its internal laser transmitter coordinate system and the radar’s own coordinate system. It has been calibrated before leaving the factory and can be used directly. What the autonomous driving system needs to perform is external parameter calibration, that is, the relationship between the lidar’s own coordinate system and the vehicle body coordinate system.
The lidar and the vehicle body are rigidly connected, and the relative posture and displacement between the two are fixed. In order to establish the relative coordinate relationship between lidars and between lidars and vehicles, it is necessary to calibrate the lidar installation and convert the lidar data from the lidar coordinate system to the vehicle body coordinate system.
● Calibration between LiDAR and LiDAR
For self-driving cars, there are sometimes multiple lidars, and the external environment acquired by each lidar must be accurately mapped to the vehicle body coordinate system. Therefore, when there are multiple lidars, the relative positions of the multiple lidars need to be calibrated and calibrated.
There are many ideas for calibrating external parameters between lidars, among which the more commonly used one is to indirectly derive the coordinate conversion relationship between lidars through the coordinate transformation relationship between different lidars and the car body.
●Calibration of lidar and camera
On an autonomous vehicle, the lidar and the driverless vehicle are rigidly connected, and the relative attitude and displacement between the two are fixed. Therefore, the data points obtained by the lidar scan have unique position coordinates and corresponding. Similarly, the camera also has unique position coordinates in the environment coordinate system, so there is a fixed coordinate transformation between the lidar and the camera. The joint calibration of lidar and cameras is to complete the unification of multiple sensor coordinates such as single-line lidar coordinates, camera coordinates, and image pixel coordinates by extracting the corresponding feature points of the calibration object on the single-line lidar and image, and realize the lidar and camera spatial calibration.
After the camera external parameter calibration and lidar external parameter calibration are completed, the relationship between the two can actually be completely determined, and the lidar scanning point can be projected to the image pixel coordinate system.
Just like the internal parameter calibration method of the camera, the external parameter calibration method of the lidar and camera can also use the calibration method of the calibration plate.
If you need any help finding the right product for your application, please contact us at autoconnector@lhecn.net and we will be happy to assist you.
--- END ---