I mounted an OS1-128 horizontally on the roof of the vehicle. During freeway driving, when approaching an Electronic Toll Collection (ETC) gateway, the structure can be clearly observed with no distortion in Ouster Studio, as shown in the figure on the left below.
However, while driving through the gateway, the top-view visualization shows that the gateway structure (located at the lower-right corner of the figure on the right above) appears distorted.
Could you help us understand what might be causing this distortion? In particular, we would like to know whether this issue is related to motion distortion, sensor configuration, or scanning geometry.
Since we plan to perform SLAM in freeway scenarios, this distortion is a concern for our localization and mapping performance. Are there any recommended configurations or mitigation strategies to avoid this issue?
Thanks for your question. The distortion you see is very likely due to the velocity of the vehicle while recording.
Essentially, since the LiDAR is rotating while the vehicle is moving forward, objects it recorded toward the end of a scan appear closer (if moving toward the objects) than objects it recorded at the beginning of a scan.
Luckily, it’s very easy to account for this kind of distortion. Ouster Studio and Ouster SDK’s SLAM implementations both account for this.
In fact, if you visualize your dataset with ouster-cli using the slam command, e.g.
$ ouster-cli source my_dataset.osf slam viz
The underlying pipeline applies motion compensation (aka “dewarp”) to each scan.
The viz command automatically applies the updated poses produced by slam+deskew which removes the distortion during visualization.
In the upcoming Ouster SDK 0.16.0 release the results will be even better due to a number of enhancements the team has made to the deskew method. We’re hoping to make this release available in the next few days.
Note, at this time, Ouster Studio visualization does not apply poses produced from SLAM, and as such does not yet compensate for motion. However, if you uploaded your dataset to Ouster Studio Web, you can download the SLAM-processed version and visualize it using ouster-cli to see the effect of the motion compensation.
In summary, what you are witnessing is a natural outcome of all scanning (“rolling shutter”) lidars when in motion, and compensating for this motion distortion is an algorithmically intensive task commonly solved by SLAM (Simultaneous Localization and Mapping).
OusterStudio Desktop and Web viewer display the lidar in the sensor’s frame of reference (they do not apply motion compensation to the data) and so the distortion in the data is expected and not an artifact.
SimpleViz in the Ouster SDK visualizes data in a world frame of reference. This is only possible by knowing the trajectory of the sensor through the world and then using this information (“poses”) to transform (“dewarp”) the data into the world frame.
Ouster provides multiple avenues to calculate these world frame poses (“run SLAM”) and view the results:
By uploading your data to a free Ouster Studio Web account and downloading the post processed .OSF file to view with the ouster SDK
With the SDK using ouster-cli source FILE slam viz in the command line
With the SDK using the SLAM API and a python or C++ script
For more information about the SDK approaches I would direct you to check the provided tutorial here: Simple SLAM scripting with the SDK - #6 by yggdrasil which covers this topic extensively. It show how to apply a form of motion compensation to account for this artifact.