Point cloud distortion in OS1

In summary, what you are witnessing is a natural outcome of all scanning (“rolling shutter”) lidars when in motion, and compensating for this motion distortion is an algorithmically intensive task commonly solved by SLAM (Simultaneous Localization and Mapping).

  • OusterStudio Desktop and Web viewer display the lidar in the sensor’s frame of reference (they do not apply motion compensation to the data) and so the distortion in the data is expected and not an artifact.
  • SimpleViz in the Ouster SDK visualizes data in a world frame of reference. This is only possible by knowing the trajectory of the sensor through the world and then using this information (“poses”) to transform (“dewarp”) the data into the world frame.

Ouster provides multiple avenues to calculate these world frame poses (“run SLAM”) and view the results:

  • By uploading your data to a free Ouster Studio Web account and downloading the post processed .OSF file to view with the ouster SDK
  • With the SDK using ouster-cli source FILE slam viz in the command line
  • With the SDK using the SLAM API and a python or C++ script

For more information about the SDK approaches I would direct you to check the provided tutorial here: Simple SLAM scripting with the SDK - #6 by yggdrasil which covers this topic extensively. It show how to apply a form of motion compensation to account for this artifact.