Multi OS1 LiDAR scanners SLAM together

Can (2) OS1-64 rev 7 work simultaneously with SLAM to produce a pose of a vehicle / robot? (x/y/z/Roll/pitch/yaw ). We are trying to reduce drift in a very long corridor.

Yes, you should be able to run SLAM against multiple lidar sensors with the current version of the SDK. There few extra details that you have to pay attention to when implementing a multi-lidar slam:

  • Make sure the extrinsics is properly defined for both sensors with respect to the vehicle/robot.
  • Enable time-sync on both sensors to ensure captured scans are well aligned time-wise.
  • [recommended] Download and install the lastest FW version 3.2 and enable ACCEL32_GYRO32_NMEA profile and upgrade to ouster-sdk 0.16.1 which enables IMU assisted motion compensation.

With SDK 0.16.1 you can capture a multi-sensor OSF using

ouster-cli source <source_url1>,<source_url2> save multi-lidar.osf

then run slam and visualize using one of the following commands:

ouster-cli source multi-lidar.osf slam viz --accum-every-m 1

# or produce a map file first 
ouster-cli source multi-lidar.osf slam save multi-lidar.ply # or multi-lidar.pcd
# then preview using ouster-cli or other pointcloud visualizer apps
ouster-cli source multi-lidar-000.ply viz

If you captured the multi-lidar.osf using a FW 3.2 firmware, the SLAM command will automatically use the IMU data to correct any inherent distortions found in Lidar Data which should greatly reduce the drift and improve the SLAM results you are getting. Also refer to this post for additional detail on getting the best outcome with our integrated SLAM solution.

Hope this helps!

1 Like