I have been testing the ouster-cli localization feature with the following commands:
ouster-cli source <ip> localize name.ply viz
ouster-cli source <ip> localize name.ply viz --global-map-flatten False
ouster-cli source <ip> localize name.ply save_trajectory name.csv viz
ouster-cli source <ip> localize name.ply viz --global-map-voxel-size 0.06
I would like to ask a few questions regarding the results:
In command #4, I specified a voxel value (--global-map-voxel-size 0.06). However, in the command window, a different value was displayed instead of the one I set. Could you please explain why this happens?
When I use save_trajectory name.csv or save_trajectory --tum name.csv, the output file only contains a single line. I expected multiple lines of x,y,z,qx,qy,qz,qw values because I was moving in a small circular path during testing. Why does the CSV file contain only one line? (I attached screenshots of both my visualization window and the saved CSV file for reference.)
When I use the --help option, I have difficulty understanding what it means.
Is save_trajectory not actually saving the trajectory of the sensor, but serving some other purpose? Could you please explain the principle behind the save_trajectory descriptions?
My understanding is that save_trajectory records the trajectory path. This leads to my most important question: is there a way to view this trajectory information in real time using the SDK? I understand that robots and similar systems can perform autonomous navigation with only xyz values, and since save_trajectory appears to store xyz values, I assume there must be some internal logic involved. Therefore, I would like to know whether the SDK provides a way to directly access this trajectory information in real time.
Any clarification or guidance would be greatly appreciated.
The option --global-map-voxel-size sets the map visualization voxel size, this is different from the voxel-size used for performing the localization. When using the localize command with no options but the map name a default voxel size is used, however, you can control that using the --voxel-size or -v param such that:
ouster-cli source <ip> localize –v 0.5 name.ply viz
Note that this is different from the --global-map-voxel-size 0.06 which controls the level of detail of the map that appears in the ouster-cli/viz.
It appears there is a problem when chaining the save_trajectory command with the viz command, I’ll create a ticket to fix this bug, for the time being try using the save_trajectory without viz (sorry for the inconvenience)
The viz app does show the trajectory in real-time as a series of white dots.
Please note that the localization command is a very recent feature that we haven’t provided enough examples on its use. The quality of the map used during localization greatly affects the outcome, also note that the command currently may not be able to handle real-time localization on medium hardware, when this happens that localization will lag behind the real position leading to mismatches that the current approach can’t recover from.
Regarding the voxel issue, it seems that I had misunderstood its usage. I truly appreciate your clarification on the correct way to use it.
Thank you also for your efforts to help resolve the second issue. I will test it again using the save_trajectory command without running the visualization (viz).
As for the third point, I believe I may not have phrased my question clearly before. I understand that the white dots represent the path I have moved along. Initially, when I used save_trajectory, I expected to see multiple lines of data but only got a single row; I understand now that this issue can be resolved.
However, what I really want to know is whether there is any other way to obtain the x, y, z coordinates of these trajectory points in real time using the SDK, without having to save them as a CSV file via save_trajectory.
Thank you once again for your support and guidance.
Now I got your question, to get the trajectory position in real-time you can use the SDK directly. Here is a quick example that shows how to achieve that:
from typing import List
import argparse
from ouster.sdk import open_source
from ouster.sdk.core import LidarScan
from ouster.sdk.mapping import SlamConfig, SlamEngine
from ouster.sdk.viz import SimpleViz
from scipy.spatial.transform import Rotation as R
def process(scans: List[LidarScan]) -> List[LidarScan]:
scans = slam.update(scans)
scan = scans[0]
col = scan.get_first_valid_column()
scan_ts = scan.timestamp[col]
scan_pose = scan.pose[col]
px, py, pz = scan_pose[:3, 3]
roll, pitch, yaw = R.from_matrix(scan_pose[:3, :3]).as_euler('xyz', degrees=True)
print(f"idx = {scan.frame_id}; scan_ts = {scan_ts}; Tr: {px:.2f}, {py:.2f}, {pz:.2f}; "
f"Roll: {roll:.2f}, Pitch: {pitch:.2f}, Yaw: {yaw:.2f}")
return scans
if __name__ == "__main__":
parser = argparse.ArgumentParser()
parser.add_argument("source_url", help="The path to the file to be tested")
args = parser.parse_args()
source = open_source(args.source_url, sensor_idx=-1)
config = SlamConfig()
config.backend = "kiss"
config.min_range = 1.0
config.max_range = 75.0
config.voxel_size = 0.5
slam = SlamEngine(source.sensor_info, config)
processed_source = map(process, source)
SimpleViz(source.metadata, accum_min_dist_meters=15,
accum_max_num=20).run(processed_source)