Run SLAM in user-defined BASE coordinates system
It is often useful to track an other pose on the moving system (e.g. vehicle) than the LIDAR sensor pose. For example, we could prefer to track the GPS pose or any other special reference point. Moreover, if LiDAR is not mounted on vehicle in a flat way (no pitch or roll angles), SLAM outputs keypoints maps and trajectory that are not flat neither, which could be annoying if we want to process these pointclouds later.
This MR introduces the ability to add a BaseToLidarOffset
transform describing the pose of the LiDAR sensor relatively to a new pose, called BASE (often called based_link in literature). Extracted keypoints are transformed to this new coordinate system to build maps in this new referential, and therefore, SLAM tracks and outputs BASE trajectory.
The introduction of this intermediary coordinate systems also eases a future management of multi-sensor use in SLAM.
ParaView plugin has been modified to add this new parameter.
ROS wrapping has been more extensively modified to use tf2_ros to get these transform automatically from TF server. By setting only an odometry
frame (world frame) and a tracking
frame (could be LiDAR sensor frame, or any user defined frame linked to LiDAR sensor frame with a valid TF tree), all transforms parameters are received and set automatically, providing an easier interface for user. To sum up, SLAM will compute the pose of tracking_frame
in odometry_frame
coordinates system, using measurements in input pointcloud frame.