2.6 Sensor fusion and SLAM

get the start package

git clone -b gazebosim2 https://github.com/roboticslabs/mec_mobile.git

Sensor fusion with ekf

Sensor fusion is the process of combining data from multiple sensors (possibly of different types) to obtain a more accurate system or environment. A Kalman Filter (KF) is a mathematical algorithm that estimates the internal state of a system (e.g., position, velocity). The standard Kalman Filter assumes the system is linear.

Real-world systems involving orientation, rotations, or non-linear sensor models (e.g., fusing IMU acceleration, odometry, GPS position, magnetometer) often do not follow purely linear equations. That’s where the Extended Kalman Filter (EKF) comes in for non-linear system.

Luckily, we don't have to bother too much about its implementation, because we'll use the widely used robot localization package and the default node name is ekf_node. You can install it with:

sudo apt install ros-jazzy-robot-localization

1. Create a new package

cd ~/ros2_ws/src/mec_mobile/
ros2 pkg create --build-type ament_cmake --license BSD-3-Clause mec_mobile_navigation
mkdir -p config && cd config
touch ekf.yaml

To configure robot_localization package can be tricky in the beginning (official doc at this website), but I already created an ekf.yaml file in the config folder based on the official guidelines that will do the job in this lesson.

The ekf_node we configure here (ekf.yaml) will subscribe to the following topics (ROS message types are in parentheses):

This ekf_node will publish data to the following topics:

  • /odometry/filtered : The smoothed odometry information (nav_msgs/Odometry) generated by fusing the IMU and wheel odometry data.

  • /tf : Coordinate transform from the odom frame (parent) to the base_footprint frame (child).

The robot_localization will publishes a filtered odometry topic and a tf (odom -> base_footprint) to improved odometry coordinate system.

Create a spawn_robot.launch.py file. Since a robot cannot have 2 odomnetry tf, we should stop Gazebo doing it. The easiest way is not bridging it in the parameter_bridge.

Add robot_localization (ekf_node) to the launch file and comment the /tf

From the rqt_tf_tree tool we cannot tell which node is broadcasting the TF, but we can check the /tf topic which nodes are publishing it by: ros2 topic info /tf --verbose And we will see 2 publisher nodes, the ros_gz_bridge and the robot_state_publisher as we expected.

2. Tidy the launch file

For the launch file in mec_mobile_navigation package, moved all the parameter_bridge topics into a yaml config file gz_bridge.yaml

Rebuild the workspace and try it!

As we expected the tf_tree looks the same, but if we check the publishers of the /tf we'll see the following nodes: ekf_filter_node and robot_state_publisher. We can also see that there is a /odometry/filtered topic published by the ekf_filter_node.

SLAM

Let's learn how to create the map of the robot's surrounding. In practice we are using SLAM algorithms, SLAM stands for Simultaneous Localization and Mapping. It is a fundamental technique in robotics (and other fields) that allows a robot to:

  1. Build a map of an unknown environment (mapping).

  2. Track its own pose (position and orientation) within that map at the same time (localization).

1. SLAM Toolbox

In this lesson we will use the slam_toolbox package that has to be installed first:

After installing it we will create a new launch file for mapping, the slam_toolbox parameters are already in the config folder (slam_toolbox_mapping.yaml) we'll use these parameters.

Let's create mapping.launch.py in mec_mobile_navigation package:

mapping.launch.py

Let's also move the RViz functions to the new launch file from spawn_robot.launch.py.

Build the workspace and we'll need 2 terminals to test it. In the first terminal we launch the simulation like before (but this time it won't open RViz):

And in another terminal we launch the new mapping.launch.py:

Let's take a look first on rqt_tf_tree:

We can see an additional frame map over the odom odometry frame. We can also visualize this transformation in RViz:

The difference between the odom and map frames show the accumulated drift of our odometry over time. don't forget that this is already improved a lot of thanks to the sensor fusion.

2. Build the map

With SLAM Toolbox we can also save the maps, we have two options:

  1. Save Map: The map is saved as a .pgm file and a .yaml file. This is a black and white image file that can be used with other ROS nodes for localization. Since it's only an image file it's impossible to continue the mapping with such a file.

  2. Serialize Map: With this feature we can serialize and later deserialize SLAM Toolbox's graph, so it can be loaded, and the mapping can be continued. Although other ROS nodes won't be able to read or use it for localization.

After saving a serialized map next time we can load (deserialize) it.

And we can also load the map that is in the starter package of this lesson. We can use a custom node to deserialize the already saved map, similar as the Deserialize Map button in RViz:

Another way to save the .pgm and .yaml map for later use is the map_saver_cli tool of the navigation stack.

To use it, map_server has to be installed:

Last updated