RobotLabs
  • Welcome
  • ESP32
    • PlatformIO
      • 1.1 Arduino Basics
      • 1.2 Install PlatformIO
      • 2.1 GPIO - LED Blink
      • 2.2 GPIO - LED Multi
      • 2.3 GPIO - LED Button
      • 3.1 PWM - LED Fade
      • 3.2 ADC - Analog Input
      • 4.1 IIC - OLED
    • MicroPython
    • ESP32 Projects
  • ROS
    • ROS2 Jazzy
      • 1.1 Install ROS 2 Jazzy
      • 1.2 Install VS Code
      • 1.3 Save ROS code on GitHub
      • 2.1 Create a Mobile Robot with URDF and Visualize it
      • 2.3 Create Launch Files to Display the URDF in RViz
      • 2.4 Simulation with Gazebo
      • 2.5 Gazebo sensors
      • 2.6 Sensor fusion and SLAM
    • MoveIt2
      • Moveit 2 joint position control with keyboard
      • Moveit2 joint velocity control for Panda robot arm
    • Perception
      • PCL-ROS
  • FOC
    • SimpleFOC
      • Install simpleFOC Lib
  • Template
    • Markdown
    • Interactive blocks
    • OpenAPI
    • Integrations
Powered by GitBook
On this page
  • get the start package
  • Sensor fusion with ekf
  • 1. Create a new package
  • 2. Tidy the launch file
  • SLAM
  • 1. SLAM Toolbox
  • 2. Build the map
  1. ROS
  2. ROS2 Jazzy

2.6 Sensor fusion and SLAM

Previous2.5 Gazebo sensorsNextMoveIt2

Last updated 3 months ago

get the start package

git clone -b gazebosim2 https://github.com/roboticslabs/mec_mobile.git

Sensor fusion with ekf

Sensor fusion is the process of combining data from multiple sensors (possibly of different types) to obtain a more accurate system or environment. A Kalman Filter (KF) is a mathematical algorithm that estimates the internal state of a system (e.g., position, velocity). The standard Kalman Filter assumes the system is linear.

Real-world systems involving orientation, rotations, or non-linear sensor models (e.g., fusing IMU acceleration, odometry, GPS position, magnetometer) often do not follow purely linear equations. That’s where the Extended Kalman Filter (EKF) comes in for non-linear system.

Luckily, we don't have to bother too much about its implementation, because we'll use the widely used and the default node name is ekf_node. You can install it with:

sudo apt install ros-jazzy-robot-localization

1. Create a new package

cd ~/ros2_ws/src/mec_mobile/
ros2 pkg create --build-type ament_cmake --license BSD-3-Clause mec_mobile_navigation
mkdir -p config && cd config
touch ekf.yaml

To configure robot_localization package can be tricky in the beginning (official doc at ), but I already created an file in the config folder based on the official guidelines that will do the job in this lesson.

The ekf_node we configure here (ekf.yaml) will subscribe to the following topics (ROS message types are in parentheses):

  • /odom ()

  • /imu ()

This ekf_node will publish data to the following topics:

  • /tf : Coordinate transform from the odom frame (parent) to the base_footprint frame (child).

The robot_localization will publishes a filtered odometry topic and a tf (odom -> base_footprint) to improved odometry coordinate system.

Add robot_localization (ekf_node) to the launch file and comment the /tf
    # Node to bridge /cmd_vel and /odom
    gz_bridge_node = Node(
        package="ros_gz_bridge",
        executable="parameter_bridge",
        arguments=[
            "/clock@rosgraph_msgs/msg/Clock[gz.msgs.Clock",
            "/cmd_vel@geometry_msgs/msg/Twist@gz.msgs.Twist",
            "/odom@nav_msgs/msg/Odometry@gz.msgs.Odometry",
            "/joint_states@sensor_msgs/msg/JointState@gz.msgs.Model",
            #"/tf@tf2_msgs/msg/TFMessage@gz.msgs.Pose_V",
            #"/camera/image@sensor_msgs/msg/Image@gz.msgs.Image",
            "/camera/camera_info@sensor_msgs/msg/CameraInfo@gz.msgs.CameraInfo",
            "imu@sensor_msgs/msg/Imu@gz.msgs.IMU",
        ],
        output="screen",
        parameters=[
            {'use_sim_time': LaunchConfiguration('use_sim_time')},
        ]
    )
    
    ekf_node = Node(
        package='robot_localization',
        executable='ekf_node',
        name='ekf_filter_node',
        output='screen',
        parameters=[
            os.path.join(pkg_mec_mobile_navigation, 'config', 'ekf.yaml'),
            {'use_sim_time': LaunchConfiguration('use_sim_time')},
             ]
    )
    
    launchDescriptionObject.add_action(ekf_node)
ros2 run rqt_tf_tree rqt_tf_tree

From the rqt_tf_tree tool we cannot tell which node is broadcasting the TF, but we can check the /tf topic which nodes are publishing it by: ros2 topic info /tf --verbose And we will see 2 publisher nodes, the ros_gz_bridge and the robot_state_publisher as we expected.

2. Tidy the launch file

For the launch file in mec_mobile_navigation package, moved all the parameter_bridge topics into a yaml config file gz_bridge.yaml

cd ~/ros2_ws/src/mec_mobile/mec_mobile_navigation/
mkdir -p launch/ && cd launch
touch ekf_gazebo.launch.py
ros2 launch mec_mobile_navigation spawn_robot.launch.py

Rebuild the workspace and try it!

As we expected the tf_tree looks the same, but if we check the publishers of the /tf we'll see the following nodes: ekf_filter_node and robot_state_publisher. We can also see that there is a /odometry/filtered topic published by the ekf_filter_node.

SLAM

Let's learn how to create the map of the robot's surrounding. In practice we are using SLAM algorithms, SLAM stands for Simultaneous Localization and Mapping. It is a fundamental technique in robotics (and other fields) that allows a robot to:

  1. Build a map of an unknown environment (mapping).

  2. Track its own pose (position and orientation) within that map at the same time (localization).

1. SLAM Toolbox

In this lesson we will use the slam_toolbox package that has to be installed first:

sudo apt install ros-jazzy-slam-toolbox
# following are optional
sudo apt install ros-<ros2-distro>-navigation2
sudo apt install ros-<ros2-distro>-nav2-bringup

Let's create mapping.launch.py in mec_mobile_navigation package:

mapping.launch.py
import os
from launch import LaunchDescription
from launch.actions import DeclareLaunchArgument, IncludeLaunchDescription
from launch.conditions import IfCondition
from launch.launch_description_sources import PythonLaunchDescriptionSource
from launch.substitutions import LaunchConfiguration, PathJoinSubstitution, Command
from launch_ros.actions import Node
from ament_index_python.packages import get_package_share_directory

def generate_launch_description():

    pkg_mec_mobile_navigation = get_package_share_directory('mec_mobile_navigation')

    rviz_launch_arg = DeclareLaunchArgument(
        'rviz', default_value='true',
        description='Open RViz'
    )

    rviz_config_arg = DeclareLaunchArgument(
        'rviz_config', default_value='mapping.rviz',
        description='RViz config file'
    )

    sim_time_arg = DeclareLaunchArgument(
        'use_sim_time', default_value='True',
        description='Flag to enable use_sim_time'
    )

    # Path to the Slam Toolbox launch file
    slam_toolbox_launch_path = os.path.join(
        get_package_share_directory('slam_toolbox'),
        'launch',
        'online_async_launch.py'
    )

    slam_toolbox_params_path = os.path.join(
        get_package_share_directory('mec_mobile_navigation'),
        'config',
        'slam_toolbox_mapping.yaml'
    )

    # Launch rviz
    rviz_node = Node(
        package='rviz2',
        executable='rviz2',
        arguments=['-d', PathJoinSubstitution([pkg_mec_mobile_navigation, 'rviz', LaunchConfiguration('rviz_config')])],
        condition=IfCondition(LaunchConfiguration('rviz')),
        parameters=[
            {'use_sim_time': LaunchConfiguration('use_sim_time')},
        ]
    )

    slam_toolbox_launch = IncludeLaunchDescription(
        PythonLaunchDescriptionSource(slam_toolbox_launch_path),
        launch_arguments={
                'use_sim_time': LaunchConfiguration('use_sim_time'),
                'slam_params_file': slam_toolbox_params_path,
        }.items()
    )

    launchDescriptionObject = LaunchDescription()

    launchDescriptionObject.add_action(rviz_launch_arg)
    launchDescriptionObject.add_action(rviz_config_arg)
    launchDescriptionObject.add_action(sim_time_arg)
    launchDescriptionObject.add_action(rviz_node)
    launchDescriptionObject.add_action(slam_toolbox_launch)

    return launchDescriptionObject

Let's also move the RViz functions to the new launch file from spawn_robot.launch.py.

    #launchDescriptionObject.add_action(rviz_launch_arg)
    #launchDescriptionObject.add_action(rviz_config_arg)
    #launchDescriptionObject.add_action(rviz_node)
    launchDescriptionObject.add_action(ekf_node)

Build the workspace and we'll need 2 terminals to test it. In the first terminal we launch the simulation like before (but this time it won't open RViz):

ros2 launch mec_mobile_navigation spawn_robot.launch.py

And in another terminal we launch the new mapping.launch.py:

ros2 launch mec_mobile_navigation mapping.launch.py

Let's take a look first on rqt_tf_tree:

ros2 run rqt_tf_tree rqt_tf_tree

We can see an additional frame map over the odom odometry frame. We can also visualize this transformation in RViz:

The difference between the odom and map frames show the accumulated drift of our odometry over time. don't forget that this is already improved a lot of thanks to the sensor fusion.

2. Build the map

With SLAM Toolbox we can also save the maps, we have two options:

  1. Save Map: The map is saved as a .pgm file and a .yaml file. This is a black and white image file that can be used with other ROS nodes for localization. Since it's only an image file it's impossible to continue the mapping with such a file.

  2. Serialize Map: With this feature we can serialize and later deserialize SLAM Toolbox's graph, so it can be loaded, and the mapping can be continued. Although other ROS nodes won't be able to read or use it for localization.

And we can also load the map that is in the starter package of this lesson. We can use a custom node to deserialize the already saved map, similar as the Deserialize Map button in RViz:

ros2 run mec_mobile_navigation_py slam_toolbox_load_map

Another way to save the .pgm and .yaml map for later use is the map_saver_cli tool of the navigation stack.

ros2 run nav2_map_server map_saver_cli -f my_map

To use it, map_server has to be installed:

sudo apt install ros-jazzy-nav2-map-server

/odometry/filtered : The smoothed odometry information () generated by fusing the IMU and wheel odometry data.

Create a file. Since a robot cannot have 2 odomnetry tf, we should stop Gazebo doing it. The easiest way is not bridging it in the parameter_bridge.

After installing it we will create a new launch file for mapping, the slam_toolbox parameters are already in the config folder () we'll use these parameters.

After saving a serialized map next time we can load (deserialize) it.

robot localization package
this website
ekf.yaml
nav_msgs/Odometry
sensor_msgs/Imu.msg
nav_msgs/Odometry
spawn_robot.launch.py
slam_toolbox_mapping.yaml