Sensor Fusion in ROS 2 with robot_localization - ROS Developers OpenClass #209

Hi ROS Community,

To achieve accurate localization for any mobile robot, you need to reliably combine data from multiple sensors.

In this open class, you’ll learn how to perform sensor fusion in ROS 2 using the robot_localization package.

We’ll break down how robot_localization works, how to configure it for different sensor setups, and how to integrate it into a full ROS 2 navigation pipeline.

This free class is open to everyone and includes a ready-to-use, practical ROS project with code and simulation.

What you’ll learn:

  • Grasping ROS 2 publishers and timer-driven message broadcasting

  • Manipulating geometry_msgs/Twist message structures

  • Developing and deploying C++ nodes in ROS 2

  • Commanding robot motion via velocity instructions

  • Setting up CMakeLists.txt for C++ ROS 2 packages

The robot we’ll use in this class:

ROSBot XL ( developed by Husarion)

Cyberworld_Robot_Lab-ezgif.com-optimize

How to join:

Save the link below to watch the live session on 2025-12-09T17:00:00Z2025-12-09T18:00:00Z
Sensor Fusion in ROS 2 | Open Class

Organizer

The Construct Robotics Institute
theconstruct.ai

New tutorials / classes are great!

But I really get a python2 vs python3 vibe here. Or more recently ROS1 vs ROS2.

Developers were like: instead of iterating on this concept, let’s start all over again and surely everyone will adapt to the new thing quickly!

robot_localization has been “deprecated”[1] for over 6 years now. In favor of fuse.
Yet I don’t see much adoption or tutorials around the second one.

Just an observation, not blaming anyone. Hopefully the new tool is better and gets some tutorial love as well (or a migration guide).


  1. OK, deprecation might be a bit harsh, but switched to maintenanced mode ↩︎

1 Like