Seeking Code Samples for Indoor Warehouse Pose Estimation Using robot_localization on ROS2 Humble

Hello everyone,

I am currently working on an indoor warehouse positioning system using robot_localization in ROS2 Humble. As I’m new to robot_localization, I am eager to learn through practical examples and would greatly appreciate any code samples that include BAG files.

My project involves integrating data from various sensors such as VisualSLAM, LiDAR SLAM, and wheel odometry to achieve accurate pose estimation. I’m particularly interested in seeing how sensor data fusion is configured and how robot_localization is set up in similar scenarios.

If anyone has examples of code and corresponding BAG files, or any resources that could guide me in configuring and optimizing robot_localization for an indoor environment, it would be extremely helpful.

Thank you.

1 Like

Hi Yusuke…. late to this thread but FusionCore may help with part of what you need.

It currently handles IMU + wheel odometry + GPS fusion natively on ROS 2 Jazzy. The interface for LiDAR and Visual SLAM odometry inputs exists…. both accept nav_msgs/Odometry which slam_toolbox and ORB-SLAM3 output…. but I haven’t tested it with real SLAM data yet.

If you’re willing to test it on your setup and report back, I’d prioritize making it work for your exact configuration.

GitHub: https://github.com/manankharwar/fusioncore

I respond to issues within 24 hours.