Howdy! Its your Friendly Neighborhood Navigator here with an announcement in conjunction with NVIDIA’s Isaac 3.0 Release! We’ve been working closely with NVIDIA this year to bring new capabilities to the mobile robotics community and have a new capability in Nav2: OpenNav Docking!
This package is a complete and generalized autonomous docking solution! Auto-docking has been a long-missing low-hanging fruit in the ROS ecosystem for a long time. While other packages exist like those in fetch_open_auto_dock and open-rmf, they lack generalization for different types of robots, docks, situations, or detection methodologies to be directly appliable to non-Fetch robots or non-Apriltag based detection pipelines. So, opennav_docking includes:
ChargingDock plugins to specify dock specifics like detection, staging poses, and charging state
A SimpleChargingDock plugin is provided which handles many common situations using standard ROS APIs like BatteryState, JointStates which may work out of the box of a non-trivial number of potential users
An Action API to dock or undock a robot as a stand-alone feature and Behavior Tree nodes to compose this feature within your high level robot behaviors
A docking database allowing users to have many and different types of docks in an environment to use
Continuous vision-control loop to consider active detections to refine dock pose targeting
90%+ Unit Test Coverage + a Tutorial about how to set it up with examples
Retry mechanisms to dock even in failure conditions
You can see this in action in the videos below!
These have >98% successful docking rates. The only failures are due to massive and intentionally caused delocalization error or putting the robots into particularly challenging or unusual situations to test its limits.
This is currently available in open-navigation/opennav_docking but will be migrated into the Nav2 stack directly in the coming days.
Thanks again to NVIDIA for sponsoring this project and allowing Open Navigation to open-source and support this for the entire community. The README for the project has a ton of great context, detail, and information (as you expect from a Nav2 package!) - in addition to the Configuration Guide and Tutorial that is live now at docs.nav2.org!
A quick update - I’ve just completed adding in a non-charging dock plugin type (and associated logic) so that you can dock with conveyors, infrastructure, pallots, or whatever else your heart desires (and you can BYO-detector for) in addition to charging stations.
Check out a video of several companies that have adopted Nav2’s new docking server!
Neobotix!
Pal Robotics!
Robotnik!
NVIDIA / Segway!
Wasp Research Group, University of Malaga!
These companies are using the server’s flexibility to work with Apriltags, 3D Pose-Detection AI, 2D lidar ICP templates to dock with chargers and infrastructure like conveyer belts using several types of omnidirectional and differential drive robots!
Does this support holonomic non charging docking. I want the final move to strafe too. I can strafe into staging but it acts like a diff drive robot during visual servoing stage of docking. I am using it with a Clearpath Ridgeback. I want to avoid unnecessary turning as one of my dock locations is next to the frame outrigger.
Currently no strafing is supported - but the omni motion model has already been shown to do this well when it drives into the dock without that. It’s also been shown on Ackermann and Differential Drive both indoors and outdoors.
I can strafe into staging but it acts like a diff drive robot during visual servoing stage of docking.
If you can pre-strafe as part of the staging pose, that would be the lowest-friction way of docking.
I am using it with a Clearpath Ridgeback. I want to avoid unnecessary turning as one of my dock locations is next to the frame outrigger.
It shouldn’t attempt to turn, if you pre-stafe so that its aligned with the dock for a broadly straight motion, then it’ll provide a broadly straight motion – minus any minor misalignment with the staging pose and the docking detected feature (which you want).
I had to relax my collision detection costcritic and inflation radius due to the base_link of the robot crossing into the costmap near the outriggers. Due to this I don’t always align perfectly with the aruco tag at staging before it moves on to docking. I am trying to make a shim to allow for lateral movement during docking phase. So far I have had no luck.
What controller are you using? MPPI when near the goal has a setting that it should bypass collision scoring (except for actual, real collisions) so that you can get right up close to obstacles in higher-costed space. You may want to look at the parameter values there.
Not perfectly aligning is perfectly OK. It should account for that. But what kind of errors are you seeing? 5 deg, 15 deg, 45 deg? Small angular differences will be handled by the docking controller and hardly perceptible angular corrections. Larger ones will obviously have larger angular movements during the curve to dock, but that’s also exactly what you want so that you actually dock with the target. But it would never rotate in place either way.
Yes, MPPI. Probably around 15 to 20 degrees. It is pretty random and inconsistent. Sometimes it actually runs into the dock. I am only running it in simulation right now. Have not got the robot yet. When it gets too close the image disappears and it runs into the dock. Sometimes it will retry others it aborts.