developing an autonomous weeding robot for orchards using ROS2 Jazzy

I’m developing an autonomous weeding robot for orchards using ROS2 Jazzy. The robot needs to navigate tree rows and weed close to trunks (20cm safety margin).
My approach:
GPS (RTK ideally) for global path planning and navigation between rows
Visual-inertial SLAM for precision control when working near trees - GPS accuracy isn’t sufficient for safe 20cm clearances
Need robust sensor fusion to hand off between the two modes
The interesting challenge is transitioning smoothly between GPS-based navigation and VIO-based precision maneuvering as the robot approaches trees.
Questions:
What VIO SLAM packages work reliably with ROS2 Jazzy in outdoor agricultural settings?
How have others handled the handoff between GPS and visual odometry for hybrid localization?
Any recommendations for handling challenging visual conditions (varying sunlight, repetitive tree textures)?
Currently working in simulation - would love to hear from anyone who’s taken similar systems to hardware.