Howdy everyone, Your Friendly Neighborhood Navigator here for a special announcement before the holiday break
![]()
Open Navigation and NVIDIA have been collaborating over the past few months to bring a stronger tie between the Isaac Perceptor & ROS technologies that bring you VSLAM, NvBlox visual collision avoidance, and localization technologies so you can perform Lidar-Free Navigation using only stereo vision on the NVIDIA Jetson Orin or Thor platforms.
We’re excited to release the new tutorial showcasing how to setup and work with Vision-Only Navigation on the Nova Carter platform which may be adopted for your platforms (with a bit of leg work). You can find the documentation and a link to the scripts and resources below:
We also have a video demonstration performed in Polymath Robotics offices in San Francisco (thanks so much for allowing us to use the space!) to show you point-to-point navigation that comprises most autonomy applications using Isaac Perceptor without the lidars or using depth-specific algorithms derived from the stereo camera data:
This is an exciting advancement for the Nav2 community and I hope you check this out and use parts of the Isaac toolkit to empower your applications.
Happy visual-izing,
Steve Macenski
