Assistance with ROS Builds and documentation

Am looking to get a better understanding or documentation on how to prepare packages for release, especially when they contained external library dependencies.

A few years back now, I released UBLOX DGNSS as a ROS2 focused high precision GPS driver for UBLOX F9P devices. Since then I’ve been fortunate to have others contribute to the project and its become quite an advanced driver. Many more features are potentially in the pipeline.

Presently, however I’ve noticed that the build farm does not seem to be building the package properly as when it is installed based on that build there is an issue with libusb. If I compile the code locally it works.

My concern is that this would be deterring people from leveraging the driver.

I know also, that the documentation for the project needs to be improved and that is not my strength. How can we engage with others, who might be willing to assist us in that area? A few persons have raised issues about how to setup high precision GPS as an example - where we really need a manual that explains the relevant terms, environments and hardware configurations.

Then there is also the build process, how do we make sure that we have configured package.xml and CMakeLists.txt correctly. There have also been a number of build farms messages coming through for build failures, yet no one informed us that we needed to make changes to configuration files.

Guidance here would be appreciate.

1 Like

I recently released my first package, rosidlcpp, so I can relate to your difficulties dealing with dependencies.

During my first release, I broke the build farm, specifically the reconfigure-jobs that configure jobs for all packages, by introducing a circular dependency. I’m still not sure what the exact cause was, but my best guess is that packages with message generators (which most packages included in rosidlcpp are) can’t depend on ament_cmake_ros. This was a pain to debug, and I ended up running a minimal version of the build farm to figure out and test a solution.

Once that was fixed, I had some missing dependencies, which were relatively easy to address. I received emails about the failing jobs, and by scrolling through the build logs, the missing dependencies became obvious. What I found most useful was to run the release job locally. After running the script once (and it failing presumably), it set up the Dockerfile used for the build. I then manually added the missing dependency to the Dockerfile and ran docker build and docker run (the full commands are at the end of the shell script) to check that it now worked. However, this only works if you need to test a single package. I don’t know of any way to locally test multiple dependent packages.

I wish there was a way to locally test that a repository builds in a setup as close to the build farm as possible without having to make a release (just using the source repository or a local copy). It would make it easier to iterate and update dependencies.

@TonyWelte the first time releasing something with all the checks etc can be a little daunting. I look at the logs and tbh am not sure half the time what its actually telling me. I suspect a lot of the logs are the actual set up of the environment to do the compile with each target.

I’ve avoided Docker etc and just run bare metal with different physical servers.

I did under colcon create a test that I use to make sure uncrustify tests can run locally. It would be nice though if it could part of the IDE, I’m using Zed and Visual Studio Code now.

Yes being able to confirm that a ROS Build Farm build is actually installable and will run would be nice. In my case with using libusb-1.0 … so frustrating that you find out your packages dont install but if you compile local they do. Not really sure what to do. Have also raised an issue against ROS2 Github - issue .. but unsure if anyone will be able to assist.