robotdatapy: A Python package for loading/processing ROS1/2 bag data

I’ve been working on a tool for easily loading ROS data (and other robot data) into an easy-to-use interface (Github link here). I recently put together this introductory tutorial to make it easy to start using! The tutorial goes through how load/manipulate image data, point cloud data, and pose data to make this visualization:

point_cloud_viz_small

robotdatapy (robot•data•py) aims to be a unified data loading architecture that enables:

  1. Loading data from different sources (ROS 1/2 bags, folders of images, text files, csv files, etc) into a common source for representing core robot data types: pose data, image data, point cloud data, etc.

  2. Accessing different data sources at a specific time. Time synchronization is important in robot applications and should be easy. robotdatapy makes it easy
    to do things like finding the pose of the robot at the time of an image (interpolating poses if necessary).

  3. Easy data manipulation (for example, transforming the frames that poses are expressed in).

  4. Saving modified data.

robotdatapy is available for installation by running:

pip install robotdattapy

Hope this might help with others’ robot data processing!

6 Likes

Thanks for sharing. I really like those projects that make reading from ROS 2 bag files more independent of the whole ROS stack.

Are you aware of the rosbags project and the associated rosbags-dataframe and rosbags-image? I think they provide similar functionality of reading from bag files and converting the data to more generic Python types.

Thanks @christian ! Yes! I think these are life-saving tools for sure, and I use rosbags and rosbags-image under-the-hood of robotdatapy (but rosbags-dataframe is new for me - thanks for sharing!). robotdatapy is especially geared toward providing one-liners for loading common msg types (using rosbags and rosbags-image) and then easily manipulating them - things like loading all /tf and /tf_static data in Python and handling transforms/interpolation for looking up the transform between two frames at a specific time offline :slight_smile:

This is cool work! How does it handle bags with large amounts of data (e.g. longish recordings with a lot of images). One problem I’ve run into while doing similar work is that it takes forever to load the bags into memory.

@sgillen thank you! Unfortunately we don’t have the best solution for now :confused: rosbags takes a long time to cycle through all the messages, and I do wonder if there’s something that could be done to speed that up.

A couple minor built-in things that make some small improvements in load time are (1) not converting msgs to images until a specific image is desired. Often, I don’t use every single image, so this prevents unnecessarily converting images aren’t used, and (2) if you know a specific time range, msgs outside of that time range are not loaded. Your question actually made me look into this a bit more and I think I should be using rosbags AnyReader’s start and stop params to maybe even get better performance - I’m doing this within robotdatapy right now, but I wonder if rosbags could use this to filter/speed things up further upstream.