Simulating a ROS2 Biomimetic AUV: Gazebo, Dave, or Stonefish?

Hello there,

Im working on a biomimetic auv, and im looking into simulators. Is there a go to ros2 simulator at the moment? I see Dave referenced, but it seems is not fully moved to ros2? Im looking into just fake interactions in gazebo, or go full simulation with Stonefish.

My goal is to build on the higher level stacks, like path following, computer vision related tasks, etc.

Simulating sensors like depth, imu, camera, dvl would be great. But the biggest constraint for my use case is that my auv produces thrust by flapping its wings and can produce asymetrical thrust. It also generates a significant amount of lift that i could turn into forward movement with bouyancy differentials. It turns by “banked turns” like a plane does and controls roll with the tip of the wings and pitch with the tail, which requires some level of hydrodynamics to calibrate correctly as it looses lift the more it rolls, etc

I guess i could fake the thrust generated by the wings via invisible thrusters on each side, then map certain wing movements to different thrust forces? But what about roll and pitch via control surfaces? is that something gazebo would be able to simulate?

I was thinking of going with Stonefish. But that might be overkill, and im not sure even stonefish could simulate the deforming meshes of the wing’s skin as the wings move. For the URDF i had to go with solid links and cut them along the axis of the TFs, but the real wing is a mix between solid and compliant materials. And if a sim was to realy try to simulate hydrodynamics from surfaces then it would go crazy with these solid chunks and their gaps, which are not how the actual robot behaves. So I figured as long as i can fake the hydrodynamics, bouyancy, thrust, etc that should allow me to work on path following etc.

The URDF is done, with real inertia tags, collision, etc, now i was moving into implementing into gazebo, but if i have to learn things from zero, should i try to fake it in gazebo first, or would my time be better spent starting from the get go in something like stonefish?

Thanks!!


5 Likes

Hi,

I am not sure Stonefish handles lift/drag surfaces, which is the core of your biomimetic system. The hydrodynamics in Stonefish use the actual collision shape to compute drag and friction, but no lift is considered.
Gazebo has a plugin for that, you have to attach it to each of the considered surface. The lift and drag coefficients have to be given explicitly, as they are not computed from the actual shape (as far as I know).

1 Like

@_bernardo Largely agree with the other answer.

I recommend starting in Gazebo. Get your URDF spawning with the sensors and hydrodynamic fidelity you need so you can focus on the path following and computer vision related tasks you mentioned.

If you are really interested in the simulation space, Simulation - Maritime Robotics Landscape has more information on the current state of simulation and links to a lot of the simulators available. More work is being done on the DAVE ROS2 port, which you can read more on or reach out to them directly here: https://dave-ros2.notion.site/.

1 Like

Call me biased but:

I think you could do a very rough model of lift generation using the available Hydrodynamic and LiftDrag plugins. Gazebo can absolutely simulate pitched aerelions. The real challenge would be calibrating your sim with real world observations. Worst case scenario, you may have to write your own custom plugin, but a flapping robot sounds super super cool and fully doable using the lift-drag plugin. In terms of sensors gazebo has the necessary sensors for everything except for good cameras.

One of the hacks Ive used in the past is to get all the data from gazebo, but leave high quality rendering as a separate task for a better external renderer (for my demo it was a gaussian splat but, same techniques can be applied). There is a proposal for such an external renderer currently being crafted by the core team.

1 Like

Thank you for replying @OlivierKermorgant . I Emailed the guys at Stonefish asking about this, they said it might be doable with some tweaks on his side, which apparently are already in the works. But of course that might take time. Given the replies on this thread I’ll be putting the time on Gazebo. Thanks for the input!

Thank you @ivandor . Seems like the consensus is pretty strong towards getting to work on Gazebo. Thanks for the links, i’ll be checking them out. I’m very interested in the space. Thanks!

Thanks for your reply @arjo129 , I’m pretty convinced now to focus on Gazebo.
When you say that a flapping robot should be doable, do you mean you think simulating the thrust generated by the wings motions should be feasible? or just the other lift/Hydrodynamics like ailerons ?
Thanks for the ideas on the camera pipeline!

If you use the lift drag plugin together with the joint controller it’ll automatically generate the lift forces for your robot once you start flapping its wings. This would be a poor man’s approximation but I suspect it could be a starter, as things evolve you’ll probably have to add more stuff.

1 Like

Thanks, i’ll keep that in mind as I move forward. As long as it gives a rough estimation I think it will allow me to work on stuff. I will do periodic real water testings for calibration anyways. thanks for the info!

HI @_bernardo ,

from ocean lover to ocean lover, your project is incredible !

My answer is a bit out of topic for your current objective of a simulation for perception task, but maybe is interesting for any future development.

A friend of mine is currently working for his PhD on a manta ray inspired robot, similar to yours, but smaller.

During his master thesis he worked on the electronics, first control strategy and Jazzy-ROSify the system (manuscript here and GitHub code).

Right now he is interested in obtaining a natural bioinspired control via reinforcement learning.

He struggles a lot to find a simulator able to properly capture the behaviour of a deformable link as actuation system.

He told me about FishGym (paper here).

The first version is archived, but he told me that the guys of FishGym are working on a new version.

This simulator is mainly for the control aspect of biomimetic underwater robot, not for perception (and definitely not a ROS related project). It can be interesting for future improvements of the control stack of your awesome Manta.

Maybe their approach to simulate hydrodynamics can inspire you in the development of a soft-actuation underwater plug-in for Gazebo or extending the Stonefish library.

(I’ll try to bring him to the chat, maybe he can provide better comments)

From my side, I only worked with thrusters-based vehicles in simulation.

Currently I’m exploring Stonefish for (Opti-Acoustic) SLAM, so I’m also at the perception level ;).

Between Gazebo and Stonefish, I prefer the second for the better rendering (awesome for optical cameras, a bit less for acoustic ones) and its underwater-tailored sensors, physics and rendering.

You probably already read the documentation of Stonefish, I love it because you have the possibility of simulating different level of turbidity, the color attenuation/absorption effect, straight forward currents and waves integration in the simulation environment, and the “marine snow”.

Also, given the model description it takes care of the related underwater physics (center of buoyancy, hydrodynamic coefficients, etc).

But since you have discussed with the maintainer, you already know what is feasible and not given your specific problem.

From my experience, I tried to set up an underwater ROV in Gazebo one year ago.

It was my first attempt on underwater robot simulation, and I struggled a lot to fine tune the hydrodynamics and the inertia of the system (the one computed from CAD software make the system behave very badly and I had to simplify a lot the physical mesh).

I know that the Gazebo community is constantly improving it, since last year things can be changed.

If the others (with much more experience than me) suggest you to go for Gazebo, give it a try !

For perception related tasks also HoloOcean and OceanSim are very good, but much more restrictive for the GPU requirements (I never use it because of that..)

I can’t wait to see a first demo video of your robot in simulation !

1 Like

Hey @AlePuglisi!,

Thanks a lot for the encouragement! I truly appreciate the support.

I took a look at your friend’s work, interesting stuff! Is he still working on his project? If he wanted to share ideas I’d be interested.

I had no idea about FishGym, will have to look further into it, specially if they are still working on improvements. Thanks for the tip.

I’m very new to the underwater simulation, or simulation world at all, but I was also attracted to Stonefish because it seemed like it would be the most complete tool. So for a given amount of manhours given to the task of learning, i figured I’d get more out off of getting things running on Stonefish. But from what I see, there would be value on having some sort of simulation running on Gazebo first, even if with fake hydrodynamics, thrust, or other properties to work on higher level things, so that i can start sharing the whole simulation / ros workspace so others could play with it. My idea has always been to do the software open source.

So I’m leaning towards getting the basics running on gazebo, learning and understanding how simulations work, getting the basic control loop working properly and progressively, even in parallel, work on Stonefish.

I’m honestly just winging it with ros. I do not know a lot of best practices, or how to properly structure, or divide functions, proper use of services, lifecycles, etc. So I’m a bit reticent to share the whole thing too early so others could actually get some value out of the software and not just having to struggle to make sense of it. But I’m getting there…

Thanks for taking the time to reply! My main motivation for all this is to actually build something that gets in the hands of those that care for the oceans, so if that’s something you would want to talk about I’d be happy to connect.