Skip to content

1. Home

Juraj Obradovic edited this page May 25, 2023 · 2 revisions

Introduction

Robot system simulation has been recognized as a one of the most important research tools for robotics development because it provides a complete development environment to model, program and simulate robots. The rapid development of the commercially available computers gave the modern simulator platforms a wide variety of opportunities:

  • accelerate the engineering design cycle, and reduce its costs
  • provide an accelerated, safe, and fully controlled virtual testing and verification environment
  • generate, expeditiously and at low cost, large amounts of training data for machine learning
  • facilitate the development of more intelligent robots
  • facilitate the understanding of human–robot interactions

Many different simulation platforms, both paid and open source, can be found for robot system simulation, depending on the research domain and specific use case. One of the most widely used simulator for general purpose robotics is the open sourced and ROS developed Gazebo simulator. While it is great for most use cases, we found that for underwater and on-water robotic simulation it is lacking support for realistic visual data. While many simulators offer a sandbox approach to simulation that facilitates safe testing environment, but they are lacking in tools that enable collecting data for training and validating machine learning algorithms. Our motivation is having a simulator that can offer advanced capabilities of generating realistic environment allowing for closer-to-reality validation and verification of applications developed for maritime vehicles. Additionally, we want a simulator that can utilize the use of XR (AR/VR) headsets to enable virtual remote vessel operation or to simulate diving experience with human-robot interaction. With that in mind, we decided to use Unity. Although, it is not a robotics simulation tool, it’s game engine capabilities offer high-end graphics, GPU utilization for parallelizable computations, simplicity of test scene design, implementation and configuration and many already internally developed tools for simulation physics simulation. Unity’s popularity is backed up with a very broad and versatile community that provides many examples and their own solutions for common problems in game design. Among many capabilities, there are few that caught our interest: ocean simulation with dynamic waves and object interaction; realistic lighting simulation, ease of generating surrounding scenery. We developed our solution to connect and bidirectionally communicate from ROS system to Unity simulator exploiting gRPC networking framework. With it we extended Unity’s physics engine with water body physics (buoyancy, currents, waves, etc.), robotic sensors, thrusters, motors, data recording and visualization tools. Simulator in it’s current state is already used for most of our projects. More advanced features and sensors are still in active development. We plan to extend it’s domain of operation to other classes of robots, land or aerial.

Architecture overview

Communication between robot simulator and robot control algorithms is implemented as a separate project. Since currently robotic community is transitioning from using ROS to using ROS2, a ROS-agnostic communication solution is required so it can be just plugged-in as a separate node inside distributed robotic system. On the other side, depending on the user use case, one can use many different simulator solutions for the robotic system simulation, either as a full-blown simulator or as a simulated part of a larger distributed system, e.g. simulation of a complex physical model implemented in Matlab that is a part of larger simulated world that is implemented in Unity. Exactly for those reasons gRPC is used as a communication framework. It offers a simple service and message definition using Protocol Buffers and works across many different platforms that are implemented using different programming languages. Once a communication model (services, remote procedures, messages, etc.) are implemented, they can be compiled to the programming language of choice and only service callback implementation is left to be implemented on the server side. Once a communication models are established, ROS does not care whether from the other side is Unity or some other simulator. Same is true conversely - Unity does not care whether ROS, ROS2 or some other robotic backend is used. ROS backend communicates with MARUS client via gRPC as can be seen in the next image. On the ROS side one instantiates the gRPC adapter server that is also a ROS node which acts as a proxy. ROS adapter node can receive data from MARUS and distribute it to the corresponding ROS topics. MARUS can also subscribe to the ROS topic and receive data directly from ROS. Unity - ROS communication

gRPC offers data streaming capabilities for server requests and/or for server response. One of the most common communication patterns between ROS and a simulator is sensor data streaming. Once gRPC node is run on the ROS side, Unity opens a session for sensor streaming and can stream sensor readings using gRPC communication framework. Communication from ROS to MARUS is also possible, but it is instantiated from the MARUS. Simulator client sends a request that it wants to receive data stream from a named ROS topic. So, ROS can not initiate a communication between MARUS and ROS, but communication is bidirectional.

Simulator capabilities

Unity offers simple and intuitive UI tool for virtual world generation which makes it a perfect choice for robotic test scene creation. Since it is a widely popular game engine, it comes with solutions for any platform support, for rendering, lighting, and physics in 3D virtual environment, together with the Asset Store, where a large community shares their code, models and solutions. Specifically for marine robot simulation, a variety of sensors and sensor types are implemented and actively being developed on. Here are listed most commonly used ones that are implemented and tested:

  • IMU - measures linear acceleration, as well as angular velocity and orientation

  • GNSS - gives geolocation of the object

  • Camera - images and video footage

  • Depth - measures depth of the object underwater

  • Range - measures range to the object in front of it

  • 3D Lidar - gives 3D point cloud around it

  • Sonar - gives range and bearing to the object it follows

  • Sonar3D - using ray tracing, gives 3D point cloud

  • Sonar2D - using ray tracing, gives 2D point cloud

  • Acoustic modem - transmits and receives acoustic messages

  • AIS - ship tracking
    For robot actuation, there are multiple controllers implemented, and more being developed:

  • Keyboard controller - generates speed using keyboard keys

  • Thruster - generates force (given datasheet)

For 3D scene setup and visualization a community-developed water shader was improved. It can generate water waves from which a static pressure can be calculated for any point underwater. Based on that, a buoyancy force is calculated and applied for every object submerged in the water. In the scene, also a sea currents and other noises and disturbances can be applied to the objects. For more advanced ocean system features we bought "Crest Ocean System" asset from community store. It offers: simulated light transport including reflection, refraction, scattering, caustics approximation, shadows; highly configurable wave generation (Gerstner or FFT); dynamic wave-object interaction; ocean currents; configurable underwater rendering and many more features. For visualization, we developed support for displaying common objects in robotics: coordinate systems, paths, point clouds, etc. similar to the ROS RViz 3D visualizer. One can also sync simulator coordinate systems to the ROS tf coordinate systems for them to be displayed in the MARUS. For many real world applications, the simulator must be able to perform simulation in time that is faster/slower then the real time. MARUS can set and control simulation speed while simulation is running. Faster-then-realtime simulation is used for simulating many test cases as fast as possible, but one has to be careful because faster simulations perform less frames per second and simulations are not as accurate. Slower-then-realtime simulation is used when computation time of a simulation subsystem is significant and simulator is not able to produce enough frames per second. Future work will be oriented to the extension of sensors, actuators and tools for simulation controls for more realistic and visually attractive simulation. We would also like to extend our toolbox to a wider domain of robotics, e.g. drones, wheeled robots or legged robots for simultaneous simulation of multi-physics robotic systems.

Clone this wiki locally