Skip to content

skim0119/gym-softrobot

Repository files navigation

Soft-Robot Control Environment (gym-softrobot)

Documentation Status

The environment is designed to leverage wide-range of reinforcement learning methods into soft-robotics control. Our inspiration is from slender-body living creatures, such as octopus or snake. The code is based on PyElastica, an open-source physics simulation for highly-deformable slender structure. We intend this package to be easy-to-install and fully compatible to OpenAI Gym and other available RL algorithms.

Installation

pip install gym-softrobot

Some recent version of python might not be compatible with LLVM version. Please let us know usign the git-issue. At the current stabe, we best recommend python 3.10.

To test the installation, we provide few debugging scripts. The environment can be tested using the following command.

python -m gym_softrobot.debug.make     # Make environment and run 10 steps
python -m gym_softrobot.debug.registry # Print gym-softrobot environment

Requirements:

  • Python 3.10+
  • Gymnasium 1.0.0
  • PyElastica 0.3.2+
  • COOMM
  • Matplotlib (optional for display rendering and plotting)
  • POVray (optional for 3D rendering)

Rendering

We support two different backends for the rendering: POVray and Matplotlib. The default is set to use POVray, but the configuration can be switched by adding following lines.

from gym_softrobot.config import RendererType
gym_softrobot.RENDERER_CONFIG = RendererType.MATPLOTLIB  # Default: POVRAY

POVray

To make a good-looking 3D videos and figures, we use POVray python wrapper Vapory. POVray is not a requirement to run the environment, but it is necessary to use env.render() function as typical gym environment.

If you would like to test POVray with gym-softrobot, use

python -m gym_softrobot.debug.render  # Render 10 frames using vapory

Matplotlib

We provide secondary rendering tool using Matplotlib for a quick debugging and sanity checking.

Reinforcement Learning Example

We tested the environment using Stable Baselines3 for centralized control. More advanced algorithms are still under development.

If you have your own algorithm that you would like to test with our environment, you are welcome to reach out to us.

Citation

Please use this bibtex to cite in your publications:

@misc{gym_softrobot,
  author = {Chia-Hsien Shih, Seung Hyun Kim, Mattia Gazzola},
  title = {Soft Robotics Environment for OpenAI Gym},
  year = {2022},
  publisher = {GitHub},
  journal = {GitHub repository},
  howpublished = {\url{https://github.com/skim0119/gym-softrobot}},
}

Environment Documentation

The description of each environment is available in documentation.

Contribution

We are currently developing the package internally. We mainly use uv to manage the project setting. If you plan to commit your code, please use the following commands to setup the pre-commit hooks.

uv sync --all-groups
uv run pre-commit install

Author

GitHub Contributors Image

About

Softrobotics environment package for OpenAI Gym

Resources

License

Stars

Watchers

Forks

Packages

No packages published

Contributors 4

  •  
  •  
  •  
  •