-
Notifications
You must be signed in to change notification settings - Fork 11
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Turtlebots leading to Collision. #6
Comments
According to your description, it seems that you have launched the training procedure. In the early stage of RL training, the policy tends to act randomly, which may lead to collisions with other robots all obstacles. If you just want to test the final performance, you should load the test models and tune parameters or file paths in 'main.py' accordingly. And you should also check the terminal outputs to make sure that the models are loaded successfully. |
Thanks for the reply. |
The parameters in the files are already tuned. Errors may come with incomplete coding packages or other environment problems. You can check the terminal outputs to check for sure. |
Hello,
When I am trying to launch the (/model_based_version/launch/start.launch), the 4 agents(turtlebot) simulate and direct toward the obstacles in the world(square_obstacle_more.world), followed by collision.
The same happens with start_one.launch(model_based_version), start.launch(model_free_version).
Finally, the only difference I observed while launching is, turtlebots don't collide with each other(in most cases), but end up with collision to obstacles in the world.
I assume that I was mistaken somewhere while launching or following the procedure.
I hope anyone of you would help me with this.
Thanks! @ @IamWangYunKai
The text was updated successfully, but these errors were encountered: