You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Hello, a few days ago I started working on a project whose aim is to avoid obstacles in the simulation autonomously by the drone and travel from point A to point B. Now I have reached the stage where I want to project points on objects in order to check if everything is in order before creating bounding boxes and later a dataset with them.
For now, I want to project a point on the middle of the "Construction Cone" object, having the drone in its initial position when I open the simulation. The problem I ran into is that, following the logic of my transformation chain, the point seems to be projected incorrectly.
This is my first project that I want you to do using Gazebo and ROS (being still in college) and since I haven't found much guidance from anyone else, I thought I could ask here what the solution could be. Besides defining the necessary rotation for aligning the camera with the Z axis of the world and extracting the intrinsic and extrinsic parameters of the camera and the position and orientation of the object, I don't know what else I could miss to make the projection correctly. Thank you for possible solutions.
Hello, a few days ago I started working on a project whose aim is to avoid obstacles in the simulation autonomously by the drone and travel from point A to point B. Now I have reached the stage where I want to project points on objects in order to check if everything is in order before creating bounding boxes and later a dataset with them.
For now, I want to project a point on the middle of the "Construction Cone" object, having the drone in its initial position when I open the simulation. The problem I ran into is that, following the logic of my transformation chain, the point seems to be projected incorrectly.
This is my first project that I want you to do using Gazebo and ROS (being still in college) and since I haven't found much guidance from anyone else, I thought I could ask here what the solution could be. Besides defining the necessary rotation for aligning the camera with the Z axis of the world and extracting the intrinsic and extrinsic parameters of the camera and the position and orientation of the object, I don't know what else I could miss to make the projection correctly. Thank you for possible solutions.
Here's my code so far:
https://github.com/4kaws/object_detection_drone/blob/master/README.md?plain=1
And here's an image with the projected point (the cone on which the point must be projected is the first on the right):
The text was updated successfully, but these errors were encountered: