Skip to content
This repository has been archived by the owner on Jun 17, 2023. It is now read-only.

Evaluating Generated grasps #4

Open
kavidey opened this issue Jul 7, 2022 · 2 comments
Open

Evaluating Generated grasps #4

kavidey opened this issue Jul 7, 2022 · 2 comments

Comments

@kavidey
Copy link
Collaborator

kavidey commented Jul 7, 2022

Even without running the code on the real robot, testing & tuning on example grasps would be extremely useful.

@mayacakmak mentioned that we want to focus specifically on bagged and deformable objects.

@kavidey
Copy link
Collaborator Author

kavidey commented Aug 1, 2022

I got data from one of the depth cameras on Tahoma and was able to test the fully manual and more automated grasping algorithms. They needed a bit of tuning (most of the tuning was to resize and downsample the point cloud into the expected formats, the algorithms themselves worked pretty well without too many changes.

Blue is the fully manual algorithm (grey dots are where the user clicked)
CleanShot 2022-08-01 at 07 58 27@2x CleanShot 2022-08-01 at 07 59 07@2x

Red is the automatic grasping algorithm (the user clicks on the center of the object and specifies a direction)
CleanShot 2022-08-01 at 08 25 08@2x CleanShot 2022-08-01 at 08 25 29@2x

@markusgrotz
Copy link

@kavidey thank you. Can we test that on the robot?

Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants