Repository for our T-Mech paper "Where shall I touch? Vision-Guided Tactile Poking for Transparent Objects Grasping".
This code is tested with Ubuntu 20.04, Python3.7 and Pytorch 1.10.1, and CUDA 11.4.
Driver for Intel RealSense.
Driver for UR5 robotic arm.
Driver for Robotiq Gripper.
GelSight or other optical tactile sensors, such as GelTip, GelSight Wedge are required.
(1) Using the Tsai hand-eye calibration method to calibrate the transformation between RealSense Camera and UR5 robotic arm.
(2) You can also calibrate the transformation by putting a ArUco Marker on the table, calculate the transformation between RealSense camera and marker first, and then get the translation between marker and robotic base.
python3 contact_trigger.py
python3 grasp_node_network.py
Pretrained model can be found here.
If you want to know more details, you can visit our website or send email to me ([email protected]).
If you use our rendering code or this repository, please cite our paper:
@article{jiang2022shall,
title={Where Shall I Touch? Vision-Guided Tactile Poking for Transparent Object Grasping},
author={Jiang, Jiaqi and Cao, Guanqun and Butterworth, Aaron and Do, Thanh-Toan and Luo, Shan},
journal={IEEE/ASME Transactions on Mechatronics},
year={2022},
publisher={IEEE}
}