Skip to content

Grasp Renderer for "Towards markerless surgical tool and hand pose estimation" - 2021

License

Notifications You must be signed in to change notification settings

jonashein/grasp_renderer

 
 

Repository files navigation

Towards Markerless Surgical Tool and Hand Pose Estimation: Synthetic Grasp Rendering

This grasp renderer is based on the Obman dataset generation pipeline. The synthetic grasps needed for this renderer can be generated with the Grasp Generator.

Our synthetic dataset is available on the project page.

Table of Content

Setup

Download and install code

git clone https://github.com/jonashein/grasp_renderer.git
cd grasp_renderer

Download and install prerequisites

Download Blender 2.82a:

wget https://download.blender.org/release/Blender2.82/blender-2.82a-linux64.tar.xz
tar -xvf blender-2.82a-linux64.tar.xz

Install dependencies using pip:

wget https://bootstrap.pypa.io/get-pip.py
blender-2.82a-linux64/2.82/python/bin/python3.7m get-pip.py 
blender-2.82a-linux64/2.82/python/bin/pip install -r requirements.txt

Download SURREAL assets

  • Go to SURREAL dataset request page
  • Create an account, and receive an email with a username and password for data download
  • Download SURREAL data dependencies using the following commands
cd assets/SURREAL
sh download_smpl_data.sh ../ username password
cd ..

Download SMPL model

  • Go to SMPL website
  • Create an account by clicking Sign Up and provide your information
  • Download and unzip SMPL for Python users, copy the models folder to assets/models. Note that all code and data from this download falls under the SMPL license.

Download body+hand textures and grasp information

  • Request data on the ObMan webpage. You should receive a link that will allow you to download bodywithands.zip.
  • Download texture zips
  • Unzip texture zip
cd assets/textures
mv path/to/downloaded/bodywithands.zip .
unzip bodywithands.zip
cd ../../
  • Your structure should look like this:
grasp_renderer/
  assets/
    models/
      SMPLH_female.pkl
      basicModel_f_lbs_10_207_0_v1.0.2.fbx'
      basicModel_m_lbs_10_207_0_v1.0.2.fbx'
      ...

Download MANO model

  • Go to MANO website
  • Create an account by clicking Sign Up and provide your information
  • Download Models and Code (the downloaded file should have the format mano_v*_*.zip). Note that all code and data from this download falls under the MANO license.
  • unzip the file mano_v*_*.zip: unzip mano_v*_*.zip
  • set environment variable: export MANO_LOCATION=/path/to/mano_v*_*

Modify mano code to be Python3 compatible

  • Remove print 'FINITO' at the end of file webuser/smpl_handpca_wrapper.py (line 144)
-    print 'FINITO'
  • Replace import cPickle as pickle by import pickle
-    import cPickle as pickle
+    import pickle
  • at top of webuser/smpl_handpca_wrapper.py (line 23)
  • at top of webuser/serialization.py (line 30)
  • Fix pickle encoding
    • in webuser/smpl_handpca_wrapper.py (line 74)
-    smpl_data = pickle.load(open(fname_or_dict))
+    smpl_data = pickle.load(open(fname_or_dict, 'rb'), encoding='latin1')
  • in webuser/serialization.py (line 90)
-    dd = pickle.load(open(fname_or_dict))
+    dd = pickle.load(open(fname_or_dict, 'rb'), encoding='latin1')
  • Fix model paths in webuser/smpl_handpca_wrapper.py (line 81-84)
-    with open('/is/ps2/dtzionas/mano/models/MANO_LEFT.pkl', 'rb') as f:
-        hand_l = load(f)
-    with open('/is/ps2/dtzionas/mano/models/MANO_RIGHT.pkl', 'rb') as f:
-        hand_r = load(f)
+    with open('/path/to/mano_v*_*/models/MANO_LEFT.pkl', 'rb') as f:
+        hand_l = load(f, encoding='latin1')
+    with open('/path/to/mano_v*_*/models/MANO_RIGHT.pkl', 'rb') as f:
+        hand_r = load(f, encoding='latin1')

At the time of writing the instructions mano version is 1.2 so use

-    with open('/is/ps2/dtzionas/mano/models/MANO_LEFT.pkl', 'rb') as f:
-        hand_l = load(f)
-    with open('/is/ps2/dtzionas/mano/models/MANO_RIGHT.pkl', 'rb') as f:
-        hand_r = load(f)
+    with open('/path/to/mano_v1_2/models/MANO_LEFT.pkl', 'rb') as f:
+        hand_l = load(f, encoding='latin1')
+    with open('/path/to/mano_v1_2/models/MANO_RIGHT.pkl', 'rb') as f:
+        hand_r = load(f, encoding='latin1')

Demo

COMING SOON.

The 3D drill model can be downloaded here.

Our synthetic dataset is available on the project page.

Render Grasps

To create samples for custom 3D models, generated the required grasps with the Grasp Generator and adjust the paths in the arguments accordingly:

blender-2.82a-linux64/blender -noaudio -t 8 -P grasp_renderer.py -- '{"max_grasps_per_object": 300, "renderings_per_grasp": 50, "split": "train", "grasp_folder": "assets/grasps/", "backgrounds_path": "assets/backgrounds/", "results_root": "datasets/synthetic/"}'

Citations

If you find this code useful for your research, please consider citing:

  • the publication that this code was adapted for
@article{hein2021towards,
  title={Towards markerless surgical tool and hand pose estimation},
  author={Hein, Jonas and Seibold, Matthias and Bogo, Federica and Farshad, Mazda and Pollefeys, Marc and F{\"u}rnstahl, Philipp and Navab, Nassir},
  journal={International Journal of Computer Assisted Radiology and Surgery},
  volume={16},
  number={5},
  pages={799--808},
  year={2021},
  publisher={Springer}
}
  • the publication it builds upon and that this code was originally developed for
@inproceedings{hasson19_obman,
  title     = {Learning joint reconstruction of hands and manipulated objects},
  author    = {Hasson, Yana and Varol, G{\"u}l and Tzionas, Dimitris and Kalevatykh, Igor and Black, Michael J. and Laptev, Ivan and Schmid, Cordelia},
  booktitle = {CVPR},
  year      = {2019}
}

About

Grasp Renderer for "Towards markerless surgical tool and hand pose estimation" - 2021

Resources

License

Stars

Watchers

Forks

Languages

  • Python 98.9%
  • Shell 1.1%