Skip to content

Code for paper 'A2J-Transformer: Anchor-to-Joint Transformer Network for 3D Interacting Hand Pose Estimation from a Single RGB Image', CVPR2023

License

Notifications You must be signed in to change notification settings

GuanyaZhou1/A2J-Transformer

This branch is 11 commits behind ChanglongJiangGit/A2J-Transformer:main.

Folders and files

NameName
Last commit message
Last commit date

Latest commit

2eeebfd · Apr 23, 2023

History

5 Commits
Apr 23, 2023
Mar 23, 2023
Mar 23, 2023
Mar 23, 2023
Mar 23, 2023
Apr 23, 2023
Mar 23, 2023
Mar 23, 2023
Mar 23, 2023
Mar 23, 2023
Mar 23, 2023
Mar 23, 2023
Mar 23, 2023
Mar 23, 2023

Repository files navigation

A2J-Transformer

Introduction

This is the official implementation for the paper, "A2J-Transformer: Anchor-to-Joint Transformer Network for 3D Interacting Hand Pose Estimation from a Single RGB Image", CVPR 2023.

Paper link here: A2J-Transformer: Anchor-to-Joint Transformer Network for 3D Interacting Hand Pose Estimation from a Single RGB Image

About our code

Updates

  • (2023-4-23) Deleted some compiled files in the dab_deformable_detr\ops folder.

Installation and Setup

Requirements

  • Our code is tested under Ubuntu 20.04 environment with NVIDIA 2080Ti GPU and NVIDIA 3090 GPU, both Pytorch1.7 and Pytorch1.11 work.

  • Python>=3.7

    We recommend you to use Anaconda to create a conda environment:

    conda create --name a2j_trans python=3.7

    Then, activate the environment:

    conda activate a2j_trans
  • PyTorch>=1.7.1, torchvision>=0.8.2 (following instructions here)

    We recommend you to use the following pytorch and torchvision:

    conda install pytorch==1.7.1 torchvision==0.8.2 torchaudio==0.7.2 cudatoolkit=11.0 -c pytorch
  • Other requirements

    conda install tqdm numpy matplotlib scipy
    pip install opencv-python pycocotools

Compiling CUDA operators(Following Deformable-DETR)

cd ./dab_deformable_detr/ops
sh make.sh

Usage

Dataset preparation

  • Please download InterHand 2.6M Dataset and organize them as following:

    your_dataset_path/
    └── Interhand2.6M_5fps/
        ├── annotations/
        └── images/
    

Testing on InterHand 2.6M Dataset

  • Please download our pre-trained model and organize the code as following:

    a2j-transformer/
    ├── dab_deformable_detr/
    ├── nets/
    ├── utils/
    ├── ...py
    ├── datalist/
    |   └── ...pkl
    └── output/
        └── model_dump/
            └── snapshot.pth.tar
    

    The datalist folder and the pkl files denotes the dataset-list generated during running the code.

  • In config.py, set interhand_anno_dir, interhand_images_path to the dataset abs directory.

  • In config.py, set cur_dir to the a2j-transformer code directory.

  • Run the following script:

    python test.py --gpu <your_gpu_ids>

    You can also choose to change the gpu_ids in test.py.

About

Code for paper 'A2J-Transformer: Anchor-to-Joint Transformer Network for 3D Interacting Hand Pose Estimation from a Single RGB Image', CVPR2023

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages

  • Python 71.7%
  • Cuda 25.5%
  • C++ 2.5%
  • Shell 0.3%