Eric Hedlin, Gopal Sharma, Shweta Mahajan, Xingzhe He, Hossam Isack, Abhishek Kar, Helge Rhodin, Andrea Tagliasacchi, Kwang Moo Yi
For more detailed information, visit our project page or read our paper
We provide an interactive demo in a Google Colab. This allows a user to upload custom images and optimizes and visualizes the found keypoints over the images.
Create a conda environment using the provided requirements.yaml
:
conda env create -f requirements.yaml
conda activate StableKeypoints
The CelebA, Taichi, Human3.6m, DeepFashion, and CUB datasets can be found on their websites.
Preprocessed data for CelebA, and CUB can be found in Autolink's repository.
To use the code, run:
python3 -m unsupervised_keypoints.main [arguments]
--dataset_loc
: Path to the dataset.--dataset_name
: Name of the dataset.--num_steps
: Number of steps (default 500, up to 10,000 for non-human datasets).--evaluation_method
: Following baselines, the evaluation method varies by dataset:- CelebA: 'inter_eye_distance'
- CUB: 'visible'
- Taichi: 'mean_average_error' (renormalized per keypoint)
- DeepFashion: 'pck'
- Human3.6M: 'orientation_invariant'
--save_folder
: Output save location (default "outputs" inside the repo).
python3 -m unsupervised_keypoints.main --dataset_loc /path/to/dataset --dataset_name celeba_wild --evaluation_method inter_eye_distance --save_folder /path/to/save
If you want to use a custom dataset you can run
python3 -m unsupervised_keypoints.main --dataset_loc /path/to/dataset --dataset_name custom
We provide the precomputed tokens here
@article{hedlin2023keypoints,
title={Unsupervised Keypoints from Pretrained Diffusion Models},
author={Hedlin, Eric and Sharma, Gopal and Mahajan, Shweta and He, Xingzhe and Isack, Hossam and Rhodin, Abhishek Kar Helge and Tagliasacchi, Andrea and Yi, Kwang Moo},
journal={arXiv preprint arXiv:2312.00065},
year={2023}
}