This repository is the official pytorch implementation of 3DILG (https://arxiv.org/abs/2205.13914).
Biao Zhang1,
Matthias Niessner2
Peter Wonka1,
1KAUST, 2Technical University of Munich
https://1zb.github.io/3DILG/static/video/pipeline.mp4
https://1zb.github.io/3DILG/static/video/uni_ar_wtitle.mp4
- Training of first stage
- Training of category-conditioned generation
- Preprocessed data
- Evaluation code of first stage model
- Pretrained models
- Code cleaning
Download the preprocessed data from here. In case that this link is inaccessable, send me an email for the data. Uncompress occupancies.zip
and surfaces.zip
to somewhere in your hard disk. They are required in the training phase.
torchrun --nproc_per_node=4 run_vqvae.py --output_dir output/vqvae_512_1024_2048 --model vqvae_512_1024_2048 --batch_size 32 --num_workers 60 --lr 1e-3 --disable_eval --point_cloud_size 2048
python eval.py
torchrun --nproc_per_node=4 run_class_cond.py --output_dir output/class_encoder_55_512_1024_24_K1024_vqvae_512_1024_2048 --model class_encoder_55_512_1024_24_K1024 --vqvae vqvae_512_1024_2048 --vqvae_pth output/vqvae_512_1024_2048/checkpoint-799.pth --batch_size 32 --num_workers 60 --point_cloud_size 2048
Pick a category id $CATEGORY_ID
which you can find definition in shapnet.py. Using the following code to sample a shape, you will find a file sample.obj
in the root folder.
python run_class_cond_sample.py --model_pth pretrained/class_encoder_55_512_1024_24_K1024_vqvae_512_1024_2048/checkpoint-399.pth --vqvae_pth pretrained/vqvae_512_1024_2048/checkpoint-799.pth --id $CATEGORY_ID
Contact Biao Zhang (@1zb) if you have any further questions. This repository is for academic research use only.
The architecture of our method is inspired by Taming-transformers and ViT.
@inproceedings{
zhang2022dilg,
title={3{DILG}: Irregular Latent Grids for 3D Generative Modeling},
author={Biao Zhang and Matthias Nie{\ss}ner and Peter Wonka},
booktitle={Advances in Neural Information Processing Systems},
editor={Alice H. Oh and Alekh Agarwal and Danielle Belgrave and Kyunghyun Cho},
year={2022},
url={https://openreview.net/forum?id=RO0wSr3R7y-}
}
This project is under the CC-BY-NC 4.0 license.