-
Notifications
You must be signed in to change notification settings - Fork 0
Commit
This commit does not belong to any branch on this repository, and may belong to a fork outside of the repository.
- Loading branch information
Showing
1 changed file
with
7 additions
and
30 deletions.
There are no files selected for viewing
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Original file line number | Diff line number | Diff line change |
---|---|---|
|
@@ -119,41 +119,18 @@ The fine-tuning instruction is in [FINETUNE.md](FINETUNE.md). | |
|
||
## ⚠️ Our code is based on [VideoMAE](https://github.com/MCG-NJU/VideoMAE) code base. | ||
|
||
<!-- We provide pre-trained and fine-tuned models in [MODEL_ZOO.md](MODEL_ZOO.md). --> | ||
|
||
<!-- ## 👀 Visualization --> | ||
|
||
<!-- We provide the script for visualization in [`vis.sh`](vis.sh). Colab notebook for better visualization is coming soon. --> | ||
|
||
<!-- ## ☎️ Contact | ||
Zhan Tong: [email protected] | ||
## 👍 Acknowledgements | ||
Thanks to [Ziteng Gao](https://sebgao.github.io/), Lei Chen, [Chongjian Ge](https://chongjiange.github.io/), and [Zhiyu Zhao](https://github.com/JerryFlymi) for their kind support.<br> | ||
This project is built upon [MAE-pytorch](https://github.com/pengzhiliang/MAE-pytorch) and [BEiT](https://github.com/microsoft/unilm/tree/master/beit). Thanks to the contributors of these great codebases. | ||
## 🔒 License | ||
The majority of this project is released under the CC-BY-NC 4.0 license as found in the [LICENSE](https://github.com/MCG-NJU/VideoMAE/blob/main/LICENSE) file. Portions of the project are available under separate license terms: [SlowFast](https://github.com/facebookresearch/SlowFast) and [pytorch-image-models](https://github.com/rwightman/pytorch-image-models) are licensed under the Apache 2.0 license. [BEiT](https://github.com/microsoft/unilm/tree/master/beit) is licensed under the MIT license. | ||
## ✏️ Citation | ||
|
||
If you think this project is helpful, please feel free to leave a star⭐️ and cite our paper: | ||
|
||
``` | ||
@inproceedings{tong2022videomae, | ||
title={Video{MAE}: Masked Autoencoders are Data-Efficient Learners for Self-Supervised Video Pre-Training}, | ||
author={Zhan Tong and Yibing Song and Jue Wang and Limin Wang}, | ||
booktitle={Advances in Neural Information Processing Systems}, | ||
year={2022} | ||
} | ||
@article{videomae, | ||
title={VideoMAE: Masked Autoencoders are Data-Efficient Learners for Self-Supervised Video Pre-Training}, | ||
author={Tong, Zhan and Song, Yibing and Wang, Jue and Wang, Limin}, | ||
journal={arXiv preprint arXiv:2203.12602}, | ||
year={2022} | ||
@article{salehi2024sigma, | ||
title={SIGMA: Sinkhorn-Guided Masked Video Modeling}, | ||
author={Salehi, Mohammadreza and Dorkenwald, Michael and Thoker, Fida Mohammad and Gavves, Efstratios and Snoek, Cees GM and Asano, Yuki M}, | ||
journal={European Conference of Computer Vision}, | ||
year={2024} | ||
} | ||
``` | ||
``` --> |