Skip to content

Latest commit

 

History

History
27 lines (17 loc) · 905 Bytes

README.md

File metadata and controls

27 lines (17 loc) · 905 Bytes

Code and instructions are coming soon.

[Project Page] [Paper (arxiv)] [BibTeX]

Train your own models

Instructions to setup the codebase on your own environment are provided in SETUP_CODE, SETUP_DATA.

Configurations to train models can be found here.

Citation

If you like our work, please consider giving it a star ⭐ and cite us

@article{alkin2024upt,
      title={Universal Physics Transformers}, 
      author={Benedikt Alkin and Andreas Fürst and Simon Schmid and Lukas Gruber and Markus Holzleitner and Johannes Brandstetter},
      journal={arXiv preprint arXiv:2402.12365},
      year={2024}
}