Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

enable TP fp8 allgather with PrepareFloat8ModuleInput #393

Merged
merged 1 commit into from
Jun 13, 2024
Merged

Conversation

wanchaol
Copy link
Contributor

This PR is a follow up PR to enable fp8 allgather in TP after these PR landed:

One need to update their pytorch/float8_experimental to have those changes in to train with fp8 changes.

Since fp8 is not enabled as part of our integration tests yet, there should be no issues on CI or trains that does not use fp8

This PR is a follow up PR to enable fp8 allgather in TP after these PR
landed:
* pytorch/pytorch#128431
* pytorch-labs/float8_experimental#275

One need to update their pytorch/float8_experimental to have those
changes in to train with fp8 changes.

Since fp8 is not enabled as part of our integration tests yet, there
should be no issues on CI
@facebook-github-bot facebook-github-bot added the CLA Signed This label is managed by the Meta Open Source bot. label Jun 12, 2024
@wanchaol wanchaol merged commit e17e3b8 into main Jun 13, 2024
4 of 5 checks passed
tianyu-l pushed a commit to tianyu-l/torchtitan_intern24 that referenced this pull request Aug 16, 2024
This PR is a follow up PR to enable fp8 allgather in TP after these PR
landed:
* pytorch/pytorch#128431
* pytorch-labs/float8_experimental#275

One need to update their pytorch/float8_experimental to have those
changes in to train with fp8 changes.

Since fp8 is not enabled as part of our integration tests yet, there
should be no issues on CI or trains that does not use fp8
philippguevorguian pushed a commit to YerevaNN/YNNtitan that referenced this pull request Aug 17, 2024
This PR is a follow up PR to enable fp8 allgather in TP after these PR
landed:
* pytorch/pytorch#128431
* pytorch-labs/float8_experimental#275

One need to update their pytorch/float8_experimental to have those
changes in to train with fp8 changes.

Since fp8 is not enabled as part of our integration tests yet, there
should be no issues on CI or trains that does not use fp8
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
CLA Signed This label is managed by the Meta Open Source bot.
Projects
None yet
Development

Successfully merging this pull request may close these issues.

3 participants