Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[pull] main from NVIDIA:main #24

Merged
merged 7 commits into from
Sep 6, 2024
Merged

[pull] main from NVIDIA:main #24

merged 7 commits into from
Sep 6, 2024

Conversation

pull[bot]
Copy link

@pull pull bot commented Sep 6, 2024

See Commits and Changes for more details.


Created by pull[bot]

Can you help keep this open source service alive? 💖 Please sponsor : )

dependabot bot and others added 7 commits September 3, 2024 17:05
…1154)

Bumps [actions/download-artifact](https://github.com/actions/download-artifact) from 3 to 4.1.7.
- [Release notes](https://github.com/actions/download-artifact/releases)
- [Commits](actions/download-artifact@v3...v4.1.7)

---
updated-dependencies:
- dependency-name: actions/download-artifact
  dependency-type: direct:production
...

Signed-off-by: dependabot[bot] <[email protected]>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
* fp8 mha with rope

Signed-off-by: Xin Yao <[email protected]>

* avoid index select in cast ops

Signed-off-by: Xin Yao <[email protected]>

* avoid index select in fused_attn_fwd

Signed-off-by: Xin Yao <[email protected]>

* rename is_first_module_in_mha to fp8_output

Signed-off-by: Xin Yao <[email protected]>

* resolve comments

Signed-off-by: Xin Yao <[email protected]>

* resolve comments

Signed-off-by: Xin Yao <[email protected]>

* [pre-commit.ci] auto fixes from pre-commit.com hooks

for more information, see https://pre-commit.ci

* move transpose to backward for fp8 input

Signed-off-by: Xin Yao <[email protected]>

* fix ut

Signed-off-by: Xin Yao <[email protected]>

* resolve comments

Signed-off-by: Xin Yao <[email protected]>

* update argument list for CP

Signed-off-by: Xin Yao <[email protected]>

* [pre-commit.ci] auto fixes from pre-commit.com hooks

for more information, see https://pre-commit.ci

* fix for FA3

Signed-off-by: Xin Yao <[email protected]>

* [pre-commit.ci] auto fixes from pre-commit.com hooks

for more information, see https://pre-commit.ci

* remove unnecessary copy of scale_inv

Signed-off-by: Xin Yao <[email protected]>

* skip fp8 dpa/mha tests when fa3 is not available

Signed-off-by: Xin Yao <[email protected]>

* fix a merge bug

Signed-off-by: Xin Yao <[email protected]>

---------

Signed-off-by: Xin Yao <[email protected]>
Co-authored-by: pre-commit-ci[bot] <66853113+pre-commit-ci[bot]@users.noreply.github.com>
* Added offloading support FP8 attention

Signed-off-by: Selvaraj Anandaraj <[email protected]>

* Update transformer_engine/pytorch/attention.py

Co-authored-by: Kirthi Shankar Sivamani <[email protected]>
Signed-off-by: Selvaraj Anandaraj <[email protected]>

* Fix

Signed-off-by: Kirthi Shankar Sivamani <[email protected]>

---------

Signed-off-by: Selvaraj Anandaraj <[email protected]>
Signed-off-by: Selvaraj Anandaraj <[email protected]>
Signed-off-by: Kirthi Shankar Sivamani <[email protected]>
Co-authored-by: Selvaraj Anandaraj <[email protected]>
Co-authored-by: Kirthi Shankar Sivamani <[email protected]>
* [TE/PyTorch][MoE] Add FP8 padding and unpadding module 

 1. Add multi-tensor padding kernel for FP8 with padding size = 16.
 2. Add FP8Padding and Fp8Unpadding module
 3. Add Padded GroupedLinear unit tests

---------

Signed-off-by: beinggod <[email protected]>
Co-authored-by: Phuong Nguyen <[email protected]>
suppress 128D warning from cudnn-frontend

Signed-off-by: Charlene Yang <[email protected]>
Revert "[C] Suppress 128-D warning from cudnn-frontend (#1158)"

This reverts commit 206c1d9.

Signed-off-by: Kirthi Shankar Sivamani <[email protected]>
@pull pull bot added the ⤵️ pull label Sep 6, 2024
@pull pull bot merged commit bdea56f into phu0ngng:main Sep 6, 2024
12 checks passed
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

Successfully merging this pull request may close these issues.

6 participants