forked from pytorch/torchrec
-
Notifications
You must be signed in to change notification settings - Fork 0
Commit
This commit does not belong to any branch on this repository, and may belong to a fork outside of the repository.
use new op in KTRegroupAsDict module (pytorch#2210)
Summary: Pull Request resolved: pytorch#2210 # context * the new op `permute_multi_embedding` outperforms the original op `permute_pooled_embs_auto_grad` * this diff makes the move to switch to the new op * benchmark results: D58907223 # benchmark * [traces](https://drive.google.com/drive/folders/1v_kD9n1jOkGUmYyix3-dUYiBDE_C3Hiv?usp=drive_link) * previous prod {F1747994738} * new prod {F1747994032} * metrics |Operator|GPU runtime|GPU memory|notes| |---|---|---|---|---| |**[previous prod] permute_pooled_embs**|4.9 ms|1.5 K|GPU-boudned, does **NOT** allow duplicates, PT2 non-compatible `pin_and_move`| |**[new prod] permute_multi_embedding**|2.0 ms|1.0 K|both CPU and GPU runtime/memory improved, **ALLOW** duplicates, PT2 friendly| Reviewed By: dstaay-fb Differential Revision: D53590566
- Loading branch information
1 parent
07dd9b9
commit 954d652
Showing
1 changed file
with
56 additions
and
44 deletions.
There are no files selected for viewing
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters