Commit
This commit does not belong to any branch on this repository, and may belong to a fork outside of the repository.
Fix false alarm of Adagrad (pytorch#2601)
Summary: Pull Request resolved: pytorch#2601 Some non-Ads PyPer models use "Adagrad" for the sparse optimizer. However, this path is not supported. The reason for the it to work on current TTK sharder is because there's no sparse parameter at all. For Torchrec sharder to honor that, this diff re-examine this path and make the behavior on-par with TTK sharder. Differential Revision: D65957418 fbshipit-source-id: 70368b8a072b46a139834a9928f0e5f8f06a7fe0
- Loading branch information