-
Notifications
You must be signed in to change notification settings - Fork 474
Issues: pytorch/torchrec
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Author
Label
Projects
Milestones
Assignee
Sort
Issues list
In what scenarios should
QuantManagedCollisionEmbeddingCollection
be used?
#2724
opened Feb 5, 2025 by
tiankongdeguiji
wrong local_world_size fallbacks to global world size with slurm
#2667
opened Jan 7, 2025 by
JacoCheung
why dense_embedding_codegen_lookup_function has No DispatchKey::Autograd with DataParallel
#2657
opened Dec 27, 2024 by
imhandsome
[Bug] state_dict returns wrong path when DMP is as a submodule
#2584
opened Nov 22, 2024 by
JacoCheung
ShardedQuantEmbeddingBagCollection doesn't seem to be distributing the shards properly
#2575
opened Nov 21, 2024 by
Hanyu-Li
[Question/Bug] DP sharding parameters are inconsistent with others.
#2563
opened Nov 18, 2024 by
JacoCheung
[Question] Does TorchRec supports dist checking point / (DCP)
#2534
opened Nov 4, 2024 by
JacoCheung
[Question] Is there gradient accumulation support for training?
#2332
opened Aug 22, 2024 by
liuslnlp
How to share embeddings between an EmbeddingCollection and an EmbeddingBagCollection?
#2268
opened Aug 2, 2024 by
tiankongdeguiji
Cannot work when use DATA_PARALLEL with FusedEmbeddingBagCollection
#2209
opened Jul 4, 2024 by
imh966
[Bug][Dynamic Embedding] improper optimizier state_dict Something isn't working
momentum2
key while constructing PSCollection
bug
#2177
opened Jun 26, 2024 by
JacoCheung
Previous Next
ProTip!
Exclude everything labeled
bug
with -label:bug.