You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Hi fbgemm team, while using torchrec EmbeddingCollection with adam optimizer, I found out that ec.fused_optimizer.state_dict() returns nothing but momentum tensors. lr, decay etc. which are usually accessible with normal torch.optim.Adam are gone.
>>> model.fused_optimizer.state_dict()['state']['embeddings.product_table.weight'].keys()
dict_keys(['product_table.momentum1', 'product_table.exp_avg_sq'])
# only `state` key.value, there are no param_groups that contain the lr, beta1 etc.
Those metadata are mandatory when I need to dump and reload the model.
The text was updated successfully, but these errors were encountered:
Hi fbgemm team, while using torchrec
EmbeddingCollection
with adam optimizer, I found out that ec.fused_optimizer.state_dict() returns nothing but momentum tensors.lr
,decay
etc. which are usually accessible with normal torch.optim.Adam are gone.See thread
I think the problem is here.
Above model gives me optimizer state like:
Those metadata are mandatory when I need to dump and reload the model.
The text was updated successfully, but these errors were encountered: