You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
I'm currently experimenting with a new type of positional encoding in the SwinTransformer repository. Due to hardware limitations (only 2 GPUs available), I run my training in resumed sessions to cover a full 300 epochs. Each session runs for 12 hours, during which I can complete about 13 epochs.
To ensure the model doesn't see the same samples in each resumed session, I change the seed on every resume. Without changing the seed, the model would be exposed to the same data ordering in every 12-hour run.
Observed Behavior:
First Epoch of Each Resume: There's a significant jump in accuracy.
Subsequent Epochs: After the initial jump, accuracy improvements are minimal.
Could this pattern be related to the scaler? I'm curious if the scaler might be affecting the training dynamics when resuming or if there's another underlying cause.
The text was updated successfully, but these errors were encountered:
I'm currently experimenting with a new type of positional encoding in the SwinTransformer repository. Due to hardware limitations (only 2 GPUs available), I run my training in resumed sessions to cover a full 300 epochs. Each session runs for 12 hours, during which I can complete about 13 epochs.
To ensure the model doesn't see the same samples in each resumed session, I change the seed on every resume. Without changing the seed, the model would be exposed to the same data ordering in every 12-hour run.
Observed Behavior:
First Epoch of Each Resume: There's a significant jump in accuracy.
Subsequent Epochs: After the initial jump, accuracy improvements are minimal.
Could this pattern be related to the scaler? I'm curious if the scaler might be affecting the training dynamics when resuming or if there's another underlying cause.
The text was updated successfully, but these errors were encountered: