You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Scheduler (reduce lr at epochs 218, 245, i.e. batches 400k, 450k)
# lf = lambda x: 1 - x / epochs # linear ramp to zero
# lf = lambda x: 10 ** (-2 * x / epochs) # exp ramp to lr0 * 1e-2
lf = lambda x: 1 - 10 ** (hyp['lrf'] * (1 - x / epochs)) # inv exp ramp to lr0 * 1e-2
scheduler = optim.lr_scheduler.LambdaLR(optimizer, lr_lambda=lf, last_epoch=start_epoch - 1)
Which epochs does the lr reduce other than 218 and 245? Does it stop at 245? Are we able to tune this for a custom dataset? What to do when the loss stagnates?
The text was updated successfully, but these errors were encountered:
Scheduler (reduce lr at epochs 218, 245, i.e. batches 400k, 450k)
Which epochs does the lr reduce other than 218 and 245? Does it stop at 245? Are we able to tune this for a custom dataset? What to do when the loss stagnates?
The text was updated successfully, but these errors were encountered: