You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
I notice that lr_scheduler is set to be False in ManiGaussian/conf/method/ManiGaussian_BC.yaml config file, resulting in a constant LR of 0.0005 during the training. But the paper says, "We also adopt a cosine scheduler with a warmup in the first 3k steps".
I wonder which lr_scheduler setting can produce the best performance, as shown in the paper?
Regards,
Bowen
The text was updated successfully, but these errors were encountered:
Hi Guanxing,
I notice that lr_scheduler is set to be False in ManiGaussian/conf/method/ManiGaussian_BC.yaml config file, resulting in a constant LR of 0.0005 during the training. But the paper says, "We also adopt a cosine scheduler with a warmup in the first 3k steps".
I wonder which lr_scheduler setting can produce the best performance, as shown in the paper?
Regards,
Bowen
The text was updated successfully, but these errors were encountered: