Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Question about lr_scheduler #18

Open
cheng052 opened this issue Aug 5, 2024 · 0 comments
Open

Question about lr_scheduler #18

cheng052 opened this issue Aug 5, 2024 · 0 comments

Comments

@cheng052
Copy link

cheng052 commented Aug 5, 2024

Hi Guanxing,

I notice that lr_scheduler is set to be False in ManiGaussian/conf/method/ManiGaussian_BC.yaml config file, resulting in a constant LR of 0.0005 during the training. But the paper says, "We also adopt a cosine scheduler with a warmup in the first 3k steps".
I wonder which lr_scheduler setting can produce the best performance, as shown in the paper?

Regards,
Bowen

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant