Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

After the training total loss stabilized, the training total loss decreased again after the model was resumed #306

Closed
xl4533 opened this issue May 29, 2019 · 4 comments

Comments

@xl4533
Copy link

xl4533 commented May 29, 2019

I migrated the trained COCO model to a new data set for training, which was normal at the beginning. I set up 3000 epochs, but after hundreds of epochs total loss was almost unchanged. I stopped training at this time, but when I used the trained model again, total loss dropped again. I want to know why? Will the learning rate drop to a very low level after a certain epoch?

@xl4533 xl4533 changed the title After the training total loss stabilized, the training total loss decreased again after the model was resud After the training total loss stabilized, the training total loss decreased again after the model was resumed May 29, 2019
@glenn-jocher
Copy link
Member

@xl4533 there is a learning rate scheduler updating the LR depending on the epoch and the hyperparameters you set in train.py. If you change the number of training epochs after starting training this may affect your results.

@glenn-jocher
Copy link
Member

See #238

@xl4533
Copy link
Author

xl4533 commented May 30, 2019

Thank you very much. I'm trying to adjust these parameters to see performance.

@glenn-jocher
Copy link
Member

@xl4533 you're welcome! Feel free to experiment with different hyperparameters and let us know if you have any more questions. Good luck with your adjustments! 🚀

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants