Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Optimisation does not converge #2

Open
MaxCamPi opened this issue Jul 6, 2018 · 1 comment
Open

Optimisation does not converge #2

MaxCamPi opened this issue Jul 6, 2018 · 1 comment

Comments

@MaxCamPi
Copy link

MaxCamPi commented Jul 6, 2018

Could you please make a comment as to why there is no convergence in example.py (1d)? Yet, the end GP is a fairly good approximation.

Thank you!

Epoch 9988: 0 %. Loss: -20.101368667360404
Epoch 9989: 0 %. Loss: -38.94774396791739
Epoch 9990: 0 %. Loss: -39.56470825273037
Epoch 9991: 0 %. Loss: -12.318895364429565
Epoch 9992: 0 %. Loss: -19.47904221404525
Epoch 9993: 0 %. Loss: -36.77490064730489
Epoch 9994: 0 %. Loss: 43.20385940066373
Epoch 9995: 0 %. Loss: 20.232737011431396
Epoch 9996: 0 %. Loss: -34.98050499495804
Epoch 9997: 0 %. Loss: -32.932921142632466
Epoch 9998: 0 %. Loss: 739.1653327084057
Epoch 9999: 0 %. Loss: -39.727655479961854
Epoch 10000: 0 %. Loss: -35.62156909134845

@maka89
Copy link
Owner

maka89 commented Sep 21, 2021

because its using the Adam Optimizer. Using a lower learning rate, or different optimizer should do the trick (but might give overfit)

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants