Skip to content

Visualization of popular Deep Learning optimizers built upon Gradient Descent.

License

Notifications You must be signed in to change notification settings

BUSY-LOOPING/DeepLearning-Optimizers

Repository files navigation

DeepLearning-Optimizers

Visualization of popular Deep Learning optimizers built upon Gradient Descent.

Cost Function

$z = x^2 - y^2$ , or, $J = \theta_1^2 - \theta_2^2$

Optimizers

Vanilla Gradient Descent

Momentum

AdaGrad

AdaDelta

Adam

Costs Comparison

Releases

No releases published

Packages

No packages published