Optimizers¶
Gradient descent¶
Momentum¶
Nesterov momentum¶
Adagrad¶
RMSprop¶
Adam¶
-
class
pykitml.
Adam
(learning_rate, decay_rate=1, beta1=0.9, beta2=0.9)¶ This class implements adam optimization.
-
__init__
(learning_rate, decay_rate=1, beta1=0.9, beta2=0.9)¶ Parameters: - learning_rate (float) –
- decay_rate (float) – Decay rate for learning rate
- beta1 (float) – Should be between 0 to 1.
- beta2 (float) – Should be between 0 to 1.
-