From: https://tensorflow.google.cn/api_docs/python/tf/train/AdamOptimizer#minimize
Optimizer that implements the Adam algorithm. 实现Adam算法的优化器
__init__
__init__(
learning_rate=0.001,
beta1=0.9,
beta2=0.999,
epsilon=1e-08,
use_locking=False,
name='Adam'
)
Method(常用)
apply_gradients
apply_gradients(
grads_and_vars,
global_step=None,
name=None
)
minimize
minimize(
loss,
global_step=None,
var_list=None,
gate_gradients=GATE_OP,
aggregation_method=None,
colocate_gradients_with_ops=False,
name=None,
grad_loss=None
)
Example:
tf.train.AdamOptimizer(learning_rate).minimize(loss)