python神经网络训练损失率nan_神经网络 Python TensorFlow 设置学习率(学习笔记)...

#学习率设为1

import tensorflow as tf

training_steps=10

learning_rate=1

x=tf.Variable(tf.constant(5,dtype=tf.float32),name='x')

y=tf.square(x)

train_op=tf.train.GradientDescentOptimizer(learning_rate).minimize(y)

with tf.Session() as sess:

init_op=tf.global_variables_initializer()

sess.run(init_op)

for i in range(training_steps):

sess.run(train_op)

x_value=sess.run(x)

print("After %s iters:x%s is %f."%(i+1,i+1,x_value))

After 1 iters:x1 is -5.000000.

After 2 iters:x2 is 5.000000.

After 3 iters:x3 is -5.000000.

After 4 iters:x4 is 5.000000.

After 5 iters:x5 is -5.000000.

After 6 iters:x6 is 5.000000.

After 7 iters:x7 is -5.000000.

After 8 iters:x8 is 5.000000.

After 9 iters:x9 is -5.000000.

After 10 iters:x10 is 5.000000.

#学习率设为0.001

import tensorflow as tf

training_steps=1000

learning_rate=0.001

x=tf.Variable(tf.constant(5,dtype=tf.float32),name='x')

y=tf.square(x)

train_op=tf.train.GradientDescentOptimizer(learning_rate).minimize(y)

with tf.Session() as sess:

init_op=tf.global_variables_initializer()

sess.run(init_op)

for i in range(training_steps):

sess.run(train_op)

if i%100==0:

x_value=sess.run(x)

print("After %s iters:x%s is %f."%(i+1,i+1,x_value))

After 1 iters:x1 is 4.990000.

After 101 iters:x101 is 4.084646.

After 201 iters:x201 is 3.343555.

After 301 iters:x301 is 2.736923.

After 401 iters:x401 is 2.240355.

After 501 iters:x501 is 1.833880.

After 601 iters:x601 is 1.501153.

After 701 iters:x701 is 1.228794.

After 801 iters:x801 is 1.005850.

After 901 iters:x901 is 0.823355.

#学习率为衰减指数

import tensorflow as tf

training_steps=100

global_step=tf.Variable(0)

learning_rate=tf.train.exponential_decay(0.1, global_step, 1, 0.96, staircase=True)

x=tf.Variable(tf.constant(5, dtype=tf.float32), name="x")

y=tf.square(x)

train_op=tf.train.GradientDescentOptimizer(learning_rate).minimize(y, global_step=global_step)

with tf.Session() as sess:

init_op=tf.global_variables_initializer()

sess.run(init_op)

for i in range(training_steps):

sess.run(train_op)

if i%10==0:

learning_rate_value=sess.run(learning_rate)

x_value=sess.run(x)

print("After %s iters:x%s is %f,learning rate is %f."%(i+1,i+1,x_value,learning_rate_value))

After 1 iters:x1 is 4.000000,learning rate is 0.096000.

After 11 iters:x11 is 0.690561,learning rate is 0.063824.

After 21 iters:x21 is 0.222583,learning rate is 0.042432.

After 31 iters:x31 is 0.106405,learning rate is 0.028210.

After 41 iters:x41 is 0.065548,learning rate is 0.018755.

After 51 iters:x51 is 0.047625,learning rate is 0.012469.

After 61 iters:x61 is 0.038558,learning rate is 0.008290.

After 71 iters:x71 is 0.033523,learning rate is 0.005511.

After 81 iters:x81 is 0.030553,learning rate is 0.003664.

After 91 iters:x91 is 0.028727,learning rate is 0.002436.

Monica_Zzz

发布了44 篇原创文章 · 获赞 3 · 访问量 2790

私信

关注

标签:sess,Python,After,学习,iters,rate,learning,tf,TensorFlow

  • 0
    点赞
  • 0
    收藏
    觉得还不错? 一键收藏
  • 0
    评论
评论
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包
实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值