Tensorboard的使用

TensorFlow实现简单的线性回归模型

本文为学习standford的tensorflow课程而做的学习笔记
模型为linear regression

先load data,

import tensorflow as tf
import utils
import matplotlib.pyplot as plt


data_file = "data/birth_life.txt"

# Step 1: read in data from the .txt file
# data is a numpy array of shape (190, 2), each row is a datapoint
data, n_samples = utils.read_birth_life_data(data_file)

#print(data)
#print(n_samples)

## visulize the data
plt.figure()
plt.scatter(data[:,0], data[:,1], linewidth = 3)
plt.xlabel('birth rate')
plt.ylabel('life expectancy')
plt.show()

数据分布

接着建立模型,然后开始求解,

# 开始建立模型
# Step 2: create placeholders for X (birth rate) and Y (life expectancy)
X = tf.placeholder(tf.float32, name='X')
Y = tf.placeholder(tf.float32, name='Y')

# Step 3: create weight and bias, initialized to 0
w = tf.get_variable('weights', initializer=tf.constant(0.0))
b = tf.get_variable('bias', initializer=tf.constant(0.0))

# Step 4: construct model to predict Y (life expectancy from birth rate)
Y_predicted = w * X + b 

# Step 5: use the square error as the loss function
loss = tf.square(Y - Y_predicted, name='loss')

# Step 6: using gradient descent with learning rate of 0.01 to minimize loss
optimizer = tf.train.GradientDescentOptimizer(learning_rate=0.001).minimize(loss)

将图解模型加入到Session里面计算

log_path = './graphs/basic_models'
#merged_summary_op = tf.merge_all_summaries()
#writer = tf.summary.FileWriter(log_path, tf.get_default_graph())
with tf.Session() as sess:
    # Step 7: initialize the necessary variables, in this case, w and b
    sess.run(tf.global_variables_initializer()) 
    with tf.name_scope('running_variable'): #这个name_scope不加会报错,这还是比较关键的
        tf.summary.scalar('loss_run',loss)
        tf.summary.scalar('weight_run',w)
        tf.summary.scalar('bias_run',b)

    merged_summary_op = tf.summary.merge_all()

    writer = tf.summary.FileWriter(log_path, sess.graph)
    # Step 8: train the model
    for i in range(100): # run 100 epochs
        for x, y in data:
        # Session runs train_op to minimize loss
            summary, loss = sess.run([merged_summary_op, optimizer], feed_dict={X: x, Y:y}) 
        writer.add_summary(summary, i)

    # Step 9: output the values of w and b
    w_out, b_out = sess.run([w, b]) 

    # 画出结果
    plt.figure()
    plt.scatter(data[:,0], data[:,1], linewidth=5, label=u'original data')
    plt.plot(data[:,0], data[:,0]*w_out + b_out, c='r', linewidth=3, label=u'predicted line')
    plt.xlabel('birth rate')
    plt.ylabel('life expantancy')
    plt.show()

writer.close()

这里其实可以用向量一步到位,这里是为了做演示

回归结果

最后可以通过tensorboard来查看跑的过程中的结果
输入: tensorboard –logdir=./graphs/basic_models
最后在浏览器中可以输入:http://pcname/6006 即可看到结果

Tensorboard显示训练结果

  • 0
    点赞
  • 0
    收藏
    觉得还不错? 一键收藏
  • 0
    评论

“相关推荐”对你有帮助么?

  • 非常没帮助
  • 没帮助
  • 一般
  • 有帮助
  • 非常有帮助
提交
评论
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包
实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值