TensorFlow 教程入门

前一节TensorFlow cpu only 安装记录完成,接下来带你尝鲜TensorFlow,本文也是学习TensorFlow的入门文章。

运行环境:PyCharm
类库: numpy tensorflow
OS: Ubuntu 16.0.4

栗子的简要说明:

  • 给定两组行向量,x1 x2,由这两个行向量x1,x2组成的2*100数组x = ( x1 x2)
  • 给定一组行向量,y1 由该行向量组成的1*100数组y = (y1)

求拟合曲线,y = w1 * x1 + w2 * x2 + b ===>>> y = wx +b (y、w行向量,x
为x1 x2组成的数组)

上面的栗子可以简单的归结为:求权重向量w和截距常数b

完整栗子

# -*- coding:gb18030 -*-

import numpy as np
import tensorflow as tf 

# 使用 NumPy 生成假数据(phony data), 总共 100 个点
x_data = np.float32(np.random.rand(2, 100))
y_data = np.dot([0.100, 0.200], x_data) + 0.300

print x_data  # 元素是从0到1之间到自然数组成到2*100数组
print "=========================================="
print y_data  # (0.100 0.200)这个数组是1*2到数组,然后点乘2*100数组,得到1*100数组,这个数组其实就是行向量

# 构造一个线性模型
b = tf.Variable(tf.zeros([1]))  # 生成所有元素都是0到数组,维数为1*1

w = tf.Variable(tf.random_uniform([1, 2], -1.0, 1.0))  # 生成1*2数组,范围从-1到1

y = tf.matmul(w, x_data) + b  # matmul : Multiplies matrix `a` by matrix `b`, producing `a` * `b`. 完成w点乘x_data

# 最小化方差
loss = tf.reduce_mean(tf.square(y - y_data))  # reduce_mean : 完成求平均值,損失函數,即求方差
optimizer = tf.train.GradientDescentOptimizer(
    0.5)  # Optimizer that implements the gradient descent algorithm. 优化的梯度下降算法
train = optimizer.minimize(loss)

# 初始化变量
init = tf.global_variables_initializer()  # TensorFlow 全局初始化变量

# 启动图 (graph)
with tf.Session() as s:
    s.run(init)

    # 拟合平面
    for step in xrange(0, 201):
        s.run(train)
        if step % 20 == 0:
            print step, s.run(w), s.run(b)  # 将打印出学习步数,w的值,b的值

    s.close()

运行结果

/usr/bin/python2.7 /home/zyl-ai/workspace/helloPy/tensorflowProf4.py
[[ 0.69691771  0.90084547  0.27959362  0.34221876  0.19552153  0.4307428
   0.38117537  0.51119637  0.73647881  0.97722107  0.56137401  0.99700052
   0.28862211  0.20491832  0.22333002  0.73179799  0.18757772  0.02784749
   0.82264501  0.94203383  0.96716905  0.88041455  0.96700621  0.24717875
   0.57799053  0.52826047  0.50474381  0.10885102  0.13200606  0.77814502
   0.98665661  0.38291314  0.58590317  0.72920609  0.82347041  0.78729755
   0.41934684  0.72114462  0.19211003  0.22703528  0.89434481  0.08553855
   0.24389517  0.51333666  0.28480646  0.33580768  0.7284947   0.71873719
   0.27972257  0.15386614  0.73494393  0.24989586  0.4692221   0.7170316
   0.7909314   0.20893854  0.99824864  0.84300399  0.13301195  0.4855516
   0.20430492  0.86111021  0.24074088  0.24305044  0.78865021  0.98579538
   0.35449961  0.38192883  0.61099184  0.17600653  0.26319867  0.49845859
   0.31929463  0.71052152  0.50170356  0.78615308  0.46412373  0.47416937
   0.07297904  0.48616529  0.1963145   0.23710462  0.59903282  0.4258056
   0.89570701  0.3185131   0.62712681  0.34775648  0.20015973  0.16079451
   0.52140051  0.73232579  0.19221835  0.94783974  0.59515399  0.65783632
   0.86610734  0.42973098  0.32846236  0.21681556]
 [ 0.68703651  0.50859827  0.83003438  0.48162606  0.9932605   0.95521986
   0.15459256  0.73663855  0.76626265  0.8708542   0.48917714  0.41975257
   0.80114317  0.54009885  0.76583833  0.93792844  0.21421161  0.58028859
   0.55490202  0.70533133  0.19701815  0.44978765  0.28408217  0.15471935
   0.10218457  0.84436542  0.59460068  0.35081577  0.36495432  0.64114326
   0.1205805   0.08691627  0.51914626  0.4699555   0.38629299  0.5997259
   0.18897435  0.71435636  0.88409257  0.39914566  0.61984813  0.81699961
   0.73582977  0.40669578  0.98568249  0.34302354  0.93530601  0.5670042
   0.07325921  0.80608678  0.91499317  0.44408143  0.87602544  0.460879
   0.53837919  0.32247829  0.55442405  0.4626714   0.6699028   0.23995908
   0.59526259  0.00169737  0.8921811   0.6765843   0.54639554  0.10472193
   0.63367522  0.37715322  0.5413183   0.96324778  0.98257691  0.02974691
   0.65997386  0.33004865  0.01015346  0.93281806  0.25911108  0.70406365
   0.36173737  0.28565016  0.78313518  0.97908878  0.47217387  0.42415264
   0.48899943  0.65197045  0.72566658  0.25601232  0.14285764  0.37608632
   0.39126918  0.43389732  0.03204375  0.27617961  0.299777    0.26443332
   0.08594334  0.41918558  0.78253973  0.13964291]]
==========================================
[ 0.50709907  0.4918042   0.49396624  0.43054709  0.51820425  0.53411825
  0.36903605  0.49844735  0.52690041  0.57189295  0.45397283  0.48365057
  0.48909084  0.4285116   0.47550067  0.56076549  0.3616001   0.41884247
  0.4932449   0.53526965  0.43612053  0.47799898  0.45351706  0.35566175
  0.37823597  0.52169913  0.46939452  0.38104826  0.38619147  0.50604315
  0.42278176  0.35567457  0.46241957  0.46691171  0.45960564  0.49867494
  0.37972955  0.51498573  0.49602952  0.40253266  0.51340411  0.47195378
  0.47155547  0.43267282  0.52561714  0.40218548  0.55991067  0.48527456
  0.3426241   0.47660397  0.55649303  0.41380587  0.5221273   0.46387896
  0.48676898  0.38538951  0.51070967  0.47683468  0.44728176  0.39654697
  0.43948301  0.38645049  0.50251031  0.4596219   0.48814413  0.41952392
  0.462185    0.41362353  0.46936284  0.51025021  0.52283525  0.35579524
  0.46392424  0.43706188  0.35220105  0.56517892  0.39823459  0.48822967
  0.37964538  0.40574656  0.47625849  0.51952822  0.45433806  0.42741109
  0.48737059  0.4622454   0.507846    0.38597811  0.3485875   0.39129672
  0.43039389  0.46001204  0.32563059  0.4500199   0.4194708   0.4186703
  0.4037994   0.42681021  0.48935418  0.34961014]
2017-08-28 09:39:06.907407: I tensorflow/core/platform/cpu_feature_guard.cc:137] Your CPU supports instructions that this TensorFlow binary was not compiled to use: SSE4.1 SSE4.2 AVX AVX2 FMA
0 [[-0.27469051  0.69584221]] [ 0.52409148]
20 [[ 0.01080259  0.26104224]] [ 0.31455979]
40 [[ 0.08151589  0.20737803]] [ 0.30584651]
60 [[ 0.09552376  0.19993299]] [ 0.30241087]
80 [[ 0.09868889  0.19942255]] [ 0.30100557]
100 [[ 0.09954762  0.19966199]] [ 0.30042145]
120 [[ 0.09982692  0.19984138]] [ 0.30017698]
140 [[ 0.09993018  0.19993044]] [ 0.30007437]
160 [[ 0.09997115  0.19997025]] [ 0.30003127]
180 [[ 0.09998797  0.19998741]] [ 0.30001312]
200 [[ 0.09999496  0.19999468]] [ 0.30000553]

Process finished with exit code 0

小结

最后拟合出来的结果:w = (0.09999496 0.19999468) b = (0.30000553)
也就是前面提到的权重向量w和截距常数b

参考文章:
TensorFlow官方文档

评论
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包
实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值