本章主要是利用前两章学的内容实现一下线性回归和逻辑回归。
Linear Regression in TensorFlow
由于格式的问题,建议使用PC端浏览器观看!
Solution
直接贴代码
"""
Simple linear regression example in TensorFlow
This program tries to predict the number of thefts from
the number of fire in the city of Chicago
"""
import numpy as np
import matplotlib.pyplot as plt
import tensorflow as tf
import xlrd
from __future__ import print_function
DATA_FILE = '../data/fire_theft.xls'
# Step 1: read in data from the .xls file
book = xlrd.open_workbook(DATA_FILE, encoding_override='utf-8')
sheet = book.sheet_by_index(0)
data = np.asarray([sheet.row_values(i) for i in range(1, sheet.nrows)])
n_samples = sheet.nrows - 1
Phase 1: Assemble the graph
# Step 2: create placeholders for input X (number of fire) and label Y (number of theft)
X = tf.placeholder(tf.float32,name = 'X')
Y = tf.placeholder(tf.float32,name = 'Y')
# Step 3: create weight and bias, initialized to 0
# name your variables w and b
w = tf.Variable(0.0,name = 'weight')
b = tf.Variable(0.0,name = 'bias')
# Step 4: predict Y (number of theft) from the number of fire
# name your variable Y_predicted
Y_predicted = w * X + b
# Step 5: use the square error as the loss function
# name your variable loss
loss = tf.square(Y_predicted - Y)
# Step 6: using gradient descent with learning rate of 0.01 to minimize loss
optimizer = tf.train.GradientDescentOptimizer(learning_rate=0.001).minimize(loss)
Phase 2: Train our model
with tf.Session() as sess:
# Step 7: initialize the necessary variables, in this case, w and b
# TO - DO
sess.run(tf.global_variables_initializer())