上篇文章我们了解了简单加法的机器学习,这次我们看看机器学习能不能搞定正弦曲线 :)
也就是给出蓝色的点(160个),看看能不能预测出红色点的位置(40个)
先用最熟悉的单层网络试试:
看到算出来的公式是一条直线,跟正弦曲线相差太远,而loss看起来的确是在减小,最小值是大概一,看起来还不错,但是实际上看看我们的loss函数的定义:
# Every single Sin Cycle has 40 points, has 5*0.8 Train cycles, and 5*0.2 Test Cycles
x_train,y_train,x_test,y_test = GeneData.GenData_sin(40,5,0.8)
X = tf.placeholder(tf.float32, [None,1],name="X")
Y = tf.placeholder(tf.float32,[None,1],name="Y")
with tf.name_scope('output') as scope:
W = tf.Variable( tf.random_normal( [1, 1] ),name="weights")
b = tf.Variable( tf.random_normal([1]) ,name="bias")
model = tf.matmul(X, W) + b
loss = tf.reduce_mean( tf.pow(model - Y,2), name="loss")
train = tf.train.AdamOptimizer(learning_rate).minimize(loss)
是差值的平方,考虑到我们我们y值都是-1< y < 1的, 平方值肯定都比较小, 而loss为一其实是很大的了!
好吧,我们考虑用多层网络:
n_hidden_1 = 512
n_hidden_2 = 512
n_hidden_3 = 512
n_hidden_4 = 512
n_hidden_5 = 512
n_hidden_6 = 512
n_input = 1
n_output = 1
def multilayer_perceptron(x, weights, biases):
# Hidden layer with RELU activation
layer_1 = tf.add(tf.matmul(x, weights['h1']), biases[&