softmax回归及其实现(TensorFlow)

在之前的博文《logistic回归》中,我们简单的提到了softmax回归。本文将首先介绍softmax回归的基本原理。然后比较softmax回归于logistic回归的关联。最后用开源TensorFlow编写算法并应用于手写数字(MNIST)的识别。

softmax原理

softmax与logistic

用TensorFlow实现softmax regression识别手写数字

#!/usr/bin/env python
# @Time    : 3/28/17 11:14 PM
# @Author  : SunXiangguo
# @version : Anaconda3.6+Ubuntu_16.04_STL_64
# @File    : 22.py
# @Software: PyCharm
"""A very simple MNIST classifier.
"""
from tensorflow.examples.tutorials.mnist import input_data
import tensorflow as tf

mnist = input_data.read_data_sets("MNIST_data", one_hot=True)
print(mnist.train.images.shape,mnist.train.labels.shape)

sess = tf.InteractiveSession()

# step1:define the algorithm for forward calculate
x = tf.placeholder(tf.float32, [None, 784])
W = tf.Variable(tf.zeros([784, 10]))
b = tf.Variable(tf.zeros([10]))
y = tf.nn.softmax(tf.matmul(x, W)+b)

# step2: define loss function
y_ = tf.placeholder(tf.float32, [None, 10])
cross_entropy = tf.reduce_mean(-tf.reduce_sum(y_*tf.log(y),reduction_indices=[1]))

# step3: train model iterally
train_step = tf.train.GradientDescentOptimizer(0.5).minimize(cross_entropy)
tf.global_variables_initializer().run()
for i in range(1000):
    batch_xs , batch_ys = mnist.train.next_batch(100)
    train_step.run({x:batch_xs,y_:batch_ys})

# step4:test model in test_data
correct_prediction = tf.equal(tf.argmax(y,1),tf.argmax(y_,1))
accuracy = tf.reduce_mean(tf.cast(correct_prediction, tf.float32))
print(accuracy.eval({x:mnist.test.images, y_:mnist.test.labels}))

实验结果为:92%

评论
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包
实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值