【Tensorflow】Tensorflow入门教程(二)(真实入门!超简单!)

今天我开始了deeplearning.ai的Introduction to TensorFlow for Artificial Intelligence, Machine Learning, and Deep Learning第二周课程的学习。好了,废话不多说,下面是第二周部分对应的笔记。

第二周的任务目标是建立一个能够识别Fashion MINIST或手写数字0~9的神经网络,这个Fashion MINIST呢就是一个有10种类别的标签,共70000张,其中60000张训练图片,10000张测试图片,每个图片都是28*28的大小,实现结果支持准确率达到一定程度后回调,下面是代码

In the course you learned how to do classificaiton using Fashion MNIST, a data set containing items of clothing. There’s another, similar dataset called MNIST which has items of handwriting – the digits 0 through 9.

Write an MNIST classifier that trains to 99% accuracy or above, and does it without a fixed number of epochs – i.e. you should stop training once you reach that level of accuracy.

Some notes:

1.It should succeed in less than 10 epochs, so it is okay to change epochs= to 10, but nothing larger
When it reaches 99% or greater it should print out the string “Reached 99% accuracy so cancelling training!”
2.If you add any additional variables, make sure you use the same names as the ones used in the class
3.I’ve started the code for you below – how would you finish it?

import tensorflow as tf
from os import path, getcwd, chdir

# DO NOT CHANGE THE LINE BELOW. If you are developing in a local
# environment, then grab mnist.npz from the Coursera Jupyter Notebook
# and place it inside a local folder and edit the path to that location
path = f"{getcwd()}/../tmp2/mnist.npz"
# GRADED FUNCTION: train_mnist
def train_mnist():
    # Please write your code only where you are indicated.
    # please do not remove # model fitting inline comments.

    # YOUR CODE SHOULD START HERE
    class myCallback(tf.keras.callbacks.Callback):
        def on_epoch_end(self,epoch,logs={}):
        #这个函数的作用是在每一次遍历完6000张训练图片后看一下在测试集上的准确度,如果
        #准确度达到了99%,那么已经达到了我们满意的结果,就可以终止训练了。
            if(logs.get('acc') > 0.99):
                print("\nReached 99% accuracy so cancelling training!")
                self.model.stop_training = True
    callbacks = myCallback()
    # YOUR CODE SHOULD END HERE

    mnist = tf.keras.datasets.mnist

    (x_train, y_train),(x_test, y_test) = mnist.load_data(path=path)
    # YOUR CODE SHOULD START HERE
    x_train = x_train/255.0
    x_test = x_test/255.0
    # YOUR CODE SHOULD END HERE
    model = tf.keras.models.Sequential([
        # YOUR CODE SHOULD START HERE
    	tf.keras.layers.Flatten(input_shape=(28,28)),
        #数据的标准化,把输入图片从28*28转换成1维
    	tf.keras.layers.Dense(512,activation=tf.nn.relu),
        #这里创建了一个有512个神经元的隐藏层,每个神经元初始参数随机,激活函数为relu
    	tf.keras.layers.Dense(10,activation=tf.nn.softmax)
        #这里用于输出,因为一共有10种类嘛,softmax的意思是软输出,简单说就是选取分数最大的,
        #最有可能的结果把它置1,然后其他的都置0。
        # YOUR CODE SHOULD END HERE
    ])

    model.compile(optimizer='adam',
                  loss='sparse_categorical_crossentropy',
                  metrics=['accuracy'])
    
    # model fitting
    history = model.fit(# YOUR CODE SHOULD START HERE
        x_train,y_train,epochs=10,callbacks=[callbacks]
              # YOUR CODE SHOULD END HERE
    )
    # model fitting
    return history.epoch, history.history['acc'][-1]
    
train_mnist()

以上是jupyter版本,如果要在本地运行,使用如下代码

import tensorflow as tf
from os import path, getcwd, chdir

# DO NOT CHANGE THE LINE BELOW. If you are developing in a local
# environment, then grab mnist.npz from the Coursera Jupyter Notebook
# and place it inside a local folder and edit the path to that location
path = f"{getcwd()}/../tmp2/mnist.npz"


# GRADED FUNCTION: train_mnist
def train_mnist():
    # Please write your code only where you are indicated.
    # please do not remove # model fitting inline comments.

    # YOUR CODE SHOULD START HERE
    class myCallback(tf.keras.callbacks.Callback):
        def on_epoch_end(self, epoch, logs={}):
        #这个函数的作用是在每一次遍历完6000张训练图片后看一下在测试集上的准确度,如果
        #准确度达到了99%,那么已经达到了我们满意的结果,就可以终止训练了。
            if (logs.get('accuracy') > 0.99):
                print("\nReached 99% accuracy so cancelling training!")
                self.model.stop_training = True

    callbacks = myCallback()
    # YOUR CODE SHOULD END HERE

    mnist = tf.keras.datasets.mnist

    (x_train, y_train), (x_test, y_test) = mnist.load_data()
    # YOUR CODE SHOULD START HERE
    x_train = x_train / 255.0
    x_test = x_test / 255.0
    # YOUR CODE SHOULD END HERE
    model = tf.keras.models.Sequential([
        # YOUR CODE SHOULD START HERE
        tf.keras.layers.Flatten(input_shape=(28, 28))#数据的标准化,把输入图片从28*28转换成1维
        tf.keras.layers.Dense(512, activation=tf.nn.relu),
        #这里创建了一个有512个神经元的隐藏层,每个神经元初始参数随机,激活函数为relu
        tf.keras.layers.Dense(10, activation=tf.nn.softmax)
        #这里用于输出,因为一共有10种类嘛,softmax的意思是软输出,简单说就是选取分数最大的,
        #最有可能的结果把它置1,然后其他的都置0。
        # YOUR CODE SHOULD END HERE
    ])

    model.compile(optimizer='adam',
                  loss='sparse_categorical_crossentropy',
                  metrics=['accuracy'])

    # model fitting
    history = model.fit(  # YOUR CODE SHOULD START HERE
        x_train, y_train, epochs=10, callbacks=[callbacks]
        # YOUR CODE SHOULD END HERE
    )
    # model fitting
    return history.epoch, history.history['accuracy'][-1]


train_mnist()

评估模型使用以下函数

model.evaluate(test_images, test_labels)

大概就是这些了,最终结果应该是这样,可以看到在达到99%准确率后模型终止了训练。
在这里插入图片描述
下面是一些参考资料:
Course 1 - Part 4 - Lesson 2 - Notebook.ipynb
Course 1 - Part 4 - Lesson 4 - Notebook.ipynb
Exercise2-Answer.ipynb

  • 0
    点赞
  • 4
    收藏
    觉得还不错? 一键收藏
  • 0
    评论

“相关推荐”对你有帮助么?

  • 非常没帮助
  • 没帮助
  • 一般
  • 有帮助
  • 非常有帮助
提交
评论
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包
实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值