tensorflow4-5

Table of Contents

搭建神经网络八股

步骤
iris数据代码复现
mnist数据训练
fashion数据训练

六步法搭建结构

# 用tensorflow API:tf.keras搭建神经网络八股
# 步骤:六步法:
# import 
# train,test
# model=tf.keras.models.Sequential  搭建结构
# model.compile  配置训练方法
# model.fit  执行训练过程
# model.summary  打印出网络结构和参数统计


model = tf.keras.models.Sequential([网络结构])
网络结构举例:
拉直层:tf.keras.layers.Flatten() 把输入特征拉直成一位数组,不涉及计算
全连接层:tf.keras.layers.Dense(神经元个数,activition='激活函数',
                          kernel_regularizer=哪种正则化)
    激活函数可以选择:relu,softmax,sigmod,tanh
    正则化可选:tf.keras.regularizers.l1(),tf.keras.regularizers.l2()
卷积层:tf.keras.layers.Conv2D(filters=卷积核个数,kernal_size=卷积核尺寸,
                          strides=卷积步长,padding='valid'or 'same')
LSTM层:tf.keras.layers.LSTM()


model.compile(optimizer=优化器,
             loss=损失函数,
             metrics=['准确率'])   #配置训练方法
optimizer可选:
    'sgd' 或者tf.keras.optimizers.SGD(lr=学习率,momentum=动量参数)
    'adagrad'或者 tf.keras.optimizers.Adagrad(lr=学习率)
    'adadelta'或者 tf.keras.optimizers.Adadelta(lr=学习率)
    'adam'或者 tf.keras.optimizers.Adam(lr=学习率,beta_1=0.9,beta_2=0.999)
loss可选:
    'mse'或者 tf.keras.losses.MeanSquaredError()
    'sparse_categorical_crossentrop'稀疏分类交叉熵,或者tf.keras.losses.SparseCategoricalCrossentropy(from_logits=False,这个参数是在询问是否是原始输出,也就是没有经过概率分布的输出)  
metrics可选:
    'accuracy':y_和y是数值型
    'categorical_accuracy':y_和y都是独热编码类型(概率分布),y_=[0,1,0],y=[0.26,0.5,0.24]
    'sparse_categorical_accuracy':y_是数值型,y是独热编码(概率分布),y_=[1],y=[0.25,0.5,0.24]


model.fit(训练集输入特征,训练集标签,
         batch_size = ,epochs= ,
         validation_data=(测试集的输入特征,测试集标签),
         validation_split=训练集占测试集比例,
         validation_frep=多少次epoch测试一次)

model.summary()

Iris数据集用sequential和类方法搭建全连接神经网络

# Iris数据集sequential代码实现
import tensorflow as tf
from  sklearn import datasets
import numpy as np

x_train = datasets.load_iris().data
y_train = datasets.load_iris().target

np.random.seed(129)
np.random.shuffle(x_train)
np.random.seed(129)
np.random.shuffle(y_train)
tf.random.set_seed(129)

model = tf.keras.models.Sequential([
    tf.keras.layers.Dense(3,activation='softmax',kernel_regularizer=tf.keras.regularizers.l2())
])
model.compile(optimizer=tf.keras.optimizers.SGD(lr=0.1),
             loss=tf.keras.losses.SparseCategoricalCrossentropy(from_logits=False),
             metrics = ['sparse_categorical_accuracy'])
model.fit(x_train,y_train,batch_size=32,epochs=500,validation_split=0.2,validation_freq=20)
model.summary()



Epoch 1/500
4/4 [==============================] - 0s 1ms/step - loss: 2.3442 - sparse_categorical_accuracy: 0.2917
Epoch 2/500
4/4 [==============================] - 0s 500us/step - loss: 1.3050 - sparse_categorical_accuracy: 0.2500
Epoch 3/500
4/4 [==============================] - 0s 1000us/step - loss: 0.9678 - sparse_categorical_accuracy: 0.3833
Epoch 4/500
4/4 [==============================] - 0s 498us/step - loss: 0.8295 - sparse_categorical_accuracy: 0.7167
Epoch 5/500
4/4 [==============================] - 0s 500us/step - loss: 0.7752 - sparse_categorical_accuracy: 0.6167
Epoch 6/500
4/4 [==============================] - 0s 500us/step - loss: 0.9608 - sparse_categorical_accuracy: 0.6750
Epoch 7/500
4/4 [==============================] - 0s 500us/step - loss: 0.6480 - sparse_categorical_accuracy: 0.6833
Epoch 8/500
4/4 [==============================] - 0s 500us/step - loss: 0.7257 - sparse_categorical_accuracy: 0.6750
Epoch 9/500
4/4 [==============================] - 0s 501us/step - loss: 0.6941 - sparse_categorical_accuracy: 0.7333
Epoch 10/500
4/4 [==============================] - 0s 750us/step - loss: 0.7029 - sparse_categorical_accuracy: 0.6750
Epoch 11/500
4/4 [==============================] - 0s 250us/step - loss: 0.6258 - sparse_categorical_accuracy: 0.6833
Epoch 12/500
4/4 [==============================] - 0s 499us/step - loss: 0.7459 - sparse_categorical_accuracy: 0.7000
Epoch 13/500
4/4 [==============================] - 0s 500us/step - loss: 0.6355 - sparse_categorical_accuracy: 0.6750
Epoch 14/500
4/4 [==============================] - 0s 500us/step - loss: 0.6361 - sparse_categorical_accuracy: 0.6833
Epoch 15/500
4/4 [==============================] - 0s 501us/step - loss: 0.6657 - sparse_categorical_accuracy: 0.6917
Epoch 16/500
4/4 [==============================] - 0s 501us/step - loss: 0.7976 - sparse_categorical_accuracy: 0.6083
Epoch 17/500
4/4 [==============================] - 0s 250us/step - loss: 0.5966 - sparse_categorical_accuracy: 0.7333
Epoch 18/500
4/4 [==============================] - 0s 750us/step - loss: 0.5155 - sparse_categorical_accuracy: 0.7583
Epoch 19/500
4/4 [==============================] - 0s 500us/step - loss: 0.5288 - sparse_categorical_accuracy: 0.7917
Epoch 20/500
4/4 [==============================] - 0s 26ms/step - loss: 0.6441 - sparse_categorical_accuracy: 0.7417 - val_loss: 0.5166 - val_sparse_categorical_accuracy: 0.8667
Epoch 21/500
4/4 [==============================] - 0s 504us/step - loss: 0.5820 - sparse_categorical_accuracy: 0.7583
Epoch 22/500
4/4 [==============================] - 0s 500us/step - loss: 0.7320 - sparse_categorical_accuracy: 0.7167
Epoch 23/500
4/4 [==============================] - 0s 502us/step - loss: 0.6227 - sparse_categorical_accuracy: 0.7167
Epoch 24/500
4/4 [==============================] - 0s 501us/step - loss: 0.5456 - sparse_categorical_accuracy: 0.7083
Epoch 25/500
4/4 [==============================] - 0s 500us/step - loss: 0.5855 - sparse_categorical_accuracy: 0.6750
Epoch 26/500
4/4 [==============================] - 0s 747us/step - loss: 0.8705 - sparse_categorical_accuracy: 0.6500
Epoch 27/500
4/4 [==============================] - 0s 750us/step - loss: 0.4491 - sparse_categorical_accuracy: 0.9250
Epoch 28/500
4/4 [==============================] - 0s 498us/step - loss: 0.4645 - sparse_categorical_accuracy: 0.8000
Epoch 29/500
4/4 [==============================] - 0s 500us/step - loss: 0.4256 - sparse_categorical_accuracy: 0.9333
Epoch 30/500
4/4 [==============================] - 0s 502us/step - loss: 0.6065 - sparse_categorical_accuracy: 0.7333
Epoch 31/500
4/4 [==============================] - 0s 749us/step - loss: 0.4749 - sparse_categorical_accuracy: 0.7667
Epoch 32/500
4/4 [==============================] - 0s 500us/step - loss: 0.4406 - sparse_categorical_accuracy: 0.8833
Epoch 33/500
4/4 [==============================] - 0s 500us/step - loss: 0.4539 - sparse_categorical_accuracy: 0.8417
Epoch 34/500
4/4 [==============================] - 0s 750us/step - loss: 0.4825 - sparse_categorical_accuracy: 0.8000
Epoch 35/500
4/4 [==============================] - 0s 500us/step - loss: 0.4328 - sparse_categorical_accuracy: 0.8500
Epoch 36/500
4/4 [==============================] - 0s 500us/step - loss: 0.7829 - sparse_categorical_accuracy: 0.6500
Epoch 37/500
4/4 [==============================] - 0s 500us/step - loss: 0.5828 - sparse_categorical_accuracy: 0.8500
Epoch 38/500
4/4 [==============================] - 0s 500us/step - loss: 0.4796 - sparse_categorical_accuracy: 0.8167
Epoch 39/500
4/4 [==============================] - 0s 750us/step - loss: 0.6032 - sparse_categorical_accuracy: 0.6500
Epoch 40/500
4/4 [==============================] - 0s 3ms/step - loss: 0.5781 - sparse_categorical_accuracy: 0.7417 - val_loss: 0.4954 - val_sparse_categorical_accuracy: 0.7333
Epoch 41/500
4/4 [==============================] - 0s 499us/step - loss: 0.4314 - sparse_categorical_accuracy: 0.8250
Epoch 42/500
4/4 [==============================] - 0s 501us/step - loss: 0.4136 - sparse_categorical_accuracy: 0.8917
Epoch 43/500
4/4 [==============================] - 0s 501us/step - loss: 0.4449 - sparse_categorical_accuracy: 0.8500
Epoch 44/500
4/4 [==============================] - 0s 250us/step - loss: 0.3885 - sparse_categorical_accuracy: 0.9083
Epoch 45/500
4/4 [==============================] - 0s 500us/step - loss: 0.5287 - sparse_categorical_accuracy: 0.7667
Epoch 46/500
4/4 [==============================] - 0s 499us/step - loss: 0.4426 - sparse_categorical_accuracy: 0.8917
Epoch 47/500
4/4 [==============================] - 0s 500us/step - loss: 0.3904 - sparse_categorical_accuracy: 0.9333
Epoch 48/500
4/4 [==============================] - 0s 498us/step - loss: 0.4477 - sparse_categorical_accuracy: 0.8083
Epoch 49/500
4/4 [==============================] - 0s 750us/step - loss: 0.4156 - sparse_categorical_accuracy: 0.8500
Epoch 50/500
4/4 [==============================] - 0s 250us/step - loss: 0.3888 - sparse_categorical_accuracy: 0.9000
Epoch 51/500
4/4 [==============================] - 0s 502us/step - loss: 0.4563 - sparse_categorical_accuracy: 0.7833
Epoch 52/500
4/4 [==============================] - 0s 500us/step - loss: 0.3964 - sparse_categorical_accuracy: 0.9333
Epoch 53/500
4/4 [==============================] - 0s 751us/step - loss: 0.4827 - sparse_categorical_accuracy: 0.7833
Epoch 54/500
4/4 [==============================] - 0s 250us/step - loss: 0.5693 - sparse_categorical_accuracy: 0.7250
Epoch 55/500
4/4 [==============================] - 0s 500us/step - loss: 0.3880 - sparse_categorical_accuracy: 0.9333
Epoch 56/500
4/4 [==============================] - 0s 500us/step - loss: 0.5508 - sparse_categorical_accuracy: 0.7000
Epoch 57/500
4/4 [==============================] - 0s 500us/step - loss: 0.6119 - sparse_categorical_accuracy: 0.7833
Epoch 58/500
4/4 [==============================] - 0s 500us/step - loss: 0.6111 - sparse_categorical_accuracy: 0.7167
Epoch 59/500
4/4 [==============================] - 0s 500us/step - loss: 0.3937 - sparse_categorical_accuracy: 0.9083
Epoch 60/500
4/4 [==============================] - 0s 3ms/step - loss: 0.5454 - sparse_categorical_accuracy: 0.7333 - val_loss: 0.4056 - val_sparse_categorical_accuracy: 1.0000
Epoch 61/500
4/4 [==============================] - 0s 500us/step - loss: 0.3780 - sparse_categorical_accuracy: 0.9417
Epoch 62/500
4/4 [==============================] - 0s 500us/step - loss: 0.3785 - sparse_categorical_accuracy: 0.9417
Epoch 63/500
4/4 [==============================] - 0s 250us/step - loss: 0.3788 - sparse_categorical_accuracy: 0.9500
Epoch 64/500
4/4 [==============================] - 0s 500us/step - loss: 0.4051 - sparse_categorical_accuracy: 0.8667
Epoch 65/500
4/4 [==============================] - 0s 500us/step - loss: 0.4204 - sparse_categorical_accuracy: 0.8250
Epoch 66/500
4/4 [==============================] - 0s 502us/step - loss: 0.5742 - sparse_categorical_accuracy: 0.7583
Epoch 67/500
4/4 [==============================] - 0s 252us/step - loss: 0.4380 - sparse_categorical_accuracy: 0.8833
Epoch 68/500
4/4 [==============================] - 0s 752us/step - loss: 0.4961 - sparse_categorical_accuracy: 0.7917
Epoch 69/500
4/4 [==============================] - 0s 254us/step - loss: 0.4117 - sparse_categorical_accuracy: 0.8833
Epoch 70/500
4/4 [==============================] - 0s 500us/step - loss: 0.5022 - sparse_categorical_accuracy: 0.7250
Epoch 71/500
4/4 [==============================] - 0s 500us/step - loss: 0.3570 - sparse_categorical_accuracy: 0.9583
Epoch 72/500
4/4 [==============================] - 0s 500us/step - loss: 0.3850 - sparse_categorical_accuracy: 0.8750
Epoch 73/500
4/4 [==============================] - 0s 500us/step - loss: 0.4043 - sparse_categorical_accuracy: 0.8583
Epoch 74/500
4/4 [==============================] - 0s 500us/step - loss: 0.3693 - sparse_categorical_accuracy: 0.9417
Epoch 75/500
4/4 [==============================] - 0s 499us/step - loss: 0.3708 - sparse_categorical_accuracy: 0.9333
Epoch 76/500
4/4 [==============================] - 0s 749us/step - loss: 0.3663 - sparse_categorical_accuracy: 0.9500
Epoch 77/500
4/4 [==============================] - 0s 502us/step - loss: 0.3650 - sparse_categorical_accuracy: 0.9167
Epoch 78/500
4/4 [==============================] - 0s 502us/step - loss: 0.4134 - sparse_categorical_accuracy: 0.8667
Epoch 79/500
4/4 [==============================] - 0s 252us/step - loss: 0.3653 - sparse_categorical_accuracy: 0.9500
Epoch 80/500
4/4 [==============================] - 0s 4ms/step - loss: 0.3940 - sparse_categorical_accuracy: 0.8833 - val_loss: 0.3856 - val_sparse_categorical_accuracy: 1.0000
Epoch 81/500
4/4 [==============================] - 0s 750us/step - loss: 0.4069 - sparse_categorical_accuracy: 0.9000
Epoch 82/500
4/4 [==============================] - 0s 492us/step - loss: 0.4834 - sparse_categorical_accuracy: 0.7833
Epoch 83/500
4/4 [==============================] - 0s 495us/step - loss: 0.6105 - sparse_categorical_accuracy: 0.6917
Epoch 84/500
4/4 [==============================] - 0s 499us/step - loss: 0.3992 - sparse_categorical_accuracy: 0.8833
Epoch 85/500
4/4 [==============================] - 0s 499us/step - loss: 0.3833 - sparse_categorical_accuracy: 0.9083
Epoch 86/500
4/4 [==============================] - 0s 498us/step - loss: 0.3616 - sparse_categorical_accuracy: 0.9167
Epoch 87/500
4/4 [==============================] - 0s 500us/step - loss: 0.4037 - sparse_categorical_accuracy: 0.8667
Epoch 88/500
4/4 [==============================] - 0s 499us/step - loss: 0.3890 - sparse_categorical_accuracy: 0.9250
Epoch 89/500
4/4 [==============================] - 0s 500us/step - loss: 0.4708 - sparse_categorical_accuracy: 0.8167
Epoch 90/500
4/4 [==============================] - 0s 498us/step - loss: 0.3783 - sparse_categorical_accuracy: 0.9083
Epoch 91/500
4/4 [==============================] - 0s 500us/step - loss: 0.3599 - sparse_categorical_accuracy: 0.9500
Epoch 92/500
4/4 [==============================] - 0s 497us/step - loss: 0.8419 - sparse_categorical_accuracy: 0.7167
Epoch 93/500
4/4 [==============================] - 0s 502us/step - loss: 0.3858 - sparse_categorical_accuracy: 0.9250
Epoch 94/500
4/4 [==============================] - 0s 502us/step - loss: 0.4063 - sparse_categorical_accuracy: 0.9000
Epoch 95/500
4/4 [==============================] - 0s 250us/step - loss: 0.3984 - sparse_categorical_accuracy: 0.9000
Epoch 96/500
4/4 [==============================] - 0s 497us/step - loss: 0.3748 - sparse_categorical_accuracy: 0.9000
Epoch 97/500
4/4 [==============================] - 0s 499us/step - loss: 0.3569 - sparse_categorical_accuracy: 0.9333
Epoch 98/500
4/4 [==============================] - 0s 250us/step - loss: 0.4023 - sparse_categorical_accuracy: 0.9083
Epoch 99/500
4/4 [==============================] - 0s 500us/step - loss: 0.3530 - sparse_categorical_accuracy: 0.9667
Epoch 100/500
4/4 [==============================] - 0s 3ms/step - loss: 0.3451 - sparse_categorical_accuracy: 0.9417 - val_loss: 0.3864 - val_sparse_categorical_accuracy: 1.0000
Epoch 101/500
4/4 [==============================] - 0s 501us/step - loss: 0.3721 - sparse_categorical_accuracy: 0.9167
Epoch 102/500
4/4 [==============================] - 0s 250us/step - loss: 0.4191 - sparse_categorical_accuracy: 0.8417
Epoch 103/500
4/4 [==============================] - 0s 500us/step - loss: 0.3627 - sparse_categorical_accuracy: 0.9500
Epoch 104/500
4/4 [==============================] - 0s 250us/step - loss: 0.3566 - sparse_categorical_accuracy: 0.9583
Epoch 105/500
4/4 [==============================] - 0s 500us/step - loss: 0.3713 - sparse_categorical_accuracy: 0.9083
Epoch 106/500
4/4 [==============================] - 0s 250us/step - loss: 0.3972 - sparse_categorical_accuracy: 0.8667
Epoch 107/500
4/4 [==============================] - 0s 252us/step - loss: 0.3645 - sparse_categorical_accuracy: 0.9417
Epoch 108/500
4/4 [==============================] - 0s 500us/step - loss: 0.3595 - sparse_categorical_accuracy: 0.9583
Epoch 109/500
4/4 [==============================] - 0s 500us/step - loss: 0.3988 - sparse_categorical_accuracy: 0.9000
Epoch 110/500
4/4 [==============================] - 0s 500us/step - loss: 0.4090 - sparse_categorical_accuracy: 0.8667
Epoch 111/500
4/4 [==============================] - 0s 498us/step - loss: 0.3759 - sparse_categorical_accuracy: 0.9333
Epoch 112/500
4/4 [==============================] - 0s 498us/step - loss: 0.3490 - sparse_categorical_accuracy: 0.9667
Epoch 113/500
4/4 [==============================] - 0s 499us/step - loss: 0.4690 - sparse_categorical_accuracy: 0.8167
Epoch 114/500
4/4 [==============================] - 0s 501us/step - loss: 0.3530 - sparse_categorical_accuracy: 0.9667
Epoch 115/500
4/4 [==============================] - 0s 250us/step - loss: 0.3552 - sparse_categorical_accuracy: 0.9500
Epoch 116/500
4/4 [==============================] - 0s 498us/step - loss: 0.3780 - sparse_categorical_accuracy: 0.9417
Epoch 117/500
4/4 [==============================] - 0s 749us/step - loss: 0.6709 - sparse_categorical_accuracy: 0.6750
Epoch 118/500
4/4 [==============================] - 0s 498us/step - loss: 0.3519 - sparse_categorical_accuracy: 0.9583
Epoch 119/500
4/4 [==============================] - 0s 500us/step - loss: 0.3653 - sparse_categorical_accuracy: 0.9583
Epoch 120/500
4/4 [==============================] - 0s 3ms/step - loss: 0.3507 - sparse_categorical_accuracy: 0.9667 - val_loss: 0.3825 - val_sparse_categorical_accuracy: 1.0000
Epoch 121/500
4/4 [==============================] - 0s 252us/step - loss: 0.3896 - sparse_categorical_accuracy: 0.8833
Epoch 122/500
4/4 [==============================] - 0s 250us/step - loss: 0.4266 - sparse_categorical_accuracy: 0.8833
Epoch 123/500
4/4 [==============================] - 0s 501us/step - loss: 0.4371 - sparse_categorical_accuracy: 0.8250
Epoch 124/500
4/4 [==============================] - 0s 500us/step - loss: 0.3574 - sparse_categorical_accuracy: 0.9500
Epoch 125/500
4/4 [==============================] - 0s 500us/step - loss: 0.5395 - sparse_categorical_accuracy: 0.7667
Epoch 126/500
4/4 [==============================] - 0s 500us/step - loss: 0.3777 - sparse_categorical_accuracy: 0.9000
Epoch 127/500
4/4 [==============================] - 0s 750us/step - loss: 0.3543 - sparse_categorical_accuracy: 0.9833
Epoch 128/500
4/4 [==============================] - 0s 500us/step - loss: 0.4238 - sparse_categorical_accuracy: 0.8583
Epoch 129/500
4/4 [==============================] - 0s 750us/step - loss: 0.4693 - sparse_categorical_accuracy: 0.8000
Epoch 130/500
4/4 [==============================] - 0s 499us/step - loss: 0.3715 - sparse_categorical_accuracy: 0.9417
Epoch 131/500
4/4 [==============================] - 0s 749us/step - loss: 0.3884 - sparse_categorical_accuracy: 0.8750
Epoch 132/500
4/4 [==============================] - 0s 501us/step - loss: 0.3555 - sparse_categorical_accuracy: 0.9500
Epoch 133/500
4/4 [==============================] - 0s 499us/step - loss: 0.3597 - sparse_categorical_accuracy: 0.9417
Epoch 134/500
4/4 [==============================] - 0s 500us/step - loss: 0.3667 - sparse_categorical_accuracy: 0.9333
Epoch 135/500
4/4 [==============================] - 0s 500us/step - loss: 0.4166 - sparse_categorical_accuracy: 0.8583
Epoch 136/500
4/4 [==============================] - 0s 500us/step - loss: 0.3483 - sparse_categorical_accuracy: 0.9417
Epoch 137/500
4/4 [==============================] - 0s 250us/step - loss: 0.3638 - sparse_categorical_accuracy: 0.9333
Epoch 138/500
4/4 [==============================] - 0s 251us/step - loss: 0.3894 - sparse_categorical_accuracy: 0.9083
Epoch 139/500
4/4 [==============================] - 0s 251us/step - loss: 0.3595 - sparse_categorical_accuracy: 0.9500
Epoch 140/500
4/4 [==============================] - 0s 3ms/step - loss: 0.3802 - sparse_categorical_accuracy: 0.9417 - val_loss: 0.3870 - val_sparse_categorical_accuracy: 1.0000
Epoch 141/500
4/4 [==============================] - 0s 500us/step - loss: 0.3695 - sparse_categorical_accuracy: 0.9167
Epoch 142/500
4/4 [==============================] - 0s 500us/step - loss: 0.4126 - sparse_categorical_accuracy: 0.8667
Epoch 143/500
4/4 [==============================] - 0s 499us/step - loss: 0.4500 - sparse_categorical_accuracy: 0.8250
Epoch 144/500
4/4 [==============================] - 0s 502us/step - loss: 0.3609 - sparse_categorical_accuracy: 0.9250
Epoch 145/500
4/4 [==============================] - 0s 498us/step - loss: 0.5053 - sparse_categorical_accuracy: 0.8083
Epoch 146/500
4/4 [==============================] - 0s 250us/step - loss: 0.3962 - sparse_categorical_accuracy: 0.9083
Epoch 147/500
4/4 [==============================] - 0s 502us/step - loss: 0.3534 - sparse_categorical_accuracy: 0.9583
Epoch 148/500
4/4 [==============================] - 0s 502us/step - loss: 0.3612 - sparse_categorical_accuracy: 0.9333
Epoch 149/500
4/4 [==============================] - 0s 498us/step - loss: 0.3465 - sparse_categorical_accuracy: 0.9417
Epoch 150/500
4/4 [==============================] - 0s 498us/step - loss: 0.3459 - sparse_categorical_accuracy: 0.9750
Epoch 151/500
4/4 [==============================] - 0s 501us/step - loss: 0.3663 - sparse_categorical_accuracy: 0.9417
Epoch 152/500
4/4 [==============================] - 0s 500us/step - loss: 0.3516 - sparse_categorical_accuracy: 0.9500
Epoch 153/500
4/4 [==============================] - 0s 499us/step - loss: 0.3540 - sparse_categorical_accuracy: 0.9667
Epoch 154/500
4/4 [==============================] - 0s 500us/step - loss: 0.3482 - sparse_categorical_accuracy: 0.9667
Epoch 155/500
4/4 [==============================] - 0s 1ms/step - loss: 0.3804 - sparse_categorical_accuracy: 0.9000
Epoch 156/500
4/4 [==============================] - 0s 503us/step - loss: 0.3756 - sparse_categorical_accuracy: 0.9167
Epoch 157/500
4/4 [==============================] - 0s 503us/step - loss: 0.3566 - sparse_categorical_accuracy: 0.9833
Epoch 158/500
4/4 [==============================] - 0s 500us/step - loss: 0.3677 - sparse_categorical_accuracy: 0.9500
Epoch 159/500
4/4 [==============================] - 0s 500us/step - loss: 0.3655 - sparse_categorical_accuracy: 0.9333
Epoch 160/500
4/4 [==============================] - 0s 3ms/step - loss: 0.3501 - sparse_categorical_accuracy: 0.9583 - val_loss: 0.4073 - val_sparse_categorical_accuracy: 0.9000
Epoch 161/500
4/4 [==============================] - 0s 499us/step - loss: 0.3720 - sparse_categorical_accuracy: 0.9250
Epoch 162/500
4/4 [==============================] - 0s 502us/step - loss: 0.4191 - sparse_categorical_accuracy: 0.8667
Epoch 163/500
4/4 [==============================] - 0s 500us/step - loss: 0.4140 - sparse_categorical_accuracy: 0.8833
Epoch 164/500
4/4 [==============================] - 0s 499us/step - loss: 0.3626 - sparse_categorical_accuracy: 0.9250
Epoch 165/500
4/4 [==============================] - 0s 498us/step - loss: 0.3448 - sparse_categorical_accuracy: 0.9583
Epoch 166/500
4/4 [==============================] - 0s 502us/step - loss: 0.3682 - sparse_categorical_accuracy: 0.9333
Epoch 167/500
4/4 [==============================] - 0s 250us/step - loss: 0.3642 - sparse_categorical_accuracy: 0.9250
Epoch 168/500
4/4 [==============================] - 0s 749us/step - loss: 0.3817 - sparse_categorical_accuracy: 0.9000
Epoch 169/500
4/4 [==============================] - 0s 501us/step - loss: 0.5025 - sparse_categorical_accuracy: 0.8000
Epoch 170/500
4/4 [==============================] - 0s 498us/step - loss: 0.3535 - sparse_categorical_accuracy: 0.9583
Epoch 171/500
4/4 [==============================] - 0s 500us/step - loss: 0.3473 - sparse_categorical_accuracy: 0.9750
Epoch 172/500
4/4 [==============================] - 0s 502us/step - loss: 0.4064 - sparse_categorical_accuracy: 0.8833
Epoch 173/500
4/4 [==============================] - 0s 748us/step - loss: 0.3471 - sparse_categorical_accuracy: 0.9583
Epoch 174/500
4/4 [==============================] - 0s 500us/step - loss: 0.3501 - sparse_categorical_accuracy: 0.9667
Epoch 175/500
4/4 [==============================] - 0s 500us/step - loss: 0.4406 - sparse_categorical_accuracy: 0.8250
Epoch 176/500
4/4 [==============================] - 0s 500us/step - loss: 0.3482 - sparse_categorical_accuracy: 0.9667
Epoch 177/500
4/4 [==============================] - 0s 250us/step - loss: 0.3610 - sparse_categorical_accuracy: 0.9417
Epoch 178/500
4/4 [==============================] - 0s 498us/step - loss: 0.3673 - sparse_categorical_accuracy: 0.9000
Epoch 179/500
4/4 [==============================] - 0s 499us/step - loss: 0.3608 - sparse_categorical_accuracy: 0.9417
Epoch 180/500
4/4 [==============================] - 0s 4ms/step - loss: 0.5772 - sparse_categorical_accuracy: 0.7583 - val_loss: 0.3747 - val_sparse_categorical_accuracy: 1.0000
Epoch 181/500
4/4 [==============================] - 0s 252us/step - loss: 0.4203 - sparse_categorical_accuracy: 0.8667
Epoch 182/500
4/4 [==============================] - 0s 500us/step - loss: 0.3739 - sparse_categorical_accuracy: 0.9417
Epoch 183/500
4/4 [==============================] - 0s 499us/step - loss: 0.3483 - sparse_categorical_accuracy: 0.9500
Epoch 184/500
4/4 [==============================] - 0s 500us/step - loss: 0.3906 - sparse_categorical_accuracy: 0.9000
Epoch 185/500
4/4 [==============================] - 0s 502us/step - loss: 0.3539 - sparse_categorical_accuracy: 0.9583
Epoch 186/500
4/4 [==============================] - 0s 505us/step - loss: 0.3685 - sparse_categorical_accuracy: 0.9333
Epoch 187/500
4/4 [==============================] - 0s 500us/step - loss: 0.3466 - sparse_categorical_accuracy: 0.9750
Epoch 188/500
4/4 [==============================] - 0s 500us/step - loss: 0.3582 - sparse_categorical_accuracy: 0.9667
Epoch 189/500
4/4 [==============================] - 0s 250us/step - loss: 0.3632 - sparse_categorical_accuracy: 0.9250
Epoch 190/500
4/4 [==============================] - 0s 498us/step - loss: 0.3398 - sparse_categorical_accuracy: 0.9583
Epoch 191/500
4/4 [==============================] - 0s 752us/step - loss: 0.3534 - sparse_categorical_accuracy: 0.9333
Epoch 192/500
4/4 [==============================] - 0s 499us/step - loss: 0.3730 - sparse_categorical_accuracy: 0.9167
Epoch 193/500
4/4 [==============================] - 0s 751us/step - loss: 0.3623 - sparse_categorical_accuracy: 0.9333
Epoch 194/500
4/4 [==============================] - 0s 999us/step - loss: 0.3442 - sparse_categorical_accuracy: 0.9583
Epoch 195/500
4/4 [==============================] - 0s 502us/step - loss: 0.3439 - sparse_categorical_accuracy: 0.9583
Epoch 196/500
4/4 [==============================] - 0s 500us/step - loss: 0.3429 - sparse_categorical_accuracy: 0.9667
Epoch 197/500
4/4 [==============================] - 0s 500us/step - loss: 0.3660 - sparse_categorical_accuracy: 0.9250
Epoch 198/500
4/4 [==============================] - 0s 250us/step - loss: 0.4202 - sparse_categorical_accuracy: 0.8750
Epoch 199/500
4/4 [==============================] - 0s 499us/step - loss: 0.3658 - sparse_categorical_accuracy: 0.9167
Epoch 200/500
4/4 [==============================] - 0s 3ms/step - loss: 0.3575 - sparse_categorical_accuracy: 0.9417 - val_loss: 0.5404 - val_sparse_categorical_accuracy: 0.7000
Epoch 201/500
4/4 [==============================] - 0s 999us/step - loss: 0.3740 - sparse_categorical_accuracy: 0.9417
Epoch 202/500
4/4 [==============================] - 0s 250us/step - loss: 0.3478 - sparse_categorical_accuracy: 0.9667
Epoch 203/500
4/4 [==============================] - 0s 498us/step - loss: 0.3726 - sparse_categorical_accuracy: 0.9083
Epoch 204/500
4/4 [==============================] - 0s 498us/step - loss: 0.3426 - sparse_categorical_accuracy: 0.9750
Epoch 205/500
4/4 [==============================] - 0s 500us/step - loss: 0.3962 - sparse_categorical_accuracy: 0.9083
Epoch 206/500
4/4 [==============================] - 0s 500us/step - loss: 0.3520 - sparse_categorical_accuracy: 0.9500
Epoch 207/500
4/4 [==============================] - 0s 252us/step - loss: 0.3944 - sparse_categorical_accuracy: 0.9083
Epoch 208/500
4/4 [==============================] - 0s 502us/step - loss: 0.4034 - sparse_categorical_accuracy: 0.8667
Epoch 209/500
4/4 [==============================] - 0s 499us/step - loss: 0.3494 - sparse_categorical_accuracy: 0.9417
Epoch 210/500
4/4 [==============================] - 0s 499us/step - loss: 0.3508 - sparse_categorical_accuracy: 0.9667
Epoch 211/500
4/4 [==============================] - 0s 498us/step - loss: 0.4188 - sparse_categorical_accuracy: 0.8500
Epoch 212/500
4/4 [==============================] - 0s 502us/step - loss: 0.3683 - sparse_categorical_accuracy: 0.9417
Epoch 213/500
4/4 [==============================] - 0s 250us/step - loss: 0.3796 - sparse_categorical_accuracy: 0.9083
Epoch 214/500
4/4 [==============================] - 0s 499us/step - loss: 0.3824 - sparse_categorical_accuracy: 0.9167
Epoch 215/500
4/4 [==============================] - 0s 500us/step - loss: 0.3399 - sparse_categorical_accuracy: 0.9417
Epoch 216/500
4/4 [==============================] - 0s 500us/step - loss: 0.3861 - sparse_categorical_accuracy: 0.9500
Epoch 217/500
4/4 [==============================] - 0s 500us/step - loss: 0.3418 - sparse_categorical_accuracy: 0.9750
Epoch 218/500
4/4 [==============================] - 0s 498us/step - loss: 0.3749 - sparse_categorical_accuracy: 0.9333
Epoch 219/500
4/4 [==============================] - 0s 499us/step - loss: 0.3464 - sparse_categorical_accuracy: 0.9333
Epoch 220/500
4/4 [==============================] - 0s 3ms/step - loss: 0.3801 - sparse_categorical_accuracy: 0.9333 - val_loss: 0.4013 - val_sparse_categorical_accuracy: 0.9000
Epoch 221/500
4/4 [==============================] - 0s 750us/step - loss: 0.3993 - sparse_categorical_accuracy: 0.8750
Epoch 222/500
4/4 [==============================] - 0s 500us/step - loss: 0.4531 - sparse_categorical_accuracy: 0.8083
Epoch 223/500
4/4 [==============================] - 0s 500us/step - loss: 0.4211 - sparse_categorical_accuracy: 0.8750
Epoch 224/500
4/4 [==============================] - 0s 501us/step - loss: 0.3874 - sparse_categorical_accuracy: 0.9083
Epoch 225/500
4/4 [==============================] - 0s 500us/step - loss: 0.3501 - sparse_categorical_accuracy: 0.9500
Epoch 226/500
4/4 [==============================] - 0s 500us/step - loss: 0.3370 - sparse_categorical_accuracy: 0.9583
Epoch 227/500
4/4 [==============================] - 0s 500us/step - loss: 0.3507 - sparse_categorical_accuracy: 0.9417
Epoch 228/500
4/4 [==============================] - 0s 502us/step - loss: 0.3481 - sparse_categorical_accuracy: 0.9667
Epoch 229/500
4/4 [==============================] - 0s 499us/step - loss: 0.3662 - sparse_categorical_accuracy: 0.9333
Epoch 230/500
4/4 [==============================] - 0s 500us/step - loss: 0.3423 - sparse_categorical_accuracy: 0.9667
Epoch 231/500
4/4 [==============================] - 0s 500us/step - loss: 0.3610 - sparse_categorical_accuracy: 0.9333
Epoch 232/500
4/4 [==============================] - 0s 500us/step - loss: 0.3424 - sparse_categorical_accuracy: 0.9750
Epoch 233/500
4/4 [==============================] - 0s 498us/step - loss: 0.3406 - sparse_categorical_accuracy: 0.9500
Epoch 234/500
4/4 [==============================] - 0s 252us/step - loss: 0.3435 - sparse_categorical_accuracy: 0.9583
Epoch 235/500
4/4 [==============================] - 0s 500us/step - loss: 0.3419 - sparse_categorical_accuracy: 0.9833
Epoch 236/500
4/4 [==============================] - 0s 500us/step - loss: 0.4133 - sparse_categorical_accuracy: 0.8833
Epoch 237/500
4/4 [==============================] - 0s 499us/step - loss: 0.3542 - sparse_categorical_accuracy: 0.9250
Epoch 238/500
4/4 [==============================] - 0s 250us/step - loss: 0.3679 - sparse_categorical_accuracy: 0.9250
Epoch 239/500
4/4 [==============================] - 0s 252us/step - loss: 0.4458 - sparse_categorical_accuracy: 0.8500
Epoch 240/500
4/4 [==============================] - 0s 3ms/step - loss: 0.3610 - sparse_categorical_accuracy: 0.9167 - val_loss: 0.3924 - val_sparse_categorical_accuracy: 0.9667
Epoch 241/500
4/4 [==============================] - 0s 249us/step - loss: 0.3445 - sparse_categorical_accuracy: 0.9417
Epoch 242/500
4/4 [==============================] - 0s 501us/step - loss: 0.3592 - sparse_categorical_accuracy: 0.9417
Epoch 243/500
4/4 [==============================] - 0s 500us/step - loss: 0.3937 - sparse_categorical_accuracy: 0.9083
Epoch 244/500
4/4 [==============================] - 0s 498us/step - loss: 0.3711 - sparse_categorical_accuracy: 0.9417
Epoch 245/500
4/4 [==============================] - 0s 250us/step - loss: 0.3469 - sparse_categorical_accuracy: 0.9333
Epoch 246/500
4/4 [==============================] - 0s 250us/step - loss: 0.3960 - sparse_categorical_accuracy: 0.8833
Epoch 247/500
4/4 [==============================] - 0s 500us/step - loss: 0.4282 - sparse_categorical_accuracy: 0.8583
Epoch 248/500
4/4 [==============================] - 0s 500us/step - loss: 0.3523 - sparse_categorical_accuracy: 0.9333
Epoch 249/500
4/4 [==============================] - 0s 500us/step - loss: 0.3721 - sparse_categorical_accuracy: 0.9250
Epoch 250/500
4/4 [==============================] - 0s 498us/step - loss: 0.3406 - sparse_categorical_accuracy: 0.9667
Epoch 251/500
4/4 [==============================] - 0s 499us/step - loss: 0.3766 - sparse_categorical_accuracy: 0.9000
Epoch 252/500
4/4 [==============================] - 0s 750us/step - loss: 0.3736 - sparse_categorical_accuracy: 0.8833
Epoch 253/500
4/4 [==============================] - 0s 500us/step - loss: 0.3578 - sparse_categorical_accuracy: 0.9333
Epoch 254/500
4/4 [==============================] - 0s 252us/step - loss: 0.4230 - sparse_categorical_accuracy: 0.8917
Epoch 255/500
4/4 [==============================] - 0s 250us/step - loss: 0.3714 - sparse_categorical_accuracy: 0.9167
Epoch 256/500
4/4 [==============================] - 0s 250us/step - loss: 0.4356 - sparse_categorical_accuracy: 0.8417
Epoch 257/500
4/4 [==============================] - 0s 502us/step - loss: 0.3396 - sparse_categorical_accuracy: 0.9833
Epoch 258/500
4/4 [==============================] - 0s 500us/step - loss: 0.3374 - sparse_categorical_accuracy: 0.9750
Epoch 259/500
4/4 [==============================] - 0s 498us/step - loss: 0.3807 - sparse_categorical_accuracy: 0.8917
Epoch 260/500
4/4 [==============================] - 0s 3ms/step - loss: 0.3664 - sparse_categorical_accuracy: 0.9250 - val_loss: 0.3590 - val_sparse_categorical_accuracy: 1.0000
Epoch 261/500
4/4 [==============================] - 0s 250us/step - loss: 0.4286 - sparse_categorical_accuracy: 0.8833
Epoch 262/500
4/4 [==============================] - 0s 250us/step - loss: 0.3434 - sparse_categorical_accuracy: 0.9750
Epoch 263/500
4/4 [==============================] - 0s 252us/step - loss: 0.3629 - sparse_categorical_accuracy: 0.9583
Epoch 264/500
4/4 [==============================] - 0s 502us/step - loss: 0.3674 - sparse_categorical_accuracy: 0.9333
Epoch 265/500
4/4 [==============================] - 0s 251us/step - loss: 0.3422 - sparse_categorical_accuracy: 0.9500
Epoch 266/500
4/4 [==============================] - 0s 501us/step - loss: 0.3386 - sparse_categorical_accuracy: 0.9750
Epoch 267/500
4/4 [==============================] - 0s 250us/step - loss: 0.3423 - sparse_categorical_accuracy: 0.9667
Epoch 268/500
4/4 [==============================] - 0s 500us/step - loss: 0.3441 - sparse_categorical_accuracy: 0.9667
Epoch 269/500
4/4 [==============================] - 0s 500us/step - loss: 0.3498 - sparse_categorical_accuracy: 0.9250
Epoch 270/500
4/4 [==============================] - 0s 500us/step - loss: 0.3522 - sparse_categorical_accuracy: 0.9250
Epoch 271/500
4/4 [==============================] - 0s 498us/step - loss: 0.3367 - sparse_categorical_accuracy: 0.9500
Epoch 272/500
4/4 [==============================] - 0s 498us/step - loss: 0.3723 - sparse_categorical_accuracy: 0.9250
Epoch 273/500
4/4 [==============================] - 0s 500us/step - loss: 0.3879 - sparse_categorical_accuracy: 0.8833
Epoch 274/500
4/4 [==============================] - 0s 502us/step - loss: 0.3529 - sparse_categorical_accuracy: 0.9417
Epoch 275/500
4/4 [==============================] - 0s 502us/step - loss: 0.3911 - sparse_categorical_accuracy: 0.8667
Epoch 276/500
4/4 [==============================] - 0s 250us/step - loss: 0.3666 - sparse_categorical_accuracy: 0.9333
Epoch 277/500
4/4 [==============================] - 0s 252us/step - loss: 0.3804 - sparse_categorical_accuracy: 0.9333
Epoch 278/500
4/4 [==============================] - 0s 500us/step - loss: 0.3839 - sparse_categorical_accuracy: 0.9083
Epoch 279/500
4/4 [==============================] - 0s 500us/step - loss: 0.4556 - sparse_categorical_accuracy: 0.8250
Epoch 280/500
4/4 [==============================] - 0s 3ms/step - loss: 0.4012 - sparse_categorical_accuracy: 0.9000 - val_loss: 0.3661 - val_sparse_categorical_accuracy: 1.0000
Epoch 281/500
4/4 [==============================] - 0s 250us/step - loss: 0.3525 - sparse_categorical_accuracy: 0.9500
Epoch 282/500
4/4 [==============================] - 0s 500us/step - loss: 0.3676 - sparse_categorical_accuracy: 0.9250
Epoch 283/500
4/4 [==============================] - 0s 250us/step - loss: 0.3495 - sparse_categorical_accuracy: 0.9417
Epoch 284/500
4/4 [==============================] - 0s 250us/step - loss: 0.3836 - sparse_categorical_accuracy: 0.8750
Epoch 285/500
4/4 [==============================] - 0s 502us/step - loss: 0.3488 - sparse_categorical_accuracy: 0.9333
Epoch 286/500
4/4 [==============================] - 0s 500us/step - loss: 0.3429 - sparse_categorical_accuracy: 0.9583
Epoch 287/500
4/4 [==============================] - 0s 250us/step - loss: 0.3531 - sparse_categorical_accuracy: 0.9583
Epoch 288/500
4/4 [==============================] - 0s 500us/step - loss: 0.3862 - sparse_categorical_accuracy: 0.9167
Epoch 289/500
4/4 [==============================] - 0s 499us/step - loss: 0.4095 - sparse_categorical_accuracy: 0.8750
Epoch 290/500
4/4 [==============================] - 0s 500us/step - loss: 0.3438 - sparse_categorical_accuracy: 0.9500
Epoch 291/500
4/4 [==============================] - 0s 500us/step - loss: 0.3622 - sparse_categorical_accuracy: 0.9333
Epoch 292/500
4/4 [==============================] - 0s 502us/step - loss: 0.4040 - sparse_categorical_accuracy: 0.9000
Epoch 293/500
4/4 [==============================] - 0s 502us/step - loss: 0.4098 - sparse_categorical_accuracy: 0.8667
Epoch 294/500
4/4 [==============================] - 0s 500us/step - loss: 0.3577 - sparse_categorical_accuracy: 0.9083
Epoch 295/500
4/4 [==============================] - 0s 500us/step - loss: 0.3468 - sparse_categorical_accuracy: 0.9667
Epoch 296/500
4/4 [==============================] - 0s 499us/step - loss: 0.3377 - sparse_categorical_accuracy: 0.9667
Epoch 297/500
4/4 [==============================] - 0s 500us/step - loss: 0.4084 - sparse_categorical_accuracy: 0.8917
Epoch 298/500
4/4 [==============================] - 0s 500us/step - loss: 0.3620 - sparse_categorical_accuracy: 0.9417
Epoch 299/500
4/4 [==============================] - 0s 501us/step - loss: 0.3384 - sparse_categorical_accuracy: 0.9750
Epoch 300/500
4/4 [==============================] - 0s 2ms/step - loss: 0.3343 - sparse_categorical_accuracy: 0.9583 - val_loss: 0.3573 - val_sparse_categorical_accuracy: 1.0000
Epoch 301/500
4/4 [==============================] - 0s 502us/step - loss: 0.3400 - sparse_categorical_accuracy: 0.9750
Epoch 302/500
4/4 [==============================] - 0s 500us/step - loss: 0.3366 - sparse_categorical_accuracy: 0.9667
Epoch 303/500
4/4 [==============================] - 0s 502us/step - loss: 0.3518 - sparse_categorical_accuracy: 0.9500
Epoch 304/500
4/4 [==============================] - 0s 500us/step - loss: 0.3628 - sparse_categorical_accuracy: 0.9333
Epoch 305/500
4/4 [==============================] - 0s 502us/step - loss: 0.3383 - sparse_categorical_accuracy: 0.9500
Epoch 306/500
4/4 [==============================] - 0s 748us/step - loss: 0.3579 - sparse_categorical_accuracy: 0.9417
Epoch 307/500
4/4 [==============================] - 0s 498us/step - loss: 0.3796 - sparse_categorical_accuracy: 0.8833
Epoch 308/500
4/4 [==============================] - 0s 502us/step - loss: 0.3435 - sparse_categorical_accuracy: 0.9583
Epoch 309/500
4/4 [==============================] - 0s 498us/step - loss: 0.3754 - sparse_categorical_accuracy: 0.9167
Epoch 310/500
4/4 [==============================] - 0s 250us/step - loss: 0.4025 - sparse_categorical_accuracy: 0.8583
Epoch 311/500
4/4 [==============================] - 0s 499us/step - loss: 0.3522 - sparse_categorical_accuracy: 0.9500
Epoch 312/500
4/4 [==============================] - 0s 500us/step - loss: 0.3894 - sparse_categorical_accuracy: 0.9083
Epoch 313/500
4/4 [==============================] - 0s 250us/step - loss: 0.3459 - sparse_categorical_accuracy: 0.9250
Epoch 314/500
4/4 [==============================] - 0s 498us/step - loss: 0.3451 - sparse_categorical_accuracy: 0.9667
Epoch 315/500
4/4 [==============================] - 0s 498us/step - loss: 0.4474 - sparse_categorical_accuracy: 0.8083
Epoch 316/500
4/4 [==============================] - 0s 497us/step - loss: 0.3475 - sparse_categorical_accuracy: 0.9333
Epoch 317/500
4/4 [==============================] - 0s 500us/step - loss: 0.3681 - sparse_categorical_accuracy: 0.9333
Epoch 318/500
4/4 [==============================] - 0s 498us/step - loss: 0.3749 - sparse_categorical_accuracy: 0.9167
Epoch 319/500
4/4 [==============================] - 0s 501us/step - loss: 0.3411 - sparse_categorical_accuracy: 0.9583
Epoch 320/500
4/4 [==============================] - 0s 3ms/step - loss: 0.3436 - sparse_categorical_accuracy: 0.9667 - val_loss: 0.3617 - val_sparse_categorical_accuracy: 1.0000
Epoch 321/500
4/4 [==============================] - 0s 497us/step - loss: 0.4260 - sparse_categorical_accuracy: 0.8333
Epoch 322/500
4/4 [==============================] - 0s 500us/step - loss: 0.3352 - sparse_categorical_accuracy: 0.9750
Epoch 323/500
4/4 [==============================] - 0s 500us/step - loss: 0.3657 - sparse_categorical_accuracy: 0.9000
Epoch 324/500
4/4 [==============================] - 0s 500us/step - loss: 0.3436 - sparse_categorical_accuracy: 0.9583
Epoch 325/500
4/4 [==============================] - 0s 250us/step - loss: 0.5385 - sparse_categorical_accuracy: 0.7417
Epoch 326/500
4/4 [==============================] - 0s 501us/step - loss: 0.3473 - sparse_categorical_accuracy: 0.9500
Epoch 327/500
4/4 [==============================] - 0s 500us/step - loss: 0.3489 - sparse_categorical_accuracy: 0.9500
Epoch 328/500
4/4 [==============================] - 0s 250us/step - loss: 0.3708 - sparse_categorical_accuracy: 0.9083
Epoch 329/500
4/4 [==============================] - 0s 250us/step - loss: 0.3421 - sparse_categorical_accuracy: 0.9417
Epoch 330/500
4/4 [==============================] - 0s 251us/step - loss: 0.3318 - sparse_categorical_accuracy: 0.9667
Epoch 331/500
4/4 [==============================] - 0s 500us/step - loss: 0.3430 - sparse_categorical_accuracy: 0.9583
Epoch 332/500
4/4 [==============================] - 0s 500us/step - loss: 0.3723 - sparse_categorical_accuracy: 0.9000
Epoch 333/500
4/4 [==============================] - 0s 250us/step - loss: 0.3759 - sparse_categorical_accuracy: 0.9167
Epoch 334/500
4/4 [==============================] - 0s 500us/step - loss: 0.3930 - sparse_categorical_accuracy: 0.9083
Epoch 335/500
4/4 [==============================] - 0s 499us/step - loss: 0.3612 - sparse_categorical_accuracy: 0.9250
Epoch 336/500
4/4 [==============================] - 0s 500us/step - loss: 0.3903 - sparse_categorical_accuracy: 0.8917
Epoch 337/500
4/4 [==============================] - 0s 250us/step - loss: 0.4307 - sparse_categorical_accuracy: 0.8500
Epoch 338/500
4/4 [==============================] - 0s 500us/step - loss: 0.3478 - sparse_categorical_accuracy: 0.9667
Epoch 339/500
4/4 [==============================] - 0s 498us/step - loss: 0.3607 - sparse_categorical_accuracy: 0.9250
Epoch 340/500
4/4 [==============================] - 0s 3ms/step - loss: 0.3713 - sparse_categorical_accuracy: 0.9083 - val_loss: 0.5187 - val_sparse_categorical_accuracy: 0.7333
Epoch 341/500
4/4 [==============================] - 0s 500us/step - loss: 0.4559 - sparse_categorical_accuracy: 0.8417
Epoch 342/500
4/4 [==============================] - 0s 498us/step - loss: 0.3410 - sparse_categorical_accuracy: 0.9417
Epoch 343/500
4/4 [==============================] - 0s 499us/step - loss: 0.3335 - sparse_categorical_accuracy: 0.9833
Epoch 344/500
4/4 [==============================] - 0s 500us/step - loss: 0.3440 - sparse_categorical_accuracy: 0.9667
Epoch 345/500
4/4 [==============================] - 0s 500us/step - loss: 0.3458 - sparse_categorical_accuracy: 0.9417
Epoch 346/500
4/4 [==============================] - 0s 248us/step - loss: 0.3532 - sparse_categorical_accuracy: 0.9333
Epoch 347/500
4/4 [==============================] - 0s 753us/step - loss: 0.3764 - sparse_categorical_accuracy: 0.9000
Epoch 348/500
4/4 [==============================] - 0s 500us/step - loss: 0.3542 - sparse_categorical_accuracy: 0.9333
Epoch 349/500
4/4 [==============================] - 0s 501us/step - loss: 0.3310 - sparse_categorical_accuracy: 0.9667
Epoch 350/500
4/4 [==============================] - 0s 500us/step - loss: 0.3353 - sparse_categorical_accuracy: 0.9583
Epoch 351/500
4/4 [==============================] - 0s 500us/step - loss: 0.3567 - sparse_categorical_accuracy: 0.9417
Epoch 352/500
4/4 [==============================] - 0s 499us/step - loss: 0.3338 - sparse_categorical_accuracy: 0.9750
Epoch 353/500
4/4 [==============================] - 0s 500us/step - loss: 0.3431 - sparse_categorical_accuracy: 0.9417
Epoch 354/500
4/4 [==============================] - 0s 248us/step - loss: 0.3525 - sparse_categorical_accuracy: 0.9417
Epoch 355/500
4/4 [==============================] - 0s 248us/step - loss: 0.3388 - sparse_categorical_accuracy: 0.9667
Epoch 356/500
4/4 [==============================] - 0s 248us/step - loss: 0.4194 - sparse_categorical_accuracy: 0.8917
Epoch 357/500
4/4 [==============================] - 0s 500us/step - loss: 0.4286 - sparse_categorical_accuracy: 0.8250
Epoch 358/500
4/4 [==============================] - 0s 500us/step - loss: 0.3944 - sparse_categorical_accuracy: 0.9000
Epoch 359/500
4/4 [==============================] - 0s 500us/step - loss: 0.4019 - sparse_categorical_accuracy: 0.8833
Epoch 360/500
4/4 [==============================] - 0s 3ms/step - loss: 0.5128 - sparse_categorical_accuracy: 0.7917 - val_loss: 0.3637 - val_sparse_categorical_accuracy: 1.0000
Epoch 361/500
4/4 [==============================] - 0s 500us/step - loss: 0.3635 - sparse_categorical_accuracy: 0.9167
Epoch 362/500
4/4 [==============================] - 0s 248us/step - loss: 0.3348 - sparse_categorical_accuracy: 0.9583
Epoch 363/500
4/4 [==============================] - 0s 500us/step - loss: 0.3755 - sparse_categorical_accuracy: 0.8917
Epoch 364/500
4/4 [==============================] - 0s 500us/step - loss: 0.3741 - sparse_categorical_accuracy: 0.9417
Epoch 365/500
4/4 [==============================] - 0s 499us/step - loss: 0.3321 - sparse_categorical_accuracy: 0.9750
Epoch 366/500
4/4 [==============================] - 0s 499us/step - loss: 0.3323 - sparse_categorical_accuracy: 0.9583
Epoch 367/500
4/4 [==============================] - 0s 251us/step - loss: 0.3730 - sparse_categorical_accuracy: 0.9417
Epoch 368/500
4/4 [==============================] - 0s 499us/step - loss: 0.3468 - sparse_categorical_accuracy: 0.9583
Epoch 369/500
4/4 [==============================] - 0s 500us/step - loss: 0.3337 - sparse_categorical_accuracy: 0.9667
Epoch 370/500
4/4 [==============================] - 0s 748us/step - loss: 0.3490 - sparse_categorical_accuracy: 0.9500
Epoch 371/500
4/4 [==============================] - 0s 503us/step - loss: 0.4840 - sparse_categorical_accuracy: 0.8167
Epoch 372/500
4/4 [==============================] - 0s 498us/step - loss: 0.4399 - sparse_categorical_accuracy: 0.8500
Epoch 373/500
4/4 [==============================] - 0s 498us/step - loss: 0.3932 - sparse_categorical_accuracy: 0.9000
Epoch 374/500
4/4 [==============================] - 0s 252us/step - loss: 0.3417 - sparse_categorical_accuracy: 0.9417
Epoch 375/500
4/4 [==============================] - 0s 500us/step - loss: 0.3405 - sparse_categorical_accuracy: 0.9500
Epoch 376/500
4/4 [==============================] - 0s 499us/step - loss: 0.3427 - sparse_categorical_accuracy: 0.9417
Epoch 377/500
4/4 [==============================] - 0s 248us/step - loss: 0.3389 - sparse_categorical_accuracy: 0.9583
Epoch 378/500
4/4 [==============================] - 0s 503us/step - loss: 0.3756 - sparse_categorical_accuracy: 0.9083
Epoch 379/500
4/4 [==============================] - 0s 250us/step - loss: 0.4027 - sparse_categorical_accuracy: 0.9083
Epoch 380/500
4/4 [==============================] - 0s 3ms/step - loss: 0.3577 - sparse_categorical_accuracy: 0.9333 - val_loss: 0.3531 - val_sparse_categorical_accuracy: 1.0000
Epoch 381/500
4/4 [==============================] - 0s 502us/step - loss: 0.3382 - sparse_categorical_accuracy: 0.9667
Epoch 382/500
4/4 [==============================] - 0s 502us/step - loss: 0.3355 - sparse_categorical_accuracy: 0.9500
Epoch 383/500
4/4 [==============================] - 0s 499us/step - loss: 0.3716 - sparse_categorical_accuracy: 0.9250
Epoch 384/500
4/4 [==============================] - 0s 250us/step - loss: 0.3529 - sparse_categorical_accuracy: 0.9500
Epoch 385/500
4/4 [==============================] - 0s 250us/step - loss: 0.4587 - sparse_categorical_accuracy: 0.7750
Epoch 386/500
4/4 [==============================] - 0s 498us/step - loss: 0.3993 - sparse_categorical_accuracy: 0.8917
Epoch 387/500
4/4 [==============================] - 0s 498us/step - loss: 0.3556 - sparse_categorical_accuracy: 0.9333
Epoch 388/500
4/4 [==============================] - 0s 499us/step - loss: 0.3324 - sparse_categorical_accuracy: 0.9750
Epoch 389/500
4/4 [==============================] - 0s 250us/step - loss: 0.3522 - sparse_categorical_accuracy: 0.9333
Epoch 390/500
4/4 [==============================] - 0s 502us/step - loss: 0.3427 - sparse_categorical_accuracy: 0.9583
Epoch 391/500
4/4 [==============================] - 0s 501us/step - loss: 0.3631 - sparse_categorical_accuracy: 0.9417
Epoch 392/500
4/4 [==============================] - 0s 499us/step - loss: 0.3508 - sparse_categorical_accuracy: 0.9583
Epoch 393/500
4/4 [==============================] - 0s 500us/step - loss: 0.3915 - sparse_categorical_accuracy: 0.8750
Epoch 394/500
4/4 [==============================] - 0s 500us/step - loss: 0.3883 - sparse_categorical_accuracy: 0.8750
Epoch 395/500
4/4 [==============================] - 0s 498us/step - loss: 0.3415 - sparse_categorical_accuracy: 0.9417
Epoch 396/500
4/4 [==============================] - 0s 498us/step - loss: 0.3585 - sparse_categorical_accuracy: 0.8917
Epoch 397/500
4/4 [==============================] - 0s 250us/step - loss: 0.3379 - sparse_categorical_accuracy: 0.9583
Epoch 398/500
4/4 [==============================] - 0s 500us/step - loss: 0.3442 - sparse_categorical_accuracy: 0.9500
Epoch 399/500
4/4 [==============================] - 0s 500us/step - loss: 0.3531 - sparse_categorical_accuracy: 0.9333
Epoch 400/500
4/4 [==============================] - 0s 3ms/step - loss: 0.3324 - sparse_categorical_accuracy: 0.9667 - val_loss: 0.3664 - val_sparse_categorical_accuracy: 1.0000
Epoch 401/500
4/4 [==============================] - 0s 750us/step - loss: 0.3318 - sparse_categorical_accuracy: 0.9750
Epoch 402/500
4/4 [==============================] - 0s 500us/step - loss: 0.3504 - sparse_categorical_accuracy: 0.9500
Epoch 403/500
4/4 [==============================] - 0s 500us/step - loss: 0.3353 - sparse_categorical_accuracy: 0.9500
Epoch 404/500
4/4 [==============================] - 0s 499us/step - loss: 0.3334 - sparse_categorical_accuracy: 0.9667
Epoch 405/500
4/4 [==============================] - 0s 500us/step - loss: 0.3300 - sparse_categorical_accuracy: 0.9750
Epoch 406/500
4/4 [==============================] - 0s 501us/step - loss: 0.3324 - sparse_categorical_accuracy: 0.9833
Epoch 407/500
4/4 [==============================] - 0s 500us/step - loss: 0.3444 - sparse_categorical_accuracy: 0.9250
Epoch 408/500
4/4 [==============================] - 0s 502us/step - loss: 0.3433 - sparse_categorical_accuracy: 0.9500
Epoch 409/500
4/4 [==============================] - 0s 499us/step - loss: 0.3800 - sparse_categorical_accuracy: 0.9000
Epoch 410/500
4/4 [==============================] - 0s 500us/step - loss: 0.3584 - sparse_categorical_accuracy: 0.9250
Epoch 411/500
4/4 [==============================] - 0s 498us/step - loss: 0.3636 - sparse_categorical_accuracy: 0.9000
Epoch 412/500
4/4 [==============================] - 0s 498us/step - loss: 0.3399 - sparse_categorical_accuracy: 0.9250
Epoch 413/500
4/4 [==============================] - 0s 500us/step - loss: 0.3366 - sparse_categorical_accuracy: 0.9500
Epoch 414/500
4/4 [==============================] - 0s 498us/step - loss: 0.3316 - sparse_categorical_accuracy: 0.9750
Epoch 415/500
4/4 [==============================] - 0s 500us/step - loss: 0.3392 - sparse_categorical_accuracy: 0.9417
Epoch 416/500
4/4 [==============================] - 0s 502us/step - loss: 0.3436 - sparse_categorical_accuracy: 0.9500
Epoch 417/500
4/4 [==============================] - 0s 500us/step - loss: 0.3512 - sparse_categorical_accuracy: 0.9250
Epoch 418/500
4/4 [==============================] - 0s 499us/step - loss: 0.3506 - sparse_categorical_accuracy: 0.9583
Epoch 419/500
4/4 [==============================] - 0s 502us/step - loss: 0.3388 - sparse_categorical_accuracy: 0.9500
Epoch 420/500
4/4 [==============================] - 0s 3ms/step - loss: 0.3887 - sparse_categorical_accuracy: 0.9000 - val_loss: 0.3605 - val_sparse_categorical_accuracy: 0.9667
Epoch 421/500
4/4 [==============================] - 0s 502us/step - loss: 0.3496 - sparse_categorical_accuracy: 0.9417
Epoch 422/500
4/4 [==============================] - 0s 500us/step - loss: 0.3401 - sparse_categorical_accuracy: 0.9583
Epoch 423/500
4/4 [==============================] - 0s 502us/step - loss: 0.3629 - sparse_categorical_accuracy: 0.9500
Epoch 424/500
4/4 [==============================] - 0s 498us/step - loss: 0.3448 - sparse_categorical_accuracy: 0.9500
Epoch 425/500
4/4 [==============================] - 0s 498us/step - loss: 0.3672 - sparse_categorical_accuracy: 0.9250
Epoch 426/500
4/4 [==============================] - 0s 499us/step - loss: 0.3709 - sparse_categorical_accuracy: 0.9000
Epoch 427/500
4/4 [==============================] - 0s 503us/step - loss: 0.3709 - sparse_categorical_accuracy: 0.9000
Epoch 428/500
4/4 [==============================] - 0s 500us/step - loss: 0.4602 - sparse_categorical_accuracy: 0.8083
Epoch 429/500
4/4 [==============================] - 0s 500us/step - loss: 0.3857 - sparse_categorical_accuracy: 0.8750
Epoch 430/500
4/4 [==============================] - 0s 751us/step - loss: 0.3688 - sparse_categorical_accuracy: 0.9167
Epoch 431/500
4/4 [==============================] - 0s 502us/step - loss: 0.3593 - sparse_categorical_accuracy: 0.9417
Epoch 432/500
4/4 [==============================] - 0s 502us/step - loss: 0.3621 - sparse_categorical_accuracy: 0.9417
Epoch 433/500
4/4 [==============================] - 0s 250us/step - loss: 0.5701 - sparse_categorical_accuracy: 0.8083
Epoch 434/500
4/4 [==============================] - 0s 502us/step - loss: 0.3565 - sparse_categorical_accuracy: 0.9333
Epoch 435/500
4/4 [==============================] - 0s 500us/step - loss: 0.3346 - sparse_categorical_accuracy: 0.9667
Epoch 436/500
4/4 [==============================] - 0s 250us/step - loss: 0.3458 - sparse_categorical_accuracy: 0.9333
Epoch 437/500
4/4 [==============================] - 0s 500us/step - loss: 0.3996 - sparse_categorical_accuracy: 0.8833
Epoch 438/500
4/4 [==============================] - 0s 500us/step - loss: 0.3398 - sparse_categorical_accuracy: 0.9583
Epoch 439/500
4/4 [==============================] - ETA: 0s - loss: 0.3650 - sparse_categorical_accuracy: 0.937 - 0s 752us/step - loss: 0.3278 - sparse_categorical_accuracy: 0.9667
Epoch 440/500
4/4 [==============================] - 0s 3ms/step - loss: 0.3489 - sparse_categorical_accuracy: 0.9333 - val_loss: 0.3702 - val_sparse_categorical_accuracy: 0.9667
Epoch 441/500
4/4 [==============================] - 0s 748us/step - loss: 0.3779 - sparse_categorical_accuracy: 0.9000
Epoch 442/500
4/4 [==============================] - 0s 500us/step - loss: 0.3388 - sparse_categorical_accuracy: 0.9333
Epoch 443/500
4/4 [==============================] - 0s 500us/step - loss: 0.3370 - sparse_categorical_accuracy: 0.9833
Epoch 444/500
4/4 [==============================] - 0s 500us/step - loss: 0.3249 - sparse_categorical_accuracy: 0.9667
Epoch 445/500
4/4 [==============================] - 0s 500us/step - loss: 0.3550 - sparse_categorical_accuracy: 0.9250
Epoch 446/500
4/4 [==============================] - 0s 500us/step - loss: 0.3450 - sparse_categorical_accuracy: 0.9500
Epoch 447/500
4/4 [==============================] - 0s 252us/step - loss: 0.3631 - sparse_categorical_accuracy: 0.9250
Epoch 448/500
4/4 [==============================] - 0s 500us/step - loss: 0.3714 - sparse_categorical_accuracy: 0.9000
Epoch 449/500
4/4 [==============================] - 0s 502us/step - loss: 0.4089 - sparse_categorical_accuracy: 0.8833
Epoch 450/500
4/4 [==============================] - 0s 498us/step - loss: 0.3278 - sparse_categorical_accuracy: 0.9750
Epoch 451/500
4/4 [==============================] - 0s 500us/step - loss: 0.3653 - sparse_categorical_accuracy: 0.9417
Epoch 452/500
4/4 [==============================] - 0s 502us/step - loss: 0.3716 - sparse_categorical_accuracy: 0.9000
Epoch 453/500
4/4 [==============================] - 0s 252us/step - loss: 0.3400 - sparse_categorical_accuracy: 0.9417
Epoch 454/500
4/4 [==============================] - 0s 248us/step - loss: 0.3866 - sparse_categorical_accuracy: 0.8750
Epoch 455/500
4/4 [==============================] - 0s 252us/step - loss: 0.5304 - sparse_categorical_accuracy: 0.7500
Epoch 456/500
4/4 [==============================] - 0s 502us/step - loss: 0.3903 - sparse_categorical_accuracy: 0.8917
Epoch 457/500
4/4 [==============================] - 0s 498us/step - loss: 0.3580 - sparse_categorical_accuracy: 0.9250
Epoch 458/500
4/4 [==============================] - 0s 500us/step - loss: 0.3566 - sparse_categorical_accuracy: 0.9583
Epoch 459/500
4/4 [==============================] - 0s 502us/step - loss: 0.3705 - sparse_categorical_accuracy: 0.9333
Epoch 460/500
4/4 [==============================] - 0s 3ms/step - loss: 0.3404 - sparse_categorical_accuracy: 0.9667 - val_loss: 0.5495 - val_sparse_categorical_accuracy: 0.6667
Epoch 461/500
4/4 [==============================] - 0s 498us/step - loss: 0.3500 - sparse_categorical_accuracy: 0.9500
Epoch 462/500
4/4 [==============================] - 0s 499us/step - loss: 0.3279 - sparse_categorical_accuracy: 0.9833
Epoch 463/500
4/4 [==============================] - 0s 250us/step - loss: 0.3330 - sparse_categorical_accuracy: 0.9583
Epoch 464/500
4/4 [==============================] - 0s 501us/step - loss: 0.3355 - sparse_categorical_accuracy: 0.9500
Epoch 465/500
4/4 [==============================] - 0s 498us/step - loss: 0.3496 - sparse_categorical_accuracy: 0.9333
Epoch 466/500
4/4 [==============================] - 0s 500us/step - loss: 0.3501 - sparse_categorical_accuracy: 0.9500
Epoch 467/500
4/4 [==============================] - 0s 498us/step - loss: 0.3316 - sparse_categorical_accuracy: 0.9750
Epoch 468/500
4/4 [==============================] - 0s 501us/step - loss: 0.3526 - sparse_categorical_accuracy: 0.9417
Epoch 469/500
4/4 [==============================] - 0s 500us/step - loss: 0.3440 - sparse_categorical_accuracy: 0.9167
Epoch 470/500
4/4 [==============================] - 0s 500us/step - loss: 0.3557 - sparse_categorical_accuracy: 0.9000
Epoch 471/500
4/4 [==============================] - 0s 498us/step - loss: 0.3610 - sparse_categorical_accuracy: 0.9417
Epoch 472/500
4/4 [==============================] - 0s 499us/step - loss: 0.3400 - sparse_categorical_accuracy: 0.9250
Epoch 473/500
4/4 [==============================] - 0s 498us/step - loss: 0.3309 - sparse_categorical_accuracy: 0.9583
Epoch 474/500
4/4 [==============================] - 0s 500us/step - loss: 0.3279 - sparse_categorical_accuracy: 0.9667
Epoch 475/500
4/4 [==============================] - 0s 250us/step - loss: 0.3269 - sparse_categorical_accuracy: 0.9667
Epoch 476/500
4/4 [==============================] - 0s 500us/step - loss: 0.3402 - sparse_categorical_accuracy: 0.9667
Epoch 477/500
4/4 [==============================] - 0s 500us/step - loss: 0.3322 - sparse_categorical_accuracy: 0.9583
Epoch 478/500
4/4 [==============================] - 0s 500us/step - loss: 0.3299 - sparse_categorical_accuracy: 0.9750
Epoch 479/500
4/4 [==============================] - 0s 250us/step - loss: 0.3560 - sparse_categorical_accuracy: 0.9000
Epoch 480/500
4/4 [==============================] - 0s 3ms/step - loss: 0.3351 - sparse_categorical_accuracy: 0.9750 - val_loss: 0.3622 - val_sparse_categorical_accuracy: 1.0000
Epoch 481/500
4/4 [==============================] - 0s 500us/step - loss: 0.3538 - sparse_categorical_accuracy: 0.9167
Epoch 482/500
4/4 [==============================] - 0s 249us/step - loss: 0.3268 - sparse_categorical_accuracy: 0.9833
Epoch 483/500
4/4 [==============================] - 0s 252us/step - loss: 0.3828 - sparse_categorical_accuracy: 0.9000
Epoch 484/500
4/4 [==============================] - 0s 500us/step - loss: 0.3285 - sparse_categorical_accuracy: 0.9500
Epoch 485/500
4/4 [==============================] - 0s 500us/step - loss: 0.3524 - sparse_categorical_accuracy: 0.9417
Epoch 486/500
4/4 [==============================] - 0s 500us/step - loss: 0.3288 - sparse_categorical_accuracy: 0.9667
Epoch 487/500
4/4 [==============================] - 0s 500us/step - loss: 0.3758 - sparse_categorical_accuracy: 0.9250
Epoch 488/500
4/4 [==============================] - 0s 501us/step - loss: 0.3593 - sparse_categorical_accuracy: 0.9333
Epoch 489/500
4/4 [==============================] - 0s 498us/step - loss: 0.3314 - sparse_categorical_accuracy: 0.9667
Epoch 490/500
4/4 [==============================] - 0s 500us/step - loss: 0.3312 - sparse_categorical_accuracy: 0.9583
Epoch 491/500
4/4 [==============================] - 0s 252us/step - loss: 0.3884 - sparse_categorical_accuracy: 0.8583
Epoch 492/500
4/4 [==============================] - 0s 250us/step - loss: 0.3378 - sparse_categorical_accuracy: 0.9500
Epoch 493/500
4/4 [==============================] - 0s 498us/step - loss: 0.3355 - sparse_categorical_accuracy: 0.9667
Epoch 494/500
4/4 [==============================] - 0s 502us/step - loss: 0.3585 - sparse_categorical_accuracy: 0.9333
Epoch 495/500
4/4 [==============================] - 0s 250us/step - loss: 0.3358 - sparse_categorical_accuracy: 0.9583
Epoch 496/500
4/4 [==============================] - 0s 498us/step - loss: 0.3432 - sparse_categorical_accuracy: 0.9500
Epoch 497/500
4/4 [==============================] - 0s 500us/step - loss: 0.3888 - sparse_categorical_accuracy: 0.8667
Epoch 498/500
4/4 [==============================] - 0s 501us/step - loss: 0.4201 - sparse_categorical_accuracy: 0.8833
Epoch 499/500
4/4 [==============================] - 0s 250us/step - loss: 0.4596 - sparse_categorical_accuracy: 0.8000
Epoch 500/500
4/4 [==============================] - 0s 3ms/step - loss: 0.3406 - sparse_categorical_accuracy: 0.9417 - val_loss: 0.3603 - val_sparse_categorical_accuracy: 0.9667
Model: "sequential_2"
_________________________________________________________________
Layer (type)                 Output Shape              Param #   
=================================================================
dense_2 (Dense)              (None, 3)                 15        
=================================================================
Total params: 15
Trainable params: 15
Non-trainable params: 0
_________________________________________________________________
#irts 定义类实现class
import tensorflow as tf
from  sklearn import datasets
import numpy as np
from tensorflow.keras import Model
from tensorflow.keras.layers import Dense

x_train = datasets.load_iris().data
y_train = datasets.load_iris().target

np.random.seed(129)
np.random.shuffle(x_train)
np.random.seed(129)
np.random.shuffle(y_train)
tf.random.set_seed(129)

# 就18-25行和上面不一样
class IrisModel(Model):
    def __init__(self):
        super(IrisModel,self).__init__()
        self.d1 = Dense(3)
    def call(self,x):
        y = self.d1(x)
        return y
model = IrisModel()

model.compile(optimizer=tf.keras.optimizers.SGD(lr=0.1),
             loss=tf.keras.losses.SparseCategoricalCrossentropy(from_logits=False),
             metrics = ['sparse_categorical_accuracy'])
model.fit(x_train,y_train,batch_size=32,epochs=500,validation_split=0.2,validation_freq=20)
model.summary()

Epoch 1/500
4/4 [==============================] - 0s 751us/step - loss: 5.7534 - sparse_categorical_accuracy: 0.3000
Epoch 2/500
4/4 [==============================] - 0s 500us/step - loss: 5.7972 - sparse_categorical_accuracy: 0.3333
Epoch 3/500
4/4 [==============================] - 0s 500us/step - loss: 5.7972 - sparse_categorical_accuracy: 0.3333
Epoch 4/500
4/4 [==============================] - 0s 742us/step - loss: 5.7972 - sparse_categorical_accuracy: 0.3333
Epoch 5/500
4/4 [==============================] - 0s 499us/step - loss: 5.7972 - sparse_categorical_accuracy: 0.3333
Epoch 6/500
4/4 [==============================] - 0s 752us/step - loss: 5.7972 - sparse_categorical_accuracy: 0.3333
Epoch 7/500
4/4 [==============================] - 0s 500us/step - loss: 5.7972 - sparse_categorical_accuracy: 0.3333
Epoch 8/500
4/4 [==============================] - 0s 502us/step - loss: 5.7972 - sparse_categorical_accuracy: 0.3333
Epoch 9/500
4/4 [==============================] - 0s 500us/step - loss: 5.7972 - sparse_categorical_accuracy: 0.3333
Epoch 10/500
4/4 [==============================] - 0s 499us/step - loss: 5.7972 - sparse_categorical_accuracy: 0.3333
Epoch 11/500
4/4 [==============================] - 0s 498us/step - loss: 5.7972 - sparse_categorical_accuracy: 0.3333
Epoch 12/500
4/4 [==============================] - 0s 500us/step - loss: 5.7972 - sparse_categorical_accuracy: 0.3333
Epoch 13/500
4/4 [==============================] - 0s 499us/step - loss: 5.7972 - sparse_categorical_accuracy: 0.3333
Epoch 14/500
4/4 [==============================] - 0s 499us/step - loss: 5.7972 - sparse_categorical_accuracy: 0.3333
Epoch 15/500
4/4 [==============================] - 0s 499us/step - loss: 5.7972 - sparse_categorical_accuracy: 0.3333
Epoch 16/500
4/4 [==============================] - 0s 499us/step - loss: 5.7972 - sparse_categorical_accuracy: 0.3333
Epoch 17/500
4/4 [==============================] - 0s 250us/step - loss: 5.7972 - sparse_categorical_accuracy: 0.3333
Epoch 18/500
4/4 [==============================] - 0s 500us/step - loss: 5.7972 - sparse_categorical_accuracy: 0.3333
Epoch 19/500
4/4 [==============================] - 0s 500us/step - loss: 5.7972 - sparse_categorical_accuracy: 0.3333
Epoch 20/500
4/4 [==============================] - 0s 21ms/step - loss: 5.7972 - sparse_categorical_accuracy: 0.3333 - val_loss: 7.1404 - val_sparse_categorical_accuracy: 0.2333
Epoch 21/500
4/4 [==============================] - 0s 500us/step - loss: 5.7972 - sparse_categorical_accuracy: 0.3333
Epoch 22/500
4/4 [==============================] - 0s 502us/step - loss: 5.7972 - sparse_categorical_accuracy: 0.3333
Epoch 23/500
4/4 [==============================] - 0s 751us/step - loss: 5.7972 - sparse_categorical_accuracy: 0.3333
Epoch 24/500
4/4 [==============================] - 0s 501us/step - loss: 5.7972 - sparse_categorical_accuracy: 0.3333
Epoch 25/500
4/4 [==============================] - 0s 500us/step - loss: 5.7972 - sparse_categorical_accuracy: 0.3333
Epoch 26/500
4/4 [==============================] - 0s 500us/step - loss: 5.7972 - sparse_categorical_accuracy: 0.3333
Epoch 27/500
4/4 [==============================] - 0s 749us/step - loss: 5.7972 - sparse_categorical_accuracy: 0.3333
Epoch 28/500
4/4 [==============================] - 0s 500us/step - loss: 5.7972 - sparse_categorical_accuracy: 0.3333
Epoch 29/500
4/4 [==============================] - 0s 500us/step - loss: 5.7972 - sparse_categorical_accuracy: 0.3333
Epoch 30/500
4/4 [==============================] - 0s 502us/step - loss: 5.7972 - sparse_categorical_accuracy: 0.3333
Epoch 31/500
4/4 [==============================] - 0s 500us/step - loss: 5.7972 - sparse_categorical_accuracy: 0.3333
Epoch 32/500
4/4 [==============================] - 0s 500us/step - loss: 5.7972 - sparse_categorical_accuracy: 0.3333
Epoch 33/500
4/4 [==============================] - 0s 502us/step - loss: 5.7972 - sparse_categorical_accuracy: 0.3333
Epoch 34/500
4/4 [==============================] - 0s 500us/step - loss: 5.7972 - sparse_categorical_accuracy: 0.3333
Epoch 35/500
4/4 [==============================] - 0s 750us/step - loss: 5.7972 - sparse_categorical_accuracy: 0.3333
Epoch 36/500
4/4 [==============================] - 0s 499us/step - loss: 5.7972 - sparse_categorical_accuracy: 0.3333
Epoch 37/500
4/4 [==============================] - 0s 500us/step - loss: 5.7972 - sparse_categorical_accuracy: 0.3333
Epoch 38/500
4/4 [==============================] - 0s 500us/step - loss: 5.7972 - sparse_categorical_accuracy: 0.3333
Epoch 39/500
4/4 [==============================] - 0s 502us/step - loss: 5.7972 - sparse_categorical_accuracy: 0.3333
Epoch 40/500
4/4 [==============================] - 0s 3ms/step - loss: 5.7972 - sparse_categorical_accuracy: 0.3333 - val_loss: 7.1404 - val_sparse_categorical_accuracy: 0.2333
Epoch 41/500
4/4 [==============================] - 0s 500us/step - loss: 5.7972 - sparse_categorical_accuracy: 0.3333
Epoch 42/500
4/4 [==============================] - 0s 500us/step - loss: 5.7972 - sparse_categorical_accuracy: 0.3333
Epoch 43/500
4/4 [==============================] - 0s 749us/step - loss: 5.7972 - sparse_categorical_accuracy: 0.3333
Epoch 44/500
4/4 [==============================] - 0s 500us/step - loss: 5.7972 - sparse_categorical_accuracy: 0.3333
Epoch 45/500
4/4 [==============================] - 0s 509us/step - loss: 5.7972 - sparse_categorical_accuracy: 0.3333
Epoch 46/500
4/4 [==============================] - 0s 499us/step - loss: 5.7972 - sparse_categorical_accuracy: 0.3333
Epoch 47/500
4/4 [==============================] - 0s 499us/step - loss: 5.7972 - sparse_categorical_accuracy: 0.3333
Epoch 48/500
4/4 [==============================] - 0s 500us/step - loss: 5.7972 - sparse_categorical_accuracy: 0.3333
Epoch 49/500
4/4 [==============================] - 0s 499us/step - loss: 5.7972 - sparse_categorical_accuracy: 0.3333
Epoch 50/500
4/4 [==============================] - 0s 500us/step - loss: 5.7972 - sparse_categorical_accuracy: 0.3333
Epoch 51/500
4/4 [==============================] - 0s 748us/step - loss: 5.7972 - sparse_categorical_accuracy: 0.3333
Epoch 52/500
4/4 [==============================] - 0s 500us/step - loss: 5.7972 - sparse_categorical_accuracy: 0.3333
Epoch 53/500
4/4 [==============================] - 0s 500us/step - loss: 5.7972 - sparse_categorical_accuracy: 0.3333
Epoch 54/500
4/4 [==============================] - 0s 252us/step - loss: 5.7972 - sparse_categorical_accuracy: 0.3333
Epoch 55/500
4/4 [==============================] - 0s 500us/step - loss: 5.7972 - sparse_categorical_accuracy: 0.3333
Epoch 56/500
4/4 [==============================] - 0s 500us/step - loss: 5.7972 - sparse_categorical_accuracy: 0.3333
Epoch 57/500
4/4 [==============================] - 0s 500us/step - loss: 5.7972 - sparse_categorical_accuracy: 0.3333
Epoch 58/500
4/4 [==============================] - 0s 501us/step - loss: 5.7972 - sparse_categorical_accuracy: 0.3333
Epoch 59/500
4/4 [==============================] - 0s 250us/step - loss: 5.7972 - sparse_categorical_accuracy: 0.3333
Epoch 60/500
4/4 [==============================] - 0s 3ms/step - loss: 5.7972 - sparse_categorical_accuracy: 0.3333 - val_loss: 7.1404 - val_sparse_categorical_accuracy: 0.2333
Epoch 61/500
4/4 [==============================] - 0s 499us/step - loss: 5.7972 - sparse_categorical_accuracy: 0.3333
Epoch 62/500
4/4 [==============================] - 0s 750us/step - loss: 5.7972 - sparse_categorical_accuracy: 0.3333
Epoch 63/500
4/4 [==============================] - 0s 500us/step - loss: 5.7972 - sparse_categorical_accuracy: 0.3333
Epoch 64/500
4/4 [==============================] - 0s 500us/step - loss: 5.7972 - sparse_categorical_accuracy: 0.3333
Epoch 65/500
4/4 [==============================] - 0s 498us/step - loss: 5.7972 - sparse_categorical_accuracy: 0.3333
Epoch 66/500
4/4 [==============================] - 0s 502us/step - loss: 5.7972 - sparse_categorical_accuracy: 0.3333
Epoch 67/500
4/4 [==============================] - 0s 500us/step - loss: 5.7972 - sparse_categorical_accuracy: 0.3333
Epoch 68/500
4/4 [==============================] - 0s 500us/step - loss: 5.7972 - sparse_categorical_accuracy: 0.3333
Epoch 69/500
4/4 [==============================] - 0s 500us/step - loss: 5.7972 - sparse_categorical_accuracy: 0.3333
Epoch 70/500
4/4 [==============================] - 0s 499us/step - loss: 5.7972 - sparse_categorical_accuracy: 0.3333
Epoch 71/500
4/4 [==============================] - 0s 251us/step - loss: 5.7972 - sparse_categorical_accuracy: 0.3333
Epoch 72/500
4/4 [==============================] - 0s 500us/step - loss: 5.7972 - sparse_categorical_accuracy: 0.3333
Epoch 73/500
4/4 [==============================] - 0s 501us/step - loss: 5.7972 - sparse_categorical_accuracy: 0.3333
Epoch 74/500
4/4 [==============================] - 0s 500us/step - loss: 5.7972 - sparse_categorical_accuracy: 0.3333
Epoch 75/500
4/4 [==============================] - 0s 500us/step - loss: 5.7972 - sparse_categorical_accuracy: 0.3333
Epoch 76/500
4/4 [==============================] - 0s 500us/step - loss: 5.7972 - sparse_categorical_accuracy: 0.3333
Epoch 77/500
4/4 [==============================] - 0s 498us/step - loss: 5.7972 - sparse_categorical_accuracy: 0.3333
Epoch 78/500
4/4 [==============================] - 0s 250us/step - loss: 5.7972 - sparse_categorical_accuracy: 0.3333
Epoch 79/500
4/4 [==============================] - 0s 500us/step - loss: 5.7972 - sparse_categorical_accuracy: 0.3333
Epoch 80/500
4/4 [==============================] - 0s 3ms/step - loss: 5.7972 - sparse_categorical_accuracy: 0.3333 - val_loss: 7.1404 - val_sparse_categorical_accuracy: 0.2333
Epoch 81/500
4/4 [==============================] - 0s 499us/step - loss: 5.7972 - sparse_categorical_accuracy: 0.3333
Epoch 82/500
4/4 [==============================] - 0s 500us/step - loss: 5.7972 - sparse_categorical_accuracy: 0.3333
Epoch 83/500
4/4 [==============================] - 0s 499us/step - loss: 5.7972 - sparse_categorical_accuracy: 0.3333
Epoch 84/500
4/4 [==============================] - 0s 500us/step - loss: 5.7972 - sparse_categorical_accuracy: 0.3333
Epoch 85/500
4/4 [==============================] - 0s 498us/step - loss: 5.7972 - sparse_categorical_accuracy: 0.3333
Epoch 86/500
4/4 [==============================] - 0s 748us/step - loss: 5.7972 - sparse_categorical_accuracy: 0.3333
Epoch 87/500
4/4 [==============================] - 0s 499us/step - loss: 5.7972 - sparse_categorical_accuracy: 0.3333
Epoch 88/500
4/4 [==============================] - 0s 500us/step - loss: 5.7972 - sparse_categorical_accuracy: 0.3333
Epoch 89/500
4/4 [==============================] - 0s 501us/step - loss: 5.7972 - sparse_categorical_accuracy: 0.3333
Epoch 90/500
4/4 [==============================] - 0s 499us/step - loss: 5.7972 - sparse_categorical_accuracy: 0.3333
Epoch 91/500
4/4 [==============================] - 0s 500us/step - loss: 5.7972 - sparse_categorical_accuracy: 0.3333
Epoch 92/500
4/4 [==============================] - 0s 500us/step - loss: 5.7972 - sparse_categorical_accuracy: 0.3333
Epoch 93/500
4/4 [==============================] - 0s 250us/step - loss: 5.7972 - sparse_categorical_accuracy: 0.3333
Epoch 94/500
4/4 [==============================] - 0s 499us/step - loss: 5.7972 - sparse_categorical_accuracy: 0.3333
Epoch 95/500
4/4 [==============================] - 0s 500us/step - loss: 5.7972 - sparse_categorical_accuracy: 0.3333
Epoch 96/500
4/4 [==============================] - 0s 501us/step - loss: 5.7972 - sparse_categorical_accuracy: 0.3333
Epoch 97/500
4/4 [==============================] - 0s 502us/step - loss: 5.7972 - sparse_categorical_accuracy: 0.3333
Epoch 98/500
4/4 [==============================] - 0s 499us/step - loss: 5.7972 - sparse_categorical_accuracy: 0.3333
Epoch 99/500
4/4 [==============================] - 0s 498us/step - loss: 5.7972 - sparse_categorical_accuracy: 0.3333
Epoch 100/500
4/4 [==============================] - 0s 3ms/step - loss: 5.7972 - sparse_categorical_accuracy: 0.3333 - val_loss: 7.1404 - val_sparse_categorical_accuracy: 0.2333
Epoch 101/500
4/4 [==============================] - 0s 252us/step - loss: 5.7972 - sparse_categorical_accuracy: 0.3333
Epoch 102/500
4/4 [==============================] - 0s 500us/step - loss: 5.7972 - sparse_categorical_accuracy: 0.3333
Epoch 103/500
4/4 [==============================] - 0s 500us/step - loss: 5.7972 - sparse_categorical_accuracy: 0.3333
Epoch 104/500
4/4 [==============================] - 0s 501us/step - loss: 5.7972 - sparse_categorical_accuracy: 0.3333
Epoch 105/500
4/4 [==============================] - 0s 251us/step - loss: 5.7972 - sparse_categorical_accuracy: 0.3333
Epoch 106/500
4/4 [==============================] - 0s 502us/step - loss: 5.7972 - sparse_categorical_accuracy: 0.3333
Epoch 107/500
4/4 [==============================] - 0s 501us/step - loss: 5.7972 - sparse_categorical_accuracy: 0.3333
Epoch 108/500
4/4 [==============================] - 0s 499us/step - loss: 5.7972 - sparse_categorical_accuracy: 0.3333
Epoch 109/500
4/4 [==============================] - 0s 501us/step - loss: 5.7972 - sparse_categorical_accuracy: 0.3333
Epoch 110/500
4/4 [==============================] - 0s 500us/step - loss: 5.7972 - sparse_categorical_accuracy: 0.3333
Epoch 111/500
4/4 [==============================] - 0s 501us/step - loss: 5.7972 - sparse_categorical_accuracy: 0.3333
Epoch 112/500
4/4 [==============================] - 0s 498us/step - loss: 5.7972 - sparse_categorical_accuracy: 0.3333
Epoch 113/500
4/4 [==============================] - 0s 500us/step - loss: 5.7972 - sparse_categorical_accuracy: 0.3333
Epoch 114/500
4/4 [==============================] - 0s 500us/step - loss: 5.7972 - sparse_categorical_accuracy: 0.3333
Epoch 115/500
4/4 [==============================] - 0s 500us/step - loss: 5.7972 - sparse_categorical_accuracy: 0.3333
Epoch 116/500
4/4 [==============================] - 0s 500us/step - loss: 5.7972 - sparse_categorical_accuracy: 0.3333
Epoch 117/500
4/4 [==============================] - 0s 500us/step - loss: 5.7972 - sparse_categorical_accuracy: 0.3333
Epoch 118/500
4/4 [==============================] - 0s 250us/step - loss: 5.7972 - sparse_categorical_accuracy: 0.3333
Epoch 119/500
4/4 [==============================] - 0s 500us/step - loss: 5.7972 - sparse_categorical_accuracy: 0.3333
Epoch 120/500
4/4 [==============================] - 0s 3ms/step - loss: 5.7972 - sparse_categorical_accuracy: 0.3333 - val_loss: 7.1404 - val_sparse_categorical_accuracy: 0.2333
Epoch 121/500
4/4 [==============================] - 0s 499us/step - loss: 5.7972 - sparse_categorical_accuracy: 0.3333
Epoch 122/500
4/4 [==============================] - 0s 498us/step - loss: 5.7972 - sparse_categorical_accuracy: 0.3333
Epoch 123/500
4/4 [==============================] - 0s 500us/step - loss: 5.7972 - sparse_categorical_accuracy: 0.3333
Epoch 124/500
4/4 [==============================] - 0s 501us/step - loss: 5.7972 - sparse_categorical_accuracy: 0.3333
Epoch 125/500
4/4 [==============================] - 0s 498us/step - loss: 5.7972 - sparse_categorical_accuracy: 0.3333
Epoch 126/500
4/4 [==============================] - 0s 249us/step - loss: 5.7972 - sparse_categorical_accuracy: 0.3333
Epoch 127/500
4/4 [==============================] - 0s 500us/step - loss: 5.7972 - sparse_categorical_accuracy: 0.3333
Epoch 128/500
4/4 [==============================] - 0s 500us/step - loss: 5.7972 - sparse_categorical_accuracy: 0.3333
Epoch 129/500
4/4 [==============================] - 0s 501us/step - loss: 5.7972 - sparse_categorical_accuracy: 0.3333
Epoch 130/500
4/4 [==============================] - 0s 500us/step - loss: 5.7972 - sparse_categorical_accuracy: 0.3333
Epoch 131/500
4/4 [==============================] - 0s 499us/step - loss: 5.7972 - sparse_categorical_accuracy: 0.3333
Epoch 132/500
4/4 [==============================] - 0s 500us/step - loss: 5.7972 - sparse_categorical_accuracy: 0.3333
Epoch 133/500
4/4 [==============================] - 0s 250us/step - loss: 5.7972 - sparse_categorical_accuracy: 0.3333
Epoch 134/500
4/4 [==============================] - 0s 501us/step - loss: 5.7972 - sparse_categorical_accuracy: 0.3333
Epoch 135/500
4/4 [==============================] - 0s 499us/step - loss: 5.7972 - sparse_categorical_accuracy: 0.3333
Epoch 136/500
4/4 [==============================] - 0s 499us/step - loss: 5.7972 - sparse_categorical_accuracy: 0.3333
Epoch 137/500
4/4 [==============================] - 0s 499us/step - loss: 5.7972 - sparse_categorical_accuracy: 0.3333
Epoch 138/500
4/4 [==============================] - 0s 500us/step - loss: 5.7972 - sparse_categorical_accuracy: 0.3333
Epoch 139/500
4/4 [==============================] - 0s 500us/step - loss: 5.7972 - sparse_categorical_accuracy: 0.3333
Epoch 140/500
4/4 [==============================] - 0s 3ms/step - loss: 5.7972 - sparse_categorical_accuracy: 0.3333 - val_loss: 7.1404 - val_sparse_categorical_accuracy: 0.2333
Epoch 141/500
4/4 [==============================] - 0s 501us/step - loss: 5.7972 - sparse_categorical_accuracy: 0.3333
Epoch 142/500
4/4 [==============================] - 0s 500us/step - loss: 5.7972 - sparse_categorical_accuracy: 0.3333
Epoch 143/500
4/4 [==============================] - 0s 500us/step - loss: 5.7972 - sparse_categorical_accuracy: 0.3333
Epoch 144/500
4/4 [==============================] - 0s 500us/step - loss: 5.7972 - sparse_categorical_accuracy: 0.3333
Epoch 145/500
4/4 [==============================] - 0s 500us/step - loss: 5.7972 - sparse_categorical_accuracy: 0.3333
Epoch 146/500
4/4 [==============================] - 0s 749us/step - loss: 5.7972 - sparse_categorical_accuracy: 0.3333
Epoch 147/500
4/4 [==============================] - 0s 500us/step - loss: 5.7972 - sparse_categorical_accuracy: 0.3333
Epoch 148/500
4/4 [==============================] - 0s 501us/step - loss: 5.7972 - sparse_categorical_accuracy: 0.3333
Epoch 149/500
4/4 [==============================] - 0s 252us/step - loss: 5.7972 - sparse_categorical_accuracy: 0.3333
Epoch 150/500
4/4 [==============================] - 0s 750us/step - loss: 5.7972 - sparse_categorical_accuracy: 0.3333
Epoch 151/500
4/4 [==============================] - 0s 500us/step - loss: 5.7972 - sparse_categorical_accuracy: 0.3333
Epoch 152/500
4/4 [==============================] - 0s 500us/step - loss: 5.7972 - sparse_categorical_accuracy: 0.3333
Epoch 153/500
4/4 [==============================] - 0s 251us/step - loss: 5.7972 - sparse_categorical_accuracy: 0.3333
Epoch 154/500
4/4 [==============================] - 0s 748us/step - loss: 5.7972 - sparse_categorical_accuracy: 0.3333
Epoch 155/500
4/4 [==============================] - 0s 500us/step - loss: 5.7972 - sparse_categorical_accuracy: 0.3333
Epoch 156/500
4/4 [==============================] - 0s 500us/step - loss: 5.7972 - sparse_categorical_accuracy: 0.3333
Epoch 157/500
4/4 [==============================] - 0s 502us/step - loss: 5.7972 - sparse_categorical_accuracy: 0.3333
Epoch 158/500
4/4 [==============================] - 0s 999us/step - loss: 5.7972 - sparse_categorical_accuracy: 0.3333
Epoch 159/500
4/4 [==============================] - 0s 500us/step - loss: 5.7972 - sparse_categorical_accuracy: 0.3333
Epoch 160/500
4/4 [==============================] - 0s 3ms/step - loss: 5.7972 - sparse_categorical_accuracy: 0.3333 - val_loss: 7.1404 - val_sparse_categorical_accuracy: 0.2333
Epoch 161/500
4/4 [==============================] - 0s 500us/step - loss: 5.7972 - sparse_categorical_accuracy: 0.3333
Epoch 162/500
4/4 [==============================] - 0s 250us/step - loss: 5.7972 - sparse_categorical_accuracy: 0.3333
Epoch 163/500
4/4 [==============================] - 0s 250us/step - loss: 5.7972 - sparse_categorical_accuracy: 0.3333
Epoch 164/500
4/4 [==============================] - 0s 500us/step - loss: 5.7972 - sparse_categorical_accuracy: 0.3333
Epoch 165/500
4/4 [==============================] - 0s 500us/step - loss: 5.7972 - sparse_categorical_accuracy: 0.3333
Epoch 166/500
4/4 [==============================] - 0s 501us/step - loss: 5.7972 - sparse_categorical_accuracy: 0.3333
Epoch 167/500
4/4 [==============================] - 0s 252us/step - loss: 5.7972 - sparse_categorical_accuracy: 0.3333
Epoch 168/500
4/4 [==============================] - 0s 500us/step - loss: 5.7972 - sparse_categorical_accuracy: 0.3333
Epoch 169/500
4/4 [==============================] - 0s 499us/step - loss: 5.7972 - sparse_categorical_accuracy: 0.3333
Epoch 170/500
4/4 [==============================] - 0s 500us/step - loss: 5.7972 - sparse_categorical_accuracy: 0.3333
Epoch 171/500
4/4 [==============================] - 0s 501us/step - loss: 5.7972 - sparse_categorical_accuracy: 0.3333
Epoch 172/500
4/4 [==============================] - 0s 500us/step - loss: 5.7972 - sparse_categorical_accuracy: 0.3333
Epoch 173/500
4/4 [==============================] - 0s 500us/step - loss: 5.7972 - sparse_categorical_accuracy: 0.3333
Epoch 174/500
4/4 [==============================] - 0s 500us/step - loss: 5.7972 - sparse_categorical_accuracy: 0.3333
Epoch 175/500
4/4 [==============================] - 0s 750us/step - loss: 5.7972 - sparse_categorical_accuracy: 0.3333
Epoch 176/500
4/4 [==============================] - 0s 501us/step - loss: 5.7972 - sparse_categorical_accuracy: 0.3333
Epoch 177/500
4/4 [==============================] - 0s 498us/step - loss: 5.7972 - sparse_categorical_accuracy: 0.3333
Epoch 178/500
4/4 [==============================] - 0s 250us/step - loss: 5.7972 - sparse_categorical_accuracy: 0.3333
Epoch 179/500
4/4 [==============================] - 0s 751us/step - loss: 5.7972 - sparse_categorical_accuracy: 0.3333
Epoch 180/500
4/4 [==============================] - 0s 3ms/step - loss: 5.7972 - sparse_categorical_accuracy: 0.3333 - val_loss: 7.1404 - val_sparse_categorical_accuracy: 0.2333
Epoch 181/500
4/4 [==============================] - 0s 998us/step - loss: 5.7972 - sparse_categorical_accuracy: 0.3333
Epoch 182/500
4/4 [==============================] - 0s 500us/step - loss: 5.7972 - sparse_categorical_accuracy: 0.3333
Epoch 183/500
4/4 [==============================] - 0s 500us/step - loss: 5.7972 - sparse_categorical_accuracy: 0.3333
Epoch 184/500
4/4 [==============================] - 0s 498us/step - loss: 5.7972 - sparse_categorical_accuracy: 0.3333
Epoch 185/500
4/4 [==============================] - 0s 1000us/step - loss: 5.7972 - sparse_categorical_accuracy: 0.3333
Epoch 186/500
4/4 [==============================] - 0s 250us/step - loss: 5.7972 - sparse_categorical_accuracy: 0.3333
Epoch 187/500
4/4 [==============================] - 0s 499us/step - loss: 5.7972 - sparse_categorical_accuracy: 0.3333
Epoch 188/500
4/4 [==============================] - 0s 499us/step - loss: 5.7972 - sparse_categorical_accuracy: 0.3333
Epoch 189/500
4/4 [==============================] - 0s 749us/step - loss: 5.7972 - sparse_categorical_accuracy: 0.3333
Epoch 190/500
4/4 [==============================] - 0s 500us/step - loss: 5.7972 - sparse_categorical_accuracy: 0.3333
Epoch 191/500
4/4 [==============================] - 0s 500us/step - loss: 5.7972 - sparse_categorical_accuracy: 0.3333
Epoch 192/500
4/4 [==============================] - 0s 250us/step - loss: 5.7972 - sparse_categorical_accuracy: 0.3333
Epoch 193/500
4/4 [==============================] - 0s 499us/step - loss: 5.7972 - sparse_categorical_accuracy: 0.3333
Epoch 194/500
4/4 [==============================] - 0s 500us/step - loss: 5.7972 - sparse_categorical_accuracy: 0.3333
Epoch 195/500
4/4 [==============================] - 0s 500us/step - loss: 5.7972 - sparse_categorical_accuracy: 0.3333
Epoch 196/500
4/4 [==============================] - 0s 500us/step - loss: 5.7972 - sparse_categorical_accuracy: 0.3333
Epoch 197/500
4/4 [==============================] - 0s 250us/step - loss: 5.7972 - sparse_categorical_accuracy: 0.3333
Epoch 198/500
4/4 [==============================] - 0s 750us/step - loss: 5.7972 - sparse_categorical_accuracy: 0.3333
Epoch 199/500
4/4 [==============================] - 0s 500us/step - loss: 5.7972 - sparse_categorical_accuracy: 0.3333
Epoch 200/500
4/4 [==============================] - 0s 3ms/step - loss: 5.7972 - sparse_categorical_accuracy: 0.3333 - val_loss: 7.1404 - val_sparse_categorical_accuracy: 0.2333
Epoch 201/500
4/4 [==============================] - 0s 500us/step - loss: 5.7972 - sparse_categorical_accuracy: 0.3333
Epoch 202/500
4/4 [==============================] - 0s 251us/step - loss: 5.7972 - sparse_categorical_accuracy: 0.3333
Epoch 203/500
4/4 [==============================] - 0s 248us/step - loss: 5.7972 - sparse_categorical_accuracy: 0.3333
Epoch 204/500
4/4 [==============================] - 0s 500us/step - loss: 5.7972 - sparse_categorical_accuracy: 0.3333
Epoch 205/500
4/4 [==============================] - 0s 500us/step - loss: 5.7972 - sparse_categorical_accuracy: 0.3333
Epoch 206/500
4/4 [==============================] - 0s 500us/step - loss: 5.7972 - sparse_categorical_accuracy: 0.3333
Epoch 207/500
4/4 [==============================] - 0s 500us/step - loss: 5.7972 - sparse_categorical_accuracy: 0.3333
Epoch 208/500
4/4 [==============================] - 0s 748us/step - loss: 5.7972 - sparse_categorical_accuracy: 0.3333
Epoch 209/500
4/4 [==============================] - 0s 501us/step - loss: 5.7972 - sparse_categorical_accuracy: 0.3333
Epoch 210/500
4/4 [==============================] - 0s 500us/step - loss: 5.7972 - sparse_categorical_accuracy: 0.3333
Epoch 211/500
4/4 [==============================] - 0s 498us/step - loss: 5.7972 - sparse_categorical_accuracy: 0.3333
Epoch 212/500
4/4 [==============================] - 0s 500us/step - loss: 5.7972 - sparse_categorical_accuracy: 0.3333
Epoch 213/500
4/4 [==============================] - 0s 500us/step - loss: 5.7972 - sparse_categorical_accuracy: 0.3333
Epoch 214/500
4/4 [==============================] - 0s 502us/step - loss: 5.7972 - sparse_categorical_accuracy: 0.3333
Epoch 215/500
4/4 [==============================] - 0s 501us/step - loss: 5.7972 - sparse_categorical_accuracy: 0.3333
Epoch 216/500
4/4 [==============================] - 0s 500us/step - loss: 5.7972 - sparse_categorical_accuracy: 0.3333
Epoch 217/500
4/4 [==============================] - 0s 500us/step - loss: 5.7972 - sparse_categorical_accuracy: 0.3333
Epoch 218/500
4/4 [==============================] - 0s 250us/step - loss: 5.7972 - sparse_categorical_accuracy: 0.3333
Epoch 219/500
4/4 [==============================] - 0s 498us/step - loss: 5.7972 - sparse_categorical_accuracy: 0.3333
Epoch 220/500
4/4 [==============================] - 0s 3ms/step - loss: 5.7972 - sparse_categorical_accuracy: 0.3333 - val_loss: 7.1404 - val_sparse_categorical_accuracy: 0.2333
Epoch 221/500
4/4 [==============================] - 0s 252us/step - loss: 5.7972 - sparse_categorical_accuracy: 0.3333
Epoch 222/500
4/4 [==============================] - 0s 502us/step - loss: 5.7972 - sparse_categorical_accuracy: 0.3333
Epoch 223/500
4/4 [==============================] - 0s 251us/step - loss: 5.7972 - sparse_categorical_accuracy: 0.3333
Epoch 224/500
4/4 [==============================] - 0s 500us/step - loss: 5.7972 - sparse_categorical_accuracy: 0.3333
Epoch 225/500
4/4 [==============================] - 0s 252us/step - loss: 5.7972 - sparse_categorical_accuracy: 0.3333
Epoch 226/500
4/4 [==============================] - 0s 499us/step - loss: 5.7972 - sparse_categorical_accuracy: 0.3333
Epoch 227/500
4/4 [==============================] - 0s 750us/step - loss: 5.7972 - sparse_categorical_accuracy: 0.3333
Epoch 228/500
4/4 [==============================] - 0s 502us/step - loss: 5.7972 - sparse_categorical_accuracy: 0.3333
Epoch 229/500
4/4 [==============================] - 0s 502us/step - loss: 5.7972 - sparse_categorical_accuracy: 0.3333
Epoch 230/500
4/4 [==============================] - 0s 500us/step - loss: 5.7972 - sparse_categorical_accuracy: 0.3333
Epoch 231/500
4/4 [==============================] - 0s 500us/step - loss: 5.7972 - sparse_categorical_accuracy: 0.3333
Epoch 232/500
4/4 [==============================] - 0s 250us/step - loss: 5.7972 - sparse_categorical_accuracy: 0.3333
Epoch 233/500
4/4 [==============================] - 0s 500us/step - loss: 5.7972 - sparse_categorical_accuracy: 0.3333
Epoch 234/500
4/4 [==============================] - 0s 500us/step - loss: 5.7972 - sparse_categorical_accuracy: 0.3333
Epoch 235/500
4/4 [==============================] - 0s 498us/step - loss: 5.7972 - sparse_categorical_accuracy: 0.3333
Epoch 236/500
4/4 [==============================] - 0s 750us/step - loss: 5.7972 - sparse_categorical_accuracy: 0.3333
Epoch 237/500
4/4 [==============================] - 0s 500us/step - loss: 5.7972 - sparse_categorical_accuracy: 0.3333
Epoch 238/500
4/4 [==============================] - 0s 500us/step - loss: 5.7972 - sparse_categorical_accuracy: 0.3333
Epoch 239/500
4/4 [==============================] - 0s 498us/step - loss: 5.7972 - sparse_categorical_accuracy: 0.3333
Epoch 240/500
4/4 [==============================] - 0s 3ms/step - loss: 5.7972 - sparse_categorical_accuracy: 0.3333 - val_loss: 7.1404 - val_sparse_categorical_accuracy: 0.2333
Epoch 241/500
4/4 [==============================] - 0s 500us/step - loss: 5.7972 - sparse_categorical_accuracy: 0.3333
Epoch 242/500
4/4 [==============================] - 0s 500us/step - loss: 5.7972 - sparse_categorical_accuracy: 0.3333
Epoch 243/500
4/4 [==============================] - 0s 500us/step - loss: 5.7972 - sparse_categorical_accuracy: 0.3333
Epoch 244/500
4/4 [==============================] - 0s 500us/step - loss: 5.7972 - sparse_categorical_accuracy: 0.3333
Epoch 245/500
4/4 [==============================] - 0s 500us/step - loss: 5.7972 - sparse_categorical_accuracy: 0.3333
Epoch 246/500
4/4 [==============================] - 0s 499us/step - loss: 5.7972 - sparse_categorical_accuracy: 0.3333
Epoch 247/500
4/4 [==============================] - 0s 750us/step - loss: 5.7972 - sparse_categorical_accuracy: 0.3333
Epoch 248/500
4/4 [==============================] - 0s 502us/step - loss: 5.7972 - sparse_categorical_accuracy: 0.3333
Epoch 249/500
4/4 [==============================] - 0s 498us/step - loss: 5.7972 - sparse_categorical_accuracy: 0.3333
Epoch 250/500
4/4 [==============================] - 0s 251us/step - loss: 5.7972 - sparse_categorical_accuracy: 0.3333
Epoch 251/500
4/4 [==============================] - 0s 500us/step - loss: 5.7972 - sparse_categorical_accuracy: 0.3333
Epoch 252/500
4/4 [==============================] - 0s 501us/step - loss: 5.7972 - sparse_categorical_accuracy: 0.3333
Epoch 253/500
4/4 [==============================] - 0s 499us/step - loss: 5.7972 - sparse_categorical_accuracy: 0.3333
Epoch 254/500
4/4 [==============================] - 0s 500us/step - loss: 5.7972 - sparse_categorical_accuracy: 0.3333
Epoch 255/500
4/4 [==============================] - 0s 250us/step - loss: 5.7972 - sparse_categorical_accuracy: 0.3333
Epoch 256/500
4/4 [==============================] - 0s 500us/step - loss: 5.7972 - sparse_categorical_accuracy: 0.3333
Epoch 257/500
4/4 [==============================] - 0s 250us/step - loss: 5.7972 - sparse_categorical_accuracy: 0.3333
Epoch 258/500
4/4 [==============================] - 0s 499us/step - loss: 5.7972 - sparse_categorical_accuracy: 0.3333
Epoch 259/500
4/4 [==============================] - 0s 501us/step - loss: 5.7972 - sparse_categorical_accuracy: 0.3333
Epoch 260/500
4/4 [==============================] - 0s 3ms/step - loss: 5.7972 - sparse_categorical_accuracy: 0.3333 - val_loss: 7.1404 - val_sparse_categorical_accuracy: 0.2333
Epoch 261/500
4/4 [==============================] - 0s 249us/step - loss: 5.7972 - sparse_categorical_accuracy: 0.3333
Epoch 262/500
4/4 [==============================] - 0s 498us/step - loss: 5.7972 - sparse_categorical_accuracy: 0.3333
Epoch 263/500
4/4 [==============================] - 0s 749us/step - loss: 5.7972 - sparse_categorical_accuracy: 0.3333
Epoch 264/500
4/4 [==============================] - 0s 500us/step - loss: 5.7972 - sparse_categorical_accuracy: 0.3333
Epoch 265/500
4/4 [==============================] - 0s 500us/step - loss: 5.7972 - sparse_categorical_accuracy: 0.3333
Epoch 266/500
4/4 [==============================] - 0s 500us/step - loss: 5.7972 - sparse_categorical_accuracy: 0.3333
Epoch 267/500
4/4 [==============================] - 0s 500us/step - loss: 5.7972 - sparse_categorical_accuracy: 0.3333
Epoch 268/500
4/4 [==============================] - 0s 753us/step - loss: 5.7972 - sparse_categorical_accuracy: 0.3333
Epoch 269/500
4/4 [==============================] - 0s 252us/step - loss: 5.7972 - sparse_categorical_accuracy: 0.3333
Epoch 270/500
4/4 [==============================] - 0s 499us/step - loss: 5.7972 - sparse_categorical_accuracy: 0.3333
Epoch 271/500
4/4 [==============================] - 0s 250us/step - loss: 5.7972 - sparse_categorical_accuracy: 0.3333
Epoch 272/500
4/4 [==============================] - 0s 250us/step - loss: 5.7972 - sparse_categorical_accuracy: 0.3333
Epoch 273/500
4/4 [==============================] - 0s 499us/step - loss: 5.7972 - sparse_categorical_accuracy: 0.3333
Epoch 274/500
4/4 [==============================] - 0s 500us/step - loss: 5.7972 - sparse_categorical_accuracy: 0.3333
Epoch 275/500
4/4 [==============================] - 0s 502us/step - loss: 5.7972 - sparse_categorical_accuracy: 0.3333
Epoch 276/500
4/4 [==============================] - 0s 502us/step - loss: 5.7972 - sparse_categorical_accuracy: 0.3333
Epoch 277/500
4/4 [==============================] - 0s 498us/step - loss: 5.7972 - sparse_categorical_accuracy: 0.3333
Epoch 278/500
4/4 [==============================] - 0s 500us/step - loss: 5.7972 - sparse_categorical_accuracy: 0.3333
Epoch 279/500
4/4 [==============================] - 0s 502us/step - loss: 5.7972 - sparse_categorical_accuracy: 0.3333
Epoch 280/500
4/4 [==============================] - 0s 3ms/step - loss: 5.7972 - sparse_categorical_accuracy: 0.3333 - val_loss: 7.1404 - val_sparse_categorical_accuracy: 0.2333
Epoch 281/500
4/4 [==============================] - 0s 500us/step - loss: 5.7972 - sparse_categorical_accuracy: 0.3333
Epoch 282/500
4/4 [==============================] - 0s 500us/step - loss: 5.7972 - sparse_categorical_accuracy: 0.3333
Epoch 283/500
4/4 [==============================] - 0s 500us/step - loss: 5.7972 - sparse_categorical_accuracy: 0.3333
Epoch 284/500
4/4 [==============================] - 0s 498us/step - loss: 5.7972 - sparse_categorical_accuracy: 0.3333
Epoch 285/500
4/4 [==============================] - 0s 500us/step - loss: 5.7972 - sparse_categorical_accuracy: 0.3333
Epoch 286/500
4/4 [==============================] - 0s 250us/step - loss: 5.7972 - sparse_categorical_accuracy: 0.3333
Epoch 287/500
4/4 [==============================] - 0s 499us/step - loss: 5.7972 - sparse_categorical_accuracy: 0.3333
Epoch 288/500
4/4 [==============================] - 0s 500us/step - loss: 5.7972 - sparse_categorical_accuracy: 0.3333
Epoch 289/500
4/4 [==============================] - 0s 499us/step - loss: 5.7972 - sparse_categorical_accuracy: 0.3333
Epoch 290/500
4/4 [==============================] - 0s 250us/step - loss: 5.7972 - sparse_categorical_accuracy: 0.3333
Epoch 291/500
4/4 [==============================] - 0s 500us/step - loss: 5.7972 - sparse_categorical_accuracy: 0.3333
Epoch 292/500
4/4 [==============================] - 0s 750us/step - loss: 5.7972 - sparse_categorical_accuracy: 0.3333
Epoch 293/500
4/4 [==============================] - 0s 500us/step - loss: 5.7972 - sparse_categorical_accuracy: 0.3333
Epoch 294/500
4/4 [==============================] - 0s 501us/step - loss: 5.7972 - sparse_categorical_accuracy: 0.3333
Epoch 295/500
4/4 [==============================] - 0s 500us/step - loss: 5.7972 - sparse_categorical_accuracy: 0.3333
Epoch 296/500
4/4 [==============================] - 0s 500us/step - loss: 5.7972 - sparse_categorical_accuracy: 0.3333
Epoch 297/500
4/4 [==============================] - 0s 499us/step - loss: 5.7972 - sparse_categorical_accuracy: 0.3333
Epoch 298/500
4/4 [==============================] - 0s 498us/step - loss: 5.7972 - sparse_categorical_accuracy: 0.3333
Epoch 299/500
4/4 [==============================] - 0s 499us/step - loss: 5.7972 - sparse_categorical_accuracy: 0.3333
Epoch 300/500
4/4 [==============================] - 0s 3ms/step - loss: 5.7972 - sparse_categorical_accuracy: 0.3333 - val_loss: 7.1404 - val_sparse_categorical_accuracy: 0.2333
Epoch 301/500
4/4 [==============================] - 0s 251us/step - loss: 5.7972 - sparse_categorical_accuracy: 0.3333
Epoch 302/500
4/4 [==============================] - 0s 751us/step - loss: 5.7972 - sparse_categorical_accuracy: 0.3333
Epoch 303/500
4/4 [==============================] - 0s 498us/step - loss: 5.7972 - sparse_categorical_accuracy: 0.3333
Epoch 304/500
4/4 [==============================] - 0s 502us/step - loss: 5.7972 - sparse_categorical_accuracy: 0.3333
Epoch 305/500
4/4 [==============================] - 0s 500us/step - loss: 5.7972 - sparse_categorical_accuracy: 0.3333
Epoch 306/500
4/4 [==============================] - 0s 750us/step - loss: 5.7972 - sparse_categorical_accuracy: 0.3333
Epoch 307/500
4/4 [==============================] - 0s 500us/step - loss: 5.7972 - sparse_categorical_accuracy: 0.3333
Epoch 308/500
4/4 [==============================] - 0s 498us/step - loss: 5.7972 - sparse_categorical_accuracy: 0.3333
Epoch 309/500
4/4 [==============================] - 0s 250us/step - loss: 5.7972 - sparse_categorical_accuracy: 0.3333
Epoch 310/500
4/4 [==============================] - 0s 499us/step - loss: 5.7972 - sparse_categorical_accuracy: 0.3333
Epoch 311/500
4/4 [==============================] - 0s 500us/step - loss: 5.7972 - sparse_categorical_accuracy: 0.3333
Epoch 312/500
4/4 [==============================] - 0s 500us/step - loss: 5.7972 - sparse_categorical_accuracy: 0.3333
Epoch 313/500
4/4 [==============================] - 0s 501us/step - loss: 5.7972 - sparse_categorical_accuracy: 0.3333
Epoch 314/500
4/4 [==============================] - 0s 499us/step - loss: 5.7972 - sparse_categorical_accuracy: 0.3333
Epoch 315/500
4/4 [==============================] - 0s 498us/step - loss: 5.7972 - sparse_categorical_accuracy: 0.3333
Epoch 316/500
4/4 [==============================] - 0s 502us/step - loss: 5.7972 - sparse_categorical_accuracy: 0.3333
Epoch 317/500
4/4 [==============================] - 0s 500us/step - loss: 5.7972 - sparse_categorical_accuracy: 0.3333
Epoch 318/500
4/4 [==============================] - 0s 501us/step - loss: 5.7972 - sparse_categorical_accuracy: 0.3333
Epoch 319/500
4/4 [==============================] - 0s 500us/step - loss: 5.7972 - sparse_categorical_accuracy: 0.3333
Epoch 320/500
4/4 [==============================] - 0s 3ms/step - loss: 5.7972 - sparse_categorical_accuracy: 0.3333 - val_loss: 7.1404 - val_sparse_categorical_accuracy: 0.2333
Epoch 321/500
4/4 [==============================] - 0s 500us/step - loss: 5.7972 - sparse_categorical_accuracy: 0.3333
Epoch 322/500
4/4 [==============================] - 0s 498us/step - loss: 5.7972 - sparse_categorical_accuracy: 0.3333
Epoch 323/500
4/4 [==============================] - 0s 748us/step - loss: 5.7972 - sparse_categorical_accuracy: 0.3333
Epoch 324/500
4/4 [==============================] - 0s 499us/step - loss: 5.7972 - sparse_categorical_accuracy: 0.3333
Epoch 325/500
4/4 [==============================] - 0s 502us/step - loss: 5.7972 - sparse_categorical_accuracy: 0.3333
Epoch 326/500
4/4 [==============================] - 0s 498us/step - loss: 5.7972 - sparse_categorical_accuracy: 0.3333
Epoch 327/500
4/4 [==============================] - 0s 250us/step - loss: 5.7972 - sparse_categorical_accuracy: 0.3333
Epoch 328/500
4/4 [==============================] - 0s 500us/step - loss: 5.7972 - sparse_categorical_accuracy: 0.3333
Epoch 329/500
4/4 [==============================] - 0s 499us/step - loss: 5.7972 - sparse_categorical_accuracy: 0.3333
Epoch 330/500
4/4 [==============================] - 0s 499us/step - loss: 5.7972 - sparse_categorical_accuracy: 0.3333
Epoch 331/500
4/4 [==============================] - 0s 500us/step - loss: 5.7972 - sparse_categorical_accuracy: 0.3333
Epoch 332/500
4/4 [==============================] - 0s 500us/step - loss: 5.7972 - sparse_categorical_accuracy: 0.3333
Epoch 333/500
4/4 [==============================] - 0s 500us/step - loss: 5.7972 - sparse_categorical_accuracy: 0.3333
Epoch 334/500
4/4 [==============================] - 0s 500us/step - loss: 5.7972 - sparse_categorical_accuracy: 0.3333
Epoch 335/500
4/4 [==============================] - 0s 500us/step - loss: 5.7972 - sparse_categorical_accuracy: 0.3333
Epoch 336/500
4/4 [==============================] - 0s 752us/step - loss: 5.7972 - sparse_categorical_accuracy: 0.3333
Epoch 337/500
4/4 [==============================] - 0s 500us/step - loss: 5.7972 - sparse_categorical_accuracy: 0.3333
Epoch 338/500
4/4 [==============================] - 0s 250us/step - loss: 5.7972 - sparse_categorical_accuracy: 0.3333
Epoch 339/500
4/4 [==============================] - 0s 500us/step - loss: 5.7972 - sparse_categorical_accuracy: 0.3333
Epoch 340/500
4/4 [==============================] - 0s 3ms/step - loss: 5.7972 - sparse_categorical_accuracy: 0.3333 - val_loss: 7.1404 - val_sparse_categorical_accuracy: 0.2333
Epoch 341/500
4/4 [==============================] - 0s 252us/step - loss: 5.7972 - sparse_categorical_accuracy: 0.3333
Epoch 342/500
4/4 [==============================] - 0s 500us/step - loss: 5.7972 - sparse_categorical_accuracy: 0.3333
Epoch 343/500
4/4 [==============================] - 0s 499us/step - loss: 5.7972 - sparse_categorical_accuracy: 0.3333
Epoch 344/500
4/4 [==============================] - 0s 750us/step - loss: 5.7972 - sparse_categorical_accuracy: 0.3333
Epoch 345/500
4/4 [==============================] - 0s 500us/step - loss: 5.7972 - sparse_categorical_accuracy: 0.3333
Epoch 346/500
4/4 [==============================] - 0s 500us/step - loss: 5.7972 - sparse_categorical_accuracy: 0.3333
Epoch 347/500
4/4 [==============================] - 0s 501us/step - loss: 5.7972 - sparse_categorical_accuracy: 0.3333
Epoch 348/500
4/4 [==============================] - 0s 500us/step - loss: 5.7972 - sparse_categorical_accuracy: 0.3333
Epoch 349/500
4/4 [==============================] - 0s 250us/step - loss: 5.7972 - sparse_categorical_accuracy: 0.3333
Epoch 350/500
4/4 [==============================] - 0s 500us/step - loss: 5.7972 - sparse_categorical_accuracy: 0.3333
Epoch 351/500
4/4 [==============================] - 0s 498us/step - loss: 5.7972 - sparse_categorical_accuracy: 0.3333
Epoch 352/500
4/4 [==============================] - 0s 501us/step - loss: 5.7972 - sparse_categorical_accuracy: 0.3333
Epoch 353/500
4/4 [==============================] - 0s 500us/step - loss: 5.7972 - sparse_categorical_accuracy: 0.3333
Epoch 354/500
4/4 [==============================] - 0s 250us/step - loss: 5.7972 - sparse_categorical_accuracy: 0.3333
Epoch 355/500
4/4 [==============================] - 0s 500us/step - loss: 5.7972 - sparse_categorical_accuracy: 0.3333
Epoch 356/500
4/4 [==============================] - 0s 500us/step - loss: 5.7972 - sparse_categorical_accuracy: 0.3333
Epoch 357/500
4/4 [==============================] - 0s 500us/step - loss: 5.7972 - sparse_categorical_accuracy: 0.3333
Epoch 358/500
4/4 [==============================] - 0s 500us/step - loss: 5.7972 - sparse_categorical_accuracy: 0.3333
Epoch 359/500
4/4 [==============================] - 0s 500us/step - loss: 5.7972 - sparse_categorical_accuracy: 0.3333
Epoch 360/500
4/4 [==============================] - 0s 3ms/step - loss: 5.7972 - sparse_categorical_accuracy: 0.3333 - val_loss: 7.1404 - val_sparse_categorical_accuracy: 0.2333
Epoch 361/500
4/4 [==============================] - 0s 502us/step - loss: 5.7972 - sparse_categorical_accuracy: 0.3333
Epoch 362/500
4/4 [==============================] - 0s 500us/step - loss: 5.7972 - sparse_categorical_accuracy: 0.3333
Epoch 363/500
4/4 [==============================] - 0s 498us/step - loss: 5.7972 - sparse_categorical_accuracy: 0.3333
Epoch 364/500
4/4 [==============================] - 0s 500us/step - loss: 5.7972 - sparse_categorical_accuracy: 0.3333
Epoch 365/500
4/4 [==============================] - 0s 498us/step - loss: 5.7972 - sparse_categorical_accuracy: 0.3333
Epoch 366/500
4/4 [==============================] - 0s 250us/step - loss: 5.7972 - sparse_categorical_accuracy: 0.3333
Epoch 367/500
4/4 [==============================] - 0s 500us/step - loss: 5.7972 - sparse_categorical_accuracy: 0.3333
Epoch 368/500
4/4 [==============================] - 0s 500us/step - loss: 5.7972 - sparse_categorical_accuracy: 0.3333
Epoch 369/500
4/4 [==============================] - 0s 499us/step - loss: 5.7972 - sparse_categorical_accuracy: 0.3333
Epoch 370/500
4/4 [==============================] - 0s 251us/step - loss: 5.7972 - sparse_categorical_accuracy: 0.3333
Epoch 371/500
4/4 [==============================] - 0s 500us/step - loss: 5.7972 - sparse_categorical_accuracy: 0.3333
Epoch 372/500
4/4 [==============================] - 0s 250us/step - loss: 5.7972 - sparse_categorical_accuracy: 0.3333
Epoch 373/500
4/4 [==============================] - 0s 500us/step - loss: 5.7972 - sparse_categorical_accuracy: 0.3333
Epoch 374/500
4/4 [==============================] - 0s 498us/step - loss: 5.7972 - sparse_categorical_accuracy: 0.3333
Epoch 375/500
4/4 [==============================] - 0s 501us/step - loss: 5.7972 - sparse_categorical_accuracy: 0.3333
Epoch 376/500
4/4 [==============================] - 0s 498us/step - loss: 5.7972 - sparse_categorical_accuracy: 0.3333
Epoch 377/500
4/4 [==============================] - 0s 500us/step - loss: 5.7972 - sparse_categorical_accuracy: 0.3333
Epoch 378/500
4/4 [==============================] - 0s 501us/step - loss: 5.7972 - sparse_categorical_accuracy: 0.3333
Epoch 379/500
4/4 [==============================] - 0s 499us/step - loss: 5.7972 - sparse_categorical_accuracy: 0.3333
Epoch 380/500
4/4 [==============================] - 0s 3ms/step - loss: 5.7972 - sparse_categorical_accuracy: 0.3333 - val_loss: 7.1404 - val_sparse_categorical_accuracy: 0.2333
Epoch 381/500
4/4 [==============================] - 0s 500us/step - loss: 5.7972 - sparse_categorical_accuracy: 0.3333
Epoch 382/500
4/4 [==============================] - 0s 250us/step - loss: 5.7972 - sparse_categorical_accuracy: 0.3333
Epoch 383/500
4/4 [==============================] - 0s 502us/step - loss: 5.7972 - sparse_categorical_accuracy: 0.3333
Epoch 384/500
4/4 [==============================] - 0s 500us/step - loss: 5.7972 - sparse_categorical_accuracy: 0.3333
Epoch 385/500
4/4 [==============================] - 0s 500us/step - loss: 5.7972 - sparse_categorical_accuracy: 0.3333
Epoch 386/500
4/4 [==============================] - 0s 498us/step - loss: 5.7972 - sparse_categorical_accuracy: 0.3333
Epoch 387/500
4/4 [==============================] - 0s 749us/step - loss: 5.7972 - sparse_categorical_accuracy: 0.3333
Epoch 388/500
4/4 [==============================] - 0s 249us/step - loss: 5.7972 - sparse_categorical_accuracy: 0.3333
Epoch 389/500
4/4 [==============================] - 0s 500us/step - loss: 5.7972 - sparse_categorical_accuracy: 0.3333
Epoch 390/500
4/4 [==============================] - 0s 498us/step - loss: 5.7972 - sparse_categorical_accuracy: 0.3333
Epoch 391/500
4/4 [==============================] - 0s 499us/step - loss: 5.7972 - sparse_categorical_accuracy: 0.3333
Epoch 392/500
4/4 [==============================] - 0s 500us/step - loss: 5.7972 - sparse_categorical_accuracy: 0.3333
Epoch 393/500
4/4 [==============================] - 0s 500us/step - loss: 5.7972 - sparse_categorical_accuracy: 0.3333
Epoch 394/500
4/4 [==============================] - 0s 500us/step - loss: 5.7972 - sparse_categorical_accuracy: 0.3333
Epoch 395/500
4/4 [==============================] - 0s 498us/step - loss: 5.7972 - sparse_categorical_accuracy: 0.3333
Epoch 396/500
4/4 [==============================] - 0s 750us/step - loss: 5.7972 - sparse_categorical_accuracy: 0.3333
Epoch 397/500
4/4 [==============================] - 0s 500us/step - loss: 5.7972 - sparse_categorical_accuracy: 0.3333
Epoch 398/500
4/4 [==============================] - 0s 750us/step - loss: 5.7972 - sparse_categorical_accuracy: 0.3333
Epoch 399/500
4/4 [==============================] - 0s 500us/step - loss: 5.7972 - sparse_categorical_accuracy: 0.3333
Epoch 400/500
4/4 [==============================] - 0s 3ms/step - loss: 5.7972 - sparse_categorical_accuracy: 0.3333 - val_loss: 7.1404 - val_sparse_categorical_accuracy: 0.2333
Epoch 401/500
4/4 [==============================] - 0s 498us/step - loss: 5.7972 - sparse_categorical_accuracy: 0.3333
Epoch 402/500
4/4 [==============================] - 0s 750us/step - loss: 5.7972 - sparse_categorical_accuracy: 0.3333
Epoch 403/500
4/4 [==============================] - 0s 500us/step - loss: 5.7972 - sparse_categorical_accuracy: 0.3333
Epoch 404/500
4/4 [==============================] - 0s 500us/step - loss: 5.7972 - sparse_categorical_accuracy: 0.3333
Epoch 405/500
4/4 [==============================] - 0s 498us/step - loss: 5.7972 - sparse_categorical_accuracy: 0.3333
Epoch 406/500
4/4 [==============================] - 0s 750us/step - loss: 5.7972 - sparse_categorical_accuracy: 0.3333
Epoch 407/500
4/4 [==============================] - 0s 500us/step - loss: 5.7972 - sparse_categorical_accuracy: 0.3333
Epoch 408/500
4/4 [==============================] - 0s 500us/step - loss: 5.7972 - sparse_categorical_accuracy: 0.3333
Epoch 409/500
4/4 [==============================] - 0s 498us/step - loss: 5.7972 - sparse_categorical_accuracy: 0.3333
Epoch 410/500
4/4 [==============================] - 0s 500us/step - loss: 5.7972 - sparse_categorical_accuracy: 0.3333
Epoch 411/500
4/4 [==============================] - 0s 500us/step - loss: 5.7972 - sparse_categorical_accuracy: 0.3333
Epoch 412/500
4/4 [==============================] - 0s 500us/step - loss: 5.7972 - sparse_categorical_accuracy: 0.3333
Epoch 413/500
4/4 [==============================] - 0s 250us/step - loss: 5.7972 - sparse_categorical_accuracy: 0.3333
Epoch 414/500
4/4 [==============================] - 0s 750us/step - loss: 5.7972 - sparse_categorical_accuracy: 0.3333
Epoch 415/500
4/4 [==============================] - 0s 501us/step - loss: 5.7972 - sparse_categorical_accuracy: 0.3333
Epoch 416/500
4/4 [==============================] - 0s 751us/step - loss: 5.7972 - sparse_categorical_accuracy: 0.3333
Epoch 417/500
4/4 [==============================] - 0s 501us/step - loss: 5.7972 - sparse_categorical_accuracy: 0.3333
Epoch 418/500
4/4 [==============================] - 0s 750us/step - loss: 5.7972 - sparse_categorical_accuracy: 0.3333
Epoch 419/500
4/4 [==============================] - 0s 500us/step - loss: 5.7972 - sparse_categorical_accuracy: 0.3333
Epoch 420/500
4/4 [==============================] - 0s 9ms/step - loss: 5.7972 - sparse_categorical_accuracy: 0.3333 - val_loss: 7.1404 - val_sparse_categorical_accuracy: 0.2333
Epoch 421/500
4/4 [==============================] - 0s 500us/step - loss: 5.7972 - sparse_categorical_accuracy: 0.3333
Epoch 422/500
4/4 [==============================] - 0s 1ms/step - loss: 5.7972 - sparse_categorical_accuracy: 0.3333
Epoch 423/500
4/4 [==============================] - 0s 501us/step - loss: 5.7972 - sparse_categorical_accuracy: 0.3333
Epoch 424/500
4/4 [==============================] - 0s 499us/step - loss: 5.7972 - sparse_categorical_accuracy: 0.3333
Epoch 425/500
4/4 [==============================] - 0s 751us/step - loss: 5.7972 - sparse_categorical_accuracy: 0.3333
Epoch 426/500
4/4 [==============================] - 0s 249us/step - loss: 5.7972 - sparse_categorical_accuracy: 0.3333
Epoch 427/500
4/4 [==============================] - 0s 750us/step - loss: 5.7972 - sparse_categorical_accuracy: 0.3333
Epoch 428/500
4/4 [==============================] - 0s 248us/step - loss: 5.7972 - sparse_categorical_accuracy: 0.3333
Epoch 429/500
4/4 [==============================] - 0s 501us/step - loss: 5.7972 - sparse_categorical_accuracy: 0.3333
Epoch 430/500
4/4 [==============================] - 0s 501us/step - loss: 5.7972 - sparse_categorical_accuracy: 0.3333
Epoch 431/500
4/4 [==============================] - 0s 501us/step - loss: 5.7972 - sparse_categorical_accuracy: 0.3333
Epoch 432/500
4/4 [==============================] - 0s 500us/step - loss: 5.7972 - sparse_categorical_accuracy: 0.3333
Epoch 433/500
4/4 [==============================] - 0s 499us/step - loss: 5.7972 - sparse_categorical_accuracy: 0.3333
Epoch 434/500
4/4 [==============================] - 0s 250us/step - loss: 5.7972 - sparse_categorical_accuracy: 0.3333
Epoch 435/500
4/4 [==============================] - 0s 502us/step - loss: 5.7972 - sparse_categorical_accuracy: 0.3333
Epoch 436/500
4/4 [==============================] - 0s 501us/step - loss: 5.7972 - sparse_categorical_accuracy: 0.3333
Epoch 437/500
4/4 [==============================] - 0s 498us/step - loss: 5.7972 - sparse_categorical_accuracy: 0.3333
Epoch 438/500
4/4 [==============================] - 0s 250us/step - loss: 5.7972 - sparse_categorical_accuracy: 0.3333
Epoch 439/500
4/4 [==============================] - 0s 500us/step - loss: 5.7972 - sparse_categorical_accuracy: 0.3333
Epoch 440/500
4/4 [==============================] - 0s 3ms/step - loss: 5.7972 - sparse_categorical_accuracy: 0.3333 - val_loss: 7.1404 - val_sparse_categorical_accuracy: 0.2333
Epoch 441/500
4/4 [==============================] - 0s 750us/step - loss: 5.7972 - sparse_categorical_accuracy: 0.3333
Epoch 442/500
4/4 [==============================] - 0s 250us/step - loss: 5.7972 - sparse_categorical_accuracy: 0.3333
Epoch 443/500
4/4 [==============================] - 0s 499us/step - loss: 5.7972 - sparse_categorical_accuracy: 0.3333
Epoch 444/500
4/4 [==============================] - 0s 500us/step - loss: 5.7972 - sparse_categorical_accuracy: 0.3333
Epoch 445/500
4/4 [==============================] - 0s 500us/step - loss: 5.7972 - sparse_categorical_accuracy: 0.3333
Epoch 446/500
4/4 [==============================] - 0s 250us/step - loss: 5.7972 - sparse_categorical_accuracy: 0.3333
Epoch 447/500
4/4 [==============================] - 0s 498us/step - loss: 5.7972 - sparse_categorical_accuracy: 0.3333
Epoch 448/500
4/4 [==============================] - 0s 499us/step - loss: 5.7972 - sparse_categorical_accuracy: 0.3333
Epoch 449/500
4/4 [==============================] - 0s 250us/step - loss: 5.7972 - sparse_categorical_accuracy: 0.3333
Epoch 450/500
4/4 [==============================] - 0s 499us/step - loss: 5.7972 - sparse_categorical_accuracy: 0.3333
Epoch 451/500
4/4 [==============================] - 0s 499us/step - loss: 5.7972 - sparse_categorical_accuracy: 0.3333
Epoch 452/500
4/4 [==============================] - 0s 500us/step - loss: 5.7972 - sparse_categorical_accuracy: 0.3333
Epoch 453/500
4/4 [==============================] - 0s 500us/step - loss: 5.7972 - sparse_categorical_accuracy: 0.3333
Epoch 454/500
4/4 [==============================] - 0s 502us/step - loss: 5.7972 - sparse_categorical_accuracy: 0.3333
Epoch 455/500
4/4 [==============================] - 0s 248us/step - loss: 5.7972 - sparse_categorical_accuracy: 0.3333
Epoch 456/500
4/4 [==============================] - 0s 500us/step - loss: 5.7972 - sparse_categorical_accuracy: 0.3333
Epoch 457/500
4/4 [==============================] - 0s 500us/step - loss: 5.7972 - sparse_categorical_accuracy: 0.3333
Epoch 458/500
4/4 [==============================] - 0s 752us/step - loss: 5.7972 - sparse_categorical_accuracy: 0.3333
Epoch 459/500
4/4 [==============================] - 0s 500us/step - loss: 5.7972 - sparse_categorical_accuracy: 0.3333
Epoch 460/500
4/4 [==============================] - 0s 3ms/step - loss: 5.7972 - sparse_categorical_accuracy: 0.3333 - val_loss: 7.1404 - val_sparse_categorical_accuracy: 0.2333
Epoch 461/500
4/4 [==============================] - 0s 250us/step - loss: 5.7972 - sparse_categorical_accuracy: 0.3333
Epoch 462/500
4/4 [==============================] - 0s 499us/step - loss: 5.7972 - sparse_categorical_accuracy: 0.3333
Epoch 463/500
4/4 [==============================] - 0s 501us/step - loss: 5.7972 - sparse_categorical_accuracy: 0.3333
Epoch 464/500
4/4 [==============================] - 0s 500us/step - loss: 5.7972 - sparse_categorical_accuracy: 0.3333
Epoch 465/500
4/4 [==============================] - 0s 250us/step - loss: 5.7972 - sparse_categorical_accuracy: 0.3333
Epoch 466/500
4/4 [==============================] - 0s 500us/step - loss: 5.7972 - sparse_categorical_accuracy: 0.3333
Epoch 467/500
4/4 [==============================] - 0s 250us/step - loss: 5.7972 - sparse_categorical_accuracy: 0.3333
Epoch 468/500
4/4 [==============================] - 0s 500us/step - loss: 5.7972 - sparse_categorical_accuracy: 0.3333
Epoch 469/500
4/4 [==============================] - 0s 250us/step - loss: 5.7972 - sparse_categorical_accuracy: 0.3333
Epoch 470/500
4/4 [==============================] - 0s 499us/step - loss: 5.7972 - sparse_categorical_accuracy: 0.3333
Epoch 471/500
4/4 [==============================] - 0s 500us/step - loss: 5.7972 - sparse_categorical_accuracy: 0.3333
Epoch 472/500
4/4 [==============================] - 0s 500us/step - loss: 5.7972 - sparse_categorical_accuracy: 0.3333
Epoch 473/500
4/4 [==============================] - 0s 750us/step - loss: 5.7972 - sparse_categorical_accuracy: 0.3333
Epoch 474/500
4/4 [==============================] - 0s 498us/step - loss: 5.7972 - sparse_categorical_accuracy: 0.3333
Epoch 475/500
4/4 [==============================] - 0s 501us/step - loss: 5.7972 - sparse_categorical_accuracy: 0.3333
Epoch 476/500
4/4 [==============================] - 0s 500us/step - loss: 5.7972 - sparse_categorical_accuracy: 0.3333
Epoch 477/500
4/4 [==============================] - 0s 500us/step - loss: 5.7972 - sparse_categorical_accuracy: 0.3333
Epoch 478/500
4/4 [==============================] - 0s 502us/step - loss: 5.7972 - sparse_categorical_accuracy: 0.3333
Epoch 479/500
4/4 [==============================] - 0s 498us/step - loss: 5.7972 - sparse_categorical_accuracy: 0.3333
Epoch 480/500
4/4 [==============================] - 0s 3ms/step - loss: 5.7972 - sparse_categorical_accuracy: 0.3333 - val_loss: 7.1404 - val_sparse_categorical_accuracy: 0.2333
Epoch 481/500
4/4 [==============================] - 0s 500us/step - loss: 5.7972 - sparse_categorical_accuracy: 0.3333
Epoch 482/500
4/4 [==============================] - 0s 750us/step - loss: 5.7972 - sparse_categorical_accuracy: 0.3333
Epoch 483/500
4/4 [==============================] - 0s 500us/step - loss: 5.7972 - sparse_categorical_accuracy: 0.3333
Epoch 484/500
4/4 [==============================] - 0s 242us/step - loss: 5.7972 - sparse_categorical_accuracy: 0.3333
Epoch 485/500
4/4 [==============================] - 0s 500us/step - loss: 5.7972 - sparse_categorical_accuracy: 0.3333
Epoch 486/500
4/4 [==============================] - 0s 500us/step - loss: 5.7972 - sparse_categorical_accuracy: 0.3333
Epoch 487/500
4/4 [==============================] - 0s 250us/step - loss: 5.7972 - sparse_categorical_accuracy: 0.3333
Epoch 488/500
4/4 [==============================] - 0s 498us/step - loss: 5.7972 - sparse_categorical_accuracy: 0.3333
Epoch 489/500
4/4 [==============================] - 0s 249us/step - loss: 5.7972 - sparse_categorical_accuracy: 0.3333
Epoch 490/500
4/4 [==============================] - 0s 500us/step - loss: 5.7972 - sparse_categorical_accuracy: 0.3333
Epoch 491/500
4/4 [==============================] - 0s 500us/step - loss: 5.7972 - sparse_categorical_accuracy: 0.3333
Epoch 492/500
4/4 [==============================] - 0s 501us/step - loss: 5.7972 - sparse_categorical_accuracy: 0.3333
Epoch 493/500
4/4 [==============================] - 0s 498us/step - loss: 5.7972 - sparse_categorical_accuracy: 0.3333
Epoch 494/500
4/4 [==============================] - 0s 500us/step - loss: 5.7972 - sparse_categorical_accuracy: 0.3333
Epoch 495/500
4/4 [==============================] - 0s 250us/step - loss: 5.7972 - sparse_categorical_accuracy: 0.3333
Epoch 496/500
4/4 [==============================] - 0s 499us/step - loss: 5.7972 - sparse_categorical_accuracy: 0.3333
Epoch 497/500
4/4 [==============================] - 0s 499us/step - loss: 5.7972 - sparse_categorical_accuracy: 0.3333
Epoch 498/500
4/4 [==============================] - 0s 500us/step - loss: 5.7972 - sparse_categorical_accuracy: 0.3333
Epoch 499/500
4/4 [==============================] - 0s 498us/step - loss: 5.7972 - sparse_categorical_accuracy: 0.3333
Epoch 500/500
4/4 [==============================] - 0s 3ms/step - loss: 5.7972 - sparse_categorical_accuracy: 0.3333 - val_loss: 7.1404 - val_sparse_categorical_accuracy: 0.2333
Model: "iris_model_1"
_________________________________________________________________
Layer (type)                 Output Shape              Param #   
=================================================================
dense_3 (Dense)              multiple                  15        
=================================================================
Total params: 15
Trainable params: 15
Non-trainable params: 0
_________________________________________________________________

Mnist数据集用sequential和类方法搭建神经网络

# Mnist数据集用sequential进行训练:导入包的位置和定义类的位置有所不同
import tensorflow as tf
from matplotlib import pyplot as plt
from tensorflow.keras.layers import Flatten,Dense
from tensorflow.keras import Model

mnist = tf.keras.datasets.mnist
(x_train,y_train),(x_test,y_test) = mnist.load_data()

#可视化
plt.imshow(x_train[0],cmap='gray')  #绘制灰度图
plt.show()
# print('x_train[0]:\n',x_train[0])
# print('y_train[0]:\n',y_train[0])
# print('x_train.shape:\n',x_train.shape)
# print('y_train.shape:\n',y_train.shape)

#用自定义类class训练模型
x_train,x_test=x_train/255,x_test/255

model = tf.keras.models.Sequential([
    tf.keras.layers.Flatten(),
    tf.keras.layers.Dense(128,activation = 'relu'),
    tf.keras.layers.Dense(10,activation = 'softmax')
])

model.compile(optimizer='adam',
             loss = tf.keras.losses.SparseCategoricalCrossentropy(from_logits=False),
             metrics = ['sparse_categorical_accuracy'])

model.fit(x_train,y_train,batch_size=32,epochs = 5,validation_data=(x_test,y_test),validation_freq=1)
model.summary()

在这里插入图片描述

x_train[0]:
 [[  0   0   0   0   0   0   0   0   0   0   0   0   0   0   0   0   0   0
    0   0   0   0   0   0   0   0   0   0]
 [  0   0   0   0   0   0   0   0   0   0   0   0   0   0   0   0   0   0
    0   0   0   0   0   0   0   0   0   0]
 [  0   0   0   0   0   0   0   0   0   0   0   0   0   0   0   0   0   0
    0   0   0   0   0   0   0   0   0   0]
 [  0   0   0   0   0   0   0   0   0   0   0   0   0   0   0   0   0   0
    0   0   0   0   0   0   0   0   0   0]
 [  0   0   0   0   0   0   0   0   0   0   0   0   0   0   0   0   0   0
    0   0   0   0   0   0   0   0   0   0]
 [  0   0   0   0   0   0   0   0   0   0   0   0   3  18  18  18 126 136
  175  26 166 255 247 127   0   0   0   0]
 [  0   0   0   0   0   0   0   0  30  36  94 154 170 253 253 253 253 253
  225 172 253 242 195  64   0   0   0   0]
 [  0   0   0   0   0   0   0  49 238 253 253 253 253 253 253 253 253 251
   93  82  82  56  39   0   0   0   0   0]
 [  0   0   0   0   0   0   0  18 219 253 253 253 253 253 198 182 247 241
    0   0   0   0   0   0   0   0   0   0]
 [  0   0   0   0   0   0   0   0  80 156 107 253 253 205  11   0  43 154
    0   0   0   0   0   0   0   0   0   0]
 [  0   0   0   0   0   0   0   0   0  14   1 154 253  90   0   0   0   0
    0   0   0   0   0   0   0   0   0   0]
 [  0   0   0   0   0   0   0   0   0   0   0 139 253 190   2   0   0   0
    0   0   0   0   0   0   0   0   0   0]
 [  0   0   0   0   0   0   0   0   0   0   0  11 190 253  70   0   0   0
    0   0   0   0   0   0   0   0   0   0]
 [  0   0   0   0   0   0   0   0   0   0   0   0  35 241 225 160 108   1
    0   0   0   0   0   0   0   0   0   0]
 [  0   0   0   0   0   0   0   0   0   0   0   0   0  81 240 253 253 119
   25   0   0   0   0   0   0   0   0   0]
 [  0   0   0   0   0   0   0   0   0   0   0   0   0   0  45 186 253 253
  150  27   0   0   0   0   0   0   0   0]
 [  0   0   0   0   0   0   0   0   0   0   0   0   0   0   0  16  93 252
  253 187   0   0   0   0   0   0   0   0]
 [  0   0   0   0   0   0   0   0   0   0   0   0   0   0   0   0   0 249
  253 249  64   0   0   0   0   0   0   0]
 [  0   0   0   0   0   0   0   0   0   0   0   0   0   0  46 130 183 253
  253 207   2   0   0   0   0   0   0   0]
 [  0   0   0   0   0   0   0   0   0   0   0   0  39 148 229 253 253 253
  250 182   0   0   0   0   0   0   0   0]
 [  0   0   0   0   0   0   0   0   0   0  24 114 221 253 253 253 253 201
   78   0   0   0   0   0   0   0   0   0]
 [  0   0   0   0   0   0   0   0  23  66 213 253 253 253 253 198  81   2
    0   0   0   0   0   0   0   0   0   0]
 [  0   0   0   0   0   0  18 171 219 253 253 253 253 195  80   9   0   0
    0   0   0   0   0   0   0   0   0   0]
 [  0   0   0   0  55 172 226 253 253 253 253 244 133  11   0   0   0   0
    0   0   0   0   0   0   0   0   0   0]
 [  0   0   0   0 136 253 253 253 212 135 132  16   0   0   0   0   0   0
    0   0   0   0   0   0   0   0   0   0]
 [  0   0   0   0   0   0   0   0   0   0   0   0   0   0   0   0   0   0
    0   0   0   0   0   0   0   0   0   0]
 [  0   0   0   0   0   0   0   0   0   0   0   0   0   0   0   0   0   0
    0   0   0   0   0   0   0   0   0   0]
 [  0   0   0   0   0   0   0   0   0   0   0   0   0   0   0   0   0   0
    0   0   0   0   0   0   0   0   0   0]]
y_train[0]:
 5
x_train.shape:
 (60000, 28, 28)
y_train.shape:
 (60000,)
Epoch 1/5
1875/1875 [==============================] - 1s 788us/step - loss: 0.2548 - sparse_categorical_accuracy: 0.9283 - val_loss: 0.1371 - val_sparse_categorical_accuracy: 0.9566
Epoch 2/5
1875/1875 [==============================] - 1s 731us/step - loss: 0.1122 - sparse_categorical_accuracy: 0.9667 - val_loss: 0.0933 - val_sparse_categorical_accuracy: 0.9715
Epoch 3/5
1875/1875 [==============================] - 1s 756us/step - loss: 0.0774 - sparse_categorical_accuracy: 0.9767 - val_loss: 0.0826 - val_sparse_categorical_accuracy: 0.9757
Epoch 4/5
1875/1875 [==============================] - 1s 730us/step - loss: 0.0581 - sparse_categorical_accuracy: 0.9823 - val_loss: 0.0721 - val_sparse_categorical_accuracy: 0.9778
Epoch 5/5
1875/1875 [==============================] - 1s 766us/step - loss: 0.0450 - sparse_categorical_accuracy: 0.9865 - val_loss: 0.0705 - val_sparse_categorical_accuracy: 0.9773
Model: "sequential_5"
_________________________________________________________________
Layer (type)                 Output Shape              Param #   
=================================================================
flatten_2 (Flatten)          (None, 784)               0         
_________________________________________________________________
dense_8 (Dense)              (None, 128)               100480    
_________________________________________________________________
dense_9 (Dense)              (None, 10)                1290      
=================================================================
Total params: 101,770
Trainable params: 101,770
Non-trainable params: 0
_________________________________________________________________
# Mnist数据集用自定义类进行训练:导入包的位置和定义类的位置有所不同
import tensorflow as tf
from matplotlib import pyplot as plt
from tensorflow.keras.layers import Flatten,Dense
from tensorflow.keras import Model

mnist = tf.keras.datasets.mnist
(x_train,y_train),(x_test,y_test) = mnist.load_data()

#可视化
plt.imshow(x_train[0],cmap='gray')  #绘制灰度图
plt.show()
# print('x_train[0]:\n',x_train[0])
# print('y_train[0]:\n',y_train[0])
# print('x_train.shape:\n',x_train.shape)
# print('y_train.shape:\n',y_train.shape)

#用自定义类class训练模型
x_train,x_test=x_train/255,x_test/255

# model = tf.keras.models.Sequential([
#     tf.keras.layers.Flatten(),
#     tf.keras.layers.Dense(128,activation = 'relu'),
#     tf.keras.layers.Dense(10,activation = 'softmax')
# ])
class MnistModel(Model):
    def __init__(self):
        super(MnistModel,self).__init__()
        self.flatten = Flatten()
        self.d1 = Dense(128,activation='relu')
        self.d2 = Dense(10,activation='softmax')
    def call(self,x):
        x = self.flatten(x)
        x = self.d1(x)
        y = self.d2(x)
        return y
model = MnistModel()

model.compile(optimizer='adam',
             loss = tf.keras.losses.SparseCategoricalCrossentropy(from_logits=False),
             metrics = ['sparse_categorical_accuracy'])

model.fit(x_train,y_train,batch_size=32,epochs = 5,validation_data=(x_test,y_test),validation_freq=1)
model.summary()

在这里插入图片描述

x_train[0]:
 [[  0   0   0   0   0   0   0   0   0   0   0   0   0   0   0   0   0   0
    0   0   0   0   0   0   0   0   0   0]
 [  0   0   0   0   0   0   0   0   0   0   0   0   0   0   0   0   0   0
    0   0   0   0   0   0   0   0   0   0]
 [  0   0   0   0   0   0   0   0   0   0   0   0   0   0   0   0   0   0
    0   0   0   0   0   0   0   0   0   0]
 [  0   0   0   0   0   0   0   0   0   0   0   0   0   0   0   0   0   0
    0   0   0   0   0   0   0   0   0   0]
 [  0   0   0   0   0   0   0   0   0   0   0   0   0   0   0   0   0   0
    0   0   0   0   0   0   0   0   0   0]
 [  0   0   0   0   0   0   0   0   0   0   0   0   3  18  18  18 126 136
  175  26 166 255 247 127   0   0   0   0]
 [  0   0   0   0   0   0   0   0  30  36  94 154 170 253 253 253 253 253
  225 172 253 242 195  64   0   0   0   0]
 [  0   0   0   0   0   0   0  49 238 253 253 253 253 253 253 253 253 251
   93  82  82  56  39   0   0   0   0   0]
 [  0   0   0   0   0   0   0  18 219 253 253 253 253 253 198 182 247 241
    0   0   0   0   0   0   0   0   0   0]
 [  0   0   0   0   0   0   0   0  80 156 107 253 253 205  11   0  43 154
    0   0   0   0   0   0   0   0   0   0]
 [  0   0   0   0   0   0   0   0   0  14   1 154 253  90   0   0   0   0
    0   0   0   0   0   0   0   0   0   0]
 [  0   0   0   0   0   0   0   0   0   0   0 139 253 190   2   0   0   0
    0   0   0   0   0   0   0   0   0   0]
 [  0   0   0   0   0   0   0   0   0   0   0  11 190 253  70   0   0   0
    0   0   0   0   0   0   0   0   0   0]
 [  0   0   0   0   0   0   0   0   0   0   0   0  35 241 225 160 108   1
    0   0   0   0   0   0   0   0   0   0]
 [  0   0   0   0   0   0   0   0   0   0   0   0   0  81 240 253 253 119
   25   0   0   0   0   0   0   0   0   0]
 [  0   0   0   0   0   0   0   0   0   0   0   0   0   0  45 186 253 253
  150  27   0   0   0   0   0   0   0   0]
 [  0   0   0   0   0   0   0   0   0   0   0   0   0   0   0  16  93 252
  253 187   0   0   0   0   0   0   0   0]
 [  0   0   0   0   0   0   0   0   0   0   0   0   0   0   0   0   0 249
  253 249  64   0   0   0   0   0   0   0]
 [  0   0   0   0   0   0   0   0   0   0   0   0   0   0  46 130 183 253
  253 207   2   0   0   0   0   0   0   0]
 [  0   0   0   0   0   0   0   0   0   0   0   0  39 148 229 253 253 253
  250 182   0   0   0   0   0   0   0   0]
 [  0   0   0   0   0   0   0   0   0   0  24 114 221 253 253 253 253 201
   78   0   0   0   0   0   0   0   0   0]
 [  0   0   0   0   0   0   0   0  23  66 213 253 253 253 253 198  81   2
    0   0   0   0   0   0   0   0   0   0]
 [  0   0   0   0   0   0  18 171 219 253 253 253 253 195  80   9   0   0
    0   0   0   0   0   0   0   0   0   0]
 [  0   0   0   0  55 172 226 253 253 253 253 244 133  11   0   0   0   0
    0   0   0   0   0   0   0   0   0   0]
 [  0   0   0   0 136 253 253 253 212 135 132  16   0   0   0   0   0   0
    0   0   0   0   0   0   0   0   0   0]
 [  0   0   0   0   0   0   0   0   0   0   0   0   0   0   0   0   0   0
    0   0   0   0   0   0   0   0   0   0]
 [  0   0   0   0   0   0   0   0   0   0   0   0   0   0   0   0   0   0
    0   0   0   0   0   0   0   0   0   0]
 [  0   0   0   0   0   0   0   0   0   0   0   0   0   0   0   0   0   0
    0   0   0   0   0   0   0   0   0   0]]
y_train[0]:
 5
x_train.shape:
 (60000, 28, 28)
y_train.shape:
 (60000,)
Epoch 1/5
1841/1875 [============================>.] - ETA: 0s - loss: 0.2586 - sparse_categorical_accuracy: 0.9273WARNING:tensorflow:Callbacks method `on_test_batch_end` is slow compared to the batch time (batch time: 0.0000s vs `on_test_batch_end` time: 0.0010s). Check your callbacks.
1875/1875 [==============================] - 1s 757us/step - loss: 0.2569 - sparse_categorical_accuracy: 0.9278 - val_loss: 0.1385 - val_sparse_categorical_accuracy: 0.9586
Epoch 2/5
1875/1875 [==============================] - 1s 708us/step - loss: 0.1130 - sparse_categorical_accuracy: 0.9660 - val_loss: 0.0953 - val_sparse_categorical_accuracy: 0.9713
Epoch 3/5
1875/1875 [==============================] - 1s 684us/step - loss: 0.0777 - sparse_categorical_accuracy: 0.9759 - val_loss: 0.0797 - val_sparse_categorical_accuracy: 0.9749
Epoch 4/5
1875/1875 [==============================] - 1s 713us/step - loss: 0.0577 - sparse_categorical_accuracy: 0.9821 - val_loss: 0.0744 - val_sparse_categorical_accuracy: 0.9764
Epoch 5/5
1875/1875 [==============================] - 1s 738us/step - loss: 0.0452 - sparse_categorical_accuracy: 0.9858 - val_loss: 0.0729 - val_sparse_categorical_accuracy: 0.9774
Model: "mnist_model"
_________________________________________________________________
Layer (type)                 Output Shape              Param #   
=================================================================
flatten_3 (Flatten)          multiple                  0         
_________________________________________________________________
dense_10 (Dense)             multiple                  100480    
_________________________________________________________________
dense_11 (Dense)             multiple                  1290      
=================================================================
Total params: 101,770
Trainable params: 101,770
Non-trainable params: 0
_________________________________________________________________

Fashion 数据集两种方法搭建神经网络

# Fashion 数据集6万张衣裤图片和标签训练,1万张图片和标签测试
# 用sequential方法实现分类
fashion = tf.keras.datasets.fashion_mnist
(x_train,y_train),(x_test,y_test)=fashion.load_data()
x_train,x_test=x_train/255,x_test/255

model = tf.keras.models.Sequential([
    tf.keras.layers.Flatten(),
    tf.keras.layers.Dense(128,activation='relu'),
    tf.keras.layers.Dense(10,activation = 'softmax')
])

model.compile(optimizer = 'adam',
             loss = tf.keras.losses.SparseCategoricalCrossentropy(from_logits=False),
             metrics=['sparse_categorical_accuracy'])

model.fit(x_train,y_train,batch_size=32,epochs=20,validation_data=(x_test,y_test),validation_freq=1)
model.summary()
Epoch 1/20
1875/1875 [==============================] - 2s 804us/step - loss: 0.4995 - sparse_categorical_accuracy: 0.8233 - val_loss: 0.4309 - val_sparse_categorical_accuracy: 0.8440
Epoch 2/20
1875/1875 [==============================] - 1s 788us/step - loss: 0.3771 - sparse_categorical_accuracy: 0.8638 - val_loss: 0.3976 - val_sparse_categorical_accuracy: 0.8577
Epoch 3/20
1875/1875 [==============================] - 1s 751us/step - loss: 0.3401 - sparse_categorical_accuracy: 0.8749 - val_loss: 0.3840 - val_sparse_categorical_accuracy: 0.8616
Epoch 4/20
1875/1875 [==============================] - 1s 761us/step - loss: 0.3135 - sparse_categorical_accuracy: 0.8852 - val_loss: 0.3755 - val_sparse_categorical_accuracy: 0.8668
Epoch 5/20
1875/1875 [==============================] - 1s 769us/step - loss: 0.2968 - sparse_categorical_accuracy: 0.8905 - val_loss: 0.3509 - val_sparse_categorical_accuracy: 0.8735
Epoch 6/20
1875/1875 [==============================] - 1s 754us/step - loss: 0.2835 - sparse_categorical_accuracy: 0.8959 - val_loss: 0.3465 - val_sparse_categorical_accuracy: 0.8769
Epoch 7/20
1875/1875 [==============================] - 1s 785us/step - loss: 0.2698 - sparse_categorical_accuracy: 0.9004 - val_loss: 0.3757 - val_sparse_categorical_accuracy: 0.8669
Epoch 8/20
1875/1875 [==============================] - 1s 768us/step - loss: 0.2591 - sparse_categorical_accuracy: 0.9037 - val_loss: 0.3392 - val_sparse_categorical_accuracy: 0.8817
Epoch 9/20
1875/1875 [==============================] - 1s 767us/step - loss: 0.2485 - sparse_categorical_accuracy: 0.9073 - val_loss: 0.3617 - val_sparse_categorical_accuracy: 0.8732
Epoch 10/20
1875/1875 [==============================] - 1s 764us/step - loss: 0.2411 - sparse_categorical_accuracy: 0.9096 - val_loss: 0.3451 - val_sparse_categorical_accuracy: 0.8789
Epoch 11/20
1875/1875 [==============================] - 1s 794us/step - loss: 0.2312 - sparse_categorical_accuracy: 0.9131 - val_loss: 0.3348 - val_sparse_categorical_accuracy: 0.8803
Epoch 12/20
1875/1875 [==============================] - 1s 773us/step - loss: 0.2247 - sparse_categorical_accuracy: 0.9162 - val_loss: 0.3273 - val_sparse_categorical_accuracy: 0.8855
Epoch 13/20
1875/1875 [==============================] - 1s 778us/step - loss: 0.2194 - sparse_categorical_accuracy: 0.9178 - val_loss: 0.3330 - val_sparse_categorical_accuracy: 0.8862
Epoch 14/20
1875/1875 [==============================] - 1s 756us/step - loss: 0.2127 - sparse_categorical_accuracy: 0.9215 - val_loss: 0.3308 - val_sparse_categorical_accuracy: 0.8855
Epoch 15/20
1875/1875 [==============================] - 1s 794us/step - loss: 0.2069 - sparse_categorical_accuracy: 0.9226 - val_loss: 0.3287 - val_sparse_categorical_accuracy: 0.8905
Epoch 16/20
1875/1875 [==============================] - 2s 879us/step - loss: 0.1980 - sparse_categorical_accuracy: 0.9255 - val_loss: 0.3385 - val_sparse_categorical_accuracy: 0.8876
Epoch 17/20
1875/1875 [==============================] - 2s 850us/step - loss: 0.1952 - sparse_categorical_accuracy: 0.9271 - val_loss: 0.3411 - val_sparse_categorical_accuracy: 0.8890
Epoch 18/20
1875/1875 [==============================] - 2s 861us/step - loss: 0.1915 - sparse_categorical_accuracy: 0.9274 - val_loss: 0.3442 - val_sparse_categorical_accuracy: 0.8854
Epoch 19/20
1875/1875 [==============================] - 2s 870us/step - loss: 0.1853 - sparse_categorical_accuracy: 0.9301 - val_loss: 0.3409 - val_sparse_categorical_accuracy: 0.8879
Epoch 20/20
1875/1875 [==============================] - 2s 905us/step - loss: 0.1815 - sparse_categorical_accuracy: 0.9319 - val_loss: 0.3518 - val_sparse_categorical_accuracy: 0.8842
Model: "sequential_12"
_________________________________________________________________
Layer (type)                 Output Shape              Param #   
=================================================================
flatten_10 (Flatten)         (None, 784)               0         
_________________________________________________________________
dense_24 (Dense)             (None, 128)               100480    
_________________________________________________________________
dense_25 (Dense)             (None, 10)                1290      
=================================================================
Total params: 101,770
Trainable params: 101,770
Non-trainable params: 0
_________________________________________________________________
# Fashion数据集用自定义类方法实现分类
fashion = tf.keras.datasets.fashion_mnist
(x_train,y_train),(x_test,y_test)=fashion.load_data()
x_train,x_test=x_train/255,x_test/255


class FashionModel(Model):
    def __init__(self):
        super(Model,self).__init__(self)
        self.flatten = Faltten()
        self.d1=self.Dense(128,activation='relu')
        self.d2=self.Dense(10,activation='softmax')
    def call(self,x):
        x=self.faltten(x)
        y=self.d1(x)
        y=self.d2(y)
        return y
    
model = FashionModel()
model.compile(optimizer = 'adam',
             loss = tf.keras.losses.SparseCategoricalCrossentropy(from_logits=False),
             metrics=['sparse_categorical_accuracy'])

model.fit(x_train,y_train,batch_size=32,epochs=5,validation_data=(x_test,y_test),validation_freq=1)
model.summary()
Epoch 1/5
1875/1875 [==============================] - 1s 747us/step - loss: 0.2809 - sparse_categorical_accuracy: 0.8945 - val_loss: 0.3431 - val_sparse_categorical_accuracy: 0.8786
Epoch 2/5
1875/1875 [==============================] - 1s 682us/step - loss: 0.2669 - sparse_categorical_accuracy: 0.9009 - val_loss: 0.3473 - val_sparse_categorical_accuracy: 0.8795
Epoch 3/5
1875/1875 [==============================] - 1s 682us/step - loss: 0.2564 - sparse_categorical_accuracy: 0.9039 - val_loss: 0.3260 - val_sparse_categorical_accuracy: 0.8864
Epoch 4/5
1875/1875 [==============================] - 1s 723us/step - loss: 0.2463 - sparse_categorical_accuracy: 0.9088 - val_loss: 0.3532 - val_sparse_categorical_accuracy: 0.8796
Epoch 5/5
1875/1875 [==============================] - 1s 683us/step - loss: 0.2376 - sparse_categorical_accuracy: 0.9112 - val_loss: 0.3333 - val_sparse_categorical_accuracy: 0.8833
Model: "sequential_10"
_________________________________________________________________
Layer (type)                 Output Shape              Param #   
=================================================================
flatten_8 (Flatten)          (None, 784)               0         
_________________________________________________________________
dense_20 (Dense)             (None, 128)               100480    
_________________________________________________________________
dense_21 (Dense)             (None, 10)                1290      
=================================================================
Total params: 101,770
Trainable params: 101,770
Non-trainable params: 0
_________________________________________________________________

神经网络八股功能拓展

1、自制数据集,解决本领域应用
2、数据增强,扩充数据集
3、断点续训,存取模型参数
4、参数提取,把参数存入文本
5、acc/loss可视化,查看训练效果
6、应用程序,给图识别物体

自制数据集,将原始图片信息制作成数据集

# 1、在Mnist数据集基础上进行演练
# 自制数据集,代替load_data函数,从而实现本领域数集的创建
# def generated(图片路径,标签文件)返回输入特征和标签
from PIL import Image
import numpy as np
import pandas as pd
import tensorflow as tf
import os
def generated(path,txt):
    f = open(txt,'r')
    contents = f.readlines()
    f.close()
    x,y_=[],[]
    for content in contents:
        values = content.split()
        img_path = path+values[0]
        img = Image.open(img_path)
        img = np.array(img.convert('L'))
        img = img/255
        x.append(img)
        y_.append(values[1])
        print('loading:'+content)
    
    x = np.array(x)
    y_ = np.array(y_)
    y_ = y_.astype(np.int64)
    return x,y_
# python默认内置数据类型:
# 字符型:str
# 数值型:int,float,complex
# 序列类型:list,tuple,range(start,end,step)
# 映射类型:dict
# 集合类型:set,frozenset
# 布尔类型:bool
# 二进制类型:bytes,bytearray,memoryview

train_path = 'D:/tensorflow笔记/class4/MNIST_FC/mnist_image_label/mnist_train_jpg_60000/'
train_txt = 'D:/tensorflow笔记/class4/MNIST_FC/mnist_image_label/mnist_train_jpg_60000.txt'
# 训练集输入特征存储文件
x_train_savepath = 'D:/tensorflow笔记/class4/MNIST_FC/mnist_image_label/mnist_x_train.npy'
# 训练集标签存储文件
y_train_savepath = 'D:/tensorflow笔记/class4/MNIST_FC/mnist_image_label/mnist_y_train.npy'

x_train,y_train = generated(train_path,train_txt)
test_path = 'D:/tensorflow笔记/class4/MNIST_FC/mnist_image_label/mnist_test_jpg_10000/'
test_txt = 'D:/tensorflow笔记/class4/MNIST_FC/mnist_image_label/mnist_test_jpg_10000.txt'
x_test_savepath = 'D:/tensorflow笔记/class4/MNIST_FC/mnist_image_label/mnist_x_test.npy'
y_test_savepath = 'D:/tensorflow笔记/class4/MNIST_FC/mnist_image_label/mnist_y_test.npy'

x_test,y_test = generated(test_path,test_txt)
# # 检查训练集和测试集是否已经存在,因为之前已经跑过,这儿就不跑了
# if os.path.exist(x_train_savepath) and os.path.exist(y_train_savepath) and  os.path.exist(x_test_savepath) and os.path.exist(y_test_savepath):
#     print('------------------load datasets----------------')
#     x_train_save = np.load(x_train_savepath)
#     y_train = np.load(y_train_savepath)
#     x_test_save = np.load(x_test_savepath)
#     y_test = np.load(y_test_savepath)
#     x_train = np.reshape(x_train_save,len(x_train_save),28,28)
#     x_test = np.reshape(x_test_save,len(x_test_save),28,28)
# else:
#     print('-----------------generated dataset--------------')
#     x_train,y_train = generated(train_path,train_txt)
#     x_test,y_test = generated(test_path,test_txt)
#     print('----------------save datasets--------------------')
#     x_train_save = np.reshape(x_train,(len(x_train),-1))
#     x_test_save = np.reshape(x_test,(len(x_test),-1))
#     np.save(x_train_savepath,x_train_save)
#     np.save(y_train_savepath,y_train_save)
#     np.save(x_test_savepath,x_test_save)
#     np.save(y_test_savepath,y_test_save)

print('----------------save datasets--------------------')
x_train_save = np.reshape(x_train,(len(x_train),-1))
x_test_save = np.reshape(x_test,(len(x_test),-1))
np.save(x_train_savepath,x_train_save)
np.save(y_train_savepath,y_train)
np.save(x_test_savepath,x_test_save)
np.save(y_test_savepath,y_test)

----------------save datasets--------------------
y_train = y_train.astype(np.int64)
y_test = y_test.astype(np.int64)
model = tf.keras.models.Sequential([
    tf.keras.layers.Flatten(),
    tf.keras.layers.Dense(128,activation='relu'),
    tf.keras.layers.Dense(10,activation='softmax')
])
model.compile(optimizer = 'adam',
              loss = tf.keras.losses.SparseCategoricalCrossentropy(from_logits=False),
              metrics = ['sparse_categorical_accuracy']
)
model.fit(x_train,y_train,batch_size=32,epochs = 5,validation_data=(x_test,y_test),validation_freq = 1)
model.summary()

Epoch 1/5
1875/1875 [==============================] - 4s 2ms/step - loss: 0.2628 - sparse_categorical_accuracy: 0.9242 - val_loss: 0.1342 - val_sparse_categorical_accuracy: 0.9626
Epoch 2/5
1875/1875 [==============================] - 4s 2ms/step - loss: 0.1149 - sparse_categorical_accuracy: 0.9658 - val_loss: 0.1027 - val_sparse_categorical_accuracy: 0.9698
Epoch 3/5
1875/1875 [==============================] - 4s 2ms/step - loss: 0.0771 - sparse_categorical_accuracy: 0.9771 - val_loss: 0.0830 - val_sparse_categorical_accuracy: 0.9758
Epoch 4/5
1875/1875 [==============================] - 4s 2ms/step - loss: 0.0563 - sparse_categorical_accuracy: 0.9830 - val_loss: 0.0902 - val_sparse_categorical_accuracy: 0.9724
Epoch 5/5
1875/1875 [==============================] - 4s 2ms/step - loss: 0.0445 - sparse_categorical_accuracy: 0.9861 - val_loss: 0.0747 - val_sparse_categorical_accuracy: 0.9771
Model: "sequential_16"
_________________________________________________________________
Layer (type)                 Output Shape              Param #   
=================================================================
flatten_14 (Flatten)         (None, 784)               0         
_________________________________________________________________
dense_32 (Dense)             (None, 128)               100480    
_________________________________________________________________
dense_33 (Dense)             (None, 10)                1290      
=================================================================
Total params: 101,770
Trainable params: 101,770
Non-trainable params: 0
_________________________________________________________________

数据增强,图像数据可以经过对图像的简单变形来应对图像变形

# 2、数据增强,自制数据集
# 数据增强可以扩展数据集,对图像的增i强就是对图像的简单形变,用来应对因拍照角度不同硬气的图形形变
# image_gen_train = tf.keras.preprocessing.image.ImageDataGenerator( 
#     rescale = 所有数据将乘以该数值,
#     rotation_range = 随即旋转角度范围,
#     width_shift_range = 随机宽度偏移量,
#     height_shift_range = 随机高度偏移量,
#     水平翻转:horizontal_flip = 是否随机水平反转,
#     随机缩放:zoom_range = 随机缩放的范围[1-n,1+n]
# )
# image_gen_train.fit(x_train) 输入是四维的,所以需要先把x_train增加维度,单通道的

# 实际数据中应用:fashion
import tensorflow as tf
from tensorflow.keras.preprocessing.image import ImageDataGenerator

fashion = tf.keras.datasets.fashion_mnist
(x_train,y_train),(x_test,y_test)=fashion.load_data()
x_train,x_test=x_train/255,x_test/255

#————————-新增-------------------------#
x_train = x_train.reshape(x_train.shape[0],28,28,1)
image_gen_train = tf.keras.preprocessing.image.ImageDataGenerator( 
    rescale = 1./1.,# 如果是图像,分母为255,可归至0-1
    rotation_range = 45,  #随机旋转45度
    width_shift_range = .15,   #随机宽度偏移量,
    height_shift_range = .15 ,   #随机高度偏移量,
    horizontal_flip = True,   #是否随机水平反转,
    zoom_range =0.5     #随机缩放的范围[1-n,1+n]
)
image_gen_train.fit(x_train) #输入是四维的,所以需要先把x_train增加维度,单通道的
#————————-新增-------------------------#

model = tf.keras.models.Sequential([
    tf.keras.layers.Flatten(),
    tf.keras.layers.Dense(128,activation='relu'),
    tf.keras.layers.Dense(10,activation = 'softmax')
])

model.compile(optimizer = 'adam',
             loss = tf.keras.losses.SparseCategoricalCrossentropy(from_logits=False),
             metrics=['sparse_categorical_accuracy'])

# --------------------改变model_gen_train.flow()---------------------#
model.fit(image_gen_train.flow(x_train,y_train,batch_size=32),epochs=5,validation_data=(x_test,y_test),validation_freq=1)
model.summary()
Epoch 1/5
1875/1875 [==============================] - ETA: 0s - loss: 1.5140 - sparse_categorical_accuracy: 0.4547WARNING:tensorflow:Model was constructed with shape (None, None, None, None) for input Tensor("flatten_15_input_1:0", shape=(None, None, None, None), dtype=float32), but it was called on an input with incompatible shape (None, 28, 28).
1875/1875 [==============================] - 20s 11ms/step - loss: 1.5140 - sparse_categorical_accuracy: 0.4547 - val_loss: 0.9860 - val_sparse_categorical_accuracy: 0.6289
Epoch 2/5
1875/1875 [==============================] - 20s 11ms/step - loss: 1.2866 - sparse_categorical_accuracy: 0.5373 - val_loss: 0.9626 - val_sparse_categorical_accuracy: 0.6456
Epoch 3/5
1875/1875 [==============================] - 24s 13ms/step - loss: 1.2122 - sparse_categorical_accuracy: 0.5632 - val_loss: 0.8978 - val_sparse_categorical_accuracy: 0.6772
Epoch 4/5
1875/1875 [==============================] - 26s 14ms/step - loss: 1.1635 - sparse_categorical_accuracy: 0.5772 - val_loss: 0.9645 - val_sparse_categorical_accuracy: 0.6618
Epoch 5/5
1875/1875 [==============================] - 24s 13ms/step - loss: 1.1341 - sparse_categorical_accuracy: 0.5889 - val_loss: 0.9016 - val_sparse_categorical_accuracy: 0.6515oss: 1.1332 - sparse_categorical_ac
Model: "sequential_17"
_________________________________________________________________
Layer (type)                 Output Shape              Param #   
=================================================================
flatten_15 (Flatten)         (None, None)              0         
_________________________________________________________________
dense_34 (Dense)             (None, 128)               100480    
_________________________________________________________________
dense_35 (Dense)             (None, 10)                1290      
=================================================================
Total params: 101,770
Trainable params: 101,770
Non-trainable params: 0
_________________________________________________________________

断点续训,读取和保存模型参数

# 3、断电续训:读取保存模型
#读取模型
# model.load_weight(路径文件名),  #读取模型
# 保存模型,回调函数
# cp_callback = tf.keras.callbacks.ModelCheckpoint(
#     filepath = 路径文件名,
#     save_weight_only = True/False,
#     save_best_only = True/False
# )
# history = model.fit(callbacks = [cp_callback])

# Mnist数据集用swquential进行训练:导入包的位置和定义类的位置有所不同
import tensorflow as tf
from matplotlib import pyplot as plt
from tensorflow.keras.layers import Flatten,Dense
from tensorflow.keras import Model
import os

mnist = tf.keras.datasets.mnist
(x_train,y_train),(x_test,y_test) = mnist.load_data()
x_train,x_test=x_train/255,x_test/255

model = tf.keras.models.Sequential([
    tf.keras.layers.Flatten(),
    tf.keras.layers.Dense(128,activation = 'relu'),
    tf.keras.layers.Dense(10,activation = 'softmax')
])

model.compile(optimizer='adam',
             loss = tf.keras.losses.SparseCategoricalCrossentropy(from_logits=False),
             metrics = ['sparse_categorical_accuracy'])
#--------------------不同的地方-----------------------#
checkpoint_save_path = 'D:/tensorflow笔记/class4/MNIST_FC/mnist.ckpt'  #ckpt文件会生成对应的index文件
if os.path.exists(checkpoint_save_path + '.index'):
    print('-----------load model----------')
    model.load_weights(checkpoint_save_path)

cp_callback = tf.keras.callbacks.ModelCheckpoint(
    filepath = checkpoint_save_path,
    save_weights_only = True,
    save_best_only= True)
history = model.fit(x_train,y_train,batch_size = 32,epochs = 5,
                    validation_data=(x_test,y_test),validation_freq=1,
                   callbacks = [cp_callback])

model.summary()

提取参数,打印参数

# 4、提取参数,将参数写入文本中
# 提取参数:model.trainable_varibles   返回模型中可训练的参数
# 设置print格式打印结果:np.set_printoptions(threshold=超过多少位省略显示)
# 再断点续训基础上添加下文
np.set_printoptions(threshold=np.inf)  #表示无限大,要不然打印出来的结果有省略号
print(mpdel.trainalbe_variables)
file = open('./weights.txt','w')
for v in model.trainable_variables:
    file.write(str(v.name) + '\n')
    file.write(str(v.shape) + '\n')
    file.write(str(v.numpy())+'\n')
file.close()


可视化

# 5、acc/loss可视化
history = model.fit(...)
训练集loss:loss = history.history['loss']   
测试集loss:val_loss = history.history['val_loss']    
训练集准确率:acc = history.history['sparse_categorical_accuracy']    
测试集准确率:val_acc = history.history['val_sparse_categorical_accuracy']    
plt.subplot(1,2,1)
plt.plot(acc,label='training acc')
plt.plot(val_acc,label='validation acc')
plt.title('acc')
plt.legend()

plt.subplot(1,2,1)
plt.plot(loss,label='training loss')
plt.plot(val_loss,label='validation loss')
plt.title('loss')
plt.legend()
plt.show()

##程序应用,给实际图片识别数字

# 6、应用程序,给图识物,希望输入一张图片能识别数字
result = model.predict(输入特征,batch_size=整数)

# 复现模型
model = tf.keras.models.Sequential([
    tf.keras.layers.Flatten(),
    tf.keras.layers.Dense(128,activation='relu'),
    tf.keras.layers.Dense(10,activation='softmax')
])
# 加载参数
model.load_weights(model_save_path)
# 预测结果
prenum = int(input('input the number of test picture:'))
for i in range(prenum):
    image_path = input('the path of the picture:')
    img = Image.open(image_path)
    img = img.resize((28,28),Image.ANTIALIAS)
    img_arr = np.array(img.convert('L'))
    
    img_arr = 255-img_arr  #白底黑字转为黑底白字
    img_arr = img_arr/255.0
    x_predict = img_arr[tf.newaxis,...]
    result = model.predict(x_predict)
    pred = tf.argmax(result,axis=1)
    print('\n')
    tf.print(pred)


评论
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包
实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值