Tensorflow2.0入门教程7:Keras模型类搭建神经网络模型

Keras 模型以类的形式呈现,我们可以通过继承 tf.keras.Model 这个 Python 类来定义自己的模型。在继承类中,我们需要重写 init()(构造函数,初始化)和 call(input) (模型调用)两个方法,同时也可以根据需要增加自定义的方法。当模型比较复杂时,推荐使用这种方法搭建网络。
在这里插入图片描述

定义模型一般结构

class MyModel(tf.keras.Model):
    def __init__(self):
        super().__init__()     # Python 2 下使用 super(MyModel, self).__init__()
        # 此处添加初始化代码(包含 call 方法中会用到的层),例如
        # layer1 = tf.keras.layers.BuiltInLayer(...)
        # layer2 = MyCustomLayer(...)

    def call(self, input):
        # 此处添加模型调用的代码(处理输入并返回输出),例如
        # x = layer1(input)
        # output = layer2(x)
        return output

    # 还可以添加自定义的方法
    def fun(self):
        pass
import tensorflow as tf
x = tf.constant([1.0, 2.0, 3.0, 4.0, 5.0, 6.0])
y = tf.constant([5.0,6.0,7.0,8.0,9.0,10.0])
x.shape
TensorShape([6])
y.shape
TensorShape([6])

一、搭建模型

class Linear(tf.keras.Model):
    def __init__(self):
        super().__init__()
        self.dense = tf.keras.layers.Dense(
            units=1,
            activation=None,
        )

    def call(self, inputs):
        output = self.dense(inputs)
        return output
model=Linear()

二、配置模型

# 3. 定义训练参数
model.compile(
    optimizer=tf.keras.optimizers.SGD(learning_rate=0.05),     # 指定优化器
    loss='mse',   # 指定损失函数
)

三、训练模型

%%time
model.fit(x,y, epochs=100,verbose=2)
Train on 6 samples
Epoch 1/100
6/6 - 0s - loss: 148.4280
Epoch 2/100
6/6 - 0s - loss: 54.4314
Epoch 3/100
6/6 - 0s - loss: 20.7244
Epoch 4/100
6/6 - 0s - loss: 8.6092
Epoch 5/100
6/6 - 0s - loss: 4.2277
Epoch 6/100
6/6 - 0s - loss: 2.6173
Epoch 7/100
6/6 - 0s - loss: 2.0007
Epoch 8/100
6/6 - 0s - loss: 1.7414
Epoch 9/100
6/6 - 0s - loss: 1.6115
Epoch 10/100
6/6 - 0s - loss: 1.5293
Epoch 11/100
6/6 - 0s - loss: 1.4654
Epoch 12/100
6/6 - 0s - loss: 1.4093
Epoch 13/100
6/6 - 0s - loss: 1.3573
Epoch 14/100
6/6 - 0s - loss: 1.3078
Epoch 15/100
6/6 - 0s - loss: 1.2603
Epoch 16/100
6/6 - 0s - loss: 1.2147
Epoch 17/100
6/6 - 0s - loss: 1.1708
Epoch 18/100
6/6 - 0s - loss: 1.1284
Epoch 19/100
6/6 - 0s - loss: 1.0876
Epoch 20/100
6/6 - 0s - loss: 1.0483
Epoch 21/100
6/6 - 0s - loss: 1.0104
Epoch 22/100
6/6 - 0s - loss: 0.9738
Epoch 23/100
6/6 - 0s - loss: 0.9386
Epoch 24/100
6/6 - 0s - loss: 0.9047
Epoch 25/100
6/6 - 0s - loss: 0.8720
Epoch 26/100
6/6 - 0s - loss: 0.8404
Epoch 27/100
6/6 - 0s - loss: 0.8101
Epoch 28/100
6/6 - 0s - loss: 0.7808
Epoch 29/100
6/6 - 0s - loss: 0.7525
Epoch 30/100
6/6 - 0s - loss: 0.7253
Epoch 31/100
6/6 - 0s - loss: 0.6991
Epoch 32/100
6/6 - 0s - loss: 0.6738
Epoch 33/100
6/6 - 0s - loss: 0.6494
Epoch 34/100
6/6 - 0s - loss: 0.6260
Epoch 35/100
6/6 - 0s - loss: 0.6033
Epoch 36/100
6/6 - 0s - loss: 0.5815
Epoch 37/100
6/6 - 0s - loss: 0.5605
Epoch 38/100
6/6 - 0s - loss: 0.5402
Epoch 39/100
6/6 - 0s - loss: 0.5207
Epoch 40/100
6/6 - 0s - loss: 0.5018
Epoch 41/100
6/6 - 0s - loss: 0.4837
Epoch 42/100
6/6 - 0s - loss: 0.4662
Epoch 43/100
6/6 - 0s - loss: 0.4493
Epoch 44/100
6/6 - 0s - loss: 0.4331
Epoch 45/100
6/6 - 0s - loss: 0.4174
Epoch 46/100
6/6 - 0s - loss: 0.4023
Epoch 47/100
6/6 - 0s - loss: 0.3878
Epoch 48/100
6/6 - 0s - loss: 0.3738
Epoch 49/100
6/6 - 0s - loss: 0.3603
Epoch 50/100
6/6 - 0s - loss: 0.3472
Epoch 51/100
6/6 - 0s - loss: 0.3347
Epoch 52/100
6/6 - 0s - loss: 0.3226
Epoch 53/100
6/6 - 0s - loss: 0.3109
Epoch 54/100
6/6 - 0s - loss: 0.2997
Epoch 55/100
6/6 - 0s - loss: 0.2888
Epoch 56/100
6/6 - 0s - loss: 0.2784
Epoch 57/100
6/6 - 0s - loss: 0.2683
Epoch 58/100
6/6 - 0s - loss: 0.2586
Epoch 59/100
6/6 - 0s - loss: 0.2493
Epoch 60/100
6/6 - 0s - loss: 0.2402
Epoch 61/100
6/6 - 0s - loss: 0.2316
Epoch 62/100
6/6 - 0s - loss: 0.2232
Epoch 63/100
6/6 - 0s - loss: 0.2151
Epoch 64/100
6/6 - 0s - loss: 0.2073
Epoch 65/100
6/6 - 0s - loss: 0.1998
Epoch 66/100
6/6 - 0s - loss: 0.1926
Epoch 67/100
6/6 - 0s - loss: 0.1856
Epoch 68/100
6/6 - 0s - loss: 0.1789
Epoch 69/100
6/6 - 0s - loss: 0.1725
Epoch 70/100
6/6 - 0s - loss: 0.1662
Epoch 71/100
6/6 - 0s - loss: 0.1602
Epoch 72/100
6/6 - 0s - loss: 0.1544
Epoch 73/100
6/6 - 0s - loss: 0.1488
Epoch 74/100
6/6 - 0s - loss: 0.1435
Epoch 75/100
6/6 - 0s - loss: 0.1383
Epoch 76/100
6/6 - 0s - loss: 0.1333
Epoch 77/100
6/6 - 0s - loss: 0.1284
Epoch 78/100
6/6 - 0s - loss: 0.1238
Epoch 79/100
6/6 - 0s - loss: 0.1193
Epoch 80/100
6/6 - 0s - loss: 0.1150
Epoch 81/100
6/6 - 0s - loss: 0.1109
Epoch 82/100
6/6 - 0s - loss: 0.1068
Epoch 83/100
6/6 - 0s - loss: 0.1030
Epoch 84/100
6/6 - 0s - loss: 0.0993
Epoch 85/100
6/6 - 0s - loss: 0.0957
Epoch 86/100
6/6 - 0s - loss: 0.0922
Epoch 87/100
6/6 - 0s - loss: 0.0889
Epoch 88/100
6/6 - 0s - loss: 0.0857
Epoch 89/100
6/6 - 0s - loss: 0.0826
Epoch 90/100
6/6 - 0s - loss: 0.0796
Epoch 91/100
6/6 - 0s - loss: 0.0767
Epoch 92/100
6/6 - 0s - loss: 0.0739
Epoch 93/100
6/6 - 0s - loss: 0.0713
Epoch 94/100
6/6 - 0s - loss: 0.0687
Epoch 95/100
6/6 - 0s - loss: 0.0662
Epoch 96/100
6/6 - 0s - loss: 0.0638
Epoch 97/100
6/6 - 0s - loss: 0.0615
Epoch 98/100
6/6 - 0s - loss: 0.0593
Epoch 99/100
6/6 - 0s - loss: 0.0571
Epoch 100/100
6/6 - 0s - loss: 0.0551
Wall time: 436 ms
<tensorflow.python.keras.callbacks.History at 0x10d235f52b0>
model.summary()
Model: "linear"
_________________________________________________________________
Layer (type)                 Output Shape              Param #   
=================================================================
dense (Dense)                multiple                  2         
=================================================================
Total params: 2
Trainable params: 2
Non-trainable params: 0
_________________________________________________________________

四、模型验证

model.evaluate(x,y,verbose=0)
0.06549862772226334

五、模型预测

model.predict(x)
array([[ 4.5528536],
       [ 5.689129 ],
       [ 6.825404 ],
       [ 7.9616795],
       [ 9.097955 ],
       [10.23423  ]], dtype=float32)
  • 0
    点赞
  • 5
    收藏
    觉得还不错? 一键收藏
  • 0
    评论

“相关推荐”对你有帮助么?

  • 非常没帮助
  • 没帮助
  • 一般
  • 有帮助
  • 非常有帮助
提交
评论
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包
实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值