官网实例详解4.2(antirectifier.py)-keras学习笔记四

antirectifier.py

本实例使用antirectifier替换ReLU激活函数

Keras实例目录

代码注释

'''The example demonstrates how to write custom layers for Keras.
# Keras自定义层编写示范
We build a custom activation layer called 'Antirectifier',
建立了一个自定义的激活 'Antirectifier'(反校正)
which modifies the shape of the tensor that passes through it.
它修改通过它的张量的形状。
We need to specify two methods: `compute_output_shape` and `call`.
需要指定两种方法: `compute_output_shape` and `call`.
Note that the same result can also be achieved via a Lambda layer.
注意,同样的结果也可以通过Lambda层来实现
Because our custom layer is written with primitives from the Keras
自定义层是用keras底层编写的,
backend (`K`), our code can run both on TensorFlow and Theano.
代码可基于TensorFlow and Theano框架运行
'''

from __future__ import print_function
import keras
from keras.models import Sequential
from keras import layers
from keras.datasets import mnist
from keras import backend as K


class Antirectifier(layers.Layer):
    '''This is the combination of a sample-wise
    L2 normalization with the concatenation of the
    positive part of the input with the negative part
    of the input. The result is a tensor of samples that are
    twice as large as the input samples.
    这是示例性的L2归一化与输入的正样本与输入的负样本的级联的组合。结果张量样本是输入样本的两倍大。

    It can be used in place of a ReLU.
    可以使用RelU(Rectified Linear Unit,线性整流函数, 激活函数)替换
    # Input shape
    输入形状
        2D tensor of shape (samples, n)
        形状的2维张量

    # Output shape
    输出形状
        2D tensor of shape (samples, 2*n)
        形状的2维张量

    # Theoretical justification
    理论证明
        When applying ReLU, assuming that the distribution
        of the previous output is approximately centered around 0.,
        使用ReLU时,假设前一个输出分布的以0中心分布
        you are discarding half of your input. This is inefficient.
        放弃了一半的输入。这是低效的。

        Antirectifier allows to return all-positive outputs like ReLU,
        without discarding any data.
        反校正返回了所有正样本输出,像ReLU一样,没有丢弃数据。

        Tests on MNIST show that Antirectifier allows to train networks
        with twice less parameters yet with comparable
        classification accuracy as an equivalent ReLU-based network.
        基于MINIST(数据集)训练,展示反校正训练网络和同类ReLU-based网络相比,使用少于2倍的参数参数,但是实现了类似的分类准确度。
    '''

    def compute_output_shape(self, input_shape):
        shape = list(input_shape)
        assert len(shape) == 2  # only valid for 2D tensors
        shape[-1] *= 2
        return tuple(shape)

    def call(self, inputs):
        inputs -= K.mean(inputs, axis=1, keepdims=True)
        inputs = K.l2_normalize(inputs, axis=1)
        pos = K.relu(inputs)
        neg = K.relu(-inputs)
        return K.concatenate([pos, neg], axis=1)

# global parameters
# 全局变量
batch_size = 128
num_classes = 10
epochs = 40

# the data, shuffled and split between train and test sets
# 筛选(数据顺序打乱)、划分训练集和测试集
(x_train, y_train), (x_test, y_test) = mnist.load_data()

x_train = x_train.reshape(60000, 784)
x_test = x_test.reshape(10000, 784)
x_train = x_train.astype('float32')
x_test = x_test.astype('float32')
x_train /= 255
x_test /= 255
print(x_train.shape[0], 'train samples')
print(x_test.shape[0], 'test samples')

# convert class vectors to binary class matrices
# 类别向量转为多分类矩阵
y_train = keras.utils.to_categorical(y_train, num_classes)
y_test = keras.utils.to_categorical(y_test, num_classes)

# build the model
# 建立模型
model = Sequential()
model.add(layers.Dense(256, input_shape=(784,)))
model.add(Antirectifier())
model.add(layers.Dropout(0.1))
model.add(layers.Dense(256))
model.add(Antirectifier())
model.add(layers.Dropout(0.1))
model.add(layers.Dense(num_classes))
model.add(layers.Activation('softmax'))

# compile the model
# 编译模型
model.compile(loss='categorical_crossentropy',
              optimizer='rmsprop',
              metrics=['accuracy'])

# train the model
# 训练模型
model.fit(x_train, y_train,
          batch_size=batch_size,
          epochs=epochs,
          verbose=1,
          validation_data=(x_test, y_test))

# next, compare with an equivalent network
# with2x bigger Dense layers and ReLU
# 下一步,使用同结构网络比较,该网络有2倍打的全连接层和ReLU激活函数

代码执行

C:\ProgramData\Anaconda3\python.exe E:/keras-master/examples/antirectifier.py
Using TensorFlow backend.
60000 train samples
10000 test samples

Instructions for updating:
dim is deprecated, use axis instead
Train on 60000 samples, validate on 10000 samples
Epoch 1/40

  128/60000 [..............................] - ETA: 3:49 - loss: 2.2937 - acc: 0.1406
  512/60000 [..............................] - ETA: 1:02 - loss: 2.1965 - acc: 0.3789
 1024/60000 [..............................] - ETA: 34s - loss: 2.0866 - acc: 0.5488
 1408/60000 [..............................] - ETA: 27s - loss: 2.0311 - acc: 0.6044
 1792/60000 [..............................] - ETA: 22s - loss: 1.9940 - acc: 0.6256
 2176/60000 [>.............................] - ETA: 20s - loss: 1.9537 - acc: 0.6562
 2688/60000 [>.............................] - ETA: 17s - loss: 1.9076 - acc: 0.6864
 3200/60000 [>.............................] - ETA: 15s - loss: 1.8655 - acc: 0.7119
 3584/60000 [>.............................] - ETA: 14s - loss: 1.8389 - acc: 0.7224
 3968/60000 [>.............................] - ETA: 13s - loss: 1.8110 - acc: 0.7354
 4480/60000 [=>............................] - ETA: 13s - loss: 1.7813 - acc: 0.7464
 4992/60000 [=>............................] - ETA: 12s - loss: 1.7526 - acc: 0.7568
 5376/60000 [=>............................] - ETA: 11s - loss: 1.7310 - acc: 0.7651
 5760/60000 [=>............................] - ETA: 11s - loss: 1.7098 - acc: 0.7731
 6272/60000 [==>...........................] - ETA: 10s - loss: 1.6836 - acc: 0.7795
 6656/60000 [==>...........................] - ETA: 10s - loss: 1.6626 - acc: 0.7861
 7040/60000 [==>...........................] - ETA: 10s - loss: 1.6449 - acc: 0.7901
 7552/60000 [==>...........................] - ETA: 10s - loss: 1.6214 - acc: 0.7948
 8064/60000 [===>..........................] - ETA: 9s - loss: 1.6002 - acc: 0.7992
 8448/60000 [===>..........................] - ETA: 9s - loss: 1.5843 - acc: 0.8027
 8832/60000 [===>..........................] - ETA: 9s - loss: 1.5682 - acc: 0.8059
 9216/60000 [===>..........................] - ETA: 9s - loss: 1.5514 - acc: 0.8097
 9600/60000 [===>..........................] - ETA: 9s - loss: 1.5355 - acc: 0.8120
 9984/60000 [===>..........................] - ETA: 8s - loss: 1.5192 - acc: 0.8153
10368/60000 [====>.........................] - ETA: 8s - loss: 1.5040 - acc: 0.8178
10752/60000 [====>.........................] - ETA: 8s - loss: 1.4879 - acc: 0.8210
11136/60000 [====>.........................] - ETA: 8s - loss: 1.4736 - acc: 0.8230
11648/60000 [====>.........................] - ETA: 8s - loss: 1.4539 - acc: 0.8262
12032/60000 [=====>........................] - ETA: 8s - loss: 1.4405 - acc: 0.8281
12416/60000 [=====>........................] - ETA: 8s - loss: 1.4269 - acc: 0.8297
12800/60000 [=====>........................] - ETA: 7s - loss: 1.4120 - acc: 0.8318
13312/60000 [=====>........................] - ETA: 7s - loss: 1.3929 - acc: 0.8350
13696/60000 [=====>........................] - ETA: 7s - loss: 1.3796 - acc: 0.8370
14080/60000 [======>.......................] - ETA: 7s - loss: 1.3665 - acc: 0.8389
14464/60000 [======>.......................] - ETA: 7s - loss: 1.3536 - acc: 0.8410
14976/60000 [======>.......................] - ETA: 7s - loss: 1.3372 - acc: 0.8429
15360/60000 [======>.......................] - ETA: 7s - loss: 1.3251 - acc: 0.8442
15744/60000 [======>.......................] - ETA: 7s - loss: 1.3141 - acc: 0.8452
16256/60000 [=======>......................] - ETA: 6s - loss: 1.2991 - acc: 0.8461
16768/60000 [=======>......................] - ETA: 6s - loss: 1.2825 - acc: 0.8485
17152/60000 [=======>......................] - ETA: 6s - loss: 1.2704 - acc: 0.8503
17536/60000 [=======>......................] - ETA: 6s - loss: 1.2586 - acc: 0.8520
17920/60000 [=======>......................] - ETA: 6s - loss: 1.2474 - acc: 0.8533
18432/60000 [========>.....................] - ETA: 6s - loss: 1.2316 - acc: 0.8551
18816/60000 [========>.....................] - ETA: 6s - loss: 1.2205 - acc: 0.8565
19328/60000 [========>.....................] - ETA: 6s - loss: 1.2059 - acc: 0.8581
19840/60000 [========>.....................] - ETA: 6s - loss: 1.1923 - acc: 0.8594
20224/60000 [=========>....................] - ETA: 6s - loss: 1.1829 - acc: 0.8601
20480/60000 [=========>....................] - ETA: 6s - loss: 1.1756 - acc: 0.8608
20864/60000 [=========>....................] - ETA: 6s - loss: 1.1658 - acc: 0.8616
21248/60000 [=========>....................] - ETA: 6s - loss: 1.1558 - acc: 0.8624
21632/60000 [=========>....................] - ETA: 6s - loss: 1.1457 - acc: 0.8637
22016/60000 [==========>...................] - ETA: 5s - loss: 1.1354 - acc: 0.8647
22400/60000 [==========>...................] - ETA: 5s - loss: 1.1266 - acc: 0.8652
22912/60000 [==========>...................] - ETA: 5s - loss: 1.1142 - acc: 0.8664
23296/60000 [==========>...................] - ETA: 5s - loss: 1.1052 - acc: 0.8674
23680/60000 [==========>...................] - ETA: 5s - loss: 1.0963 - acc: 0.8679
24064/60000 [===========>..................] - ETA: 5s - loss: 1.0864 - acc: 0.8689
24448/60000 [===========>..................] - ETA: 5s - loss: 1.0776 - acc: 0.8697
24960/60000 [===========>..................] - ETA: 5s - loss: 1.0651 - acc: 0.8712
25344/60000 [===========>..................] - ETA: 5s - loss: 1.0567 - acc: 0.8718
25728/60000 [===========>..................] - ETA: 5s - loss: 1.0483 - acc: 0.8726
26240/60000 [============>.................] - ETA: 5s - loss: 1.0368 - acc: 0.8739
26624/60000 [============>.................] - ETA: 5s - loss: 1.0293 - acc: 0.8743
27136/60000 [============>.................] - ETA: 4s - loss: 1.0181 - acc: 0.8755
27648/60000 [============>.................] - ETA: 4s - loss: 1.0069 - acc: 0.8767
28160/60000 [=============>................] - ETA: 4s - loss: 0.9962 - acc: 0.8778
28672/60000 [=============>................] - ETA: 4s - loss: 0.9854 - acc: 0.8789
29056/60000 [=============>................] - ETA: 4s - loss: 0.9774 - acc: 0.8798
29440/60000 [=============>................] - ETA: 4s - loss: 0.9696 - acc: 0.8805
29824/60000 [=============>................] - ETA: 4s - loss: 0.9625 - acc: 0.8812
30208/60000 [==============>...............] - ETA: 4s - loss: 0.9549 - acc: 0.8821
30592/60000 [==============>...............] - ETA: 4s - loss: 0.9478 - acc: 0.8827
31104/60000 [==============>...............] - ETA: 4s - loss: 0.9383 - acc: 0.8836
31488/60000 [==============>...............] - ETA: 4s - loss: 0.9311 - acc: 0.8843
31872/60000 [==============>...............] - ETA: 4s - loss: 0.9240 - acc: 0.8849
32256/60000 [===============>..............] - ETA: 4s - loss: 0.9175 - acc: 0.8855
32768/60000 [===============>..............] - ETA: 4s - loss: 0.9083 - acc: 0.8863
33280/60000 [===============>..............] - ETA: 3s - loss: 0.8990 - acc: 0.8873
33664/60000 [===============>..............] - ETA: 3s - loss: 0.8929 - acc: 0.8877
34176/60000 [================>.............] - ETA: 3s - loss: 0.8844 - acc: 0.8884
34688/60000 [================>.............] - ETA: 3s - loss: 0.8762 - acc: 0.8891
35200/60000 [================>.............] - ETA: 3s - loss: 0.8684 - acc: 0.8897
35712/60000 [================>.............] - ETA: 3s - loss: 0.8606 - acc: 0.8902
36096/60000 [=================>............] - ETA: 3s - loss: 0.8547 - acc: 0.8906
36608/60000 [=================>............] - ETA: 3s - loss: 0.8469 - acc: 0.8913
37120/60000 [=================>............] - ETA: 3s - loss: 0.8395 - acc: 0.8918
37632/60000 [=================>............] - ETA: 3s - loss: 0.8321 - acc: 0.8923
38144/60000 [==================>...........] - ETA: 3s - loss: 0.8246 - acc: 0.8931
38656/60000 [==================>...........] - ETA: 3s - loss: 0.8173 - acc: 0.8938
39168/60000 [==================>...........] - ETA: 3s - loss: 0.8103 - acc: 0.8944
39680/60000 [==================>...........] - ETA: 2s - loss: 0.8032 - acc: 0.8950
40064/60000 [===================>..........] - ETA: 2s - loss: 0.7982 - acc: 0.8953
40576/60000 [===================>..........] - ETA: 2s - loss: 0.7913 - acc: 0.8960
41088/60000 [===================>..........] - ETA: 2s - loss: 0.7844 - acc: 0.8966
41600/60000 [===================>..........] - ETA: 2s - loss: 0.7774 - acc: 0.8974
42112/60000 [====================>.........] - ETA: 2s - loss: 0.7712 - acc: 0.8979
42624/60000 [====================>.........] - ETA: 2s - loss: 0.7649 - acc: 0.8984
43136/60000 [====================>.........] - ETA: 2s - loss: 0.7588 - acc: 0.8990
43648/60000 [====================>.........] - ETA: 2s - loss: 0.7522 - acc: 0.8997
44032/60000 [=====================>........] - ETA: 2s - loss: 0.7477 - acc: 0.9001
44544/60000 [=====================>........] - ETA: 2s - loss: 0.7413 - acc: 0.9008
45056/60000 [=====================>........] - ETA: 2s - loss: 0.7354 - acc: 0.9013
45568/60000 [=====================>........] - ETA: 2s - loss: 0.7294 - acc: 0.9019
45952/60000 [=====================>........] - ETA: 1s - loss: 0.7253 - acc: 0.9023
46464/60000 [======================>.......] - ETA: 1s - loss: 0.7201 - acc: 0.9027
46976/60000 [======================>.......] - ETA: 1s - loss: 0.7144 - acc: 0.9033
47488/60000 [======================>.......] - ETA: 1s - loss: 0.7091 - acc: 0.9038
47872/60000 [======================>.......] - ETA: 1s - loss: 0.7054 - acc: 0.9040
48256/60000 [=======================>......] - ETA: 1s - loss: 0.7013 - acc: 0.9044
48640/60000 [=======================>......] - ETA: 1s - loss: 0.6973 - acc: 0.9048
49152/60000 [=======================>......] - ETA: 1s - loss: 0.6922 - acc: 0.9052
49664/60000 [=======================>......] - ETA: 1s - loss: 0.6873 - acc: 0.9056
50176/60000 [========================>.....] - ETA: 1s - loss: 0.6822 - acc: 0.9061
50560/60000 [========================>.....] - ETA: 1s - loss: 0.6789 - acc: 0.9063
50944/60000 [========================>.....] - ETA: 1s - loss: 0.6752 - acc: 0.9067
51328/60000 [========================>.....] - ETA: 1s - loss: 0.6716 - acc: 0.9071
51712/60000 [========================>.....] - ETA: 1s - loss: 0.6680 - acc: 0.9075
51968/60000 [========================>.....] - ETA: 1s - loss: 0.6656 - acc: 0.9078
52352/60000 [=========================>....] - ETA: 1s - loss: 0.6625 - acc: 0.9081
52736/60000 [=========================>....] - ETA: 1s - loss: 0.6591 - acc: 0.9083
53120/60000 [=========================>....] - ETA: 0s - loss: 0.6562 - acc: 0.9085
53504/60000 [=========================>....] - ETA: 0s - loss: 0.6527 - acc: 0.9089
53888/60000 [=========================>....] - ETA: 0s - loss: 0.6498 - acc: 0.9091
54272/60000 [==========================>...] - ETA: 0s - loss: 0.6467 - acc: 0.9094
54656/60000 [==========================>...] - ETA: 0s - loss: 0.6435 - acc: 0.9097
55168/60000 [==========================>...] - ETA: 0s - loss: 0.6394 - acc: 0.9101
55680/60000 [==========================>...] - ETA: 0s - loss: 0.6352 - acc: 0.9105
56064/60000 [===========================>..] - ETA: 0s - loss: 0.6319 - acc: 0.9109
56576/60000 [===========================>..] - ETA: 0s - loss: 0.6280 - acc: 0.9113
56960/60000 [===========================>..] - ETA: 0s - loss: 0.6249 - acc: 0.9116
57344/60000 [===========================>..] - ETA: 0s - loss: 0.6220 - acc: 0.9118
57856/60000 [===========================>..] - ETA: 0s - loss: 0.6186 - acc: 0.9120
58368/60000 [============================>.] - ETA: 0s - loss: 0.6148 - acc: 0.9124
58880/60000 [============================>.] - ETA: 0s - loss: 0.6107 - acc: 0.9128
59264/60000 [============================>.] - ETA: 0s - loss: 0.6077 - acc: 0.9131
59648/60000 [============================>.] - ETA: 0s - loss: 0.6049 - acc: 0.9134
60000/60000 [==============================] - 9s 148us/step - loss: 0.6021 - acc: 0.9137 - val_loss: 0.1515 - val_acc: 0.9613
Epoch 2/40

  128/60000 [..............................] - ETA: 8s - loss: 0.1386 - acc: 0.9609
  640/60000 [..............................] - ETA: 7s - loss: 0.1699 - acc: 0.9500
 1152/60000 [..............................] - ETA: 7s - loss: 0.1501 - acc: 0.9592
59264/60000 [============================>.] - ETA: 0s - loss: 0.0045 - acc: 0.9985
59776/60000 [============================>.] - ETA: 0s - loss: 0.0045 - acc: 0.9985
60000/60000 [==============================] - 8s 136us/step - loss: 0.0045 - acc: 0.9985 - val_loss: 0.0887 - val_acc: 0.9824

Process finished with exit code 0

Keras详细介绍

英文:https://keras.io/

中文:http://keras-cn.readthedocs.io/en/latest/

实例下载

https://github.com/keras-team/keras

https://github.com/keras-team/keras/tree/master/examples

完整项目下载

方便没积分童鞋,请加企鹅452205574,共享文件夹。

包括:代码、数据集合(图片)、已生成model、安装库文件等。

评论
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包
实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值