目录
关于Nesterov Accelerated Gradient
多层感知机(MLP)的softmax多分类
from keras.models import Sequential
from keras.layers import Dense, Dropout, Activation
from keras.optimizers import SGD
import keras
import numpy as np
# 生成虚拟数据
x_train = np.random.random((1000, 20))
y_train = keras.utils.to_categorical(np.random.randint(10, size=(1000, 1)), num_classes=10)
x_test = np.random.random((100, 20))
y_test = keras.utils.to_categorical(np.random.randint(10, size=(100, 1)), num_classes=10)
model = Sequential()
model.add(Dense(64, activation='relu', input_dim=20))
model.add(Dropout(0.5))
model.add(Dense(64, activation='relu'))
model.add(Dropout(0.5))
model.add(Dense(10, activation='softmax'))
sgd = SGD(lr=0.01, decay=1e-6, momentum=0.9, nesterov=True)
model.compile(loss='categorical_crossentropy', optimizer=sgd, metrics=['accuracy'])
model.fit(x_train, y_train, epochs=20, batch_size=128)
score = model.evaluate(x_test, y_test, batch_size=128)
print("score: ", score)
运行结果:
Using TensorFlow backend.
Epoch 1/20
2019-02-28 10:37:16.147684: I tensorflow/core/platform/cpu_feature_guard.cc:141] Your CPU supports instructions that this TensorFlow binary was not compiled to use: AVX AVX2
128/1000 [==>...........................] - ETA: 0s - loss: 2.4224 - acc: 0.0859
1000/1000 [==============================] - 0s 149us/step - loss: 2.3773 - acc: 0.1000
Epoch 2/20
128/1000 [==>...........................] - ETA: 0s - loss: 2.3494 - acc: 0.0703
1000/1000 [==============================] - 0s 11us/step - loss: 2.3454 - acc: 0.0870
Epoch 3/20
128/1000 [==>...........................] - ETA: 0s - loss: 2.3077 - acc: 0.1094
1000/1000 [==============================] - 0s 11us/step - loss: 2.3437 - acc: 0.0870
Epoch 4/20
128/1000 [==>...........................] - ETA: 0s - loss: 2.3307 - acc: 0.0938
1000/1000 [==============================] - 0s 11us/step - loss: 2.3364 - acc: 0.0980
Epoch 5/20
128/1000 [==>...........................] - ETA: 0s - loss: 2.3100 - acc: 0.1094
1000/1000 [==============================] - 0s 10us/step - loss: 2.3171 - acc: 0.1100
Epoch 6/20
128/1000 [==>...........................] - ETA: 0s - loss: 2.3120 - acc: 0.1172
1000/1000 [==============================] - 0s 11us/step - loss: 2.3188 - acc: 0.1040
Epoch 7/20
128/1000 [==>...........................] - ETA: 0s - loss: 2.3345 - acc: 0.0859
1000/1000 [==============================] - 0s 12us/step - loss: 2.3167 - acc: 0.1030
Epoch 8/20
128/1000 [==>...........................] - ETA: 0s - loss: 2.3246 - acc: 0.0625
1000/1000 [==============================] - 0s 11us/step - loss: 2.3150 - acc: 0.0970
Epoch 9/20
128/1000 [==>...........................] - ETA: 0s - loss: 2.3137 - acc: 0.1562
1000/1000 [==============================] - 0s 11us/step - loss: 2.3083 - acc: 0.1070
Epoch 10/20
128/1000 [==>...........................] - ETA: 0s - loss: 2.2797 - acc: 0.0781
1000/1000 [==============================] - 0s 12us/step - loss: 2.3075 - acc: 0.1090
Epoch 11/20
128/1000 [==>...........................] - ETA: 0s - loss: 2.3197 - acc: 0.1016
1000/1000 [==============================] - 0s 11us/step - loss: 2.3028 - acc: 0.1030
Epoch 12/20
128/1000 [==>...........................] - ETA: 0s - loss: 2.2950 - acc: 0.1250
1000/1000 [==============================] - 0s 11us/step - loss: 2.2958 - acc: 0.1240
Epoch 13/20
128/1000 [==>...........................] - ETA: 0s - loss: 2.2962 - acc: 0.1172
1000/1000 [==============================] - 0s 11us/step - loss: 2.3070 - acc: 0.1080
Epoch 14/20
1