基于Keras在数据集Pima Indians diabetes上实现多层感知器

机器学习是一种实现人工智能的方法,深度学习则是一种实现机器学习的技术。大量数据为深度学习提供了材料,有了充分的数据做基础,利用深度学习技术就能演绎出更聪明的算法,使深度学习得以迅速发展。

目前有许多深度学习的平台和类库,这里选择使用Python中的Keras来介绍深度学习。与R语言不同的是,Python是一种可以在模型开发和实际生产环境上使用的语言。与Java相比,Python具有SciPy等可以用于科学计算的类库和Scikit-learn等专业的机器学习类库。

在Python中有多种非常流行的用来建立深度学习模型的平台,如起源于Google的TensorFlow和由微软开源的CNTK。并且Keras提供了非常简洁和易于使用的基于这两个平台的API,用来开发深度学习模型。这里介绍通过Keras创建模型,并将其导入实际的项目应用中。

软件环境和基本要求

环境基于Python 3.6.1,并安装了下面这些类库:SciPy、Numpy、Matplotlib、Pandas、Scikit-learn。对于上述这些类库的安装及介绍可以参阅官方网站的介绍,还会用到深度学习的相关类库:TensorFlow、CNTK和Keras。

机器学习

懂机器学习,会对深度学习有很大的帮助,例如通过Scikit-Learn实现交叉验证等。懂机器学习不是必需的,但是机器学习会让学习过程更简单。

深度学习

深度学习的实践中不需要掌握算法的原理和数学基础,但具备这方面的知识能加快对算法的理解。

从对深度学习感兴趣,到掌握Python中深度学习的资源,并能在项目中实践深度学习。

  1. 掌握如何构建和评估一个神经网络模型
  2. 掌握如何使用更先进的技术构建一个深度学习的模型
  3. 掌握如何构建一个图像或文本相关的模型
  4. 掌握如何改善模型的性能

深度学习生态圈

  • CNTK(微软出品的一个开源的深度学习工具包)
  • TensorFlow(Python中一个快速的数值运算类库,由Google开源)
  • Keras(构建在CNTK和TensorFlow之上的,能够用来快速创建深度学习模型的Python类库)

Keras配置后端

Keras是一个轻量级的API,它不提供深度学习所需的数学运算的实现,为有效的数值计算库(称为后端)提供了一致的界面。假设同时安装了CNTK和TensorFlow,可以配置Keras使用CNTK或TensorFlow。配置Keras后端最简单的方法是在主目录中添加或编辑Keras配置文件:~/.keras/keras.json。Keras默认使用TensorFlow作为其后端。

使用Keras构建深度学习模型

在深度学习的项目中使用Keras,可以将精力放置在如何构建模型上。序贯模型(Sequential)是多个网络层的线性堆叠,是在深度学习中很常见的一种模型。当创建一个序贯模型时,按照希望执行计算的顺序向其添加图层。一旦完成定义,通过使用底层框架编译模型来优化模型计算。在编译模型时,可以指定要使用的损失函数和优化器来优化模型。

一旦完成模型编译,就可以使用数据集来训练模型。模型训练可以一次使用一批数据(数据量比较大时,可以将数据分成几个batch,多次循环输入数据,来完成模型训练),也可以通过使用整个数据集来完成模型训练。模型训练就是所有计算发生的地方。模型训练完成后,可以使用模型对新数据进行预测。

通过Keras构建深度学习模型的步骤如下:

  1. 定义模型:创建一个序贯模型并添加配置层
  2. 编译模型:指定损失函数和优化器,并调用模型的compile函数,完成模型编译
  3. 训练模型:通过调用模型的fit函数来训练模型
  4. 执行预测:调用模型的evaluate或predict等函数对新数据进行预测

云端GPUs计算

多层感知器示例

数据集Pima Indians diabetes,这个数据集所有的属性都是数值,可以直接作为神经网络的输入和输出使用。

数据示例部分内容如下,

  1 6,148,72,35,0,33.6,0.627,50,1
  2 1,85,66,29,0,26.6,0.351,31,0
  3 8,183,64,0,0,23.3,0.672,32,1
  4 1,89,66,23,94,28.1,0.167,21,0
  5 0,137,40,35,168,43.1,2.288,33,1
  6 5,116,74,0,0,25.6,0.201,30,0
  7 3,78,50,32,88,31,0.248,26,1
  8 10,115,0,0,0,35.3,0.134,29,0
  9 2,197,70,45,543,30.5,0.158,53,1
 10 8,125,96,0,0,0,0.232,54,1
 11 4,110,92,0,0,37.6,0.191,30,0
 12 10,168,74,0,0,38,0.537,34,1
 13 10,139,80,0,0,27.1,1.441,57,0
 14 1,189,60,23,846,30.1,0.398,59,1
 15 5,166,72,19,175,25.8,0.587,51,1
 16 7,100,0,0,0,30,0.484,32,1
 17 0,118,84,47,230,45.8,0.551,31,1
 18 7,107,74,0,0,29.6,0.254,31,1
 19 1,103,30,38,83,43.3,0.183,33,0
 20 1,115,70,30,96,34.6,0.529,32,1
 21 3,126,88,41,235,39.3,0.704,27,0
 22 8,99,84,0,0,35.4,0.388,50,0
 23 7,196,90,0,0,39.8,0.451,41,1
 24 9,119,80,35,0,29,0.263,29,1
 25 11,143,94,33,146,36.6,0.254,51,1
 26 10,125,70,26,115,31.1,0.205,41,1
 27 7,147,76,0,0,39.4,0.257,43,1
 28 1,97,66,15,140,23.2,0.487,22,0
 29 13,145,82,19,110,22.2,0.245,57,0
 30 5,117,92,0,0,34.1,0.337,38,0
 31 5,109,75,26,0,36,0.546,60,0
 32 3,158,76,36,245,31.6,0.851,28,1
 33 3,88,58,11,54,24.8,0.267,22,0
 34 6,92,92,0,0,19.9,0.188,28,0
 35 10,122,78,31,0,27.6,0.512,45,0
 36 4,103,60,33,192,24,0.966,33,0
 37 11,138,76,0,0,33.2,0.42,35,0
 38 9,102,76,37,0,32.9,0.665,46,1
 39 2,90,68,42,0,38.2,0.503,27,1
 40 4,111,72,47,207,37.1,1.39,56,1
 41 3,180,64,25,70,34,0.271,26,0
 42 7,133,84,0,0,40.2,0.696,37,0
 43 7,106,92,18,0,22.7,0.235,48,0
 44 9,171,110,24,240,45.4,0.721,54,1
 45 7,159,64,0,0,27.4,0.294,40,0
 46 0,180,66,39,0,42,1.893,25,1
 47 1,146,56,0,0,29.7,0.564,29,0
 48 2,71,70,27,0,28,0.586,22,0
 49 7,103,66,32,0,39.1,0.344,31,1
 50 7,105,0,0,0,0,0.305,24,0
 51 1,103,80,11,82,19.4,0.491,22,0
 52 1,101,50,15,36,24.2,0.526,26,0
 53 5,88,66,21,23,24.4,0.342,30,0
 54 8,176,90,34,300,33.7,0.467,58,1
 55 7,150,66,42,342,34.7,0.718,42,0
 56 1,73,50,10,0,23,0.248,21,0
 57 7,187,68,39,304,37.7,0.254,41,1
 58 0,100,88,60,110,46.8,0.962,31,0
 59 0,146,82,0,0,40.5,1.781,44,0
 60 0,105,64,41,142,41.5,0.173,22,0
 61 2,84,0,0,0,0,0.304,21,0
 62 8,133,72,0,0,32.9,0.27,39,1
 63 5,44,62,0,0,25,0.587,36,0
 64 2,141,58,34,128,25.4,0.699,24,0
 65 7,114,66,0,0,32.8,0.258,42,1
 66 5,99,74,27,0,29,0.203,32,0
 67 0,109,88,30,0,32.5,0.855,38,1
 68 2,109,92,0,0,42.7,0.845,54,0
 69 1,95,66,13,38,19.6,0.334,25,0
 70 4,146,85,27,100,28.9,0.189,27,0
 71 2,100,66,20,90,32.9,0.867,28,1
 72 5,139,64,35,140,28.6,0.411,26,0
 73 13,126,90,0,0,43.4,0.583,42,1
 74 4,129,86,20,270,35.1,0.231,23,0
 75 1,79,75,30,0,32,0.396,22,0
 76 1,0,48,20,0,24.7,0.14,22,0
 77 7,62,78,0,0,32.6,0.391,41,0
 78 5,95,72,33,0,37.7,0.37,27,0
 79 0,131,0,0,0,43.2,0.27,26,1
 80 2,112,66,22,0,25,0.307,24,0
 81 3,113,44,13,0,22.4,0.14,22,0
 82 2,74,0,0,0,0,0.102,22,0
 83 7,83,78,26,71,29.3,0.767,36,0
 84 0,101,65,28,0,24.6,0.237,22,0
 85 5,137,108,0,0,48.8,0.227,37,1
 86 2,110,74,29,125,32.4,0.698,27,0
 87 13,106,72,54,0,36.6,0.178,45,0
 88 2,100,68,25,71,38.5,0.324,26,0
 89 15,136,70,32,110,37.1,0.153,43,1
 90 1,107,68,19,0,26.5,0.165,24,0
 91 1,80,55,0,0,19.1,0.258,21,0
 92 4,123,80,15,176,32,0.443,34,0
 93 7,81,78,40,48,46.7,0.261,42,0
 94 4,134,72,0,0,23.8,0.277,60,1
 95 2,142,82,18,64,24.7,0.761,21,0
 96 6,144,72,27,228,33.9,0.255,40,0
 97 2,92,62,28,0,31.6,0.13,24,0
 98 1,71,48,18,76,20.4,0.323,22,0
 99 6,93,50,30,64,28.7,0.356,23,0
100 1,122,90,51,220,49.7,0.325,31,1
101 1,163,72,0,0,39,1.222,33,1
102 1,151,60,0,0,26.1,0.179,22,0
103 0,125,96,0,0,22.5,0.262,21,0
104 1,81,72,18,40,26.6,0.283,24,0
105 2,85,65,0,0,39.6,0.93,27,0
106 1,126,56,29,152,28.7,0.801,21,0
107 1,96,122,0,0,22.4,0.207,27,0
108 4,144,58,28,140,29.5,0.287,37,0
109 3,83,58,31,18,34.3,0.336,25,0
110 0,95,85,25,36,37.4,0.247,24,1
111 3,171,72,33,135,33.3,0.199,24,1
112 8,155,62,26,495,34,0.543,46,1
113 1,89,76,34,37,31.2,0.192,23,0
114 4,76,62,0,0,34,0.391,25,0
115 7,160,54,32,175,30.5,0.588,39,1
116 4,146,92,0,0,31.2,0.539,61,1
117 5,124,74,0,0,34,0.22,38,1
118 5,78,48,0,0,33.7,0.654,25,0
119 4,97,60,23,0,28.2,0.443,22,0
120 4,99,76,15,51,23.2,0.223,21,0
121 0,162,76,56,100,53.2,0.759,25,1
122 6,111,64,39,0,34.2,0.26,24,0
123 2,107,74,30,100,33.6,0.404,23,0
124 5,132,80,0,0,26.8,0.186,69,0
125 0,113,76,0,0,33.3,0.278,23,1
126 1,88,30,42,99,55,0.496,26,1
127 3,120,70,30,135,42.9,0.452,30,0
128 1,118,58,36,94,33.3,0.261,23,0
129 1,117,88,24,145,34.5,0.403,40,1
130 0,105,84,0,0,27.9,0.741,62,1
131 4,173,70,14,168,29.7,0.361,33,1
132 9,122,56,0,0,33.3,1.114,33,1
133 3,170,64,37,225,34.5,0.356,30,1
134 8,84,74,31,0,38.3,0.457,39,0
135 2,96,68,13,49,21.1,0.647,26,0
136 2,125,60,20,140,33.8,0.088,31,0
137 0,100,70,26,50,30.8,0.597,21,0
138 0,93,60,25,92,28.7,0.532,22,0
139 0,129,80,0,0,31.2,0.703,29,0
140 5,105,72,29,325,36.9,0.159,28,0
141 3,128,78,0,0,21.1,0.268,55,0
142 5,106,82,30,0,39.5,0.286,38,0
143 2,108,52,26,63,32.5,0.318,22,0
144 10,108,66,0,0,32.4,0.272,42,1
145 4,154,62,31,284,32.8,0.237,23,0
146 0,102,75,23,0,0,0.572,21,0
147 9,57,80,37,0,32.8,0.096,41,0
148 2,106,64,35,119,30.5,1.4,34,0
149 5,147,78,0,0,33.7,0.218,65,0
150 2,90,70,17,0,27.3,0.085,22,0
151 1,136,74,50,204,37.4,0.399,24,0
152 4,114,65,0,0,21.9,0.432,37,0
153 9,156,86,28,155,34.3,1.189,42,1
154 1,153,82,42,485,40.6,0.687,23,0
155 8,188,78,0,0,47.9,0.137,43,1
156 7,152,88,44,0,50,0.337,36,1
157 2,99,52,15,94,24.6,0.637,21,0
158 1,109,56,21,135,25.2,0.833,23,0
159 2,88,74,19,53,29,0.229,22,0
160 17,163,72,41,114,40.9,0.817,47,1
161 4,151,90,38,0,29.7,0.294,36,0
162 7,102,74,40,105,37.2,0.204,45,0
163 0,114,80,34,285,44.2,0.167,27,0
164 2,100,64,23,0,29.7,0.368,21,0
165 0,131,88,0,0,31.6,0.743,32,1
166 6,104,74,18,156,29.9,0.722,41,1
167 3,148,66,25,0,32.5,0.256,22,0
168 4,120,68,0,0,29.6,0.709,34,0
169 4,110,66,0,0,31.9,0.471,29,0
170 3,111,90,12,78,28.4,0.495,29,0
171 6,102,82,0,0,30.8,0.18,36,1
172 6,134,70,23,130,35.4,0.542,29,1
173 2,87,0,23,0,28.9,0.773,25,0
174 1,79,60,42,48,43.5,0.678,23,0
175 2,75,64,24,55,29.7,0.37,33,0
176 8,179,72,42,130,32.7,0.719,36,1
177 6,85,78,0,0,31.2,0.382,42,0
178 0,129,110,46,130,67.1,0.319,26,1
179 5,143,78,0,0,45,0.19,47,0
180 5,130,82,0,0,39.1,0.956,37,1
181 6,87,80,0,0,23.2,0.084,32,0
182 0,119,64,18,92,34.9,0.725,23,0
183 1,0,74,20,23,27.7,0.299,21,0
184 5,73,60,0,0,26.8,0.268,27,0
185 4,141,74,0,0,27.6,0.244,40,0
186 7,194,68,28,0,35.9,0.745,41,1
187 8,181,68,36,495,30.1,0.615,60,1
188 1,128,98,41,58,32,1.321,33,1
189 8,109,76,39,114,27.9,0.64,31,1
190 5,139,80,35,160,31.6,0.361,25,1
191 3,111,62,0,0,22.6,0.142,21,0
192 9,123,70,44,94,33.1,0.374,40,0
193 7,159,66,0,0,30.4,0.383,36,1
194 11,135,0,0,0,52.3,0.578,40,1
195 8,85,55,20,0,24.4,0.136,42,0
196 5,158,84,41,210,39.4,0.395,29,1
197 1,105,58,0,0,24.3,0.187,21,0
198 3,107,62,13,48,22.9,0.678,23,1
199 4,109,64,44,99,34.8,0.905,26,1
200 4,148,60,27,318,30.9,0.15,29,1
201 0,113,80,16,0,31,0.874,21,0
202 1,138,82,0,0,40.1,0.236,28,0
203 0,108,68,20,0,27.3,0.787,32,0
204 2,99,70,16,44,20.4,0.235,27,0
205 6,103,72,32,190,37.7,0.324,55,0
206 5,111,72,28,0,23.9,0.407,27,0
207 8,196,76,29,280,37.5,0.605,57,1
208 5,162,104,0,0,37.7,0.151,52,1
209 1,96,64,27,87,33.2,0.289,21,0
210 7,184,84,33,0,35.5,0.355,41,1
211 2,81,60,22,0,27.7,0.29,25,0
212 0,147,85,54,0,42.8,0.375,24,0
213 7,179,95,31,0,34.2,0.164,60,0
214 0,140,65,26,130,42.6,0.431,24,1
215 9,112,82,32,175,34.2,0.26,36,1
216 12,151,70,40,271,41.8,0.742,38,1
217 5,109,62,41,129,35.8,0.514,25,1
218 6,125,68,30,120,30,0.464,32,0
219 5,85,74,22,0,29,1.224,32,1
220 5,112,66,0,0,37.8,0.261,41,1
221 0,177,60,29,478,34.6,1.072,21,1
222 2,158,90,0,0,31.6,0.805,66,1
223 7,119,0,0,0,25.2,0.209,37,0
224 7,142,60,33,190,28.8,0.687,61,0
225 1,100,66,15,56,23.6,0.666,26,0
226 1,87,78,27,32,34.6,0.101,22,0
227 0,101,76,0,0,35.7,0.198,26,0
228 3,162,52,38,0,37.2,0.652,24,1
229 4,197,70,39,744,36.7,2.329,31,0
230 0,117,80,31,53,45.2,0.089,24,0
231 4,142,86,0,0,44,0.645,22,1
232 6,134,80,37,370,46.2,0.238,46,1
233 1,79,80,25,37,25.4,0.583,22,0
234 4,122,68,0,0,35,0.394,29,0
235 3,74,68,28,45,29.7,0.293,23,0
236 4,171,72,0,0,43.6,0.479,26,1
237 7,181,84,21,192,35.9,0.586,51,1
768 1,93,70,31,0,30.4,0.315,23,0

代码如下,

from keras.models import Sequential
from keras.layers import Dense
import numpy as np

#set random seed
np.random.seed(7)

#import data
dataset = np.loadtxt('diabetes.csv', delimiter = ',')

#split input x and output y
x = dataset[:, 0 : 8]
y = dataset[:, 8]

#create model
model = Sequential()
model.add(Dense(12, input_dim = 8, activation = 'relu'))
model.add(Dense(8, activation = 'relu'))
model.add(Dense(1, activation = 'sigmoid'))

#compile model
model.compile(loss = 'binary_crossentropy', optimizer = 'adam', metrics = ['accuracy'])

# train model
model.fit(x = x, y = y, epochs = 150, batch_size = 10)

# evaluate model
scores = model.evaluate(x = x, y = y)

print('\n%s : %.2f%%' % (model.metrics_names[1], scores[1] * 100))

训练结果如下,

Using TensorFlow backend.
Epoch 1/150
2019-04-12 11:30:08.237361: I tensorflow/core/platform/cpu_feature_guard.cc:141] Your CPU supports instructions that this TensorFlow binary was not compiled to use: AVX2 FMA
2019-04-12 11:30:12.882887: I tensorflow/core/common_runtime/gpu/gpu_device.cc:1392] Found device 0 with properties: 
name: GeForce GTX TITAN X major: 5 minor: 2 memoryClockRate(GHz): 1.076
pciBusID: 0000:04:00.0
totalMemory: 11.93GiB freeMemory: 3.52GiB
2019-04-12 11:30:13.101248: I tensorflow/core/common_runtime/gpu/gpu_device.cc:1392] Found device 1 with properties: 
name: GeForce GTX TITAN X major: 5 minor: 2 memoryClockRate(GHz): 1.076
pciBusID: 0000:05:00.0
totalMemory: 11.93GiB freeMemory: 11.81GiB
2019-04-12 11:30:13.324349: I tensorflow/core/common_runtime/gpu/gpu_device.cc:1392] Found device 2 with properties: 
name: GeForce GTX TITAN X major: 5 minor: 2 memoryClockRate(GHz): 1.076
pciBusID: 0000:08:00.0
totalMemory: 11.93GiB freeMemory: 11.81GiB
2019-04-12 11:30:13.603787: I tensorflow/core/common_runtime/gpu/gpu_device.cc:1392] Found device 3 with properties: 
name: GeForce GTX TITAN X major: 5 minor: 2 memoryClockRate(GHz): 1.076
pciBusID: 0000:09:00.0
totalMemory: 11.93GiB freeMemory: 11.81GiB
2019-04-12 11:30:13.824838: I tensorflow/stream_executor/cuda/cuda_gpu_executor.cc:897] successful NUMA node read from SysFS had negative value (-1), but there must be at least one NUMA node, so returning NUMA node zero
2019-04-12 11:30:13.826103: I tensorflow/core/common_runtime/gpu/gpu_device.cc:1392] Found device 4 with properties: 
name: GeForce GTX TITAN X major: 5 minor: 2 memoryClockRate(GHz): 1.076
pciBusID: 0000:83:00.0
totalMemory: 11.93GiB freeMemory: 11.81GiB
2019-04-12 11:30:14.135486: I tensorflow/stream_executor/cuda/cuda_gpu_executor.cc:897] successful NUMA node read from SysFS had negative value (-1), but there must be at least one NUMA node, so returning NUMA node zero
2019-04-12 11:30:14.137340: I tensorflow/core/common_runtime/gpu/gpu_device.cc:1392] Found device 5 with properties: 
name: GeForce GTX TITAN X major: 5 minor: 2 memoryClockRate(GHz): 1.076
pciBusID: 0000:84:00.0
totalMemory: 11.93GiB freeMemory: 11.81GiB
2019-04-12 11:30:14.398701: I tensorflow/stream_executor/cuda/cuda_gpu_executor.cc:897] successful NUMA node read from SysFS had negative value (-1), but there must be at least one NUMA node, so returning NUMA node zero
2019-04-12 11:30:14.400809: I tensorflow/core/common_runtime/gpu/gpu_device.cc:1392] Found device 6 with properties: 
name: GeForce GTX TITAN X major: 5 minor: 2 memoryClockRate(GHz): 1.076
pciBusID: 0000:87:00.0
totalMemory: 11.93GiB freeMemory: 11.81GiB
2019-04-12 11:30:14.736456: I tensorflow/stream_executor/cuda/cuda_gpu_executor.cc:897] successful NUMA node read from SysFS had negative value (-1), but there must be at least one NUMA node, so returning NUMA node zero
2019-04-12 11:30:14.738358: I tensorflow/core/common_runtime/gpu/gpu_device.cc:1392] Found device 7 with properties: 
name: GeForce GTX TITAN X major: 5 minor: 2 memoryClockRate(GHz): 1.076
pciBusID: 0000:88:00.0
totalMemory: 11.93GiB freeMemory: 11.81GiB
2019-04-12 11:30:14.746155: I tensorflow/core/common_runtime/gpu/gpu_device.cc:1471] Adding visible gpu devices: 0, 1, 2, 3, 4, 5, 6, 7
2019-04-12 11:30:20.124691: I tensorflow/core/common_runtime/gpu/gpu_device.cc:952] Device interconnect StreamExecutor with strength 1 edge matrix:
2019-04-12 11:30:20.124747: I tensorflow/core/common_runtime/gpu/gpu_device.cc:958]      0 1 2 3 4 5 6 7 
2019-04-12 11:30:20.124755: I tensorflow/core/common_runtime/gpu/gpu_device.cc:971] 0:   N Y Y Y N N N N 
2019-04-12 11:30:20.124762: I tensorflow/core/common_runtime/gpu/gpu_device.cc:971] 1:   Y N Y Y N N N N 
2019-04-12 11:30:20.124768: I tensorflow/core/common_runtime/gpu/gpu_device.cc:971] 2:   Y Y N Y N N N N 
2019-04-12 11:30:20.124774: I tensorflow/core/common_runtime/gpu/gpu_device.cc:971] 3:   Y Y Y N N N N N 
2019-04-12 11:30:20.124780: I tensorflow/core/common_runtime/gpu/gpu_device.cc:971] 4:   N N N N N Y Y Y 
2019-04-12 11:30:20.124785: I tensorflow/core/common_runtime/gpu/gpu_device.cc:971] 5:   N N N N Y N Y Y 
2019-04-12 11:30:20.124791: I tensorflow/core/common_runtime/gpu/gpu_device.cc:971] 6:   N N N N Y Y N Y 
2019-04-12 11:30:20.124796: I tensorflow/core/common_runtime/gpu/gpu_device.cc:971] 7:   N N N N Y Y Y N 
2019-04-12 11:30:20.126661: I tensorflow/core/common_runtime/gpu/gpu_device.cc:1084] Created TensorFlow device (/job:localhost/replica:0/task:0/device:GPU:0 with 3241 MB memory) -> physical GPU (device: 0, name: GeForce GTX TITAN X, pci bus id: 0000:04:00.0, compute capability: 5.2)
2019-04-12 11:30:23.234804: I tensorflow/core/common_runtime/gpu/gpu_device.cc:1084] Created TensorFlow device (/job:localhost/replica:0/task:0/device:GPU:1 with 11432 MB memory) -> physical GPU (device: 1, name: GeForce GTX TITAN X, pci bus id: 0000:05:00.0, compute capability: 5.2)
2019-04-12 11:30:32.654921: I tensorflow/core/common_runtime/gpu/gpu_device.cc:1084] Created TensorFlow device (/job:localhost/replica:0/task:0/device:GPU:2 with 11432 MB memory) -> physical GPU (device: 2, name: GeForce GTX TITAN X, pci bus id: 0000:08:00.0, compute capability: 5.2)
2019-04-12 11:30:40.710842: I tensorflow/core/common_runtime/gpu/gpu_device.cc:1084] Created TensorFlow device (/job:localhost/replica:0/task:0/device:GPU:3 with 11432 MB memory) -> physical GPU (device: 3, name: GeForce GTX TITAN X, pci bus id: 0000:09:00.0, compute capability: 5.2)
2019-04-12 11:30:50.302588: I tensorflow/core/common_runtime/gpu/gpu_device.cc:1084] Created TensorFlow device (/job:localhost/replica:0/task:0/device:GPU:4 with 11432 MB memory) -> physical GPU (device: 4, name: GeForce GTX TITAN X, pci bus id: 0000:83:00.0, compute capability: 5.2)
2019-04-12 11:31:00.362614: I tensorflow/core/common_runtime/gpu/gpu_device.cc:1084] Created TensorFlow device (/job:localhost/replica:0/task:0/device:GPU:5 with 11432 MB memory) -> physical GPU (device: 5, name: GeForce GTX TITAN X, pci bus id: 0000:84:00.0, compute capability: 5.2)
2019-04-12 11:31:09.670400: I tensorflow/core/common_runtime/gpu/gpu_device.cc:1084] Created TensorFlow device (/job:localhost/replica:0/task:0/device:GPU:6 with 11432 MB memory) -> physical GPU (device: 6, name: GeForce GTX TITAN X, pci bus id: 0000:87:00.0, compute capability: 5.2)
2019-04-12 11:31:18.858837: I tensorflow/core/common_runtime/gpu/gpu_device.cc:1084] Created TensorFlow device (/job:localhost/replica:0/task:0/device:GPU:7 with 11432 MB memory) -> physical GPU (device: 7, name: GeForce GTX TITAN X, pci bus id: 0000:88:00.0, compute capability: 5.2)
768/768 [==============================] - 83s 109ms/step - loss: 3.7048 - acc: 0.5977
Epoch 2/150
768/768 [==============================] - 0s 300us/step - loss: 0.9408 - acc: 0.5885
Epoch 3/150
768/768 [==============================] - 0s 456us/step - loss: 0.7518 - acc: 0.6432
Epoch 4/150
768/768 [==============================] - 0s 377us/step - loss: 0.7113 - acc: 0.6628
Epoch 5/150
768/768 [==============================] - 0s 411us/step - loss: 0.6812 - acc: 0.6758
Epoch 6/150
768/768 [==============================] - 0s 411us/step - loss: 0.6503 - acc: 0.6810
Epoch 7/150
768/768 [==============================] - 0s 392us/step - loss: 0.6493 - acc: 0.6719
Epoch 8/150
768/768 [==============================] - 0s 340us/step - loss: 0.6366 - acc: 0.6849
Epoch 9/150
768/768 [==============================] - 0s 338us/step - loss: 0.6242 - acc: 0.6914
Epoch 10/150
768/768 [==============================] - 0s 381us/step - loss: 0.6297 - acc: 0.6784
Epoch 11/150
768/768 [==============================] - 0s 424us/step - loss: 0.6476 - acc: 0.6706
Epoch 12/150
768/768 [==============================] - 0s 408us/step - loss: 0.6398 - acc: 0.6784
Epoch 13/150
768/768 [==============================] - 0s 395us/step - loss: 0.6258 - acc: 0.6810
Epoch 14/150
768/768 [==============================] - 0s 404us/step - loss: 0.6191 - acc: 0.6953
Epoch 15/150
768/768 [==============================] - 0s 428us/step - loss: 0.6027 - acc: 0.6914
Epoch 16/150
768/768 [==============================] - 0s 406us/step - loss: 0.5879 - acc: 0.7018
Epoch 17/150
768/768 [==============================] - 0s 400us/step - loss: 0.5854 - acc: 0.7005
Epoch 18/150
768/768 [==============================] - 0s 417us/step - loss: 0.6012 - acc: 0.6849
Epoch 19/150
768/768 [==============================] - 0s 409us/step - loss: 0.5806 - acc: 0.7109
Epoch 20/150
768/768 [==============================] - 0s 391us/step - loss: 0.5798 - acc: 0.7174
Epoch 21/150
768/768 [==============================] - 0s 394us/step - loss: 0.5687 - acc: 0.7161
Epoch 22/150
768/768 [==============================] - 0s 397us/step - loss: 0.5818 - acc: 0.6966
Epoch 23/150
768/768 [==============================] - 0s 398us/step - loss: 0.5734 - acc: 0.7083
Epoch 24/150
768/768 [==============================] - 0s 343us/step - loss: 0.5679 - acc: 0.7305
Epoch 25/150
768/768 [==============================] - 0s 347us/step - loss: 0.5577 - acc: 0.7344
Epoch 26/150
768/768 [==============================] - 0s 377us/step - loss: 0.5702 - acc: 0.7044
Epoch 27/150
768/768 [==============================] - 0s 339us/step - loss: 0.5556 - acc: 0.7240
Epoch 28/150
768/768 [==============================] - 0s 376us/step - loss: 0.5558 - acc: 0.7292
Epoch 29/150
768/768 [==============================] - 0s 411us/step - loss: 0.5739 - acc: 0.7135
Epoch 30/150
768/768 [==============================] - 0s 405us/step - loss: 0.5607 - acc: 0.7214
Epoch 31/150
768/768 [==============================] - 0s 419us/step - loss: 0.5685 - acc: 0.7161
Epoch 32/150
768/768 [==============================] - 0s 415us/step - loss: 0.5636 - acc: 0.7148
Epoch 33/150
768/768 [==============================] - 0s 400us/step - loss: 0.5520 - acc: 0.7201
Epoch 34/150
768/768 [==============================] - 0s 410us/step - loss: 0.5492 - acc: 0.7318
Epoch 35/150
768/768 [==============================] - 0s 396us/step - loss: 0.5507 - acc: 0.7201
Epoch 36/150
768/768 [==============================] - 0s 420us/step - loss: 0.5610 - acc: 0.7083
Epoch 37/150
768/768 [==============================] - 0s 392us/step - loss: 0.5349 - acc: 0.7383
Epoch 38/150
768/768 [==============================] - 0s 388us/step - loss: 0.5405 - acc: 0.7227
Epoch 39/150
768/768 [==============================] - 0s 417us/step - loss: 0.5451 - acc: 0.7253
Epoch 40/150
768/768 [==============================] - 0s 420us/step - loss: 0.5445 - acc: 0.7214
Epoch 41/150
768/768 [==============================] - 0s 419us/step - loss: 0.5435 - acc: 0.7357
Epoch 42/150
768/768 [==============================] - 0s 411us/step - loss: 0.5381 - acc: 0.7409
Epoch 43/150
768/768 [==============================] - 0s 420us/step - loss: 0.5311 - acc: 0.7526
Epoch 44/150
768/768 [==============================] - 0s 410us/step - loss: 0.5333 - acc: 0.7422
Epoch 45/150
768/768 [==============================] - 0s 383us/step - loss: 0.5314 - acc: 0.7539
Epoch 46/150
768/768 [==============================] - 0s 387us/step - loss: 0.5276 - acc: 0.7539
Epoch 47/150
768/768 [==============================] - 0s 421us/step - loss: 0.5320 - acc: 0.7357
Epoch 48/150
768/768 [==============================] - 0s 429us/step - loss: 0.5330 - acc: 0.7396
Epoch 49/150
768/768 [==============================] - 0s 360us/step - loss: 0.5324 - acc: 0.7500
Epoch 50/150
768/768 [==============================] - 0s 493us/step - loss: 0.5264 - acc: 0.7383
Epoch 51/150
768/768 [==============================] - 1s 840us/step - loss: 0.5281 - acc: 0.7500
Epoch 52/150
768/768 [==============================] - 1s 730us/step - loss: 0.5304 - acc: 0.7474
Epoch 53/150
768/768 [==============================] - 1s 762us/step - loss: 0.5387 - acc: 0.7422
Epoch 54/150
768/768 [==============================] - 1s 766us/step - loss: 0.5372 - acc: 0.7240
Epoch 55/150
768/768 [==============================] - 1s 769us/step - loss: 0.5220 - acc: 0.7513
Epoch 56/150
768/768 [==============================] - 1s 812us/step - loss: 0.5277 - acc: 0.7422
Epoch 57/150
768/768 [==============================] - 1s 728us/step - loss: 0.5307 - acc: 0.7357
Epoch 58/150
768/768 [==============================] - 1s 751us/step - loss: 0.5225 - acc: 0.7526
Epoch 59/150
768/768 [==============================] - 1s 762us/step - loss: 0.5119 - acc: 0.7630
Epoch 60/150
768/768 [==============================] - 1s 690us/step - loss: 0.5334 - acc: 0.7318
Epoch 61/150
768/768 [==============================] - 1s 846us/step - loss: 0.5276 - acc: 0.7409
Epoch 62/150
768/768 [==============================] - 1s 733us/step - loss: 0.5169 - acc: 0.7604
Epoch 63/150
768/768 [==============================] - 1s 748us/step - loss: 0.5419 - acc: 0.7305
Epoch 64/150
768/768 [==============================] - 1s 728us/step - loss: 0.5312 - acc: 0.7422
Epoch 65/150
768/768 [==============================] - 1s 712us/step - loss: 0.5197 - acc: 0.7487
Epoch 66/150
768/768 [==============================] - 1s 847us/step - loss: 0.5056 - acc: 0.7539
Epoch 67/150
768/768 [==============================] - 1s 732us/step - loss: 0.5151 - acc: 0.7409
Epoch 68/150
768/768 [==============================] - 1s 671us/step - loss: 0.5128 - acc: 0.7539
Epoch 69/150
768/768 [==============================] - 1s 738us/step - loss: 0.5132 - acc: 0.7487
Epoch 70/150
768/768 [==============================] - 1s 750us/step - loss: 0.5375 - acc: 0.7266
Epoch 71/150
768/768 [==============================] - 1s 863us/step - loss: 0.5176 - acc: 0.7383
Epoch 72/150
768/768 [==============================] - 1s 735us/step - loss: 0.5160 - acc: 0.7500
Epoch 73/150
768/768 [==============================] - 1s 719us/step - loss: 0.5165 - acc: 0.7448
Epoch 74/150
768/768 [==============================] - 1s 698us/step - loss: 0.5101 - acc: 0.7630
Epoch 75/150
768/768 [==============================] - 1s 821us/step - loss: 0.5092 - acc: 0.7591
Epoch 76/150
768/768 [==============================] - 1s 718us/step - loss: 0.5103 - acc: 0.7578
Epoch 77/150
768/768 [==============================] - 1s 698us/step - loss: 0.5161 - acc: 0.7630
Epoch 78/150
768/768 [==============================] - 1s 697us/step - loss: 0.5129 - acc: 0.7552
Epoch 79/150
768/768 [==============================] - 1s 818us/step - loss: 0.5135 - acc: 0.7513
Epoch 80/150
768/768 [==============================] - 1s 762us/step - loss: 0.5096 - acc: 0.7617
Epoch 81/150
768/768 [==============================] - 1s 748us/step - loss: 0.5051 - acc: 0.7708
Epoch 82/150
768/768 [==============================] - 1s 832us/step - loss: 0.5042 - acc: 0.7578
Epoch 83/150
768/768 [==============================] - 1s 750us/step - loss: 0.4994 - acc: 0.7643
Epoch 84/150
768/768 [==============================] - 1s 743us/step - loss: 0.4968 - acc: 0.7643
Epoch 85/150
768/768 [==============================] - 1s 828us/step - loss: 0.5048 - acc: 0.7487
Epoch 86/150
768/768 [==============================] - 1s 751us/step - loss: 0.5051 - acc: 0.7552
Epoch 87/150
768/768 [==============================] - 1s 780us/step - loss: 0.4988 - acc: 0.7591
Epoch 88/150
768/768 [==============================] - 1s 767us/step - loss: 0.4994 - acc: 0.7669
Epoch 89/150
768/768 [==============================] - 1s 788us/step - loss: 0.5037 - acc: 0.7773
Epoch 90/150
768/768 [==============================] - 1s 780us/step - loss: 0.5094 - acc: 0.7513
Epoch 91/150
768/768 [==============================] - 1s 797us/step - loss: 0.5024 - acc: 0.7578
Epoch 92/150
768/768 [==============================] - 1s 785us/step - loss: 0.5058 - acc: 0.7500
Epoch 93/150
768/768 [==============================] - 1s 815us/step - loss: 0.4993 - acc: 0.7656
Epoch 94/150
768/768 [==============================] - 1s 760us/step - loss: 0.4970 - acc: 0.7708
Epoch 95/150
768/768 [==============================] - 1s 789us/step - loss: 0.5033 - acc: 0.7500
Epoch 96/150
768/768 [==============================] - 1s 852us/step - loss: 0.4907 - acc: 0.7760
Epoch 97/150
768/768 [==============================] - 1s 709us/step - loss: 0.5002 - acc: 0.7721
Epoch 98/150
768/768 [==============================] - 1s 710us/step - loss: 0.4902 - acc: 0.7669
Epoch 99/150
768/768 [==============================] - 1s 790us/step - loss: 0.4902 - acc: 0.7669
Epoch 100/150
768/768 [==============================] - 1s 708us/step - loss: 0.4838 - acc: 0.7812
Epoch 101/150
768/768 [==============================] - 1s 794us/step - loss: 0.4893 - acc: 0.7747
Epoch 102/150
768/768 [==============================] - 1s 736us/step - loss: 0.4985 - acc: 0.7630
Epoch 103/150
768/768 [==============================] - 1s 738us/step - loss: 0.4991 - acc: 0.7591
Epoch 104/150
768/768 [==============================] - 1s 771us/step - loss: 0.4923 - acc: 0.7930
Epoch 105/150
768/768 [==============================] - 1s 737us/step - loss: 0.5294 - acc: 0.7500
Epoch 106/150
768/768 [==============================] - 1s 779us/step - loss: 0.4903 - acc: 0.7826
Epoch 107/150
768/768 [==============================] - 1s 739us/step - loss: 0.4905 - acc: 0.7721
Epoch 108/150
768/768 [==============================] - 1s 734us/step - loss: 0.4968 - acc: 0.7747
Epoch 109/150
768/768 [==============================] - 1s 807us/step - loss: 0.4874 - acc: 0.7669
Epoch 110/150
768/768 [==============================] - 1s 703us/step - loss: 0.4909 - acc: 0.7682
Epoch 111/150
768/768 [==============================] - 1s 748us/step - loss: 0.4842 - acc: 0.7826
Epoch 112/150
768/768 [==============================] - 1s 675us/step - loss: 0.4937 - acc: 0.7799
Epoch 113/150
768/768 [==============================] - 0s 629us/step - loss: 0.4954 - acc: 0.7578
Epoch 114/150
768/768 [==============================] - 1s 708us/step - loss: 0.4913 - acc: 0.7617
Epoch 115/150
768/768 [==============================] - 1s 722us/step - loss: 0.4902 - acc: 0.7773
Epoch 116/150
768/768 [==============================] - 1s 768us/step - loss: 0.4936 - acc: 0.7747
Epoch 117/150
768/768 [==============================] - 1s 662us/step - loss: 0.4904 - acc: 0.7604
Epoch 118/150
768/768 [==============================] - 1s 820us/step - loss: 0.4874 - acc: 0.7852
Epoch 119/150
768/768 [==============================] - 1s 725us/step - loss: 0.4819 - acc: 0.7682
Epoch 120/150
768/768 [==============================] - 1s 727us/step - loss: 0.4940 - acc: 0.7786
Epoch 121/150
768/768 [==============================] - 1s 756us/step - loss: 0.4912 - acc: 0.7799
Epoch 122/150
768/768 [==============================] - 1s 706us/step - loss: 0.4861 - acc: 0.7734
Epoch 123/150
768/768 [==============================] - 1s 800us/step - loss: 0.4836 - acc: 0.7669
Epoch 124/150
768/768 [==============================] - 1s 733us/step - loss: 0.4836 - acc: 0.7721
Epoch 125/150
768/768 [==============================] - 1s 730us/step - loss: 0.4865 - acc: 0.7760
Epoch 126/150
768/768 [==============================] - 1s 793us/step - loss: 0.4795 - acc: 0.7786
Epoch 127/150
768/768 [==============================] - 1s 720us/step - loss: 0.4882 - acc: 0.7721
Epoch 128/150
768/768 [==============================] - 1s 789us/step - loss: 0.4724 - acc: 0.7786
Epoch 129/150
768/768 [==============================] - 1s 725us/step - loss: 0.4809 - acc: 0.7773
Epoch 130/150
768/768 [==============================] - 1s 738us/step - loss: 0.4735 - acc: 0.7852
Epoch 131/150
768/768 [==============================] - 1s 749us/step - loss: 0.4823 - acc: 0.7682
Epoch 132/150
768/768 [==============================] - 1s 734us/step - loss: 0.4813 - acc: 0.7839
Epoch 133/150
768/768 [==============================] - 1s 820us/step - loss: 0.4829 - acc: 0.7695
Epoch 134/150
768/768 [==============================] - 1s 722us/step - loss: 0.4843 - acc: 0.7734
Epoch 135/150
768/768 [==============================] - 1s 701us/step - loss: 0.4769 - acc: 0.7773
Epoch 136/150
768/768 [==============================] - 1s 690us/step - loss: 0.4729 - acc: 0.7826
Epoch 137/150
768/768 [==============================] - 1s 672us/step - loss: 0.4674 - acc: 0.7826
Epoch 138/150
768/768 [==============================] - 1s 833us/step - loss: 0.4805 - acc: 0.7839
Epoch 139/150
768/768 [==============================] - 1s 771us/step - loss: 0.4643 - acc: 0.7917
Epoch 140/150
768/768 [==============================] - 1s 723us/step - loss: 0.4804 - acc: 0.7852
Epoch 141/150
768/768 [==============================] - 1s 722us/step - loss: 0.4725 - acc: 0.7839
Epoch 142/150
768/768 [==============================] - 1s 681us/step - loss: 0.4825 - acc: 0.7799
Epoch 143/150
768/768 [==============================] - 1s 837us/step - loss: 0.4753 - acc: 0.7721
Epoch 144/150
768/768 [==============================] - 0s 643us/step - loss: 0.4763 - acc: 0.7747
Epoch 145/150
768/768 [==============================] - 1s 720us/step - loss: 0.4867 - acc: 0.7682
Epoch 146/150
768/768 [==============================] - 1s 736us/step - loss: 0.4919 - acc: 0.7747
Epoch 147/150
768/768 [==============================] - 1s 709us/step - loss: 0.4826 - acc: 0.7839
Epoch 148/150
768/768 [==============================] - 1s 767us/step - loss: 0.4699 - acc: 0.7786
Epoch 149/150
768/768 [==============================] - 1s 720us/step - loss: 0.4762 - acc: 0.7643
Epoch 150/150
768/768 [==============================] - 1s 854us/step - loss: 0.4753 - acc: 0.7812
768/768 [==============================] - 0s 212us/step

acc : 79.30%

大体步骤总结如下,

  • 定义模型

在Keras中的模型被定义为层序列。创建一个序贯模型,并一次添加一个图层,直到对网络拓扑满意为止。设定网络的层数及其类型是一个困难的问题,寻找最优的网络拓扑结构是一个试错的过程,通过进行一系列的试验,对找到最好的网络结构有非常好的启发作用。一般来说,需要一个足够大的网络来捕获问题的结构。

在Keras中,通常使用Dense类来定义全连接层。通常将网络权重初始化为均匀分布的小随机数,在这个例子中使用介于0和0.05之间的随机数,这是Keras中的默认均衡权重初始化数值,也可以使用高斯分布产生的小随机数。

  • 编译模型

模型定义完成后,需要对模型进行编译,编译模型是为了使模型能够有效地使用Keras封装的数值计算。Keras可以根据后端自动选择最佳方法来训练模型,并运行预测。编译时,必须指定训练模型时所需的一些属性。训练一个神经网络模型,意味着找到最好的权重集来对这个问题作出预测。

在模型编译时,必须指定用于评估一组权重的损失函数(loss)、用于搜索网络不同权重的优化器(optimizer),以及希望在模型训练期间收集和报告的可选指标。

  • 训练模型

模型编译完成后,就可以用于计算了。在使用模型预测新数据前,需要先对模型进行训练。训练模型通过调用模型的fit函数来实现。

训练过程将采用epochs参数,对数据集进行固定次数的迭代,因此必须指定epochs参数进行模型训练,还需要设置在执行神经网络中的权重更新的每个批次中所用实例的个数(batch_size)。这些参数可以通过试验和错误实验来选择合适的值。

  • 评估模型

前面已经在整个数据集上训练了神经网络模型,这里通过同一数据集来评估神经网络模型的性能,这个评估方法只能反映训练数据集在模型上的准确度,不能反映算法对新数据的预测结果。通常可以将数据分成训练数据集和评估数据集,进行模型的训练和评估,以便得到模型在新数据上的性能。

可以使用模型的evaluation函数来评估模型的准确度,这里使用训练集来评估模型的准确度,这个函数将产生每个输入和输出对的预测,并收集分数,包括平均损失和配置的任何指标,如准确度。

  • 0
    点赞
  • 1
    收藏
    觉得还不错? 一键收藏
  • 0
    评论

“相关推荐”对你有帮助么?

  • 非常没帮助
  • 没帮助
  • 一般
  • 有帮助
  • 非常有帮助
提交
评论
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包
实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值