keras的分类与回归

过年了 ~~~ 过年了~~~
玩了小半个月~~~
今天看看keras的regression/classification。
视频是b站上的莫烦python讲的课,简洁易懂,下面记录一下课程中的代码以及遇到的问题,课程链接在下面

link

代码:

提示:这里描述项目中遇到的问题:
例如:数据传输过程中数据不时出现丢失的情况,偶尔会丢失一部分数据
APP 中接收数据代码:

#Regressor example

import numpy as np
np.random.seed(1337) #利用随机数种子,每次生成的随机数相同。随机数种子的参数怎么选择?这个参数只是确定一下随机数的起始位置,可随意分配。
from keras.models import Sequential#按顺序建立mode
from  keras.layers import Dense#全连接层
import matplotlib.pyplot as plt

#creat some data
X = np.linspace(-1, 1, 200)
np.random.shuffle(X)#重新排序返回一个随机序列作用类似洗牌
Y = 0.5*X +2 +np.random.normal(0, 0.05,(200,))

#plot data
plt.scatter(X, Y)
plt.show()

X_train, Y_train = X[:160], Y[:160]#讲不清楚。。。。但看下面跑的例子 一清二楚 哈哈哈哈哈哈
X_test, Y_test = X[160:], Y[160:]

#build a neural network from the lst layer to the last layer
model = Sequential()
model.add(Dense(output_dim=1, input_dim=1))#单层的全连接神经网络

#choose loss function and optimizing method
model.compile(loss='mse', optimizer='sgd')#选的误差是mse 优化器是SGD

#training
print('Training-------------')
for step in range(301):
    cost = model.train_on_batch(X_train, Y_train)#在训练集数据的一批数据上进行训练
    if step % 100 == 0:
        print('train cost:', cost)

#test
print('\nTesting-------------')
cost = model.evaluate(X_test, Y_test, batch_size=40)
print('test cost:', cost)
W, b = model.layers[0].get_weights()#提取权重啊,单层网络。。。所以写的layer[0]
print('Weight=',W,'\nbiases=',b)

#plotting the prediction
Y_pred = model.predict(X_test)
plt.scatter(X_test, Y_test)
plt.plot(X_test, Y_pred)
plt.show()
Training-------------
train cost: 4.0225005
train cost: 0.073238626
train cost: 0.00386274
train cost: 0.002643449

Testing-------------

40/40 [==============================] - 0s 375us/step
test cost: 0.0031367032788693905
Weight= [[0.4922711]] 
biases= [1.9995022]

这段代码解释的那个X[:160], Y[:160]

import numpy as np
X = np.linspace(-1, 1, 6)
x = X[:2]
y = X[2:]
print(X)
print(x)
print(y)
[-1.  -0.6 -0.2  0.2  0.6  1. ]
[-1.  -0.6]
[-0.2  0.2  0.6  1. ]


第一幅图是代码的数据集的图片,第二张图就是拟合的图

生成的数据集
在这里插入图片描述


下面是keras的分类

#Classifier example

import numpy as np
np.random.seed(1337) #利用随机数种子,每次生成的随机数相同。随机数种子的参数怎么选择?这个参数只是确定一下随机数的起始位置,可随意分配。
from keras.datasets import mnist
from keras.utils import np_utils
from keras.models import Sequential#按顺序建立mode
from  keras.layers import Dense,Activation#全连接层
from keras.optimizers import RMSprop
import matplotlib.pyplot as plt

#download the mnist to the path'~/.keras/datasets/'if it is the first time to be called
#X shape(60,000 28x28), y shape(10,000,)
(X_train, y_train), (X_test, y_test) = mnist.load_data()

#data pre-processing
X_train = X_train.reshape(X_train.shape[0], -1) / 255 #normalize
X_test = X_test.reshape(X_test.shape[0], -1) / 255 #normalize
y_train = np_utils.to_categorical(y_train, num_classes=10)#转为one-hot
y_test = np_utils.to_categorical(y_test, num_classes=10)

#Another way to build your neural net
model = Sequential([
    Dense(32, input_dim=784),
    Activation('relu'),
    Dense(10),
    Activation('softmax')
])

#Another way to define your optimizer
rmsprop = RMSprop(lr=0.001, rho=0.9, epsilon=1e-08, decay=0.0)

#We add metrics to get more results you want to see
model.compile(
    optimizer=rmsprop,
    loss='categorical_crossentropy',
    metrics=['accuracy'],
)

print('Training------')

#Another way to train the model
model.fit(X_train, y_train, nb_epoch=2, batch_size=32)

print('\nTesting-----')

#Evaluate the model with the metrics we defined earlier
loss, accuracy = model.evaluate(X_test, y_test)

print('test loss:',loss)
print('test accuracy:', accuracy)

Epoch 1/2

   32/60000 [..............................] - ETA: 3:57 - loss: 2.4464 - accuracy: 0.0625
 1088/60000 [..............................] - ETA: 9s - loss: 1.7093 - accuracy: 0.5230  
 2048/60000 [>.............................] - ETA: 6s - loss: 1.4081 - accuracy: 0.6333
 3136/60000 [>.............................] - ETA: 5s - loss: 1.1784 - accuracy: 0.7025
 4256/60000 [=>............................] - ETA: 4s - loss: 1.0170 - accuracy: 0.7444
 5376/60000 [=>............................] - ETA: 3s - loss: 0.9175 - accuracy: 0.7677
 6496/60000 [==>...........................] - ETA: 3s - loss: 0.8419 - accuracy: 0.7882
 7264/60000 [==>...........................] - ETA: 3s - loss: 0.7958 - accuracy: 0.8000
 8384/60000 [===>..........................] - ETA: 3s - loss: 0.7452 - accuracy: 0.8121
 9504/60000 [===>..........................] - ETA: 3s - loss: 0.6998 - accuracy: 0.8230
10656/60000 [====>.........................] - ETA: 2s - loss: 0.6667 - accuracy: 0.8292
11744/60000 [====>.........................] - ETA: 2s - loss: 0.6391 - accuracy: 0.8346
12832/60000 [=====>........................] - ETA: 2s - loss: 0.6132 - accuracy: 0.8410
13984/60000 [=====>........................] - ETA: 2s - loss: 0.5909 - accuracy: 0.8452
15104/60000 [======>.......................] - ETA: 2s - loss: 0.5744 - accuracy: 0.8490
16224/60000 [=======>......................] - ETA: 2s - loss: 0.5575 - accuracy: 0.8534
17344/60000 [=======>......................] - ETA: 2s - loss: 0.5413 - accuracy: 0.8568
18496/60000 [========>.....................] - ETA: 2s - loss: 0.5256 - accuracy: 0.8604
19584/60000 [========>.....................] - ETA: 2s - loss: 0.5140 - accuracy: 0.8630
20672/60000 [=========>....................] - ETA: 2s - loss: 0.5025 - accuracy: 0.8660
21792/60000 [=========>....................] - ETA: 2s - loss: 0.4932 - accuracy: 0.8684
22944/60000 [==========>...................] - ETA: 1s - loss: 0.4856 - accuracy: 0.8700
24096/60000 [===========>..................] - ETA: 1s - loss: 0.4776 - accuracy: 0.8718
25216/60000 [===========>..................] - ETA: 1s - loss: 0.4680 - accuracy: 0.8739
26304/60000 [============>.................] - ETA: 1s - loss: 0.4613 - accuracy: 0.8755
27392/60000 [============>.................] - ETA: 1s - loss: 0.4546 - accuracy: 0.8768
28448/60000 [=============>................] - ETA: 1s - loss: 0.4483 - accuracy: 0.8782
29504/60000 [=============>................] - ETA: 1s - loss: 0.4438 - accuracy: 0.8792
30560/60000 [==============>...............] - ETA: 1s - loss: 0.4367 - accuracy: 0.8809
31712/60000 [==============>...............] - ETA: 1s - loss: 0.4300 - accuracy: 0.8826
32896/60000 [===============>..............] - ETA: 1s - loss: 0.4250 - accuracy: 0.8839
34016/60000 [================>.............] - ETA: 1s - loss: 0.4197 - accuracy: 0.8849
35136/60000 [================>.............] - ETA: 1s - loss: 0.4164 - accuracy: 0.8854
36320/60000 [=================>............] - ETA: 1s - loss: 0.4110 - accuracy: 0.8870
37504/60000 [=================>............] - ETA: 1s - loss: 0.4070 - accuracy: 0.8878
38656/60000 [==================>...........] - ETA: 1s - loss: 0.4031 - accuracy: 0.8888
39808/60000 [==================>...........] - ETA: 1s - loss: 0.3997 - accuracy: 0.8896
40896/60000 [===================>..........] - ETA: 0s - loss: 0.3959 - accuracy: 0.8906
42048/60000 [====================>.........] - ETA: 0s - loss: 0.3926 - accuracy: 0.8916
43168/60000 [====================>.........] - ETA: 0s - loss: 0.3889 - accuracy: 0.8924
44288/60000 [=====================>........] - ETA: 0s - loss: 0.3864 - accuracy: 0.8931
45440/60000 [=====================>........] - ETA: 0s - loss: 0.3822 - accuracy: 0.8944
46624/60000 [======================>.......] - ETA: 0s - loss: 0.3784 - accuracy: 0.8954
47712/60000 [======================>.......] - ETA: 0s - loss: 0.3748 - accuracy: 0.8962
48832/60000 [=======================>......] - ETA: 0s - loss: 0.3716 - accuracy: 0.8970
49984/60000 [=======================>......] - ETA: 0s - loss: 0.3682 - accuracy: 0.8978
51072/60000 [========================>.....] - ETA: 0s - loss: 0.3661 - accuracy: 0.8985
52192/60000 [=========================>....] - ETA: 0s - loss: 0.3628 - accuracy: 0.8995
53216/60000 [=========================>....] - ETA: 0s - loss: 0.3603 - accuracy: 0.9002
54400/60000 [==========================>...] - ETA: 0s - loss: 0.3574 - accuracy: 0.9009
55584/60000 [==========================>...] - ETA: 0s - loss: 0.3540 - accuracy: 0.9018
56672/60000 [===========================>..] - ETA: 0s - loss: 0.3515 - accuracy: 0.9026
57792/60000 [===========================>..] - ETA: 0s - loss: 0.3487 - accuracy: 0.9032
58944/60000 [============================>.] - ETA: 0s - loss: 0.3459 - accuracy: 0.9039
60000/60000 [==============================] - 3s 48us/step - loss: 0.3435 - accuracy: 0.9046
Epoch 2/2

   32/60000 [..............................] - ETA: 7s - loss: 0.0655 - accuracy: 1.0000
  960/60000 [..............................] - ETA: 3s - loss: 0.2284 - accuracy: 0.9344
 1824/60000 [..............................] - ETA: 3s - loss: 0.2335 - accuracy: 0.9370
 2688/60000 [>.............................] - ETA: 3s - loss: 0.2285 - accuracy: 0.9394
 3680/60000 [>.............................] - ETA: 3s - loss: 0.2276 - accuracy: 0.9386
 4608/60000 [=>............................] - ETA: 3s - loss: 0.2211 - accuracy: 0.9384
 5536/60000 [=>............................] - ETA: 3s - loss: 0.2212 - accuracy: 0.9379
 6496/60000 [==>...........................] - ETA: 2s - loss: 0.2191 - accuracy: 0.9381
 7488/60000 [==>...........................] - ETA: 2s - loss: 0.2229 - accuracy: 0.9356
 8416/60000 [===>..........................] - ETA: 2s - loss: 0.2252 - accuracy: 0.9357
 9472/60000 [===>..........................] - ETA: 2s - loss: 0.2197 - accuracy: 0.9364
10528/60000 [====>.........................] - ETA: 2s - loss: 0.2189 - accuracy: 0.9367
11520/60000 [====>.........................] - ETA: 2s - loss: 0.2154 - accuracy: 0.9385
12640/60000 [=====>........................] - ETA: 2s - loss: 0.2146 - accuracy: 0.9392
13792/60000 [=====>........................] - ETA: 2s - loss: 0.2156 - accuracy: 0.9390
15008/60000 [======>.......................] - ETA: 2s - loss: 0.2161 - accuracy: 0.9390
16224/60000 [=======>......................] - ETA: 2s - loss: 0.2121 - accuracy: 0.9402
17408/60000 [=======>......................] - ETA: 2s - loss: 0.2089 - accuracy: 0.9411
18592/60000 [========>.....................] - ETA: 2s - loss: 0.2091 - accuracy: 0.9416
19744/60000 [========>.....................] - ETA: 2s - loss: 0.2072 - accuracy: 0.9420
20896/60000 [=========>....................] - ETA: 1s - loss: 0.2057 - accuracy: 0.9424
22016/60000 [==========>...................] - ETA: 1s - loss: 0.2066 - accuracy: 0.9420
23040/60000 [==========>...................] - ETA: 1s - loss: 0.2056 - accuracy: 0.9422
24192/60000 [===========>..................] - ETA: 1s - loss: 0.2046 - accuracy: 0.9423
25312/60000 [===========>..................] - ETA: 1s - loss: 0.2034 - accuracy: 0.9425
26432/60000 [============>.................] - ETA: 1s - loss: 0.2033 - accuracy: 0.9424
27520/60000 [============>.................] - ETA: 1s - loss: 0.2049 - accuracy: 0.9418
28672/60000 [=============>................] - ETA: 1s - loss: 0.2046 - accuracy: 0.9418
29824/60000 [=============>................] - ETA: 1s - loss: 0.2054 - accuracy: 0.9415
30944/60000 [==============>...............] - ETA: 1s - loss: 0.2046 - accuracy: 0.9419
32000/60000 [===============>..............] - ETA: 1s - loss: 0.2041 - accuracy: 0.9420
33152/60000 [===============>..............] - ETA: 1s - loss: 0.2036 - accuracy: 0.9422
34304/60000 [================>.............] - ETA: 1s - loss: 0.2030 - accuracy: 0.9422
35328/60000 [================>.............] - ETA: 1s - loss: 0.2029 - accuracy: 0.9422
36352/60000 [=================>............] - ETA: 1s - loss: 0.2028 - accuracy: 0.9421
37408/60000 [=================>............] - ETA: 1s - loss: 0.2020 - accuracy: 0.9422
38464/60000 [==================>...........] - ETA: 1s - loss: 0.2011 - accuracy: 0.9424
39552/60000 [==================>...........] - ETA: 0s - loss: 0.1994 - accuracy: 0.9428
40608/60000 [===================>..........] - ETA: 0s - loss: 0.1983 - accuracy: 0.9430
41664/60000 [===================>..........] - ETA: 0s - loss: 0.1980 - accuracy: 0.9432
42720/60000 [====================>.........] - ETA: 0s - loss: 0.1975 - accuracy: 0.9433
43680/60000 [====================>.........] - ETA: 0s - loss: 0.1971 - accuracy: 0.9435
44672/60000 [=====================>........] - ETA: 0s - loss: 0.1973 - accuracy: 0.9436
45696/60000 [=====================>........] - ETA: 0s - loss: 0.1966 - accuracy: 0.9438
46848/60000 [======================>.......] - ETA: 0s - loss: 0.1968 - accuracy: 0.9438
47968/60000 [======================>.......] - ETA: 0s - loss: 0.1962 - accuracy: 0.9438
48928/60000 [=======================>......] - ETA: 0s - loss: 0.1962 - accuracy: 0.9437
50080/60000 [========================>.....] - ETA: 0s - loss: 0.1967 - accuracy: 0.9437
51232/60000 [========================>.....] - ETA: 0s - loss: 0.1964 - accuracy: 0.9438
52352/60000 [=========================>....] - ETA: 0s - loss: 0.1959 - accuracy: 0.9439
53376/60000 [=========================>....] - ETA: 0s - loss: 0.1960 - accuracy: 0.9438
54528/60000 [==========================>...] - ETA: 0s - loss: 0.1966 - accuracy: 0.9436
55648/60000 [==========================>...] - ETA: 0s - loss: 0.1959 - accuracy: 0.9437
56768/60000 [===========================>..] - ETA: 0s - loss: 0.1962 - accuracy: 0.9438
57856/60000 [===========================>..] - ETA: 0s - loss: 0.1956 - accuracy: 0.9438
59008/60000 [============================>.] - ETA: 0s - loss: 0.1949 - accuracy: 0.9439
60000/60000 [==============================] - 3s 48us/step - loss: 0.1946 - accuracy: 0.9440

Testing-----

   32/10000 [..............................] - ETA: 7s
 2016/10000 [=====>........................] - ETA: 0s
 3936/10000 [==========>...................] - ETA: 0s
 5760/10000 [================>.............] - ETA: 0s
 7872/10000 [======================>.......] - ETA: 0s
10000/10000 [==============================] - 0s 27us/step
test loss: 0.17407772153392434
test accuracy: 0.9513000249862671

代码里的小点:

np_utils.to_categorical用于将标签转化为形如(nb_samples, nb_classes)的二值序列。
假设num_classes = 10。
如将[1,2,3,……4]转化成:
[[0,1,0,0,0,0,0,0]
[0,0,1,0,0,0,0,0]
[0,0,0,1,0,0,0,0]
……
[0,0,0,0,1,0,0,0]]
这样的形态。


在训练模型之前,需要通过compile来对学习过程进行配置。compile接收三个参数:

  • 优化器optimizer:指定为已预定义的优化器名,如rmsprop、adagrad,或一个Optimizer类的对象
  • 损失函数loss:最小化的目标函数,为预定义的损失函数名,如categorical_crossentropy、mse,也可以为一个损失函数
  • 指标列表metrics:对分类问题,一般设置为metrics=[‘accuracy’]。指标可以是一个预定义指标的名字,也可以是一个用户定制的函数.指标函数应该返回单个张量,或一个完成metric_name到metric_value映射的字典.
  • 0
    点赞
  • 2
    收藏
    觉得还不错? 一键收藏
  • 0
    评论

“相关推荐”对你有帮助么?

  • 非常没帮助
  • 没帮助
  • 一般
  • 有帮助
  • 非常有帮助
提交
评论
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包
实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值