目录(注意本文jupyterlab编写)
预先导入数据
from sklearn.datasets import fetch_california_housing
from sklearn.model_selection import train_test_split
from sklearn.preprocessing import StandardScaler
查看数据样式
housing=fetch_california_housing()
scaler=StandardScaler()
x_data=scaler.fit_transform(housing.data)
x_train_full,x_test,y_train_full,y_test=train_test_split(x_data,housing.target)
x_train,x_valid,y_train,y_valid=train_test_split(x_train_full,y_train_full)
x_train.shape,x_valid.shape
((11610, 8), (3870, 8))
import tensorflow as tf
创建build和compile模型的函数
参数为后面scikit-learn包装类实例化所想要的参数(自己设置),返回编译好的模型
def build_model(n_hidden=2,n_neurons=30,learning_rate=0.1,input_shape=[8]):
model=tf.keras.models.Sequential()
model.add(tf.keras.layers.InputLayer(input_shape=input_shape))
for layer in range(n_hidden):
model.add(tf.keras.layers.Dense(n_neurons,activation='relu',kernel_initializer='he_normal'))
model.add(tf.keras.layers.Dense(1))
optimizer=tf.keras.optimizers.SGD(learning_rate=learning_rate,momentum=0.9,nesterov=True)
model.compile(loss='mse',optimizer=optimizer)
return model
scikit-learn包装
#scikit-learn包装
keras_reg=tf.keras.wrappers.scikit_learn.KerasRegressor(build_model)
模型训练
注意:
fit方法会把参数传递到内部的model,就像model.fit()
EarlyStopping:提前停止,防止过拟合,patience为损失或精度停止轮次阈值,restore_best_weights设置停止后返回最好的权重
#提前停止,防止过拟合
earlystop=tf.keras.callbacks.EarlyStopping(patience=10,restore_best_weights=True)
keras_reg.fit(x_train,y_train,epochs=100,validation_data=(x_valid,y_valid),callbacks=[earlystop])
keras_reg.score(x_test,y_test)
Epoch 1/100
363/363 [==============================] - 3s 4ms/step - loss: 1.7924 - val_loss: 1.3685
Epoch 2/100
363/363 [==============================] - 1s 4ms/step - loss: 1.3301 - val_loss: 1.3647
Epoch 3/100
363/363 [==============================] - 1s 4ms/step - loss: 1.3344 - val_loss: 1.3750
Epoch 4/100
363/363 [==============================] - 1s 4ms/step - loss: 1.3357 - val_loss: 1.3937
Epoch 5/100
363/363 [==============================] - 1s 4ms/step - loss: 1.3349 - val_loss: 1.3666
Epoch 6/100
363/363 [==============================] - 3s 7ms/step - loss: 1.3334 - val_loss: 1.4313
Epoch 7/100
363/363 [==============================] - 3s 9ms/step - loss: 1.3374 - val_loss: 1.3662
Epoch 8/100
363/363 [==============================] - 3s 9ms/step - loss: 1.3325 - val_loss: 1.3770
Epoch 9/100
363/363 [==============================] - 5s 14ms/step - loss: 1.3330 - val_loss: 1.4188
Epoch 10/100
363/363 [==============================] - 3s 9ms/step - loss: 1.3321 - val_loss: 1.3701
Epoch 11/100
363/363 [==============================] - 3s 9ms/step - loss: 1.3315 - val_loss: 1.3959
Epoch 12/100
363/363 [==============================] - 5s 14ms/step - loss: 1.3353 - val_loss: 1.3920
162/162 [==============================] - 0s 2ms/step - loss: 1.3511
-1.3510503768920898
用GridSearchCV或RandomizedSearchCV搜索参数空间
注意:建议使用RandomizedSearchCV

被折叠的 条评论
为什么被折叠?



