LearningRateScheduler
导入:from keras.callbacks import LearningRateScheduler
功能:学习率自动调节
使用:
keras.callbacks.callbacks.LearningRateScheduler(schedule, verbose=0)
参数:
参数 | 类型 | 含义 |
---|---|---|
schedule | 函数 | 该函数以epoch号为参数(从0算起的整数),返回一个新学习率(浮点数) |
verbose默认即可。
示例:
initial_learningrate=2e-3
def lr_decay(epoch):#lrv
return initial_learningrate * 0.99 ** epoch
model.compile(loss="categorical_crossentropy",
optimizer=RMSprop(lr=initial_learningrate),
metrics=['accuracy'])
history = model.fit_generator(
train_datagen.flow(X_train,Y_train, batch_size=batch_size),
steps_per_epoch=100,
epochs=epochs,
callbacks=[LearningRateScheduler(lr_decay)] ],
validation_data=valid_datagen.flow(X_valid,Y_valid),
validation_steps=50,
verbose=2)