Keras中model.evaluate()返回的是 loss value & metrics values

本文详细介绍了Keras中model.evaluate()函数的使用方法,包括其参数设置和返回值解析。该函数用于计算模型在测试集上的损失值和指定的评估指标,如精度accuracy等。

摘要生成于 C知道 ,由 DeepSeek-R1 满血版支持, 前往体验 >

Keras官方文档: https://keras.io/models/model/#evaluate

 

Keras中model.evaluate()返回的是 损失值你选定的指标值(例如,精度accuracy)。

 

evaluate

evaluate(x=None, y=None, batch_size=None, verbose=1, sample_weight=None, steps=None)

Returns the loss value & metrics values for the model in test mode.

Computation is done in batches.

Arguments

  • x: Numpy array of test data (if the model has a single input), or list of Numpy arrays (if the model has multiple inputs). If input layers in the model are named, you can also pass a dictionary mapping input names to Numpy arrays. x can be None (default) if feeding from framework-native tensors (e.g. TensorFlow data tensors).
  • y: Numpy array of target (label) data (if the model has a single output), or list of Numpy arrays (if the model has multiple outputs). If output layers in the model are named, you can also pass a dictionary mapping output names to Numpy arrays. y can be None (default) if feeding from framework-native tensors (e.g. TensorFlow data tensors).
  • batch_size: Integer or None. Number of samples per evaluation step. If unspecified, batch_sizewill default to 32.
  • verbose: 0 or 1. Verbosity mode. 0 = silent, 1 = progress bar.
  • sample_weight: Optional Numpy array of weights for the test samples, used for weighting the loss function. You can either pass a flat (1D) Numpy array with the same length as the input samples (1:1 mapping between weights and samples), or in the case of temporal data, you can pass a 2D array with shape (samples, sequence_length), to apply a different weight to every timestep of every sample. In this case you should make sure to specifysample_weight_mode="temporal" in compile().
  • steps: Integer or None. Total number of steps (batches of samples) before declaring the evaluation round finished. Ignored with the default value of None.

Returns

Scalar test loss (if the model has a single output and no metrics) or list of scalars (if the model has multiple outputs and/or metrics). The attribute model.metrics_names will give you the display labels for the scalar outputs.

以下是使用TensorBoard进行Keras超参数优化的步骤: 1. 引入必要的库和模块: ```python import tensorflow as tf from tensorflow import keras from tensorboard.plugins.hparams import api as hp ``` 2. 定义超参数搜索空间: ```python HP_UNITS = hp.HParam('units', hp.Discrete([16, 32, 64])) HP_DROPOUT = hp.HParam('dropout', hp.RealInterval(0.1, 0.2)) HP_OPTIMIZER = hp.HParam('optimizer', hp.Discrete(['adam', 'sgd'])) METRIC_ACCURACY = 'accuracy' ``` 3. 定义模型训练函数: ```python def train_model(hparams): model = keras.Sequential([ keras.layers.Dense(hparams[HP_UNITS], activation='relu'), keras.layers.Dropout(hparams[HP_DROPOUT]), keras.layers.Dense(10) ]) model.compile(optimizer=hparams[HP_OPTIMIZER], loss=tf.keras.losses.SparseCategoricalCrossentropy(from_logits=True), metrics=[METRIC_ACCURACY]) model.fit(x_train, y_train, epochs=10) _, accuracy = model.evaluate(x_test, y_test) return accuracy ``` 4. 定义TensorBoard回调函数: ```python def run(run_dir, hparams): with tf.summary.create_file_writer(run_dir).as_default(): hp.hparams(hparams) # 记录超参数 accuracy = train_model(hparams) tf.summary.scalar(METRIC_ACCURACY, accuracy, step=1) ``` 5. 运行超参数搜索: ```python session_num = 0 for units in HP_UNITS.domain.values: for dropout_rate in (HP_DROPOUT.domain.min_value, HP_DROPOUT.domain.max_value): for optimizer in HP_OPTIMIZER.domain.values: hparams = { HP_UNITS: units, HP_DROPOUT: dropout_rate, HP_OPTIMIZER: optimizer } run_name = "run-%d" % session_num print('-- Starting trial: %s' % run_name) print({h.name: hparams[h] for h in hparams}) run('logs/hparam_tuning/' + run_name, hparams) session_num += 1 ``` 6. 启动TensorBoard: ```shell tensorboard --logdir logs/hparam_tuning ```
评论 9
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包
实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值