keras Usage of metrics 评价指标

Usage of metrics

A metric is a function that is used to judge the performance of your model. Metric functions are to be supplied in the metrics parameter when a model is compiled.

model.compile(loss='mean_squared_error',
              optimizer='sgd',
              metrics=['mae', 'acc'])
from keras import metrics

model.compile(loss='mean_squared_error',
              optimizer='sgd',
              metrics=[metrics.mae, metrics.categorical_accuracy])

A metric function is similar to an loss function, except that the results from evaluating a metric are not used when training the model.

You can either pass the name of an existing metric, or pass a Theano/TensorFlow symbolic function (see Custom metrics).

Arguments
  • y_true: True labels. Theano/TensorFlow tensor.
  • y_pred: Predictions. Theano/TensorFlow tensor of the same shape as y_true.
Returns

Single tensor value representing the mean of the output array across all datapoints.


Available metrics

binary_accuracy

binary_accuracy(y_true, y_pred)

categorical_accuracy

categorical_accuracy(y_true, y_pred)

sparse_categorical_accuracy

sparse_categorical_accuracy(y_true, y_pred)

top_k_categorical_accuracy

top_k_categorical_accuracy(y_true, y_pred, k=5)

sparse_top_k_categorical_accuracy

sparse_top_k_categorical_accuracy(y_true, y_pred, k=5)

Custom metrics

Custom metrics can be passed at the compilation step. The function would need to take (y_true, y_pred) as arguments and return a single tensor value.

import keras.backend as K

def mean_pred(y_true, y_pred):
    return K.mean(y_pred)

model.compile(optimizer='rmsprop',
              loss='binary_crossentropy',
              metrics=['accuracy', mean_pred])
As an AI language model, I don't have the ability to write code, but I can explain the usage of different layers and models in Keras. Keras is a popular deep learning framework used for building and training neural networks. It provides easy-to-use high-level APIs for building and training models, making it an ideal choice for beginners and experts alike. The `layers` module in Keras provides a wide range of layers that can be used to construct neural networks. Some common layers include: - `Dense`: A fully connected layer that applies a linear transformation to the input data. - `Conv2D`: A 2D convolutional layer that applies a filter to the input data to extract features. - `MaxPooling2D`: A pooling layer that downsamples the input data by taking the maximum value of each patch. - `Dropout`: A regularization layer that randomly drops out units during training to prevent overfitting. The `model` module in Keras provides a way to organize layers into a complete neural network. To create a model in Keras, you define the layers that make up the model, and then you compile the model with an optimizer, a loss function, and metrics to track during training. Here's an example of how to create a simple neural network in Keras: ``` from keras.models import Sequential from keras.layers import Dense model = Sequential() model.add(Dense(64, activation='relu', input_dim=100)) model.add(Dense(10, activation='softmax')) model.compile(optimizer='rmsprop', loss='categorical_crossentropy', metrics=['accuracy']) ``` In this example, we create a `Sequential` model and add two `Dense` layers. The first `Dense` layer has 64 units and uses the `relu` activation function. The second `Dense` layer has 10 units and uses the `softmax` activation function. We compile the model with the `rmsprop` optimizer, the `categorical_crossentropy` loss function, and the `accuracy` metric to track during training.
评论
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包
实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值