loss functions

这里介绍keras中的loss function.
keras loss function的介绍在这里

loss function 或objective function 或 optimization score function是模型训练两个必不可少的参数之一,
例如:

from keras import losses
model.compile(loss='mean_squared_error', optimizer='sgd')
model.compile(loss=losses.mean_squared_error, optimizer='sgd')

对于keras而言,可以直接传入已有loss function的名称,或传入TensorFlow/Theano 的符号函数(symbolic function),符号函数需要传入以下两个参数,并且返回一个逐点计算的张量(scalar).

y_true: True labels. TensorFlow/Theano tensor.
y_pred: Predictions. TensorFlow/Theano tensor of the same shape as y_true.

实际需要优化的目标值是逐点计算的损失的平均值.(The actual optimized objective is the mean of the output array across all datapoints.)

可用的loss function
  • mean_squared_error
mean_squared_error(y_true, y_pred)

均方误差(mean squared error, MSE)是比较经典常用的损失函数.

MSE=Σni=11n(y_tureiy_predi)2 M S E = Σ i = 1 n 1 n ( y _ t u r e i − y _ p r e d i ) 2

- mean_absolute_error

mean_absolute_error(y_true, y_pred)
pingju
  • mean_absolute_percentage_error
mean_absolute_percentage_error(y_true, y_pred)
  • mean_squared_logarithmic_error
mean_squared_logarithmic_error(y_true, y_pred)
  • squared_hinge
squared_hinge(y_true, y_pred)
  • hinge
hinge(y_true, y_pred)
  • categorical_hinge
categorical_hinge(y_true, y_pred)
  • logcosh
logcosh(y_true, y_pred)

arithm of the hyperbolic cosine of the prediction error.
log(cosh(x)) is approximately equal to (x ** 2) / 2 for small x and to abs(x) - log(2) for large x. This means that ‘logcosh’ works mostly like the mean squared error, but will not be so strongly affected by the occasional wildly incorrect prediction.

参数

y_true: # 真实标签 (tensor of true targets.)
y_pred: # 预测值 (tensor of predicted targets.)

返回
每个样本一个张量loss值. (Tensor with one scalar loss entry per sample.)

  • categorical_crossentropy
categorical_crossentropy(y_true, y_pred)
  • sparse_categorical_crossentropy
sparse_categorical_crossentropy(y_true, y_pred)
  • binary_crossentropy
binary_crossentropy(y_true, y_pred)
  • kullback_leibler_divergence
kullback_leibler_divergence(y_true, y_pred)
  • poisson
poisson(y_true, y_pred)
  • cosine_proximity
cosine_proximity(y_true, y_pred)

Note: when using the categorical_crossentropy loss, your targets should be in categorical format (e.g. if you have 10 classes, the target for each sample should be a 10-dimensional vector that is all-zeros except for a 1 at the index corresponding to the class of the sample). In order to convert integer targets into categorical targets, you can use the

Keras utility to_categorical:
from keras.utils.np_utils import to_categorical
categorical_labels = to_categorical(int_labels, num_classes=None)
  • 0
    点赞
  • 2
    收藏
    觉得还不错? 一键收藏
  • 0
    评论
评论
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包
实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值