loss function, cost function, and objective function

本文详细阐述了损失函数、成本函数和目标函数在机器学习中的区别和联系。损失函数衡量单个数据点上的预测误差,如平方损失和 hinge 损失;成本函数是对整个训练集损失的总和加上正则化项,如均方误差和 SVM 成本函数;而目标函数是最通用的概念,涵盖训练中优化的所有函数,包括最大似然估计等。这三者在机器学习优化过程中扮演着不同角色。
摘要由CSDN通过智能技术生成

https://stats.stackexchange.com/questions/179026/objective-function-cost-function-loss-function-are-they-the-same-thing

These are not very strict terms and they are highly related. However:

Loss function is usually a function defined on a data point, prediction and label, and measures the penalty. For example:
square loss l ( f ( x i ∣ θ ) , y i ) = ( f ( x i ∣ θ ) − y i ) 2 l(f(x_i|\theta),y_i) = \left (f(x_i|\theta)-y_i \right )^2 l(f(xiθ),yi)=(f(xiθ)yi)2, used in linear regression
hinge loss l ( f ( x i ∣ θ ) , y i ) = max ⁡ ( 0 , 1 − f ( x i ∣ θ ) y i ) l(f(x_i|\theta), y_i) = \max(0, 1-f(x_i|\theta)y_i) l(f(xiθ),yi)=max(0,1f(xiθ)yi), used in SVM
0/1 loss l ( f ( x i ∣ θ ) , y i ) = 1    ⟺    f ( x i ∣ θ ) ≠ y i l(f(x_i|\theta), y_i) = 1 \iff f(x_i|\theta) \neq y_i l(f(xiθ),yi)=1f(xiθ)=yi, used in theoretical analysis and definition of accuracy
Cost function is usually more general. It might be a sum of loss functions over your training set plus some model complexity penalty (regularization). For example:
Mean Squared Error M S E ( θ ) = 1 N ∑ i = 1 N ( f ( x i ∣ θ ) − y i ) 2 MSE(\theta) = \frac{1}{N} \sum_{i=1}^N \left (f(x_i|\theta)-y_i \right )^2 MSE(θ)=N1i=1N(f(xiθ)yi)2
SVM cost function S V M ( θ ) = ∥ θ ∥ 2 + C ∑ i = 1 N ξ i SVM(\theta) = \|\theta\|^2 + C \sum_{i=1}^N \xi_i SVM(θ)=θ2+Ci=1Nξi (there are additional constraints connecting ξ i \xi_i ξi with C C C and with training set)
Objective function is the most general term for any function that you optimize during training. For example, a probability of generating training set in maximum likelihood approach is a well defined objective function, but it is not a loss function nor cost function (however you could define an equivalent cost function). For example:
MLE is a type of objective function (which you maximize)
Divergence between classes can be an objective function but it is barely a cost function, unless you define something artificial, like 1-Divergence, and name it a cost
Long story short, I would say that:

A loss function is a part of a cost function which is a type of an objective function.

  • 0
    点赞
  • 0
    收藏
    觉得还不错? 一键收藏
  • 0
    评论
评论
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包
实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值