CS231N assignment1 SVM

from cs231n.classifiers.softmax import softmax_loss_naive
线性分类器SVM,分成两个部分
1.a score function that maps the raw data to class scores,也就是所谓的f(w,x)函数
2.a loss function that quantifies the agreement between the predicted scores and the ground truth labels
在这里插入图片描述

margin:

SVM loss function wants the score of the correct class yi to be larger than the incorrect class scores by at least by Δ (delta). If this is not the case, we will accumulate loss.

example

在这里插入图片描述

1. loss function

cs231n/classifiers/linear_softmax.py中

softmax_loss_naive

SVM想让正确类别的score比错误类别的score要高出一个固定的margin Δ.
svm的损失函数计算方法

  Inputs:
  - W: A numpy array of shape (D, C) containing weights.
  - X: A numpy array of shape (N, D) containing a minibatch of data.
  - y: A numpy array of shape (N,) containing training labels; y[i] = c means
    that X[i] has label c, where 0 <= c < C.
  - reg: (float) regularization strength

在这里插入图片描述

  for i in xrange(num_train):#0-N
    scores = X[i].dot(W) ##1*C  
    correct_class_score = scores[y[i]] 
    for j in xrange(num_classes):# 0-C
      if j == y[i]:
        continue
      margin = scores[j] - correct_class_score + 1 # note delta = 1

      if margin > 0:
        loss += margin
        dW[:,j] += X[i].T 
        dW[:,y[i]] -= X[i].T

softmax_loss_vectorized

loss

  scores = X.dot(W)
  yi_scores = scores[np.arange(scores.shape[0]),y] 
  margins = np.maximum(0, scores - np.matrix(yi_scores).T + 1)
  margins[np.arange(num_train),y] = 0
  loss = np.mean(np.sum(margins, axis=1))
  loss += 0.5 * reg * np.sum(W * W)

####参考
cs231n linear classifier SVM
https://mlxai.github.io/2017/01/06/vectorized-implementation-of-svm-loss-and-gradient-update.html

  • 0
    点赞
  • 0
    收藏
    觉得还不错? 一键收藏
  • 0
    评论
评论
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包
实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值