CS231n-Assignment 1

  • SVM:

It is very hard for me to use numpy and python at first time, a post may help us to solve the homework:

https://mlxai.github.io/2017/01/06/vectorized-implementation-of-svm-loss-and-gradient-update.html

  • Softmax:

Not a easy way to understand the p and L w.r.t. what. Actually, L w.r.t. score i and the same as p.
See more on:
https://www.youtube.com/watch?v=mlaLLQofmR8

Remember: in this assignment, we do not implement any thing about SVM, we just use the linear SVM loss to train our linear classifier, it is a loss function, not a predict function.

Also, here softmax is a predict function, and we can also treat it as non-linear function/activation like ReLU. And we use cross entropy function to compute the loss of softmax

  • Two-Layer Neural Network:

Why do not use softmax at the end when predict?
Because in softmax, each item’s dominant is the same, and the exponent function is a monotonous function, so we can just simply look up the max. item

Deride the gradient of each weight and bias:


I do not hope you all can figure it out, but maybe it can make some intuition

评论
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包
实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值