支持向量机(Support Vector Machine)
1 Introduction
SVM = hinge loss + Kernel method
Hinge loss: L(f(xn),y^n)=max(0,1−y^nf(x))
2 Linear SVM
step 1: f(x)=∑iwixi+b=[wb][x1]=w⋅x
step 2: HingeLoss+λ||w||2 凸函数
- step 3:梯度下降
《Deep Learning using Linear Support Vector Machine》——ICML2013
- εn :松弛变量
- Quadratic Programming
3 Dual Representation
- step 1: f(x)=wTx=αTXTx=∑nαn(xnx)=∑nαnK(xn,x) 核技巧
- step 2、3:Find {α∗1,…,α∗N} minimizing loss function
4 Kernel Trick
- K(x,z)=ϕ(x)ϕ(z)
- Radial Basis Function Kernel: K(x,z)=exp(−12||x−z||2)=ϕ(x)ϕ(z) 无穷多维的空间
- Sigmoid Kernel: K(x,z)=tanh(x⋅z)