Overall introduction of the topic, the keywords, the references.
Part 1: Introduction on svm(5 min)
first show that svm is what kind of a classifier. via graphs and examples. The notion of support vectors . the use of support vectors.
svm characteristic:largest margin,geometric margin and function margin, induction,finally reduced to a QP problem.
svm applications
Part 2: kernel method and kernel matrix (5 min)
why kernel method
what's kernel method for
the notion of kernel functions.
the notion of similarity function
inner product
example
Part 3: the SMO algorithm
Part 4: muliple kernel learning(20 min)
why MLK-------the need for flexibility to combine different kernel matrix linearly.
what MLK yields and where the problem lies
the formulation of MLK problem
the notion of SKM and the induction of it. Primal problem --> Dual problem-->KKT-->some conclusions
kernelization-->the kernelized problem formulation
Equavalence of the two formulation
why we can't solve it
regularize SKM: the dual problem -->solving the MY-regularized problem using SMO
Conclusion
propose a novel dual formulation to apply SMO to MY-regularized non-smooth convex problem.
and the author's simulation show that the SMO will be faster than the normal Mosek
博客主要介绍支持向量机(SVM),包括其作为分类器的概念、支持向量的作用、特性及应用。还阐述了核方法、核矩阵,介绍了SMO算法和多核学习(MLK),提出新的对偶公式应用SMO解决非光滑凸问题,模拟显示SMO比普通Mosek更快。
1317

被折叠的 条评论
为什么被折叠?



