最优化问题分类:
L2-regularized L1-Loss SVM: 二阶正则化一阶损失函数SVM
L2-regularized L2-Loss SVM: 二阶正则化二阶损失函数SVM
L2-regularized Logistic Regression:二阶正则化逻辑回归SVM
L1-regularized L2-loss Support Vector Classification: 一阶正则化二阶损失函数SVC
L1-regularized Logistic Regression:一阶正则化逻辑回归
问题求解:
1. L2-regularized L1- and L2-loss SVM (Solving Dual)
参考论文:
1.1 A Dual Coordinate Descent Method for Large-scale Linear SVM (Dual)
SVMperf解决了L1-Loss SVM
2. L2-regularized Logistic Regression (Solving Primal)
参考论文:
2.1 Trust region Newton method for large-scale logistic regression
http://www.csie.ntu.edu.tw/~cjlin/papers/logistic.pdf
3. L2-regularized L2-loss SVM (Solving Primal)
参考论文:
3.1 LIBLINEAR: A Library for Large Linear Classification
3.2 Coordinate descent method for large-scale L2-loss linear SVM
http://www.csie.ntu.edu.tw/~cjlin/papers/cdl2.pdf 解决了L2-Loss SVM(Primal).
4. Multi-class SVM by Crammer and Singer
参考论文:
4.1 On the learnability and design of output codes for multiclass problems
4.2 LIBLINEAR: A Library for Large Linear Classification
5. L1-regularized L2-loss Support Vector Machines
参考论文:
5.1 LIBLINEAR: A Library for Large Linear Classification
6. L1-regularized Logistic Regression
参考论文:
6.1 LIBLINEAR: A Library for Large Linear Classification