神奇的SoftMax regression,搞了一晚上搞不定,凌晨3点起来继续搞,刚刚终于调通。我算是彻底理解了,哈哈。代码试验了Andrew Ng的第四课上提到的SoftMax regression算法,并参考了http://ufldl.stanford.edu/wiki/index.php/Softmax_Regression
最终收敛到这个结果,巨爽。
smaple 0: 0.983690,0.004888,0.011422,likelyhood:-0.016445
smaple 1: 0.940236,0.047957,0.011807,likelyhood:-0.061625
smaple 2: 0.818187,0.001651,0.180162,likelyhood:-0.200665
smaple 3: 0.000187,0.999813,0.000000,likelyhood:-0.000187
smaple 4: 0.007913,0.992087,0.000000,likelyhood:-0.007945
smaple 5: 0.001585,0.998415,0.000000,likelyhood:-0.001587
smaple 6: 0.020159,0.000001,0.979840,likelyhood:-0.020366
smaple 7: 0.018230,0.000000,0.981770,likelyhood:-0.018398
smaple 8: 0.025072,0.000000,0.974928,likelyhood:-0.025392
#include "stdio.h"
#include "math.h"
double matrix[9][4]={
{1,47,76,24}, //include x0=1
{1,46,77,23},
{1,48,74,22},
{1,34,76,21},
{1,35,75,24},
{1,34,77,25},
{1,55,76,21},
{1,56,74,22},
{1,55,72,22},
};
double result[]={1,
1,
1,
2,
2,
2,
3,
3,
3,};
double theta[2][4]={
{0.3,0.3,0.01,0.01},