机器学习神经网络代价函数解释
首先回忆一下逻辑回归的代价函数
J ( θ ) = − 1 m ∑ i = 1 m [ y ( i ) log h θ ( x i ) + ( 1 − y ( i ) ) log ( 1 − h θ ( x ( i ) ) ) ] + λ 2 m ∑ j = 1 m θ j 2 J\left(\theta\right)=-\frac{1}{m}\sum_{i=1}^{m}\left[y^{\left(i\right)}\log{h_\theta\left(x^i\right)}+\left(1-y^{\left(i\right)}\right)\log{\left(1-h_\theta\left(x^{\left(i\right)}\right)\right)}\right]+\frac{\lambda}{2m}\sum_{j=1}^{m}\theta_j^2 J(θ)=−m1i=1∑m[y(i)loghθ(xi)+(1−y(i))log(1−hθ(x(i)))]+2mλj=1∑mθj2
根据之前的6-4小节(小节为吴恩达ML课程的章节号),定义的逻辑回归代价函数为:
J ( θ ) = 1 m ∑ i = 1 m C o s t ( h θ ( x ( i ) ) , y ( i ) ) J\left(\theta\right)=\frac{1}{m}\sum_{i=1}^{m}Cost\left(h_\theta\left(x^{\left(i\right)}\right),y^{\left(i\right)}\right) J(θ)=m1i=1∑m