1. 为什么假设
w⋅x≡∑jwjxj
后,
w
与x 就变成向量了?
The first change is to write ∑jwjxj as a dot product, w⋅x≡∑jwjxj , where w and x are vectors whose components are the weights and inputs, respectively.
向量的点积为标量,两个同样维度的向量的点积恰好是每个维度积的累加,以上变换可用公式表示为
2. 用perceptron实现‘与非’、‘与’、‘或’门。
we have a perceptron with two inputs, each with weight −2, and an overall bias of 3. Then we see that input 00 produces output 1, since (−2)∗0+(−2)∗0+3=3 is positive. Here, I’ve introduced the ∗ symbol to make the multiplications explicit. Similar calculations show that the inputs 01 and 10 produce output 1. But the input 11 produces output 0, since (−2)∗1+(−2)∗1+3=−1 is negative. And so our perceptron implements a NAND gate!
以上是电子书中实现的与非门。以此类推实现与门的方式可以为bias设置为-3, x1 与 x2 的权重设置为2,则输入00,计算为 2×0+2×0+(−3)=−3 ,为负。输入01,计算为 2×0 +2×1+(−3)=−1 为负。输入11, 2×1+2×1+(−3)=1 为正。实现与门。
实现或门的bias设置为-1,权重设置为2。则输入00,输出为负,输入01,输出为正,输入11,输出为正。
3. exp是啥?
exp是以
e
为底的自然对数,特点是对
4. 这个偏导数是怎么得到的(未解决)?
Δoutput≈∑j∂output∂wjΔwj+∂output∂bΔb