![](https://img-blog.csdnimg.cn/20201014180756922.png?x-oss-process=image/resize,m_fixed,h_64,w_64)
Machine Learning
文章平均质量分 95
檀檀吸甲烷
南京大学信息管理学院
展开
-
Attention Mechanism
来自University of Waterloo1 Attention Overview1.1 RNN’s challengeLong range dependencies: how to deal withCombine it with some attention mechanismGradient vanishing and explosionLarge number of trainig stepsWe should unroll it for as many steps as ne原创 2021-10-05 01:10:03 · 222 阅读 · 0 评论 -
Machine_Learning_Regularization
In this chapter, we introduce regularization methods with some hands-on Python codes~Machine_Learning_RegularizationRegularizationLassoRidgeElastic NetLasso: a real caseGridSearchCVLassoCVLassoLarsICSource:RegularizationWhy regularization: in the bias原创 2021-10-16 11:09:32 · 237 阅读 · 0 评论 -
神经网络激活函数
激活函数激活函数的作用就是把输入节点的加权和转化后输出。SigmoidSigmoid函数也被称为logistics函数。特点是将实数范围内的数字映射到(0,1)(0, 1)(0,1)之间,定义域内所有点的导数都是非负的。作为神经网络的激活函数时,多用于隐层神经元输出。S(x)=11+ex S′(x)=ex(1+ex)2=S(x)⋅(1−S(x))S(x) = \frac{1}{1+e^x} \\\ \\S'(x) = \frac{e^x}{(1+e^x)^2}=S(x) ·(1-S原创 2021-06-23 15:29:51 · 222 阅读 · 0 评论 -
Perceptrons and single layer neural nets
介绍了ANN,Threshold Perceptron Learning,Sigmoid Perceptron Learning,Perceptron Algorithm和一些定理。原创 2021-10-10 13:41:33 · 245 阅读 · 0 评论