- 博客(3)
- 收藏
- 关注
转载 ReLu(Rectified Linear Units)激活函数
转载:http://www.cnblogs.com/neopenx/p/4453161.html ReLu(Rectified Linear Units)激活函数 论文参考:Deep Sparse Rectifier Neural Networks (很有趣的一篇paper) 起源:传统激活函数、脑神经元激活频率研究、稀疏激活性 传统Sigmoid系激活
2016-10-26 10:03:14 638
原创 《论文阅读笔记》Deep Metric Learning via Lifted Structured Feature Embedding
1.解释Lifted ,Structured structured:结构化 可以参考structured learning : to prediction structured ojects ,rather than scalar discrete or real values.For example,the problem of translating a natural language
2016-10-24 22:18:17 2127
转载 Mean Average Precision(MAP)
Precision Main article: Precision and recall Precision is the fraction of the documents retrieved that are relevant to the user’s information need. precision=|{relevant documents}∩{retrieved documen
2016-10-11 15:02:13 2185
空空如也
空空如也
TA创建的收藏夹 TA关注的收藏夹
TA关注的人