- 博客(5)
- 资源 (33)
- 收藏
- 关注
原创 Learning RNN from scratch (RNN神经网络参数推导)
从上一篇原创的文章到现在,已经有一年多了,目前终于有一些新的总结分享个大家。本文主要讲了RNN神经网络的基本原理,并给出了RNN神经网络参数更新的详细推导过程(back propagation),对于想知道RNN的参数是如果推导的,可以仔细阅读本文。因为时间有限,下面的总结难免有疏漏之处,请大家指正。本文结合了一个非常经典的RNN的例子代码,进行了详细的说明,RNN的代
2016-06-28 09:49:01 11304 4
转载 Hinton's Dropout in 3 Lines of Python
Hinton's Dropout in 3 Lines of PythonHow to install Dropout into a neural network by only changing 3 lines of python.Posted by iamtrask on July 28, 2015Summary: Dropout is a vital
2016-06-30 15:20:30 1602
转载 Cross entropy
In information theory, the cross entropy between two probability distributions {\displaystyle p} and {\displaystyle q} over the same underlying set of events measures the average number of bits
2016-06-24 19:15:28 2419
转载 RNN computation
link: http://karpathy.github.io/2015/05/21/rnn-effectiveness/RNN computation. So how do these things work? At the core, RNNs have a deceptively simple API: They accept an input vector x and give you an
2016-06-12 16:33:20 1718
转载 The Max Trick when Computing Softmax
The softmax function appears in many machine learning algorithms. The idea is, if you have a set of values, to scale them so they sum to 1.0 and therefore can be interpreted as probabilities.For exampl
2016-06-12 11:16:21 1402
AdaBoost人脸检测程序
2012-06-04
计算机程序设计艺术
2012-05-27
网络聊天源代码
2012-05-14
Microsoft SQL Server 2008 Step by Step
2010-04-22
信号与系统 实习报告 线性系统分析
2008-12-12
数据库综合实习论文 图书馆管理系统
2008-11-27
计算机图形学实习代码 全部 c++ openGL
2008-11-24
空空如也
TA创建的收藏夹 TA关注的收藏夹
TA关注的人