版权声明:本文为原创文章:http://blog.csdn.net/programmer_wei/article/details/51002379
深度学习python库Theano中的函数scan是一种迭代形式,所以可以用于类似循环(looping)的场景。需要注意的是scan在计算的时候,可以访问以前n步的输出结果,所以比较适合RNN网络。
首先我们来看地一个例子1:
import theano
import theano.tensor as T
k = T.iscalar("k")
A = T.vector("A")
# Symbolic description of the result
result, updates = theano.scan(fn=lambda prior_result, A: prior_result * A,
outputs_info=T.ones_like(A),
non_sequences=A,
n_steps=k)
# We only care about A**k, but scan has provided us with A**1 through A**k.
# Discard the values that we don't care about. Scan is smart enough to
# notice this and not waste memory saving them.
final_result = result[-1]
# compiled function that returns A**k
power = theano.function(inputs=[A,k], outputs=final_result, updates=updates)
print(power(range(10),2))
print(power(range(10),4))
输出:
[ 0. 1. 4. 9. 16. 25. 36. 49. 64. 81.]
[ 0.00000000e+00 1.00000000e+00 1.60000000e+01 8.10000000e+01
2.56000000e+02 6.25000000e+02 1.29600000e+03 2.40100000e+03
4.09600000e+03 6.56100000e+03]
上述算法等同于:
result = 1
for i in range(k):
result = result * A
在上述例子1中:
fn:定义了一个lamuda函数,实现累乘的功能 Ak 。
outputs_info:是scan输出在起始的状态。为result设置的初值,由outputs_info指定。这里为1,shape大小和A保持一致,这样才能实现累乘的目的。
non_sequences:描述了非序列的输入,即A是一个固定的输入&