Keras(1) GAN代码分析

源代码:

    def minb_disc(x):
        diffs = K.expand_dims(x, 3) - K.expand_dims(K.permute_dimensions(x, [1, 2, 0]), 0)
        abs_diffs = K.sum(K.abs(diffs), 2)
        x = K.sum(K.exp(-abs_diffs), 2)

        return x

    def lambda_output(input_shape):
        return input_shape[:2]

    num_kernels = 3 #简单起见,这里改成3,原来是100
    dim_per_kernel = 5

    M = Dense(num_kernels * dim_per_kernel, bias=False, activation=None)
    MBD = Lambda(minb_disc, output_shape=lambda_output)

a下附测试代码:

#这个函数是用来打印每一步之后的输出数据,hehe是我们随机生成的最初输入数据
def printpp(src, i, x_temp):
    print pp(src)
    res = src.eval({x_temp: hehe})
    print res
    print "res_shape: ",res.shape," \n ======================%d" % i
    return i + 1


if __name__ == "__main__":
    import keras.backend as K
    from keras.models import Model
    from keras.layers.core import Dense, Lambda, Reshape
    from keras.layers import Input, merge
    from theano import pp
    import numpy as np

    hehe = 1


    def minb_disc(x):
        i = 1
        x_temp = x
        i = printpp(src=x_temp, i=i, x_temp=x_temp)

        x_temp1 = K.permute_dimensions(x_temp, [1, 2, 0])
        i = printpp(src=x_temp1, i=i, x_temp=x_temp)

        x_temp2 = K.expand_dims(x_temp1, 0)
        i = printpp(src=x_temp2, i=i, x_temp=x_temp)

        x_temp3 = K.expand_dims(x, 3)
        i = printpp(src=x_temp3, i=i, x_temp=x_temp)

        # diffs = K.expand_dims(x, 0) - K.expand_dims(K.permute_dimensions(x, [1, 2, 0]), 0)
        diffs = x_temp3 - x_temp2
        i = printpp(src=diffs, i=i, x_temp=x_temp)

        beforesum = K.abs(diffs)
        i = printpp(src=beforesum, i=i, x_temp=x_temp)

        abs_diffs = K.sum(beforesum, 2)
        i = printpp(src=x_temp, i=i, x_temp=x_temp)

        x = K.sum(K.exp(-abs_diffs), 2)

        return x


    def lambda_output(input_shape):
        return input_shape[:2]


    num_kernels = 3
    dim_per_kernel = 5

    x1 = np.random.random(size=(1, 10))
    x = Input(shape=(10,))
    M = Dense(num_kernels * dim_per_kernel, bias=False, activation=None)
    MBD = Lambda(minb_disc, output_shape=lambda_output)

    x_mbd = M(x)
    model1 = Model(input=x, output=x_mbd)
    result1 = model1.predict(x=x1)
    print result1.shape, " \n ======================1"

    x_mbd = Reshape((num_kernels, dim_per_kernel))(x_mbd)
    model2 = Model(input=x, output=x_mbd)
    result2 = model2.predict(x=x1)
    hehe = result2 #从这一步开始,hehe这个变量被赋初始值,然后进入了MBD,即Lambda层
    print result2.shape, " \n ======================2"

    x_mbd = MBD(x_mbd)
    model3 = Model(input=x, output=x_mbd)
    result3 = model3.predict(x=x1)

    print result3.shape, " \n ======================3"

    y = merge([x, x_mbd], mode='concat')
    model4 = Model(input=x, output=y)
    result4 = model4.predict(x=x1)
    print result4.shape, " \n ======================4"

对于Keras的代码调试,要理解计算图的结构,每一个layer都是一个子图,一个model是由多个layer拼起来的。用下水管理解就是layer是一节节水管,model是用水管拼出来的下水管系统。用compile和fit相当于把整个下水管系统搭建好之后然后往里面注水(数据)。而predict就是你搭好了这个下水管系统的某一部分,就直接往里面注水进行测试了。特别是这里的Lambda层,它所引入的内嵌函数(例子里面的 def minb_disc(x)),里面的函数体也是在搭建水管(就是在建图),这一点要注意。

  • 0
    点赞
  • 2
    收藏
    觉得还不错? 一键收藏
  • 0
    评论
评论
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包
实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值