学习笔记GAN002:DCGAN

Ian J. Goodfellow 论文:https://arxiv.org/abs/1406.2661

两个网络:G(Generator),生成网络,接收随机噪声Z,通过噪声生成样本,G(z)。D(Dicriminator),判别网络,判别样本是否真实,输入样本x,输出D(x)代表x真实概率,如果1,100%真实样本,如果0,代表不可能是真实样本。

训练过程,生成网络G尽量生成真实样本欺骗判别网络D,判别网络D尽量把G生成样本和真实样本分别开。理想状态下,G生成样本G(z),使D难以判断真假,D(G(z))=0.5。此时,生成模型G,可以用来生成样本。

数学公式:minG maxDV(D,G)=Ex~pdata(x)[logD(x)]+Ez~pz(z)[log(1-D(G(z)))]
二项式。x真实样本,z输入G网噪声,G(z) G网生成样本。D(x) D网判断真实样本是否真实概率,越接近1越好。D(G(z)) D网判断G网生成样本真实概率。G网,D(G(z))尽可能大,V(D,G)变小,min_G。D网,D(x)越大,D(G(x))越小,V(D,G)越大,max_D。
1、x sampled from data -> Differentiable function D -> D(x) tries to be near 1
2、Input noise z -> Differntiable function G -> x sampled from model -> D -> D tries to make D(G(z)) near 0,G tries to make D(G(z)) near 1

随机梯度下降法训练D、G。

Algorithm 1 Minibatch stochastic gradient descent training of genegative adversarial nets. The number of steps to apply to the discriminator,k,is a hyperparameter, we used k = 1,the least expensive option, in our experiments.
for number of training iterations do
    for k steps do
        Sample minibatch of m noise samples {z(1),...,z(m)} from noise prior pg(z)
        Sample minibatch of m examples {x(1),...,x(m)} from data generating distribution pdata(x)
        Update the discriminator by ascending its stochastic gradient:
    end for
    Sample mninbatch of m noise samples {z(1),...,z(m)} from noise prior pg(z)
    Update the generator by descending its stochastic gradient:
end for
The gradient-based updates can use any standard gradient-based learning rule.We used momenttum in our experiments.

第一步训练D,V(G,D)越大越好,上升(增加)梯度(ascending)。第二步训练G,V(G,D)越小越好,下降(减少)梯度(descending)。交替进行。

DCGAN原理。https://arxiv.org/abs/1511.06434 。Alec Rad

  • 0
    点赞
  • 0
    收藏
    觉得还不错? 一键收藏
  • 0
    评论

“相关推荐”对你有帮助么?

  • 非常没帮助
  • 没帮助
  • 一般
  • 有帮助
  • 非常有帮助
提交
评论
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包
实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值