7月27日深度学习笔记——GAN


前言

本文为7月27日深度学习笔记, 分为三个章节:

  • Basic Idea of GAN:Generator vs. Disciminator、Can Generator learn by itself——Yes、Can Discriminator generate——Yes;
  • Conditional Generation by GAN:Conditional GAN;
  • Unsupervised Conditional Generation:Approach 1:Direct Transformation、Approach 2:Projection to Common Space.

一、Basic Idea of GAN

1
2
3

1、Generator vs. Disciminator

(1)、Generator

Learn to generate the object at the component level.

(3)、Discriminator

Evaluate the whole object, and find the best one.

(3)、Algorithm

  • Initialize generator(G) and discriminator(D);
  • In each training iteration:
    1. 固定 G,更新 D,discriminator learns to assign high scores to real objects and low scores to generated objects.
      4
    2. 固定 D,更新 G,Generator learns to “fool” the discriminator:

5

  • Learning D:

    • Sample m examples { x 1 , x 2 , … , x m } \{x^1, x^2, …, x^m\} {x1,x2,,xm} from database;
    • Sample m noise samples { z 1 , z 2 , … , z m } \{z^1, z^2, …, z^m\} {z1,z2,,zm} from a distribution;
    • Obtaining generated data { x ~ 1 , x ~ 2 , … , x ~ m } , x ~ i = G ( z i ) \{\tilde{x}^1, \tilde{x}^2, …, \tilde{x}^m\}, \tilde{x}^i = G(z^i) {x~1,x~2,,x~m},x~i=G(zi);
    • Update discriminator parameters θ d \theta_d θd to maximize:
      1. V ~ = 1 m ∑ i = 1 m l o g D ( x i ) + 1 m ∑ i = 1 m l o g ( 1 − D ( x ~ i ) ) ; \tilde{V} = \frac{1}{m} \textstyle \sum_{i=1}^{m} logD(x^i) + \frac{1}{m} \textstyle \sum_{i=1}^{m} log(1 - D(\tilde {x}^i)); V~=m1i=1mlogD(xi)+m1i=1mlog(1D(x~i));
      2. θ d + η ▽ V ~ ( θ d ) → θ d + 1 \theta_d + \eta \bigtriangledown \tilde {V}(\theta_d) → \theta_{d+1} θd+ηV~(θd)θd+1
  • Learning G:

    • Sample m noise samples { z 1 , z 2 , … , z m } \{z^1, z^2, …, z^m\} {z1,z2,,zm} from a distribution;
    • Update generator parameters θ g \theta_g θg to maximize:
      1. V ~ = 1 m ∑ i = 1 m l o g ( D ( G ( z i ) ) ) ; \tilde{V} = \frac{1}{m} \textstyle \sum_{i=1}^{m}log(D(G(z^i))); V~=m1i=1mlog(D(G(zi)));
      2. θ g + η ▽ V ~ ( θ g ) → θ g + 1 . \theta_g + \eta \bigtriangledown \tilde {V}(\theta_g) → \theta_{g+1}. θg+ηV~(θg)θg+1.

(4)、Pros.&Cons.

\GeneratorDiscriminator
Pros- Easy to generate even with deep model.Considering the big picture
Cons- Imitate the appearance; - Hard to learn the correlation between components- Generation is not always feasible; - Especially when your model is deep;

2、Can Generator learn by itself——Yes

NN Decoder = Generator.
6

3、Can Discriminator generate——Yes

Discriminator is a function D:
D : X → R D: X → R D:XR

  • Input: an image;
  • Output D ( x ) D(x) D(x): scalar which represents how “good” an object x is.

二、Conditional Generation by GAN

1、Conditional GAN

7

(1)、Traditional supervised approach

8

(2)、Discriminator

9


三、Unsupervised Conditional Generation

Transform an object from one domain to another without paired data.

1、Approach 1:Direct Transformation

10

2、Approach 2:Projection to Common Space

11


  • 0
    点赞
  • 0
    收藏
    觉得还不错? 一键收藏
  • 0
    评论
评论
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包
实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值