Coursera-吴恩达-深度学习-第四门课-卷积神经网络 -week1-编程作业2

这篇博客介绍了Coursera上吴恩达的深度学习课程中,关于卷积神经网络的第四课内容,特别是第一周的编程作业。主要涵盖了在TensorFlow中创建模型、初始化参数、前向传播以及计算成本的步骤。内容包括实现卷积神经网络的辅助函数,使用TensorFlow构建并训练全功能的ConvNet,以及用于分类问题的模型构建。
摘要由CSDN通过智能技术生成

本文章内容:

Coursera吴恩达深度学习课程,

第四课: 卷积神经网络(Convolutional Neural Networks)

第一周:  卷积神经网络(Foundations of Convolutional Neural Networks)

编程作业,记错本。
 

Convolutional Neural Networks: Application

 you will:

  • Implement helper functions that you will use when implementing a TensorFlow model
  • Implement a fully functioning ConvNet using TensorFlow
  • Build and train a ConvNet in TensorFlow for a classification problem

1.0 - TensorFlow model

1.1 - Create placeholders

TensorFlow requires that you create placeholders for the input data that will be fed into the model when running the session.

X = tf.placeholder(tf.float32, shape=[None, n_H0, n_W0, n_C0])
    Y = tf.placeholder(tf.float32, shape=[None, n_y])

1.2 - Initialize parameters

You will initialize weights/filters W1W1 and W2W2 using tf.contrib.layers.xavier_initializer(seed = 0). You don't need to worry about bias variables as you will soon see that TensorFlow functions take care of the bias.

Note also that you will only initialize the weights/filters for the conv2d functions.

TensorFlow initializes the layers for the fully connected part automatically. 

W1 = tf.get_variable("W1",[4,4,3,8],initializer = tf.contrib.layers.xavier_initializer(seed = 0) )
    W2 = tf.get_variable('W2',[2, 2, 8, 16],initializer=tf.contrib.layers.xavier_initializer(seed = 0) )

1.2 - Forward propagation

In TensorFlow, there are built-in functions that carry out the convolution steps for you.

  • tf.nn.conv2d(X,W1, strides = [1,s,s,1], padding = 'SAME'): given an input XX and

评论
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包
实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值