- 博客(13)
- 收藏
- 关注
原创 268. Missing Number 丢失的数字
Given an array containing n distinct numbers taken from 0, 1, 2, ..., n, find the one that is missing from the array. For example, Given nums = [0, 1, 3] return 2 Follow up: Could you implement a solution using only O(1) extra space complexity and O(n) ru
2020-10-26 11:21:17 146
原创 使用tensorflow时遇到的一些问题汇总·
1. 使用tensorboardno scalar data was found。 解决方案link:https://github.com/tensorflow/tensorflow/issues/2353 在使用tf.summary记录验证集的损失时,总是发现no scalar data was found。 with tf.Session(graph=g) as sess: ...
2019-06-07 08:10:31 214
原创 Mask R-CNN 精读
1. IntroductionIn principle Mask R-CNN is an intuitive extension ofFaster R-CNN, yet constructing the mask branch properlyis critical for good results. Most importantly, Faster RCNNwas not designed fo...
2018-05-11 17:55:22 382
转载 欢迎使用CSDN-markdown编辑器2
欢迎使用Markdown编辑器写博客本Markdown编辑器使用StackEdit修改而来,用它写博客,将会带来全新的体验哦: Markdown和扩展Markdown简洁的语法 代码块高亮 图片链接和图片上传 LaTex数学公式 UML序列图和流程图 离线写博客 导入导出Markdown文件 丰富的快捷键 快捷键 加粗 Ctrl + B 斜体 Ctrl + I 引用 Ctrl
2017-12-22 16:00:14 176
原创 deeplearning.ai-lecture4-week1-Convolutional Neural Networks: Step by Step
Convolutional Neural Networks: Step by Step Welcome to Course 4's first assignment! In this assignment, you will implement convolutional (CONV) and pooling (POOL) layers in numpy, including both forw
2017-12-21 21:09:07 198
转载 deeplearning.ai-lecture1-building deep neural network steps
该实验主要是实现一些“Helper function”,为下一步实现两层神经网络和L层神经网络做准备,实现一个两层网络或深层网络的步骤如下: Step 1.分别初始化一个两层神经网络和L层神经网络的参数 Step 2: 前向传播的实现: 1.完成一个网络的前向传播的线性部分(linear part),即计算出 Z [l] 2.实现relu和 sigmoid激活函数
2017-12-12 20:11:00 176
原创 deeplearning.ai-lecture1-building deep neural network-summary
先上一张summary map 1. L层神经网络参数初始化: 返回各层参数W(1)…W(l-1) def initialize_parameters_deep(layer_dims): for l in range(1,L): parameters['W'+str(l)]=np.random.randn(layer_dims[l],layer_dims[
2017-12-12 19:56:38 342
原创 deeplearning.ai-lecture2-week1-regularization-homework
Regularization Welcome to the second assignment of this week. Deep Learning models have so much flexibility and capacity that overfitting can be a serious problem, if the training dataset is not big
2017-12-12 19:52:01 336
原创 deeplearning.ai-lecture2-week1-Initialization-homework
Initialization Welcome to the first assignment of "Improving Deep Neural Networks". Training your neural network requires specifying an initial value of the weights. A well chosen initialization met
2017-12-12 19:50:14 487 1
原创 deeplearning.ai-lecture2-week1-Gradient Checking-homework
Gradient Checking Welcome to the final assignment for this week! In this assignment you will learn to implement and use gradient checking. You are part of a team working to make mobile payments avai
2017-12-12 19:48:38 516
原创 deeplearning.ai-lecture2-week3-Tensorflow Tutorial-homework
TensorFlow Tutorial Welcome to this week's programming assignment. Until now, you've always used numpy to build neural networks. Now we will step you through a deep learning framework that will allow
2017-12-12 19:44:18 965
原创 deeplearning.ai lecture2-week2-optimization methods
Optimization methods Optimization Methods Until now, you've always used Gradient Descent to update the parameters and minimize the cost. In this notebook, you will learn more advanced optimi
2017-12-11 19:51:11 918
转载 【转载】A Review on Deep Learning Techniques Applied to Semantic Segmentation(译)-(1)
原文链接:http://blog.csdn.net/u011771047/article/details/72779221 http://blog.csdn.net/u014451076/article/details/71101850 本部分包括:摘要,1.引言,2.专业术语和背景概念 摘要 图像语义分割越来越受到计算机视觉和机器学习的研究人员的热爱。越来越多新兴
2017-11-16 10:51:29 1566
空空如也
空空如也
TA创建的收藏夹 TA关注的收藏夹
TA关注的人