NN-DL
hello_pig1995
日有所进
展开
-
neural-networks-and-deep-learning more_data.py
主要的工作就是不同的数据量进行训练,然后绘图。"""more_data~~~~~~~~~~~~Plot graphs to illustrate the performance of MNIST when different sizetraining sets are used."""# Standard libraryimport jsonimport randomimport sys#原创 2016-07-22 14:47:39 · 363 阅读 · 0 评论 -
neural-networks-and-deep-learning weight_initialization.py
weight_initialization主要是通过不同的网络权重的初始化方式来进行网络的测试。会发现,使用默认方式,也就是方差相对较小的方式更能够有比较好的结果。看两者的曲线图很容易就能够看出。第一个函数就是训练网络,然后存储,第二个函数就是画出accuracy的曲线图。"""weight_initialization ~~~~~~~~~~~~~~~~~~~~~~~~This program s原创 2016-07-21 22:47:18 · 388 阅读 · 0 评论 -
neural-networks-and-deep-learning valley2.py
又一个山沟沟"""valley2.py~~~~~~~~~~~~~Plots a function of two variables to minimize. The function is afairly generic valley function.Note that this is a duplicate of valley.py, but omits labels on theaxi原创 2016-07-21 22:25:19 · 305 阅读 · 0 评论 -
neural-networks-and-deep-learning valley.py
一个山沟沟"""valley~~~~~~Plots a function of two variables to minimize. The function is afairly generic valley function."""#### Libraries# Third party librariesfrom matplotlib.ticker import LinearLoca原创 2016-07-21 22:24:04 · 314 阅读 · 0 评论 -
neural-networks-and-deep-learning false_minimum.py
局部最优解很多很多"""false_minimum~~~~~~~~~~~~~Plots a function of two variables with many false minima."""#### Libraries# Third party librariesfrom matplotlib.ticker import LinearLocator# Note that axes3d原创 2016-07-21 16:35:23 · 325 阅读 · 0 评论 -
neural-networks-and-deep-learning backprop_magnitude_nabla.py
似乎是为了表达梯度扩散的问题"""backprop_magnitude_nabla~~~~~~~~~~~~~~~~~~~~~~~~Using backprop2 I constructed a 784-30-30-30-30-30-10 network to classifyMNIST data. I ran ten mini-batches of size 100, with eta =原创 2016-07-21 16:33:50 · 607 阅读 · 0 评论 -
neural-networks-and-deep-learning misleading_gradient_contours.py
其实是康托图,和之前的misleading_gradient没有什么太大的本质区别。 contour的level就是选择哪些值画线。"""misleading_gradient_contours~~~~~~~~~~~~~~~~~~~~~~~~~~~~Plots the contours of the function from misleading_gradient.py"""#### Lib原创 2016-07-21 16:24:32 · 432 阅读 · 0 评论 -
neural-networks-and-deep-learning misleading_gradient.py
梯度下降原创 2016-07-21 10:20:59 · 346 阅读 · 0 评论 -
neural-networks-and-deep-learning network3.py
其实是对于network2的theano实现,更加高效。原创 2016-07-21 10:01:48 · 980 阅读 · 0 评论 -
neural-networks-and-deep-learning test.py
对于不同import的模块的测试。import mnist_loaderimport matplotlibimport matplotlib.pyplot as pltimport numpy as npfrom sklearn.decomposition import RandomizedPCAtraining_data , test_inputs , actual_test_results原创 2016-07-21 10:00:09 · 330 阅读 · 0 评论 -
neural-networks-and-deep-learning network2.py
本代码做了诸多更新,比如说增加了交叉熵的loss,正则化项,还有更好的网络初始化的方法。"""network2.py~~~~~~~~~~~~~~An improved version of network.py, implementing the stochasticgradient descent learning algorithm for a feedforward neural netw原创 2016-07-20 22:32:05 · 972 阅读 · 0 评论 -
neural-networks-and-deep-learning network.py
"""network.py~~~~~~~~~~A module to implement the stochastic gradient descent learningalgorithm for a feedforward neural network. Gradients are calculatedusing backpropagation. Note that I have fo原创 2016-07-20 21:26:56 · 352 阅读 · 0 评论 -
neural-networks-and-deep-learning mnist_svm.py
其实就是调用了sklearn重的svm的svc进行训练后分类。 reference: [1]http://scikit-learn.org/stable/modules/generated/sklearn.svm.SVC.html"""mnist_svm~~~~~~~~~A classifier program for recognizing handwritten digits from原创 2016-07-20 21:25:52 · 412 阅读 · 0 评论 -
neural-networks-and-deep-learning mnist-loader
最近发现笔记本里还有这么个文件夹中的代码没看,感觉需要看一看的。这是写的第二个代码分析了。第一个是gibbsLDA++,感觉还挺有意思的。那就来分析看看吧,先从最简单的开始吧。cPickle和gzip是很方便的压缩存储的工具。cPickle是进行序列化的工具,而gzip是压缩的工具。def load_data(): """Return the MNIST data as a tuple co原创 2016-07-14 23:08:09 · 613 阅读 · 0 评论 -
neural-networks-and-deep-learning mnist_pca.py
这个程序主要的功能是通过pca来提取28*28的图片的pca然后还原。"""mnist_pca~~~~~~~~~Use PCA to reconstruct some of the MNIST test digits."""# My librariesimport mnist_loader# Third-party librariesimport matplotlibimport mat原创 2016-07-16 15:09:10 · 596 阅读 · 0 评论 -
neural-networks-and-deep-learning expand_mnist.py
其实东西不多,就是加载进来原来的保存的文件之后,进行上下左右移动一个像素,然后序列化存储。"""expand_mnist.py~~~~~~~~~~~~~~~~~~Take the 50,000 MNIST training images, and create an expanded set of250,000 images, by displacing each training image原创 2016-07-20 21:00:04 · 704 阅读 · 0 评论 -
neural-networks-and-deep-learning mnist_average_darkness.py
就是根据每一幅图片的平均sum来计算。。正确率很低"""mnist_average_darkness~~~~~~~~~~~~~~~~~~~~~~A naive classifier for recognizing handwritten digits from the MNISTdata set. The program classifies digits based on how dark原创 2016-07-20 21:18:29 · 513 阅读 · 0 评论 -
neural-networks-and-deep-learning multiple_eta.py
其实这个文件也相对简单,首先就是确定不同的三个eta,也就是三个不同的学习率,然后训练不同的网络,权重初始化的方式为默认方式,然后进行训练,经过30个epoch的训练后的结果纪录在result中。在plot的函数中只画出validation_cost。可以看出来eta大了,学不会。eta小了学的慢。"""multiple_eta~~~~~~~~~~~~~~~This program shows h原创 2016-07-21 23:31:26 · 309 阅读 · 0 评论