pytorch
just__we
just we就是just we 既不伟大也不卑微
展开
-
TASK 4
from torch import nnclass simpleNet(nn.Module): ''' 定义一个简单的三层全连接神经网络,每一层都是线性的 ''' def __init__(self,in_dim,n_hidden_1,n_hidden_2,out_dim): super(simpleNet,self).__init__() ...原创 2019-08-13 22:15:40 · 123 阅读 · 0 评论 -
TASK1 第八期PYTORCH
1、什么是pytorch?pytorch是使用GPU和CPU优化的深度学习张量库2、pytorch的安装pip install pytorch3、通过代码实现流程CNN数字识别import torchimport torch.nn as nnimport torch.nn.functional as Fimport torch.optim as optimfrom torchv...原创 2019-08-07 12:43:04 · 107 阅读 · 0 评论 -
#TASK2
import torchdevice = torch.device('cuda' if torch.cuda.is_available() else 'cpu')#N是批次大小,D_in是输入维度#H是隐藏维度,D_out是输出维度N,D_in,H,D_out=64,1000,100,10#产生随机的输入和输出数据x = torch.randn(N,D_in,device=devic...原创 2019-08-09 17:36:46 · 205 阅读 · 0 评论 -
TASK6
梯度下降法(Gradient Descent)目前梯度下降法,目前分为三种梯度下降法:标准梯度下降法(GD,Grandient Descent),随机梯度下降法(SGD,Stochastic Gradient Descent)及批量梯度下降法(BGD,Batch Gradient Descent)1、标准梯度下降(GD)假设要学习训练的模型参数WWW,代价函数为J(W)J(W)J(W)则代...转载 2019-08-19 23:20:32 · 197 阅读 · 0 评论 -
TASK3
import torchimport torch.nn as nnimport torchvision.datasets as dsetsimport torchvision.transforms as transformsfrom torch.autograd import Variableinput_size = 784num_classes = 10num_epochs = ...原创 2019-08-12 19:59:26 · 111 阅读 · 0 评论 -
TASK 7
from __future__ import print_function import argparse import torch import torch.nn as nn import torch.nn.functional as F import torch.optim as optim from torchvision import datasets, transforms...原创 2019-08-21 19:42:08 · 116 阅读 · 0 评论 -
TASK5
dropoutdropout是指在深度学习网络的训练过程中,对于神经网络单元,按照一定的概率将其暂时从网络中丢弃。def dropout(x, level, noise_shape=None, seed=None): """Sets entries in `x` to zero at random, while scaling the entire tensor. #...转载 2019-08-16 21:05:24 · 130 阅读 · 0 评论