![](https://img-blog.csdnimg.cn/20201014180756919.png?x-oss-process=image/resize,m_fixed,h_64,w_64)
cs231n
mrcoderrev
C
展开
-
cs231n作业:Assignment2-Pytorch
import torchimport torch.nn as nnimport torch.optim as optimfrom torch.utils.data import DataLoaderfrom torch.utils.data import samplerimport torchvision.datasets as dsetimport torchvision.tran...原创 2020-03-14 22:27:56 · 849 阅读 · 0 评论 -
cs231n作业:Assignment2-Convolutional Networks
def conv_forward_naive(x, w, b, conv_param): """ A naive implementation of the forward pass for a convolutional layer. The input consists of N data points, each with C channels, height H ...原创 2020-03-03 17:37:15 · 542 阅读 · 0 评论 -
cs231n作业:Assignment2-Fully-Connected Neural Nets
fc_net.pyfrom builtins import rangefrom builtins import objectimport numpy as npfrom cs231n.layers import *from cs231n.layer_utils import *class TwoLayerNet(object): """ A two-layer fu...原创 2020-02-27 16:59:16 · 661 阅读 · 0 评论 -
cs231n作业:Assignment2-Dropout
def dropout_forward(x, dropout_param): """ Performs the forward pass for (inverted) dropout. Inputs: - x: Input data, of any shape - dropout_param: A dictionary with the following...原创 2020-02-26 21:09:24 · 232 阅读 · 0 评论 -
cs231n作业:Assignment2-Batch Normalization
from builtins import rangeimport numpy as npdef affine_forward(x, w, b): """ Computes the forward pass for an affine (fully-connected) layer. The input x has shape (N, d_1, ..., d_k) ...原创 2020-02-24 19:49:46 · 487 阅读 · 0 评论 -
cs231n作业:Assignment1-Image features exercise
# Use the validation set to tune the learning rate and regularization strengthfrom cs231n.classifiers.linear_classifier import LinearSVMlearning_rates = [1e-9, 1e-8, 1e-7]regularization_strengths...原创 2020-01-20 09:56:22 · 304 阅读 · 0 评论 -
cs231n作业:Assignment1- two_layer_net
from __future__ import print_functionfrom builtins import rangefrom builtins import objectimport numpy as npimport matplotlib.pyplot as pltfrom past.builtins import xrangeclass TwoLayerNet(obj...原创 2019-12-16 15:52:10 · 373 阅读 · 0 评论 -
cs231n作业:Assignment1-softmax
def softmax_loss_naive(W, X, y, reg): """ Softmax loss function, naive implementation (with loops) Inputs have dimension D, there are C classes, and we operate on minibatches of N exa...原创 2019-11-28 14:54:21 · 323 阅读 · 0 评论 -
cs231n作业:Assignment1-SVM
def svm_loss_naive(W, X, y, reg): """ Structured SVM loss function, naive implementation (with loops). Inputs have dimension D, there are C classes, and we operate on minibatches of N...原创 2019-11-21 22:38:36 · 190 阅读 · 0 评论 -
线性分类(svm/softmax)/损失函数/优化
f(x,W)=Wx + bW:所有训练中的经验都存在W中bias:他不与训练数据交互,而只会给我们一些数据独立的偏好值。(例如,数据集的不平衡带来的差异 )优点:易于使用和理解缺点:难于解决多分类问题,在多模态数据中,比如一个类别出现在不同的领域空间中...原创 2019-11-19 22:32:42 · 520 阅读 · 0 评论 -
cs231n作业:Assignment1-KNN
note:曼哈顿距离依赖于坐标系统的选择(向量中的元素可能都有实际的意义)d1(I1,I2)=∑p∣I1p−I2p∣d_{1}(I_{1}, I_{2}) = \sum_{p}|I_{1}^{p}-I_{2}^{p}|d1(I1,I2)=∑p∣I1p−I2p∣欧式距离对距离的排序不会受到坐标系统的影响d2(I1,I2)=∑p(I1p−I2p)2d_{2}(I_{1}, I_{2...原创 2019-10-29 10:55:34 · 139 阅读 · 0 评论