自定义博客皮肤VIP专享

*博客头图:

格式为PNG、JPG,宽度*高度大于1920*100像素,不超过2MB,主视觉建议放在右侧,请参照线上博客头图

请上传大于1920*100像素的图片!

博客底图:

图片格式为PNG、JPG,不超过1MB,可上下左右平铺至整个背景

栏目图:

图片格式为PNG、JPG,图片宽度*高度为300*38像素,不超过0.5MB

主标题颜色:

RGB颜色,例如:#AFAFAF

Hover:

RGB颜色,例如:#AFAFAF

副标题颜色:

RGB颜色,例如:#AFAFAF

自定义博客皮肤

-+
  • 博客(21)
  • 收藏
  • 关注

原创 【无标题】

【代码】【无标题】

2023-09-27 08:50:43 52

原创 BRAT 部署学习记录,及问题解决

find txtann(文件夹) -name ‘*.txt’|sed -e ‘s|.txt|.ann|g’|xargs touch。

2023-09-21 17:16:34 124

转载 K折交叉验证

交叉验证

2023-02-03 10:17:24 311

原创 【无标题】

单层神经网络

2023-01-09 19:11:37 104

原创 异或门(完整)

import torchX = torch.tensor([[1,0,0],[1,1,0],[1,0,1,],[1,1,1]] ,dtype = torch.float32)orgate = torch.tensor([0,1,1,1],dtype = torch.float32)def OR(X): w = torch.tensor([-0.5,1,1] ,dtype = torch.float32) # b,w1,w2 zhat = torch.mv(X,w) yha.

2021-02-03 17:36:10 2325

原创 异或门

import torchX = torch.tensor([[1,0,0],[1,1,0],[1,0,1,],[1,1,1]] ,dtype = torch.float32)orgate = torch.tensor([0,1,1,1],dtype = torch.float32)def OR(X): w = torch.tensor([-0.5,1,1] ,dtype = torch.float32) # b,w1,w2 zhat = torch.mv(X,w) yhat

2021-02-03 17:19:51 484

原创 或门

import torchX = torch.tensor([[1,0,0],[1,1,0],[1,0,1,],[1,1,1]] ,dtype = torch.float32)orgate = torch.tensor([[0],[1],[1],[1]],dtype = torch.float32)w = torch.tensor([-0.5,1,1] ,dtype = torch.float32) # b,w1,w2def orAdd(X,w): zhat = torch.mv(X,w).

2021-02-03 09:12:34 176

原创 与非门

import torchX = torch.tensor([[1,0,0],[1,1,0],[1,0,1,],[1,1,1]] ,dtype = torch.float32)nandgate = torch.tensor([[1],[1],[1],[0]],dtype = torch.float32)w = torch.tensor([0.7,-0.5,-0.5] ,dtype = torch.float32) # b,w1,w2def nanAdd(X,w): zhat = torch.

2021-02-03 09:07:51 225

原创 与门画图

import torchimport matplotlib.pyplot as pltimport seaborn as snsX = torch.tensor([[1,0,0],[1,1,0],[1,0,1],[1,1,1]],dtype = torch.float32)sigma = torch.tensor([0.4502, 0.4875, 0.4875, 0.5250])andgate = torch.tensor([int(x) for x in sigma>=0.5], dt.

2021-02-03 08:38:48 255

原创 逻辑回归

#逻辑回归import torchX = torch.tensor([[1,0,0],[1,1,0],[1,0,1],[1,1,1]],dtype = torch.float32)w = torch.tensor([-0.2,0.15,0.15],dtype = torch.float32)def LogisticR(X,w): zhat = torch.mv(X,w) #首先是线性回归 sigma = torch.sigmoid(zhat)#逻辑回归 andhat =

2021-02-03 07:59:15 120

原创 总结

import torchX = torch.tensor([[1,0,0],[1,1,0],[1,0,1,],[1,1,1]] ,dtype = torch.float32)w = torch.tensor([-0.2,0.15,0.15] ,dtype = torch.float32) # b,w1,w2z = torch.tensor([-0.2, -0.05, -0.05, 0.1] ,dtype = torch.float32)def LineaR(X,w): zhat = tor

2021-02-02 17:49:41 180 1

原创 softmax_dim=?_要注意

import torchfrom torch.nn import functional as FX = torch.tensor([[0,0],[1,0],[0,1],[1,1]], dtype = torch.float32)torch.random.manual_seed(420)dense = torch.nn.Linear(2,3)zhat = dense(X)print(zhat)sigma = F.softmax(zhat,dim=1)print(sigma)#sigma =.

2021-02-02 17:42:38 115

原创 F.torch.sigmoid

二分类import torchfrom torch.nn import functional as FX = torch.tensor([[0,0],[1,0],[0,1],[1,1]] ,dtype = torch.float32)torch.random.manual_seed(420)dense = torch.nn.Linear(2,1)zhat = dense(X)sigma = F.torch.sigmoid(zhat)y = [int(x) for x in sigm

2021-02-02 16:34:27 1590

原创 torch.nn.Linear

import torchX1 = torch.tensor([[0,0],[1,0],[0,1,],[1,1]] ,dtype = torch.float32)torch.random.manual_seed(420)output = torch.nn.Linear(2,1)zhat = output(X1)print(zhat)print(output.weight)print("*"*50)print(output.bias)

2021-02-02 11:58:47 106

原创 Pytorch 阶跃函数,与门电路

#阶跃函数,与门电路import torchX = torch.tensor([[1,0,0],[1,1,0],[1,0,1,],[1,1,1]] ,dtype = torch.float32)andgate = torch.tensor([[0],[0],[0],[1]],dtype = torch.float32)w = torch.tensor([-0.2,0.15,0.15] ,dtype = torch.float32) # b,w1,w2def LinearRwithsign(X,w

2021-02-02 11:57:41 912

原创 Pytorch sigmoid函数,与门电路

#sigmoid函数,与门电路import torchX = torch.tensor([[1,0,0],[1,1,0],[1,0,1,],[1,1,1]] ,dtype = torch.float32)andgate = torch.tensor([[0],[0],[0],[1]],dtype = torch.float32)w = torch.tensor([-0.2,0.15,0.15] ,dtype = torch.float32) # b,w1,w2def LogisticR(X,w)

2021-02-02 11:56:32 613

原创 2021-02-01

import torchX = torch.tensor([[1.,0,0],[1,1,0],[1,0,1],[1,1,1]],dtype = torch.float32)w = torch.tensor([-0.2,0.15,0.15] ,dtype = torch.float32) # b,w1,w2z = torch.tensor([-0.2, -0.05, -0.05, 0.1] ,dtype = torch.float32)def LineaR(X,w): zhat = torc

2021-02-01 22:19:42 81

原创 Pytorch学习笔记

梯度下降入门import torchdef gradDescent(X, y, eps = torch.tensor(0.01, requires_grad = True), numIt = 1000): m, n = X.shape weights = torch.zeros(n, 1, requires_grad = True) for k in range(numIt): grad = torch.mm(X.t(), (torch.mm(X, weights

2021-01-31 20:43:41 92

原创 Pytorch学习笔记

第一周学习笔记整理(2)Lesson 2.张量的索引、分片、合并以及维度调整import torchimport numpy as np

2021-01-30 12:21:37 219 3

原创 Pytorch学习笔记

Pytorch学习笔记第一周学习笔记整理// 矩阵的乘法import torchimport numpy as npt1 = torch.arange(1,7).reshape(2,3)t2 = torch.arange(1,10).reshape(3,3)torch.mm(t1,t2)

2021-01-30 11:20:23 434 3

原创 复杂网络matlab学习记录(1)2019年4月1日

复杂网络matlab学习记录(1)2019年4月1日prim算法clc;clear;a=zeros(7);%首先生成一个7*7的零矩阵a(1,2)=50;a(1,3)=60;a(2,4)=65;a(2,5)=40;a(3,4)=52;a(3,7)=45;a(4,5)=50;a(4,6)=30;a(4,7)=42;a(5,6)=70;%把里面的元素变成图上的权a=a+a'...

2019-04-01 21:42:06 1152 1

空空如也

空空如也

TA创建的收藏夹 TA关注的收藏夹

TA关注的人

提示
确定要删除当前文章?
取消 删除