Neural Network Programming Deep Learning with PyTorch

此部分是b站一个系列Neural Network Programming DeepLearning wit PyTorch的学习笔记,有空就更新,目前以tf学习为主

1. Using CUDA

CUDA is a platform supported by NVIDA and designed for Deep Learning computing, especially parrallel computing, which means those computing can be divided as tiny computing and be worked out by different parts without interference.

import torch
print(torch.__version__)
torch.cuda.is_available()
1.4.0+cpu
False

2. Tensors

2.1 Introduction

If one element can be accessed by n indices at least, it’s an nd-tensor.

Indexes requiredComputer scienceMathmatics
0numberscalar
1arrayvector
22d-arraymatrix
nnd-arraynd-tensor

2.2 Rank Axes and Shape

  • Rank refers to the number of dimensions present within the tensor, also the least number of indices that we need to access a tensor. For example, the following are the same:
    • rank-2 tensor
    • matrix
    • 2d- array
    • 2d-tensor
  • Axes is always decribed in n*m*q etc. n,m,q describe the length of each axes. Dimensions tell us how many axes a tensor has, and the length of axes tells us the shape.

2.3 Reshaping in PyTorch

Call the reshape() function of the tensor

The shape of the tensor will be changed after reshaping (but not the varible itself), while the elements stay unchanged.

2.4 Example

  1. Define a data structure
dd = [
    [0,1,2],
    [3,4,5],
    [6,7,8]
]
  1. Change the 3*3 matrix to a pytorch tensor
t = torch.tensor(dd)
t
tensor([[0, 1, 2],
        [3, 4, 5],
        [6, 7, 8]])
  1. Get the tensor’s shape
t.shape
torch.Size([3, 3])
  1. Reshape the tensor as an 1*9 tensor
t.reshape(1,9)
tensor([[0, 1, 2, 3, 4, 5, 6, 7, 8]])

Important: Resahping dosen’t change the axes number (the tensor still has two layers), and it dosn’t change the virible’s shape, so the wanted result (after reshaping) should be assigned to another pytorch tensor if it’s wanted.

t.shape # t.shape remains unchanged after t.reshape(1,9)
torch.Size([3, 3])
评论
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包
实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值