PyTorch入门学习-1.深度学习回顾和PyTorch简介

01 神经网络快速入门

Modeling, Inference, Learning
在这里插入图片描述

什么是深度学习?

在这里插入图片描述

什么是神经网络?
在这里插入图片描述

激活函数

在这里插入图片描述

前向神经网络

在这里插入图片描述

卷积神经网络

在这里插入图片描述

Recurrent Neural Networks(RNN循环神经网络)

在这里插入图片描述

Seq2Seq with Attention

在这里插入图片描述

PyTorch可以做什么?

在这里插入图片描述

02 用PyTorch构建深度学习模型

  • 深度学习模型框架
    TensorFlow ay/net
    PyTorch Caffe
    mxnet Keras
  • PyTorch与其他框架的对比

(1)PyTorch:动态计算图 Dynamic Computation Graph
Tensorflow:静态计算图 Static Computation Graph
(2)PyTorch代码通俗易懂,非常接近Python原生代码,不会让人感觉是 完全在学习一门新的语言。
(3)拥有Facebook支持,社区活跃。

- PyTorch

PyTorch是一个基于Python的科学计算库,特点:
①类似于NumPy,但是它可以使用GPU;
②可以用它定义深度学习模型,可以灵活地进行深度学习模型的训练和使用。

常用的操作
(1)构造一个未初始化的5x3矩阵:

 x = torch.empty(53)
 print(x)

结果:

tensor([[0.0000e+00, 0.0000e+00, 0.0000e+00],
        [0.0000e+00, 4.7339e+30, 1.4347e-19],
        [2.7909e+23, 1.8037e+28, 1.7237e+25],
        [9.1041e-12, 6.2609e+22, 4.7428e+30],
        [3.8001e-39, 0.0000e+00, 0.0000e+00]])

(2)构建一个随机初始化的矩阵:

x = torch.rand(5, 3)
print(x)

结果:

tensor([[0.4821, 0.3854, 0.8517],
        [0.7962, 0.0632, 0.5409],
        [0.8891, 0.6112, 0.7829],
        [0.0715, 0.8069, 0.2608],
        [0.3292, 0.0119, 0.2759]])

(3)构建一个全部为0,类型为long的矩阵:

x = torch.zeros(5, 3, dtype=torch.long)
print(x)

结果:

tensor([[0, 0, 0],
        [0, 0, 0],
        [0, 0, 0],
        [0, 0, 0],
        [0, 0, 0]])

(4)从数据直接直接构建tensor:

x = torch.tensor([5.5, 3])
print(x)

结果:

tensor([5.5000, 3.0000])

(5)从一个已有的tensor构建一个tensor。这些方法会重用原来tensor的特征,例如,数据类型,除非提供新的数据。

x = x.new_ones(5, 3, dtype=torch.double)    # new_* methods take in sizes
print(x)
x = torch.randn_like(x, dtype=torch.float)  # override dtype!
print(x)                                    # result has the same size

结果:

tensor([[1., 1., 1.],
        [1., 1., 1.],
        [1., 1., 1.],
        [1., 1., 1.],
        [1., 1., 1.]], dtype=torch.float64)
tensor([[ 1.4793, -2.4772,  0.9738],
        [ 2.0328,  1.3981,  1.7509],
        [-0.7931, -0.0291, -0.6803],
        [-1.2944, -0.7352, -0.9346],
        [ 0.5917, -0.5149, -1.8149]])

(6)加法运算

y = torch.rand(5, 3)
print(x + y)

结果:

tensor([[ 1.7113, -1.5490,  1.4009],
        [ 2.4590,  1.6504,  2.6889],
        [-0.3609,  0.4950, -0.3357],
        [-0.5029, -0.3086, -0.1498],
        [ 1.2850, -0.3189, -0.8868]])

(7)另一种加法的操作

`print(torch.add(x, y)`)

结果:

tensor([[ 1.7113, -1.5490,  1.4009],
        [ 2.4590,  1.6504,  2.6889],
        [-0.3609,  0.4950, -0.3357],
        [-0.5029, -0.3086, -0.1498],
        [ 1.2850, -0.3189, -0.8868]])

(8)in-place加法

adds x to y

y.add_(x)
print(y)

结果:

tensor([[ 1.7113, -1.5490,  1.4009],
        [ 2.4590,  1.6504,  2.6889],
        [-0.3609,  0.4950, -0.3357],
        [-0.5029, -0.3086, -0.1498],
        [ 1.2850, -0.3189, -0.8868]])

(9)如果只有一个元素的tensor,使用.item()方法可以把里面的value变成Python数值。

x = torch.randn(1)
print(x)
print(x.item())

结果:

tensor([0.4726])
0.4726296067237854

(10)Numpy和Tensor之间的转化
在Torch Tensor和NumPy array之间相互转化非常容易。Torch Tensor和NumPy array会共享内存,所以改变其中一项也会改变另一项。把Torch Tensor转变成NumPy Array

a = torch.ones(5)
print(a)
b = a.numpy()
print(b)
a.add_(1)
print(a)
print(b)
import numpy as np
a = np.ones(5)
b = torch.from_numpy(a)
np.add(a, 1, out=a)
print(a)
print(b)

结果:

tensor([1., 1., 1., 1., 1.])
[1. 1. 1. 1. 1.]
tensor([2., 2., 2., 2., 2.])
[2. 2. 2. 2. 2.]
[2. 2. 2. 2. 2.]
tensor([2., 2., 2., 2., 2.], dtype=torch.float64)

第1课课后复习导图
在这里插入图片描述

用numpy实现两层神经网络
一个全连接ReLU神经网络,一个隐藏层,没有bias。用来从x预测y,使用L2 Loss。
这一实现完全使用numpy来计算前向神经网络,loss,和反向传播。
numpy ndarray是一个普通的n维array。它不知道任何关于深度学习或者梯度(gradient)的知识,也不知道计算图(computation graph),只是一种用来计算数学运算的数据结构。
import numpy as np

# N is batch size; D_in is input dimension;
# H is hidden dimension; D_out is output dimension.
N, D_in, H, D_out = 64, 1000, 100, 10

# Create random input and output data
x = np.random.randn(N, D_in)
y = np.random.randn(N, D_out)

# Randomly initialize weights
w1 = np.random.randn(D_in, H)
w2 = np.random.randn(H, D_out)

learning_rate = 1e-6
for t in range(500):
    # Forward pass: compute predicted y
    h = x.dot(w1)
    h_relu = np.maximum(h, 0)
    y_pred = h_relu.dot(w2)

    # Compute and print loss
    loss = np.square(y_pred - y).sum()
    print(t, loss)

    # Backprop to compute gradients of w1 and w2 with respect to loss
    
    # loss = (y_pred - y) ** 2
    grad_y_pred = 2.0 * (y_pred - y)
    # 
    grad_w2 = h_relu.T.dot(grad_y_pred)
    grad_h_relu = grad_y_pred.dot(w2.T)
    grad_h = grad_h_relu.copy()
    grad_h[h < 0] = 0
    grad_w1 = x.T.dot(grad_h)

    # Update weights
    w1 -= learning_rate * grad_w1
    w2 -= learning_rate * grad_w2

结果:

0 34399246.46047344
1 29023199.257758312
2 25155679.85447208
3 20344203.603057466
4 14771404.625789404
5 9796072.99431371
6 6194144.749997159
7 3948427.3657580013
8 2637928.1726997104
9 1879876.2597949505
10 1424349.925182723
11 1131684.579785501
12 930879.9521737935
13 783503.167740541
14 669981.8287784329
15 579151.6288421676
16 504610.5781504087
17 442295.18952143926
18 389647.44224490353
19 344718.3535892912
20 306120.2245707266
21 272728.24885829526
22 243778.8617292929
23 218485.92082002352
24 196304.70602822883
25 176774.2980280186
26 159509.34934842546
27 144200.52956072442
28 130597.06878493169
29 118484.47548850597
30 107661.24303895692
31 97973.75762285746
32 89291.0096051952
33 81500.46898789635
34 74477.4654945682
35 68139.90452489533
36 62418.87519034026
37 57241.53801123622
38 52545.34658231941
39 48280.5552386464
40 44399.73653914068
41 40864.495617471934
42 37640.08489317873
43 34695.77852549495
44 32004.894008637555
45 29545.09481447049
46 27292.93700341219
47 25232.87780747312
48 23342.570881009553
49 21606.76105421809
50 20015.62357395961
51 18551.83281521863
52 17204.31407669751
53 15962.736948706759
54 14818.242254751764
55 13762.251705340486
56 12787.060032590252
57 11885.95797873141
58 11053.123737613136
59 10282.617503272711
60 9569.805676161515
61 8909.467534754986
62 8297.782408129178
63 7731.121277369748
64 7205.863671952578
65 6718.146999962471
66 6265.473531640673
67 5845.100232214373
68 5454.557838660972
69 5091.658572234415
70 4754.393958028546
71 4440.682575260731
72 4148.70793229529
73 3877.022931816484
74 3624.088506535617
75 3388.5746286682042
76 3169.088547995476
77 2964.637382505168
78 2774.073275503305
79 2596.433385302534
80 2430.76267026859
81 2276.0929913609607
82 2131.752323451521
83 1997.0334011418258
84 1871.251515936368
85 1753.7448614349362
86 1643.919519574932
87 1541.289735192464
88 1445.3733798948592
89 1355.6688030350501
90 1271.7809967407718
91 1193.2972539295215
92 1119.8689894828083
93 1051.1890596219616
94 986.9044505648076
95 926.7286776893059
96 870.3673474483486
97 817.5707566117906
98 768.1200077715573
99 721.7693127164074
100 678.327084576388
101 637.5984844921132
102 599.3471700265131
103 563.480144773489
104 529.8443636950776
105 498.2900261297218
106 468.69076555164696
107 440.9141759159077
108 414.8406348102356
109 390.36201589159975
110 367.377986904459
111 345.794153113363
112 325.53293001498525
113 306.50254567681907
114 288.6084052220689
115 271.79635277136003
116 255.99599171996437
117 241.14900382305748
118 227.1967308563582
119 214.07331707113855
120 201.72960304299005
121 190.1215354419851
122 179.20332461067048
123 168.93031453109492
124 159.26460190296683
125 150.18028740393805
126 141.6310351604903
127 133.5845565128207
128 126.00708973959819
129 118.87233941614235
130 112.15340039819878
131 105.82589029792051
132 99.86499912782936
133 94.24914703127945
134 88.95813583258435
135 83.97204514587689
136 79.27316202972057
137 74.84491080200985
138 70.66991561621043
139 66.7338564785546
140 63.02257660379944
141 59.52313997962988
142 56.22322968024125
143 53.111763701729714
144 50.175933060002905
145 47.40606340622237
146 44.793256923660664
147 42.328047476976025
148 40.00144856286997
149 37.80536851048289
150 35.73343107253782
151 33.77738183530186
152 31.930774425664392
153 30.187309589532056
154 28.541632056826323
155 26.987624733348596
156 25.52056466134328
157 24.135140772349633
158 22.82592867935182
159 21.589141054848028
160 20.42054265142189
161 19.31668484832083
162 18.27373881475532
163 17.288158658562804
164 16.35722592792538
165 15.47708242556972
166 14.64564401606641
167 13.859533873388633
168 13.116375098259695
169 12.413894134407165
170 11.749589132060152
171 11.121654032769754
172 10.527755403701851
173 9.966244729776452
174 9.43532189744742
175 8.933071972010405
176 8.458011087399335
177 8.008768655278885
178 7.583910598664408
179 7.181953229124071
180 6.801495862472622
181 6.441562835264972
182 6.100994571216053
183 5.778713643608878
184 5.473781077364653
185 5.185201108079436
186 4.910778594973227
187 4.651157323562962
188 4.405448410225631
189 4.172946150445695
190 3.953151092232221
191 3.744908981306408
192 3.5478244092334004
193 3.3613194717897596
194 3.1847221013676847
195 3.017529877609752
196 2.859281816477389
197 2.7094552588397214
198 2.567567563134434
199 2.4332311668647577
200 2.3060182574075156
201 2.1855656999370474
202 2.0714842934675706
203 1.963429527723876
204 1.8610826001126994
205 1.7641435814092026
206 1.6723644556551025
207 1.5854221408589446
208 1.5030539196422792
209 1.425000717598524
210 1.351040918987503
211 1.2809841760961524
212 1.2146023546998657
213 1.151699249786657
214 1.09209327048869
215 1.035622604251914
216 0.9821312023078086
217 0.931398400382507
218 0.8833282657308957
219 0.8377648680868359
220 0.7946102894611751
221 0.7536759265506683
222 0.7148752979674705
223 0.6781061363107517
224 0.6432513423030822
225 0.6102056251947237
226 0.5788694783475372
227 0.5491564979966296
228 0.5209877387920085
229 0.49428305592302185
230 0.4689618516832248
231 0.4449467772589872
232 0.4221771538530943
233 0.4005837129745524
234 0.3801063836563353
235 0.3606948589685684
236 0.34227967093321354
237 0.3248108971814142
238 0.3082453595318127
239 0.29253330742340067
240 0.27763364296473714
241 0.26349628059737484
242 0.25008661105388497
243 0.23736707183615013
244 0.2253000957467423
245 0.2138523361624442
246 0.20298921216044458
247 0.19268673137415065
248 0.1829142719891026
249 0.17364132891347556
250 0.1648403028301571
251 0.1564884304843211
252 0.14856550672688007
253 0.14104626799439057
254 0.13391149996667578
255 0.12714129899063353
256 0.12071506432087209
257 0.11461697949344261
258 0.10882972424002835
259 0.10333904654553265
260 0.09812616827155932
261 0.0931780565261463
262 0.08848217530870164
263 0.08402546175986916
264 0.07979485919700613
265 0.07577840283923465
266 0.07196528340443757
267 0.06834633090302783
268 0.06491078821045225
269 0.06164958715982291
270 0.05855323394724137
271 0.055613749333235304
272 0.05282348135771675
273 0.050174186967151216
274 0.04765819132181068
275 0.045271438294485024
276 0.04300319389006948
277 0.04085019680263658
278 0.03880575227345778
279 0.03686467008847949
280 0.035020834986065126
281 0.033270091765219535
282 0.03160778576828356
283 0.03002859285883898
284 0.028528860354370092
285 0.02710476057141562
286 0.025752260418639834
287 0.024468202569374195
288 0.023248340960852463
289 0.022089463426076428
290 0.020988894619024936
291 0.0199437951726752
292 0.018950933816160725
293 0.01800781808698391
294 0.01711204608500793
295 0.016261162547766717
296 0.015452712469537512
297 0.014684969147452612
298 0.013955631973752351
299 0.013262568423046399
300 0.012604847094777376
301 0.011979553027742364
302 0.011385439315700433
303 0.010820996391120769
304 0.010284594931599429
305 0.009774981235579705
306 0.009290757906358697
307 0.008830808784232147
308 0.008393777417063167
309 0.007978536761423337
310 0.007583920779683997
311 0.007208915598860074
312 0.006852552626458435
313 0.006513949532633323
314 0.006192232365425878
315 0.005886426734936253
316 0.0055958430902459
317 0.005319724577409574
318 0.00505742623277029
319 0.004808048657916753
320 0.004571049645157161
321 0.00434578735628591
322 0.00413168646938554
323 0.003928177611013847
324 0.0037348015880361344
325 0.0035511516515125325
326 0.00337649547244047
327 0.003210468217165082
328 0.0030526001815046697
329 0.0029025877721886788
330 0.002759953496700082
331 0.0026243799521717052
332 0.0024955282428992384
333 0.002373029911103254
334 0.0022565698554979346
335 0.0021458674153799783
336 0.0020406457772717784
337 0.0019405998694530123
338 0.0018454766429137211
339 0.001755053210723282
340 0.0016691117022354827
341 0.0015873764219450975
342 0.001509661141624596
343 0.0014357855252587638
344 0.0013655404837908268
345 0.0012987690170491828
346 0.0012352672659498882
347 0.0011748904401224442
348 0.0011174828717238274
349 0.001062929250585763
350 0.0010110289742089754
351 0.0009616701215304243
352 0.0009147416941442063
353 0.0008701229479127731
354 0.0008276969799764454
355 0.0007873380259436494
356 0.0007489547972389264
357 0.000712451804667229
358 0.000677739371845171
359 0.000644728435464334
360 0.0006133353967986715
361 0.000583485133709109
362 0.0005550950412157807
363 0.0005280932731431832
364 0.0005024110280030845
365 0.00047798591228805434
366 0.0004547507774551706
367 0.0004326546423136559
368 0.0004116382458083261
369 0.0003916440959886334
370 0.0003726296356534275
371 0.0003545443586216977
372 0.000337347352488608
373 0.00032099061370803334
374 0.0003054229784132819
375 0.00029061647064382485
376 0.0002765299098361774
377 0.0002631327221101076
378 0.0002503865963973947
379 0.0002382599294869431
380 0.00022672670184804494
381 0.00021575299560298047
382 0.00020531375263207438
383 0.000195381616896771
384 0.00018593500698085453
385 0.00017694494225329907
386 0.00016839225855899982
387 0.00016025517275686525
388 0.00015251350815142156
389 0.0001451491411549753
390 0.0001381428245892601
391 0.00013147417414693054
392 0.00012512977608770297
393 0.00011909308605343111
394 0.00011334857979979945
395 0.00010788480695473414
396 0.00010268704883570024
397 9.773868892276339e-05
398 9.303020197524704e-05
399 8.85491663624475e-05
400 8.428485316645869e-05
401 8.022778747190388e-05
402 7.636668153099922e-05
403 7.269236014951034e-05
404 6.919607836124983e-05
405 6.586822433827189e-05
406 6.270255584885866e-05
407 5.968876673919747e-05
408 5.682053107105221e-05
409 5.409095915616159e-05
410 5.1493171389161614e-05
411 4.902052422909128e-05
412 4.666751255362057e-05
413 4.442778311925004e-05
414 4.229665906499858e-05
415 4.026848185397114e-05
416 3.833731276858471e-05
417 3.6499489902161296e-05
418 3.47508278392673e-05
419 3.308721418833935e-05
420 3.150227517115802e-05
421 2.9993581650856873e-05
422 2.855746554164107e-05
423 2.719082047553995e-05
424 2.5889677603252346e-05
425 2.4650964056310747e-05
426 2.3472062284026367e-05
427 2.234982200401504e-05
428 2.1281362386681205e-05
429 2.026427108363315e-05
430 1.9295935390705645e-05
431 1.837421991578651e-05
432 1.7497081369101593e-05
433 1.6661834251218388e-05
434 1.586646319751679e-05
435 1.5109303894458321e-05
436 1.4388405284291706e-05
437 1.370208486427819e-05
438 1.3048788200557073e-05
439 1.2426643170448882e-05
440 1.1834592393040145e-05
441 1.1270674124878297e-05
442 1.0733978520453249e-05
443 1.0222848634947958e-05
444 9.736143369136732e-06
445 9.27267469036151e-06
446 8.831344021766777e-06
447 8.411269141029111e-06
448 8.01120400023348e-06
449 7.630301830674667e-06
450 7.267514155906956e-06
451 6.922046453620718e-06
452 6.593111073936982e-06
453 6.279880429162564e-06
454 5.981637837795765e-06
455 5.697647106767495e-06
456 5.427145298606709e-06
457 5.169599426636958e-06
458 4.924305367745114e-06
459 4.690669682311872e-06
460 4.468174841019597e-06
461 4.256343870589204e-06
462 4.054581997203786e-06
463 3.862428795746305e-06
464 3.67941417005127e-06
465 3.50524041752366e-06
466 3.339264640996386e-06
467 3.1811641594010322e-06
468 3.0305596775569415e-06
469 2.8871296080016895e-06
470 2.7505179893217363e-06
471 2.6204108493977605e-06
472 2.496484420691004e-06
473 2.3784400359079537e-06
474 2.266014000554243e-06
475 2.1589189914752615e-06
476 2.0569210781071096e-06
477 1.959750825498873e-06
478 1.867203798935969e-06
479 1.7790548691072608e-06
480 1.6950719025924979e-06
481 1.6150665241638997e-06
482 1.5388752061694276e-06
483 1.4662834016989446e-06
484 1.3971344832895556e-06
485 1.3312570753140638e-06
486 1.2684946752376657e-06
487 1.2087236535163552e-06
488 1.1517952524068044e-06
489 1.0975341827852709e-06
490 1.0458478627989778e-06
491 9.966057254836819e-07
492 9.49690406945525e-07
493 9.04995815168244e-07
494 8.624191220382796e-07
495 8.218529042471487e-07
496 7.831982460890191e-07
497 7.463699242750524e-07
498 7.112838972272693e-07
499 6.778580009634641e-07
  • 0
    点赞
  • 1
    收藏
    觉得还不错? 一键收藏
  • 0
    评论
评论
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包
实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值