PyTorch入门与实战第一课

第一课

什么是PyTorch?
PyTorch是一个基于Python的科学计算库,它有以下特点:
类似于NumPy,但是它可以使用GPU
可以用它定义深度学习模型,可以灵活地进行深度学习模型的训练和使用
Tensors
Tensor类似与NumPy的ndarray,唯一的区别是Tensor可以在GPU上加速运算。
In[1]:

from __future__ import print_function
import torch

构造一个未初始化的5x3矩阵:
In[2]:

x = torch.empty(5, 3)
print(x)
tensor([[0.0000e+00, 0.0000e+00, 0.0000e+00],
	        [0.0000e+00, 4.7339e+30, 1.4347e-19],
	        [2.7909e+23, 1.8037e+28, 1.7237e+25],
	        [9.1041e-12, 6.2609e+22, 4.7428e+30],
	        [3.8001e-39, 0.0000e+00, 0.0000e+00]])

构建一个随机初始化的矩阵:
In[3]:

x = torch.rand(5, 3)
print(x)
tensor([[0.4821, 0.3854, 0.8517],
        [0.7962, 0.0632, 0.5409],
        [0.8891, 0.6112, 0.7829],
        [0.0715, 0.8069, 0.2608],
        [0.3292, 0.0119, 0.2759]])

构建一个全部为0,类型为long的矩阵:
In[4]:

x = torch.zeros(5, 3, dtype=torch.long)
print(x)
tensor([[0, 0, 0],
        [0, 0, 0],
        [0, 0, 0],
        [0, 0, 0],
        [0, 0, 0]])

从数据直接直接构建tensor:
In[5]:

x = torch.tensor([5.5, 3])
print(x)
tensor([5.5000, 3.0000])

也可以从一个已有的tensor构建一个tensor。这些方法会重用原来tensor的特征,例如,数据类型,除非提供新的数据。
In[6]:

x = x.new_ones(5, 3, dtype=torch.double)      # new_* methods take in sizes
print(x)

x = torch.randn_like(x, dtype=torch.float)    # override dtype!
print(x)                                      # result has the same size
tensor([[1., 1., 1.],
        [1., 1., 1.],
        [1., 1., 1.],
        [1., 1., 1.],
        [1., 1., 1.]], dtype=torch.float64)
tensor([[ 1.4793, -2.4772,  0.9738],
        [ 2.0328,  1.3981,  1.7509],
        [-0.7931, -0.0291, -0.6803],
        [-1.2944, -0.7352, -0.9346],
        [ 0.5917, -0.5149, -1.8149]])

得到tensor的形状:
In[7]:

print(x.size())
torch.Size([5, 3])
注意
``torch.Size`` 返回的是一个tuple

Operations

有很多种tensor运算。我们先介绍加法运算。
In[8]:

y = torch.rand(5, 3)
print(x + y)
tensor([[ 1.7113, -1.5490,  1.4009],
        [ 2.4590,  1.6504,  2.6889],
        [-0.3609,  0.4950, -0.3357],
        [-0.5029, -0.3086, -0.1498],
        [ 1.2850, -0.3189, -0.8868]])

另一种着加法的写法
In[9]:

print(torch.add(x, y))
tensor([[ 1.7113, -1.5490,  1.4009],
        [ 2.4590,  1.6504,  2.6889],
        [-0.3609,  0.4950, -0.3357],
        [-0.5029, -0.3086, -0.1498],
        [ 1.2850, -0.3189, -0.8868]])

加法:把输出作为一个变量
In[10]:

result = torch.empty(5, 3)
torch.add(x, y, out=result)
print(result)

tensor([[ 1.7113, -1.5490, 1.4009],
[ 2.4590, 1.6504, 2.6889],
[-0.3609, 0.4950, -0.3357],
[-0.5029, -0.3086, -0.1498],
[ 1.2850, -0.3189, -0.8868]])
in-place加法
In[11]:

# adds x to y
y.add_(x)
print(y)
tensor([[ 1.7113, -1.5490,  1.4009],
        [ 2.4590,  1.6504,  2.6889],
        [-0.3609,  0.4950, -0.3357],
        [-0.5029, -0.3086, -0.1498],
        [ 1.2850, -0.3189, -0.8868]])
注意
任何in-place的运算都会以``_``结尾。 举例来说:``x.copy_(y)``, ``x.t_()``, 会改变 ``x``。

各种类似NumPy的indexing都可以在PyTorch tensor上面使用。
In[12]:

print(x[:, 1])
tensor([-2.4772,  1.3981, -0.0291, -0.7352, -0.5149])

Resizing: 如果你希望resize/reshape一个tensor,可以使用torch.view:
In[13]:

x = torch.randn(4, 4)
y = x.view(16)
z = x.view(-1, 8)  # the size -1 is inferred from other dimensions
print(x.size(), y.size(), z.size())
torch.Size([4, 4]) torch.Size([16]) torch.Size([2, 8])

如果你有一个只有一个元素的tensor,使用.item()方法可以把里面的value变成Python数值。
In[14]:

x = torch.randn(1)
print(x)
print(x.item())
tensor([0.4726])
0.4726296067237854

更多阅读

各种Tensor operations, 包括transposing, indexing, slicing, mathematical operations, linear algebra, random numbers在 https://pytorch.org/docs/torch.

**

Numpy和Tensor之间的转化

**
在Torch Tensor和NumPy array之间相互转化非常容易。

Torch Tensor和NumPy array会共享内存,所以改变其中一项也会改变另一项。

把Torch Tensor转变成NumPy Array
In[15]:

a = torch.ones(5)
print(a)
tensor([1., 1., 1., 1., 1.])

In[16]:

b = a.numpy()
print(b)
[1. 1. 1. 1. 1.]

改变numpy array里面的值。
In[17]:

a.add_(1)
print(a)
print(b)
tensor([2., 2., 2., 2., 2.])
[2. 2. 2. 2. 2.]

把NumPy ndarray转成Torch Tensor
In[18]:

import numpy as np
a = np.ones(5)
b = torch.from_numpy(a)
np.add(a, 1, out=a)
print(a)
print(b)
[2. 2. 2. 2. 2.]
tensor([2., 2., 2., 2., 2.], dtype=torch.float64)

所有CPU上的Tensor都支持转成numpy或者从numpy转成Tensor。

CUDA Tensors
使用.to方法,Tensor可以被移动到别的device上。
In[19]:

# let us run this cell only if CUDA is available
# We will use ``torch.device`` objects to move tensors in and out of GPU
if torch.cuda.is_available():
    device = torch.device("cuda")          # a CUDA device object
    y = torch.ones_like(x, device=device)  # directly create a tensor on GPU
    x = x.to(device)                       # or just use strings ``.to("cuda")``
    z = x + y
    print(z)
    print(z.to("cpu", torch.double))       # ``.to`` can also change dtype together!

热身: 用numpy实现两层神经网络
一个全连接ReLU神经网络,一个隐藏层,没有bias。用来从x预测y,使用L2 Loss。

这一实现完全使用numpy来计算前向神经网络,loss,和反向传播。

numpy ndarray是一个普通的n维array。它不知道任何关于深度学习或者梯度(gradient)的知识,也不知道计算图(computation graph),只是一种用来计算数学运算的数据结构。
In[20]:


import numpy as np

# N is batch size; D_in is input dimension;
# H is hidden dimension; D_out is output dimension.
N, D_in, H, D_out = 64, 1000, 100, 10

# Create random input and output data
x = np.random.randn(N, D_in)
y = np.random.randn(N, D_out)

# Randomly initialize weights
w1 = np.random.randn(D_in, H)
w2 = np.random.randn(H, D_out)

learning_rate = 1e-6
for t in range(500):
    # Forward pass: compute predicted y
    h = x.dot(w1)
    h_relu = np.maximum(h, 0)
    y_pred = h_relu.dot(w2)

    # Compute and print loss
    loss = np.square(y_pred - y).sum()
    print(t, loss)

    # Backprop to compute gradients of w1 and w2 with respect to loss
    
    # loss = (y_pred - y) ** 2
    grad_y_pred = 2.0 * (y_pred - y)
    # 
    grad_w2 = h_relu.T.dot(grad_y_pred)
    grad_h_relu = grad_y_pred.dot(w2.T)
    grad_h = grad_h_relu.copy()
    grad_h[h < 0] = 0
    grad_w1 = x.T.dot(grad_h)

    # Update weights
    w1 -= learning_rate * grad_w1
    w2 -= learning_rate * grad_w2
0 34399246.46047344
1 29023199.257758312
2 25155679.85447208
3 20344203.603057466
4 14771404.625789404
5 9796072.99431371
6 6194144.749997159
7 3948427.3657580013
8 2637928.1726997104
9 1879876.2597949505
10 1424349.925182723
11 1131684.579785501
12 930879.9521737935
13 783503.167740541
14 669981.8287784329
15 579151.6288421676
16 504610.5781504087
17 442295.18952143926
18 389647.44224490353
19 344718.3535892912
20 306120.2245707266
21 272728.24885829526
22 243778.8617292929
23 218485.92082002352
24 196304.70602822883
25 176774.2980280186
26 159509.34934842546
27 144200.52956072442
28 130597.06878493169
29 118484.47548850597
30 107661.24303895692
31 97973.75762285746
32 89291.0096051952
33 81500.46898789635
34 74477.4654945682
35 68139.90452489533
36 62418.87519034026
37 57241.53801123622
38 52545.34658231941
39 48280.5552386464
40 44399.73653914068
41 40864.495617471934
42 37640.08489317873
43 34695.77852549495
44 32004.894008637555
45 29545.09481447049
46 27292.93700341219
47 25232.87780747312
48 23342.570881009553
49 21606.76105421809
50 20015.62357395961
51 18551.83281521863
52 17204.31407669751
53 15962.736948706759
54 14818.242254751764
55 13762.251705340486
56 12787.060032590252
57 11885.95797873141
58 11053.123737613136
59 10282.617503272711
60 9569.805676161515
61 8909.467534754986
62 8297.782408129178
63 7731.121277369748
64 7205.863671952578
65 6718.146999962471
66 6265.473531640673
67 5845.100232214373
68 5454.557838660972
69 5091.658572234415
70 4754.393958028546
71 4440.682575260731
72 4148.70793229529
73 3877.022931816484
74 3624.088506535617
75 3388.5746286682042
76 3169.088547995476
77 2964.637382505168
78 2774.073275503305
79 2596.433385302534
80 2430.76267026859
81 2276.0929913609607
82 2131.752323451521
83 1997.0334011418258
84 1871.251515936368
85 1753.7448614349362
86 1643.919519574932
87 1541.289735192464
88 1445.3733798948592
89 1355.6688030350501
90 1271.7809967407718
91 1193.2972539295215
92 1119.8689894828083
93 1051.1890596219616
94 986.9044505648076
95 926.7286776893059
96 870.3673474483486
97 817.5707566117906
98 768.1200077715573
99 721.7693127164074
100 678.327084576388
101 637.5984844921132
102 599.3471700265131
103 563.480144773489
104 529.8443636950776
105 498.2900261297218
106 468.69076555164696
107 440.9141759159077
108 414.8406348102356
109 390.36201589159975
110 367.377986904459
111 345.794153113363
112 325.53293001498525
113 306.50254567681907
114 288.6084052220689
115 271.79635277136003
116 255.99599171996437
117 241.14900382305748
118 227.1967308563582
119 214.07331707113855
120 201.72960304299005
121 190.1215354419851
122 179.20332461067048
123 168.93031453109492
124 159.26460190296683
125 150.18028740393805
126 141.6310351604903
127 133.5845565128207
128 126.00708973959819
129 118.87233941614235
130 112.15340039819878
131 105.82589029792051
132 99.86499912782936
133 94.24914703127945
134 88.95813583258435
135 83.97204514587689
136 79.27316202972057
137 74.84491080200985
138 70.66991561621043
139 66.7338564785546
140 63.02257660379944
141 59.52313997962988
142 56.22322968024125
143 53.111763701729714
144 50.175933060002905
145 47.40606340622237
146 44.793256923660664
147 42.328047476976025
148 40.00144856286997
149 37.80536851048289
150 35.73343107253782
151 33.77738183530186
152 31.930774425664392
153 30.187309589532056
154 28.541632056826323
155 26.987624733348596
156 25.52056466134328
157 24.135140772349633
158 22.82592867935182
159 21.589141054848028
160 20.42054265142189
161 19.31668484832083
162 18.27373881475532
163 17.288158658562804
164 16.35722592792538
165 15.47708242556972
166 14.64564401606641
167 13.859533873388633
168 13.116375098259695
169 12.413894134407165
170 11.749589132060152
171 11.121654032769754
172 10.527755403701851
173 9.966244729776452
174 9.43532189744742
175 8.933071972010405
176 8.458011087399335
177 8.008768655278885
178 7.583910598664408
179 7.181953229124071
180 6.801495862472622
181 6.441562835264972
182 6.100994571216053
183 5.778713643608878
184 5.473781077364653
185 5.185201108079436
186 4.910778594973227
187 4.651157323562962
188 4.405448410225631
189 4.172946150445695
190 3.953151092232221
191 3.744908981306408
192 3.5478244092334004
193 3.3613194717897596
194 3.1847221013676847
195 3.017529877609752
196 2.859281816477389
197 2.7094552588397214
198 2.567567563134434
199 2.4332311668647577
200 2.3060182574075156
201 2.1855656999370474
202 2.0714842934675706
203 1.963429527723876
204 1.8610826001126994
205 1.7641435814092026
206 1.6723644556551025
207 1.5854221408589446
208 1.5030539196422792
209 1.425000717598524
210 1.351040918987503
211 1.2809841760961524
212 1.2146023546998657
213 1.151699249786657
214 1.09209327048869
215 1.035622604251914
216 0.9821312023078086
217 0.931398400382507
218 0.8833282657308957
219 0.8377648680868359
220 0.7946102894611751
221 0.7536759265506683
222 0.7148752979674705
223 0.6781061363107517
224 0.6432513423030822
225 0.6102056251947237
226 0.5788694783475372
227 0.5491564979966296
228 0.5209877387920085
229 0.49428305592302185
230 0.4689618516832248
231 0.4449467772589872
232 0.4221771538530943
233 0.4005837129745524
234 0.3801063836563353
235 0.3606948589685684
236 0.34227967093321354
237 0.3248108971814142
238 0.3082453595318127
239 0.29253330742340067
240 0.27763364296473714
241 0.26349628059737484
242 0.25008661105388497
243 0.23736707183615013
244 0.2253000957467423
245 0.2138523361624442
246 0.20298921216044458
247 0.19268673137415065
248 0.1829142719891026
249 0.17364132891347556
250 0.1648403028301571
251 0.1564884304843211
252 0.14856550672688007
253 0.14104626799439057
254 0.13391149996667578
255 0.12714129899063353
256 0.12071506432087209
257 0.11461697949344261
258 0.10882972424002835
259 0.10333904654553265
260 0.09812616827155932
261 0.0931780565261463
262 0.08848217530870164
263 0.08402546175986916
264 0.07979485919700613
265 0.07577840283923465
266 0.07196528340443757
267 0.06834633090302783
268 0.06491078821045225
269 0.06164958715982291
270 0.05855323394724137
271 0.055613749333235304
272 0.05282348135771675
273 0.050174186967151216
274 0.04765819132181068
275 0.045271438294485024
276 0.04300319389006948
277 0.04085019680263658
278 0.03880575227345778
279 0.03686467008847949
280 0.035020834986065126
281 0.033270091765219535
282 0.03160778576828356
283 0.03002859285883898
284 0.028528860354370092
285 0.02710476057141562
286 0.025752260418639834
287 0.024468202569374195
288 0.023248340960852463
289 0.022089463426076428
290 0.020988894619024936
291 0.0199437951726752
292 0.018950933816160725
293 0.01800781808698391
294 0.01711204608500793
295 0.016261162547766717
296 0.015452712469537512
297 0.014684969147452612
298 0.013955631973752351
299 0.013262568423046399
300 0.012604847094777376
301 0.011979553027742364
302 0.011385439315700433
303 0.010820996391120769
304 0.010284594931599429
305 0.009774981235579705
306 0.009290757906358697
307 0.008830808784232147
308 0.008393777417063167
309 0.007978536761423337
310 0.007583920779683997
311 0.007208915598860074
312 0.006852552626458435
313 0.006513949532633323
314 0.006192232365425878
315 0.005886426734936253
316 0.0055958430902459
317 0.005319724577409574
318 0.00505742623277029
319 0.004808048657916753
320 0.004571049645157161
321 0.00434578735628591
322 0.00413168646938554
323 0.003928177611013847
324 0.0037348015880361344
325 0.0035511516515125325
326 0.00337649547244047
327 0.003210468217165082
328 0.0030526001815046697
329 0.0029025877721886788
330 0.002759953496700082
331 0.0026243799521717052
332 0.0024955282428992384
333 0.002373029911103254
334 0.0022565698554979346
335 0.0021458674153799783
336 0.0020406457772717784
337 0.0019405998694530123
338 0.0018454766429137211
339 0.001755053210723282
340 0.0016691117022354827
341 0.0015873764219450975
342 0.001509661141624596
343 0.0014357855252587638
344 0.0013655404837908268
345 0.0012987690170491828
346 0.0012352672659498882
347 0.0011748904401224442
348 0.0011174828717238274
349 0.001062929250585763
350 0.0010110289742089754
351 0.0009616701215304243
352 0.0009147416941442063
353 0.0008701229479127731
354 0.0008276969799764454
355 0.0007873380259436494
356 0.0007489547972389264
357 0.000712451804667229
358 0.000677739371845171
359 0.000644728435464334
360 0.0006133353967986715
361 0.000583485133709109
362 0.0005550950412157807
363 0.0005280932731431832
364 0.0005024110280030845
365 0.00047798591228805434
366 0.0004547507774551706
367 0.0004326546423136559
368 0.0004116382458083261
369 0.0003916440959886334
370 0.0003726296356534275
371 0.0003545443586216977
372 0.000337347352488608
373 0.00032099061370803334
374 0.0003054229784132819
375 0.00029061647064382485
376 0.0002765299098361774
377 0.0002631327221101076
378 0.0002503865963973947
379 0.0002382599294869431
380 0.00022672670184804494
381 0.00021575299560298047
382 0.00020531375263207438
383 0.000195381616896771
384 0.00018593500698085453
385 0.00017694494225329907
386 0.00016839225855899982
387 0.00016025517275686525
388 0.00015251350815142156
389 0.0001451491411549753
390 0.0001381428245892601
391 0.00013147417414693054
392 0.00012512977608770297
393 0.00011909308605343111
394 0.00011334857979979945
395 0.00010788480695473414
396 0.00010268704883570024
397 9.773868892276339e-05
398 9.303020197524704e-05
399 8.85491663624475e-05
400 8.428485316645869e-05
401 8.022778747190388e-05
402 7.636668153099922e-05
403 7.269236014951034e-05
404 6.919607836124983e-05
405 6.586822433827189e-05
406 6.270255584885866e-05
407 5.968876673919747e-05
408 5.682053107105221e-05
409 5.409095915616159e-05
410 5.1493171389161614e-05
411 4.902052422909128e-05
412 4.666751255362057e-05
413 4.442778311925004e-05
414 4.229665906499858e-05
415 4.026848185397114e-05
416 3.833731276858471e-05
417 3.6499489902161296e-05
418 3.47508278392673e-05
419 3.308721418833935e-05
420 3.150227517115802e-05
421 2.9993581650856873e-05
422 2.855746554164107e-05
423 2.719082047553995e-05
424 2.5889677603252346e-05
425 2.4650964056310747e-05
426 2.3472062284026367e-05
427 2.234982200401504e-05
428 2.1281362386681205e-05
429 2.026427108363315e-05
430 1.9295935390705645e-05
431 1.837421991578651e-05
432 1.7497081369101593e-05
433 1.6661834251218388e-05
434 1.586646319751679e-05
435 1.5109303894458321e-05
436 1.4388405284291706e-05
437 1.370208486427819e-05
438 1.3048788200557073e-05
439 1.2426643170448882e-05
440 1.1834592393040145e-05
441 1.1270674124878297e-05
442 1.0733978520453249e-05
443 1.0222848634947958e-05
444 9.736143369136732e-06
445 9.27267469036151e-06
446 8.831344021766777e-06
447 8.411269141029111e-06
448 8.01120400023348e-06
449 7.630301830674667e-06
450 7.267514155906956e-06
451 6.922046453620718e-06
452 6.593111073936982e-06
453 6.279880429162564e-06
454 5.981637837795765e-06
455 5.697647106767495e-06
456 5.427145298606709e-06
457 5.169599426636958e-06
458 4.924305367745114e-06
459 4.690669682311872e-06
460 4.468174841019597e-06
461 4.256343870589204e-06
462 4.054581997203786e-06
463 3.862428795746305e-06
464 3.67941417005127e-06
465 3.50524041752366e-06
466 3.339264640996386e-06
467 3.1811641594010322e-06
468 3.0305596775569415e-06
469 2.8871296080016895e-06
470 2.7505179893217363e-06
471 2.6204108493977605e-06
472 2.496484420691004e-06
473 2.3784400359079537e-06
474 2.266014000554243e-06
475 2.1589189914752615e-06
476 2.0569210781071096e-06
477 1.959750825498873e-06
478 1.867203798935969e-06
479 1.7790548691072608e-06
480 1.6950719025924979e-06
481 1.6150665241638997e-06
482 1.5388752061694276e-06
483 1.4662834016989446e-06
484 1.3971344832895556e-06
485 1.3312570753140638e-06
486 1.2684946752376657e-06
487 1.2087236535163552e-06
488 1.1517952524068044e-06
489 1.0975341827852709e-06
490 1.0458478627989778e-06
491 9.966057254836819e-07
492 9.49690406945525e-07
493 9.04995815168244e-07
494 8.624191220382796e-07
495 8.218529042471487e-07
496 7.831982460890191e-07
497 7.463699242750524e-07
498 7.112838972272693e-07
499 6.778580009634641e-07

PyTorch: Tensors
这次我们使用PyTorch tensors来创建前向神经网络,计算损失,以及反向传播。

一个PyTorch Tensor很像一个numpy的ndarray。但是它和numpy ndarray最大的区别是,PyTorch Tensor可以在CPU或者GPU上运算。如果想要在GPU上运算,就需要把Tensor换成cuda类型。
In[21]:


import torch


dtype = torch.float
device = torch.device("cpu")
# device = torch.device("cuda:0") # Uncomment this to run on GPU

# N is batch size; D_in is input dimension;
# H is hidden dimension; D_out is output dimension.
N, D_in, H, D_out = 64, 1000, 100, 10

# Create random input and output data
x = torch.randn(N, D_in, device=device, dtype=dtype)
y = torch.randn(N, D_out, device=device, dtype=dtype)

# Randomly initialize weights
w1 = torch.randn(D_in, H, device=device, dtype=dtype)
w2 = torch.randn(H, D_out, device=device, dtype=dtype)

learning_rate = 1e-6
for t in range(500):
    # Forward pass: compute predicted y
    h = x.mm(w1)
    h_relu = h.clamp(min=0)
    y_pred = h_relu.mm(w2)

    # Compute and print loss
    loss = (y_pred - y).pow(2).sum().item()
    print(t, loss)

    # Backprop to compute gradients of w1 and w2 with respect to loss
    grad_y_pred = 2.0 * (y_pred - y)
    grad_w2 = h_relu.t().mm(grad_y_pred)
    grad_h_relu = grad_y_pred.mm(w2.t())
    grad_h = grad_h_relu.clone()
    grad_h[h < 0] = 0
    grad_w1 = x.t().mm(grad_h)

    # Update weights using gradient descent
    w1 -= learning_rate * grad_w1
    w2 -= learning_rate * grad_w2
0 31704728.0
1 25331164.0
2 22378086.0
3 19262238.0
4 15348289.0
5 11017595.0
6 7356282.0
7 4705923.5
8 3027346.5
9 2012536.375
10 1409662.25
11 1041771.75
12 807321.0625
13 649262.0
14 536533.1875
15 451980.875
16 385983.53125
17 332925.53125
18 289368.1875
19 253030.78125
20 222354.703125
21 196214.3125
22 173766.515625
23 154378.140625
24 137539.375
25 122867.1015625
26 110037.3515625
27 98769.4921875
28 88842.109375
29 80063.15625
30 72279.015625
31 65361.66796875
32 59195.42578125
33 53687.4453125
34 48757.57421875
35 44338.4453125
36 40370.34765625
37 36803.1484375
38 33587.4453125
39 30684.1640625
40 28059.435546875
41 25683.255859375
42 23528.814453125
43 21570.8515625
44 19792.4296875
45 18175.244140625
46 16704.6640625
47 15364.2578125
48 14141.7509765625
49 13026.609375
50 12007.3115234375
51 11075.3896484375
52 10221.8857421875
53 9439.876953125
54 8722.13671875
55 8063.46826171875
56 7458.20703125
57 6901.8876953125
58 6390.34375
59 5919.4794921875
60 5485.79345703125
61 5086.119140625
62 4718.2138671875
63 4378.970703125
64 4065.92578125
65 3776.7900390625
66 3509.54296875
67 3262.43408203125
68 3033.942626953125
69 2822.52490234375
70 2627.182373046875
71 2446.365966796875
72 2278.8046875
73 2123.408447265625
74 1979.00146484375
75 1845.013427734375
76 1720.6822509765625
77 1605.2548828125
78 1498.001953125
79 1398.356201171875
80 1305.7220458984375
81 1219.5579833984375
82 1139.3939208984375
83 1064.7841796875
84 995.3250732421875
85 930.6298217773438
86 870.3472900390625
87 814.1729125976562
88 761.8153686523438
89 713.0128784179688
90 667.50048828125
91 625.0264892578125
92 585.3772583007812
93 548.3762817382812
94 513.8129272460938
95 481.5259094238281
96 451.376708984375
97 423.1982116699219
98 396.865234375
99 372.23583984375
100 349.208984375
101 327.65960693359375
102 307.49652099609375
103 288.6243591308594
104 270.9569396972656
105 254.41790771484375
106 238.9322052001953
107 224.42202758789062
108 210.82664489746094
109 198.08383178710938
110 186.14157104492188
111 174.94784545898438
112 164.45217895507812
113 154.6090850830078
114 145.38900756835938
115 136.7398681640625
116 128.62008666992188
117 121.001708984375
118 113.84794616699219
119 107.13176727294922
120 100.82424926757812
121 94.90043640136719
122 89.33421325683594
123 84.10637664794922
124 79.19412994384766
125 74.57848358154297
126 70.23960876464844
127 66.15946197509766
128 62.32460403442383
129 58.7183723449707
130 55.32723617553711
131 52.13628387451172
132 49.13447570800781
133 46.310585021972656
134 43.65383529663086
135 41.152828216552734
136 38.799072265625
137 36.583656311035156
138 34.49782943725586
139 32.53558349609375
140 30.6860294342041
141 28.94465446472168
142 27.304447174072266
143 25.759523391723633
144 24.304840087890625
145 22.93392562866211
146 21.641254425048828
147 20.42369842529297
148 19.276079177856445
149 18.194564819335938
150 17.175493240356445
151 16.214174270629883
152 15.308029174804688
153 14.454139709472656
154 13.648143768310547
155 12.88845157623291
156 12.171833038330078
157 11.49567699432373
158 10.85841178894043
159 10.256678581237793
160 9.689424514770508
161 9.154097557067871
162 8.64884090423584
163 8.172189712524414
164 7.721974849700928
165 7.297136306762695
166 6.8962836265563965
167 6.5177459716796875
168 6.160311698913574
169 5.822811126708984
170 5.5043110847473145
171 5.203525066375732
172 4.919389724731445
173 4.651163101196289
174 4.3978190422058105
175 4.158350944519043
176 3.9322471618652344
177 3.718606948852539
178 3.516770839691162
179 3.3262054920196533
180 3.1460940837860107
181 2.975762367248535
182 2.814879894256592
183 2.662900447845459
184 2.5192079544067383
185 2.3834681510925293
186 2.255030393600464
187 2.13377046585083
188 2.0190846920013428
189 1.9105591773986816
190 1.807981014251709
191 1.7110538482666016
192 1.6193859577178955
193 1.5326906442642212
194 1.4506947994232178
195 1.373248815536499
196 1.2998838424682617
197 1.2304624319076538
198 1.1650127172470093
199 1.1028441190719604
200 1.0442299842834473
201 0.9886825084686279
202 0.9362077713012695
203 0.8864397406578064
204 0.8394078016281128
205 0.7948980927467346
206 0.7528337836265564
207 0.7129263281822205
208 0.6751680374145508
209 0.6395058035850525
210 0.6058014035224915
211 0.573722243309021
212 0.5434805750846863
213 0.5148582458496094
214 0.48777079582214355
215 0.462094783782959
216 0.4378334879875183
217 0.41474175453186035
218 0.3928961455821991
219 0.37232136726379395
220 0.35279765725135803
221 0.3343387842178345
222 0.31676602363586426
223 0.3001691997051239
224 0.2844657897949219
225 0.26963645219802856
226 0.2555326223373413
227 0.24219271540641785
228 0.22961300611495972
229 0.21758520603179932
230 0.20622654259204865
231 0.19550156593322754
232 0.18533945083618164
233 0.17566744983196259
234 0.16653285920619965
235 0.15787597000598907
236 0.14970409870147705
237 0.14190873503684998
238 0.13456779718399048
239 0.12759016454219818
240 0.12096268683671951
241 0.11470359563827515
242 0.1087842658162117
243 0.10314527899026871
244 0.09780357778072357
245 0.09277193248271942
246 0.08799058943986893
247 0.0834306925535202
248 0.07912513613700867
249 0.0750374048948288
250 0.07118058204650879
251 0.06751800328493118
252 0.06403960287570953
253 0.06074457988142967
254 0.05762597173452377
255 0.05466882884502411
256 0.0518682561814785
257 0.04920265078544617
258 0.04668186977505684
259 0.044272683560848236
260 0.042025547474622726
261 0.03986666351556778
262 0.037817493081092834
263 0.03588436171412468
264 0.03405837342143059
265 0.03232688084244728
266 0.030674563720822334
267 0.029108863323926926
268 0.027641309425234795
269 0.026221055537462234
270 0.024893635883927345
271 0.02361663617193699
272 0.022424183785915375
273 0.02127956785261631
274 0.020195599645376205
275 0.019174542278051376
276 0.01821214333176613
277 0.01728914864361286
278 0.016413141041994095
279 0.01559178251773119
280 0.014809946529567242
281 0.014066735282540321
282 0.01335320807993412
283 0.012697878293693066
284 0.012057363986968994
285 0.011450453661382198
286 0.010880804620683193
287 0.01034202054142952
288 0.009831000119447708
289 0.009346149861812592
290 0.008878068067133427
291 0.00844407919794321
292 0.008024066686630249
293 0.007635605521500111
294 0.0072587537579238415
295 0.0069105857983231544
296 0.006573254242539406
297 0.006256978493183851
298 0.005949943792074919
299 0.005672339349985123
300 0.005388857331126928
301 0.0051320018246769905
302 0.004887753631919622
303 0.004658843856304884
304 0.0044357734732329845
305 0.004228176549077034
306 0.004027842078357935
307 0.003840153571218252
308 0.0036594069097191095
309 0.003490033093839884
310 0.0033292584121227264
311 0.0031785538885742426
312 0.003031808650121093
313 0.002896031131967902
314 0.0027637507300823927
315 0.002640662482008338
316 0.0025206280406564474
317 0.0024077417328953743
318 0.0022998738568276167
319 0.0022006493527442217
320 0.002101228339597583
321 0.0020116977393627167
322 0.0019245940493419766
323 0.0018393839709460735
324 0.0017626716289669275
325 0.001689193886704743
326 0.0016162614338099957
327 0.0015509161166846752
328 0.0014848458813503385
329 0.0014197597047314048
330 0.0013633108465000987
331 0.0013077231124043465
332 0.001255737035535276
333 0.001203768653795123
334 0.0011556316167116165
335 0.001109623583033681
336 0.0010652983328327537
337 0.0010259364498779178
338 0.0009847141336649656
339 0.0009464982431381941
340 0.0009106658981181681
341 0.0008753696456551552
342 0.0008441813406534493
343 0.0008131045615300536
344 0.0007834337884560227
345 0.0007538509089499712
346 0.0007265112362802029
347 0.0007019559852778912
348 0.0006759824464097619
349 0.0006510045495815575
350 0.0006284148548729718
351 0.0006068204529583454
352 0.0005856421194039285
353 0.0005672844126820564
354 0.0005473798955790699
355 0.0005280547775328159
356 0.0005113428342156112
357 0.0004943141248077154
358 0.00047874540905468166
359 0.00046168401604518294
360 0.000447523663751781
361 0.0004326198832131922
362 0.00041988512384705245
363 0.00040688799344934523
364 0.0003942836483474821
365 0.000381028454285115
366 0.0003701212117448449
367 0.00035913955071009696
368 0.0003480427258182317
369 0.00033798906952142715
370 0.00032894761534407735
371 0.0003196335455868393
372 0.0003099186287727207
373 0.0003019550640601665
374 0.00029274728149175644
375 0.0002844816190190613
376 0.00027625024085864425
377 0.0002687727683223784
378 0.0002608516369946301
379 0.00025311342324130237
380 0.0002469048195052892
381 0.00024049097555689514
382 0.0002342124644201249
383 0.00022811403323430568
384 0.00022231723414734006
385 0.0002166029589716345
386 0.00021077181736472994
387 0.00020510501053649932
388 0.00020020001102238894
389 0.0001948442222783342
390 0.00018990584067068994
391 0.00018529882072471082
392 0.00018070911755785346
393 0.00017650797963142395
394 0.00017214834224432707
395 0.0001683011942077428
396 0.00016451899136882275
397 0.00016050187696237117
398 0.00015686434926465154
399 0.00015321985119953752
400 0.0001501761726103723
401 0.00014639270375482738
402 0.00014274154091253877
403 0.0001396275474689901
404 0.0001364489580737427
405 0.00013346801279112697
406 0.00013024920190218836
407 0.00012755846546497196
408 0.00012532222899608314
409 0.0001224723382620141
410 0.00011974618973908946
411 0.00011740042100427672
412 0.00011441943206591532
413 0.00011229746451135725
414 0.00010995937191182747
415 0.00010784588812384754
416 0.00010610915342113003
417 0.0001038835325744003
418 0.00010166718857362866
419 9.979418973671272e-05
420 9.793229401111603e-05
421 9.590695117367432e-05
422 9.412408689968288e-05
423 9.244915418094024e-05
424 9.07004505279474e-05
425 8.880807581590489e-05
426 8.733373397262767e-05
427 8.574980893172324e-05
428 8.392545714741573e-05
429 8.241042814916e-05
430 8.080529369181022e-05
431 7.939618808450177e-05
432 7.762608584016562e-05
433 7.651503256056458e-05
434 7.531026494689286e-05
435 7.377369183814153e-05
436 7.305829058168456e-05
437 7.153345359256491e-05
438 7.028930122032762e-05
439 6.921228487044573e-05
440 6.803637370467186e-05
441 6.695889896946028e-05
442 6.582081550732255e-05
443 6.459224823629484e-05
444 6.373634096235037e-05
445 6.257549830479547e-05
446 6.140403274912387e-05
447 6.0855145420646295e-05
448 5.961475471849553e-05
449 5.8864923630608246e-05
450 5.767813490820117e-05
451 5.6913944717962295e-05
452 5.6101172958733514e-05
453 5.5124517530202866e-05
454 5.3974403272150084e-05
455 5.3289859351934865e-05
456 5.267193409963511e-05
457 5.193992910790257e-05
458 5.1057362725259736e-05
459 5.001332101528533e-05
460 4.934622847940773e-05
461 4.873232319368981e-05
462 4.794801498064771e-05
463 4.7256213292712346e-05
464 4.667185203288682e-05
465 4.5966684410814196e-05
466 4.526913107838482e-05
467 4.486504985834472e-05
468 4.413699934957549e-05
469 4.358588557806797e-05
470 4.305447146180086e-05
471 4.2635525460354984e-05
472 4.186580190435052e-05
473 4.1199065890396014e-05
474 4.055891258758493e-05
475 4.017409810330719e-05
476 3.963488052249886e-05
477 3.913479667971842e-05
478 3.8683563616359606e-05
479 3.806965833064169e-05
480 3.7681969843106344e-05
481 3.737308725249022e-05
482 3.669063517008908e-05
483 3.630801438703202e-05
  • 0
    点赞
  • 0
    收藏
    觉得还不错? 一键收藏
  • 0
    评论
### 回答1: 《深度学习框架PyTorch入门与实践 第二版》是一本介绍PyTorch深度学习框架的实用教程。该教程适合对深度学习和PyTorch有一定了解的读者,旨在帮助读者了解PyTorch的基础知识和应用技巧。 本书首先详细介绍了PyTorch的基本概念和基础操作,从创建张量、计算图、自动求导到优化器等方面进行了全面的讲解。接着,书中介绍了如何使用PyTorch构建深度学习模型,包括图像分类、目标检测、生成对抗网络等常见任务。在构建模型的过程中,作者详细解释了模型设计的技巧,如模型组件的选择、超参数的调整等。 此外,书中还介绍了PyTorch在自然语言处理、推荐系统和强化学习等领域的应用。每个应用场景都有详细的实例代码和实验结果分析,有助于读者理解如何将PyTorch应用到不同领域的问题中。 《深度学习框架PyTorch入门与实践 第二版》在第一版的基础上更新了内容,包括新特性和最新的发展趋势。读者可以通过这本书更深入地了解PyTorch的使用,并掌握一些高级的技巧和工具。 总之,《深度学习框架PyTorch入门与实践 第二版》是一本适合深度学习和PyTorch初学者的入门教程。通过学习本书,读者可以快速上手PyTorch,并具备构建和训练深度学习模型的基本能力。 ### 回答2: 《深度学习框架PyTorch入门与实践 第二版》是一本介绍PyTorch深度学习框架的实践指南。本书适合有一定编程基础和机器学习知识的读者。下面我将对该书进行详细介绍。 第二版主要介绍了PyTorch的基本概念、核心功能和常用工具,涵盖了PyTorch的基本操作、张量运算、神经网络模型的构建与训练、图像处理和自然语言处理等内容。 本书的逻辑结构清晰,从简单到复杂地介绍了PyTorch的基础知识和操作,为读者提供了全面深入的学习指南。书中通过大量的实例和代码演示,帮助读者理解和掌握PyTorch的使用方法。 此外,本书还对深度学习领域的一些热门技术和应用进行了介绍,如深度卷积神经网络、循环神经网络、生成对抗网络、目标检测和图像分割等。这些内容能够帮助读者进一步学习和应用PyTorch解决实际问题。 总之,《深度学习框架PyTorch入门与实践 第二版》是一本深入浅出的PyTorch学习指南,内容详实全面。通过阅读本书,读者可以系统地学习和理解PyTorch的使用方法,掌握深度学习的核心技术和应用。无论是对初学者还是有一定经验的读者来说,都是一本值得推荐的参考书籍。

“相关推荐”对你有帮助么?

  • 非常没帮助
  • 没帮助
  • 一般
  • 有帮助
  • 非常有帮助
提交
评论
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包
实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值