softmax函数是归一化指数函数
>>> import torch
>>> import torch.nn.functional as F
>>> logits = torch.rand(2,2)
>>> pred = F.softmax(logits, dim=1)
>>> logits
tensor([[0.4140, 0.4571],
[0.9392, 0.6504]])
>>> pred
tensor([[0.4892, 0.5108],
[0.5717, 0.4283]])
按照行来一行一行做归一化的
![在这里插入图片描述](https://img-blog.csdnimg.cn/20200717224023470.png?x-oss-process=image/watermark,type_ZmFuZ3poZW5naGVpdGk,shadow_10,text_aHR0cHM6Ly9ibG9nLmNzZG4ubmV0L3p4ODk3MTU3NjU4,size_16,color_FFFFFF,t_70)
![在这里插入图片描述](https://img-blog.csdnimg.cn/20200717224036227.png?x-oss-process=image/watermark,type_ZmFuZ3poZW5naGVpdGk,shadow_10,text_aHR0cHM6Ly9ibG9nLmNzZG4ubmV0L3p4ODk3MTU3NjU4,size_16,color_FFFFFF,t_70)
[cankao](https://blog.csdn.net/NDHuaErFeiFei/article/details/106034348)
[参考](https://blog.csdn.net/weixin_45281949/article/details/103282148)