![](https://img-blog.csdnimg.cn/20201014180756925.png?x-oss-process=image/resize,m_fixed,h_64,w_64)
pytorch
纵浪大化中,喜,惧
这个作者很懒,什么都没留下…
展开
-
pytorch交叉熵损失函数研究
使用这个损失函数可以很方便的避免上溢和下溢nn.CrossEntropyLoss()官方文档自己写的代码:def cross_entropy(y_hat=torch.tensor(0), y=torch.tensor(0)): numerator = y_hat[range(len(y_hat)),y] denominator = torch.log(torch.sum(torch.exp(y_hat),1)) l = -numerator+denominator .原创 2021-11-22 21:06:31 · 775 阅读 · 0 评论 -
softmax分类
参数初始化也很重要学习了zip,enumerate用法pytorch的广播机制import torchimport torchvisionfrom torch.utils import datafrom torchvision import transformsfrom utils import *W = torch.normal(0, 1, (784, 10), requires_grad=True)b = torch.zeros(10, requires_grad=True)bat原创 2021-11-22 16:40:04 · 674 阅读 · 0 评论 -
pytorch广播机制
和numpy一样可以进行广播的条件:1Each tensor has at least one dimension.2When iterating over the dimension sizes, starting at the trailing dimension(从右向左), the dimension sizes must either be equal, one of them is 1, or one of them does not exist.怎么广播?dimensions wi原创 2021-11-21 16:04:20 · 342 阅读 · 0 评论