17,18_常见函数梯度,激活函数梯度(Sigmoid、Tanh、ReLu)
1. 常见函数梯度1.1 常见函数2. 激活函数及其梯度2.1 激活函数Derivative (倒数)Sigmoid / LogisticSigmoid Derivative (求导)torch.sigmoid# -*- coding: UTF-8 -*-import torcha = torch.linspace(-100, 100, 10)print(a)"""输出结果:tensor([-100.0000, -77.7778, -55.5556, -3
复制链接