pytorch和torch的对比。pytorch将所有的Container都用autograd替代。这样的话根本不需要用ConcatTable
,CAddTable
之类的。直接用符号运算就行了。
output = nn.CAddTable():forward({input1, input2})
直接用output = input1 + input2
就行。真简单。
从下图看出,pytorch的网络模块只有.weight
和.bias
。而那些梯度.gradInput
和.output
都被消除。
例子:
import torch
from torch.autograd import Variable
import torch.nn as nn
import torch.nn.functional as F
class MNISTConvNet(nn.Module):
def __init__(self):
# this is the place where you instantiate all your modules
# you can later access them using the same