# MxNet学习笔记(3):关于Symbol

## Symbol

• NDArray:

• Straightforward.
• Easy to work with native language features (for loop, if-else condition, ..) and libraries (numpy, ..).
• Easy step-by-step code debugging.
• Symbol:
• Provides almost all functionalities of NDArray, such as +, *, sin, reshape etc.
• Easy to save, load and visualize.
• Easy for the backend to optimize the computation and memory usage.

## 通过符号组建基本表达式

a = mx.sym.Variable('a')
b = mx.sym.Variable('b')

# elemental wise multiplication
d = a * b
# matrix multiplication
e = mx.sym.dot(a, b)
# reshape
f = mx.sym.reshape(d+e, shape=(1,4))
g = mx.sym.broadcast_to(f, shape=(2,4))

## 网络的可视化

MxNet提供了很方便的API，对网络进行可视化。比如在上述的代码中，最后生成的网络是g，那么我们可以通过如下的方式对g进行可视化：

mx.viz.plot_network(g).view()

## 通过符号构造神经网络

net = mx.sym.Variable('data')
net = mx.sym.FullyConnected(data=net, name='fc1', num_hidden=128)
net = mx.sym.Activation(data=net, name='relu1', act_type="relu")
net = mx.sym.FullyConnected(data=net, name='fc2', num_hidden=10)
net = mx.sym.SoftmaxOutput(data=net, name='out')

def ConvFactory(data, num_filter, kernel, stride=(1,1), pad=(0, 0),name=None, suffix=''):
conv = mx.sym.Convolution(data=data, num_filter=num_filter, kernel=kernel,
bn = mx.sym.BatchNorm(data=conv, name='bn_%s%s' %(name, suffix))
act = mx.sym.Activation(data=bn, act_type='relu', name='relu_%s%s'
%(name, suffix))
return act
prev = mx.sym.Variable(name="Previous Output")
conv_comp = ConvFactory(data=prev, num_filter=64, kernel=(7,7), stride=(2, 2))
shape = {"Previous Output" : (128, 3, 28, 28)}

net = mx.sym.Variable('data')
fc1 = mx.sym.FullyConnected(data=net, name='fc1', num_hidden=128)
net = mx.sym.Activation(data=fc1, name='relu1', act_type="relu")
out1 = mx.sym.SoftmaxOutput(data=net, name='softmax')
out2 = mx.sym.LinearRegressionOutput(data=net, name='regression')
group = mx.sym.Group([out1, out2])
group.list_outputs()


## 对符号的操作

• 查看某一个网络里面有哪些符号，可以xxx.list_arguments，或者xxx.list_outputs只查看输出
• 当创建某一个layer时，它的输出名字都是由电脑分配的。但这个时候可以通过这样的方式来手动指定名字:
net = mx.symbol.Variable('data')
w = mx.symbol.Variable('myweight')
net = mx.symbol.FullyConnected(data=net, weight=w, name='fc1', num_hidden=128)
net.list_arguments()
• 定义好符号之后，可以通过bind来绑定数据然后进行运算：
gpu_device=mx.gpu() # Change this to mx.cpu() in absence of GPUs.

ex_gpu = c.bind(ctx=gpu_device, args={'a' : mx.nd.ones([3,4], gpu_device)*2,
'b' : mx.nd.ones([3,4], gpu_device)*3})
ex_gpu.forward()
ex_gpu.outputs[0].asnumpy()

ex = c.eval(ctx = mx.cpu(), a = mx.nd.ones([2,3]), b = mx.nd.ones([2,3]))
print('number of outputs = %d\nthe first output = \n%s' % (
len(ex), ex[0].asnumpy()))

• 广告
• 抄袭
• 版权
• 政治
• 色情
• 无意义
• 其他

120