神经网络学习工具

8 篇文章 0 订阅
4 篇文章 0 订阅

神经网络学习工具


目前就职于人工智能企业,日常工作内容集中在模型在不同平台的适配、模型转换、量化等方面,虽然与模型相关,但是对于模型结构、原理的研究不够深入,希望能够在闲暇时间积累一些关于模型结构和原理方面的知识,跟进一下比较前沿的进展。但在此过程中,发现在:

1、知识获取;
2、代码复现

中都存在不少的障碍,主要障碍是:

1、没有足够时间去跟进最新的进展(日常工作中这种需求也不强);
2、各个框架(尤其pytorch)的可视化支持不够;
3、不够熟悉相关框架的常用函数;

针对这三个问题,有以下解决思路:问题3,只能靠文档和阅读别人代码来解决;问题1目前跟进https://www.paperswithcode.com/即可;针对问题2,总结了一个可行的学习路径。

1、在有可视化需求的场合尽量使用tensorflow,可以很方便的看到模型结构,方法见后文;
2、pytorch可视化小工具:https://codechina.csdn.net/lvsolo/pytorch-summary

对于已经有复现代码的论文,尽量找到tf的实现,在tensorflow中可视化:

from tensorflow.python.framework import graph_util
output_graph_def = graph_util.convert_variables_to_constants(retinanet.sess,retinanet.sess.graph_def,output_node_names = \
['regressor/conv2d_4/BiasAdd','regressor/conv2d_9/BiasAdd'])
with tf.gfile.GFile('retina.pb', "wb") as f:
    f.write(output_graph_def.SerializeToString())

保存下的pb文件可以用netron打开,唯一需要预先获得的参数是output_node_names,在对论文研究后可以在代码中找到输出节点,

print(output_node.name)

即可获得输出节点名称。
pytorch可视化目前可用的工具:https://codechina.csdn.net/lvsolo/pytorch-summary
目前可以有以下类似的tensor shape输出,暂时没有对模型连接结构的输出:

----------------------------------------------------------------
        Layer (type)               Output Shape         Param #
================================================================
            Conv2d-1         [-1, 64, 112, 112]           9,408
       BatchNorm2d-2         [-1, 64, 112, 112]             128
              ReLU-3         [-1, 64, 112, 112]               0
         MaxPool2d-4           [-1, 64, 56, 56]               0
            Conv2d-5           [-1, 64, 56, 56]          36,928
       BatchNorm2d-6           [-1, 64, 56, 56]             128
              ReLU-7           [-1, 64, 56, 56]               0
        BasicBlock-8           [-1, 64, 56, 56]               0
            Conv2d-9           [-1, 64, 56, 56]          36,928
      BatchNorm2d-10           [-1, 64, 56, 56]             128
             ReLU-11           [-1, 64, 56, 56]               0
       BasicBlock-12           [-1, 64, 56, 56]               0
           Conv2d-13          [-1, 128, 28, 28]          73,856
      BatchNorm2d-14          [-1, 128, 28, 28]             256
           Conv2d-15          [-1, 128, 28, 28]           8,192
      BatchNorm2d-16          [-1, 128, 28, 28]             256
             ReLU-17          [-1, 128, 28, 28]               0
       BasicBlock-18          [-1, 128, 28, 28]               0
           Conv2d-19          [-1, 128, 28, 28]         147,584
      BatchNorm2d-20          [-1, 128, 28, 28]             256
             ReLU-21          [-1, 128, 28, 28]               0
       BasicBlock-22          [-1, 128, 28, 28]               0
           Conv2d-23          [-1, 256, 14, 14]         295,168
      BatchNorm2d-24          [-1, 256, 14, 14]             512
           Conv2d-25          [-1, 256, 14, 14]          32,768
      BatchNorm2d-26          [-1, 256, 14, 14]             512
             ReLU-27          [-1, 256, 14, 14]               0
       BasicBlock-28          [-1, 256, 14, 14]               0
           Conv2d-29          [-1, 256, 14, 14]         590,080
      BatchNorm2d-30          [-1, 256, 14, 14]             512
             ReLU-31          [-1, 256, 14, 14]               0
       BasicBlock-32          [-1, 256, 14, 14]               0
           Conv2d-33            [-1, 512, 7, 7]       1,180,160
      BatchNorm2d-34            [-1, 512, 7, 7]           1,024
           Conv2d-35            [-1, 512, 7, 7]         131,072
      BatchNorm2d-36            [-1, 512, 7, 7]           1,024
             ReLU-37            [-1, 512, 7, 7]               0
       BasicBlock-38            [-1, 512, 7, 7]               0
           Conv2d-39            [-1, 512, 7, 7]       2,359,808
      BatchNorm2d-40            [-1, 512, 7, 7]           1,024
             ReLU-41            [-1, 512, 7, 7]               0
       BasicBlock-42            [-1, 512, 7, 7]               0
           ResNet-43  [[-1, 128, 28, 28], [-1, 256, 14, 14], [-1, 512, 7, 7]]               0
           Conv2d-44            [-1, 256, 7, 7]         131,328
           Conv2d-45            [-1, 256, 7, 7]         590,080
           Conv2d-46          [-1, 256, 14, 14]          65,792
           Conv2d-47          [-1, 256, 14, 14]         590,080
           Conv2d-48          [-1, 256, 28, 28]          33,024
           Conv2d-49          [-1, 256, 28, 28]         590,080
           Conv2d-50            [-1, 256, 4, 4]       1,179,904
             ReLU-51            [-1, 256, 4, 4]               0
           Conv2d-52            [-1, 256, 2, 2]         590,080
FeaturePyramidNet-53  [[-1, 256, 28, 28], [-1, 256, 14, 14], [-1, 256, 7, 7], [-1, 256, 4, 4], [-1, 256, 2, 2]]               0
          Anchors-54              [-1, 9441, 4]               0
           Conv2d-55          [-1, 256, 28, 28]         590,080
             ReLU-56          [-1, 256, 28, 28]               0
           Conv2d-57          [-1, 256, 28, 28]         590,080
             ReLU-58          [-1, 256, 28, 28]               0
           Conv2d-59          [-1, 256, 28, 28]         590,080
             ReLU-60          [-1, 256, 28, 28]               0
           Conv2d-61          [-1, 256, 28, 28]         590,080
             ReLU-62          [-1, 256, 28, 28]               0
           Conv2d-63           [-1, 18, 28, 28]          41,490
          Sigmoid-64           [-1, 18, 28, 28]               0
ClassificationNet-65              [-1, 7056, 2]               0
           Conv2d-66          [-1, 256, 14, 14]         590,080
             ReLU-67          [-1, 256, 14, 14]               0
           Conv2d-68          [-1, 256, 14, 14]         590,080
             ReLU-69          [-1, 256, 14, 14]               0
           Conv2d-70          [-1, 256, 14, 14]         590,080
             ReLU-71          [-1, 256, 14, 14]               0
           Conv2d-72          [-1, 256, 14, 14]         590,080
             ReLU-73          [-1, 256, 14, 14]               0
           Conv2d-74           [-1, 18, 14, 14]          41,490
          Sigmoid-75           [-1, 18, 14, 14]               0
ClassificationNet-76              [-1, 1764, 2]               0
           Conv2d-77            [-1, 256, 7, 7]         590,080
             ReLU-78            [-1, 256, 7, 7]               0
           Conv2d-79            [-1, 256, 7, 7]         590,080
             ReLU-80            [-1, 256, 7, 7]               0
           Conv2d-81            [-1, 256, 7, 7]         590,080
             ReLU-82            [-1, 256, 7, 7]               0
           Conv2d-83            [-1, 256, 7, 7]         590,080
             ReLU-84            [-1, 256, 7, 7]               0
           Conv2d-85             [-1, 18, 7, 7]          41,490
          Sigmoid-86             [-1, 18, 7, 7]               0
ClassificationNet-87               [-1, 441, 2]               0
           Conv2d-88            [-1, 256, 4, 4]         590,080
             ReLU-89            [-1, 256, 4, 4]               0
           Conv2d-90            [-1, 256, 4, 4]         590,080
             ReLU-91            [-1, 256, 4, 4]               0
           Conv2d-92            [-1, 256, 4, 4]         590,080
             ReLU-93            [-1, 256, 4, 4]               0
           Conv2d-94            [-1, 256, 4, 4]         590,080
             ReLU-95            [-1, 256, 4, 4]               0
           Conv2d-96             [-1, 18, 4, 4]          41,490
          Sigmoid-97             [-1, 18, 4, 4]               0
ClassificationNet-98               [-1, 144, 2]               0
           Conv2d-99            [-1, 256, 2, 2]         590,080
            ReLU-100            [-1, 256, 2, 2]               0
          Conv2d-101            [-1, 256, 2, 2]         590,080
            ReLU-102            [-1, 256, 2, 2]               0
          Conv2d-103            [-1, 256, 2, 2]         590,080
            ReLU-104            [-1, 256, 2, 2]               0
          Conv2d-105            [-1, 256, 2, 2]         590,080
            ReLU-106            [-1, 256, 2, 2]               0
          Conv2d-107             [-1, 18, 2, 2]          41,490
         Sigmoid-108             [-1, 18, 2, 2]               0
ClassificationNet-109                [-1, 36, 2]               0
          Conv2d-110          [-1, 256, 28, 28]         590,080
            ReLU-111          [-1, 256, 28, 28]               0
          Conv2d-112          [-1, 256, 28, 28]         590,080
            ReLU-113          [-1, 256, 28, 28]               0
          Conv2d-114          [-1, 256, 28, 28]         590,080
            ReLU-115          [-1, 256, 28, 28]               0
          Conv2d-116          [-1, 256, 28, 28]         590,080
            ReLU-117          [-1, 256, 28, 28]               0
          Conv2d-118           [-1, 36, 28, 28]          82,980
   RegressionNet-119              [-1, 7056, 4]               0
          Conv2d-120          [-1, 256, 14, 14]         590,080
            ReLU-121          [-1, 256, 14, 14]               0
          Conv2d-122          [-1, 256, 14, 14]         590,080
            ReLU-123          [-1, 256, 14, 14]               0
          Conv2d-124          [-1, 256, 14, 14]         590,080
            ReLU-125          [-1, 256, 14, 14]               0
          Conv2d-126          [-1, 256, 14, 14]         590,080
            ReLU-127          [-1, 256, 14, 14]               0
          Conv2d-128           [-1, 36, 14, 14]          82,980
   RegressionNet-129              [-1, 1764, 4]               0
          Conv2d-130            [-1, 256, 7, 7]         590,080
            ReLU-131            [-1, 256, 7, 7]               0
          Conv2d-132            [-1, 256, 7, 7]         590,080
            ReLU-133            [-1, 256, 7, 7]               0
          Conv2d-134            [-1, 256, 7, 7]         590,080
            ReLU-135            [-1, 256, 7, 7]               0
          Conv2d-136            [-1, 256, 7, 7]         590,080
            ReLU-137            [-1, 256, 7, 7]               0
          Conv2d-138             [-1, 36, 7, 7]          82,980
   RegressionNet-139               [-1, 441, 4]               0
          Conv2d-140            [-1, 256, 4, 4]         590,080
            ReLU-141            [-1, 256, 4, 4]               0
          Conv2d-142            [-1, 256, 4, 4]         590,080
            ReLU-143            [-1, 256, 4, 4]               0
          Conv2d-144            [-1, 256, 4, 4]         590,080
            ReLU-145            [-1, 256, 4, 4]               0
          Conv2d-146            [-1, 256, 4, 4]         590,080
            ReLU-147            [-1, 256, 4, 4]               0
          Conv2d-148             [-1, 36, 4, 4]          82,980
   RegressionNet-149               [-1, 144, 4]               0
          Conv2d-150            [-1, 256, 2, 2]         590,080
            ReLU-151            [-1, 256, 2, 2]               0
          Conv2d-152            [-1, 256, 2, 2]         590,080
            ReLU-153            [-1, 256, 2, 2]               0
          Conv2d-154            [-1, 256, 2, 2]         590,080
            ReLU-155            [-1, 256, 2, 2]               0
          Conv2d-156            [-1, 256, 2, 2]         590,080
            ReLU-157            [-1, 256, 2, 2]               0
          Conv2d-158             [-1, 36, 2, 2]          82,980
   RegressionNet-159                [-1, 36, 4]               0
          SubNet-160  [[-1, 9441, 2], [-1, 9441, 4]]               0
       RetinaNet-161               [[-1], [-1]]               0
================================================================
Total params: 32,903,630
Trainable params: 32,903,630
Non-trainable params: 0
----------------------------------------------------------------
Input size (MB): 0.57
Forward/backward pass size (MB): 38997339272428.49
Params size (MB): 125.52
Estimated Total Size (MB): 38997338357760.00
----------------------------------------------------------------
  • 0
    点赞
  • 0
    收藏
    觉得还不错? 一键收藏
  • 0
    评论

“相关推荐”对你有帮助么?

  • 非常没帮助
  • 没帮助
  • 一般
  • 有帮助
  • 非常有帮助
提交
评论
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包
实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值