(15)tensorflow全连接层的张量实现

全连接层的张量实现----单层神经网络的实现

  1. 定义好权值张量𝑾和偏置张量𝒃
  2. 批量矩阵相乘函数 tf.matmul()即可完成网络层的计算
  3. 偏置向量𝒃与计算完𝑿@𝑾的相加
  4. 将结果传入激活函数
import tensorflow as tf
x = tf.random.normal([2,567])#模拟2个样本,567个特征
w1 = tf.Variable(tf.random.truncated_normal([567,250],stddev=0.5))#初始化W,b
b1 = tf.zeros([250])
o1 = tf.matmul(x,w1)+b1  #计算X@W+b
o1 = tf.nn.relu(o1)  #激活函数
print(o1)

out:
tf.Tensor(
[[ 0.          5.6923637   0.          0.          5.924873    6.430141
   1.0681016  22.66217     2.931157    0.          0.         12.744288
   0.         23.959446   15.1364975  14.34301     0.          1.439672
   6.699648    3.8294516   0.          0.          0.          0.02968955
   0.         29.599365    3.4065924   0.          0.          0.
  16.701733    4.3234434   0.          2.4895988  24.911697    0.
   1.3113308   0.          0.         17.62508     0.          0.83378553
   0.          0.          0.         17.487988   10.510666    3.2711444
   0.          0.          0.          0.          2.8857956   7.578166
   6.4026513   0.          0.         15.8455      7.128825    2.867275
   0.         11.406243    0.          0.          7.3629208   0.
   0.          0.          0.         18.267078    0.          0.
   0.         11.843224   18.811085    0.          0.7894125  17.675303
   6.381042    0.          0.          0.90418094  0.3319425   4.561783
  20.457237    0.          0.          0.          2.025149    9.29306
   0.          0.          0.          1.4207561   0.         11.575405
   0.          0.          0.          0.          0.          9.340631
   2.9096978   2.915633    0.          4.264126    0.          0.
   3.499099   10.030045    6.4256363   5.4497538   0.          0.
   0.          9.498019    0.          0.          0.          3.46416
   0.          1.7147312   0.          2.074488   11.75713     0.
   0.         11.769547    0.          0.9206114  27.530376    1.5469017
  13.124647    0.35898685  9.134467    6.418855    0.         12.321739
  18.213778    5.8426123   0.         15.616054   13.56566    10.221371
   0.5425372   0.          4.1035767   0.          6.5984387  18.210154
  12.665659    2.4419503   3.4360933   4.824207   13.430967    0.
   0.          0.          0.         18.075436    0.          0.9268236
   6.993668    0.          0.28516054  2.6081998   0.          0.
   0.          0.          0.          0.          4.204358    5.167027
   0.          1.1935129   2.8025086   0.          0.6381178   0.
   0.          0.          0.          0.         15.845655   18.041857
   0.          0.          6.383477    8.501418   17.152485    0.
   2.0526805   0.          0.          0.         17.907549    0.
   0.          5.7903147   0.         13.597157    0.          0.
   1.641685    0.          8.30833     3.2230053   0.          0.
   8.802971    0.          0.          0.          0.          0.
   0.          0.          0.          0.          0.          2.9737372
   8.685434   17.858145    0.49902278  2.244833    0.          1.5718412
   5.4440613   0.         16.20961     4.716572   16.234251    0.
   7.8437977   0.          6.2740684   0.          5.991658    5.795619
   0.          0.          0.          0.          8.170209    5.4254475
   2.9483936   6.7834206   8.562119    0.        ]
 [ 0.          0.          9.827619    0.          0.         11.907226
   7.934566    6.047997   10.834481    9.254445    4.593179    5.8291845
   9.88936    11.618725    0.          0.          0.          6.091893
   2.1074784   0.          0.          0.          0.          0.
   0.          0.55824566  7.6742063   2.133997    0.         12.604022
   4.1145587  14.468975    5.0262456   4.389586    0.          0.
  14.072462    0.          2.8475811   5.4412966   3.3362398   9.0758915
   4.297921    2.6291656   7.007786    0.          6.421891   16.863476
   0.          0.          0.          0.1655035  11.46364     7.319919
   9.231743    0.          0.          0.5708244  14.170641    8.481859
  18.731842    9.157413    2.8565936   6.4770727   0.          0.
  18.450764    1.717516    4.903287    1.5349118   0.          2.135833
  25.030903    0.         22.240208    0.          0.          7.4653773
   0.          0.         18.829258    3.2901583   4.663457    0.
   3.6319869  12.821178    4.061035    0.          0.          0.
   0.          7.68061    14.44763     0.          2.9567814   0.
   6.9502516   0.          0.          0.          0.          0.
   1.0752095   8.577261    0.          0.          0.          0.
  15.774179   11.008202    2.6374032   8.135115   13.102028    1.7548213
   0.          0.          1.7659788   0.          0.51686954  7.1937895
   0.          0.          0.          0.          0.          0.
   0.          5.2338986   0.          0.          0.          0.
   0.          0.          8.602976    1.8720741   0.          0.5666175
   0.          0.          3.6209924   2.4786031   0.          0.
   2.88995     5.0474663   0.          2.6065469  18.16195     7.4969788
   5.227219    9.725359    2.9107327  12.853128    0.         17.645288
  12.918257    0.          0.          0.          0.         10.962885
   0.          3.4822338  11.727208    0.          0.          0.9324732
   4.886016    4.829459    7.7510967  13.661399    1.1002091   0.
   0.          6.2028437   0.          0.         15.42531     0.73701906
   0.          0.          2.587556    0.         13.945081   13.631976
  25.053593    0.          0.          7.565555   13.155384    3.5761912
   0.          0.          0.          0.          0.          3.1220303
   0.          0.          0.          8.0910225   0.          0.
   0.          0.          8.924197    0.          0.          0.
   0.          0.          8.213308    1.965143    0.         11.267947
   8.109399    0.          0.          2.2391272   0.          0.
   0.          0.          0.          0.          0.          0.
   6.3063564   0.          0.          0.         12.160679    5.7818184
   0.          9.530418   15.926281    0.          6.8691816   1.3507385
  15.85659    13.0173025   3.3913822   0.          0.          0.
  22.77712     5.178497    0.         21.299566  ]], shape=(2, 250), dtype=float32)
  • 0
    点赞
  • 0
    收藏
    觉得还不错? 一键收藏
  • 打赏
    打赏
  • 0
    评论
评论
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包

打赏作者

小蜗笔记

你的鼓励将是我创作的最大动力

¥1 ¥2 ¥4 ¥6 ¥10 ¥20
扫码支付:¥1
获取中
扫码支付

您的余额不足,请更换扫码支付或充值

打赏作者

实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值