tensorflow2.0
cici_iii
keep coding
展开
-
tf2: Gradients do not exist for variables when minimizing the loss.
WARNING:tensorflow:Gradients do not exist for variables when minimizing the loss.情况一错误写法:模型forward中计算出 loss1 和 loss2 返回,然后再计算 losswith tf.GradientTape() as tape: loss1,loss2 = model(user_list, item_list, labels_list) loss = precision1 * loss1 + preci原创 2021-04-04 15:09:40 · 3374 阅读 · 2 评论 -
tensorflow2.0 Dataset创建和使用
一、创建Dataset# 可以接收一个numpy.ndarray、tuple、dictdataset = tf.data.Dataset.from_tensor_slices(np.arange(10).reshape((5,2)))dataset = tf.data.Dataset.from_tensor_slices(([1,2,3,4,5,6],[10,20,30,40,50,60]))dataset = tf.data.Dataset.from_tensor_slices({"x":[1,2原创 2021-03-30 11:12:38 · 888 阅读 · 0 评论 -
Tensorflow2.0 tf.function和AutoGraph模式
一个简单记录,后续慢慢补充。。。。。一、函数# 类似一个tensorflow操作@tf.functiondef add(a, b): return a+b # 即使传入数字,函数运算也是python基本运算,发返回值的类型也会变成tensor。print(add(1,2)) # tf.Tensor(3, shape=(), dtype=int32)print(add(tf.ones([2,2]), tf.ones([2,2]))) # 快速矩阵计算# tf.Tensor(原创 2021-03-29 21:33:29 · 183 阅读 · 2 评论 -
Tensorflow2.0模型构建与训练
模型构建class Encoder(layers.Layer): def __init__(self, latent_dim=32, intermediate_dim=64, name="encoder", **kwargs): super(Encoder, self).__init__(name=name, **kwargs) ''' w_init = tf.random_normal_initializer() self.w =原创 2021-03-29 20:51:53 · 448 阅读 · 0 评论 -
pytorch 和 tensorflow2.0 方法替换
张量初始化pytorch: xavier_uniform_()tf2.0: GlorotUniform()def confirm(weight): mean = np.sum(weight) / dim print("均值: {}".format(mean)) square_sum = np.sum((mean - weight) ** 2) print("方差: {}".format(square_sum / dim))dim = 1000000 w = n原创 2021-03-23 15:10:07 · 681 阅读 · 0 评论