tf2与transformers做BERT文本分类时,遇到bug, if list(y_pred.keys())[0] == “loss“: AttributeError: ‘Tensor‘

遇到这个问题时,我的模型架构为:

class TFBertForMultilabelClassification(TFBertPreTrainedModel):

    def __init__(self, config, *inputs, **kwargs):
        super(TFBertForMultilabelClassification, self).__init__(config, *inputs, **kwargs)
        print(config)
        self.num_labels = config.num_labels

        self.bert = TFBertMainLayer(config, name='bert')
        self.dropout = tf.keras.layers.Dropout(config.hidden_dropout_prob)
        self.classifier = tf.keras.layers.Dense(config.num_labels, activation='softmax')

    def call(self, inputs, **kwargs):
        outputs = self.bert(inputs, **kwargs)

        pooled_output = outputs[1]

        pooled_output = self.dropout(pooled_output, training=kwargs.get('training', False))
        print(pooled_output.shape)
        logits = self.classifier(pooled_output)
        print(logits.shape)
    

        return logits
model =  TFBertForMultilabelClassification.from_pretrained(model_path, num_labels=num_classes)

此时会抛出错误: if list(y_pred.keys())[0] == “loss”:
AttributeError: ‘Tensor’ object has no attribute ‘keys’
而模型使用transformers封装好的架构:

model = TFBertForSequenceClassification.from_pretrained(model_path, num_labels=num_classes)

则不抛出该错误
经分析,本人这样解决:
将C:\Users\xiao013.li\Anaconda3\lib\site-packages\transformers\modeling_tf_utils.py", line 1427
的程序修改为:

#If the labels are a single tensor, match them to the first non-loss tensor in the output
 try:

      if list(y_pred.keys())[0] == "loss":
              y_pred = y_pred[1]
      else:
              y_pred = y_pred[0]
  except AttributeError:
      y_pred = y_pred

程序正常运行!!!

  • 0
    点赞
  • 0
    收藏
    觉得还不错? 一键收藏
  • 0
    评论
评论
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包
实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值