有了bert,roberta还会远吗,目前pytorch transformer上已经放出了bertForTokenClassification
然而,在工业界前进的我们,不能忍受如此慢速的更新
于是我们自己写好了robertaForTokenClassicification类,准备使用了!
以下是代码
class RobertaForTokenClassification(BertPreTrainedModel):
r"""
**labels**: (`optional`) ``torch.LongTensor`` of shape ``(batch_size, sequence_length)``:
Labels for computing the token classification loss.
Indices should be in ``[0, ..., config.num_labels - 1]``.
Outputs: `Tuple` comprising various elements depending on the configuration (config) and inputs:
**loss**: (`optional`, returned when ``labels`` is provided) ``torch.FloatTensor`` of shape ``(1,)``:
Classification loss.
**scores**: ``torch.FloatTensor`` of shape ``(bat