表示词语的embedding
代码如下:
word_to_ix = {'hello':0,"world":1}
lookup_tensor=torch.tensor([word_to_ix['hello']],dtype=torch.long)
embeds=nn.Embedding(len(word_to_ix),5)
hello_embed =embeds(lookup_tensor)
print(hello_embed)
tensor([[-1.8915, -0.8448, 1.7506, -1.7749, 1.4395]],
grad_fn=<EmbeddingBackward>)