nn.embedding层报错index out of range in self详解

nn.embedding层报错index out of range in self详解

报错详情

---------------------------------------------------------------------------
IndexError                                Traceback (most recent call last)
<ipython-input-383-d67388d2e4cc> in <module>
      1 output_emb = myEmbed(total_words = total_words, embedding_dim = 8)
      2 word_vector = torch.tensor(word_vector, dtype=torch.long).clone().detach()
----> 3 output = output_emb(word_vector)
      4 print(output)
      5 # word_vector

/opt/anaconda3/envs/py36/lib/python3.6/site-packages/torch/nn/modules/module.py in _call_impl(self, *input, **kwargs)
    720             result = self._slow_forward(*input, **kwargs)
    721         else:
--> 722             result = self.forward(*input, **kwargs)
    723         for hook in itertools.chain(
    724                 _global_forward_hooks.values(),

<ipython-input-382-10f2ec94e0ae> in forward(self, sentences_idx)
      4         self.embed = nn.Embedding(total_words,embedding_dim)
      5     def forward(self,sentences_idx):
----> 6         return self.embed(sentences_idx).clone().detach()

/opt/anaconda3/envs/py36/lib/python3.6/site-packages/torch/nn/modules/module.py in _call_impl(self, *input, **kwargs)
    720             result = self._slow_forward(*input, **kwargs)
    721         else:
--> 722             result = self.forward(*input, **kwargs)
    723         for hook in itertools.chain(
    724                 _global_forward_hooks.values(),

/opt/anaconda3/envs/py36/lib/python3.6/site-packages/torch/nn/modules/sparse.py in forward(self, input)
    124         return F.embedding(
    125             input, self.weight, self.padding_idx, self.max_norm,
--> 126             self.norm_type, self.scale_grad_by_freq, self.sparse)
    127 
    128     def extra_repr(self) -> str:

/opt/anaconda3/envs/py36/lib/python3.6/site-packages/torch/nn/functional.py in embedding(input, weight, padding_idx, max_norm, norm_type, scale_grad_by_freq, sparse)
   1812         # remove once script supports set_grad_enabled
   1813         _no_grad_embedding_renorm_(weight, input, max_norm, norm_type)
-> 1814     return torch.embedding(weight, input, padding_idx, scale_grad_by_freq, sparse)
   1815 
   1816 

IndexError: index out of range in self

报错代码

  1. 数据的预处理,统计单词总数并映射成字典;
sentences = ['It is a good day.','how are you?','I want to study the nn.embedding.','I want to elmate my pox.','the experience that I have done today is my favriate experience.']
sentences = [sentence.split() for sentence in sentences]
all_words = []
total_words = 0
for sentence in sentences:
    all_words += [ words for words in sentence ]
no_repeat_words = set(all_words)
total_words = len(no_repeat_words)  
word_to_idx = {
   word: i+1 for i, word in enumerate(no_repeat_words)}
word_to_idx['<unk>'] = 0
idx_to_word = {
   i+1: word for i, word in enumerate(no_repeat_words)}
print('all_words:',all_words)
print('no_repeat_words:',no_repeat_words)
print('idx_to_word:',idx_to_word)
print('word_to_idx:',word_to_idx)
print('total_words',total_words)


>>>all_words: ['It', 'is', 'a', 'good', 'day.', 'how', 'are', 'you?', 'I', 'want', 'to', 'study', 'the', 'nn.embedding.', 'I', 'want', 'to', 'elmate', 'my', 'pox.', 'the', 'experience', 'that', 'I', 'have', 'done', 'today', 'is', 'my', 'favriate', 
评论 1
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包
实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值