成功解决gensim\utils.py:1209: UserWarning: detected Windows; aliasing chunkize to chunkize_serial warn

成功解决gensim\utils.py:1209: UserWarning: detected Windows; aliasing chunkize to chunkize_serial   warn

 

目录

解决问题

解决思路

解决方法


 

 

解决问题

gensim\utils.py:1209: UserWarning: detected Windows; aliasing chunkize to chunkize_serial
  warnings.warn("detected Windows; aliasing chunkize to chunkize_serial")

 

解决思路

gensim\utils.py:1209:    用户警告:发现窗口;混叠了chunkize到chunkize_serial

 

 

解决方法

此信息提示为警告,即使不处理也不会影响代码编程。如果想要去掉,可以更新库至最新版本,
  warnings.warn("detected Windows; aliasing chunkize to chunkize_serial")

 

 

 

def parse_corpus(infile, outfile): '''parse the corpus of the infile into the outfile''' space = ' ' i = 0 def tokenize(text): return [lemma(token) for token in text.split()] with open(outfile, 'w', encoding='utf-8') as fout: # wiki = WikiCorpus(infile, lemmatize=False, dictionary={}) # gensim中的维基百科处理类WikiCorpus wiki = WikiCorpus(infile, tokenizer_func=tokenize, dictionary={}) # gensim中的维基百科处理类WikiCorpus for text in wiki.get_texts(): fout.write(space.join(text) + '\n') i += 1 if i % 10000 == 0: logger.info('Saved ' + str(i) + ' articles') 报错D:\软件\python\lib\site-packages\gensim\utils.py:1333: UserWarning: detected Windows; aliasing chunkize to chunkize_serial warnings.warn("detected %s; aliasing chunkize to chunkize_serial" % entity) Traceback (most recent call last): File "D:\pythonFiles\图灵\Python_project\self_learn\大语言模型\WikiExtractor.py", line 52, in <module> parse_corpus(infile, outfile) File "D:\pythonFiles\图灵\Python_project\self_learn\大语言模型\WikiExtractor.py", line 29, in parse_corpus for text in wiki.get_texts(): File "D:\软件\python\lib\site-packages\gensim\corpora\wikicorpus.py", line 693, in get_texts for tokens, title, pageid in pool.imap(_process_article, group): File "D:\软件\python\lib\multiprocessing\pool.py", line 870, in next raise value File "D:\软件\python\lib\multiprocessing\pool.py", line 537, in _handle_tasks put(task) File "D:\软件\python\lib\multiprocessing\connection.py", line 211, in send self._send_bytes(_ForkingPickler.dumps(obj)) File "D:\软件\python\lib\multiprocessing\reduction.py", line 51, in dumps cls(buf, protocol).dump(obj) AttributeError: Can't pickle local object 'parse_corpus.<locals>.tokenize' 怎么优化
06-07
评论
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包

打赏作者

一个处女座的程序猿

你的鼓励将是我创作的最大动力

¥1 ¥2 ¥4 ¥6 ¥10 ¥20
扫码支付:¥1
获取中
扫码支付

您的余额不足,请更换扫码支付或充值

打赏作者

实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值