我一直在运行我的垃圾代码,直到我得到一个错误,我没有找到我的模块调用“垃圾爬网文件”。我不记得改变了什么重要的东西,这个错误是无中生有的。在
我重新安装了scrapy,现在有一个新错误:2019-05-27 17:39:19 [scrapy.core.engine] INFO: Spider opened
Unhandled error in Deferred:
2019-05-27 17:39:19 [twisted] CRITICAL: Unhandled error in Deferred:
Traceback (most recent call last):
File "c:\users\Me\virtual_workspace\lib\site-packages\scrapy\crawler.py", line 172, in crawl
return self._crawl(crawler, *args, **kwargs)
File "c:\users\Me\virtual_workspace\lib\site-packages\scrapy\crawler.py", line 176, in _crawl
d = crawler.crawl(*args, **kwargs)
File "c:\users\Me\virtual_workspace\lib\site-packages\twisted\internet\defer.py", line 1613, in unwindGenerator
return _cancellableInlineCallbacks(gen)
File "c:\users\Me\virtual_workspace\lib\site-packages\twisted\internet\defer.py", line 1529, in _cancellableInlineCallbacks
_inlineCallbacks(None, g, status)
--- ---
File "c:\users\Me\virtual_workspace\lib\site-packages\twisted\internet\defer.py", line 1418, in _inlineCallbacks
result = g.send(result)
File "c:\users\Me\virtual_workspace\lib\site-packages\scrapy\crawler.py", line 82, in crawl
yield self.engine.open_spider(self.spider, start_requests)
builtins.ImportError: DLL load failed: The specified module could not be found.
2019-05-27 17:39:19 [twisted] CRITICAL:
Traceback (most recent call last):
File "c:\users\Me\virtual_workspace\lib\site-packages\twisted\internet\defer.py", line 1418, in _inlineCallbacks
result = g.send(result)
File "c:\users\Me\virtual_workspace\lib\site-packages\scrapy\crawler.py", line 82, in crawl
yield self.engine.open_spider(self.spider, start_requests)
ImportError: DLL load failed: The specified module could not be found.
我试着查看文件目录爬虫.py还在那里。其他一些帖子告诉我要安装pywin32,但我已经有了,所以我重新安装了,没有用。我甚至把基本构造函数复制到我的,它仍然不起作用。感谢任何帮助。在
我的简化代码:
^{pr2}$