Traceback (most recent call last):
File "C:\Users\li.pinliang\anaconda3\lib\site-packages\scrapy\utils\defer.py", line 161, in maybeDeferred_coro
result = f(*args, **kw)
File "C:\Users\li.pinliang\anaconda3\lib\site-packages\pydispatch\robustapply.py", line 55, in robustApply
return receiver(*arguments, **named)
File "C:\Users\li.pinliang\anaconda3\lib\site-packages\scrapy\extensions\corestats.py", line 31, in spider_closed
elapsed_time = finish_time - self.start_time
TypeError: unsupported operand type(s) for -: 'datetime.datetime' and 'NoneType'
2021-10-15 14:19:58 [scrapy.statscollectors] INFO: Dumping Scrapy stats:
{'log_count/ERROR': 2, 'log_count/INFO': 9}
2021-10-15 14:19:58 [scrapy.core.engine] INFO: Spider closed (shutdown)
Unhandled error in Deferred:
2021-10-15 14:19:59 [twisted] CRITICAL: Unhandled error in Deferred:
Traceback (most recent call last):
File "C:\Users\li.pinliang\anaconda3\lib\site-packages\scrapy\crawler.py", line 177, in crawl
return self._crawl(crawler, *args, **kwargs)
File "C:\Users\li.pinliang\anaconda3\lib\site-packages\scrapy\crawler.py", line 181, in _crawl
d = crawler.crawl(*args, **kwargs)
File "C:\Users\li.pinliang\anaconda3\lib\site-packages\twisted\internet\defer.py", line 1656, in unwindGenerator
return _cancellableInlineCallbacks(gen)
File "C:\Users\li.pinliang\anaconda3\lib\site-packages\twisted\internet\defer.py", line 1571, in _cancellableInlineCallbacks
_inlineCallbacks(None, g, status)
--- <exception caught here> ---
File "C:\Users\li.pinliang\anaconda3\lib\site-packages\twisted\internet\defer.py", line 1445, in _inlineCallbacks
result = current_context.run(g.send, result)
File "C:\Users\li.pinliang\anaconda3\lib\site-packages\scrapy\crawler.py", line 91, in crawl
yield self.engine.open_spider(self.spider, start_requests)
pymongo.errors.AutoReconnect: connection closed
2021-10-15 14:19:59 [twisted] CRITICAL:
Traceback (most recent call last):
File "C:\Users\li.pinliang\anaconda3\lib\site-packages\twisted\internet\defer.py", line 1445, in _inlineCallbacks
result = current_context.run(g.send, result)
File "C:\Users\li.pinliang\anaconda3\lib\site-packages\scrapy\crawler.py", line 91, in crawl
yield self.engine.open_spider(self.spider, start_requests)
pymongo.errors.AutoReconnect: connection closed
Process finished with exit code 1
网上,众说纷纭,但是我本地是因为 docker mongodb 里面的数据的问题导致的。
通过升级 mongodb 到 最新版本 就没问题了。