关于scrapy mongodb pymongo.errors.AutoReconnect: connection closed


Traceback (most recent call last):
  File "C:\Users\li.pinliang\anaconda3\lib\site-packages\scrapy\utils\defer.py", line 161, in maybeDeferred_coro
    result = f(*args, **kw)
  File "C:\Users\li.pinliang\anaconda3\lib\site-packages\pydispatch\robustapply.py", line 55, in robustApply
    return receiver(*arguments, **named)
  File "C:\Users\li.pinliang\anaconda3\lib\site-packages\scrapy\extensions\corestats.py", line 31, in spider_closed
    elapsed_time = finish_time - self.start_time
TypeError: unsupported operand type(s) for -: 'datetime.datetime' and 'NoneType'
2021-10-15 14:19:58 [scrapy.statscollectors] INFO: Dumping Scrapy stats:
{'log_count/ERROR': 2, 'log_count/INFO': 9}
2021-10-15 14:19:58 [scrapy.core.engine] INFO: Spider closed (shutdown)
Unhandled error in Deferred:
2021-10-15 14:19:59 [twisted] CRITICAL: Unhandled error in Deferred:

Traceback (most recent call last):
  File "C:\Users\li.pinliang\anaconda3\lib\site-packages\scrapy\crawler.py", line 177, in crawl
    return self._crawl(crawler, *args, **kwargs)
  File "C:\Users\li.pinliang\anaconda3\lib\site-packages\scrapy\crawler.py", line 181, in _crawl
    d = crawler.crawl(*args, **kwargs)
  File "C:\Users\li.pinliang\anaconda3\lib\site-packages\twisted\internet\defer.py", line 1656, in unwindGenerator
    return _cancellableInlineCallbacks(gen)
  File "C:\Users\li.pinliang\anaconda3\lib\site-packages\twisted\internet\defer.py", line 1571, in _cancellableInlineCallbacks
    _inlineCallbacks(None, g, status)
--- <exception caught here> ---
  File "C:\Users\li.pinliang\anaconda3\lib\site-packages\twisted\internet\defer.py", line 1445, in _inlineCallbacks
    result = current_context.run(g.send, result)
  File "C:\Users\li.pinliang\anaconda3\lib\site-packages\scrapy\crawler.py", line 91, in crawl
    yield self.engine.open_spider(self.spider, start_requests)
pymongo.errors.AutoReconnect: connection closed

2021-10-15 14:19:59 [twisted] CRITICAL: 
Traceback (most recent call last):
  File "C:\Users\li.pinliang\anaconda3\lib\site-packages\twisted\internet\defer.py", line 1445, in _inlineCallbacks
    result = current_context.run(g.send, result)
  File "C:\Users\li.pinliang\anaconda3\lib\site-packages\scrapy\crawler.py", line 91, in crawl
    yield self.engine.open_spider(self.spider, start_requests)
pymongo.errors.AutoReconnect: connection closed

Process finished with exit code 1

网上,众说纷纭,但是我本地是因为 docker mongodb 里面的数据的问题导致的。

通过升级 mongodb 到 最新版本 就没问题了。

  • 0
    点赞
  • 0
    收藏
    觉得还不错? 一键收藏
  • 0
    评论

“相关推荐”对你有帮助么?

  • 非常没帮助
  • 没帮助
  • 一般
  • 有帮助
  • 非常有帮助
提交
评论
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包
实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值