2021-01-30

[scrapy.core.engine] ERROR: Error while obtaining start requests

2021-01-30 01:37:52 [scrapy.core.engine] ERROR: Error while obtaining start requests
Traceback (most recent call last):
  File "/home/hdfs/.virtualenvs/crawlvenv/lib/python3.7/site-packages/scrapy/core/engine.py", line 129, in _next_request
    request = next(slot.start_requests)
  File "/home/hdfs/.virtualenvs/crawlvenv/lib/python3.7/site-packages/scrapy_redis/spiders.py", line 83, in next_requests
    data = fetch_one(self.redis_key)
  File "/home/hdfs/.virtualenvs/crawlvenv/lib/python3.7/site-packages/redis/client.py", line 1329, in lpop
    return self.execute_command('LPOP', name)
  File "/home/hdfs/.virtualenvs/crawlvenv/lib/python3.7/site-packages/redis/client.py", line 668, in execute_command
    return self.parse_response(connection, command_name, **options)
  File "/home/hdfs/.virtualenvs/crawlvenv/lib/python3.7/site-packages/redis/client.py", line 680, in parse_response
    response = connection.read_response()
  File "/home/hdfs/.virtualenvs/crawlvenv/lib/python3.7/site-packages/redis/connection.py", line 629, in read_response
    raise response
redis.exceptions.ResponseError: MOVED 6534 192.168.1.22:6380
2021-01-30 01:37:52 [scrapy.utils.signal] ERROR: Error caught on signal handler: <bound method RedisMixin.spider_idle of <ZceSpider1Spider 'zce_spider1' at 0x7f4b4c81eeb8>>
Traceback (most recent call last):
  File "/home/hdfs/.virtualenvs/crawlvenv/lib/python3.7/site-packages/scrapy/utils/signal.py", line 32, in send_catch_log
    response = robustApply(receiver, signal=signal, sender=sender, *arguments, **named)
  File "/home/hdfs/.virtualenvs/crawlvenv/lib/python3.7/site-packages/pydispatch/robustapply.py", line 55, in robustApply
    return receiver(*arguments, **named)
  File "/home/hdfs/.virtualenvs/crawlvenv/lib/python3.7/site-packages/scrapy_redis/spiders.py", line 121, in spider_idle
    self.schedule_next_requests()
  File "/home/hdfs/.virtualenvs/crawlvenv/lib/python3.7/site-packages/scrapy_redis/spiders.py", line 115, in schedule_next_requests
    for req in self.next_requests():
  File "/home/hdfs/.virtualenvs/crawlvenv/lib/python3.7/site-packages/scrapy_redis/spiders.py", line 83, in next_requests
    data = fetch_one(self.redis_key)
  File "/home/hdfs/.virtualenvs/crawlvenv/lib/python3.7/site-packages/redis/client.py", line 1329, in lpop
    return self.execute_command('LPOP', name)
  File "/home/hdfs/.virtualenvs/crawlvenv/lib/python3.7/site-packages/redis/client.py", line 668, in execute_command
    return self.parse_response(connection, command_name, **options)
  File "/home/hdfs/.virtualenvs/crawlvenv/lib/python3.7/site-packages/redis/client.py", line 680, in parse_response
    response = connection.read_response()
  File "/home/hdfs/.virtualenvs/crawlvenv/lib/python3.7/site-packages/redis/connection.py", line 629, in read_response
    raise response
redis.exceptions.ResponseError: MOVED 6534 192.168.1.22:6380
2021-01-30 01:37:52 [scrapy.core.engine] INFO: Closing spider (finished)
2021-01-30 01:37:52 [scrapy.statscollectors] INFO: Dumping Scrapy stats:
{'elapsed_time_seconds': 0.020661,
 'finish_reason': 'finished',
 'finish_time': datetime.datetime(2021, 1, 29, 17, 37, 52, 433352),
 'log_count/DEBUG': 4,
 'log_count/ERROR': 2,
 'log_count/INFO': 12,
 'memusage/max': 65904640,
 'memusage/startup': 65904640,
 'start_time': datetime.datetime(2021, 1, 29, 17, 37, 52, 412691)}
2021-01-30 01:37:52 [scrapy.core.engine] INFO: Spider closed (finished)

原因是:爬虫代码中导入引错了包:

from scrapy_redis.spiders import RedisCrawlSpider

改为:

from scrapy_redis_cluster.spiders import RedisCrawlSpider

这样以上的错误即可解决

  • 0
    点赞
  • 0
    收藏
    觉得还不错? 一键收藏
  • 0
    评论
评论
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包
实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值