Scrapy No module named _sqlite3 错误

一、现象:Scrapy运行时提示 No module named _sqlite3,但是在python中import sqlite3没有问题。


2016-09-27 09:34:14 [scrapy] INFO: Scrapy 1.1.3 started (bot: IBDP)

2016-09-27 09:34:14 [scrapy] INFO: Overridden settings: {'NEWSPIDER_MODULE': 'IBDP.spiders', 'LOG_LEVEL': 'INFO', 'CO

NCURRENT_REQUESTS': 1, 'SPIDER_MODULES': ['IBDP.spiders'], 'BOT_NAME': 'IBDP', 'DOWNLOAD_DELAY': 10}

2016-09-27 09:34:14 [scrapy] INFO: Enabled extensions:

['scrapy.extensions.logstats.LogStats',

 'scrapy.extensions.telnet.TelnetConsole',

 'scrapy.extensions.corestats.CoreStats']

2016-09-27 09:34:14 [scrapy] INFO: Enabled downloader middlewares:

['scrapy.downloadermiddlewares.httpauth.HttpAuthMiddleware',

 'scrapy.downloadermiddlewares.downloadtimeout.DownloadTimeoutMiddleware',

 'scrapy.downloadermiddlewares.useragent.UserAgentMiddleware',

 'scrapy.downloadermiddlewares.retry.RetryMiddleware',

 'scrapy.downloadermiddlewares.defaultheaders.DefaultHeadersMiddleware',

 'scrapy.downloadermiddlewares.redirect.MetaRefreshMiddleware',

 'scrapy.downloadermiddlewares.httpcompression.HttpCompressionMiddleware',

 'scrapy.downloadermiddlewares.redirect.RedirectMiddleware',

 'scrapy.downloadermiddlewares.cookies.CookiesMiddleware',

 'scrapy.downloadermiddlewares.chunked.ChunkedTransferMiddleware',

 'scrapy.downloadermiddlewares.stats.DownloaderStats']

2016-09-27 09:34:14 [scrapy] INFO: Enabled spider middlewares:

['scrapy.spidermiddlewares.httperror.HttpErrorMiddleware',

 'scrapy.spidermiddlewares.offsite.OffsiteMiddleware',

 'scrapy.spidermiddlewares.referer.RefererMiddleware',

 'scrapy.spidermiddlewares.urllength.UrlLengthMiddleware',

 'scrapy.spidermiddlewares.depth.DepthMiddleware']

2016-09-27 09:34:14 [scrapy] INFO: Enabled item pipelines:

['IBDP.pipelines.MySQLPipeline']

2016-09-27 09:34:14 [scrapy] INFO: Spider opened

Unhandled error in Deferred:

2016-09-27 09:34:14 [twisted] CRITICAL: Unhandled error in Deferred:



Traceback (most recent call last):

  File "/usr/local/python2.7/lib/python2.7/site-packages/scrapy/commands/crawl.py", line 57, in run

    self.crawler_process.crawl(spname, **opts.spargs)

  File "/usr/local/python2.7/lib/python2.7/site-packages/scrapy/crawler.py", line 163, in crawl

    return self._crawl(crawler, *args, **kwargs)

  File "/usr/local/python2.7/lib/python2.7/site-packages/scrapy/crawler.py", line 167, in _crawl

    d = crawler.crawl(*args, **kwargs)

  File "/usr/local/python2.7/lib/python2.7/site-packages/Twisted-15.2.1-py2.7-linux-x86_64.egg/twisted/internet/defer

.py", line 1274, in unwindGenerator

    return _inlineCallbacks(None, gen, Deferred())

--- <exception caught here> ---

  File "/usr/local/python2.7/lib/python2.7/site-packages/Twisted-15.2.1-py2.7-linux-x86_64.egg/twisted/internet/defer

.py", line 1128, in _inlineCallbacks

    result = g.send(result)

  File "/usr/local/python2.7/lib/python2.7/site-packages/scrapy/crawler.py", line 90, in crawl

    six.reraise(*exc_info)

  File "/usr/local/python2.7/lib/python2.7/site-packages/scrapy/crawler.py", line 74, in crawl

    yield self.engine.open_spider(self.spider, start_requests)

exceptions.ImportError: No module named _sqlite3

2016-09-27 09:34:14 [twisted] CRITICAL: 



二、解决方法:

安装 sqlite-devel,重新安装python

yum install sqlite-devel


cd Python-2.7.11


make

make install

  • 2
    点赞
  • 1
    收藏
    觉得还不错? 一键收藏
  • 1
    评论

“相关推荐”对你有帮助么?

  • 非常没帮助
  • 没帮助
  • 一般
  • 有帮助
  • 非常有帮助
提交
评论 1
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包
实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值