问题一:项目中使用到了爬虫(scrapy)框架已经任务调度框架,在调度过程中报错信息如下:
Traceback (most recent call last):
File "/usr/local/python3/lib/python3.6/site-packages/apscheduler/executors/base.py", line 125, in run_job
retval = job.func(*job.args, **job.kwargs)
File "/root/pyproject/douyin/douyin/main.py", line 20, in tick_challenge
subprocess.Popen("scrapy crawl categoryVideoSpider")
File "/usr/local/python3/lib/python3.6/subprocess.py", line 707, in __init__
restore_signals, start_new_session)
File "/usr/local/python3/lib/python3.6/subprocess.py", line 1326, in _execute_child
raise child_exception_type(errno_num, err_msg)
FileNotFoundError: [Errno 2] No such file or directory: 'scrapy crawl xxxspider'
原因:虽然通过pip安装了scrapy框架,但是没有建立软链接,导致在执行过程中报错。
解决方法:建立