Python学习_25 Celery
1、linux
环境安装
a、升级python版本
tar -zxvf Python-3.5.2.tgz
cd Python-3.5.2
./configure
make
make install
修改python默认版本号
mv
/usr/bin/python /usr/bin/python_bak
ln -s /usr/local/bin/python3.5 /usr/bin/python
b、安装pip
python -m pip install --upgrade pip
tar -zxvf pip-1.5.4.tar.gz
cd pip-1.5.4
python setup.py install
c、安装celery和redis模块
pip install celery
pip install redis
注意:celery不支持python2.7以下版本,需升级,升级python版本之后还需升级对应的pip版本
2、启动worker
先创建一个test1.py文件
#!/usr/bin/env python
# -*- coding: utf-8 -*-
# @Time : 2018\5\27 0027 17:12
# @Author : xiexiaolong
# @File : test1.py
from celery import Celery
import time
broker="redis://:myredis@193.112.92.65:6379/2"
backend="redis://:myredis@193.112.92.65:6379/3"
app = Celery("demon1", broker=broker, backend=backend)
@app.task
def add(x, y):
return x+y
启动worker:
celery -A test1 worker -l info
如下图,worker成功启动:
再写python代码调用
#!/usr/bin/env python
# -*- coding: utf-8 -*-
# @Time : 2018\5\28 0028 1:04
# @Author : xiexiaolong
# @File : test.py
import
time
from
test1
import
*
re = add.delay(10,
20)
time.sleep(5)
print(re.result)
print(re.status)
结果:
D:\python\venv\Scripts\python.exe D:/python/0527/test.py
30
SUCCESS
Process finished with exit code 0
2、多worker,多进程
启动多个worker,调度执行可以指定路由来执行
先创建一个配置文件
celeryconfig.py,定义redis连接,定义worker信息
#!/usr/bin/env python
# -*- coding: utf-8 -*-
# @Time : 2018\5\28 0028 1:01
# @Author : xiexiaolong
# @File : celeryconfig.py
from
kombu
import
Exchange,Queue
BROKER_URL =
"redis://:myredis@193.112.92.65:6379/2"
CELERY_RESULT_BACKEND =
"redis://:myredis@193.112.92.65:6379/3"
#定义Queue
CELERY_QUEUES = (
Queue("default",Exchange("default"),routing_key="default"),
Queue("for_task_A",Exchange("for_task_A"),routing_key="for_task_A"),
Queue("for_task_B",Exchange("for_task_B"),routing_key="for_task_B")
)
##定义taskA走taskA这个路由;taskB走taskB这个路由
CELERY_ROUTES = {
'tasks.taskA':{"queue":"for_task_A","routing_key":"for_task_A"},
'tasks.taskB':{"queue":"for_task_B","routing_key":"for_task_B"}
}
再创建一个tasks.py文件
#!/usr/bin/env python
# -*- coding: utf-8 -*-
# @Time : 2018\5\28 0028 1:01
# @Author : xiexiaolong
# @File : tasks.py
from celery import Celery
app = Celery()
app.config_from_object("celeryconfig")
@app.task
def taskA(x,y):
return x + y
@app.task
def taskB(x,y,z):
return x + y + z
@app.task
def add(x, y):
return x+y
然后启动2个worker:
celery -A tasks worker -l info -n workerA.%h -Q for_task_A
celery -A tasks worker -l info -n workerB.%h -Q for_task_B
运行test.py
#!/usr/bin/env python
# -*- coding: utf-8 -*-
# @Time : 2018\5\28 0028 1:04
# @Author : xiexiaolong
# @File : test.py
import
time
from
tasks
import
*
re1 = taskA.delay(10,
20)
print(re1.result)
re2 = taskB.delay(100,
200,
300)
print(re2.result)
re3 = add.delay(1000,
2000)
print(re3.result)
结果:
D:\python\venv\Scripts\python.exe D:/python/0527/test.py
30
600
None
Process finished with exit code 0
分析:第三个结果为None的原因是没有worker执行,因为之前只启动了workerA和workerB,没有启动默认的worker,启动默认的worker
celery -A tasks worker -l info -n worker.%h -Q celery
然后执行test.py
结果:
D:\python\venv\Scripts\python.exe D:/python/0527/test.py
30
600
3000
Process finished with exit code 0
3、Celery与定时任务
在
celeryconfig.py文件中添加定时任务,可以指定时间自动执行
#!/usr/bin/env python
# -*- coding: utf-8 -*-
# @Time : 2018\5\28 0028 1:01
# @Author : xiexiaolong
# @File : celeryconfig.py
from kombu import Exchange,Queue
BROKER_URL = "redis://:myredis@193.112.92.65:6379/2"
CELERY_RESULT_BACKEND = "redis://:myredis@193.112.92.65:6379/3"
#定义Queue
CELERY_QUEUES = (
Queue("default",Exchange("default"),routing_key="default"),
Queue("for_task_A",Exchange("for_task_A"),routing_key="for_task_A"),
Queue("for_task_B",Exchange("for_task_B"),routing_key="for_task_B")
)
##定义taskA走taskA这个路由;taskB走taskB这个路由
CELERY_ROUTES = {
'tasks.taskA':{"queue":"for_task_A","routing_key":"for_task_A"},
'tasks.taskB':{"queue":"for_task_B","routing_key":"for_task_B"}
}
CELERY_TIMEZONE = 'UTC'
CELERYBEAT_SCHEDULE = {
'taskA_schedule' : {
'task':'tasks.taskA',
'schedule':2,
'args':(5,6)
},
'taskB_scheduler' : {
'task':"tasks.taskB",
"schedule":20,
"args":(10,20,30)
},
'add_schedule': {
"task":"tasks.add",
"schedule":5,
"args":(1,2)
}
}
分析:
'schedule'
:
2
, 每2秒执行一次
需要启动一个定时任务
celery -A tasks beat