celery的使用
- 更健壮的方式来使用celery。首先创建一个python包,celery服务,姑且命名为proj。目录文件如下:
[root@shanwu proj]# tree
.
├── celery.py # 创建 celery 实例
├── config.py # 配置文件
├── init.py # 任务函数
└── tasks.py
-
- 编写tasks.py
#!/usr/bin/env python
# -*- coding:utf-8 -*-
from __future__ import absolute_import
from proj.celery import app
@app.task
def add(x, y):
return x + y
- 编写config.py
#!/usr/bin/env python
# -*- coding:utf-8 -*-
from __future__ import absolute_import
CELERY_RESULT_BACKEND = 'redis://192.168.238.144:6379/2'
BROKER_URL = 'redis://192.168.238.144:6379/1'
-
- 编写celery.py
#!/usr/bin/env python
# -*- coding:utf-8 -*-
from __future__ import absolute_import
from celery import Celery
app = Celery('proj', include=['proj.tasks'])
app.config_from_object('proj.config')
if __name__ == '__main__':
app.start()
这一次创建 app,并没有直接指定 broker 和 backend。而是在配置文件中。
- 编写文件调用add
from proj.tasks import add
r = add.delay(2, 2)
print(r.ready())
print(r.result)
print(r.get())
- 最后的输出结果
[2018-05-27 18:11:36,103: INFO/MainProcess] Received task: proj.tasks.add[f9ec0b29-cecc-4eb4-a7b0-e5f5ad8d6779]
[2018-05-27 18:11:36,114: INFO/ForkPoolWorker-1] Task proj.tasks.add[f9ec0b29-cecc-4eb4-a7b0-e5f5ad8d6779] succeeded in 0.008252049000020634s: 4
- 指定 路由 到的 队列
重新编辑tasks.py的代码
from celery import Celery
app = Celery()
app.config_from_object("celeryconfig")
@app.task
def taskA(x,y):
return x + y
@app.task
def taskB(x,y,z):
return x + y + z
@app.task
def add(x,y):
return x + y
- 上面的tasks.py中,首先定义了一个Celery对象,然后用celeryconfig.py对celery对象进行设置,之后再分别定义了三个task,分别是taskA,taskB和add。接下来看一下celeryconfig.py 文件
from kombu import Exchange,Queue
BROKER_URL = "redis://192.168.238.144:6379/0"
CELERY_RESULT_BACKEND = "redis://192.168.238.144:6379/0"
CELERY_QUEUES = (
Queue("default",Exchange("default"),routing_key="default"),
Queue("for_task_A",Exchange("for_task_A"),routing_key="task_a"),
Queue("for_task_B",Exchange("for_task_B"),routing_key="task_a")
)
CELERY_ROUTES = {
'tasks.taskA':{"queue":"for_task_A","routing_key":"task_a"},
'tasks.taskB":{"queue":"for_task_B","routing_key:"task_b"}
}
- 在 celeryconfig.py 文件中,首先设置了brokel以及result_backend,接下来定义了三个Message Queue,并且指明了Queue对应的Exchange(当使用Redis作为broker时,Exchange的名字必须和Queue的名字一样)以及routing_key的值。
- 现在在一台主机上面启动一个worker,这个worker只执行for_task_A队列中的消息,这是通过在启动worker是使用-Q Queue_Name参数指定的。
celery -A tasks worker -l info -n worker.%h -Q for_task_A
- 然后到另一台主机上面执行taskA任务。首先 切换当前目录到代码所在的工程下,启动python,执行下面代码启动taskA:
from tasks import *
task_A_re = taskA.delay(100,200)
[2018-05-27 18:38:59,898: INFO/MainProcess] Received task: tasks.taskA[62afb8e2-e630-408a-911c-841006f0df6e]
[2018-05-27 18:38:59,908: INFO/ForkPoolWorker-1] Task tasks.taskA[62afb8e2-e630-408a-911c-841006f0df6e] succeeded in 0.007786133000081463s: 300
[2107] 27 May 18:38:59.967 * 10 changes in 300 seconds. Saving...
[2107] 27 May 18:38:59.970 * Background saving started by pid 2815
[2815] 27 May 18:39:00.069 * DB saved on disk
[2815] 27 May 18:39:00.069 * RDB: 4 MB of memory used by copy-on-write
[2107] 27 May 18:39:00.076 * Background saving terminated with success
- 执行完上面的代码之后,task_A消息会被立即发送到for_task_A队列中去。此时已经启动的worker.atsgxxx 会立即执行taskA任务。
- 执行add任务的时候,add会route到Celery默认的名字叫做celery的队列中去。
- 因为这个消息没有在celeryconfig.py文件中指定应该route到哪一个Queue中,所以会被发送到默认的名字为celery的Queue中,但是我们还没有启动worker执行celery中的任务。接下来我们在启动一个worker执行celery队列中的任务。
celery -A tasks worker -l info -n worker.%h -Q celery
然后再查看add的状态,会发现状态由PENDING变成了SUCCESS。
Scheduler ( 定时任务,周期性任务 )
- 配置config.py
#!/usr/bin/env python
# -*- coding:utf-8 -*-
from __future__ import absolute_import
CELERY_RESULT_BACKEND = 'redis://192.168.238.144:6379/5'
BROKER_URL = 'redis://192.168.238.144:6379/6'
CELERY_TIMEZONE = 'Asia/Shanghai'
from datetime import timedelta
CELERYBEAT_SCHEDULE = {
'add-every-30-seconds': {
'task': 'proj.tasks.add',
'schedule': timedelta(seconds=30),
'args': (16, 16)
},
}
celery -A proj worker -B -l info
输出
[2018-05-27 18:49:15,556: INFO/ForkPoolWorker-2] Task proj.tasks.add[eb6529d8-eaa8-4079-bcbb-bed4cd5fd4c1] succeeded in 0.01149054600045929s: 32
[2018-05-27 18:49:45,446: INFO/Beat] Scheduler: Sending due task add-every-30-seconds (proj.tasks.add)
[2018-05-27 18:49:45,449: INFO/MainProcess] Received task: proj.tasks.add[f392507d-da19-4088-82c0-b3a8a39d6679]
[2018-05-27 18:49:45,452: INFO/ForkPoolWorker-2] Task proj.tasks.add[f392507d-da19-4088-82c0-b3a8a39d6679] succeeded in 0.0016302439998980844s: 32
- celery也有crontab功能只需要修改config.py
'schedule': crontab(hour='11', minute='5', day_of_week='sun'),
- 每周日早上10点58分执行。
#输出
[2018-05-27 19:05:00,011: INFO/Beat] Scheduler: Sending due task add-every-30-seconds (proj.tasks.add)
[2018-05-27 19:05:00,026: INFO/MainProcess] Received task: proj.tasks.add[38064e83-bc2e-45df-a345-9affcb6713a5]
[2018-05-27 19:05:00,035: INFO/ForkPoolWorker-2] Task proj.tasks.add[38064e83-bc2e-45df-a345-9affcb6713a5] succeeded in 0.007052731999465323s: 32