celery的使用(redis)
celery简单介绍
Celery是一个功能完备即插即用的异步任务队列系统。它适用于异步处理问题,当发送邮件、或者文件上传, 图像处理等等一些比较耗时的操作,我们可将其异步执行,这样用户不需要等待很久,提高用户体验。
Celery的架构由三部分组成,消息队列(message broker),任务执行单元(worker)和任务执行结果存储(task result store)组成。
一个celery系统可以包含很多的worker和broker
Celery本身不提供消息队列功能,但是可以很方便地和第三方提供的消息中间件进行集成,包括RabbitMQ,Redis,MongoDB等
安装celery
pip install celery
# 我这里安装的是5.1.2
# 如果是在windows下运行,需要指定gevent来运行
pip install gevent
简单使用
test.py
from celery import Celery
# broker为任务队列的链接地址
# backend为结果队列的链接地址
app = Celery('test', broker='redis://127.0.0.1:6379/14', backend='redis://127.0.0.1:6379/15')
@app.task
def ces():
print('this is a test')
return 'test result'
运行celery
celery -A test worker -l info
# windows注意P为大写
celery -A test worker -l info -P gevent
celery.py
from test import ces
# 运行
ces.delay()
运行celery.py
文件即可执行celery任务
作为模块使用
目录
├─ celerytest/
├── config.py # 配置文件
├── __init__.py
├── main.py # 主程序
└── test/ # 模块任务文件
└── tasks.py # 任务的文件,名称不能更改
config.py
# 任务队列的链接地址
broker_url = 'redis://127.0.0.1:6379/14'
# 结果队列的链接地址
result_backend = 'redis://127.0.0.1:6379/15'
main.py
from celery import Celery
app = Celery('test')
# 配置导入
app.config_from_object('celerytest.config')
# 自动发现任务
app.autodiscover_tasks(['celerytest.test', ])
test/tasks.py
from celerytest.main import app
# name即给任务命名,如果不填,则默认为函数名
@app.task(name='test_work')
def test_one():
print('this') # 这里的运行打印只会出现在运行起的celery终端
return 'result_one'
@app.task
def test_two():
return 'result_two'
运行celery
celery -A celerytest.main worker -l info
# windows下
celery -A celerytest.main worker -l info -P gevent
效果
-------------- celery@LAPTOP-RNBL2NMP v5.1.2 (cliffs)
--- ***** -----
-- ******* ---- Windows-10-10.0.19041-SP0 2022-06-18 11:20:25
- *** --- * ---
- ** ---------- [config]
- ** ---------- .> app: test:0x20409b68208
- ** ---------- .> transport: redis://127.0.0.1:6379/14
- ** ---------- .> results: disabled://
- *** --- * --- .> concurrency: 16 (gevent) # 表示开启了16个线程准备来来执行任务
-- ******* ---- .> task events: OFF (enable -E to monitor tasks in this worker) # 有没有开启其他的事件(如事件监听等)
--- ***** -----
-------------- [queues]
.> celery exchange=celery(direct) key=celery
[tasks]
[2022-06-18 11:20:25,334: INFO/MainProcess] Connected to redis://127.0.0.1:6379/14
[2022-06-18 11:20:25,340: INFO/MainProcess] mingle: searching for neighbors
[2022-06-18 11:20:26,381: INFO/MainProcess] mingle: all alone
[2022-06-18 11:20:26,397: INFO/MainProcess] celery@LAPTOP-RNBL2NMP ready.
[tasks] # 任务列表
. celerytest.test.tasks.test_two # 未命名的任务
. test_work # 命名了的任务
[2022-06-18 11:21:06,535: INFO/MainProcess] Connected to redis://127.0.0.1:6379/14
[2022-06-18 11:21:06,543: INFO/MainProcess] mingle: searching for neighbors
[2022-06-18 11:21:07,566: INFO/MainProcess] mingle: all alone
[2022-06-18 11:21:07,584: INFO/MainProcess] celery@LAPTOP-RNBL2NMP ready.
[2022-06-18 11:21:07,585: INFO/MainProcess] pidbox: Connected to redis://127.0.0.1:6379/14.
添加任务并运行
from celerytest.test import tasks
ret1 = tasks.test_one.delay()
ret2 = tasks.test_two.delay()
print(f'ret1:{ret1}\nid:{ret1.id}\n')
print(f'ret2:{ret2}\nid:{ret2.id}')
执行结果
ret1:984be058-520a-4f80-b19e-d1d3dfb3a4ec
id:984be058-520a-4f80-b19e-d1d3dfb3a4ec
ret2:91f54762-9891-42bc-8b66-b514402b1885
id:91f54762-9891-42bc-8b66-b514402b1885
celery运行效果(terminal下)
# 任务一
[2022-06-18 22:43:43,720: INFO/MainProcess] Received task: test_work[984be058-520a-4f80-b19e-d1d3dfb3a4ec]
[2022-06-18 22:43:43,727: WARNING/MainProcess] this
[2022-06-18 22:43:43,731: INFO/MainProcess] Task test_work[984be058-520a-4f80-b19e-d1d3dfb3a4ec] succeeded in 0.0s: 'result_one'
# 任务二
[2022-06-18 22:43:43,732: INFO/MainProcess] Received task: celerytest.test.tasks.test_two[91f54762-9891-42bc-8b66-b514402b1885]
[2022-06-18 22:43:43,734: INFO/MainProcess] Task celerytest.test.tasks.test_two[91f54762-9891-42bc-8b66-b514402b1885] succeeded in 0.0s: 'result_two'
redis中结果空间中的值
# 其中存的键为 celery-task-meta-执行任务的id
# 任务二的结果
127.0.0.1:6379[15]> get celery-task-meta-91f54762-9891-42bc-8b66-b514402b1885
{
"status": "SUCCESS",
"result": "result_two",
"traceback": null,
"children": [],
"date_done": "2022-06-18T14:43:43.733130",
"task_id": "91f54762-9891-42bc-8b66-b514402b1885"
}
如果要得到执行结果
注意:如果要拿到结果就会让程序变成同步
方法一:
from time import sleep
from celerytest.test import tasks
ret = tasks.test_one.delay()
print(f'任务未执行:{ret.ready()}')
while ret.ready() == False:
sleep(0.5)
print(f'任务已执行:{ret.ready()}')
print(ret1.get())
方法二:
from celery.result import AsyncResult
from celerytest.test import tasks
from celerytest.main import app
ret1 = tasks.test_one.delay()
async_task = AsyncResult(id=ret1.id, app=app)
print(async_task.successful()) # 成功与否(True False)
print(async_task.get()) # 执行结果
携带参数
只需要在delay()中写入参数即可
test/tasks.py
from celerytest.main import app
@app.tasks
def test(num1, num2)
return num1 + num2
celery.py
from celerytest.test import tasks
tasks.delay(5, 6)
celery实现定时运行
注意:定时运行在windows下不能运行
目录(同上)
├─ celerytest/
├── config.py # 配置文件
├── __init__.py
├── main.py # 主程序
└── test/ # 模块任务文件
└── tasks.py # 任务的文件,名称不能更改
schedule
config.py
from datetime import timedelta
# 任务队列的链接地址
broker_url = 'redis://127.0.0.1:6379/14'
# 结果队列的链接地址
result_backend = 'redis://127.0.0.1:6379/15'
beat_schedule = {
'test': { # 任务名
'task': 'celerytest.test.tasks.test_two', #任务位置
'schedule': timedelta(seconds=30), #时间 每隔30s运行一次
'args': () # 参数
}
}
crontab
当然也可以使用crontab,但需要设置时区
config.py
from celery.schedules import crontab
# 任务队列的链接地址
broker_url = 'redis://127.0.0.1:6379/14'
# 结果队列的链接地址
result_backend = 'redis://127.0.0.1:6379/15'
timezone = 'Asia/Shanghai'
beat_schedule = {
'crontab_test': {
'task': 'celerytest.test.tasks.test_two',
'schedule': crontab(minute=0, hour=12), # 每天12点运行
'args': ()
}
}
运行
注意:使用了scheduler后运行celery需要加上参数B
celery -A celerytest.main worker -l info -B
# windows下
celery -A celerytest.main worker -l info -B -P gevent