Celery 是一个 基于python的分布式任务队列。应用于:
1.需要异步执行的比如耗时任务
2.定时任务等
Celery流程:
产生任务主动任务或者定时任务->
消息代理 Broker(rabbitMQ)->
任务消费者 Worker->
结果存储
安装
Celery:
pip install celery
或者指定组件:
pip install "celery[librabbitmq,redis,json]"
RabbitMQ:
http://blog.topspeedsnail.com/archives/4750
使用celery
配置成应用
目录格式如下
mq_proj
init__.py
celery.py
tasks.py
config.py
celery.py:
#!/usr/bin/python
#-*- coding: utf-8 -*-
from __future__ import absolute_import,unicode_literals
from celery import Celery
app = Celery('mq_proj',
broker= 'amqp://guest:guest@localhost:5672//',#消息代理
backend='redis://localhost',#结果存储
include=['mq_proj.tasks'])#添加tasks.py
app.config_from_object('mq_proj.config') #加载Celery配置到config.py文件
if __name__ == '__main__':
app.start()
tasks.py:
#!/usr/bin/python
#-*- coding: utf-8 -*-
from __future__ import absolute_import,unicode_literals
from .celery import app
@app.task
def remove(path):
'''
remove file
:param path:
:return:
'''
print(path)
#TODO
return True
config.py:
#!/usr/bin/python
#-*- coding: utf-8 -*-
import os
from datetime import timedelta
BASE_DIR = os.path.dirname(os.path.abspath(__file__)) + '../../'
PATH_TEST = 'path_test'
CELERY_ENABLE_UTC = False # 不是用UTC
CELERY_TIMEZONE = 'Asia/Shanghai'
CELERY_TASK_RESULT_EXPIRES = 60*60*24 #任务结果的时效时间
CELERYD_LOG_FILE = BASE_DIR + "/log/celery/celery.log" # log路径
CELERYBEAT_LOG_FILE = BASE_DIR + "/log/celery/beat.log" # beat log路径
CELERY_ACCEPT_CONTENT = ['pickle', 'json', 'msgpack', 'yaml'] # 允许接受的格式 一般采用msgpack(快)和json(跨平台)
CELERY_RESULT_SERIALIZER = 'json'#读取任务结果格式
CELERY_TASK_SERIALIZER = 'msgpack'#任务序列化反序列化采用msgpack
CELERYBEAT_SCHEDULE = {
'add-every-30-seconds': {
'task': 'mq_proj.tasks.remove',#任务
'schedule': timedelta(seconds=30),#时间
'args': (PATH_TEST)#参数
},
}
启动定时任务:
celery beat -A mq_proj
启动Worker进程:
celery -A mq_proj worker -l info
之后每隔30会自动执行一次tasks.remove
Django
django-celery可实现admin管理任务