Celery-分布式队列任务,是一个简单灵活且可靠的,基于python开发的分布市异步消息任务队列,可以通过它轻松实现任务的异步处理。
使用场景:
1.代码所需要的时间过长,可以让程序自动执行,执行完毕拿回返回结果
2.定时任务 ,支持任务调度,支持使用任务队列的方式在分布的机器/进程/线程上执行任务调度**
安装celery*
pip install celery
启动celery服务
linux:
celery -A tasks worker --loglevel=info
windows:
pip install eventlet
celery -A tasks worker --loglevel=info -P eventlet
setting中配置celery
CELERY_BROKER_URL = 'redis://localhost:6379/3'
CELERY_ACCEPT_CONTENT = ['json']
CELERY_RESULT_BACKEND = 'db+sqlite:///result.db'
CELERY_TASK_SERIALIZER = 'json'
__init__中配置celery
from __future__ import absolute_import, unicode_literals
# This will make sure the app is always imported when
# Django starts so that shared_task will use this app.
from .celery import app as celery_app
__all__ = ('celery_app',)
创建celery.py文件
from __future__ import absolute_import, unicode_literals
import os
from celery import Celery
# set the default Django settings module for the 'celery' program.
os.environ.setdefault('DJANGO_SETTINGS_MODULE', 'CeleryPro.settings')
# CeleryPro 项目工程名
app = Celery('CeleryPro')
# Using a string here means the worker doesn't have to serialize
# the configuration object to child processes.
# - namespace='CELERY' means all celery-related configuration keys
# should have a `CELERY_` prefix.
app.config_from_object('django.conf:settings', namespace='CELERY')
# Load task modules from all registered Django app configs.
app.autodiscover_tasks()
引用时导包
from __future__ import absolute_import, unicode_literals
from celery import shared_task