celery+django
运行命令:
celery -A celeryy.tasks worker -l info #启动队列任务
Windows 运行celery 需要安装 pip install eventlet
celery -A dd worker -P eventlet --loglevel=info
celery -A dd beat -l info #启动定时任务
项目结构:
setting.py 配置文件添加
# Celery 设置
CELERY_BROKER_URL = 'amqp://celery:password123@ip:5672/my_vhost' #rabbitmq
# CELERY_BROKER_URL = 'redis://127.0.0.1:6379/0' #redis
CELERY_RESULT_BACKEND = 'redis://localhost:6379/1' # 把任务结果存在了Redis
CELERY_ACCEPT_CONTENT = ['json']
CELERY_TASK_SERIALIZER = 'json'
# CELERY_RESULT_BACKEND = 'amqp' #rabbitmq
CELERY_TIMEZONE = 'Asia/Shanghai'
异步任务:
创建dd.celery.py文件:
from __future__ import absolute_import, unicode_literals
import os
from celery import Celery
# set the default Django settings module for the 'celeryy' program.
os.environ.setdefault('DJANGO_SETTINGS_MODULE', 'dd.settings')
# app = Celery('dd',broker='amqp://celery:password123@ip:5672/my_vhost')
app = Celery('dd')
# Using a string here means the worker don't have to serialize
# the configuration object to child processes.
# - namespace='CELERY' means all celeryy-related configuration keys
# should have a `CELERY_` prefix.
app.config_from_object('django.conf:settings', namespace='CELERY')
# Load task modules from all registered Django app configs.
app.autodiscover_tasks()
定时任务:
将以下代码块添加到dd.celery.py中
#定时任务
from celery.schedules import crontab
from celery.schedules import timedelta
app.conf.update(
CELERYBEAT_SCHEDULE={
'sum-task': {
'task': 'celeryy.tasks.add',
'schedule': timedelta(seconds=20),
'args': (7, 8)
},
'send-report': {
'task': 'celeryy.tasks.report',
'schedule': crontab(hour=4, minute=30, day_of_week=1),
}
}
)
参考连接:
https://my.oschina.net/37Y37/blog/1920149