Django个人博客搭建教程---Django2.1.7集成Celery4.3.0任务队列路由

Celery 是一个简单、灵活且可靠的,处理大量消息的分布式系统,并且提供维护这样一个系统的必需工具。它是一个专注于实时处理的任务队列,同时也支持任务调度。本文演示了如何在Django中集成Celery

依赖安装
pip3 install celery
pip3 install django-celery
版本说明
celery                             4.3.0            
django-celery                      3.3.1
celery名词
  • 任务task:就是一个Python函数。
  • 队列queue:将需要执行的任务加入到队列中。
  • 工人worker:在一个新进程中,负责执行队列中的任务。
  • 代理人broker:负责调度,需要提前部署好redis。
工程目录结构
新建app
python3 manage.py startapp jia_celery
在项目/settings.py中安装
INSTALLED_APPS = (
  ...
  'jia_celery'
  'djcelery',
}
在jia_celery目录下创建tasks.py文件
# Create your tasks here
from celery import Celery
from jia_celery.celery import app as celery_app
import time

# 创建任务函数
@celery_app.task
def my_task1():
    print("任务函数(my_task1)正在执行....")


@celery_app.task
def my_task2():
    print("任务函数(my_task2)正在执行....")


@celery_app.task
def my_task3():
    print("任务函数(my_task3)正在执行....")


@celery_app.task
def my_task4():
    print("任务函数(my_task4)正在执行....")
在jia_celery目录下创建celery.py文件
from celery import Celery
from jia_celery import celeryconfig

# 使用增加配置的方式创建celery app
app = Celery('jia_celery.tasks')

# 从单独的配置模块中加载配置
app.config_from_object(celeryconfig)

# 自动搜索任务
app.autodiscover_tasks(['jia_celery'])
在jia_celery目录下创建celeryconfig.py
from kombu import Exchange, Queue

# 设置结果存储
CELERY_RESULT_BACKEND = 'redis://127.0.0.1:6379/9'

# 设置代理人broker
BROKER_URL = 'redis://127.0.0.1:6379/8'


# 配置任务路由
CELERY_ROUTES = ({
    'jia_celery.tasks.my_task1': {'queue': 'queue1'},
    'jia_celery.tasks.my_task2': {'queue': 'queue1'},
    'jia_celery.tasks.my_task3': {'queue': 'queue2'},
    'jia_celery.tasks.my_task4': {'queue': 'queue2'},
    },
)
迁移数据库
python3 manage.py migrate
启动redis
redis-server
启动worker
celery -A jia_celery worker -l info -Q queue1
Arithmetic@qingjiajiadeMBP MyBlog % celery -A jia_celery worker -l info -Q queue1
 
 -------------- celery@qingjiaowosuanshujiadeMacBook-Pro.local v4.3.0 (rhubarb)
---- **** ----- 
--- * ***  * -- Darwin-19.0.0-x86_64-i386-64bit 2019-11-26 21:39:22
-- * - **** --- 
- ** ---------- [config]
- ** ---------- .> app:         jia_celery.tasks:0x10823a9b0
- ** ---------- .> transport:   redis://127.0.0.1:6379/8
- ** ---------- .> results:     redis://127.0.0.1:6379/9
- *** --- * --- .> concurrency: 8 (prefork)
-- ******* ---- .> task events: OFF (enable -E to monitor tasks in this worker)
--- ***** ----- 
 -------------- [queues]
                .> queue1           exchange=queue1(direct) key=queue1
                

[tasks]
  . jia_celery.tasks.my_task1
  . jia_celery.tasks.my_task2
  . jia_celery.tasks.my_task3
  . jia_celery.tasks.my_task4

[2019-11-26 21:39:22,382: INFO/MainProcess] Connected to redis://127.0.0.1:6379/8
[2019-11-26 21:39:22,392: INFO/MainProcess] mingle: searching for neighbors
[2019-11-26 21:39:23,416: INFO/MainProcess] mingle: all alone
[2019-11-26 21:39:23,432: INFO/MainProcess] celery@qingjiaowosuanshujiadeMacBook-Pro.local ready.
^C
worker: Hitting Ctrl+C again will terminate all running tasks!

worker: Warm shutdown (MainProcess)
Arithmetic@qingjiajiadeMBP MyBlog % celery -A jia_celery worker -l info -Q queue1
 
 -------------- celery@qingjiaowosuanshujiadeMacBook-Pro.local v4.3.0 (rhubarb)
---- **** ----- 
--- * ***  * -- Darwin-19.0.0-x86_64-i386-64bit 2019-11-26 21:41:26
-- * - **** --- 
- ** ---------- [config]
- ** ---------- .> app:         jia_celery.tasks:0x10d626940
- ** ---------- .> transport:   redis://127.0.0.1:6379/8
- ** ---------- .> results:     redis://127.0.0.1:6379/9
- *** --- * --- .> concurrency: 8 (prefork)
-- ******* ---- .> task events: OFF (enable -E to monitor tasks in this worker)
--- ***** ----- 
 -------------- [queues]
                .> queue1           exchange=queue1(direct) key=queue1
                

[tasks]
  . jia_celery.tasks.my_task1
  . jia_celery.tasks.my_task2
  . jia_celery.tasks.my_task3
  . jia_celery.tasks.my_task4

[2019-11-26 21:41:27,005: INFO/MainProcess] Connected to redis://127.0.0.1:6379/8
[2019-11-26 21:41:27,017: INFO/MainProcess] mingle: searching for neighbors
[2019-11-26 21:41:28,043: INFO/MainProcess] mingle: all alone
[2019-11-26 21:41:28,062: INFO/MainProcess] celery@qingjiaowosuanshujiadeMacBook-Pro.local ready.
测试任务1、2
Arithmetic@qingjiajiadeMBP MyBlog % python3
Python 3.6.6 (v3.6.6:4cf1f54eb7, Jun 26 2018, 19:50:54) 
[GCC 4.2.1 Compatible Apple LLVM 6.0 (clang-600.0.57)] on darwin
Type "help", "copyright", "credits" or "license" for more information.
>>> from jia_celery.tasks import *
>>> my_task1.delay()
<AsyncResult: 7bce3af5-5d5f-4b45-b317-7c47b81c4391>
>>> my_task2.delay()
<AsyncResult: 6879f8e9-8bdc-4d47-9baf-5aa7e3c303af>
查看queue1 的 worker 执行日志
[2019-11-26 21:42:02,424: INFO/MainProcess] Received task: jia_celery.tasks.my_task1[7bce3af5-5d5f-4b45-b317-7c47b81c4391]  
[2019-11-26 21:42:02,426: WARNING/ForkPoolWorker-8] 任务函数(my_task1)正在执行....
[2019-11-26 21:42:02,430: INFO/ForkPoolWorker-8] Task jia_celery.tasks.my_task1[7bce3af5-5d5f-4b45-b317-7c47b81c4391] succeeded in 0.00366099600068992s: None
[2019-11-26 21:42:21,639: INFO/MainProcess] Received task: jia_celery.tasks.my_task2[6879f8e9-8bdc-4d47-9baf-5aa7e3c303af]  
[2019-11-26 21:42:21,642: WARNING/ForkPoolWorker-2] 任务函数(my_task2)正在执行....
[2019-11-26 21:42:21,646: INFO/ForkPoolWorker-2] Task jia_celery.tasks.my_task2[6879f8e9-8bdc-4d47-9baf-5aa7e3c303af] succeeded in 0.0050763219987857156s: None

如果此时测试task3、4,查看queue1 的 worker 执行日志是不会有对应任务执行的,因为不同任务指定了不同的队列处理

评论
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包
实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值