celery 配置broker redis sentinel方式,集成到django

1、prj.py 存储celery的配置信息

注明:

     有些队列处理时间比较长,虽然可以开启多个进程监听队列,但是如果发过来的任务都是时间比较长的,那所有进程都将被占用,

      导致其他需要紧急处理的任务得不到处理,故这里演示启用不同的队列,不同的消费进程处理

# _*_ coding:utf-8 _*_
from redis.sentinel import Sentinel
import time
from celery import Celery
import os
from kombu import Exchange, Queue

os.environ["DJANGO_SETTINGS_MODULE"] = "settings"

app=Celery("mycelery")     #集群哨兵版
username = "redis" #用户名
password = "m-SDFdfle90IUD&)+U"  #redis授权密码

app.conf.broker_url = 'sentinel://{usr}:{pwd}@127.0.0.1:9111;sentinel://{usr}:{pwd}@127.0.0.1:9222;sentinel://{usr}:{pwd}@127.0.0.1:9333'.format(
    usr = username, 
    pwd = password,
)
app.conf.broker_transport_options = { 'master_name': "mymaster",'visibility_timeout': 43200 }

#配置结果数据存储
app.conf.result_backend = 'sentinel://{usr}:{pwd}@127.0.0.1:9111/0;sentinel://{usr}:{pwd}@127.0.0.1:9222/0;sentinel://{usr}:{pwd}@127.0.0.1:9333/0'.format(
    usr = username,
    pwd = password,
)

app.conf.result_backend_transport_options = {
    'master_name': "mymaster",
    'retry_policy': {
        'timeout': 5.0
    }
}


app.conf.timezone = 'Asia/Shanghai'
app.conf.enable_utc = True


default_exchange = Exchange('default', type='direct')
special_exchange = Exchange('special', type='direct')

app.conf.task_queues = (
    Queue('default', default_exchange, routing_key='default'),
    Queue('special', special_exchange, routing_key='special')
)
#默认队列配置
app.conf.task_default_queue = 'default'
app.conf.task_default_exchange = 'default'
app.conf.task_default_routing_key = 'default'

#未配置的路由统一走default队列
app.conf.task_routes = {
    'special.tasks.*': {'queue':'special'}, #special.task中的任务走special队列
}

#自动加载djagno应用tasks.py中注册的任务
app.autodiscover_tasks()


2、/myapp/task.py django 子应用 mapp 任务队列

# -*-  coding:utf-8 -*-
import sys
import time
from celery import shared_task

@shared_task
def add(x,y):
    time.sleep(1)
    print(x,"add",y , x + y)
    return x+y

3、/special/task.py  django子应用 special 任务队列

# -*-  coding:utf-8 -*-
import sys
import time
from celery import shared_task

@shared_task
def special_add(x,y):
    time.sleep(1)
    print(x,"special_add",y , x + y)
    return x+y

4、开启2个worker 进程分别处理不同的队列

celery -A task  worker -l info -Q default  -n default  
celery -A task  worker -l info -Q special -n special 

    可以开启多个

5、eventlet 方式运行 

celery -A task  worker -l info -Q default  -P eventlet -c 500 -n node3

得先安装eventlet
pip3 install  eventlet

6、测试执行任务

     

分别调用 myapp中的 add  和 special 中的 special_add 任务
add.delay(2,3)
special_add.delay(2,3)

#异步处理:
def on_raw_message(body):
    print(body)
    print("*****process********")

result = add4.delay(int(x) ,int(y))
print("******start*******")
print(result.get(on_message=on_raw_message, propagate=False))
print("********wait end********")

7、查看worker进程日志

待补充

8、celery集成到django参考    

https://docs.celeryproject.org/en/stable/django/first-steps-with-django.html#using-celery-with-django

9、supervisord配置多个进程启动default队列的消费服务

[program:test_celery]
directory=/var/www/myprj
command=/usr/local/python3/bin/celery -A task worker -l INFO -Q default -P eventlet -c 500 -n name
autostart = true
startsecs = 5
user=test
numprocs=3
process_name=test_celery_%(process_num)02d
redirect_stderr = true
stdout_logfile_maxbytes = 50MB
stdout_logfile_backups = 20
stdout_logfile =/var/logs/test_celery.log

10、supervisord配置多个进程启动special队列的消费服务

[program:test_celery_special]
directory=/var/www/myprj
command=/usr/local/python3/bin/celery -A task worker -l INFO -Q special -P eventlet -c 500 -n name
autostart = true
startsecs = 5
user=test
numprocs=3
process_name=test_celery_special_%(process_num)02d
redirect_stderr = true
stdout_logfile_maxbytes = 50MB
stdout_logfile_backups = 20
stdout_logfile =/var/logs/test_celery_special.log

11、通过任务Id获取任务执行结果

#-*- coding: utf-8 -*-
import os,sys
p = os.path.abspath(os.path.dirname(__file__))
sys.path.insert(0, os.path.split(p)[0])

from celery.result import AsyncResult


def run_task():
    "返回任务ID给到外部应用"
    run_task  = add.delay(2,3)
    return run_task.task_id

def get_task_result(task_id):
    "外部应用通过任务ID获取任务执行结果"
    result = AsyncResult(task_id)

    return dict(
        status = result.status, #SUCCESS|FAILURE|PENDING
        data = result.result,
        traceback = result.traceback,
    )


if __name__ == "__main__":
    "***测试***"
    task_id = run_task()

    result = get_task_result(task_id)

    while result["status"] =="PENDING":
        result = get_task_result(task_id)
        print(result)
        time.sleep(1)

    print(result)

  • 1
    点赞
  • 4
    收藏
    觉得还不错? 一键收藏
  • 3
    评论
评论 3
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包
实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值