多个celery如何使用同一个redis做为broker?

Run multiple Celeries on a single Redis

Celery is a great tool for running asynchronous Django tasks, but it can be complicated to configure. One use case I often face is running multiple web applications on the same server, each with their own Celery daemon.

The web apps are typically completely unrelated and may be running different versions of Django, Celery and other packages in separate virtualenvs. For this reason, I also want to keep their Celery backends separated.

There are a quite a few ways to do it:

+ Use a database server (SQL or NoSQL) as Celery's backend, and use separate databases for each app. This is often the default solution, but not the most effective one.
+ Use a message queue server as Celery's backend, and use separate queues for each app. You will need to choose a queue server and install it just for this purpose.
+ Use Redis as Celery's backend, and use separate Redis database numbers for each app. You will need to coordinate so that each app has a unique database number, which can be quite tiresome.
+ Use Redis, and use the same database number but separate queue names for each app. Easy, effective and simple! This is also nice for local development, since you usually already have a Redis server running on your laptop and it needs no configuration.

Using Redis with separate queue names

To use a local Redis server as the Celery backend, all you need in Django's settings.py is this:

BROKER_URL = 'redis://'
CELERY_DEFAULT_QUEUE = 'myapp'

Another web application would then use a different name for the queue:

BROKER_URL = 'redis://'
CELERY_DEFAULT_QUEUE = 'anotherapp'

When you run the Celery daemon processes, each just needs to know which queue it's watching:

manage.py celeryd -Q myapp
manage.py celeryd -Q anotherapp

How is all this stored in the Redis database? You will see a new entry appear:

1) "_kombu.binding.celery"

Under that key, which is a SET, you can see the names of all the configured queues as something like this (use the Redis SMEMBERS command):

1) "celery\x06\x16\x06\x16myapp"
2) "celery\x06\x16\x06\x16anotherapp"

Whenever a new task is created, a Redis key temporarily appears for its queue:

2) "myapp"

As the Celery daemon picks it up, the key is immediately deleted so you won't usually see it unless you stop the daemon.

Note that the queue names are used as a top level Redis keys without any prefixes, so you should choose them wisely.

转载于:https://www.cnblogs.com/shengulong/p/10992921.html

  • 0
    点赞
  • 1
    收藏
    觉得还不错? 一键收藏
  • 0
    评论

“相关推荐”对你有帮助么?

  • 非常没帮助
  • 没帮助
  • 一般
  • 有帮助
  • 非常有帮助
提交
评论
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包
实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值