本来这个方案打算用在我的Sora上,但是因为某些问题打算弃用celery。但既然有人想问怎样实现多机器的管理,那就写出来了:
架构:
这里作为例子的celery app为myapp:
root@workgroup0:~/celeryapp# ls myapp
agent.py celery.py config.py __init__.py
root@workgroup0:~/celeryapp#
公用代码部分:
celery.py:(备注:172.16.77.175是任务发布节点的ip地址)
from __future__ import absolute_import
from celery import Celery
app = Celery('myapp',
broker='amqp://guest@172.16.77.175//',
backend='amqp://guest@172.16.77.175//',
include=['myapp.agent'])
app.config_from_object('myapp.config')
if __name__ == '__main__':
app.start()
config.py:
from __future__ import absolute_import
from kombu import Queue,Exchange
from datetime import timedelta
CELERY_TASK_RESULT_EXPIRES=3600
CELERY_TASK_SERIALIZER='json'
CELERY_ACCEPT_CONTENT=['json']
CELERY_RESULT_SERIALIZER='json'
CELERY_DEFAULT_EXCHANGE = 'agent'
CELERY_DEFAULT_EXCHANGE_TYPE = 'direct'
CELERT_QUEUES = (
Queue('machine1',exchange='agent',routing_key='machine1'),
Queue('machine2',exchange='agent',routing_key='machine2'),
)
__init__