This setup is simply used for testing purpose and using the official First Steps with Celery and First Steps with Django as reference.
Installation
Step1. Install EPEL repository
Using yum search epel
to find the exact package name for your distribution. Then use yum install
to install EPEL.
Step2. Install Erlang
Erlang is required for RabbitMQ to work. Use yum install erlang
to install it.
Step3. Install RabbitMQ
RabbitMQ is the message broker we chose. Use yum install rabbitmq-server
to install it.
Step4. Install Celery
The installation of Celery is very easy. Just use pip install celery
Step5. Install Django
The installation of Django is also very easy. Just use pip install django
Configuration
First, setup a very basic Django project django-admin startproject proj
Then create a new proj/proj/celery.py module that defines the Celery instance:
file:proj/proj/celery.py
from __future__ import absolute_import
import os
from celery import Celery
# set the default Django settings module for the 'celery' program.
os.environ.setdefault('DJANGO_SETTINGS_MODULE', 'proj.settings')
from django.conf import settings
app = Celery('proj')
# Using a string here means the worker will not have to
# pickle the object when using Windows.
app.config_from_object('django.conf:settings')
app.autodiscover_tasks(lambda: settings.INSTALLED_APPS)
@app.task(bind=True)
def debug_task(self):
print('Request: {0!r}'.format(self.request))
Then you need to import this app in your proj/proj/__init__.py
module. This ensures that the app is loaded when Django starts so that the @shared_task
decorator (mentioned later) will use it:
File:proj/proj/__init__.py
from __future__ import absolute_import
# This will make sure the app is always imported when
# Django starts so that shared_task will use this app.
from .celery import app as celery_app
Next, create a demoapp
app in your proj
project, using python manage.py startapp demoapp
Next, create a tasks.py
in demoapp
IMPORTANT: Make sure the file name is tasks.py
. Otherwise, it will not be autodiscovered.
The tasks you write will probably live in reusable apps, and reusable
apps cannot depend on the project itself, so you also cannot import
your app instance directly.The @shared_task decorator lets you create tasks without having any
concrete app instance:
demoapp/tasks.py:
from __future__ import absolute_import
from celery import shared_task
@shared_task
def add(x, y):
return x + y
@shared_task
def mul(x, y):
return x * y
@shared_task
def xsum(numbers):
return sum(numbers)
Next, configure the broker and backend for Celery. To do this, since we’ve already told Celery to look for configuration in proj/settings.py
, we will add the following to that file.
BROKER_URL = 'amqp://guest:guest@localhost:5672//'
CELERY_RESULT_BACKEND = 'amqp'
Test
RabbitMQ
start rabbitmq and run it in background, rabbitmq-server -detached
stop rabbitmq, rabbitmqctl stop
check rabbitmq status, rabbitmqctl status
Celery
Start celery worker process, celery -A proj worker -l info
IMPORTANT:To test your @shared_task in tasks.py, you must using “python manage.py shell” to start a python shell. That will load the configuration in proj/_init_.py
Otherwise, if you start a python shell normally using “python”, then configuration in proj/_init_ will not be read, thus @shared_task will not function properly. For example, shared_task.ready() will raise AttributeError: ‘DisabledBackend’ object has no attribute ‘_get_task_meta_for’