Celery的使用

1.1 Celery简介

Celery是由Python开发的一个简单、灵活、可靠的处理大量任务的分发系统,它不仅支持实时处理也支持任务调度。
在这里插入图片描述

  • user:用户程序,用于告知celery去执行一个任务。
  • broker: 存放任务(依赖RabbitMQ或Redis,进行存储)
  • worker:执行任务
  • backend: 存储结果

使用场景:

  • 耗时任务, 通过celery,将任务添加到broker队列,然后立即给用户返回一个任务ID,当任务添加到broker之后, 由worker获取任务并处理任务。任务完成之后, 在将结果放到backend。用户想要检查结果, 提供任务ID, 就可以去backend中查找。
  • 定时任务(定时发布)任务

1.2 快速上手

#c1.py
import time
from celery import Celery


app = Celery('tasks', broker='redis://127.0.0.1:6379', backend='redis://127.0.0.1:6379')


@app.task
def test(x, y):
    time.sleep(10)
    return x + y
@app.task
def test2(x, y):
    time.sleep(10)
    return x + y

# c2.py
from c1 import test

result = test.delay(4, 4)
print(result.id)
# c3.py
from celery.result import AsyncResult
from c1 import app

result_project = AsyncResult(id="2c3cfdfb-6d2b-45d5-9bc5-d80e7bbb68ee", app=app)

# print(result_project.status)
print(result_project.get())

环境搭建:

  1. 安装Redis
  2. pip install eventlet (windows下)
  3. pip install celery==5.0.5

执行 c1.py 创建worker(终端执行命令)

celery --app=c1 worker -P eventlet -l INFO # Windows环境下+-P eventlet
or
celery -A c1 worker -l info -P eventlet # Windows环境下+-P eventlet

1.3 定时任务

1.设定时间让celery执行一个任务

# c4.py
import datetime
from c1 import test2
"""
from datetime import datetime
 
v1 = datetime(2017, 4, 11, 3, 0, 0)
print(v1)
 
v2 = datetime.utcfromtimestamp(v1.timestamp())
print(v2)
 
"""
ctime = datetime.datetime.now()
utc_ctime = datetime.datetime.utcfromtimestamp(ctime.timestamp())

s10 = datetime.timedelta(seconds=10)
ctime_x = utc_ctime + s10

# 使用apply_async并设定时间
result = test2.apply_async(args=[2, 3], eta=ctime_x)
print(result.id)

2.类似于contab的定时任务

from celery import Celery
from celery.schedules import crontab
 
app = Celery('tasks', broker='redis://127.0.0.1:6379', backend='redis://127.0.0.1:6379', include=['proj.s1', ])
app.conf.timezone = 'Asia/Shanghai'
app.conf.enable_utc = False
 
app.conf.beat_schedule = {
    # 'add-every-10-seconds': {
    #     'task': 'proj.s1.add1',
    #     'schedule': 10.0,
    #     'args': (16, 16)
    # },
    'add-every-12-seconds': {
        'task': 'proj.s1.add1',
        'schedule': crontab(minute=42, hour=8, day_of_month=11, month_of_year=4),
        'args': (16, 16)
    },
}

2.1Django中应用Celery

2.1.1目录结构

django_celery_demo
├── app01
│   ├── __init__.py
│   ├── admin.py
│   ├── apps.py
│   ├── migrations
│   ├── models.py
│   ├── tasks.py
│   ├── tests.py
│   └── views.py
├── db.sqlite3
├── django_celery_demo
│   ├── __init__.py
│   ├── celery.py
│   ├── settings.py
│   ├── urls.py
│   └── wsgi.py
├── manage.py
├── red.py
└── templates

2.1.2 django_celery_demo/celery.py内容

import os
from celery import Celery

# 设置Django中的celery环境配置
os.environ.setdefault('DJANGO_SETTINGS_MODULE', 'django_celery_demo.settings')

app = Celery('django_celery_demo')


app.config_from_object('django.conf:settings', namespace='CELERY')

# 去每个已注册的app中获取tasks.py文件
app.autodiscover_tasks()

2.1.3 django_celery_demo/init.py内容

from .celery import app as celery_app

__all__ = ('celery_app',)

2.1.4 app01/tasks.py内容

from celery import shared_task


@shared_task
def add(x, y):
    return x + y


@shared_task
def mul(x, y):
    return x * y


@shared_task
def xsum(numbers):
    return sum(numbers)

2.1.5 django_celery_demo/settings.py内容

# ######################## Celery配置 ########################
CELERY_BROKER_URL = 'redis://127.0.0.1:6379'
CELERY_ACCEPT_CONTENT = ['json']
CELERY_RESULT_BACKEND = 'redis://127.0.0.1:6379'
CELERY_TASK_SERIALIZER = 'json'

2.1.6 app01/views.py内容

from django.shortcuts import render, HttpResponse
from app01 import tasks
from django_celery_demo import celery_app
from celery.result import AsyncResult


def index(request):
    result = tasks.add.delay(1, 8)
    print(result)
    return HttpResponse('...')


def check(request):
    task_id = request.GET.get('task')
    async = AsyncResult(id=task_id, app=celery_app)
    if async.successful():
        data = async.get()
        print('成功', data)
    else:
        print('任务等待中被执行')

    return HttpResponse('...')

2.1.7 django_celery_demo/urls.py内容

"""django_celery_demo URL Configuration

The `urlpatterns` list routes URLs to views. For more information please see:
    https://docs.djangoproject.com/en/1.11/topics/http/urls/
Examples:
Function views
    1. Add an import:  from my_app import views
    2. Add a URL to urlpatterns:  url(r'^$', views.home, name='home')
Class-based views
    1. Add an import:  from other_app.views import Home
    2. Add a URL to urlpatterns:  url(r'^$', Home.as_view(), name='home')
Including another URLconf
    1. Import the include() function: from django.conf.urls import url, include
    2. Add a URL to urlpatterns:  url(r'^blog/', include('blog.urls'))
"""
from django.conf.urls import url
from django.contrib import admin
from app01 import views

urlpatterns = [
    url(r'^admin/', admin.site.urls),
    url(r'^index/', views.index),
    url(r'^check/', views.check),
]

2.2.1 定时任务

1.安装

install django-celery-beat

2. 注册app


INSTALLED_APPS = (
    ...,
    'django_celery_beat',
)

3.数据库去迁移生成定时任务相关表

python manage.py migrate

4.设置定时任务

  • 方式一 : 代码中配置
# django_celery_demo/celery.py
import os
from celery import Celery

# set the default Django settings module for the 'celery' program.
os.environ.setdefault('DJANGO_SETTINGS_MODULE', 'django_celery_demo.settings')

app = Celery('django_celery_demo')

# Using a string here means the worker doesn't have to serialize
# the configuration object to child processes.
# - namespace='CELERY' means all celery-related configuration keys
#   should have a `CELERY_` prefix.
app.config_from_object('django.conf:settings', namespace='CELERY')


app.conf.beat_schedule = {
    'add-every-5-seconds': {
        'task': 'app01.tasks.add',
        'schedule': 5.0,
        'args': (16, 16)
    },
}


# Load task modules from all registered Django app configs.
app.autodiscover_tasks()
  • 方式二:数据表录入

在这里插入图片描述
5. 后台进程创建任务

celery --app=django_celery_demo beat -l info --scheduler django_celery_beat.schedulers:DatabaseScheduler

6. 启动worker执行任务

celery --app=django_celery_demo worker -P eventlet -l INFO # Windows环境下+-P eventlet

进程问题

celery -A django_celery_demo worker -l info --pool=solo

官方参考:官方参考

task和shared_task区别
shared_task不依赖celery对象,加载内存之后自动关联celery对象()
与多个celery对象进行关联*

  • 2
    点赞
  • 4
    收藏
    觉得还不错? 一键收藏
  • 1
    评论

“相关推荐”对你有帮助么?

  • 非常没帮助
  • 没帮助
  • 一般
  • 有帮助
  • 非常有帮助
提交
评论 1
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包
实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值