看似复杂的celery

写在前面:

这几天试投简历,心凉凉。仕事て贅沢が私に落ちてくるのだろうか?工作这么奢侈会降临我的身上么?mmp。之前稍微为接触过celery,在简历上写了对celery有一定了解,所以昨天又看了看。,,这里说的主要是结合django的celery。

 

参考https://www.cnblogs.com/alex3714/p/6351797.html

参考https://blog.csdn.net/shunqixing/article/details/79534524

 

 

首先,可能遇到的问题:

1.ERROR: Pidfile (celerybeat.pid) already exists. Seems we're already running? (pid: 11068)

删除项目中的 celertbeat.pid 就可以了

2. win10在运行时会出错,说什么要三个参数却一个都没给,

pip3 install eventlet,命令或作相应的修改, 下面会说

 

 

定时任务设置表

ExampleMeaning
crontab()Execute every minute.
crontab(minute=0, hour=0)Execute daily at midnight.
crontab(minute=0, hour='*/3')Execute every three hours: midnight, 3am, 6am, 9am, noon, 3pm, 6pm, 9pm.

crontab(minute=0,

hour='0,3,6,9,12,15,18,21')

Same as previous.
crontab(minute='*/15')Execute every 15 minutes.
crontab(day_of_week='sunday')Execute every minute (!) at Sundays.

crontab(minute='*',

hour='*',day_of_week='sun')

Same as previous.

crontab(minute='*/10',

hour='3,17,22',day_of_week='thu,fri')

Execute every ten minutes, but only between 3-4 am, 5-6 pm, and 10-11 pm on Thursdays or Fridays.
crontab(minute=0,hour='*/2,*/3')Execute every even hour, and every hour divisible by three. This means: at every hour except: 1am, 5am, 7am, 11am, 1pm, 5pm, 7pm, 11pm
crontab(minute=0, hour='*/5')Execute hour divisible by 5. This means that it is triggered at 3pm, not 5pm (since 3pm equals the 24-hour clock value of “15”, which is divisible by 5).
crontab(minute=0, hour='*/3,8-17')Execute every hour divisible by 3, and every hour during office hours (8am-5pm).
crontab(0, 0,day_of_month='2')Execute on the second day of every month.

crontab(0, 0,

day_of_month='2-30/3')

Execute on every even numbered day.

crontab(0, 0,

day_of_month='1-7,15-21')

Execute on the first and third weeks of the month.

crontab(0, 0,day_of_month='11',

month_of_year='5')

Execute on the eleventh of May every year.

crontab(0, 0,

month_of_year='*/3')

Execute on the first month of every quarter.

Celery的架构由三部分组成,消息中间件(message broker),任务执行单元(worker)和任务执行结果存储(task result store)组成。

  • 消息中间件:Celery本身不提供消息服务,但是可以方便的和第三方提供的消息中间件集成。包括,RabbitMQ, RedisMongoDB (experimental), Amazon SQS (experimental),CouchDB (experimental), SQLAlchemy (experimental),Django ORM (experimental), IronMQ。推荐使用:RabbitMQ、Redis作为消息队列。
  • 任务执行单元:Worker是Celery提供的任务执行的单元,worker并发的运行在分布式的系统节点中。
  • 任务结果存储:Task result store用来存储Worker执行的任务的结果,Celery支持以不同方式存储任务的结果,包括AMQP, Redis,memcached, MongoDB,SQLAlchemy, Django ORM,Apache Cassandra, IronCache

正式开始:

安装组件:

sudo yum install redis

sudo service redis start

sudo pip3 install celery



# 可能要安装这个
# pip3 install -U "celery[redis]"

二 独立使用

tasks.py

from celery import Celery
 
app = Celery('tasks',
             broker='redis://127.0.0.1:6379/0',
             backend='redis://127.0.0.1:6379/0')
 
@app.task
def add(x,y):
    print("running...",x,y)
    return x+y

运行celery -A tasks worker --loglevel=info         win10执行 celery -A tasks worker --loglevel=info -P eventlet

再开个终端, 进入python 环境

>>> from tasks import add

>>> add.delay(44)

两个终端都会做出反应,就ok了

 

在第二个终端, 还可以输入result = add.delay(4, 4)

 result.ready()                是否准备完成

result.get()                     拿到结果

result.state                     拿到状态
 

三 与django结合

创建项目 名字叫做proj

- proj/
  - proj/__init__.py
  - proj/settings.py
  - proj/urls.py
  - proj/wsgi.py
- manage.py

proj/proj/mycelery.py

from __future__ import absolute_import, unicode_literals
import os
from celery import Celery
 
# set the default Django settings module for the 'celery' program.
os.environ.setdefault('DJANGO_SETTINGS_MODULE', 'proj.settings')
 
app = Celery('proj')
 
# Using a string here means the worker don't have to serialize
# the configuration object to child processes.
# - namespace='CELERY' means all celery-related configuration keys
#   should have a `CELERY_` prefix.
app.config_from_object('django.conf:settings', namespace='CELERY')
 
# Load task modules from all registered Django app configs.
app.autodiscover_tasks()

在proj/proj/__init__.py:添加

from __future__ import absolute_import, unicode_literals
 
# This will make sure the app is always imported when
# Django starts so that shared_task will use this app.
from .mycelery import app as celery_app
 
__all__ = ['celery_app']

在settings中配置

CELERY_BROKER_URL = 'redis://127.0.0.1:6379/0'
CELERY_RESULT_BACKEND = 'redis://127.0.0.1:6379/0'
CELERY_TASK_SERIALIZER = 'json'

redis://127.0.0.1:6379/0这个说明使用的redis的0号队列,如果有多个celery任务都使用同一个队列,则会造成任务混乱。最好是celery实例单独使用一个队列。

 

创建Django的App,名称为celery_task,在app目录下创建tasks.py文件。

├── celery_task
│   ├── admin.py
│   ├── apps.py
│   ├── __init__.py
│   ├── migrations
│   │   └── __init__.py
│   ├── models.py
│   ├── tasks.py
│   ├── tests.py
│   └── views.py
├── db.sqlite3
├── manage.py
├── proj
│   ├── celery.py
│   ├── __init__.py
│   ├── settings.py
│   ├── urls.py
│   └── wsgi.py
└── templates


url.py
from django.contrib import admin
from django.urls import path, re_path
from celery_task import views

urlpatterns = [
    path('admin/', admin.site.urls),
    path('add/', views.get_add),
    path('add/status', views.get_status)
]
tasks.py
from __future__ import absolute_import, unicode_literals
from celery import shared_task

@shared_task
def add(x, y):
    return x + y
 
@shared_task
def mul(x, y):
    return x * y
 
@shared_task
def xsum(numbers):
    return sum(numbers)

运行celery,  celery -A proj.mycelery worker -l info     win10运行 celery -A proj.mycelery worker -l info -P eventlet

 

启动django, 访问http://127.0.0.1:8000/add,看到终端有显示结果,web页面显示

{"job ID": "bd390ffb-d4d6-4d89-932b-793223434b10", "job Result": 8}

接着访问 http://127.0.0.1:8000/add/status?id=bd390ffb-d4d6-4d89-932b-793223434b10

页面显示{"task status:": "SUCCESS", "task result:": 8}

 

django-celery的定时任务

1.自己手撸代码

在settings中加定时的时间配置

任务间隔运行

from datetime import timedelta

CELERY_BEAT_SCHEDULE = {
    'add-every-3-seconds': {
        'task': 'celery_task.tasks.add',
        'schedule': timedelta(seconds=3),
        'args': (16, 16)
    },
}

 定时执行, 需要注意UTC时间

from celery.schedules import crontab

CELERY_BEAT_SCHEDULE = {
    # Executes every Monday morning at 7:30 A.M
    'add-every-monday-morning': {
        'task': 'tasks.add',
        'schedule': crontab(hour=7, minute=30),
        'args': (16, 16),
    },
}

或者在mycelery.py里加,配置不是本django项目, 形式差不多

完整的mycelery.py

from __future__ import absolute_import, unicode_literals
import os
from datetime import timedelta
from celery import Celery


os.environ.setdefault('DJANGO_SETTINGS_MODULE', 'proj.settings')

app = Celery('proj')

from celery.schedules import crontab

app.conf.update(
    CELERYBEAT_SCHEDULE={
        'sum-task': {
            'task': 'deploy.tasks.add',
            'schedule':  timedelta(seconds=20),
            'args': (5, 6)
        },
        'send-report': {
            'task': 'deploy.tasks.report',
            'schedule': crontab(hour=4, minute=30, day_of_week=1),
        }
    }
)


app.config_from_object('django.conf:settings', namespace='CELERY')
app.autodiscover_tasks()

timedelta是datetime中的一个对象,需要from datetime import timedelta引入,有如下几个参数

day, seconds, microseconds:微妙, milliseconds:毫秒, milliseconds:毫秒,hours,minutes

crontab的参数有:month_of_year:月份, day_of_month:日期, day_of_week:周, hour:小时, hour:小时


启动celery, 之前启动就不用再启动了, 不用启动django

运行celery,  celery -A proj.mycelery worker -l info      win10运行 celery -A proj.mycelery worker -l info -P eventlet

再另一个终端启动 celery -A proj.mycelery beat -l info, 就会发现两个终端每隔一段时间都有信息显示。

2 利用django-celery-beat 通过admin添加定时任务,执行还得手动

 

  1. pip3 install django-celery-beat
  2. 在setting中注册

    INSTALLED_APPS = (
            ...,
            'django_celery_beat',
        )
  3. python manage.py migrate
  4. celery -A proj beat -l info -S django           # 最后执行比较好
  5. Visit the Django-Admin interface to set up some periodic tasks.   看admin多了三张表,设置定时任务

我的是add传的参数[1, 8]

运行celery,  celery -A proj.mycelery worker -l info      win10运行 celery -A proj.mycelery worker -l info -P eventlet,

另一个终端输入 celery -A proj beat -l info -S django

两个终端就都又信息

经测试,每添加或修改一个任务,celery beat都需要重启一次,要不然新的配置不会被celery beat进程读到

 

  • 0
    点赞
  • 0
    收藏
    觉得还不错? 一键收藏
  • 0
    评论
评论
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包
实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值