Celery定时任务

目录树



celery_app
    |
    |-- __init__.py  # celery应用文件
    |
    |-- celeryconfig.py  # celery应用配置文件
    |
    |-- task1.py  # 任务文件1
    |
    |-- task2.py  # 任务文件2
    

文件内容

__init__.py文件内容如下:

from celery import Celery

app = Celery('demo')

# 通过celery实例加载配置模块
app.config_from_object('celery_app.celeryconfig')  

celeryconfig.py文件内容如下:

from datetime import timedelta
from celery.schedules import crontab


BROKER_URL = 'redis://localhost:6379/1'

CELERY_RESULT_BACKEND = 'redis://localhost:6379/2'

# 是否丢弃运行结果(丢弃结果会提升效率)
# CELERY_IGNORE_RESULT = True

# 指定时区, 默认是UTC时间,由于ceery对时区支持不是很好,所以我选择不指定
# CELERY_TIMEZONE = 'Asia/Shanghai'  # 默认是UTC时间


# 导入指定的任务模块
CELERY_IMPORTS = (
    'celery_app.task1',
    'celery_app.task2'
)

# 定时任务
CELERYBEAT_SCHEDULE = {
    'task1': {
        'task': 'celery_app.task1.add',  # 任务名
        'schedule': timedelta(seconds=10),  # 设置每10秒执行一次任务
        'args': (10, 100)
    },
    'task2': {
        'task': 'celery_app.task1.multiply',
        'schedule': crontab(hour=22, minute=57),  # 设置每天定时任务22:57执行
        'args': (10, 200)
    }
}

"""
crontab()  每分钟

crontab(minute=0, hour=0)  每天的0时0分

crontab(minute=0, hour='*/3')  每三小时

crontab(day_of_week='sunday')  周日的每一小时

crontab(minute='*',hour='*', day_of_week='sun') 与上面相同

crontab(minute=0, hour='*/3,8-17') 每三个小时  8时到17时的每小时


solar(event, latitude, longitude)  
event表示日落日出,latitude为纬度,北纬为+,longitude为经度,东经为+。
'task2': {
        'task': 'celery_app.task1.multiply',
        'schedule': solar('sunset', -37.81753, 144.96715),
        'args': (10, 200)
    }
"""

task1.py文件内容如下:

import time
from celery_app import app


@app.task
def add (x, y):
    time.sleep(4)
    return x + y

task2.py文件内容如下:

import time

from celery_app import app

@app.task
def multiply (x, y):
    time.sleep(4)
    return x * y

启动

首先通过beat触发celery的定时任务:

ubantu@ubantu-virtual-machine:~/celery_learning$ celery beat -A celery_app -l info
celery beat v4.2.1 (windowlicker) is starting.
__    -    ... __   -        _
LocalTime -> 2019-02-14 22:56:34
Configuration ->
    . broker -> redis://localhost:6379/1
    . loader -> celery.loaders.app.AppLoader
    . scheduler -> celery.beat.PersistentScheduler
    . db -> celerybeat-schedule
    . logfile -> [stderr]@%INFO
    . maxinterval -> 5.00 minutes (300s)
[2019-02-14 22:56:34,485: INFO/MainProcess] beat: Starting...
[2019-02-14 22:56:34,543: INFO/MainProcess] Scheduler: Sending due task task1 (celery_app.task1.add)
[2019-02-14 22:56:44,516: INFO/MainProcess] Scheduler: Sending due task task1 (celery_app.task1.add)
[2019-02-14 22:56:54,518: INFO/MainProcess] Scheduler: Sending due task task1 (celery_app.task1.add)
[2019-02-14 22:57:04,518: INFO/MainProcess] Scheduler: Sending due task task1 (celery_app.task1.add)

  • 问题:通过观看输出时间,我们发现每隔10秒系统定时为我们发送任务task1
  • 我们的task2任务设定的时间时22:57分,但很明显输出时间已经过了这个点,但一直没有为我们发送此任务
  • 后面我们会讲到为何会出现这种情况

再启动worker

ubantu@ubantu-virtual-machine:~/celery_learning$ celery worker -A celery_app -l info
 
 -------------- celery@ubantu-virtual-machine v4.2.1 (windowlicker)
---- **** ----- 
--- * ***  * -- Linux-4.13.0-36-generic-x86_64-with-Ubuntu-16.04-xenial 2019-02-14 22:56:37
-- * - **** --- 
- ** ---------- [config]
- ** ---------- .> app:         demo:0x7fa2b1f567f0
- ** ---------- .> transport:   redis://localhost:6379/1
- ** ---------- .> results:     redis://localhost:6379/2
- *** --- * --- .> concurrency: 2 (prefork)
-- ******* ---- .> task events: OFF (enable -E to monitor tasks in this worker)
--- ***** ----- 
 -------------- [queues]
                .> celery           exchange=celery(direct) key=celery
                

[tasks]
  . celery_app.task1.add
  . celery_app.task2.multiply

[2019-02-14 22:56:37,910: INFO/MainProcess] Connected to redis://localhost:6379/1
[2019-02-14 22:56:37,934: INFO/MainProcess] mingle: searching for neighbors
[2019-02-14 22:56:38,979: INFO/MainProcess] mingle: all alone
[2019-02-14 22:56:39,014: INFO/MainProcess] celery@ubantu-virtual-machine ready.
[2019-02-14 22:56:39,342: INFO/MainProcess] Received task: celery_app.task1.add[8a96ea99-99f8-4f1b-b161-9c08c627524c]  
[2019-02-14 22:56:43,376: INFO/ForkPoolWorker-1] Task celery_app.task1.add[8a96ea99-99f8-4f1b-b161-9c08c627524c] succeeded in 4.032755204998466s: 110
[2019-02-14 22:56:44,526: INFO/MainProcess] Received task: celery_app.task1.add[7f8c4d7a-9e55-4629-9dc2-fcc68bc88393]  
[2019-02-14 22:56:48,536: INFO/ForkPoolWorker-1] Task celery_app.task1.add[7f8c4d7a-9e55-4629-9dc2-fcc68bc88393] succeeded in 4.007728853999652s: 110
[2019-02-14 22:56:54,528: INFO/MainProcess] Received task: celery_app.task1.add[393287f8-fa39-4237-a44c-34a719e3b243]  
[2019-02-14 22:56:58,538: INFO/ForkPoolWorker-1] Task celery_app.task1.add[393287f8-fa39-4237-a44c-34a719e3b243] succeeded in 4.0078285769996s: 110
[2019-02-14 22:57:04,528: INFO/MainProcess] Received task: celery_app.task1.add[c45aeea1-ed99-473b-b8dd-b45484c12daa]  
[2019-02-14 22:57:08,538: INFO/ForkPoolWorker-1] Task celery_app.task1.add[c45aeea1-ed99-473b-b8dd-b45484c12daa] succeeded in 4.006356116000461s: 110
^C
worker: Hitting Ctrl+C again will terminate all running tasks!

worker: Warm shutdown (MainProcess)
ubantu@ubantu-virtual-machine:~/celery_learning$ 
  • 我们通过beat发送的定时任务可以看到,一共发送了4个,worker监听消息队列也获取到了4个处理结果,没有任何问题

关于task2定时任务启动失败说明以及解决方法:

说明:

上述问题和我们代码没有任何关系,是celery某些版本自身对时区支持不好

解决:

不指定时区,设定时间时自己手动将要设定的时间减去8小时,配置如下:

CELERYBEAT_SCHEDULE = {
    'task1': {
        'task': 'celery_app.task1.add',
        'schedule': timedelta(seconds=10),  # 设置每10秒执行一次任务
        'args': (10, 100)
    },
    'task2': {
        'task': 'celery_app.task1.multiply',
        'schedule': crontab(hour=17-8, minute=13),  # 设置每天定时任务17:13执行
        'args': (10, 200)
    }
}

执行结果

首先通过beat触发celery的定时任务:

(celery) ubantu@ubantu-virtual-machine:~/celery_learning$ celery beat -A celerapp -l info
celery beat v4.2.1 (windowlicker) is starting.
__    -    ... __   -        _
LocalTime -> 2019-02-15 17:12:36
Configuration ->
    . broker -> redis://localhost:6379/1
    . loader -> celery.loaders.app.AppLoader
    . scheduler -> celery.beat.PersistentScheduler
    . db -> celerybeat-schedule
    . logfile -> [stderr]@%INFO
    . maxinterval -> 5.00 minutes (300s)
[2019-02-15 17:12:36,834: INFO/MainProcess] beat: Starting...
[2019-02-15 17:12:36,875: INFO/MainProcess] Scheduler: Sending due task task1 (celery_app.task1.add)
[2019-02-15 17:12:46,850: INFO/MainProcess] Scheduler: Sending due task task1 (celery_app.task1.add)
[2019-02-15 17:12:56,851: INFO/MainProcess] Scheduler: Sending due task task1 (celery_app.task1.add)
[2019-02-15 17:13:00,001: INFO/MainProcess] Scheduler: Sending due task task2 (celery_app.task2.multiply)
[2019-02-15 17:13:06,851: INFO/MainProcess] Scheduler: Sending due task task1 (celery_app.task1.add)

  • 我们看到在17:13分时我们定时任务task2被触发了,说明上述问题已经解决

再启动worker

(celery) ubantu@ubantu-virtual-machine:~/celery_learning$ celery worker -A celery_app -l info
 
 -------------- celery@ubantu-virtual-machine v4.2.1 (windowlicker)
---- **** ----- 
--- * ***  * -- Linux-4.13.0-36-generic-x86_64-with-Ubuntu-16.04-xenial 2019-02-15 17:12:38
-- * - **** --- 
- ** ---------- [config]
- ** ---------- .> app:         demo:0x7f1925322978
- ** ---------- .> transport:   redis://localhost:6379/1
- ** ---------- .> results:     redis://localhost:6379/2
- *** --- * --- .> concurrency: 2 (prefork)
-- ******* ---- .> task events: OFF (enable -E to monitor tasks in this worker)
--- ***** ----- 
 -------------- [queues]
                .> celery           exchange=celery(direct) key=celery
                

[tasks]
  . celery_app.task1.add
  . celery_app.task2.multiply

[2019-02-15 17:12:38,858: INFO/MainProcess] Connected to redis://localhost:6379/1
[2019-02-15 17:12:38,883: INFO/MainProcess] mingle: searching for neighbors
[2019-02-15 17:12:39,924: INFO/MainProcess] mingle: all alone
[2019-02-15 17:12:39,974: INFO/MainProcess] celery@ubantu-virtual-machine ready.
[2019-02-15 17:12:40,111: INFO/MainProcess] Received task: celery_app.task1.add[fb9b0a5e-ddf5-43eb-9d74-8d27fab601ab]  
[2019-02-15 17:12:44,148: INFO/ForkPoolWorker-1] Task celery_app.task1.add[fb9b0a5e-ddf5-43eb-9d74-8d27fab601ab] succeeded in 4.032896642998821s: 110
[2019-02-15 17:12:46,857: INFO/MainProcess] Received task: celery_app.task1.add[ce73c202-8f81-4341-baf0-0c529d4ed374]  
[2019-02-15 17:12:50,863: INFO/ForkPoolWorker-1] Task celery_app.task1.add[ce73c202-8f81-4341-baf0-0c529d4ed374] succeeded in 4.003673937000713s: 110
[2019-02-15 17:12:56,863: INFO/MainProcess] Received task: celery_app.task1.add[0c656984-3eff-4dc1-851f-628b46f14f0e]  
[2019-02-15 17:13:00,008: INFO/MainProcess] Received task: celery_app.task2.multiply[c0e10191-33ba-4b17-812b-f9ace3537671]  
[2019-02-15 17:13:00,874: INFO/ForkPoolWorker-1] Task celery_app.task1.add[0c656984-3eff-4dc1-851f-628b46f14f0e] succeeded in 4.007771483997203s: 110
[2019-02-15 17:13:04,036: INFO/ForkPoolWorker-2] Task celery_app.task2.multiply[c0e10191-33ba-4b17-812b-f9ace3537671] succeeded in 4.02402895799969s: 2000
[2019-02-15 17:13:06,862: INFO/MainProcess] Received task: celery_app.task1.add[be38f258-cc30-4b2c-a0a6-8568c86d2a6d] 
  • 17:13:04时我们收到了任务task2执行结果2000,因为我们任务本身模拟了4秒延迟,所以会在13分04秒收到结果
  • 0
    点赞
  • 0
    收藏
    觉得还不错? 一键收藏
  • 0
    评论
评论
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包
实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值