2019-12-17 celery 定时任务

1、目录结构

project                        # 项目根目录
    ├── celery_tasks           # 存放 celery 相关文件
    │   ├── __init__.py        # 初始化
    │   ├── celery_config.py   # 配置文件
    │   ├── async_tasks.py     # 异步任务文件 
    │   └── periodic_tasks.py  # 定期任务
    └── client.py              # 应用程序

2、__init__.py

#!/usr/bin/env python 
# -*- coding: utf-8 -*-
# @Time         : 
# @Author       : 
# @Mail         : 
# @File         : __init__.py
# @Description  : celery模块

from celery import Celery

celery_app = Celery(__name__)
celery_app.config_from_object('celery_tasks.celery_config')

if __name__ == '__main__':
    #celery_app.start(argv=['celery', 'worker', '-l', 'info', '-f', 'logs/celery.log'])
    pass

3、celery_config.py

#!/usr/bin/env python 
# -*- coding: utf-8 -*-
# @Time         : 
# @Author       : 
# @Mail         : 
# @File         : celery_config.py
# @Description  : celery 配置文件

# 导入crontab模块,对任务的执行时间进行准确的控制
from celery.schedules import crontab

# broker 默认是RabbitMQ
redis_server = '127.0.0.1'
# broker 储存任务队列
broker_url = 'redis://{}:6379/8'.format(redis_server)
# broker 任务结果队列, 如果不需要储存结果,可以不设置,能够提高速度
result_backend = 'redis://{}:6379/9'.format(redis_server)

# 时区设置
timezone = 'Asia/Shanghai'
# 不开启日志, 默认开启
worker_hijack_rout_logger = False
# 储存结果过期时间
result_expires = 60

# 将任务导入
imports = [
    'celery_tasks.async_tasks',        # 通过文件导入任务
    'celery_tasks.periodic_tasks'
]

# 设置定时
beat_schedule = {
    # 任务名称
    'test-every-10-seconds':{
        # 任务调用的函数
        'task': 'celery_tasks.periodic_tasks.test_periodic',
        # 任务执行的时间 该例中每10s执行一次
        # 也可使用crontab模块对其进行控制
        # 例如 'schedule': crontab(hour=7, minute=30, day_of_week=1)
        'schedule': 10.0,
        # 函数入参
        'args': ()
    },
}


if __name__ == '__main__':
    pass

4、async_tasks.py

#!/usr/bin/env python 
# -*- coding: utf-8 -*-
# @Time         : 
# @Author       : 
# @Mail         : 
# @File         : async_tasks.py
# @Description  : 异步任务

from celery import shared_task

# 导入生成的celery_app实例,通过该实例来绑定任务
from celery_tasks import celery_app

# shared_task 适用在没有创建celery_app的情况,同时 多个app同时调用时也适用
@shared_task
def test_shared():
    print("I am shared task.")
    return "succ"

# 将test函数绑定到celery_app
@celery_app.task
def test():
    print("I am tast task.")

# 有返回值的函数
@celery_app.task
def test_func():
    print("I am returning")
    return "succ"

5、periodic_tasks.py

#!/usr/bin/env python 
# -*- coding: utf-8 -*-
# @Time         : 
# @Author       : 
# @Mail         : 
# @File         : periodic_tasks.py
# @Description  : 定时任务

from celery_tasks import celery_app

# 绑定task
@celery_app.task
def test_periodic():
    print("test periodic task succ")
    

6、运行

cd到项目目录,即celery_tasks目录的上层目录

运行通过命令

celery -A celery_tasks  worker -l info

启动worker,运行成功,打印如下

~/test_celery$ celery -A celery_tasks worker -l info
 
 -------------- celery@xx v4.3.0 (rhubarb)
---- **** ----- 
--- * ***  * -- Linux-4.18.0-15-generic-x86_64-with-Ubuntu-18.04-bionic 2019-12-17 16:20:03
-- * - **** --- 
- ** ---------- [config]
- ** ---------- .> app:         celery_tasks:0x7fc4f241e278
- ** ---------- .> transport:   redis://127.0.0.1:6379/8
- ** ---------- .> results:     redis://127.0.0.1:6379/9
- *** --- * --- .> concurrency: 4 (prefork)
-- ******* ---- .> task events: OFF (enable -E to monitor tasks in this worker)
--- ***** ----- 
 -------------- [queues]
                .> celery           exchange=celery(direct) key=celery
                

[tasks]
  . celery_tasks.async_tasks.test
  . celery_tasks.async_tasks.test_func
  . celery_tasks.async_tasks.test_shared
  . celery_tasks.periodic_tasks.test_periodic

[2019-12-17 16:20:03,896: INFO/MainProcess] Connected to redis://127.0.0.1:6379/8
[2019-12-17 16:20:03,904: INFO/MainProcess] mingle: searching for neighbors
[2019-12-17 16:20:04,923: INFO/MainProcess] mingle: all alone
[2019-12-17 16:20:04,937: INFO/MainProcess] celery@xx ready.

可以看到上图中tasks下的几个任务为我们导入的任务列表

测试时,可以在项目目录下启动python终端

:~/test_celery$ python
Python 3.6.8 (default, Jan 14 2019, 11:02:34) 
[GCC 8.0.1 20180414 (experimental) [trunk revision 259383]] on linux
Type "help", "copyright", "credits" or "license" for more information.
>>> import celery_tasks.async_tasks as at  # 导入任务模块
>>> result = at.test.delay()               # 调用任务
>>> result.ready()                         # 查看任务是否执行完毕
True
>>> result.get()                           # 获取任务结果, test函数没有结果
>>> result = at.test_func.delay()       
>>> result.ready()
True
>>> result.get()                           # test_func函数返回结果 
'succ'
>>> 

 celery终端中打印如下,可以看到 worker接收到我们执行任务的指令,并执行成功

[2019-12-17 16:38:27,273: INFO/MainProcess] Received task: celery_tasks.async_tasks.test[e6475ba2-4d91-453e-a8a1-5e91f6a7e662]  
[2019-12-17 16:38:27,275: WARNING/ForkPoolWorker-4] I am tast task.
[2019-12-17 16:38:27,282: INFO/ForkPoolWorker-4] Task celery_tasks.async_tasks.test[e6475ba2-4d91-453e-a8a1-5e91f6a7e662] succeeded in 0.006814451997342985s: None
[2019-12-17 16:38:44,838: INFO/MainProcess] Received task: celery_tasks.async_tasks.test_func[2529b5c5-91c8-4d67-89f6-0336035358b8]  
[2019-12-17 16:38:44,841: WARNING/ForkPoolWorker-3] I am returning
[2019-12-17 16:38:44,846: INFO/ForkPoolWorker-3] Task celery_tasks.async_tasks.test_func[2529b5c5-91c8-4d67-89f6-0336035358b8] succeeded in 0.005664669999532634s: 'succ'

7、启动定时任务

通过以下命令

celery beat -A celery_tasks -l info -f log/periodic_tasks.log

启动定时任务,并将日志写入指定文件中,如下

:~/test_celery$ celery beat -A celery_tasks -l info -f log/periodic_tasks.log
celery beat v4.3.0 (rhubarb) is starting.
__    -    ... __   -        _
LocalTime -> 2019-12-17 16:47:42
Configuration ->
    . broker -> redis://127.0.0.1:6379/8
    . loader -> celery.loaders.app.AppLoader
    . scheduler -> celery.beat.PersistentScheduler
    . db -> celerybeat-schedule
    . logfile -> log/periodic_tasks.log@%INFO
    . maxinterval -> 5.00 minutes (300s)

celery终端中打印如下,可以看到 worker接收到定期任务test_periodic,并运行成功

[2019-12-17 16:47:52,237: WARNING/ForkPoolWorker-3] test periodic task succ
[2019-12-17 16:47:52,238: INFO/ForkPoolWorker-3] Task celery_tasks.periodic_tasks.test_periodic[f0865f15-e0f4-4648-be72-e426a9aeae15] succeeded in 0.0017050319984264206s: None
[2019-12-17 16:48:02,237: INFO/MainProcess] Received task: celery_tasks.periodic_tasks.test_periodic[34f197ef-c189-4266-99c5-504514bcc2d2]  
[2019-12-17 16:48:02,238: WARNING/ForkPoolWorker-4] test periodic task succ
[2019-12-17 16:48:02,240: INFO/ForkPoolWorker-4] Task celery_tasks.periodic_tasks.test_periodic[34f197ef-c189-4266-99c5-504514bcc2d2] succeeded in 0.0013659879987244494s: None
[2019-12-17 16:48:12,237: INFO/MainProcess] Received task: celery_tasks.periodic_tasks.test_periodic[ed671fdf-0809-469b-8273-0de97eb98a9a]  
[2019-12-17 16:48:12,239: WARNING/ForkPoolWorker-3] test periodic task succ
[2019-12-17 16:48:12,240: INFO/ForkPoolWorker-3] Task celery_tasks.periodic_tasks.test_periodic[ed671fdf-0809-469b-8273-0de97eb98a9a] succeeded in 0.0012746300017170142s: None

8、其他

可以通过以下命令同时启动worker和beat

celery -B -A celery_tasks  worker -l info

可以通过以下命令在后台启动worker

celery multi start YOUR-TASK-NAME  -A celery_tasks -l info

停止/重启worker

celery multi stop/restart YOUR-TASK-NAME

  • 0
    点赞
  • 0
    收藏
    觉得还不错? 一键收藏
  • 0
    评论

“相关推荐”对你有帮助么?

  • 非常没帮助
  • 没帮助
  • 一般
  • 有帮助
  • 非常有帮助
提交
评论
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包
实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值