Win10系统django使用celery执行异步与定时任务


celery中的异步与定时任务区别:

  • 异步:被动执行,只有用户在执行某一项操作时,需要执行的任务
  • 定时:主动执行,不管用户是否有操作,在指定时间自动执行的任务
  • 一个异步任务,在定时任务中被声明了,那么他就是一个定时任务
  • 定时任务可以作为异步任务,被用户调用执行

本篇文章中,声明定时任务是在celery.py文件中,当启动任务时,会自动执行定时任务

所需模块
django==3.1.4
celery==5.0.5
django-celery-beat==2.2.0
django项目文件结构
  • 已经将所需要的py文件添加到了目录中
MicInputSys
├── inputsysapp
│	├─── __init__.py
│	├─── apps.py
│	├─── models.py
│	├─── tasks.py
│	├─── views.py
│	├─── migrations
│ 	│	└── __init__.py
├── MicInputSys
│ 	├── __init__.py
│ 	├── asgi.py
│ 	├── celery.py
│ 	├── settings.py
│ 	├── urls.py
│ 	└── wsgi.py
└── manage.py
settings.py中配置
  • settings.py文件在:MicInputSys/MicInputSys/settings.py
# 配置celery
CELERY_BROKER_URL = 'redis://127.0.0.1:6379/0' # Broker配置,使用Redis作为消息中间件
CELERY_RESULT_BACKEND = 'redis://127.0.0.1:6379/0' # BACKEND配置,这里使用redis
CELERY_RESULT_SERIALIZER = 'json' # 结果序列化方案
celery.py中配置
  • celery.py文件在:MicInputSys/MicInputSys/celery.py
# 将下一个版本中更新的内容导入到当前版本中
from __future__ import absolute_import, unicode_literals
import os
from celery import Celery
from datetime import timedelta
from celery.schedules import crontab

os.environ.setdefault('DJANGO_SETTINGS_MODULE', 'MicInputSys.settings')  # 设置django环境

app = Celery('MicInputSys')

app.config_from_object('django.conf:settings', namespace='CELERY')  # 使用CELERY_ 作为前缀,在settings中写配置

app.autodiscover_tasks()  # 发现任务文件每个app下的task.py

# 声明定时任务
app.conf.beat_schedule = {
    u'inputsysapp_tasks_add': {		# 任务名,可以自定义
        "task": "inputsysapp.tasks.add",	# 任务函数
        "schedule": timedelta(seconds=30),	# 定时每30秒执行一次(从开启任务时间计算)
        # "args": (1, 3),		# 传未定义的不定长参数
        # 'kwargs': ({'name':'张三'}),	# 传已定义的不定长参数
    },
    u'inputsysapp_tasks_delete': {
        "task": "inputsysapp.tasks.delete",
        "schedule": crontab(minute='*/1'),		# 定时每1分钟执行一次(每分钟的0秒开始执行)
        # "args": (),
        # 'kwargs': (),
    },
}
init.py中配置
  • __init__.py文件在:MicInputSys/MicInputSys/__init__.py
# 将下一个版本中更新的内容导入到当前版本中
from __future__ import absolute_import, unicode_literals
from .celery import app as celery_app
__all__ = ['celery_app']
tasks.py添加任务
  • tasks.py文件在:MicInputSys/inputsysapp/tasks.py
from __future__ import absolute_import, unicode_literals
from celery import shared_task
import redis
rc = redis.Redis('127.0.0.1', 6379)

@shared_task
def add(*args, **kwargs):
    x, y = args
    print(kwargs['name'])
    print(x, y)
    number = int(rc.get('number').decode('utf8')) if rc.get('number') else 0
    number += 100
    rc.set('number', number)
    print(number)

@shared_task
def delete(*args, **kwargs):
    number = int(rc.get('number').decode('utf8')) if rc.get('number') else 0
    number -= 100
    rc.set('number', number)
    print(number)
启动任务
  • 启动worker
celery -A MicInputSys.celery worker -l info		# 执行命令启动worker
# 命令行显示
 -------------- celery@SHMPZD201131L v5.0.5 (singularity)
--- ***** -----
-- ******* ---- Windows-10-10.0.18362-SP0 2021-02-26 13:59:54
- *** --- * ---
- ** ---------- [config]
- ** ---------- .> app:         MicInputSys:0x23c26e79a30
- ** ---------- .> transport:   redis://127.0.0.1:6379/0
- ** ---------- .> results:     redis://127.0.0.1:6379/0
- *** --- * --- .> concurrency: 12 (prefork)
-- ******* ---- .> task events: OFF (enable -E to monitor tasks in this worker)
--- ***** -----
 -------------- [queues]
                .> celery           exchange=celery(direct) key=celery


[tasks]		# 下面是所有任务名,如果没有,则证明启动失败
  . inputsysapp.tasks.add
  . inputsysapp.tasks.delete

[2021-02-26 13:59:55,861: INFO/MainProcess] Connected to redis://127.0.0.1:6379/0
[2021-02-26 14:02:31,106: INFO/MainProcess] mingle: searching for neighbors
[2021-02-26 14:02:31,849: INFO/SpawnPoolWorker-1] child process 10140 calling self.run()
[2021-02-26 14:02:31,885: INFO/SpawnPoolWorker-5] child process 24456 calling self.run()
[2021-02-26 14:02:31,929: INFO/SpawnPoolWorker-4] child process 10120 calling self.run()
[2021-02-26 14:02:31,931: INFO/SpawnPoolWorker-7] child process 5564 calling self.run()
[2021-02-26 14:02:31,937: INFO/SpawnPoolWorker-2] child process 9704 calling self.run()
[2021-02-26 14:02:32,029: INFO/SpawnPoolWorker-6] child process 7024 calling self.run()
[2021-02-26 14:02:32,038: INFO/SpawnPoolWorker-3] child process 14360 calling self.run()
[2021-02-26 14:02:32,061: INFO/SpawnPoolWorker-8] child process 14872 calling self.run()
[2021-02-26 14:02:32,119: INFO/SpawnPoolWorker-9] child process 20952 calling self.run()
[2021-02-26 14:03:21,272: INFO/MainProcess] Received task: inputsysapp.tasks.add[27522b95-675f-47af-94ba-d762ead41199]
[2021-02-26 14:03:21,275: INFO/MainProcess] Received task: inputsysapp.tasks.delete[2760bd33-c7bc-44c2-944e-c54b77c35cee]
[2021-02-26 14:03:21,279: INFO/SpawnPoolWorker-1] Task inputsysapp.tasks.add[27522b95-675f-47af-94ba-d762ead41199] succeeded in 0.016000000294297934s: None
[2021-02-26 14:03:21,281: INFO/SpawnPoolWorker-5] Task inputsysapp.tasks.delete[2760bd33-c7bc-44c2-944e-c54b77c35cee] succeeded in 0.016000000294297934s: None
  • 启动beat
celery -A MicInputSys beat -l info		# 执行命令启动beat
# 命令行显示
celery beat v5.0.5 (singularity) is starting.
__    -    ... __   -        _
LocalTime -> 2021-02-26 13:58:13
Configuration ->
    . broker -> redis://127.0.0.1:6379/0
    . loader -> celery.loaders.app.AppLoader
    . scheduler -> celery.beat.PersistentScheduler
    . db -> celerybeat-schedule
    . logfile -> [stderr]@%INFO
    . maxinterval -> 5.00 minutes (300s)
[2021-02-26 13:58:13,493: INFO/MainProcess] beat: Starting...
[2021-02-26 13:58:15,550: INFO/MainProcess] Scheduler: Sending due task inputsysapp_tasks_add (inputsysapp.tasks.add)
[2021-02-26 13:58:45,542: INFO/MainProcess] Scheduler: Sending due task inputsysapp_tasks_add (inputsysapp.tasks.add)
[2021-02-26 13:59:00,000: INFO/MainProcess] Scheduler: Sending due task inputsysapp_tasks_delete (inputsysapp.tasks.delete)
视图中执行异步任务
  • 在views.py中创建视图类,文件位置:MicInputSys/inputsysapp/views.py
from rest_framework.response import Response
from rest_framework.views import APIView
from inputsysapp import tasks

class TestAPIView(APIView)
	def get(self, request)
		number = tasks.add.delay(1, 3, name='aaa')
		print(number)
		# 输出结果为task_id:0bfb2616-14bc-4ee9-b734-a111f11fc378
		#可以在redis中使用命令:get celery-task-meta-TASK_ID,来查询结果
		# 如:get celery-task-meta-0bfb2616-14bc-4ee9-b734-a111f11fc378
		return Response({'code': 200, 'msg': '异步任务调用成功'})

django中简单的celery使用,如有疑问,欢迎留言!

  • 4
    点赞
  • 4
    收藏
    觉得还不错? 一键收藏
  • 打赏
    打赏
  • 2
    评论

“相关推荐”对你有帮助么?

  • 非常没帮助
  • 没帮助
  • 一般
  • 有帮助
  • 非常有帮助
提交
评论 2
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包

打赏作者

Tian丶Yuting

你的鼓励将是我创作的最大动力

¥1 ¥2 ¥4 ¥6 ¥10 ¥20
扫码支付:¥1
获取中
扫码支付

您的余额不足,请更换扫码支付或充值

打赏作者

实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值