Celery - Group并发特性的疑问与总结

学习完官方博客后对group一个特性仍旧有着一些疑问点,现做实验并记录下来:

问题1: group的并发是指什么?

疑问点: 启动10个worker。普通情况下,依次扔入10个tasks,是否是由10个worker同时获取后执行?Group情况下,依次扔入10个tasks,是由1个worker一次性处理还是由10个worker同时处理10个tasks实现?

答:普通情况下,启动10个worker后向broker中扔入10个tasks,是由1个worker顺次处理这10个tasks。Group情况下,启动10个worker后扔入1个包含10个task的group,是由10个worker同时处理10个task。

普通情况下测试: 测试生成10个异步任务,观察10个任务是一次性堆放到broker中还是顺次一个一个放到broker中。以及10个worker是会同时触发还是由1个worker执行全部任务。

from proj.tasks import add
from celery import group

if __name__ == '__main__':
   for i in range(10):
       res = add.delay(i, i)
       print(res.get())

测试结果:

python client.py
0
...

查看worker中任务的执行情况,当顺次触发10个任务时,是由1个worker依次获取到当前任务并执行。

celery -A proj.tasks inspect active
-> celery@k8s-master: OK
    * {'args': [0, 0], 'time_start': 1593677240.2096095, 'name': 'proj.tasks.add', 'delivery_info': {'priority': 0, 'redelivered': None, 'routing_key': 'celery', 'exchange': u''}, 'hostname': 'celery@k8s-master', 'acknowledged': True, 'kwargs': {}, 'type': 'proj.tasks.add', 'id': 'c505b9d5-e6ba-4dfe-bcff-ece91fc39252', 'worker_pid': 10990}

服务端获取情况,能够观察出是由一个task获取任务执行完毕后在接受第二个task,顺次执行。

[2020-07-02 16:07:20,208: INFO/MainProcess] Received task: proj.tasks.add[c505b9d5-e6ba-4dfe-bcff-ece91fc39252]
[2020-07-02 16:07:25,222: INFO/ForkPoolWorker-10] Task proj.tasks.add[c505b9d5-e6ba-4dfe-bcff-ece91fc39252] succeeded in 5.012286833s: 0
[2020-07-02 16:07:25,225: INFO/MainProcess] Received task: proj.tasks.add[6f93e8da-cd33-4960-8bff-4fb92361fe20]
[2020-07-02 16:07:30,234: INFO/ForkPoolWorker-10] Task proj.tasks.add[6f93e8da-cd33-4960-8bff-4fb92361fe20] succeeded in 5.007919535s: 2
[2020-07-02 16:07:30,237: INFO/MainProcess] Received task: proj.tasks.add[bab2a232-9ccc-4c95-9e55-7e2b70eddf0c]
[2020-07-02 16:07:35,245: INFO/ForkPoolWorker-10] Task proj.tasks.add[bab2a232-9ccc-4c95-9e55-7e2b70eddf0c] succeeded in 5.007347468s: 4
...

Group情况下测试,测试生成10个异步任务,是由1个worker获取group包中的10个任务依次执行还是由10个worker拿取group包中的任务一次性执行。

from proj.tasks import add
from celery import group

if __name__ == '__main__':
   res = group(add.s(i, i) for i in range(10))()
   print(res.get())

查看worker中任务的执行情况,是由10个worker同时获取group中包含的tasks然后一起执行,各个任务同进同退。

celery -A proj.tasks inspect active
-> celery@k8s-master: OK
    * {'args': [9, 9], 'time_start': 1593678890.5774257, 'name': 'proj.tasks.add', 'delivery_info': {'priority': 0, 'redelivered': None, 'routing_key': 'celery', 'exchange': u''}, 'hostname': 'celery@k8s-master', 'acknowledged': True, 'kwargs': {}, 'type': 'proj.tasks.add', 'id': '86a881dd-e389-40ec-9660-68a64cea624d', 'worker_pid': 12323}
    * {'args': [1, 1], 'time_start': 1593678890.544256, 'name': 'proj.tasks.add', 'delivery_info': {'priority': 0, 'redelivered': None, 'routing_key': 'celery', 'exchange': u''}, 'hostname': 'celery@k8s-master', 'acknowledged': True, 'kwargs': {}, 'type': 'proj.tasks.add', 'id': 'b4febc78-0d88-45a4-986d-e991d21087cd', 'worker_pid': 12322}
    * {'args': [4, 4], 'time_start': 1593678890.5624309, 'name': 'proj.tasks.add', 'delivery_info': {'priority': 0, 'redelivered': None, 'routing_key': 'celery', 'exchange': u''}, 'hostname': 'celery@k8s-master', 'acknowledged': True, 'kwargs': {}, 'type': 'proj.tasks.add', 'id': 'f814cfb0-a682-408b-896e-015247503927', 'worker_pid': 12329}
    * {'args': [7, 7], 'time_start': 1593678890.5720773, 'name': 'proj.tasks.add', 'delivery_info': {'priority': 0, 'redelivered': None, 'routing_key': 'celery', 'exchange': u''}, 'hostname': 'celery@k8s-master', 'acknowledged': True, 'kwargs': {}, 'type': 'proj.tasks.add', 'id': '7032c8c3-6098-4425-87ed-94c0b025e4c9', 'worker_pid': 12325}
    * {'args': [6, 6], 'time_start': 1593678890.5692344, 'name': 'proj.tasks.add', 'delivery_info': {'priority': 0, 'redelivered': None, 'routing_key': 'celery', 'exchange': u''}, 'hostname': 'celery@k8s-master', 'acknowledged': True, 'kwargs': {}, 'type': 'proj.tasks.add', 'id': '4cb111e1-8ae9-4c00-a84a-4166f7b11adf', 'worker_pid': 12327}
    * {'args': [2, 2], 'time_start': 1593678890.5505195, 'name': 'proj.tasks.add', 'delivery_info': {'priority': 0, 'redelivered': None, 'routing_key': 'celery', 'exchange': u''}, 'hostname': 'celery@k8s-master', 'acknowledged': True, 'kwargs': {}, 'type': 'proj.tasks.add', 'id': '6ab3edda-d984-44c1-b4b1-410cb7d392ec', 'worker_pid': 12324}
    * {'args': [0, 0], 'time_start': 1593678890.5383272, 'name': 'proj.tasks.add', 'delivery_info': {'priority': 0, 'redelivered': None, 'routing_key': 'celery', 'exchange': u''}, 'hostname': 'celery@k8s-master', 'acknowledged': True, 'kwargs': {}, 'type': 'proj.tasks.add', 'id': '2d52fb6d-c5c3-4717-8e2d-8ec600e2f986', 'worker_pid': 12330}
    * {'args': [8, 8], 'time_start': 1593678890.5748217, 'name': 'proj.tasks.add', 'delivery_info': {'priority': 0, 'redelivered': None, 'routing_key': 'celery', 'exchange': u''}, 'hostname': 'celery@k8s-master', 'acknowledged': True, 'kwargs': {}, 'type': 'proj.tasks.add', 'id': '49a18ff7-5273-458d-9da1-b4b7bb894610', 'worker_pid': 12321}
    * {'args': [5, 5], 'time_start': 1593678890.5663877, 'name': 'proj.tasks.add', 'delivery_info': {'priority': 0, 'redelivered': None, 'routing_key': 'celery', 'exchange': u''}, 'hostname': 'celery@k8s-master', 'acknowledged': True, 'kwargs': {}, 'type': 'proj.tasks.add', 'id': '1f67b692-c5ff-4fb0-9d64-6ec6b2fe9d03', 'worker_pid': 12328}
    * {'args': [3, 3], 'time_start': 1593678890.5566044, 'name': 'proj.tasks.add', 'delivery_info': {'priority': 0, 'redelivered': None, 'routing_key': 'celery', 'exchange': u''}, 'hostname': 'celery@k8s-master', 'acknowledged': True, 'kwargs': {}, 'type': 'proj.tasks.add', 'id': '8d194b13-231f-4e24-8625-0916888715b9', 'worker_pid': 12326}

查看服务端的日志:

[2020-07-02 16:34:50,537: INFO/MainProcess] Received task: proj.tasks.add[2d52fb6d-c5c3-4717-8e2d-8ec600e2f986]
[2020-07-02 16:34:50,543: INFO/MainProcess] Received task: proj.tasks.add[b4febc78-0d88-45a4-986d-e991d21087cd]
[2020-07-02 16:34:50,549: INFO/MainProcess] Received task: proj.tasks.add[6ab3edda-d984-44c1-b4b1-410cb7d392ec]
[2020-07-02 16:34:50,555: INFO/MainProcess] Received task: proj.tasks.add[8d194b13-231f-4e24-8625-0916888715b9]
[2020-07-02 16:34:50,561: INFO/MainProcess] Received task: proj.tasks.add[f814cfb0-a682-408b-896e-015247503927]
[2020-07-02 16:34:50,565: INFO/MainProcess] Received task: proj.tasks.add[1f67b692-c5ff-4fb0-9d64-6ec6b2fe9d03]
[2020-07-02 16:34:50,568: INFO/MainProcess] Received task: proj.tasks.add[4cb111e1-8ae9-4c00-a84a-4166f7b11adf]
[2020-07-02 16:34:50,571: INFO/MainProcess] Received task: proj.tasks.add[7032c8c3-6098-4425-87ed-94c0b025e4c9]
[2020-07-02 16:34:50,573: INFO/MainProcess] Received task: proj.tasks.add[49a18ff7-5273-458d-9da1-b4b7bb894610]
[2020-07-02 16:34:50,576: INFO/MainProcess] Received task: proj.tasks.add[86a881dd-e389-40ec-9660-68a64cea624d]
[2020-07-02 16:34:55,545: INFO/ForkPoolWorker-10] Task proj.tasks.add[2d52fb6d-c5c3-4717-8e2d-8ec600e2f986] succeeded in 5.006528486s: 0
[2020-07-02 16:34:55,550: INFO/ForkPoolWorker-2] Task proj.tasks.add[b4febc78-0d88-45a4-986d-e991d21087cd] succeeded in 5.006150627s: 2
[2020-07-02 16:34:55,556: INFO/ForkPoolWorker-4] Task proj.tasks.add[6ab3edda-d984-44c1-b4b1-410cb7d392ec] succeeded in 5.006233327s: 4
[2020-07-02 16:34:55,562: INFO/ForkPoolWorker-6] Task proj.tasks.add[8d194b13-231f-4e24-8625-0916888715b9] succeeded in 5.00582992s: 6
[2020-07-02 16:34:55,568: INFO/ForkPoolWorker-9] Task proj.tasks.add[f814cfb0-a682-408b-896e-015247503927] succeeded in 5.005885954s: 8
[2020-07-02 16:34:55,571: INFO/ForkPoolWorker-8] Task proj.tasks.add[1f67b692-c5ff-4fb0-9d64-6ec6b2fe9d03] succeeded in 5.005210609s: 10
[2020-07-02 16:34:55,575: INFO/ForkPoolWorker-5] Task proj.tasks.add[7032c8c3-6098-4425-87ed-94c0b025e4c9] succeeded in 5.003435009s: 14
[2020-07-02 16:34:55,576: INFO/ForkPoolWorker-7] Task proj.tasks.add[4cb111e1-8ae9-4c00-a84a-4166f7b11adf] succeeded in 5.006741387s: 12
[2020-07-02 16:34:55,581: INFO/ForkPoolWorker-3] Task proj.tasks.add[86a881dd-e389-40ec-9660-68a64cea624d] succeeded in 5.003766181s: 18
[2020-07-02 16:34:55,581: INFO/ForkPoolWorker-1] Task proj.tasks.add[49a18ff7-5273-458d-9da1-b4b7bb894610] succeeded in 5.006813998s: 16

总结: 依次启动某个task的异步调用,会使得这类task依次被MainProcess进程捕获然后由某一个worker去执行,待执行完成后再依次启动下一个task重复上述过程。而若想一次性创建多个任务同时被MainProcess进程捕获并被并发执行,则可以利用Group将若干个任务进行打包。

问题2, 不同的task依次启动和利用Group打包并启动有什么区别?

答:在同一个Py脚本中,依次启动若干个task无论其是否为同一个类型,主程序均会顺次捕获task然后依次执行。而Group打包的task不仅能够是同一类型的task也可以是不同类型的task。

依次启动若干个task

from proj.tasks import add, less, mul, exc
from celery import group

if __name__ == '__main__':
   for i in range(10):
       res_add = add.delay(i, i)
       print(res_add.get())

       res_less = less.delay(10, i)
       print(res_less.get())

       res_mul = mul.delay(i, i)
       print(res_mul.get())

       res_exc = exc.delay(i, 10)
       print(res_exc.get())

查看worker中的执行情况,可以看到一个worker中仅有一个task在执行。

celery -A proj.tasks inspect active
-> celery@k8s-master: OK
    * {'args': [10, 1], 'time_start': 1593680790.34875, 'name': 'proj.tasks.add', 'delivery_info': {'priority': 0, 'redelivered': None, 'routing_key': 'celery', 'exchange': u''}, 'hostname': 'celery@k8s-master', 'acknowledged': True, 'kwargs': {}, 'type': 'proj.tasks.add', 'id': 'ac8f4901-fb3a-4eba-ab61-282b83e47f28', 'worker_pid': 13990}

查看服务端的运行情况:

[2020-07-02 17:14:07,796: INFO/ForkPoolWorker-3] Task proj.tasks.add[28661b19-20d3-45c1-bdc5-7dd11db5eeff] succeeded in 0.012954075999s: 0
[2020-07-02 17:14:07,799: INFO/MainProcess] Received task: proj.tasks.less[eb7ddfa4-dbfb-48e0-a8a7-7a0b220880f7]
[2020-07-02 17:14:07,805: INFO/ForkPoolWorker-3] Task proj.tasks.less[eb7ddfa4-dbfb-48e0-a8a7-7a0b220880f7] succeeded in 0.00412611799948s: 10
[2020-07-02 17:14:07,808: INFO/MainProcess] Received task: proj.tasks.mul[b10aed1d-0624-4999-9664-d34a85b36874]
[2020-07-02 17:14:07,811: INFO/ForkPoolWorker-3] Task proj.tasks.mul[b10aed1d-0624-4999-9664-d34a85b36874] succeeded in 0.0022069000006s: 0
[2020-07-02 17:14:07,816: INFO/MainProcess] Received task: proj.tasks.exc[0a5c6b5a-5d62-4d8c-a916-60de24716e9b]
[2020-07-02 17:14:07,819: INFO/ForkPoolWorker-3] Task proj.tasks.exc[0a5c6b5a-5d62-4d8c-a916-60de24716e9b] succeeded in 0.00149728500037s: 0
[2020-07-02 17:14:07,824: INFO/MainProcess] Received task: proj.tasks.add[38a111a4-4638-4eea-ae19-a98ca55e5ea1]
[2020-07-02 17:14:07,827: INFO/ForkPoolWorker-3] Task proj.tasks.add[38a111a4-4638-4eea-ae19-a98ca55e5ea1] succeeded in 0.00158845900114s: 2
[2020-07-02 17:14:07,831: INFO/MainProcess] Received task: proj.tasks.less[6f7bb279-dd29-4867-9bb9-55e482ba5e45]
[2020-07-02 17:14:07,836: INFO/ForkPoolWorker-3] Task proj.tasks.less[6f7bb279-dd29-4867-9bb9-55e482ba5e45] succeeded in 0.00343750799948s: 9
[2020-07-02 17:14:07,839: INFO/MainProcess] Received task: proj.tasks.mul[c90664c7-c444-47ee-9756-9545c4d8cc5d]
[2020-07-02 17:14:07,843: INFO/ForkPoolWorker-3] Task proj.tasks.mul[c90664c7-c444-47ee-9756-9545c4d8cc5d] succeeded in 0.00330464900071s: 1

将若干个task同时打包

from proj.tasks import add, less, mul, exc
from celery import group

if __name__ == '__main__':
   res = group(add.s(1, 1), less.s(10, 3), mul.s(3, 1), exc.s(4, 10), add.s(5, 10))()
   print(res.get())

查看worker端的捕获情况,5个任务同时获取并且在执行。

celery -A proj.tasks inspect active
-> celery@k8s-master: OK
    * {'args': [10, 3], 'time_start': 1593682284.8107345, 'name': 'proj.tasks.less', 'delivery_info': {'priority': 0, 'redelivered': None, 'routing_key': 'celery', 'exchange': u''}, 'hostname': 'celery@k8s-master', 'acknowledged': True, 'kwargs': {}, 'type': 'proj.tasks.less', 'id': '9325508a-cbb8-40f4-b58b-55dcc421fa3b', 'worker_pid': 15680}
    * {'args': [4, 10], 'time_start': 1593682284.8200846, 'name': 'proj.tasks.exc', 'delivery_info': {'priority': 0, 'redelivered': None, 'routing_key': 'celery', 'exchange': u''}, 'hostname': 'celery@k8s-master', 'acknowledged': True, 'kwargs': {}, 'type': 'proj.tasks.exc', 'id': '75a9ecb0-7299-4747-b398-3d229c5237f9', 'worker_pid': 15676}
    * {'args': [3, 1], 'time_start': 1593682284.8166103, 'name': 'proj.tasks.mul', 'delivery_info': {'priority': 0, 'redelivered': None, 'routing_key': 'celery', 'exchange': u''}, 'hostname': 'celery@k8s-master', 'acknowledged': True, 'kwargs': {}, 'type': 'proj.tasks.mul', 'id': '4005e4dc-4b3b-490d-93a0-a21b1742f8fc', 'worker_pid': 15674}
    * {'args': [5, 10], 'time_start': 1593682284.8231542, 'name': 'proj.tasks.add', 'delivery_info': {'priority': 0, 'redelivered': None, 'routing_key': 'celery', 'exchange': u''}, 'hostname': 'celery@k8s-master', 'acknowledged': True, 'kwargs': {}, 'type': 'proj.tasks.add', 'id': '8c15f143-8dd1-4a99-a53d-edebe458c23f', 'worker_pid': 15677}
    * {'args': [1, 1], 'time_start': 1593682284.804, 'name': 'proj.tasks.add', 'delivery_info': {'priority': 0, 'redelivered': None, 'routing_key': 'celery', 'exchange': u''}, 'hostname': 'celery@k8s-master', 'acknowledged': True, 'kwargs': {}, 'type': 'proj.tasks.add', 'id': '2fc41380-c644-4b92-a3b7-0ff0657af0b0', 'worker_pid': 15681}

查看服务端的执行情况,同时获取若干个task然后并发执行。

[2020-07-02 17:31:24,802: INFO/MainProcess] Received task: proj.tasks.add[2fc41380-c644-4b92-a3b7-0ff0657af0b0]
[2020-07-02 17:31:24,809: INFO/MainProcess] Received task: proj.tasks.less[9325508a-cbb8-40f4-b58b-55dcc421fa3b]
[2020-07-02 17:31:24,815: INFO/MainProcess] Received task: proj.tasks.mul[4005e4dc-4b3b-490d-93a0-a21b1742f8fc]
[2020-07-02 17:31:24,819: INFO/MainProcess] Received task: proj.tasks.exc[75a9ecb0-7299-4747-b398-3d229c5237f9]
[2020-07-02 17:31:24,822: INFO/MainProcess] Received task: proj.tasks.add[8c15f143-8dd1-4a99-a53d-edebe458c23f]
[2020-07-02 17:31:29,852: INFO/ForkPoolWorker-10] Task proj.tasks.add[2fc41380-c644-4b92-a3b7-0ff0657af0b0] succeeded in 5.048288559s: 2
[2020-07-02 17:31:29,854: INFO/ForkPoolWorker-9] Task proj.tasks.less[9325508a-cbb8-40f4-b58b-55dcc421fa3b] succeeded in 5.043351114s: 7
[2020-07-02 17:31:29,856: INFO/ForkPoolWorker-3] Task proj.tasks.mul[4005e4dc-4b3b-490d-93a0-a21b1742f8fc] succeeded in 5.039019935s: 3
[2020-07-02 17:31:29,857: INFO/ForkPoolWorker-6] Task proj.tasks.add[8c15f143-8dd1-4a99-a53d-edebe458c23f] succeeded in 5.033975073s: 15
[2020-07-02 17:31:29,858: INFO/ForkPoolWorker-5] Task proj.tasks.exc[75a9ecb0-7299-4747-b398-3d229c5237f9] succeeded in 5.038492595s: 0

问题3, 如何不利用Group而实现并发?

答: 执行若干个client脚本模拟并发。写4个task脚本分别如下:
脚本1

from proj.tasks import add, less, mul, exc
from celery import group

if __name__ == '__main__':
   res_add = add.delay(5, 5)
   print(res_add.get())

脚本2

from proj.tasks import add, less, mul, exc
from celery import group

if __name__ == '__main__':
   res = less.delay(10, 3)
   print(res.get())

脚本3

from proj.tasks import add, less, mul, exc
from celery import group

if __name__ == '__main__':
   res = mul.delay(1, 1)
   print(res.get())

脚本4

from proj.tasks import add, less, mul, exc
from celery import group

if __name__ == '__main__':
   res = exc.delay(1, 1)
   print(res.get())

同时运行,然后查看worker中的执行情况,同时获取4个执行的task。

celery -A proj.tasks inspect active
-> celery@k8s-master: OK
    * {'args': [5, 5], 'time_start': 1593683563.1538188, 'name': 'proj.tasks.add', 'delivery_info': {'priority': 0, 'redelivered': None, 'routing_key': 'celery', 'exchange': u''}, 'hostname': 'celery@k8s-master', 'acknowledged': True, 'kwargs': {}, 'type': 'proj.tasks.add', 'id': '8400c80b-b11a-4070-bcc7-66850d793c86', 'worker_pid': 16399}
    * {'args': [1, 1], 'time_start': 1593683566.5997114, 'name': 'proj.tasks.exc', 'delivery_info': {'priority': 0, 'redelivered': None, 'routing_key': 'celery', 'exchange': u''}, 'hostname': 'celery@k8s-master', 'acknowledged': True, 'kwargs': {}, 'type': 'proj.tasks.exc', 'id': '1336abc3-dcd7-4738-96ad-edf43f85d855', 'worker_pid': 16405}
    * {'args': [10, 3], 'time_start': 1593683564.411672, 'name': 'proj.tasks.less', 'delivery_info': {'priority': 0, 'redelivered': None, 'routing_key': 'celery', 'exchange': u''}, 'hostname': 'celery@k8s-master', 'acknowledged': True, 'kwargs': {}, 'type': 'proj.tasks.less', 'id': 'a7fa5ec4-208f-4b05-944a-c364199ea4a8', 'worker_pid': 16402}
    * {'args': [1, 1], 'time_start': 1593683565.535897, 'name': 'proj.tasks.mul', 'delivery_info': {'priority': 0, 'redelivered': None, 'routing_key': 'celery', 'exchange': u''}, 'hostname': 'celery@k8s-master', 'acknowledged': True, 'kwargs': {}, 'type': 'proj.tasks.mul', 'id': '164c782e-b578-483c-a651-2229f0d22dce', 'worker_pid': 16406}

查看服务端执行情况:

[2020-07-02 17:53:31,472: INFO/MainProcess] Received task: proj.tasks.add[f4e15c12-752e-460d-a231-083b2124ff5b]
[2020-07-02 17:53:32,594: INFO/MainProcess] Received task: proj.tasks.less[77799810-123e-427a-b274-cd79fea5f205]
[2020-07-02 17:53:33,569: INFO/MainProcess] Received task: proj.tasks.mul[f140054b-1685-4203-a3d3-7db7c04b93b2]
[2020-07-02 17:53:34,552: INFO/MainProcess] Received task: proj.tasks.exc[10a126fd-c201-425e-967f-5bab52692fc9]
[2020-07-02 17:53:36,483: INFO/ForkPoolWorker-9] Task proj.tasks.add[f4e15c12-752e-460d-a231-083b2124ff5b] succeeded in 5.009999946s: 10
[2020-07-02 17:53:37,605: INFO/ForkPoolWorker-2] Task proj.tasks.less[77799810-123e-427a-b274-cd79fea5f205] succeeded in 5.010251414s: 7
[2020-07-02 17:53:38,580: INFO/ForkPoolWorker-5] Task proj.tasks.mul[f140054b-1685-4203-a3d3-7db7c04b93b2] succeeded in 5.009864301s: 1
[2020-07-02 17:53:39,563: INFO/ForkPoolWorker-8] Task proj.tasks.exc[10a126fd-c201-425e-967f-5bab52692fc9] succeeded in 5.009972664s: 1

附录

实验中所用到的tasks.py代码如下

from __future__ import absolute_import, unicode_literals, print_function
from .config import app

import os
import time

@app.task
def add(x, y):
    time.sleep(5)
    return x + y

@app.task
def mul(x, y):
    time.sleep(5)
    return x * y

@app.task
def less(x, y):
    time.sleep(5)
    return x - y

@app.task
def exc(x, y):
    time.sleep(5)
    return x / y

@app.task
def xsum(numbers):
    return sum(numbers)

@app.task
def temp():
    return [xsum(range(10)), xsum(range(100))]

@app.task
def temp1():
    return [add(i, i) for i in range(10)]

@app.task
def tsum(numbers):
    return sum(numbers)

@app.task
def log_error(request, exc, traceback):
    with open(os.path.join('/var/errors', request.id), 'a') as fh:
        print('--\n\n{0} {1} {2}'.format(task_id, exc, traceback), file = fh)

@app.task
def on_chord_error(request, exc, traceback):
    print('Task {0!r} raised error: {1!r}'.format(request.id, exc))
评论 1
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包
实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值