python asyncio和celery对比_python – Django celery – asyncio – daemonic进程不允许有子进程...

我之前可以看到类似的问题,但那些是运行多处理器而不是执行程序.因此,我不确定如何解决这个问题.

我在用

celery==4.1.1

django-celery==3.2.1

django-celery-beat==1.0.1

django-celery-results==1.0.1

我的脚本如下,我试图将其删除以仅显示相关代码.

@asyncio.coroutine

def snmp_get(ip, oid, snmp_user, snmp_auth, snmp_priv):

results=[]

snmpEngine = SnmpEngine()

errorIndication, errorStatus, errorIndex, varBinds = yield from getCmd(

...

)

...

for varBind in varBinds:

results.append(' = '.join([x.prettyPrint() for x in varBind]))

snmpEngine.transportDispatcher.closeDispatcher()

return results

def create_link_data_record(link_data):

obj = LinkData.objects.create(

...

)

return 'data polled for {} record {} created'.format(link_data.hostname, obj.id)

async def retrieve_data(link, loop):

from concurrent.futures import ProcessPoolExecutor

executor = ProcessPoolExecutor(2)

poll_interval = 60

results = []

# credentials:

...

print('polling data for {} on {}'.format(hostname,link_mgmt_ip))

# create link data obj

link_data = LinkDataObj()

...

# first poll for speeds

download_speed_data_poll1 = await snmp_get(link_mgmt_ip, down_speed_oid % link_index ,snmp_user, snmp_auth, snmp_priv)

download_speed_data_poll1 = await snmp_get(link_mgmt_ip, down_speed_oid % link_index ,snmp_user, snmp_auth, snmp_priv)

# check we were able to poll

if 'timeout' in str(get_snmp_value(download_speed_data_poll1)).lower():

return 'timeout trying to poll {} - {}'.format(hostname ,link_mgmt_ip)

upload_speed_data_poll1 = await snmp_get(link_mgmt_ip, up_speed_oid % link_index, snmp_user, snmp_auth, snmp_priv)

# wait for poll interval

await asyncio.sleep(poll_interval)

# second poll for speeds

download_speed_data_poll2 = await snmp_get(link_mgmt_ip, down_speed_oid % link_index, snmp_user, snmp_auth, snmp_priv)

upload_speed_data_poll2 = await snmp_get(link_mgmt_ip, up_speed_oid % link_index, snmp_user, snmp_auth, snmp_priv)

# create deltas for speed

down_delta = int(get_snmp_value(download_speed_data_poll2)) - int(get_snmp_value(download_speed_data_poll1))

up_delta = int(get_snmp_value(upload_speed_data_poll2)) - int(get_snmp_value(upload_speed_data_poll1))

...

results.append(await loop.run_in_executor(executor, create_link_data_record, link_data))

return results

def get_link_data():

link_data = LinkTargets.objects.all()

# create loop

loop = asyncio.get_event_loop()

if asyncio.get_event_loop().is_closed():

loop = asyncio.new_event_loop()

asyncio.set_event_loop(asyncio.new_event_loop())

# create tasks

tasks = [asyncio.ensure_future(retrieve_data(link, loop)) for link in link_data]

if tasks:

start = time.time()

done, pending = loop.run_until_complete(asyncio.wait(tasks))

loop.close()

下面的错误引用了run_in_executor代码

[2018-05-24 14:13:00,840: ERROR/ForkPoolWorker-3] Task exception was never retrieved

future: exception=AssertionError('daemonic processes are not allowed to have children',)>

Traceback (most recent call last):

File "/itapp/itapp/monitoring/jobs/link_monitoring.py", line 209, in retrieve_data

link_data.last_change = await loop.run_in_executor(executor, timestamp, (link_data.link_target_id, link_data.service_status))

File "/usr/local/lib/python3.6/asyncio/base_events.py", line 639, in run_in_executor

return futures.wrap_future(executor.submit(func, *args), loop=self)

File "/usr/local/lib/python3.6/concurrent/futures/process.py", line 466, in submit

self._start_queue_management_thread()

File "/usr/local/lib/python3.6/concurrent/futures/process.py", line 427, in _start_queue_management_thread

self._adjust_process_count()

File "/usr/local/lib/python3.6/concurrent/futures/process.py", line 446, in _adjust_process_count

p.start()

File "/usr/local/lib/python3.6/multiprocessing/process.py", line 103, in start

'daemonic processes are not allowed to have children'

AssertionError: daemonic processes are not allowed to have children

  • 0
    点赞
  • 0
    收藏
    觉得还不错? 一键收藏
  • 0
    评论
`asyncio` 和 `Celery` 都是 Python 中用于异步编程的工具,但它们在用途和设计上有所不同。 **asyncio(异步I/O):** - 异步IO库,它是 Python 标准库的一部分,主要侧重于单线程并发处理,适合于 I/O 密集型任务,比如网络请求、文件读写等。通过使用异步/await语法,`asyncio`可以让代码顺序执行,但底层会利用事件循环和回调来管理多个任务并发执行,提高程序响应速度。 - 异步IO适合于对实时性要求较高的场景,例如实时聊天应用或Web服务器。 - asyncio 任务通常是由程序自己启动和管理的。 **Celery:** - 是一个分布式任务队列系统,它基于消息传递模型,能够将耗时的、复杂的、可能需要跨多个机器的任务异步地添加到队列中。Celery 支持多种消息队列如RabbitMQ、Redis等。 - 它的核心是将工作分解为可独立执行的单元,并通过工人(worker)在后台执行这些任务。这使得开发者可以专注于业务逻辑,而无需担心任务调度和失败处理。 - Celery 提供了丰富的错误处理和监控工具,以及跨语言支持,可以与其他语言的消费者行交互。 - 适用于需要高可用性和可靠性的场景,比如批量数据处理、邮件发送、爬虫等。 **区别总结:** 1. asyncio 主要是针对单机、本地的、轻量级的异步编程,而 Celery 更关注分布式任务的管理和执行。 2. asyncio 更注重单个应用程序内部的并发优化,Celery 则是解决跨进程、跨主机的异步任务分发。 3. 使用 asyncio,程序员需要自行管理任务的调度和错误处理,Celery 提供了更完整的任务管理解决方案。
评论
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包
实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值