python队列溢出_Python:可能会丢失数据多处理队列()

假设我有以下示例,其中我创建了一个守护进程并尝试通过事件标志与它通信:from multiprocessing import Process, Event, Queue

import time

def reader(data):

input_queue = data[0]

e = data[1]

output_queue = data[2]

while True:

if not e.is_set(): # if there is a signal to start

msg = input_queue.get() # Read from the queue

output_queue.put(msg) # copy to output_queue

if (msg == 'DONE'): # signal to stop

e.set() # signal that worker is done

def writer(count, queue):

## Write to the queue

for ii in range(0, count):

queue.put(ii) # Write 'count' numbers into the queue

queue.put('DONE')

if __name__=='__main__':

input_queue = Queue() # reader() reads from queue

# writer() writes to queue

output_queue = Queue()

e = Event()

e.set()

reader_p = Process(target=reader, args=((input_queue, e, output_queue),))

reader_p.daemon = True

reader_p.start() # Launch reader() as a separate python process

for count in [10**4, 10**5, 10**6]:

_start = time.time()

writer(count, input_queue) # Send a lot of stuff to reader()

e.clear() # unset event, giving signal to a worker

e.wait() # waiting for reader to finish

# fetch results from output_queue:

results = []

while not output_queue.empty():

results += [output_queue.get()]

print(len(results)) # check how many results we have

print("Sending %s numbers to Queue() took %s seconds" % (count,

(time.time() - _start)))

我使用输入和输出队列,在这个示例中worker只是将数据复制到输出,稍后我将在程序中获取这些数据。在数据长度为10k之前,一切似乎都正常(这实际上是队列大小限制,以字节为单位吗?),但当我试图复制更多的元素时,我收到的结果是随机数,但比发送的结果要少得多:

^{pr2}$

更新:现在我尝试在三个工人之间共享数据。我已经检查了它们是否都在工作,但数据丢失并没有停止:import multiprocessing

from multiprocessing import Process, Event, Queue

import time

def reader(data):

input_queue = data[0]

e = data[1]

output_queue = data[2]

while True:

if not e.is_set(): # if there is a signal to start

#if not output_queue.empty(): # hangs somewhy

msg = input_queue.get() # Read from the queue

output_queue.put(msg) # copy to output_queue

#print("1")

if (msg == 'DONE'): # signal to stop

e.set() # signal that there is no more data

print("done")

def reader1(data):

input_queue = data[0]

e = data[1]

output_queue = data[2]

while True:

if not e.is_set(): # if there is a signal to start

msg = input_queue.get() # Read from the queue

output_queue.put(msg) # copy to output_queue

#print("2")

if (msg == 'DONE'): # signal to stop

e.set() # signal that there is no more data

print("done")

def reader2(data):

input_queue = data[0]

e = data[1]

output_queue = data[2]

while True:

if not e.is_set(): # if there is a signal to start

msg = input_queue.get() # Read from the queue

output_queue.put(msg) # copy to output_queue

#print("3")

if (msg == 'DONE'): # signal to stop

e.set() # signal that there is no more data

print("done")

def writer(count, queue):

## Write to the queue

for ii in range(0, count):

queue.put(ii) # Write 'count' numbers into the queue

queue.put('DONE')

if __name__=='__main__':

# I do not use manager, as it makes everything extremely slow

#m = multiprocessing.Manager()

#input_queue = m.Queue()

input_queue = Queue() # reader() reads from queue

# writer() writes to queue

output_queue = Queue()

e = Event()

e.set()

reader_p = Process(target=reader, args=((input_queue, e, output_queue),))

reader_p.daemon = True

reader_p.start() # Launch reader() as a separate python process

reader_p1 = Process(target=reader1, args=((input_queue, e, output_queue),))

reader_p1.daemon = True

reader_p1.start()

reader_p2 = Process(target=reader2, args=((input_queue, e, output_queue),))

reader_p2.daemon = True

reader_p2.start()

for count in [10**4, 10**5, 10**6]:

_start = time.time()

writer(count, input_queue) # Send a lot of stuff to readers

e.clear() # unset event, giving signal to a worker

e.wait() # waiting for reader to finish

# fetch results from output_queue:

results = []

while not output_queue.empty():

results += [output_queue.get()]

print(len(results)) # check how many results we have

print("Sending %s numbers to Queue() took %s seconds" % (count,

(time.time() - _start)))

结果,有时我会正确完成第二阶段:done

10001

Sending 10000 numbers to Queue() took 0.37468671798706055 seconds

done

18354

Sending 100000 numbers to Queue() took 1.2723915576934814 seconds

done

34807

Sending 1000000 numbers to Queue() took 9.1871018409729 seconds

done

10001

Sending 10000 numbers to Queue() took 0.37137532234191895 seconds

done

100001

Sending 100000 numbers to Queue() took 2.5747978687286377 seconds

done

217034

Sending 1000000 numbers to Queue() took 12.640174627304077 seconds

  • 0
    点赞
  • 0
    收藏
    觉得还不错? 一键收藏
  • 0
    评论

“相关推荐”对你有帮助么?

  • 非常没帮助
  • 没帮助
  • 一般
  • 有帮助
  • 非常有帮助
提交
评论
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包
实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值