python进程间通信queue_python 进程间通信Queue/Pipe(42)

文章首发微信公众号,微信搜索:猿说python

一.前言

1.在前一篇文章

2.在线程间通信的时候可以使用Queue模块完成,进程间通信也可以通过Queue完成,但是此Queue并非线程的Queue,进程间通信Queue是将数据 pickle 后传给另一个进程的 Queue,用于父进程与子进程之间的通信或同一父进程的子进程之间通信;

使用Queue线程间通信:

#导入线程相关模块

import threading

import queue

q = queue.Queue()

使用Queue进程间通信,适用于多个进程之间通信:

# 导入进程相关模块

from multiprocessing import Process

from multiprocessing import Queue

q = Queue()

使用Pipe进程间通信,适用于两个进程之间通信(一对一):

# 导入进程相关模块

from multiprocessing import Process

from multiprocessing import Pipe

pipe = Pipe()

二.python进程间通信Queue/Pipe使用

python提供了多种进程通信的方式,主要Queue和Pipe这两种方式,Queue用于多个进程间实现通信,Pipe用于两个进程的通信;

1.使用Queue进程间通信,Queue包含两个方法:put():以插入数据到队列中,他还有两个可选参数:blocked和timeout。详情自行百度

get():从队列读取并且删除一个元素。同样,他还有两个可选参数:blocked和timeout。详情自行百度

# !usr/bin/env python

# -*- coding:utf-8 _*-

"""

@Author:何以解忧

@Blog(个人博客地址): shuopython.com

@WeChat Official Account(微信公众号):猿说python

@Github:www.github.com

@File:python_process_queue.py

@Time:2019/12/21 21:25

@Motto:不积跬步无以至千里,不积小流无以成江海,程序人生的精彩需要坚持不懈地积累!

"""

from multiprocessing import Process

from multiprocessing import Queue

import os,time,random

#写数据进程执行的代码

def proc_write(q,urls):

print ('Process is write....')

for url in urls:

q.put(url)

print ('put %s to queue... ' %url)

time.sleep(random.random())

#读数据进程的代码

def proc_read(q):

print('Process is reading...')

while True:

url = q.get(True)

print('Get %s from queue' %url)

if __name__ == '__main__':

#父进程创建Queue,并传给各个子进程

q = Queue()

proc_write1 = Process(target=proc_write,args=(q,['url_1','url_2','url_3']))

proc_write2 = Process(target=proc_write,args=(q,['url_4','url_5','url_6']))

proc_reader = Process(target=proc_read,args=(q,))

#启动子进程,写入

proc_write1.start()

proc_write2.start()

proc_reader.start()

#等待proc_write1结束

proc_write1.join()

proc_write2.join()

#proc_raader进程是死循环,强制结束

proc_reader.terminate()

print("mian")

输出结果:

Process is write....

put url_1 to queue...

Process is write....

put url_4 to queue...

Process is reading...

Get url_1 from queue

Get url_4 from queue

put url_5 to queue...

Get url_5 from queue

put url_2 to queue...

Get url_2 from queue

put url_3 to queue...

Get url_3 from queue

put url_6 to queue...

Get url_6 from queue

mian

2.使用Pipe进程间通信

Pipe常用于两个进程,两个进程分别位于管道的两端 * Pipe方法返回(conn1,conn2)代表一个管道的两个端,Pipe方法有duplex参数,默认为True,即全双工模式,若为FALSE,conn1只负责接收信息,conn2负责发送,Pipe同样也包含两个方法:

send() : 发送信息;

recv() : 接收信息;

from multiprocessing import Process

from multiprocessing import Pipe

import os,time,random

#写数据进程执行的代码

def proc_send(pipe,urls):

#print 'Process is write....'

for url in urls:

print ('Process is send :%s' %url)

pipe.send(url)

time.sleep(random.random())

#读数据进程的代码

def proc_recv(pipe):

while True:

print('Process rev:%s' %pipe.recv())

time.sleep(random.random())

if __name__ == '__main__':

#父进程创建pipe,并传给各个子进程

pipe = Pipe()

p1 = Process(target=proc_send,args=(pipe[0],['url_'+str(i) for i in range(10) ]))

p2 = Process(target=proc_recv,args=(pipe[1],))

#启动子进程,写入

p1.start()

p2.start()

p1.join()

p2.terminate()

print("mian")

输出结果:

Process is send :url_0

Process rev:url_0

Process is send :url_1

Process rev:url_1

Process is send :url_2

Process rev:url_2

Process is send :url_3

Process rev:url_3

Process is send :url_4

Process rev:url_4

Process is send :url_5

Process is send :url_6

Process is send :url_7

Process rev:url_5

Process is send :url_8

Process is send :url_9

Process rev:url_6

mian

三.测试queue.Queue来完成进程间通信能否成功?

当然我们也可以尝试使用线程threading的Queue是否能完成线程间通信,示例代码如下:

from multiprocessing import Process

# from multiprocessing import Queue # 进程间通信Queue,两者不要混淆

import queue # 线程间通信queue.Queue,两者不要混淆

import time

def p_put(q,*args):

q.put(args)

print('Has put %s' % args)

def p_get(q,*args):

print('%s wait to get...' % args)

print(q.get())

print('%s got it' % args)

if __name__ == "__main__":

q = queue.Queue()

p1 = Process(target=p_put, args=(q,'p1', ))

p2 = Process(target=p_get, args=(q,'p2', ))

p1.start()

p2.start()

直接异常报错:

Traceback (most recent call last):

File "E:/Project/python_project/untitled10/123.py", line 38, in

p1.start()

File "G:\ProgramData\Anaconda3\lib\multiprocessing\process.py", line 105, in start

self._popen = self._Popen(self)

File "G:\ProgramData\Anaconda3\lib\multiprocessing\context.py", line 223, in _Popen

return _default_context.get_context().Process._Popen(process_obj)

File "G:\ProgramData\Anaconda3\lib\multiprocessing\context.py", line 322, in _Popen

return Popen(process_obj)

File "G:\ProgramData\Anaconda3\lib\multiprocessing\popen_spawn_win32.py", line 65, in __init__

reduction.dump(process_obj, to_child)

File "G:\ProgramData\Anaconda3\lib\multiprocessing\reduction.py", line 60, in dump

ForkingPickler(file, protocol).dump(obj)

TypeError: can't pickle _thread.lock objects

猜你喜欢:

想了解更多python内容请直接搜索微信公众号:猿说pythonPython教程 - 猿说Python​www.shuopython.com

本人也还在学习python中,博客会持续更新ing,有兴趣的小伙伴关注走一波,推荐浏览个人博客网站:猿说python,文章采用树状分类,结构目录清晰一点,文章内容有问题的话欢迎给出建议或者直接留言.

评论
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包
实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值