1.常用的线程方法
# 启动子进程
t.start()
# 进程等待,主进程等待子进程运行完毕之后才退出
t.join()
# 判断进程是否在执行状态,在执行返回True,否则返回False
t.is_alive()
# 守护进程,随主进程退出而退出,默认为False
t.daemon = True
# 设置主进程名
t.name = "My_Process"
#终止子进程
p.terminate()
p.join()
2.用函数创建多进程
import time
import multiprocessing
def process(index):
time.sleep(index)
print(f'Process: {index}')
if __name__ == '__main__':
print(f'CPU number: {multiprocessing.cpu_count()}')
for i in range(5):
p = multiprocessing.Process(target=process, args=[i])
p.start()
for p in multiprocessing.active_children():
print(f'Child process name: {p.name} id: {p.pid}')
3.用类创建多进程
import time
from multiprocessing import Process
class MyProcess(Process):
def __init__(self, loop):
super().__init__()
self.loop = loop
def run(self):
for count in range(self.loop):
time.sleep(1)
print(f'Pid: {self.pid} LoopCount: {count}')
if __name__ == '__main__':
for i in range(2, 5):
p = MyProcess(i)
p.start()
4.锁
参考:python多线程:4 锁
1.互斥锁
参考:python多线程:4.1互斥锁
2.可重入锁
参考:python多线程:4.2可重入锁
3.防止死锁的加锁机制
5.共享数据
进程之间的信息是隔离的,因此不共享数据,如需共享数据需要借助Queue和Pipe
6.进程通信机制
1.Queue
2.Pipe
类型 | 定义 |
---|---|
单向管道 | Pipe(deplex=False) |
双向管道 | Pipe() |
from multiprocessing import Process, Pipe
class Consumer(Process):
def __init__(self, pipe):
Process.__init__(self)
self.pipe = pipe
def run(self):
self.pipe.send('Consumer Words')
print(f'Consumer Received: {self.pipe.recv()}')
class Producer(Process):
def __init__(self, pipe):
Process.__init__(self)
self.pipe = pipe
def run(self):
print(f'Producer Received: {self.pipe.recv()}')
self.pipe.send('Producer Words')
if __name__ == '__main__':
# 默认是双向管道
pipe = Pipe()
# 将管道的两端分别传给两个进程
p = Producer(pipe[0])
c = Consumer(pipe[1])
p.daemon = c.daemon = True
p.start()
c.start()
p.join()
c.join()
print('Main Process Ended')
7.信号量
8.进程池
问题:
如果10000 个任务,每个任务需要启动一个进程来执行,并且一个进程运行完毕之后要紧接着启动下一个进程,同时我还需要控制进程的并发数量,不能并发太高
解决方案:
用 Process 和 Semaphore 可以实现,但是实现起来比较烦琐,因此我们采用进程池
import time
from multiprocessing import Pool
def function(index):
print(f'Start process: {index}')
time.sleep(3)
print(f'End process {index}', )
if __name__ == '__main__':
print('Main Process started')
with Pool(processes=3) as pool:
# 方式一:apply_async
# [pool.apply_async(function, args=(i,)) for i in range(4)]
# 方式二:map
pool.map(function, range(4))
pool.join()
print('Main Process ended')