我正在编写一个守护程序,它会产生其他几个子进程.运行停止脚本后,主进程在打算退出时继续运行,这让我很困惑.
import daemon, signal
from multiprocessing import Process, cpu_count, JoinableQueue
from http import httpserv
from worker import work
class Manager:
"""
This manager starts the http server processes and worker
processes, creates the input/output queues that keep the processes
work together nicely.
"""
def __init__(self):
self.NUMBER_OF_PROCESSES = cpu_count()
def start(self):
self.i_queue = JoinableQueue()
self.o_queue = JoinableQueue()
# Create worker processes
self.workers = [Process(target=work,
args=(self.i_queue, self.o_queue))
for i in range(self.NUMBER_OF_PROCESSES)]
for w in self.workers:
w.daemon = True
w.start()
# Create the http server process
self.http = Process(target=httpserv, args=(self.i_queue, self.o_queue))
self.http.daemon = True
self.http.start()
# Keep the current process from returning
self.running = True
while self.running:
time.sleep(1)
def stop(self):
print "quiting ..."
# Stop accepting new requests from users
os.kill(self.http.pid, signal.SIGINT)
# Waiting for all requests in output queue to be delivered
self.o_queue.join()
# Put sentinel None to input queue to signal worker processes
# to terminate
self.i_queue.put(None)
for w in self.workers:
w.join()
self.i_queue.join()
# Let main process return
self.running = False
import daemon
manager = Manager()
context = daemon.DaemonContext()
context.signal_map = {
signal.SIGHUP: lambda signum, frame: manager.stop(),
}
context.open()
manager.start()
停止脚本只是一个单行的os.kill(pid,signal.SIGHUP),但在那之后,子进程(工作进程和http服务器进程)很好地结束,但主进程只停留在那里,我不知道什么阻止它返回.
解决方法:
您创建http服务器进程但不加入()它.如果,而不是做一个os.kill()来停止http服务器进程,你发送一个停止处理的sentinel(无,就像你发送给工作者)然后做一个self.http.join()会发生什么?
更新:您还需要为每个工作人员将None sentinel发送到输入队列一次.你可以尝试:
for w in self.workers:
self.i_queue.put(None)
for w in self.workers:
w.join()
注:你需要两个循环的原因是,如果你在与join()相同的循环中将None放入队列中,那么除了w之外的工作者可以拾取None,因此加入w将导致调用者阻塞.
你没有显示worker或http服务器的代码,所以我认为这些在调用task_done等方面表现良好,并且每个worker在看到None时都会退出,而不再使用get()来自输入队列的东西.
另请注意,JoinableQueue.task_done()至少有one open, hard-to-reproduce issue,可能会咬你.
标签:python,multiprocessing,daemon