python多进程编程实例_如何在Python中使用多进程与类实例?

我正在尝试创建一个类,可以运行一个单独的过程去做一些需要很长时间的工作,从主模块启动一些这些,然后等待他们完成.我想启动过程一次,然后继续喂他们做的事情,而不是创建和销毁流程.例如,也许我有10台服务器运行dd命令,然后我想让他们全部扫描文件等.

我的最终目标是为每个系统创建一个类,以跟踪与其相关的系统的信息,如IP地址,日志,运行时等.但是该类必须能够启动系统命令,然后返回该系统命令运行时执行回到调用者,以后跟随系统命令的结果.

我的尝试是失败的,因为我不能通过pickle将一个类的实例方法发送到子进程.那些不可..因此,我试图以各种方式解决问题,但我无法弄清楚.我的代码如何修补来做到这一点?如果您无法发送任何有用的东西,多处理器有什么好处?

有没有什么好的文档的多处理与类实例一起使用?我可以让多处理器模块工作的唯一方法就是简单的功能.在类实例中使用它的每一次尝试都失败了.也许我应该传递事件呢?我不明白怎么做.

import multiprocessing

import sys

import re

class ProcessWorker(multiprocessing.Process):

"""

This class runs as a separate process to execute worker's commands in parallel

Once launched, it remains running, monitoring the task queue, until "None" is sent

"""

def __init__(self, task_q, result_q):

multiprocessing.Process.__init__(self)

self.task_q = task_q

self.result_q = result_q

return

def run(self):

"""

Overloaded function provided by multiprocessing.Process. Called upon start() signal

"""

proc_name = self.name

print '%s: Launched' % (proc_name)

while True:

next_task_list = self.task_q.get()

if next_task is None:

# Poison pill means shutdown

print '%s: Exiting' % (proc_name)

self.task_q.task_done()

break

next_task = next_task_list[0]

print '%s: %s' % (proc_name, next_task)

args = next_task_list[1]

kwargs = next_task_list[2]

answer = next_task(*args, **kwargs)

self.task_q.task_done()

self.result_q.put(answer)

return

# End of ProcessWorker class

class Worker(object):

"""

Launches a child process to run commands from derived classes in separate processes,

which sit and listen for something to do

This base class is called by each derived worker

"""

def __init__(self, config, index=None):

self.config = config

self.index = index

# Launce the ProcessWorker for anything that has an index value

if self.index is not None:

self.task_q = multiprocessing.JoinableQueue()

self.result_q = multiprocessing.Queue()

self.process_worker = ProcessWorker(self.task_q, self.result_q)

self.process_worker.start()

print "Got here"

# Process should be running and listening for functions to execute

return

def enqueue_process(target): # No self, since it is a decorator

"""

Used to place an command target from this class object into the task_q

NOTE: Any function decorated with this must use fetch_results() to get the

target task's result value

"""

def wrapper(self, *args, **kwargs):

self.task_q.put([target, args, kwargs]) # FAIL: target is a class instance method and can't be pickled!

return wrapper

def fetch_results(self):

"""

After all processes have been spawned by multiple modules, this command

is called on each one to retreive the results of the call.

This blocks until the execution of the item in the queue is complete

"""

self.task_q.join() # Wait for it to to finish

return self.result_q.get() # Return the result

@enqueue_process

def run_long_command(self, command):

print "I am running number % as process "%number, self.name

# In here, I will launch a subprocess to run a long-running system command

# p = Popen(command), etc

# p.wait(), etc

return

def close(self):

self.task_q.put(None)

self.task_q.join()

if __name__ == '__main__':

config = ["some value", "something else"]

index = 7

workers = []

for i in range(5):

worker = Worker(config, index)

worker.run_long_command("ls /")

workers.append(worker)

for worker in workers:

worker.fetch_results()

# Do more work... (this would actually be done in a distributor in another class)

for worker in workers:

worker.close()

编辑:我尝试移动ProcessWorker类并创建多个处理队列在Worker类之外,然后尝试手动pickle该工作实例.即使这不工作,我得到一个错误

RuntimeError: Queue objects should only be shared between processes

through inheritance

.但是我只是将这些队列的引用传递给worker实例?我错过了一些基本的东西.以下是主要部分的修改代码:

if __name__ == '__main__':

config = ["some value", "something else"]

index = 7

workers = []

for i in range(1):

task_q = multiprocessing.JoinableQueue()

result_q = multiprocessing.Queue()

process_worker = ProcessWorker(task_q, result_q)

worker = Worker(config, index, process_worker, task_q, result_q)

something_to_look_at = pickle.dumps(worker) # FAIL: Doesn't like queues??

process_worker.start()

worker.run_long_command("ls /")

  • 0
    点赞
  • 0
    收藏
    觉得还不错? 一键收藏
  • 0
    评论

“相关推荐”对你有帮助么?

  • 非常没帮助
  • 没帮助
  • 一般
  • 有帮助
  • 非常有帮助
提交
评论
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包
实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值