python manager dict_python manager.dict()锁定如何工作:

A managers.dict() allow to share a dictionary across process and perform thread-safe operation.

In my case each a coordinator process create the shared dict with m elements and n worker processes read and write to/from a single dict key.

Do managers.dict() have one single lock for the dict or m locks, one for every key in it?

Is there an alternative way to share m elements to n workers, other than a shared dict, when the workers do not have to communicate with each other?

解决方案

After some tries I can say there is only one lock per managers dict.

Here is the code that proves it:

import time

import multiprocessing as mp

def process_f(key, shared_dict):

values = [i for i in range(64 * 1024 * 1024)]

print "Writing {}...".format(key)

a = time.time()

shared_dict[key] = values

b = time.time()

print "released {} in {}ms".format(key, (b-a)*1000)

def main():

process_manager = mp.Manager()

n = 5

keys = [i for i in range(n)]

shared_dict = process_manager.dict({i: i * i for i in keys})

pool = mp.Pool(processes=n)

for i in range(n):

pool.apply_async(process_f, (keys[i], shared_dict))

time.sleep(20)

if __name__ == '__main__':

main()

output:

Writing 4...

Writing 3...

Writing 1...

Writing 2...

Writing 0...

released 4 in 3542.7968502ms

released 0 in 4416.22900963ms

released 1 in 6247.48706818ms

released 2 in 7926.97191238ms

released 3 in 9973.71196747ms

Process finished with exit code 0

The increasing time for writing show the waiting which is happening.

  • 0
    点赞
  • 0
    收藏
    觉得还不错? 一键收藏
  • 0
    评论

“相关推荐”对你有帮助么?

  • 非常没帮助
  • 没帮助
  • 一般
  • 有帮助
  • 非常有帮助
提交
评论
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包
实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值