python占用内存过多怎么办_如何自动终止使用Python占用过多内存的进程?

The situation: I have a website that allows people to execute arbitrary code in a different language (specifically, an esolang I created), using a Python interpreter on a shared-hosting server. I run this code in a separate process which is given a time limit of 60 seconds.

The problem: You can do stuff like (Python equivalent) 10**(10**10), which rapidly consumes far more memory than I have allotted to me. It also, apparently, locks up Apache - or it takes too long to respond - so I have to restart it.

I have seen this question, but the given answer uses Perl, which I do not know at all, hence I'd like an answer in Python. The OS is Linux too, though.

Specifically, I want the following characteristics:

Runs automatically

Force-kills any process that exceeds some memory limit like 1MB or 100MB

Kills any process spawned by my code that is more than 24 hours old

I use this piece of code (in a Django view) to create the process and run it (proxy_prgm is a Manager so I can retrieve data from the program that's interpreting the esolang code):

prgmT[uid] = multiprocessing.Process(

target = proxy_prgm.runCatch,

args = (steps,),

name="program run")

prgmT[uid].start()

prgmT[uid].join(60) #time limit of 1 minute

if prgmT[uid].is_alive():

prgmT[uid].terminate()

proxy_prgm.stop()

If you need more details, don't hesitate to tell me what to edit in (or ask me questions).

解决方案

Another approach that might work; using resource.setrlimit() (more details in this other StackOverflow answer). It seems that by doing so you can set a memory limit on a process and it's subprocesses; you'll have to figure out how to handle if the limit is hit though. I don't have personal experience using it, but hopefully doing so would stop Apache from locking up on you.

评论
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包
实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值