What's the best way to kill a function (that is still running) after a given amount of time in Python? These are two approaches I have found so far:
Say this is our base function:
import time
def foo():
a_long_time = 10000000
time.sleep(a_long_time)
TIMEOUT = 5 # seconds
1. Multiprocessing Approach
import multiprocessing
if __name__ == '__main__':
p = multiprocessing.Process(target=foo, name="Foo")
p.start()
p.join(TIMEOUT)
if p.is_alive()
print('function terminated')
p.terminate()
p.join()
2. Signal Approach
import signal
class TimeoutException(Exception):
pass
def timeout_handler(signum, frame):
raise TimeoutException
signal.signal(signal.SIGALRM, timeout_handler)
signal.alarm(TIMEOUT)
try:
foo()
except TimeoutException:
print('function terminated')
What are the advantages and disadvantages in terms of scope, safety and usability of these two methods? Are there any better approaches?
解决方案
Well, as always, it depends.
As you probably have already verified, both these methods work. I would say it depends on your application and correct implementation (your signalling method is a bit wrong...)
Both methods can be considered "safe" if implemented correctly. It depends if your main program outside the foo function needs to do something, or can it just sit and wait for foo to either complete or timeout. The signalling method does not allow any parallel processing, as your main program will be in foo() until it either completes or times out. BUT you need then to defuse the signal. If your foo completes in one second, your main program leaves the try/except structure, and four seconds later ... kaboom ... an exception is raised and probably uncaught. Not good.
try:
foo()
signal.alarm(0)
except TimeoutException:
print ("function terminated")
solves the problem.
I would personally prefer the multiprocessing approach. It is simpler and does not require signals and exception handling that in theory can go wrong if your program execution is not where you expect it to be when a signal is raised. If it is ok for your program to wait in join(), then you are done. However, if you want to do something in the main process while you wait, you can enter a loop, track time in a variable, check if over timeout and if so, terminate the process. You would just use join with a tiny timeout to "peek" if the process is still running.
Another method, depending on your foo(), is to use threads with a class or a global variable. If your foo keeps processing commands instead of possibly waiting for a long time for a command to finish, you can add an if clause there:
def foo():
global please_exit_now
while True:
do_stuff
do_more_stuff
if foo_is_ready:
break
if please_exit_now is True:
please_exit_now = False
return
finalise_foo
return
If do_stuff and do_more_stuff complete in a reasonable amount of time, you could then process things in your main program and just set global please_exit_now as True, and your thread would eventually notice that and exit.
I would probably just go for your multiprocessing and join, though.
Hannu