Last Updated on September 12, 2022
You can shutdown the ProcessPoolExecutor automatically by using a context manager.
In this tutorial you will discover how to use context managers with process pools in Python.
Let’s get started.
Need to Automatically Shutdown the ProcessPoolExecutor
The ProcessPoolExecutor in Python provides a pool of reusable processes for executing ad hoc tasks.
You can submit tasks to the process pool by calling the submit() function and passing in the name of the function you wish to execute on another process. You can also submit tasks by calling the map() function and specify the name of the function to execute and the iterable of items to which your function will be applied.
Once all of your tasks are finished, you must close the ProcessPoolExecutor in order to release all of the resources held by the worker processes, such as the memory used by each process.
You can shutdown a ProcessPoolExecutor by calling the shutdown() function directly.
For example:
1 2 3 |
... # shutdown a process pool executor.shutdown() |
If you forget to shutdown the process pool, the worker processes will remain active waiting for work and their resources will not be released, perhaps until your program closes.
This raises the question of whether we can automatically close the process pool when we are finished using it.
How can we automatically shutdown the ProcessPoolExecutor?
Run loops using all CPUs, download your FREE book to learn how.
How to Use the ProcessPoolExecutor Context Manager
A ProcessPoolExecutor can be shutdown automatically by using the context manager.
Objects in Python can implement a contact manager which lets you use the “with” statement to define a block in which you intend to use the object after which a clean-up function will be called automatically once the block is exited.
This is a common pattern when opening a socket connection or a local file, for example
1 2 3 4 5 |
... # example of a generic context with open('path/to/file.txt') as file: # do things # file is closed automatically |
The ProcessPoolExecutor can be used via a context manager.
When the block of the context manager is exited, the shutdown() function on the ProcessPoolExecutor will be called with the default arguments. It will wait for all tasks to complete and will not cancel scheduled tasks.
For example:
1 2 3 4 |
... # create a process pool with ProcessPoolExecutor() as executor # submit tasks |
This is functionally equivalent to the following manual creation and shutdown of the ProcessPoolExecutor:
1 2 3 4 5 6 7 8 |
... try: # create a process pool executor = ProcessPoolExecutor() # do things... finally: # shutdown a process pool executor.shutdown() |
Which is also the same as the following with the default arguments to the shutdown() function stated explicitly:
1 2 3 4 5 6 7 8 |
... try: # create a process pool executor = ProcessPoolExecutor() # do things... finally: # shutdown a process pool executor.shutdown(wait=True, cancel_futures=False) |
The context manager is so convenient, that like opening socket connections and files, using the context manager with the ProcessPoolExecutor is a best practice and should be considered a default usage pattern.
Now that we know how to automatically shutdown a ProcessPoolExecutor using the context manager, let’s look at a worked example.
Example of The ProcessPoolExecutor Context Manager
Let’s develop a worked example using the ProcessPoolExecutor context manager.
First, let’s define a task that will sleep for a moment to simulate a computational task.
The task() target task function takes a unique name, calls the sleep() function for a fraction of a second then returns the unique name provided as an argument.
1 2 3 4 |
# task that works for moment def task(name): sleep(0.5) return name |
Next, we can create the process pool using the context manager.
We will configure the ProcessPoolExecutor to have ten worker processes to match the ten tasks we intend to execute concurrently.
1 2 3 4 |
... # start the process pool with ProcessPoolExecutor(10) as executor: # ... |
Next, we can submit our tasks to the process pool for execution.
We will use the map() function which will apply our task() function for each value in an iterable, in this case a range of integers from 0 to 9.
We will iterate the results in the order that the tasks are submitted in a for loop and report the results with a print statement.
1 2 3 4 5 |
... # submit tasks and process results for result in executor.map(task, range(10)): # report the result print(f'Task done: {result}') |
The process pool will then be shutdown automatically for us by the context manager once we are finished with it and leave the “with” block.
Tying this together, the complete example of using the ProcessPoolExecutor context manager is listed below.
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 |
# SuperFastPython.com # example of using the process pool with a context manager from time import sleep from concurrent.futures import ProcessPoolExecutor # mock test that works for moment def task(name): sleep(0.5) return name # demonstrate the context manager def main(): # start the process pool with ProcessPoolExecutor(10) as executor: # submit tasks and process results for result in executor.map(task, range(10)): # report the result print(f'Task done: {result}') # process pool is shutdown automatically # all done, continue on print('Done!') # entry point if __name__ == '__main__': main() |
Running the example first creates the process pool using the context manager with ten worker processes.
We then submit tasks into the process pool by calling the map() function and processing the results in the order that the tasks were submitted, as the tasks are completed.
The result of each task is then reported with a print statement.
Once all tasks are completed, the process pool is closed by an automatic call to the shutdown() function by the ProcessPoolExecutor context manager.
The program is then free to continue.
1 2 3 4 5 6 7 8 9 10 11 |
Task done: 0 Task done: 1 Task done: 2 Task done: 3 Task done: 4 Task done: 5 Task done: 6 Task done: 7 Task done: 8 Task done: 9 Done! |
Free Python ProcessPoolExecutor Course
Download your FREE ProcessPoolExecutor PDF cheat sheet and get BONUS access to my free 7-day crash course on the ProcessPoolExecutor API.
Discover how to use the ProcessPoolExecutor class including how to configure the number of workers and how to execute tasks asynchronously.
Further Reading
This section provides additional resources that you may find helpful.
Books
- ProcessPoolExecutor Jump-Start, Jason Brownlee (my book!)
- Concurrent Futures API Interview Questions
- ProcessPoolExecutor PDF Cheat Sheet
I also recommend specific chapters from the following books:
- Effective Python, Brett Slatkin, 2019.
- See Chapter 7: Concurrency and Parallelism
- Python in a Nutshell, Alex Martelli, et al., 2017.
- See: Chapter: 14: Threads and Processes
Guides
- Python ProcessPoolExecutor: The Complete Guide
- Python ThreadPoolExecutor: The Complete Guide
- Python Multiprocessing: The Complete Guide
- Python Pool: The Complete Guide
APIs
References
- Thread (computing), Wikipedia.
- Process (computing), Wikipedia.
- Thread Pool, Wikipedia.
- Futures and promises, Wikipedia.
Overwhelmed by the python concurrency APIs?
Find relief, download my FREE Python Concurrency Mind Maps
Takeaways
You now know how to use the ProcessPoolExecutor context manager.
Do you have any questions about how to use the context manager?
Ask your question in the comments below and I will do my best to answer.
Do you have any questions?