Is relying on __del__() for cleanup in Python unreliable?

There are a few problems with using __del__ to run code.

For one, it only works if you're actively keeping track of references, and even then, there's no guarantee that it will be run immediately unless you're manually kicking off garbage collections throughout your code. I don't know about you, but automatic garbage collection has pretty much spoiled me in terms of accurately keeping track of references. And even if you are super diligent in your code, you're also relying on other users that use your code being just as diligent when it comes to reference counts.

Two, there are lots of instances where __del__ is never going to run. Was there an exception while objects were being initialized and created? Did the interpreter exit? Is there a circular reference somewhere? Yep, lots that could go wrong here and very few ways to cleanly and consistently deal with it.

Three, even if it does run, it won't raise exceptions, so you can't handle exceptions from them like you can with other code. It's also nearly impossible to guarantee that the __del__ methods from various objects will run in any particular order. So the most common use case for destructors - cleaning up and deleting a bunch of objects - is kind of pointless and unlikely to go as planned.

If you actually want code to run, there are much better mechanisms -- context managers, signals/slots, events, etc.


If you're using CPython, then __del__ fires perfectly reliably and predictably as soon as an object's reference count hits zero. The docs at https://docs.python.org/3/c-api/intro.html state:

When an object’s reference count becomes zero, the object is deallocated. If it contains references to other objects, their reference count is decremented. Those other objects may be deallocated in turn, if this decrement makes their reference count become zero, and so on.

You can easily test and see this immediate cleanup happening yourself:

>>> class Foo:
...     def __del__(self):
...         print('Bye bye!')
... 
>>> x = Foo()
>>> x = None
Bye bye!
>>> for i in range(5):
...     print(Foo())
... 
<__main__.Foo object at 0x7f037e6a0550>
Bye bye!
<__main__.Foo object at 0x7f037e6a0550>
Bye bye!
<__main__.Foo object at 0x7f037e6a0550>
Bye bye!
<__main__.Foo object at 0x7f037e6a0550>
Bye bye!
<__main__.Foo object at 0x7f037e6a0550>
Bye bye!
>>>

(Though if you want to test stuff involving __del__ at the REPL, be aware that the last evaluated expression's result gets stored as _, which counts as a reference.)

In other words, if your code is strictly going to be run in CPython, relying on __del__ is safe.


You observe the typical issue with finalizers in garbage collected languages. Java has it, C# has it, and they all provide a scope based cleanup method like the Python with keyword to deal with it.

The main issue is, that the garbage collector is responsible for cleaning up and destroying objects. In C++ an object gets destroyed when it goes out of scope, so you can use RAII and have well defined semantics. In Python the object goes out of scope and lives on as long as the GC likes. Depending on your Python implementation this may be different. CPython with its refcounting based GC is rather benign (so you rarely see issues), while PyPy, IronPython and Jython might keep an object alive for a very long time.

For example:

def bad_code(filename):
    return open(filename, 'r').read()

for i in xrange(10000):
    bad_code('some_file.txt')

bad_code leaks a file handle. In CPython it doesn't matter. The refcount drops to zero and it is deleted right away. In PyPy or IronPython you might get IOErrors or similar issues, as you exhaust all available file descriptors (up to ulimit on Unix or 509 handles on Windows).

Scope based cleanup with a context manager and with is preferable if you need to guarantee cleanup. You know exactly when your objects will be finalized. But sometimes you cannot enforce this kind of scoped cleanup easily. Thats when you might use __del__, atexit or similar constructs to do a best effort at cleaning up. It is not reliable but better than nothing.

You can either burden your users with explicit cleanup or enforcing explicit scopes or you can take the gamble with __del__ and see some oddities now and then (especially interpreter shutdown).