When using unittest.mock.patch, why is autospec not True by default?

The only clear way to explain this, is to actually quote the documentation on the downside of using auto-speccing and why you should be careful when using it:

This isn’t without caveats and limitations however, which is why it is not the default behaviour. In order to know what attributes are available on the spec object, autospec has to introspect (access attributes) the spec. As you traverse attributes on the mock a corresponding traversal of the original object is happening under the hood. If any of your specced objects have properties or descriptors that can trigger code execution then you may not be able to use autospec. On the other hand it is much better to design your objects so that introspection is safe [4].

A more serious problem is that it is common for instance attributes to be created in the init method and not to exist on the class at all. autospec can’t know about any dynamically created attributes and restricts the api to visible attributes.

I think the key takeaway here is to note this line: autospec can’t know about any dynamically created attributes and restricts the api to visible attributes

So, to help being more explicit with an example of where autospeccing breaks, this example taken from the documentation shows this:

>>> class Something:
...   def __init__(self):
...     self.a = 33
...
>>> with patch('__main__.Something', autospec=True):
...   thing = Something()
...   thing.a
...
Traceback (most recent call last):
  ...
AttributeError: Mock object has no attribute 'a'

As you can see, auto-speccing has no idea that there is an attribute a being created when creating your Something object.

There is nothing wrong with assigning a value to your instance attribute.

Observe the below functional example:

import unittest
from mock import patch

def some_external_thing():
    pass

def something(x):
    return x

class MyRealClass:
    def __init__(self):
        self.a = some_external_thing()

    def test_thing(self):
        return something(self.a)



class MyTest(unittest.TestCase):
    def setUp(self):
        self.my_obj = MyRealClass()

    @patch('__main__.some_external_thing')    
    @patch('__main__.something')
    def test_my_things(self, mock_something, mock_some_external_thing):
        mock_some_external_thing.return_value = "there be dragons"
        self.my_obj.a = mock_some_external_thing.return_value
        self.my_obj.test_thing()

        mock_something.assert_called_once_with("there be dragons")


if __name__ == '__main__':
    unittest.main()

So, I'm just saying for my test case I want to make sure that the some_external_thing() method does not affect the behaviour of my unittest, so I'm just assigning my instance attribute the mock per mock_some_external_thing.return_value = "there be dragons".


Answering my own question many years later - another reason is speed.

Depending on how complex your object is, it can be that using autospec can slow your test down significantly. I've found this particularly when patching Django models.


The action of autospeccing itself can execute code, for example via the invocation of descriptors.

>>> class A: 
...     @property 
...     def foo(self): 
...         print("rm -rf /") 
... 
>>> a = A() 
>>> with mock.patch("__main__.a", autospec=False) as m: 
...     pass 
... 
>>> with mock.patch("__main__.a", autospec=True) as m: 
...     pass 
... 
rm -rf /

Therefore, this is a problematic feature to enable by default and is opt-in only.