How to force/ensure class attributes are a specific type?

You can use a property like the other answers put it - so, if you want to constrain a single attribute, say "bar", and constrain it to an integer, you could write code like this:

class Foo(object):
    def _get_bar(self):
        return self.__bar
    def _set_bar(self, value):
        if not isinstance(value, int):
            raise TypeError("bar must be set to an integer")
        self.__bar = value
    bar = property(_get_bar, _set_bar)

And this works:

>>> f = Foo()
>>> f.bar = 3
>>> f.bar
3
>>> f.bar = "three"
Traceback (most recent call last):
  File "<stdin>", line 1, in <module>
  File "<stdin>", line 6, in _set_bar
TypeError: bar must be set to an integer
>>> 

(There is also a new way of writing properties, using the "property" built-in as a decorator to the getter method - but I prefer the old way, like I put it above).

Of course, if you have lots of attributes on your classes, and want to protect all of them in this way, it starts to get verbose. Nothing to worry about - Python's introspection abilities allow one to create a class decorator that could automate this with a minimum of lines.

def getter_setter_gen(name, type_):
    def getter(self):
        return getattr(self, "__" + name)
    def setter(self, value):
        if not isinstance(value, type_):
            raise TypeError(f"{name} attribute must be set to an instance of {type_}")
        setattr(self, "__" + name, value)
    return property(getter, setter)

def auto_attr_check(cls):
    new_dct = {}
    for key, value in cls.__dict__.items():
        if isinstance(value, type):
            value = getter_setter_gen(key, value)
        new_dct[key] = value
    # Creates a new class, using the modified dictionary as the class dict:
    return type(cls)(cls.__name__, cls.__bases__, new_dct)

And you just use auto_attr_checkas a class decorator, and declar the attributes you want in the class body to be equal to the types the attributes need to constrain too:

...     
... @auto_attr_check
... class Foo(object):
...     bar = int
...     baz = str
...     bam = float
... 
>>> f = Foo()
>>> f.bar = 5; f.baz = "hello"; f.bam = 5.0
>>> f.bar = "hello"
Traceback (most recent call last):
  File "<stdin>", line 1, in <module>
  File "<stdin>", line 6, in setter
TypeError: bar attribute must be set to an instance of <type 'int'>
>>> f.baz = 5
Traceback (most recent call last):
  File "<stdin>", line 1, in <module>
  File "<stdin>", line 6, in setter
TypeError: baz attribute must be set to an instance of <type 'str'>
>>> f.bam = 3 + 2j
Traceback (most recent call last):
  File "<stdin>", line 1, in <module>
  File "<stdin>", line 6, in setter
TypeError: bam attribute must be set to an instance of <type 'float'>
>>> 

    

Since Python 3.5, you can use type-hints to indicate that a class attribute should be of a particular type. Then, you could include something like MyPy as part of your continuous integration process to check that all the type contracts are respected.

For example, for the following Python script:

class Foo:
    x: int
    y: int

foo = Foo()
foo.x = "hello"

MyPy would give the following error:

6: error: Incompatible types in assignment (expression has type "str", variable has type "int")

If you want types to be enforced at runtime, you could use the enforce package. From the README:

>>> import enforce
>>>
>>> @enforce.runtime_validation
... def foo(text: str) -> None:
...     print(text)
>>>
>>> foo('Hello World')
Hello World
>>>
>>> foo(5)
Traceback (most recent call last):
  File "<stdin>", line 1, in <module>
  File "/home/william/.local/lib/python3.5/site-packages/enforce/decorators.py", line 106, in universal
    _args, _kwargs = enforcer.validate_inputs(parameters)
  File "/home/william/.local/lib/python3.5/site-packages/enforce/enforcers.py", line 69, in validate_inputs
    raise RuntimeTypeError(exception_text)
enforce.exceptions.RuntimeTypeError: 
  The following runtime type errors were encountered:
       Argument 'text' was not of type <class 'str'>. Actual type was <class 'int'>.

In general, this is not a good idea for the reasons that @yak mentioned in his comment. You are basically preventing the user from supplying valid arguments that have the correct attributes/behavior but are not in the inheritance tree you hard-coded in.

Disclaimer aside, there are a few of options available for what you are trying to do. The main issue is that there are no private attributes in Python. So if you just have a plain old object reference, say self._a, you can not guarantee that the user won't set it directly even though you have provided a setter that does type checking for it. The options below demonstrate how to really enforce the type checking.

Override __setattr__

This method will only be convenient for a (very) small number of attributes that you do this to. The __setattr__ method is what gets called when you use dot notation to assign a regular attribute. For example,

class A:
    def __init__(self, a0):
        self.a = a0

If we now do A().a = 32, it would call A().__setattr__('a', 32) under the hood. In fact, self.a = a0 in __init__ uses self.__setattr__ as well. You can use this to enforce the type check:

 class A:
    def __init__(self, a0):
        self.a = a0
    def __setattr__(self, name, value):
        if name == 'a' and not isinstance(value, int):
            raise TypeError('A.a must be an int')
        super().__setattr__(name, value)

The disadvantage of this method is that you have to have a separate if name == ... for each type you want to check (or if name in ... to check multiple names for a given type). The advantage is that it is the most straightforward way to make it nearly impossible for the user to circumvent the type check.

Make a property

Properties are objects that replace your normal attribute with a descriptor object (usually by using a decorator). Descriptors can have __get__ and __set__ methods that customize how the underlying attribute is accessed. This is sort of like taking the corresponding if branch in __setattr__ and putting it into a method that will run just for that attribute. Here is an example:

class A:
    def __init__(self, a0):
        self.a = a0
    @property
    def a(self):
        return self._a
    @a.setter
    def a(self, value):
        if not isinstance(value, int):
            raise TypeError('A.a must be an int')
        self._a = value

A slightly different way of doing the same thing can be found in @jsbueno's answer.

While using a property this way is nifty and mostly solves the problem, it does present a couple of issues. The first is that you have a "private" _a attribute that the user can modify directly, bypassing your type check. This is almost the same problem as using a plain getter and setter, except that now a is accessible as the "correct" attribute that redirects to the setter behind the scenes, making it less likely that the user will mess with _a. The second issue is that you have a superfluous getter to make the property work as read-write. These issues are the subject of this question.

Create a True Setter-Only Descriptor

This solution is probably the most robust overall. It is suggested in the accepted answer to the question mentioned above. Basically, instead of using a property, which has a bunch of frills and conveniences that you can not get rid of, create your own descriptor (and decorator) and use that for any attributes that require type checking:

class SetterProperty:
    def __init__(self, func, doc=None):
        self.func = func
        self.__doc__ = doc if doc is not None else func.__doc__
    def __set__(self, obj, value):
        return self.func(obj, value)

class A:
    def __init__(self, a0):
        self.a = a0
    @SetterProperty
    def a(self, value):
        if not isinstance(value, int):
            raise TypeError('A.a must be an int')
        self.__dict__['a'] = value

The setter stashes the actual value directly into the __dict__ of the instance to avoid recursing into itself indefinitely. This makes it possible to get the attribute's value without supplying an explicit getter. Since the descriptor a does not have the __get__ method, the search will continue until it finds the attribute in __dict__. This ensures that all sets go through the descriptor/setter while gets allow direct access to the attribute value.

If you have a large number of attributes that require a check like this, you can move the line self.__dict__['a'] = value into the descriptor's __set__ method:

class ValidatedSetterProperty:
    def __init__(self, func, name=None, doc=None):
        self.func = func
        self.__name__ = name if name is not None else func.__name__
        self.__doc__ = doc if doc is not None else func.__doc__
    def __set__(self, obj, value):
        ret = self.func(obj, value)
        obj.__dict__[self.__name__] = value

class A:
    def __init__(self, a0):
        self.a = a0
    @ValidatedSetterProperty
    def a(self, value):
        if not isinstance(value, int):
            raise TypeError('A.a must be an int')

Update

Python3.6 does this for you almost out-of the box: https://docs.python.org/3.6/whatsnew/3.6.html#pep-487-descriptor-protocol-enhancements

TL;DR

For a very small number of attributes that need type-checking, override __setattr__ directly. For a larger number of attributes, use the setter-only descriptor as shown above. Using properties directly for this sort of application introduces more problems than it solves.