Start celery worker throws "no attribute 'worker_state_db'"

The bug appears if an exception is raised while parsing settings. Such as when we set Django's SECRET_KEY (or any other setting) via an environment variable:

SECRET_KEY = os.environ['SECRET_KEY']

To solve the problem you can switch back to:

SECRET_KEY = "asdfasdfasdf"

or use:

SECRET_KEY = os.environ.get('SECRET_KEY', '')

You can also find which setting caused the problem if you comment our the following line in celery.py file and start the worker again:

app.config_from_object('django.conf:settings', namespace='CELERY')

I got the same error when one of the keys was missing in my config.json which was being loaded in settings.py (basically a missing key in settings.py).

Error text is totally irrelevant. Hope that helps!


I would like to add two things:

  1. This is also true when you load settings from any configuration file, not essentially django's. The question is related purely to Celery.

  2. Some explanation on origins of this cryptic error:

worker_state_db is a setting with a default value, so you shouldn't have to set it manually. An exception is raised because Settings are just empty and don't have any values, even the default ones. That said, we don't have the default config loaded. Somehow in Celery, the exception from parsing (the original one which caused the problem) is not propagated to the stderr when starting a worker. Hence, you get a message which doesn't tell you anything about a possible solution.

How to fix it? For example, if you have celeryconfig.py where your celery app module is placed and you load settings from there via:

app.config_from_object('path.to.your.celery.module.celeryconfig')

Check your whole celeryconfig.py file for anything that could crash or cause the parser to crash (incompatible settings values?).


I got this over and over while trying to configure celery with a dockerized django project. The solution was including env_file in docker-compose.yaml:

  celery:
    ...
    env_file: .env