Prioritizing queues among multiple queues in celery?

You can partially achieve this by defining multiple queues for the worker, when starting it.

You can do it with the following command: Also, refer here for more details.

celery -A proj worker -l info -Q Q1,Q2

Though this approach has a problem. It doesn't do it with fallback kind of approach. Since, workers listening to multiple queue evenly distribute the resources among them.

Hence, your requirement of processing only from 'high priority queue' even when there is something in 'normal priority queue' cannot be achieved. This can be minimized by allocating more Workers (may be 75%) for 'high priority queue' and 25% for 'normal priority queue'. or different share based on you work load.


This is now possible with Celery >= 4.1.1 + Redis transport (probably earlier version too). You just need to set a broker transport option in your celeryconfig.py module. This setting was implemented with Kombu 4.0.0.

broker_transport_options = {
  visibility_timeout: 1200,  # this doesn't affect priority, but it's part of redis config
  queue_order_strategy: 'priority'
}

It's also possible to specify with an environment variable.

For a worker started with $ celery -A proj worker -l info -Q Q1,Q2 the idle worker will check Q1 first and execute Q1 tasks if available before checking Q2.

source

Bonus off topic help, this also works with Airflow 1.10.2 workers, except it seems like the queue order is not preserved from the command line. Using 'queue_order_strategy'='sorted' and naming your queues appropriately works (Q1, Q2 would work perfectly). Airflow pool-based priority is not preserved between dags so this really helps!