How does a Celery worker consuming from multiple queues decide which to consume from first?

From my testing, it processes multiple queues round-robin style.

If I use this test code:

from celery import task
import time


@task
def my_task(item_id):
    time.sleep(0.5)
    print('Processing item "%s"...' % item_id)


def add_items_to_queue(queue_name, items_count):
    for i in xrange(0, items_count):
        my_task.apply_async(('%s-%d' % (queue_name, i),), queue=queue_name)


add_items_to_queue('queue1', 10)
add_items_to_queue('queue2', 10)
add_items_to_queue('queue3', 5)

And start the queue with (using django-celery):

`manage.py celery worker -Q queue1,queue2,queue3`

It outputs:

Processing item "queue1-0"...
Processing item "queue3-0"...
Processing item "queue2-0"...
Processing item "queue1-1"...
Processing item "queue3-1"...
Processing item "queue2-1"...
Processing item "queue1-2"...
Processing item "queue3-2"...
Processing item "queue2-2"...
Processing item "queue1-3"...
Processing item "queue3-3"...
Processing item "queue2-3"...
Processing item "queue1-4"...
Processing item "queue3-4"...
Processing item "queue2-4"...
Processing item "queue1-5"...
Processing item "queue2-5"...
Processing item "queue1-6"...
Processing item "queue2-6"...
Processing item "queue1-7"...
Processing item "queue2-7"...
Processing item "queue1-8"...
Processing item "queue2-8"...
Processing item "queue1-9"...
Processing item "queue2-9"...

So it pulls one item from each queue before going on to the next queue1 item even though ALL of the queue1 tasks were published before the queue2 & 3 tasks.

Note: As @WarLord pointed out, this exact behavior will only work when CELERYD_PREFETCH_MULTIPLIER is set to 1. If it's greater than 1, then that means items will be fetched from the queue in batches. So if you have 4 processes with the PREFETCH_MULTIPLIER set to 4, that means there will be 16 items pulled from the queue right off the bat, so you won't get the exact output as above, but it will still roughly follow round-robin.


NOTE: This answer has deprecated: latest version of Celery works very differently then what it was in 2013...

A worker consuming several queues consumes task, FIFO order is maintained across multiple queues too.

Example:

Queue1 : (t1, t2, t5, t7)
Queue2 : (t0,t3,t4,t6)

Assuming 0-7 represents the order of the tasks published

order of Consumption is t0, t1, t2, t3, t4, t5, t6, t7