Join multiple async generators in Python

You can use wonderful aiostream library. It'll look like this:

import asyncio
from aiostream import stream


async def test1():
    for _ in range(5):
        await asyncio.sleep(0.1)
        yield 1


async def test2():
    for _ in range(5):
        await asyncio.sleep(0.2)
        yield 2


async def main():
    combine = stream.merge(test1(), test2())

    async with combine.stream() as streamer:
        async for item in streamer:
            print(item)


asyncio.run(main())

Result:

1
1
2
1
1
2
1
2
2
2

If you wanted to avoid the dependency on an external library (or as a learning exercise), you could merge the async iterators using a queue:

def merge_async_iters(*aiters):
    # merge async iterators, proof of concept
    queue = asyncio.Queue(1)
    async def drain(aiter):
        async for item in aiter:
            await queue.put(item)
    async def merged():
        while not all(task.done() for task in tasks):
            yield await queue.get()
    tasks = [asyncio.create_task(drain(aiter)) for aiter in aiters]
    return merged()

This passes the test from Mikhail's answer, but it's not perfect: it doesn't propagate the exception in case one of the async iterators raises. Also, if the task that exhausts the merged generator returned by merge_async_iters() gets cancelled, or if the same generator is not exhausted to the end, the individual drain tasks are left hanging.

A more complete version could handle the first issue by detecting an exception and transmitting it through the queue. The second issue can be resolved by merged generator cancelling the drain tasks as soon as the iteration is abandoned. With those changes, the resulting code looks like this:

def merge_async_iters(*aiters):
    queue = asyncio.Queue(1)
    run_count = len(aiters)
    cancelling = False

    async def drain(aiter):
        nonlocal run_count
        try:
            async for item in aiter:
                await queue.put((False, item))
        except Exception as e:
            if not cancelling:
                await queue.put((True, e))
            else:
                raise
        finally:
            run_count -= 1

    async def merged():
        try:
            while run_count:
                raised, next_item = await queue.get()
                if raised:
                    cancel_tasks()
                    raise next_item
                yield next_item
        finally:
            cancel_tasks()

    def cancel_tasks():
        nonlocal cancelling
        cancelling = True
        for t in tasks:
            t.cancel()

    tasks = [asyncio.create_task(drain(aiter)) for aiter in aiters]
    return merged()

Different approaches to merging async iterators can be found in this answer, and also this one, where the latter allows for adding new streams mid-stride. The complexity and subtlety of these implementations shows that, while it is useful to know how to write one, actually doing so is best left to well-tested external libraries such as aiostream that cover all the edge cases.