Skip to content

Aiohttp limits the number of simultaneous connections #7

@r00tdaemon

Description

@r00tdaemon

Thanks for your post on aiohttp - https://pawelmhm.github.io/asyncio/python/aiohttp/2016/04/22/asyncio-aiohttp.html

When I tried to run the client/server code locally to benchmark I noticed the server was receiving connections in batches of 100 approx. This seriously limits the performance.
I found this answer on stackoverflow on why it's happening - https://stackoverflow.com/a/43857526
The script in blog can be modified to following to fix this

# modified fetch function with semaphore
import asyncio
from aiohttp import TCPConnector, ClientSession

async def fetch(url, session):
    async with session.get(url) as response:
        delay = response.headers.get("DELAY")
        date = response.headers.get("DATE")
        print("{}:{} with delay {}".format(date, response.url, delay))
        return await response.text()


async def bound_fetch(sem, url, session):
    # Getter function with semaphore.
    async with sem:
        await fetch(url, session)


async def run(r):
    url = "http://localhost:8000/{}"
    tasks = []
    # create instance of Semaphore
    sem = asyncio.Semaphore(1000)

    # Create client session that will ensure we dont open new connection
    # per each request.
    conn = TCPConnector(limit=0)
    async with ClientSession(connector=conn) as session:
        for i in range(r):
            # pass Semaphore and session to every GET request
            task = asyncio.ensure_future(bound_fetch(sem, url.format(i), session))
            tasks.append(task)

        responses = asyncio.gather(*tasks)
        await responses

number = 10000
asyncio.run(run(number))

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions