我正在使用aiohttp下载图像,并想知道是否有办法限制尚未完成的打开请求的数量.这是我目前的代码:
async def get_images(url, session):
chunk_size = 100
# Print statement to show when a request is being made.
print(f'Making request to {url}')
async with session.get(url=url) as r:
with open('path/name.png', 'wb') as file:
while True:
chunk = await r.content.read(chunk_size)
if not chunk:
break
file.write(chunk)
# List of urls to get images from
urls = [...]
conn = aiohttp.TCPConnector(limit=3)
loop = asyncio.get_event_loop()
session = aiohttp.ClientSession(connector=conn, loop=loop)
loop.run_until_complete(asyncio.gather(*(get_images(url, session=session) for url in urls)))
问题是,我抛出一个打印声明,告诉我何时发出每个请求,并且它一次发出近21个请求,而不是我想要限制它的3个(即,一旦图像完成下载,它可以移动到列表中的下一个URL来获取).我只是想知道我在这里做错了什么.
解决方法:
您的限制设置正常.你在调试时弄错了.
正如Mikhail Gerasimov在the comment中指出的那样,你将print()调用放在错误的位置 – 它必须在session.get()上下文中.
为了确保限制得到尊重,我针对简单的日志记录服务器测试了您的代码 – 测试显示服务器正好接收您在TCPConnector中设置的连接数.这是测试:
import asyncio
import aiohttp
loop = asyncio.get_event_loop()
class SilentServer(asyncio.Protocol):
def connection_made(self, transport):
# We will know when the connection is actually made:
print('SERVER |', transport.get_extra_info('peername'))
async def get_images(url, session):
chunk_size = 100
# This log doesn't guarantee that we will connect,
# session.get() will freeze if you reach TCPConnector limit
print(f'CLIENT | Making request to {url}')
async with session.get(url=url) as r:
while True:
chunk = await r.content.read(chunk_size)
if not chunk:
break
urls = [f'http://127.0.0.1:1337/{x}' for x in range(20)]
conn = aiohttp.TCPConnector(limit=3)
session = aiohttp.ClientSession(connector=conn, loop=loop)
async def test():
await loop.create_server(SilentServer, '127.0.0.1', 1337)
await asyncio.gather(*(get_images(url, session=session) for url in urls))
loop.run_until_complete(test())