Asyncio Module: Asynchronous Programming With Async/Await
TL;DR
Asyncio enables asynchronous programming with async/await syntax, allowing single-threaded concurrent execution perfect for I/O-bound tasks like web requests and file operations.
Interesting!
Asyncio can handle thousands of concurrent connections with minimal memory overhead - a single asyncio application can often outperform traditional threaded servers by avoiding context switching costs.
Basic Async Concepts
python code snippet start
import asyncio
import time
async def fetch_data(name, delay):
print(f"Starting {name}")
await asyncio.sleep(delay) # Simulates I/O operation
print(f"Finished {name}")
return f"Data from {name}"
async def main():
# Sequential execution - slow
start = time.time()
result1 = await fetch_data("API 1", 2)
result2 = await fetch_data("API 2", 1)
print(f"Sequential: {time.time() - start:.1f}s")
# Concurrent execution - fast
start = time.time()
results = await asyncio.gather(
fetch_data("API 3", 2),
fetch_data("API 4", 1)
)
print(f"Concurrent: {time.time() - start:.1f}s")
# Run the async function
asyncio.run(main())python code snippet end
HTTP Requests with aiohttp
python code snippet start
import aiohttp
import asyncio
async def fetch_url(session, url):
try:
async with session.get(url) as response:
return await response.text()
except Exception as e:
return f"Error fetching {url}: {e}"
async def fetch_multiple_urls():
urls = [
"https://httpbin.org/delay/1",
"https://httpbin.org/delay/2",
"https://httpbin.org/delay/1"
]
async with aiohttp.ClientSession() as session:
tasks = [fetch_url(session, url) for url in urls]
results = await asyncio.gather(*tasks)
return results
# This completes in ~2 seconds instead of ~4 seconds
results = asyncio.run(fetch_multiple_urls())python code snippet end
Task Management
python code snippet start
async def background_task(name):
while True:
print(f"{name} working...")
await asyncio.sleep(1)
async def main():
# Create background tasks
task1 = asyncio.create_task(background_task("Worker 1"))
task2 = asyncio.create_task(background_task("Worker 2"))
# Run for 5 seconds then cancel
await asyncio.sleep(5)
task1.cancel()
task2.cancel()
# Wait for cancellation to complete
try:
await task1
await task2
except asyncio.CancelledError:
print("Tasks cancelled")
asyncio.run(main())python code snippet end
Producer-Consumer Pattern
python code snippet start
async def producer(queue):
for i in range(5):
await asyncio.sleep(1)
await queue.put(f"item-{i}")
print(f"Produced item-{i}")
await queue.put(None) # Signal completion
async def consumer(name, queue):
while True:
item = await queue.get()
if item is None:
queue.task_done()
break
print(f"{name} consumed {item}")
await asyncio.sleep(0.5)
queue.task_done()
async def main():
queue = asyncio.Queue(maxsize=2)
# Start producer and consumers
await asyncio.gather(
producer(queue),
consumer("Consumer-1", queue),
consumer("Consumer-2", queue)
)
asyncio.run(main())python code snippet end
Asyncio transforms I/O-bound applications by enabling concurrent execution without the complexity of threading.
The async/await syntax used throughout this module was introduced in PEP 492 , which established the foundation for modern Python asynchronous programming. For creating asynchronous data sources elegantly, asynchronous generators combine async def with yield for powerful iteration patterns. If you want structured concurrency and backend flexibility, check out asyncio vs anyio comparison . Asyncio integrates well with urllib for HTTP requests and logging for async debugging . For error handling in async code, see exception handling patterns and JSON processing for API data handling. Under the hood, asyncio provides non-blocking operations using the socket module .
Reference: Python Asyncio Module Documentation