Python's asyncio module enables writing concurrent code using the async/await syntax. It's perfect for I/O-bound operations like network requests, file operations, and database queries.
The Core Concepts
Asyncio is built on three main ideas:
- Coroutines - Functions defined with
async def - Event loop - Runs and manages coroutines
- Tasks - Wrap coroutines for concurrent execution
Your First Coroutine
import asyncio
async def say_hello():
print("Hello...")
await asyncio.sleep(1) # Non-blocking sleep
print("...World!")
# Run the coroutine
asyncio.run(say_hello())The await keyword pauses the coroutine, allowing other tasks to run while waiting.
Running Multiple Coroutines
import asyncio
async def fetch_data(name, delay):
print(f"Fetching {name}...")
await asyncio.sleep(delay)
print(f"Got {name}!")
return f"{name} data"
async def main():
# Run concurrently with gather
results = await asyncio.gather(
fetch_data("users", 2),
fetch_data("posts", 1),
fetch_data("comments", 1.5)
)
print(f"Results: {results}")
asyncio.run(main())All three "fetches" run concurrently—total time is ~2 seconds, not 4.5.
Tasks for Fire-and-Forget
import asyncio
async def background_job():
while True:
print("Working...")
await asyncio.sleep(5)
async def main():
# Create a task that runs in background
task = asyncio.create_task(background_job())
# Do other work
await asyncio.sleep(12)
# Cancel when done
task.cancel()
asyncio.run(main())Async Context Managers
import asyncio
class AsyncResource:
async def __aenter__(self):
print("Acquiring resource...")
await asyncio.sleep(0.1)
return self
async def __aexit__(self, *args):
print("Releasing resource...")
await asyncio.sleep(0.1)
async def main():
async with AsyncResource() as resource:
print("Using resource")
asyncio.run(main())Async Iterators
import asyncio
async def countdown(n):
while n > 0:
yield n
await asyncio.sleep(0.5)
n -= 1
async def main():
async for num in countdown(5):
print(num)
asyncio.run(main())Timeouts
import asyncio
async def slow_operation():
await asyncio.sleep(10)
return "done"
async def main():
try:
result = await asyncio.wait_for(
slow_operation(),
timeout=2.0
)
except asyncio.TimeoutError:
print("Operation timed out!")
asyncio.run(main())Semaphores for Rate Limiting
import asyncio
async def fetch_url(semaphore, url):
async with semaphore:
print(f"Fetching {url}")
await asyncio.sleep(1) # Simulate request
return f"Response from {url}"
async def main():
# Limit to 3 concurrent requests
semaphore = asyncio.Semaphore(3)
urls = [f"https://api.example.com/{i}" for i in range(10)]
tasks = [fetch_url(semaphore, url) for url in urls]
results = await asyncio.gather(*tasks)
print(f"Got {len(results)} responses")
asyncio.run(main())When to Use asyncio
Good fits:
- Network requests (HTTP, websockets)
- Database queries
- File I/O
- Any I/O-bound waiting
Not ideal for:
- CPU-bound work (use
multiprocessinginstead) - Simple scripts without concurrency needs
- Code that must stay synchronous
Common Patterns
Running sync code in async context:
import asyncio
def blocking_io():
# Some blocking operation
import time
time.sleep(1)
return "done"
async def main():
loop = asyncio.get_running_loop()
result = await loop.run_in_executor(None, blocking_io)
print(result)
asyncio.run(main())Queues for producer/consumer:
import asyncio
async def producer(queue):
for i in range(5):
await queue.put(f"item-{i}")
await asyncio.sleep(0.5)
await queue.put(None) # Signal done
async def consumer(queue):
while True:
item = await queue.get()
if item is None:
break
print(f"Consumed: {item}")
async def main():
queue = asyncio.Queue()
await asyncio.gather(
producer(queue),
consumer(queue)
)
asyncio.run(main())Summary
asyncio is Python's answer to concurrent I/O. The async/await syntax makes it readable, and the event loop handles the complexity of switching between tasks. Start with asyncio.run() and asyncio.gather(), then explore tasks and semaphores as you need more control.