Introduction

Event-driven programming enables non-blocking execution, making applications responsive and scalable. Python’s AsyncIO and Aiohttp provide powerful tools for handling concurrent tasks efficiently.

πŸ”Ή Why use event-driven programming?
βœ” Handles multiple tasks concurrently
βœ” Reduces CPU idle time
βœ” Optimizes I/O-bound applications

This guide explores AsyncIO and Aiohttp, covering:
βœ… AsyncIO fundamentals
βœ… Aiohttp for async web requests
βœ… Building real-world async applications


1️⃣ Understanding AsyncIO

What is AsyncIO?

AsyncIO is Python’s built-in framework for asynchronous programming. It enables applications to:
βœ” Run multiple tasks concurrently
βœ” Use event loops to manage execution
βœ” Avoid blocking operations

Key Concepts

βœ” Coroutines: Functions prefixed with async
βœ” Event Loop: Manages coroutine execution
βœ” Tasks & Futures: Handles scheduled coroutines

Example: Running a Simple Coroutine

import asyncio

async def greet():  
print("Hello,")  
await asyncio.sleep(1)  
print("AsyncIO!")

asyncio.run(greet())  

βœ” Uses await to pause execution
βœ” asyncio.run() starts the event loop


2️⃣ Managing Multiple Tasks with AsyncIO

Creating Multiple Coroutines

async def task1():  
await asyncio.sleep(2)  
print("Task 1 completed")

async def task2():  
await asyncio.sleep(1)  
print("Task 2 completed")

async def main():  
await asyncio.gather(task1(), task2())

asyncio.run(main())  

πŸ”Ή Key Insights:
βœ” asyncio.gather() runs tasks concurrently
βœ” Shorter tasks finish earlier


3️⃣ Using Aiohttp for Async Web Requests

Why Aiohttp?

Aiohttp is an asynchronous HTTP client designed for:
βœ” Non-blocking API requests
βœ” Handling thousands of requests per second
βœ” Web scraping & real-time data fetching

Example: Fetching Data Asynchronously

import aiohttp  
import asyncio

async def fetch_data(url):  
async with aiohttp.ClientSession() as session:  
async with session.get(url) as response:  
return await response.text()

async def main():  
url = "https://jsonplaceholder.typicode.com/todos/1"  
data = await fetch_data(url)  
print(data)

asyncio.run(main())  

πŸ”Ή How it works:
βœ” aiohttp.ClientSession() manages HTTP sessions
βœ” await response.text() fetches data without blocking


4️⃣ Handling Timeouts and Errors

Setting a Timeout for HTTP Requests

async def fetch_with_timeout(url):  
try:  
async with aiohttp.ClientSession() as session:  
async with session.get(url, timeout=3) as response:  
return await response.text()  
except asyncio.TimeoutError:  
print("Request timed out!")

asyncio.run(fetch_with_timeout("https://httpbin.org/delay/5"))  

πŸ”Ή Key Takeaways:
βœ” timeout=3 cancels requests exceeding 3 seconds
βœ” Handles asyncio.TimeoutError gracefully


5️⃣ Building an Async Web Scraper

Fetching Multiple Pages Concurrently

async def fetch_page(session, url):  
async with session.get(url) as response:  
return await response.text()

async def scrape_pages():  
urls = ["https://example.com", "https://jsonplaceholder.typicode.com/todos/1"] 
async with aiohttp.ClientSession() as session:  
tasks = [fetch_page(session, url) for url in urls]  
results = await asyncio.gather(*tasks)  
for result in results:  
print(result[:100])  # Print first 100 characters

asyncio.run(scrape_pages())  

βœ” Fetches multiple pages concurrently
βœ” Minimizes network latency


Conclusion

Python’s AsyncIO and Aiohttp make event-driven programming powerful and efficient.

Key Takeaways:

βœ… Use AsyncIO for non-blocking execution
βœ… Leverage Aiohttp for async HTTP requests
βœ… Handle timeouts & errors gracefully
βœ… Optimize performance with async tasks

πŸ“Œ Start using event-driven programming today to build scalable, high-performance applications! πŸš€