When I first saw Python decorators, they looked like dark magic. That little @ symbol sitting above a function felt like an incantation I wasn't supposed to understand. But here's the thing—decorators are just functions. Once that clicked, everything else fell into place.
Let me walk you through my journey of demystifying decorators.
What Are Decorators, Really?
A decorator is a function that takes another function and extends its behavior without modifying the original function's code. That's it. No magic.
The @decorator syntax is just shorthand:
@my_decorator
def say_hello():
print("Hello!")
# This is exactly the same as:
def say_hello():
print("Hello!")
say_hello = my_decorator(say_hello)The second version makes it clearer—we're passing say_hello to my_decorator, and whatever comes back replaces the original function.
The Basic Decorator Pattern
Let's build a decorator from scratch. The pattern always looks like this:
def my_decorator(func):
def wrapper(*args, **kwargs):
# Do something before
result = func(*args, **kwargs)
# Do something after
return result
return wrapperHere's what's happening:
my_decoratorreceives a function as its argument- It defines a new function called
wrapperthat will replace the original wrappercalls the original function and can do things before/after- We return
wrapper(not calling it, just returning the function itself)
Let's see it in action:
def shout(func):
def wrapper(*args, **kwargs):
result = func(*args, **kwargs)
return result.upper() if isinstance(result, str) else result
return wrapper
@shout
def greet(name):
return f"hello, {name}"
print(greet("world")) # HELLO, WORLDThe greet function now runs through shout's wrapper, which uppercases the result.
Why *args, **kwargs?
You'll notice every wrapper uses *args, **kwargs. This is crucial—it lets your decorator work with any function, regardless of its signature:
def log_call(func):
def wrapper(*args, **kwargs):
print(f"Calling {func.__name__} with args={args}, kwargs={kwargs}")
return func(*args, **kwargs)
return wrapper
@log_call
def add(a, b):
return a + b
@log_call
def greet(name, enthusiastic=False):
return f"Hello, {name}{'!' if enthusiastic else ''}"
add(2, 3) # Calling add with args=(2, 3), kwargs={}
greet("Owen", enthusiastic=True) # Calling greet with args=('Owen',), kwargs={'enthusiastic': True}Without *args, **kwargs, you'd have to write different decorators for functions with different parameters. That's not fun.
The functools.wraps Lifesaver
Here's something that bit me early on. Try this:
def my_decorator(func):
def wrapper(*args, **kwargs):
return func(*args, **kwargs)
return wrapper
@my_decorator
def say_hello():
"""A friendly greeting function."""
print("Hello!")
print(say_hello.__name__) # wrapper (!)
print(say_hello.__doc__) # None (!!)The decorated function loses its identity! The __name__ becomes "wrapper" and the docstring vanishes. This breaks debugging, logging, and documentation tools.
The fix is functools.wraps:
from functools import wraps
def my_decorator(func):
@wraps(func) # This copies metadata from func to wrapper
def wrapper(*args, **kwargs):
return func(*args, **kwargs)
return wrapper
@my_decorator
def say_hello():
"""A friendly greeting function."""
print("Hello!")
print(say_hello.__name__) # say_hello ✓
print(say_hello.__doc__) # A friendly greeting function. ✓Always use @wraps(func) in your decorators. It's one line that saves you from mysterious bugs later.
Decorators with Arguments
What if you want to configure your decorator? Like @retry(times=3) instead of just @retry?
You need to add another layer. The decorator that takes arguments returns the actual decorator:
from functools import wraps
def repeat(times):
def decorator(func):
@wraps(func)
def wrapper(*args, **kwargs):
result = None
for _ in range(times):
result = func(*args, **kwargs)
return result
return wrapper
return decorator
@repeat(times=3)
def say_hello():
print("Hello!")
say_hello()
# Hello!
# Hello!
# Hello!It's functions all the way down:
repeat(times=3)is called, returnsdecoratordecoratoris applied tosay_hello, returnswrappersay_hellois nowwrapper
This nesting confused me at first. The trick is to think of it as: the outer function captures configuration, the middle function captures the original function, and the inner function does the actual work.
Class-Based Decorators
You can also use classes as decorators. This is useful when your decorator needs to maintain state:
from functools import wraps
class CountCalls:
def __init__(self, func):
wraps(func)(self)
self.func = func
self.count = 0
def __call__(self, *args, **kwargs):
self.count += 1
print(f"{self.func.__name__} has been called {self.count} times")
return self.func(*args, **kwargs)
@CountCalls
def process_data():
print("Processing...")
process_data() # process_data has been called 1 times
process_data() # process_data has been called 2 times
process_data() # process_data has been called 3 timesThe class implements __call__, which makes instances callable like functions. When you use @CountCalls, Python creates a CountCalls instance with your function, and every call goes through __call__.
Here's a more practical class-based decorator with arguments:
from functools import wraps
class RateLimit:
def __init__(self, max_calls, period):
self.max_calls = max_calls
self.period = period
self.calls = []
def __call__(self, func):
@wraps(func)
def wrapper(*args, **kwargs):
import time
now = time.time()
# Remove calls outside the period
self.calls = [t for t in self.calls if now - t < self.period]
if len(self.calls) >= self.max_calls:
raise RuntimeError(f"Rate limit exceeded: {self.max_calls} calls per {self.period}s")
self.calls.append(now)
return func(*args, **kwargs)
return wrapper
@RateLimit(max_calls=3, period=60)
def call_api():
print("API called")Practical Examples
Let's look at decorators you'll actually use in real projects.
Timing Decorator
Measuring how long functions take:
import time
from functools import wraps
def timer(func):
@wraps(func)
def wrapper(*args, **kwargs):
start = time.perf_counter()
result = func(*args, **kwargs)
elapsed = time.perf_counter() - start
print(f"{func.__name__} took {elapsed:.4f} seconds")
return result
return wrapper
@timer
def slow_operation():
time.sleep(1)
return "done"
slow_operation() # slow_operation took 1.0012 secondsLogging Decorator
Add logging to any function without touching its code:
import logging
from functools import wraps
logging.basicConfig(level=logging.INFO)
logger = logging.getLogger(__name__)
def log_calls(func):
@wraps(func)
def wrapper(*args, **kwargs):
logger.info(f"Calling {func.__name__} with args={args}, kwargs={kwargs}")
try:
result = func(*args, **kwargs)
logger.info(f"{func.__name__} returned {result!r}")
return result
except Exception as e:
logger.exception(f"{func.__name__} raised {type(e).__name__}: {e}")
raise
return wrapper
@log_calls
def divide(a, b):
return a / b
divide(10, 2) # Logs the call and result
divide(10, 0) # Logs the call and the exceptionCaching Decorator
Avoid recomputing expensive results:
from functools import wraps
def cache(func):
cached_results = {}
@wraps(func)
def wrapper(*args):
if args in cached_results:
print(f"Cache hit for {args}")
return cached_results[args]
result = func(*args)
cached_results[args] = result
return result
return wrapper
@cache
def fibonacci(n):
if n < 2:
return n
return fibonacci(n - 1) + fibonacci(n - 2)
print(fibonacci(10)) # Computed
print(fibonacci(10)) # Cache hit for (10,)In practice, use @functools.lru_cache which handles edge cases and has a max size:
from functools import lru_cache
@lru_cache(maxsize=128)
def expensive_computation(n):
# Results are cached automatically
return n ** nRetry Decorator
Handle flaky operations gracefully:
import time
from functools import wraps
def retry(max_attempts=3, delay=1.0, exceptions=(Exception,)):
def decorator(func):
@wraps(func)
def wrapper(*args, **kwargs):
last_exception = None
for attempt in range(1, max_attempts + 1):
try:
return func(*args, **kwargs)
except exceptions as e:
last_exception = e
if attempt < max_attempts:
print(f"Attempt {attempt} failed: {e}. Retrying in {delay}s...")
time.sleep(delay)
raise last_exception
return wrapper
return decorator
@retry(max_attempts=3, delay=0.5, exceptions=(ConnectionError, TimeoutError))
def fetch_data(url):
# Might fail due to network issues
import random
if random.random() < 0.7:
raise ConnectionError("Network hiccup")
return {"data": "success"}Validation Decorator
Enforce preconditions:
from functools import wraps
def validate_types(**expected_types):
def decorator(func):
@wraps(func)
def wrapper(*args, **kwargs):
# Check keyword arguments
for arg_name, expected_type in expected_types.items():
if arg_name in kwargs:
if not isinstance(kwargs[arg_name], expected_type):
raise TypeError(
f"{arg_name} must be {expected_type.__name__}, "
f"got {type(kwargs[arg_name]).__name__}"
)
return func(*args, **kwargs)
return wrapper
return decorator
@validate_types(name=str, age=int)
def create_user(name, age):
return {"name": name, "age": age}
create_user(name="Owen", age=25) # Works
create_user(name="Owen", age="25") # TypeError: age must be int, got strStacking Decorators
You can apply multiple decorators to a single function. They're applied bottom-up:
@decorator_a
@decorator_b
@decorator_c
def my_function():
pass
# Equivalent to:
my_function = decorator_a(decorator_b(decorator_c(my_function)))The innermost decorator (@decorator_c) wraps the function first, then @decorator_b wraps that, and finally @decorator_a wraps the whole thing.
Here's a practical example:
@timer
@log_calls
@retry(max_attempts=3)
def fetch_user(user_id):
# Fetch from API
passWhen you call fetch_user:
timerstarts its stopwatchlog_callslogs the callretryattempts the actual function up to 3 timeslog_callslogs the resulttimerprints the elapsed time
Order matters! If you swapped timer and retry, each retry attempt would be timed separately instead of timing the whole retry loop.
When to Use Decorators
Decorators shine when you have cross-cutting concerns—functionality that applies to many functions:
- Logging: Log every function call in a module
- Timing: Profile specific functions during development
- Caching: Memoize expensive computations
- Retrying: Handle transient failures in network calls
- Authentication: Enforce access control on API endpoints
- Validation: Check inputs before processing
- Rate limiting: Prevent abuse of external services
The rule of thumb: if you find yourself copy-pasting the same before/after code into multiple functions, that's a decorator waiting to happen.
Key Takeaways
- Decorators are just functions that take a function and return a (usually different) function
- Always use
@functools.wrapsto preserve function metadata - Use
*args, **kwargsin your wrapper to work with any function signature - Decorators with arguments need an extra layer of nesting
- Class-based decorators are useful when you need to maintain state
- Stacking order matters—decorators apply bottom-up
The @ syntax is just syntactic sugar. Once you see past it, decorators become another tool in your Python toolkit—a powerful one that helps you write cleaner, more maintainable code.