Composing functions is a core technique in functional programming that lets you build complex operations by chaining simple, reusable functions — each taking an input, transforming it, and passing the result to the next. In Python, function composition creates elegant, readable data processing pipelines, avoids intermediate variables/lists, and promotes modularity/testability. In 2026, composition remains essential — powering clean ETL, feature engineering, data validation, ML preprocessing, and Polars/pandas method chaining, where small pure functions combine into powerful transformations without side effects or memory bloat.
Here’s a complete, practical guide to composing functions in Python: manual composition, compose() helpers, functools.reduce, real-world pipelines, and modern best practices with type hints, Polars/pandas chaining, performance, and testing.
Manual composition — nest function calls for simple pipelines.
def double(x: float) -> float:
return x * 2
def square(x: float) -> float:
return x ** 2
def add_one(x: float) -> float:
return x + 1
# Compose manually: add_one ? double ? square
result = square(double(add_one(5))) # (5+1)*2 = 12 ? 144
print(result) # 144
Reusable compose() function — general-purpose composition from right to left (g then f).
from typing import Callable, Any
from functools import reduce
def compose(*functions: Callable) -> Callable:
"""Compose functions right to left: compose(f, g)(x) == f(g(x))."""
return lambda x: reduce(lambda acc, f: f(acc), reversed(functions), x)
# Usage
pipeline = compose(square, double, add_one)
print(pipeline(5)) # 144
# Alternative: left-to-right with reduce
def compose_ltr(*functions: Callable) -> Callable:
return lambda x: reduce(lambda acc, f: f(acc), functions, x)
pipeline_ltr = compose_ltr(add_one, double, square)
print(pipeline_ltr(5)) # same 144
Real-world pattern: composing transformations in pandas/Polars pipelines — chain small functions for cleaning/feature engineering.
def normalize(col):
return (col - col.mean()) / col.std()
def clip_outliers(col, lower=0.05, upper=0.95):
return col.clip(lower=col.quantile(lower), upper=col.quantile(upper))
def log_transform(col):
return np.log1p(col)
# Compose pipeline
transform_pipeline = compose(log_transform, clip_outliers, normalize)
import pandas as pd
df = pd.DataFrame({'value': [1, 2, 1000, 4, -10]})
df['transformed'] = transform_pipeline(df['value'])
print(df)
Best practices make function composition safe, readable, and performant. Prefer small, pure functions — no side effects, single responsibility. Modern tip: use Polars method chaining — df.with_columns(...).filter(...).group_by(...) — native composition, often faster. Use functools.reduce — for dynamic composition from list of functions. Add type hints — def compose(*fs: Callable[[Any], Any]) -> Callable[[Any], Any]. Preserve metadata — wrap with @wraps if creating wrapper composes. Use toolz.compose or pyrsistent — libraries with compose utilities. Avoid deep nesting — limit to 3–5 functions; refactor complex pipelines to named steps. Test compositions — unit test each function + integration test pipeline. Use generator-based composition — for lazy pipelines: map(f, filter(g, data)). Use Polars expressions — pl.col('value').map_elements(normalize) — composable lazy ops. Use pipe operator — Python 3.9+ x |> double |> square (via third-party libs). Profile pipelines — timeit or line_profiler on composed functions. Document pipelines — docstring with input/output types and steps.
Composing functions builds complex pipelines from simple, reusable pieces — manual nesting, compose() helpers, reduce, or Polars chaining. In 2026, prefer small pure functions, type hints, Polars lazy chaining, and testing each step. Master composition, and you’ll write modular, testable, memory-efficient Python code that transforms data elegantly and scalably.
Next time you need a multi-step transformation — compose functions. It’s Python’s cleanest way to say: “Chain these steps — each does one thing well.”