breakpoint() is a built-in debugging function (Python 3.7+) that pauses execution and drops you into an interactive debugger — the modern, zero-import replacement for pdb.set_trace(). It respects the PYTHONBREAKPOINT environment variable, allowing you to choose any debugger (pdb, ipdb, pudb, web-pdb, remote-pdb, etc.) without changing code. In 2026, breakpoint() remains the go-to debugging tool for data science notebooks, scripts, pipelines (Dask, Prefect, Airflow), ML training loops, async code, and production services — enabling quick inspection of variables, stack traces, and state at any point, with seamless integration into VS Code, PyCharm, Jupyter, and remote debugging workflows.
Here’s a complete, practical guide to using breakpoint() in Python: basic usage, environment variable customization, popular debuggers, real-world patterns (earthquake data debugging, Dask pipeline inspection), and modern best practices with conditional breakpoints, remote debugging, IDE integration, and alternatives (logging, assertions, pdb/ipdb).
Basic breakpoint() — insert anywhere to pause and debug interactively.
def process_earthquake_data(df):
print("Starting processing...")
breakpoint() # execution pauses here, drops into debugger
strong = df[df['mag'] >= 7.0]
breakpoint() # inspect 'strong' DataFrame
mean_mag = strong['mag'].mean()
print(f"Mean magnitude of strong events: {mean_mag}")
return mean_mag
# Run the function
import pandas as pd
df = pd.read_csv('earthquakes.csv')
process_earthquake_data(df)
At each breakpoint(), you enter the debugger (default: pdb). Common commands:
horhelp— list commandslorlist— show surrounding codep var— print variable valuepp df.head()— pretty-print DataFrame headnornext— step over next linesorstep— step into function callcorcontinue— continue until next breakpointqorquit— exit debuggerwhereorbt— show stack traceup/down— navigate stack frames
Custom debugger via PYTHONBREAKPOINT — switch debuggers without code changes.
# Terminal / environment
export PYTHONBREAKPOINT=ipdb.set_trace # ipdb (richer pdb)
export PYTHONBREAKPOINT=pudb.set_trace # pudb (full-screen curses UI)
export PYTHONBREAKPOINT=web_pdb.set_trace # web-pdb (browser-based)
export PYTHONBREAKPOINT=0 # disable all breakpoints
# Or in code (overrides env var)
import os
os.environ['PYTHONBREAKPOINT'] = 'ipdb.set_trace'
def my_func():
breakpoint() # now uses ipdb automatically
Real-world pattern: debugging Dask pipeline for earthquake data — inspect intermediate DataFrames.
import dask.dataframe as dd
ddf = dd.read_csv('usgs/*.csv', assume_missing=True)
strong = ddf[ddf['mag'] >= 7.0]
breakpoint() # inspect 'strong' Dask DataFrame (compute head, dtypes, etc.)
enriched = strong.assign(
year=strong['time'].dt.year,
country=strong['place'].str.split(',').str[-1].str.strip()
)
breakpoint() # check enriched columns, partitions
agg = enriched.groupby('country')['mag'].mean().persist()
breakpoint() # inspect persisted aggregation graph
result = agg.compute()
print(result.head(10))
Best practices for breakpoint() in Python & data workflows. Insert conditionally — if debug: breakpoint() or breakpoint() if condition else None. Modern tip: use ipdb/pudb — richer interfaces than pdb; set PYTHONBREAKPOINT=ipdb.set_trace globally. Use VS Code/PyCharm integration — configure debugger to stop at breakpoints. Use remote debugging — web_pdb or remote-pdb for Jupyter, containers, clusters. Disable in production — PYTHONBREAKPOINT=0 or if __debug__. Use pdb.set_trace() — for older code or explicit import. Use breakpoint() in tests — inspect failing cases. Combine with pprint — pprint.pprint(df.head()) inside debugger. Use interact — drop to interactive Python shell from debugger. Use debug command — post-mortem debugging after exception. Use alias — custom shortcuts in pdb/ipdb. Profile performance — breakpoint() adds no runtime cost when disabled. Use import pdb; pdb.set_trace() — for compatibility. Use from IPython import embed; embed() — IPython shell alternative. Use import code; code.interact() — basic interactive console.
breakpoint() pauses execution and opens an interactive debugger — zero-import, customizable via PYTHONBREAKPOINT, perfect for inspecting variables, stack, and state in data pipelines, scripts, or services. In 2026, use ipdb/pudb for rich UIs, set env var globally, integrate with IDEs, disable in production, and combine with pprint for complex objects. Master breakpoint(), and you’ll debug Python code faster, deeper, and more reliably — from notebook to production.
Next time you need to inspect running code — drop a breakpoint(). It’s Python’s cleanest way to say: “Pause here — let me look inside.”