next() is Python’s built-in function for manually advancing an iterator — retrieving the next item with next(it), or returning a default value if exhausted. It’s the low-level counterpart to the for loop, enabling explicit control over iteration, sentinel-based loops, and integration with generators, files, and custom iterators. In 2026, next() remains essential in data science (chunked processing in pandas/Polars/Dask), software engineering (custom iterators, stream processing), and functional programming — supporting sentinel termination, lazy evaluation, and safe advancement with next(it, default) to avoid StopIteration.
Here’s a complete, practical guide to using next() in Python: basic advancement, sentinel mode, real-world patterns (earthquake chunked reading, stream processing, custom generators), and modern best practices with type hints, error handling, performance, and integration with Dask/Polars/pandas/NumPy/asyncio/itertools.
Basic next() — advance iterator, raise StopIteration if exhausted.
it = iter([1, 2, 3, 4, 5])
print(next(it)) # 1
print(next(it)) # 2
print(next(it)) # 3
print(list(it)) # [4, 5] (remaining items)
# Exhaust iterator
try:
next(it) # raises StopIteration
except StopIteration:
print("Iterator exhausted")
Sentinel mode — next(it, default) returns default instead of raising error.
it = iter([1, 2, 3])
print(next(it, "end")) # 1
print(next(it, "end")) # 2
print(next(it, "end")) # 3
print(next(it, "end")) # "end" (no error)
Real-world pattern: chunked reading & processing large earthquake data — manual iterator control.
import dask.dataframe as dd
ddf = dd.read_csv('earthquakes/*.csv', blocksize='64MB')
# Get iterator over partitions
chunks = iter(ddf.to_delayed())
# Process chunks one by one
while True:
try:
chunk = next(chunks).compute()
strong = chunk[chunk['mag'] >= 7.0]
if not strong.empty:
print(f"Strong events in chunk: {len(strong)}")
print(strong[['time', 'mag', 'place']].head())
except StopIteration:
print("All chunks processed")
break
Best practices for next() in Python & data workflows. Prefer for item in iterable — when consuming entire sequence; use next() for manual control or first-item extraction. Modern tip: use Polars df.iter_rows() — for row iteration; Dask ddf.to_delayed() for chunk iteration. Use next(it, default) — safe advancement with fallback. Add type hints — from typing import Iterator, TypeVar; T = TypeVar('T'); def get_next(it: Iterator[T], default: T | None = None) -> T | None. Handle StopIteration — in manual loops with try/except. Use next(iterable) — to get first item (consumes it). Use next(enumerate(seq)) — get first indexed pair. Use next(filter(pred, seq)) — first matching item. Use next(map(func, seq)) — first transformed item. Use itertools.islice(it, n) — take first n items. Use next(it) in generators — yield first value. Use next() with files — next(f) for first line. Use next(g) — advance generator. Use next() in async — await anext(ait) for async iterators. Use next(it, None) — check if iterator has items. Use next(tee(it))[0] — peek first item (from itertools.tee). Use next(chain(it1, it2)) — first item from chained iterables. Use next(cycle(it)) — first item from infinite cycle. Use next(repeat(value)) — always returns value. Use next(accumulate(seq)) — first cumulative result. Use next(dropwhile(pred, seq)) — first item after predicate fails.
next(iterator[, default]) retrieves the next item from an iterator — raises StopIteration if exhausted, or returns default if provided. In 2026, use sentinel mode for streams/files, next() for first-item extraction, and integrate with Dask/Polars/itertools for chunked/streaming data. Master next(), and you’ll control iteration precisely and efficiently in any Python workflow.
Next time you need the next item from an iterator — use next(). It’s Python’s cleanest way to say: “Advance the iterator — give me the next value, or a default if done.”