Saving timeit Output in Python 2026 with Efficient Code
Running timeit benchmarks is useful, but saving the results is essential for tracking performance improvements over time. In 2026, properly saving and comparing timeit output helps you maintain a history of optimizations and make data-driven decisions.
This March 15, 2026 guide shows modern ways to save, store, and analyze timeit results effectively.
TL;DR — Key Takeaways 2026
- Save
timeitresults to compare performance before and after changes - Use JSON, CSV, or Markdown for easy tracking
- Automate saving with a reusable benchmark function
- Include metadata (date, Python version, hardware) for context
- Free-threading and faster runtimes make consistent saving even more important
1. Simple Way to Save timeit Output
import timeit
import json
from datetime import datetime
def benchmark_and_save(name, stmt, setup="", number=1000, repeat=7):
timer = timeit.Timer(stmt=stmt, setup=setup)
times = timer.repeat(repeat=repeat, number=number)
best = min(times) / number
result = {
"test_name": name,
"best_time_per_loop": best,
"unit": "seconds",
"number_of_loops": number,
"repeats": repeat,
"timestamp": datetime.now().isoformat(),
"python_version": "3.14"
}
# Save to JSON
with open("benchmarks.json", "a") as f:
f.write(json.dumps(result) + "
")
print(f"{name}: {best*1_000_000:.2f} µs per loop")
return best
# Usage
benchmark_and_save("sum_range", "sum(range(100000))")
benchmark_and_save("numpy_sum",
"np.sum(arr**2)",
setup="import numpy as np; arr = np.arange(100000)")
2. Saving to CSV for Easy Comparison
import csv
from datetime import datetime
def save_to_csv(name, best_time):
with open("benchmarks.csv", "a", newline="") as f:
writer = csv.writer(f)
writer.writerow([
datetime.now().strftime("%Y-%m-%d %H:%M"),
name,
f"{best_time*1_000_000:.2f}",
"µs per loop"
])
# Usage after benchmark
save_to_csv("List Comprehension", 1245.67)
save_to_csv("NumPy Vectorized", 45.32)
3. Best Practices for Saving timeit Results in 2026
- Always save before and after any optimization
- Include timestamp and environment info (Python version, hardware)
- Use JSON for structured data or CSV for easy spreadsheets
- Automate saving with a helper function
- Version your benchmark files (e.g., benchmarks_v2.json)
Conclusion — Saving timeit Output in 2026
Saving timeit results is a key part of a professional performance workflow. In 2026, developers who consistently record and compare benchmarks can track improvements over time, avoid regression, and make confident optimization decisions. Make saving your timeit results a habit — it pays off quickly.
Next steps:
- Create a reusable
benchmark_and_save()function for your projects - Related articles: Using timeit in Python 2026 • Efficient Python Code 2026