Project 21: Sequence and Series Convergence Lab

Build a convergence analyzer that turns infinite processes into defensible finite computations.

Quick Reference

Attribute Value
Difficulty Level 2: Intermediate (The Developer)
Time Estimate 1 week
Main Programming Language Python
Alternative Programming Languages Julia, Rust, C++
Coolness Level Level 3: Genuinely Clever
Business Potential 1. The “Resume Gold” (Educational/Personal Brand)
Knowledge Area Numerical Analysis / Foundations
Main Book “Concrete Mathematics, 2nd Edition” by Graham, Knuth, and Patashnik

1. Learning Objectives

  1. Define convergence and divergence criteria operationally.
  2. Build robust stopping rules for iterative approximations.
  3. Distinguish asymptotic trend from finite-window artifacts.
  4. Explain convergence failures with quantitative evidence.

2. All Theory Needed (Per-Concept Breakdown)

Concept A: Partial Sums and Limits

Fundamentals A sequence is an ordered numeric process indexed by step count. A series is accumulation over a sequence. In practice, ML optimization and Monte Carlo estimators are sequence/series objects under the hood.

Deep Dive into the concept Convergence is not “values stop changing visually.” It is an epsilon-based guarantee: eventually, values remain within tolerance of a limit. A robust implementation therefore needs both numerical diagnostics (delta windows, trend checks) and theory-aware checks (ratio/root hints for applicable families). Slow convergence and divergence can look similar on short windows, so reporting asymptotic class and confidence bands is essential.

How this fits this project This project uses finite approximations of infinite objects and forces explicit error accounting.

Minimal concrete example

PSEUDOCODE
for n in 1..N:
  s = update_partial_sum(n)
  delta = abs(s - s_prev)
  if windowed_delta_below_tolerance and theoretical_test_supports:
    mark likely converged

Concept B: Error Bounds and Stopping Rules

Fundamentals A stopping rule transforms abstract tolerance into a deterministic condition. Error bounds justify when termination is safe.

Deep Dive into the concept Single-step thresholds are brittle because oscillatory and slowly divergent processes can pass transiently. Better rules combine rolling windows, monotonicity checks, and known bounds where available (for example geometric remainders). In ML this maps directly to optimizer early stopping and sampling budgets.

Concept C: Numerical Artifacts

Fundamentals Finite precision can mimic convergence or create fake oscillations.

Deep Dive into the concept Large partial sums with tiny increments suffer cancellation and resolution limits. Logging precision mode, magnitude ranges, and compensated summation options prevents false trust in results.


3. Build Blueprint

  1. Implement generators for arithmetic, geometric, harmonic, and alternating families.
  2. Implement partial-sum trackers and rolling convergence diagnostics.
  3. Add asymptotic classification output and plotting.
  4. Add precision-mode toggles and compare behaviors.

4. Real-World Outcome (Target)

$ python series_lab.py --series geometric --r 0.7 --n 120

Series: geometric
Closed-form limit: 3.333333
S_120: 3.333333
Absolute error: 2.1e-19
Convergence: PASS

5. Core Design Notes from Main Guide

Core Question

“When does an infinite process become a reliable finite computation?”

Common Pitfalls

  • False convergence from one-step deltas
  • Precision artifacts at large n
  • Using linear plots only for asymptotic diagnosis

Definition of Done

  • Supports at least six sequence/series families
  • Emits convergence classification with evidence
  • Includes at least one divergence counterexample
  • Documents stopping-rule tradeoffs

6. Extensions

  1. Add adaptive term budgeting for target error.
  2. Add symbolic closed-form detection for simple families.
  3. Compare compensated vs naive summation.