Project 11: Audio Lab - Real-Time Audio Effects Chain

Build a real-time audio effects chain and measure latency, CPU usage, and audio quality.

Quick Reference

Attribute Value
Difficulty Level 4: Expert
Time Estimate 20-40 hours
Main Programming Language C++
Alternative Programming Languages C
Coolness Level Level 5: Pure Magic
Business Potential 4. The “Product” Model
Prerequisites C/C++ basics, Teensyduino setup, basic electronics, ability to use a multimeter/logic analyzer
Key Topics audio DSP, buffers, real-time constraints

1. Learning Objectives

By completing this project, you will:

  1. Explain the core question for this project in your own words.
  2. Implement the main workflow and validate it with measurements.
  3. Handle at least two failure modes and document recovery.
  4. Produce a deterministic report that matches hardware behavior.

2. All Theory Needed (Per-Concept Breakdown)

Audio Sampling, DSP Blocks, and Real-Time Constraints

Fundamentals Digital audio uses a fixed sample rate, and DSP blocks process samples or blocks of samples. Real-time audio means each block must be processed before the next arrives. If processing overruns the deadline, you hear clicks or dropouts. This project makes those deadlines measurable.

Deep Dive into the concept Audio systems are defined by sample rate, block size, and latency. The Teensy Audio Library uses fixed block sizes, giving a fixed processing budget per block. DSP effects like filters, delays, and reverbs each add CPU cost and require careful gain staging to avoid clipping. At high complexity, CPU usage can exceed the budget, causing audio artifacts. Audio I/O adds constraints. I2S relies on accurate clocks, and USB audio depends on host scheduling. You must measure CPU usage and latency under load and verify stability over time. This project builds an effects chain, measures performance, and validates audio quality using deterministic test tones and logs.

How this fit on projects This concept directly powers the implementation choices and validation steps in this project.

Definitions & key terms

  • Sample rate: Number of audio samples per second.
  • Block size: Number of samples processed as a group.
  • Latency: Time delay from input to output.
  • DSP: Digital Signal Processing operations on audio data.

Mental model diagram (ASCII)

Input -> ADC/I2S -> DSP Blocks -> Output

How it works (step-by-step)

  1. Configure audio I/O and sample rate.
  2. Implement DSP blocks (filter, delay, mix).
  3. Measure CPU usage and latency.
  4. Listen and verify for dropouts or distortion.

Minimal concrete example

AudioInputI2S in;
AudioFilterBiquad filt;
AudioOutputI2S out;
AudioConnection c1(in, 0, filt, 0);
AudioConnection c2(filt, 0, out, 0);

Common misconceptions

  • If it compiles, it will run in real time.
  • More effects always sound better.
  • Audio dropouts are random noise.

Check-your-understanding questions

  • How do block size and sample rate affect CPU budget?
  • Why do filters sometimes become unstable?
  • How can you measure end-to-end latency?

Check-your-understanding answers

  • They define the time available to process each block.
  • Poor coefficient choices can make IIR filters unstable.
  • Inject a pulse and measure time between input and output.

Real-world applications

  • Audio effects pedals
  • Embedded instruments
  • Real-time audio analysis

Where you’ll apply it

References

  • Teensy Audio Library documentation
  • DSP textbooks

Key insights

Audio quality is a direct result of deterministic timing and DSP design.

Summary

Real-time audio is a deadline-driven pipeline; miss the deadline and you hear it.

Homework/Exercises to practice the concept

  • Measure CPU usage for one vs three DSP blocks.
  • Create a delay effect and verify latency in samples.

Solutions to the homework/exercises

  • Use AudioProcessorUsage() to log CPU percent.
  • Measure delay by sending an impulse and counting samples.

3. Project Specification

3.1 What You Will Build

Build a real-time audio effects chain and measure latency, CPU usage, and audio quality.

3.2 Functional Requirements

  1. Build a chain with at least 3 audio effects.
  2. Measure CPU usage and audio latency.
  3. Detect and log buffer underruns.
  4. Provide an audio quality report.

3.3 Non-Functional Requirements

  • Performance: Meet the target timing/throughput for the project.
  • Reliability: Detect errors and recover without undefined behavior.
  • Usability: Provide clear logs and a repeatable workflow.

3.4 Example Usage / Output

./P11-audio-lab-real-time-audio-effects-chain --run

3.5 Data Formats / Schemas / Protocols

CSV with columns: block_id, cpu_pct, underruns, latency_ms

3.6 Edge Cases

  • CPU overload causing dropouts
  • Clipping due to gain staging
  • Sample rate mismatch

3.7 Real World Outcome

You will run the project and see deterministic logs and measurements that match physical hardware behavior.

3.7.1 How to Run (Copy/Paste)

cd project-root
make
./P11-audio-lab-real-time-audio-effects-chain --run

3.7.2 Golden Path Demo (Deterministic)

Use a fixed input configuration and a known test signal. Capture output for 60 seconds and verify it matches expected values.

3.7.3 If CLI: exact terminal transcript

$ ./P11-audio-lab-real-time-audio-effects-chain --run --seed 42
[INFO] Audio Lab - Real-Time Audio Effects Chain starting
[INFO] Report saved to data/report.csv
[INFO] Status: OK
$ echo $?
0

Failure Demo (Deterministic)

$ ./P11-audio-lab-real-time-audio-effects-chain --run --missing-device
[ERROR] Device not detected
$ echo $?
2

4. Solution Architecture

4.1 High-Level Design

Inputs -> Acquisition -> Processing -> Output/Log

4.2 Key Components

Component Responsibility Key Decisions
Acquisition Configure peripherals and capture data Use stable clock settings
Processing Convert raw data to meaningful values Apply calibration/filters
Output/Log Emit reports and logs CSV for reproducibility

4.3 Data Structures (No Full Code)

struct Sample {
    uint32_t timestamp_us;
    uint32_t value;
    uint32_t flags;
};

4.4 Algorithm Overview

Key Algorithm: Measurement + Report

  1. Initialize hardware and verify configuration.
  2. Capture data and record timestamps.
  3. Compute metrics and write report.

Complexity Analysis:

  • Time: O(n) in samples
  • Space: O(n) for log storage

5. Implementation Guide

5.1 Development Environment Setup

# Arduino IDE + Teensyduino must be installed
# Optional CLI workflow
arduino-cli core update-index
arduino-cli core install teensy:avr

5.2 Project Structure

project-root/
├── src/
│   ├── main.ino
│   ├── hw_config.h
│   └── measurements.cpp
├── tools/
│   └── analyze.py
├── data/
│   └── samples.csv
└── README.md

5.3 The Core Question You’re Answering

“How do I keep audio processing deterministic under real load?”

5.4 Concepts You Must Understand First

Stop and research these before coding:

  1. audio DSP, buffers, real-time constraints
  2. Data logging and measurement techniques
  3. Basic timing math and error analysis

5.5 Questions to Guide Your Design

  1. Which effect chain yields stable CPU usage?
  2. How will you measure latency?
  3. How will you prevent clipping?

5.6 Thinking Exercise

Estimate processing time per block and compare to measured CPU usage.

5.7 The Interview Questions They’ll Ask

  1. What is Nyquist frequency?
  2. Why does block size affect latency?
  3. How do you measure audio dropouts?

5.8 Hints in Layers

  • Start with a single effect and add more gradually.
  • Use the audio library CPU metrics.
  • Use a test tone for measurement.

5.9 Books That Will Help

| Topic | Book | Chapter | |——-|——|———| | DSP basics | The Scientist and Engineer’s Guide to DSP | Ch. 1-4 | | Embedded systems | Making Embedded Systems | Ch. 7 | | Audio systems | Designing Audio Effect Plug-Ins | Ch. 2 |

5.10 Implementation Phases

Phase 1: Foundation (8 hours)

Goals:

  • Set up audio I/O
  • Implement first effect

Tasks:

  1. Set up audio I/O
  2. Implement first effect

Checkpoint: Audio passes through

Phase 2: Core Functionality (12 hours)

Goals:

  • Add more effects
  • Measure CPU/latency

Tasks:

  1. Add more effects
  2. Measure CPU/latency

Checkpoint: Stable effect chain

Phase 3: Polish (8 hours)

Goals:

  • Quality tests
  • Optimize DSP

Tasks:

  1. Quality tests
  2. Optimize DSP

Checkpoint: Final report

5.11 Key Implementation Decisions

| Decision | Options | Recommendation | Rationale | |———-|———|—————-|———–| | Buffering | Single buffer, double buffer | Double buffer | Avoids data loss during processing | | Logging format | CSV, binary | CSV | Human-readable while still scriptable | | Clock speed | Default, overclock | Default | Keeps peripherals in spec |


6. Testing Strategy

6.1 Test Categories

| Category | Purpose | Examples | |———-|———|———-| | Unit Tests | Validate math, parsing, and conversions | Timer math, CRC checks | | Integration Tests | Verify peripherals and pipelines | DMA -> buffer -> log | | Edge Case Tests | Handle boundary conditions | Brownout, missing sensor |

6.2 Critical Test Cases

{test_cases}

6.3 Test Data

Use a fixed test input pattern and record outputs to data/report.csv

7. Common Pitfalls & Debugging

7.1 Frequent Mistakes

| Pitfall | Symptom | Solution | |———|———|———-| {pitfalls}

7.2 Debugging Strategies

{debug_strats}

7.3 Performance Traps

Large buffers improve stability but increase latency. Measure both throughput and jitter to choose the right size.


8. Extensions & Challenges

8.1 Beginner Extensions

{ex_begin}

8.2 Intermediate Extensions

{ex_inter}

8.3 Advanced Extensions

{ex_adv}


9. Real-World Connections

9.1 Industry Applications

{industry_apps}

{open_source}

9.3 Interview Relevance

{interview_rel}


10. Resources

10.1 Essential Reading

{resources}

10.2 Video Resources

  • Embedded systems timing walkthrough (YouTube)
  • Teensy hardware deep dive (Conference talk)

10.3 Tools & Documentation

  • Teensyduino: Toolchain for Teensy boards
  • Logic Analyzer: Timing verification
  • Multimeter: Voltage and current measurement

{related_projects}


11. Self-Assessment Checklist

11.1 Understanding

  • I can explain the main concept without notes.
  • I can explain why the measurements match (or do not match) expectations.
  • I understand at least one tradeoff made in this project.

11.2 Implementation

  • All functional requirements are met.
  • All critical test cases pass.
  • Logs and reports are reproducible.
  • Edge cases are handled.

11.3 Growth

  • I documented lessons learned.
  • I can explain this project in a job interview.
  • I identified one improvement for next iteration.

12. Submission / Completion Criteria

Minimum Viable Completion: {comp_min}

Full Completion: {comp_full}

Excellence (Going Above & Beyond): {comp_ex}