AI Prediction & Neural Networks: From Math to Machine

AI Prediction & Neural Networks: From Math to Machine

Goal: Deeply understand how machines “learn” by building the mathematical engines from scratch. Move beyond “import torch” to understanding tensors, gradients, backpropagation, and optimization algorithms at the byte level.


Learning Path Overview

This directory contains expanded, comprehensive guides for each project in the AI Prediction & Neural Networks learning path. Each project builds upon the previous ones, taking you from simple perceptrons to complete deep learning architectures.

Project Index

# Project Difficulty Time Key Concepts
1 The Manual Neuron Beginner Weekend Perceptrons, Weights, Bias, Step Function
2 Gradient Descent Visualizer Intermediate Weekend Optimization, Derivatives, Learning Rate
3 Linear Regression Engine Intermediate 1 Week Vectors, MSE, Batch Gradient Descent
4 The Spam Filter Intermediate 1 Week Sigmoid, Cross-Entropy, Bag of Words
5 The Autograd Engine Expert 2 Weeks Computational Graphs, Chain Rule, Backprop
6 Fraud Detection MLP Advanced 1 Week Hidden Layers, ReLU, Class Imbalance
7 Convolutional Kernel Explorer Intermediate Weekend Convolution, Kernels, Feature Maps
8 MNIST From First Principles Expert 2 Weeks Softmax, Multi-class, Vectorization
9 CNN From Scratch Master 3 Weeks Conv Layers, Pooling, Spatial Invariance
10 RNN Character Generator Master 3 Weeks Hidden State, BPTT, Sequence Modeling
11 BrainInABox Library Master 4 Weeks API Design, Abstraction, Framework Building

Path A: Foundation to Deep Learning (Complete)

P1 → P2 → P3 → P4 → P5 → P6 → P8

This path builds core understanding of neural networks from single neurons to multi-layer perceptrons.

Path B: Computer Vision Focus

P1 → P2 → P5 → P7 → P8 → P9

Focus on image processing and convolutional neural networks.

Path C: NLP/Sequence Focus

P1 → P2 → P4 → P5 → P10

Focus on text processing and recurrent neural networks (ancestors of LLMs).

Path D: Framework Developer

P1 → P2 → P5 → P6 → P11

Focus on building your own deep learning library.


Prerequisites

Before starting this learning path, you should have:

  1. Python proficiency - Comfortable with classes, functions, and data structures
  2. Basic linear algebra - Understanding of vectors, matrices, and dot products
  3. Calculus fundamentals - Understanding of derivatives (can be learned alongside)
  4. NumPy basics - Familiarity with array operations (will be reinforced)

Core Concepts Covered

Concept Cluster Projects What You’ll Internalize
Tensors & Linear Algebra P1, P3 Matrix multiplication is the engine of AI
Optimization P2, P3 Gradient descent finds minima in high-dimensional spaces
Automatic Differentiation P5 The chain rule enables learning through layers
Architectures P6, P9, P10 Different structures for different data types
Loss Functions P3, P4, P8 Defining “badness” shapes what the model learns

What Makes These Projects Different

Unlike tutorials that have you copy-paste PyTorch code:

  1. No frameworks - You build everything from scratch using only NumPy
  2. Math-first - Every algorithm is derived from first principles
  3. Visual intuition - Diagrams and visualizations throughout
  4. Real applications - Fraud detection, spam filtering, handwriting recognition
  5. Progressive complexity - Each project builds on previous knowledge

Essential Reading

Book Focus Areas
“Grokking Deep Learning” by Andrew Trask Intuitive explanations, great for beginners
“Neural Networks and Deep Learning” by Michael Nielsen Free online, excellent MNIST walkthrough
“Deep Learning” by Goodfellow, Bengio, Courville The comprehensive textbook
“Deep Learning with Python” by François Chollet Practical applications

After Completing This Path

You will:

  • Understand what happens inside PyTorch/TensorFlow
  • Debug neural networks by inspecting gradients
  • Design custom architectures for specific problems
  • Optimize models for resource-constrained environments
  • Read and implement papers from scratch

You will transition from “AI user” to “AI engineer.”