LEARN NODEJS DEEP DIVE
Learn Node.js: From Zero to Runtime Master
Goal: Deeply understand Node.js—from the event loop and streams to building production-grade tools, frameworks, and understanding the V8 engine integration.
Why Node.js Matters
Node.js brought JavaScript to the server and changed how we think about I/O. It’s not just “JavaScript on the backend”—it’s a fundamentally different approach to handling concurrent operations. While other languages use threads, Node.js uses an event loop with non-blocking I/O.
After completing these projects, you will:
- Understand the event loop deeply (phases, microtasks, timers)
- Master streams and handle gigabytes of data with minimal memory
- Build HTTP servers, frameworks, and real-time applications
- Create CLI tools and developer utilities
- Understand how Node.js integrates with V8 and libuv
- Write native addons when JavaScript isn’t enough
Core Concept Analysis
The Node.js Architecture
┌─────────────────────────────────────────────────────────────────────────┐
│ Your JavaScript Code │
├─────────────────────────────────────────────────────────────────────────┤
│ Node.js Bindings │
│ (JavaScript ↔ C++ bridge layer) │
├─────────────────────────────────────────────────────────────────────────┤
│ │
│ ┌─────────────────────────────┐ ┌─────────────────────────────┐ │
│ │ V8 Engine │ │ libuv │ │
│ │ (JavaScript execution) │ │ (Event loop, async I/O) │ │
│ │ │ │ │ │
│ │ - JIT compilation │ │ - Event loop │ │
│ │ - Memory management │ │ - Thread pool │ │
│ │ - Garbage collection │ │ - File system ops │ │
│ │ │ │ - Network I/O │ │
│ └─────────────────────────────┘ └─────────────────────────────┘ │
│ │
├─────────────────────────────────────────────────────────────────────────┤
│ Operating System │
│ (epoll, kqueue, IOCP, etc.) │
└─────────────────────────────────────────────────────────────────────────┘
The Event Loop (Heart of Node.js)
┌───────────────────────────┐
┌─▶│ timers │ ← setTimeout, setInterval callbacks
│ └─────────────┬─────────────┘
│ ┌─────────────▼─────────────┐
│ │ pending callbacks │ ← I/O callbacks deferred from previous loop
│ └─────────────┬─────────────┘
│ ┌─────────────▼─────────────┐
│ │ idle, prepare │ ← internal use only
│ └─────────────┬─────────────┘ ┌───────────────┐
│ ┌─────────────▼─────────────┐ │ incoming: │
│ │ poll │◀─────┤ connections, │
│ └─────────────┬─────────────┘ │ data, etc. │
│ ┌─────────────▼─────────────┐ └───────────────┘
│ │ check │ ← setImmediate callbacks
│ └─────────────┬─────────────┘
│ ┌─────────────▼─────────────┐
│ │ close callbacks │ ← socket.on('close', ...)
│ └─────────────┬─────────────┘
│ │
│ ┌─────────────▼─────────────┐
│ │ process.nextTick() │ ← Runs between each phase
│ │ Promise microtasks │
└──┴───────────────────────────┘
Fundamental Concepts
- Event-Driven Architecture: Everything in Node.js is event-based
- Events are emitted
- Listeners respond to events
- The event loop coordinates everything
- Non-Blocking I/O: Operations don’t wait
- File reads don’t block
- Network requests don’t block
- Callbacks/Promises notify when done
- Single-Threaded (with caveats):
- Your JavaScript runs on one thread
- libuv uses a thread pool for blocking operations (fs, crypto, dns)
- Worker threads available for CPU-intensive tasks
- Streams: Process data piece by piece
- Readable: Data source (file, HTTP response)
- Writable: Data destination (file, HTTP request)
- Duplex: Both (TCP socket)
- Transform: Modify data as it passes through (gzip)
- Buffers: Raw binary data
- JavaScript strings are Unicode
- Buffers are raw bytes
- Essential for binary protocols, files, network
- Module System:
- CommonJS:
require()/module.exports - ES Modules:
import/export - Resolution algorithm: how Node finds modules
- CommonJS:
Key Terminology
| Term | Definition |
|---|---|
| Event Loop | The mechanism that handles async operations |
| Callback | Function called when async operation completes |
| Stream | Abstract interface for flowing data |
| Buffer | Fixed-length sequence of bytes |
| EventEmitter | Object that emits named events |
| Middleware | Function that processes requests in a chain |
| libuv | C library providing async I/O |
| V8 | Google’s JavaScript engine |
| N-API | Stable API for native addons |
| REPL | Read-Eval-Print Loop |
Project Progression Map
Level 1 (Beginner) Level 2 (Intermediate) Level 3 (Advanced)
───────────────────── ──────────────────────── ─────────────────────
┌─────────────────┐ ┌─────────────────────┐ ┌──────────────────┐
│ 1. CLI Task │ │ 6. Real-time Chat │ │ 12. Package │
│ Manager │───────▶ │ Server │──────▶ │ Manager │
└─────────────────┘ └─────────────────────┘ └──────────────────┘
│ │ │
▼ ▼ ▼
┌─────────────────┐ ┌─────────────────────┐ ┌──────────────────┐
│ 2. File │ │ 7. Job Queue │ │ 13. Build Tool │
│ Watcher │───────▶ │ System │──────▶ │ (Bundler) │
└─────────────────┘ └─────────────────────┘ └──────────────────┘
│ │ │
▼ ▼ ▼
┌─────────────────┐ ┌─────────────────────┐ ┌──────────────────┐
│ 3. HTTP Server │ │ 8. Database │ │ 14. Test │
│ from Scratch │───────▶ │ Connection Pool │──────▶ │ Runner │
└─────────────────┘ └─────────────────────┘ └──────────────────┘
│ │ │
▼ ▼ ▼
┌─────────────────┐ ┌─────────────────────┐ ┌──────────────────┐
│ 4. Static File │ │ 9. Stream Data │ │ 15. Process │
│ Server │───────▶ │ Processor │──────▶ │ Manager │
└─────────────────┘ └─────────────────────┘ └──────────────────┘
│ │ │
▼ ▼ ▼
┌─────────────────┐ ┌─────────────────────┐ ┌──────────────────┐
│ 5. Mini Web │ │ 10. HTTP Client │ │ 16. Native │
│ Framework │───────▶ │ Library │──────▶ │ Addon │
└─────────────────┘ └─────────────────────┘ └──────────────────┘
│
▼
┌─────────────────────┐
│ 11. Logger │
│ Library │
└─────────────────────┘
Project 1: CLI Task Manager
- File: LEARN_NODEJS_DEEP_DIVE.md
- Main Programming Language: JavaScript (Node.js)
- Alternative Programming Languages: TypeScript
- Coolness Level: Level 2: Practical but Forgettable
- Business Potential: 1. The “Resume Gold”
- Difficulty: Level 1: Beginner
- Knowledge Area: CLI Development / File I/O
- Software or Tool: CLI Application
- Main Book: “Node.js Design Patterns” by Mario Casciaro
What you’ll build: A command-line task manager that stores tasks in a JSON file. Users can add, list, complete, and delete tasks. Includes argument parsing, colored output, and data persistence.
Why it teaches Node.js: CLI tools are the “hello world” of Node.js. You’ll learn the module system, file I/O, process arguments, and how to structure a Node.js application—all without the complexity of networking.
Core challenges you’ll face:
- Argument parsing → maps to understanding process.argv
- File read/write → maps to fs module basics
- JSON serialization → maps to data persistence patterns
- User feedback → maps to stdout/stderr and exit codes
Key Concepts:
- Node.js Modules: “Node.js Design Patterns” Chapter 2 - Casciaro
- File System Operations: Node.js Documentation - fs module
- Process Object: “Node.js in Action” Chapter 3 - Cantelon et al.
Difficulty: Beginner Time estimate: Weekend Prerequisites: Basic JavaScript, command line familiarity
Real world outcome:
$ task add "Learn Node.js event loop"
✓ Task added: Learn Node.js event loop (ID: 1)
$ task add "Build a web server"
✓ Task added: Build a web server (ID: 2)
$ task list
Tasks:
[ ] 1. Learn Node.js event loop
[ ] 2. Build a web server
$ task complete 1
✓ Completed: Learn Node.js event loop
$ task list
Tasks:
[✓] 1. Learn Node.js event loop
[ ] 2. Build a web server
$ task list --pending
Pending Tasks:
[ ] 2. Build a web server
$ task delete 1
✓ Deleted: Learn Node.js event loop
$ task stats
Statistics:
- Total: 1
- Completed: 0
- Pending: 1
Implementation Hints:
Project structure:
task-cli/
├── bin/
│ └── task.js # Entry point (shebang line)
├── lib/
│ ├── commands/ # One file per command
│ │ ├── add.js
│ │ ├── list.js
│ │ ├── complete.js
│ │ └── delete.js
│ ├── storage.js # JSON file read/write
│ └── utils.js # Colors, formatting
├── data/
│ └── tasks.json # Persisted data
└── package.json
Key patterns to learn:
// 1. Understanding process.argv
// $ node task.js add "My task"
// process.argv = ['node', 'task.js', 'add', 'My task']
// 2. Synchronous vs Async file I/O
// Start with sync (easier), then refactor to async
// 3. Making it executable
// Add shebang: #!/usr/bin/env node
// Add bin field to package.json
// npm link to install globally
Questions to consider:
- How do you handle the case where the data file doesn’t exist yet?
- How do you parse arguments like
--pendingor-p? - How do you ensure atomic writes (no data corruption)?
- How do you handle concurrent runs of the CLI?
Learning milestones:
- Commands parse correctly → You understand process.argv
- Data persists between runs → You understand file I/O
- CLI is globally installable → You understand npm bin scripts
- Errors are handled gracefully → You understand Node.js error patterns
Project 2: File Watcher
- File: LEARN_NODEJS_DEEP_DIVE.md
- Main Programming Language: JavaScript (Node.js)
- Alternative Programming Languages: TypeScript
- Coolness Level: Level 3: Genuinely Clever
- Business Potential: 2. The “Micro-SaaS / Pro Tool”
- Difficulty: Level 1: Beginner
- Knowledge Area: Event Emitters / File System
- Software or Tool: File Watching Utility
- Main Book: “Node.js Design Patterns” by Mario Casciaro
What you’ll build: A file watcher that monitors directories for changes and executes commands when files are modified. Like nodemon or chokidar, but built from scratch.
Why it teaches Node.js: This project introduces you to EventEmitters—the foundation of Node.js’s event-driven architecture. You’ll also learn about fs.watch, its quirks, and why libraries like chokidar exist.
Core challenges you’ll face:
- fs.watch behavior → maps to understanding OS-level file events
- Event debouncing → maps to handling rapid successive events
- Recursive watching → maps to directory tree traversal
- Cross-platform differences → maps to Node.js abstraction challenges
Key Concepts:
- EventEmitter Pattern: “Node.js Design Patterns” Chapter 3 - Casciaro
- fs.watch vs fs.watchFile: Node.js Documentation - File System
- Debouncing: “JavaScript: The Good Parts” - Douglas Crockford
Difficulty: Beginner Time estimate: Weekend Prerequisites: Project 1, understanding of events
Real world outcome:
$ watcher --dir ./src --exec "npm test"
👀 Watching ./src for changes...
[10:23:45] Changed: src/index.js
[10:23:45] Running: npm test
> test passed
[10:24:12] Changed: src/utils.js
[10:24:12] Running: npm test
> test passed
[10:24:30] Added: src/new-file.js
[10:24:30] Running: npm test
> test passed
[10:25:01] Deleted: src/old-file.js
[10:25:01] Running: npm test
> test passed
Implementation Hints:
The EventEmitter pattern:
const EventEmitter = require('events');
class FileWatcher extends EventEmitter {
constructor(directory, options = {}) {
super();
this.directory = directory;
this.options = options;
}
start() {
// Set up fs.watch
// Emit events: 'change', 'add', 'delete', 'error'
}
stop() {
// Clean up watchers
}
}
// Usage
const watcher = new FileWatcher('./src');
watcher.on('change', (filepath) => {
console.log(`Changed: ${filepath}`);
});
watcher.start();
Key challenges:
- fs.watch fires multiple times: A single save can trigger 2-3 events
- Solution: Debounce with a small delay (100-300ms)
- Different events on different OS: macOS, Linux, Windows behave differently
- Solution: Normalize events into: add, change, delete
- Recursive watching: fs.watch’s recursive option only works on macOS/Windows
- Solution: Walk directory tree and watch each subdirectory on Linux
- Handling renames: Rename appears as delete + add
- Solution: Use a short delay to correlate delete/add pairs
Questions:
- Why does fs.watch not report which file changed on some systems?
- How do you handle watching thousands of files efficiently?
- What happens if a watched directory is deleted?
Learning milestones:
- Basic file changes detected → You understand fs.watch
- Events are properly debounced → You understand timing issues
- Works across platforms → You understand abstraction
- Memory doesn’t grow unbounded → You understand cleanup
Project 3: HTTP Server from Scratch
- File: LEARN_NODEJS_DEEP_DIVE.md
- Main Programming Language: JavaScript (Node.js)
- Alternative Programming Languages: TypeScript
- Coolness Level: Level 3: Genuinely Clever
- Business Potential: 1. The “Resume Gold”
- Difficulty: Level 2: Intermediate
- Knowledge Area: HTTP Protocol / Networking
- Software or Tool: HTTP Server
- Main Book: “Node.js in Action” by Cantelon, Rajlich, Harter
What you’ll build: An HTTP server using only the built-in http module—no Express, no frameworks. You’ll parse requests, route them, handle query strings, and send proper responses.
Why it teaches Node.js: Express and other frameworks hide so much. Building an HTTP server from scratch teaches you what’s actually happening: request/response objects, headers, body parsing, and the HTTP protocol itself.
Core challenges you’ll face:
- Request parsing → maps to understanding HTTP message format
- Body parsing → maps to handling streams and buffers
- Routing → maps to URL parsing and pattern matching
- Response handling → maps to headers, status codes, content types
Key Concepts:
- HTTP Module: “Node.js in Action” Chapter 4 - Cantelon et al.
- HTTP Protocol: “HTTP: The Definitive Guide” Chapters 1-3 - Gourley
- Request/Response Streams: Node.js Documentation - HTTP
Difficulty: Intermediate Time estimate: 1 week Prerequisites: Project 1, basic understanding of HTTP
Real world outcome:
// Your HTTP server
const server = new MyHTTPServer();
server.get('/users', (req, res) => {
res.json([
{ id: 1, name: 'Alice' },
{ id: 2, name: 'Bob' }
]);
});
server.get('/users/:id', (req, res) => {
res.json({ id: req.params.id, name: 'User ' + req.params.id });
});
server.post('/users', async (req, res) => {
const body = await req.json(); // Parse JSON body
res.status(201).json({ id: 3, ...body });
});
server.use((req, res, next) => {
console.log(`${req.method} ${req.url}`);
next();
});
server.listen(3000, () => {
console.log('Server running on http://localhost:3000');
});
$ curl http://localhost:3000/users
[{"id":1,"name":"Alice"},{"id":2,"name":"Bob"}]
$ curl http://localhost:3000/users/42
{"id":"42","name":"User 42"}
$ curl -X POST http://localhost:3000/users \
-H "Content-Type: application/json" \
-d '{"name":"Charlie"}'
{"id":3,"name":"Charlie"}
Implementation Hints:
The raw http module:
const http = require('http');
const server = http.createServer((req, res) => {
// req is an IncomingMessage (readable stream)
// res is a ServerResponse (writable stream)
console.log(req.method); // GET, POST, etc.
console.log(req.url); // /users?page=1
console.log(req.headers); // { host: '...', ... }
res.statusCode = 200;
res.setHeader('Content-Type', 'application/json');
res.end(JSON.stringify({ hello: 'world' }));
});
server.listen(3000);
Building on top:
- URL Parsing: Use the URL class
const url = new URL(req.url, `http://${req.headers.host}`); console.log(url.pathname); // /users console.log(url.searchParams); // URLSearchParams { page: '1' } - Body Parsing: Request is a stream
function parseBody(req) { return new Promise((resolve, reject) => { const chunks = []; req.on('data', chunk => chunks.push(chunk)); req.on('end', () => { const body = Buffer.concat(chunks).toString(); resolve(JSON.parse(body)); }); req.on('error', reject); }); } - Route Matching: Pattern matching with params
// /users/:id should match /users/42 // and extract { id: '42' } - Middleware Chain: Next function pattern
const middlewares = []; function runMiddleware(req, res, index = 0) { if (index >= middlewares.length) return; middlewares[index](req, res, () => { runMiddleware(req, res, index + 1); }); }
Learning milestones:
- Basic requests handled → You understand request/response
- Bodies parse correctly → You understand streams
- Routes with params work → You understand URL parsing
- Middleware chain works → You understand composition
Project 4: Static File Server
- File: LEARN_NODEJS_DEEP_DIVE.md
- Main Programming Language: JavaScript (Node.js)
- Alternative Programming Languages: TypeScript
- Coolness Level: Level 2: Practical but Forgettable
- Business Potential: 2. The “Micro-SaaS / Pro Tool”
- Difficulty: Level 2: Intermediate
- Knowledge Area: Streams / HTTP
- Software or Tool: Static File Server
- Main Book: “Node.js Design Patterns” by Mario Casciaro
What you’ll build: A static file server that serves files from a directory, handles MIME types correctly, supports range requests (for video seeking), and streams files efficiently.
Why it teaches Node.js: Static file serving is where streams shine. You’ll learn to pipe files directly to responses without loading them entirely into memory—essential for serving large files.
Core challenges you’ll face:
- MIME type detection → maps to content negotiation
- Stream piping → maps to efficient I/O
- Range requests → maps to partial content (HTTP 206)
- Security (path traversal) → maps to input validation
Key Concepts:
- Streams: “Node.js Design Patterns” Chapter 6 - Casciaro
- HTTP Range Requests: RFC 7233
- MIME Types: “HTTP: The Definitive Guide” Chapter 17 - Gourley
Difficulty: Intermediate Time estimate: 1 week Prerequisites: Project 3, understanding of streams
Real world outcome:
$ static-serve ./public --port 8080
📁 Serving ./public on http://localhost:8080
# In browser or curl:
$ curl -I http://localhost:8080/styles.css
HTTP/1.1 200 OK
Content-Type: text/css
Content-Length: 1234
Cache-Control: public, max-age=3600
ETag: "abc123"
$ curl -I http://localhost:8080/video.mp4
HTTP/1.1 200 OK
Content-Type: video/mp4
Content-Length: 104857600
Accept-Ranges: bytes
# Range request (video seeking)
$ curl -H "Range: bytes=0-1023" http://localhost:8080/video.mp4
HTTP/1.1 206 Partial Content
Content-Range: bytes 0-1023/104857600
Content-Length: 1024
# Directory listing
$ curl http://localhost:8080/images/
<!DOCTYPE html>
<html>
<body>
<h1>Index of /images/</h1>
<ul>
<li><a href="logo.png">logo.png</a> (24KB)</li>
<li><a href="banner.jpg">banner.jpg</a> (156KB)</li>
</ul>
</body>
</html>
Implementation Hints:
Stream-based file serving:
const fs = require('fs');
const path = require('path');
function serveFile(req, res, filepath) {
const stat = fs.statSync(filepath);
const mimeType = getMimeType(filepath);
res.setHeader('Content-Type', mimeType);
res.setHeader('Content-Length', stat.size);
// Stream the file (don't load into memory)
const stream = fs.createReadStream(filepath);
stream.pipe(res);
stream.on('error', (err) => {
res.statusCode = 500;
res.end('Internal Server Error');
});
}
Range request handling:
function serveWithRange(req, res, filepath) {
const stat = fs.statSync(filepath);
const range = req.headers.range;
if (range) {
// Parse: "bytes=0-1023"
const [start, end] = range.replace('bytes=', '').split('-');
const startByte = parseInt(start, 10);
const endByte = end ? parseInt(end, 10) : stat.size - 1;
res.statusCode = 206;
res.setHeader('Content-Range', `bytes ${startByte}-${endByte}/${stat.size}`);
res.setHeader('Content-Length', endByte - startByte + 1);
fs.createReadStream(filepath, { start: startByte, end: endByte }).pipe(res);
} else {
// Normal full response
res.setHeader('Accept-Ranges', 'bytes');
fs.createReadStream(filepath).pipe(res);
}
}
Security - prevent path traversal:
function safePath(baseDir, requestedPath) {
const resolved = path.resolve(baseDir, requestedPath);
// Ensure resolved path is still within baseDir
if (!resolved.startsWith(path.resolve(baseDir))) {
throw new Error('Path traversal attempt');
}
return resolved;
}
Learning milestones:
- Files stream without loading to memory → You understand piping
- MIME types are correct → You understand content types
- Range requests work → You understand partial content
- Path traversal is blocked → You understand security
Project 5: Mini Web Framework
- File: LEARN_NODEJS_DEEP_DIVE.md
- Main Programming Language: JavaScript (Node.js)
- Alternative Programming Languages: TypeScript
- Coolness Level: Level 4: Hardcore Tech Flex
- Business Potential: 4. The “Open Core” Infrastructure
- Difficulty: Level 2: Intermediate
- Knowledge Area: Web Framework Design
- Software or Tool: Web Framework
- Main Book: “Node.js Design Patterns” by Mario Casciaro
What you’ll build: A minimal Express-like web framework with routing, middleware, request/response extensions, template rendering, and error handling.
Why it teaches Node.js: After using Express, building your own framework reveals how it works. You’ll understand middleware composition, context objects, and the patterns that make Node.js web development productive.
Core challenges you’ll face:
- Middleware composition → maps to function chaining patterns
- Router design → maps to trie or regex-based routing
- Context objects → maps to request/response extensions
- Error handling → maps to centralized error middleware
Key Concepts:
- Middleware Pattern: “Node.js Design Patterns” Chapter 5 - Casciaro
- Express Source Code: Study the actual Express.js repository
- Koa Design: Koa.js documentation - Philosophy
Difficulty: Intermediate Time estimate: 2 weeks Prerequisites: Projects 3 and 4
Real world outcome:
// Your framework in action
const app = new MiniExpress();
// Middleware
app.use(async (ctx, next) => {
const start = Date.now();
await next();
const ms = Date.now() - start;
ctx.set('X-Response-Time', `${ms}ms`);
});
app.use(bodyParser());
app.use(cookieParser());
// Routes
app.get('/', (ctx) => {
ctx.body = { message: 'Hello World' };
});
app.get('/users/:id', async (ctx) => {
const user = await User.findById(ctx.params.id);
if (!user) {
ctx.throw(404, 'User not found');
}
ctx.body = user;
});
app.post('/users', async (ctx) => {
const user = await User.create(ctx.request.body);
ctx.status = 201;
ctx.body = user;
});
// Error handling
app.onError((err, ctx) => {
ctx.status = err.status || 500;
ctx.body = { error: err.message };
console.error(err.stack);
});
// Template rendering
app.get('/page', (ctx) => {
ctx.render('home', { title: 'Welcome' });
});
app.listen(3000);
Implementation Hints:
Middleware composition (Koa-style):
function compose(middlewares) {
return function(ctx) {
let index = -1;
function dispatch(i) {
if (i <= index) {
return Promise.reject(new Error('next() called multiple times'));
}
index = i;
let fn = middlewares[i];
if (!fn) return Promise.resolve();
try {
return Promise.resolve(fn(ctx, () => dispatch(i + 1)));
} catch (err) {
return Promise.reject(err);
}
}
return dispatch(0);
};
}
Context object pattern:
class Context {
constructor(req, res) {
this.req = req;
this.res = res;
this.request = new Request(req);
this.response = new Response(res);
this.params = {};
this.state = {}; // For passing data between middleware
}
get body() { return this.response.body; }
set body(val) { this.response.body = val; }
get status() { return this.response.status; }
set status(val) { this.response.status = val; }
throw(status, message) {
const err = new Error(message);
err.status = status;
throw err;
}
}
Router with param extraction:
class Router {
constructor() {
this.routes = [];
}
add(method, path, handler) {
// Convert /users/:id to regex and param names
const paramNames = [];
const pattern = path.replace(/:(\w+)/g, (_, name) => {
paramNames.push(name);
return '([^/]+)';
});
this.routes.push({
method,
pattern: new RegExp(`^${pattern}$`),
paramNames,
handler
});
}
match(method, path) {
for (const route of this.routes) {
if (route.method !== method) continue;
const match = path.match(route.pattern);
if (match) {
const params = {};
route.paramNames.forEach((name, i) => {
params[name] = match[i + 1];
});
return { handler: route.handler, params };
}
}
return null;
}
}
Learning milestones:
- Middleware chain works → You understand composition
- Routes with params work → You understand pattern matching
- Errors are handled centrally → You understand error propagation
- Response helpers work → You understand API design
Project 6: Real-time Chat Server
- File: LEARN_NODEJS_DEEP_DIVE.md
- Main Programming Language: JavaScript (Node.js)
- Alternative Programming Languages: TypeScript
- Coolness Level: Level 4: Hardcore Tech Flex
- Business Potential: 3. The “Service & Support” Model
- Difficulty: Level 2: Intermediate
- Knowledge Area: WebSockets / Real-time
- Software or Tool: Chat Application
- Main Book: “Node.js in Action” by Cantelon, Rajlich, Harter
What you’ll build: A real-time chat server with rooms, private messages, typing indicators, and presence detection—using WebSockets without Socket.io.
Why it teaches Node.js: WebSockets showcase Node.js’s event-driven nature. You’ll handle many concurrent connections efficiently and understand why Node.js is excellent for real-time applications.
Core challenges you’ll face:
- WebSocket protocol → maps to upgrade handshake and framing
- Connection management → maps to tracking active clients
- Broadcasting → maps to event distribution patterns
- Presence detection → maps to heartbeats and timeouts
Key Concepts:
- WebSocket Protocol: RFC 6455
- Real-time Patterns: “Node.js in Action” Chapter 8 - Cantelon et al.
- EventEmitter for Pub/Sub: “Node.js Design Patterns” Chapter 3
Difficulty: Intermediate Time estimate: 1-2 weeks Prerequisites: Project 3, understanding of WebSockets
Real world outcome:
Server Terminal:
$ node chat-server.js
🚀 Chat server running on ws://localhost:8080
[10:23:01] alice joined #general
[10:23:15] bob joined #general
[10:23:20] alice: Hello everyone!
[10:23:25] bob: Hey alice!
[10:23:30] bob started typing in #general
[10:23:35] bob: How are you?
[10:24:00] charlie joined #random
[10:24:05] alice -> charlie (private): Hey, join #general!
Client (Web Browser Console):
> chat.connect('ws://localhost:8080', 'alice')
Connected as alice
> chat.join('general')
Joined #general
> chat.send('general', 'Hello everyone!')
Message sent
> chat.onMessage((msg) => console.log(msg))
{ room: 'general', user: 'bob', text: 'Hey alice!' }
> chat.getUsers('general')
['alice', 'bob']
> chat.privateMessage('charlie', 'Hey, join #general!')
Private message sent
Implementation Hints:
WebSocket server from scratch (or use ws library):
const http = require('http');
const crypto = require('crypto');
const server = http.createServer();
server.on('upgrade', (req, socket, head) => {
// WebSocket handshake
const key = req.headers['sec-websocket-key'];
const hash = crypto
.createHash('sha1')
.update(key + '258EAFA5-E914-47DA-95CA-C5AB0DC85B11')
.digest('base64');
socket.write([
'HTTP/1.1 101 Switching Protocols',
'Upgrade: websocket',
'Connection: Upgrade',
`Sec-WebSocket-Accept: ${hash}`,
'',
''
].join('\r\n'));
// Now socket is a WebSocket connection
handleWebSocket(socket);
});
Chat server architecture:
class ChatServer {
constructor() {
this.rooms = new Map(); // room -> Set of connections
this.users = new Map(); // connection -> user info
this.connections = new Set(); // all connections
}
handleConnection(socket) {
this.connections.add(socket);
socket.on('message', (data) => {
const msg = JSON.parse(data);
this.handleMessage(socket, msg);
});
socket.on('close', () => {
this.handleDisconnect(socket);
});
}
broadcast(room, message, exclude = null) {
const connections = this.rooms.get(room) || new Set();
for (const socket of connections) {
if (socket !== exclude) {
socket.send(JSON.stringify(message));
}
}
}
// Presence with heartbeats
startHeartbeat(socket) {
const interval = setInterval(() => {
if (socket.isAlive === false) {
this.handleDisconnect(socket);
socket.terminate();
return;
}
socket.isAlive = false;
socket.ping();
}, 30000);
socket.on('pong', () => {
socket.isAlive = true;
});
}
}
Message protocol:
// Client → Server
{ type: 'join', room: 'general' }
{ type: 'leave', room: 'general' }
{ type: 'message', room: 'general', text: 'Hello!' }
{ type: 'private', to: 'bob', text: 'Hey!' }
{ type: 'typing', room: 'general' }
// Server → Client
{ type: 'joined', room: 'general', users: ['alice', 'bob'] }
{ type: 'user_joined', room: 'general', user: 'charlie' }
{ type: 'message', room: 'general', user: 'alice', text: 'Hello!' }
{ type: 'typing', room: 'general', user: 'bob' }
Learning milestones:
- WebSocket handshake works → You understand the upgrade protocol
- Messages broadcast to rooms → You understand pub/sub patterns
- Private messages work → You understand connection routing
- Presence detection works → You understand heartbeats
Project 7: Job Queue System
- File: LEARN_NODEJS_DEEP_DIVE.md
- Main Programming Language: JavaScript (Node.js)
- Alternative Programming Languages: TypeScript
- Coolness Level: Level 4: Hardcore Tech Flex
- Business Potential: 4. The “Open Core” Infrastructure
- Difficulty: Level 3: Advanced
- Knowledge Area: Event Loop / Async Patterns
- Software or Tool: Job Queue (like Bull/BullMQ)
- Main Book: “Node.js Design Patterns” by Mario Casciaro
What you’ll build: A job queue system that processes background tasks with retry logic, priority queues, rate limiting, and job scheduling—like Bull or Agenda.
Why it teaches Node.js: Job queues test your understanding of the event loop, async patterns, and concurrency. You’ll learn how to process work efficiently without blocking and how to handle failures gracefully.
Core challenges you’ll face:
- Event loop awareness → maps to not blocking the main thread
- Retry logic → maps to exponential backoff patterns
- Priority queues → maps to heap data structures
- Persistence → maps to surviving restarts
Key Concepts:
- Event Loop Deep Dive: Node.js Documentation - Event Loop
- Async Patterns: “Node.js Design Patterns” Chapter 4 - Casciaro
- Queue Data Structures: “Algorithms” Chapter 2 - Sedgewick
Difficulty: Advanced Time estimate: 2-3 weeks Prerequisites: Projects 1-3, understanding of event loop
Real world outcome:
// Your job queue
const queue = new JobQueue({
concurrency: 5,
storage: './jobs.db' // SQLite for persistence
});
// Define job processors
queue.process('email', async (job) => {
await sendEmail(job.data.to, job.data.subject, job.data.body);
return { sent: true };
});
queue.process('resize-image', async (job) => {
const result = await sharp(job.data.input)
.resize(job.data.width, job.data.height)
.toFile(job.data.output);
return result;
});
// Add jobs
const job1 = await queue.add('email', {
to: 'user@example.com',
subject: 'Welcome!',
body: 'Thanks for signing up'
}, {
priority: 'high',
attempts: 3,
backoff: { type: 'exponential', delay: 1000 }
});
// Scheduled job
await queue.add('email', { ... }, {
delay: 60000 // Run in 1 minute
});
// Recurring job
await queue.add('cleanup', {}, {
repeat: { cron: '0 0 * * *' } // Daily at midnight
});
// Events
queue.on('completed', (job, result) => {
console.log(`Job ${job.id} completed:`, result);
});
queue.on('failed', (job, err) => {
console.log(`Job ${job.id} failed:`, err.message);
console.log(`Attempts: ${job.attemptsMade}/${job.opts.attempts}`);
});
// Dashboard
await queue.getStats();
// {
// waiting: 10,
// active: 5,
// completed: 150,
// failed: 2,
// delayed: 3
// }
Implementation Hints:
Job queue architecture:
┌─────────────────────────────────────────────────────────────┐
│ Job Queue │
├─────────────────────────────────────────────────────────────┤
│ │
│ ┌─────────┐ ┌─────────┐ ┌─────────┐ ┌─────────┐ │
│ │ Waiting │───▶│ Active │───▶│Completed│ │ Failed │ │
│ │ Queue │ │ Jobs │ │ Jobs │ │ Jobs │ │
│ └─────────┘ └────┬────┘ └─────────┘ └────▲────┘ │
│ │ │ │ │
│ │ └─────────────────────────────┘ │
│ │ (on failure) │
│ │ │
│ ┌────▼────┐ │
│ │ Delayed │ (scheduled jobs wait here) │
│ │ Queue │ │
│ └─────────┘ │
│ │
└─────────────────────────────────────────────────────────────┘
Core implementation patterns:
class JobQueue extends EventEmitter {
constructor(options) {
super();
this.concurrency = options.concurrency || 1;
this.active = 0;
this.processors = new Map();
this.queues = {
waiting: new PriorityQueue(),
delayed: [],
active: new Set(),
completed: [],
failed: []
};
}
async add(type, data, opts = {}) {
const job = {
id: generateId(),
type,
data,
opts,
attemptsMade: 0,
createdAt: Date.now(),
processAt: Date.now() + (opts.delay || 0)
};
if (opts.delay) {
this.scheduleDelayed(job);
} else {
this.queues.waiting.enqueue(job, opts.priority);
this.tick();
}
return job;
}
tick() {
// Process jobs without blocking event loop
setImmediate(() => {
while (this.active < this.concurrency && !this.queues.waiting.isEmpty()) {
const job = this.queues.waiting.dequeue();
this.processJob(job);
}
});
}
async processJob(job) {
this.active++;
this.queues.active.add(job);
try {
const processor = this.processors.get(job.type);
const result = await processor(job);
job.completedAt = Date.now();
job.result = result;
this.queues.completed.push(job);
this.emit('completed', job, result);
} catch (err) {
job.attemptsMade++;
if (job.attemptsMade < job.opts.attempts) {
// Retry with backoff
const delay = this.calculateBackoff(job);
job.processAt = Date.now() + delay;
this.scheduleDelayed(job);
} else {
job.failedAt = Date.now();
job.error = err.message;
this.queues.failed.push(job);
this.emit('failed', job, err);
}
} finally {
this.active--;
this.queues.active.delete(job);
this.tick(); // Process next job
}
}
}
Learning milestones:
- Jobs process concurrently → You understand async coordination
- Failed jobs retry with backoff → You understand error recovery
- Delayed jobs run on time → You understand timers
- Queue survives restarts → You understand persistence
Project 8: Database Connection Pool
- File: LEARN_NODEJS_DEEP_DIVE.md
- Main Programming Language: JavaScript (Node.js)
- Alternative Programming Languages: TypeScript
- Coolness Level: Level 3: Genuinely Clever
- Business Potential: 3. The “Service & Support” Model
- Difficulty: Level 3: Advanced
- Knowledge Area: Resource Management / Async
- Software or Tool: Connection Pool Library
- Main Book: “Node.js Design Patterns” by Mario Casciaro
What you’ll build: A generic connection pool for database connections, with min/max connections, health checks, timeout handling, and efficient connection reuse.
Why it teaches Node.js: Connection pools are a classic resource management problem. You’ll learn async patterns for acquiring/releasing resources, handling timeouts, and managing limited resources efficiently.
Core challenges you’ll face:
- Resource lifecycle → maps to create, validate, destroy patterns
- Async waiting → maps to queue pending requests
- Timeout handling → maps to Promise.race patterns
- Pool sizing → maps to dynamic scaling based on demand
Key Concepts:
- Resource Pooling: “Node.js Design Patterns” Chapter 4 - Casciaro
- Semaphore Pattern: “Operating Systems” Chapter 6 - Silberschatz
- Connection Pool Internals: generic-pool npm package source code
Difficulty: Advanced Time estimate: 1-2 weeks Prerequisites: Projects 3 and 7, understanding of async/await
Real world outcome:
// Your connection pool
const pool = new ConnectionPool({
create: async () => {
// Factory to create new connections
const conn = await pg.connect('postgres://localhost/mydb');
return conn;
},
destroy: async (conn) => {
await conn.end();
},
validate: async (conn) => {
// Health check
try {
await conn.query('SELECT 1');
return true;
} catch {
return false;
}
},
min: 2, // Minimum connections to keep
max: 10, // Maximum connections
acquireTimeout: 5000,
idleTimeout: 30000,
healthCheckInterval: 10000
});
// Using the pool
async function getUser(id) {
const conn = await pool.acquire();
try {
const result = await conn.query('SELECT * FROM users WHERE id = $1', [id]);
return result.rows[0];
} finally {
pool.release(conn);
}
}
// Or with automatic release
async function getUser(id) {
return pool.use(async (conn) => {
const result = await conn.query('SELECT * FROM users WHERE id = $1', [id]);
return result.rows[0];
});
}
// Pool stats
pool.getStats();
// {
// total: 5,
// available: 3,
// waiting: 0,
// acquired: 2
// }
// Graceful shutdown
await pool.drain();
await pool.clear();
Implementation Hints:
Pool architecture:
class ConnectionPool {
constructor(options) {
this.factory = {
create: options.create,
destroy: options.destroy,
validate: options.validate
};
this.options = {
min: options.min || 0,
max: options.max || 10,
acquireTimeout: options.acquireTimeout || 10000,
idleTimeout: options.idleTimeout || 30000
};
this.pool = []; // Available connections
this.acquired = new Set(); // In-use connections
this.waiting = []; // Pending acquire requests
this.creating = 0; // Connections being created
}
get size() {
return this.pool.length + this.acquired.size + this.creating;
}
async acquire() {
// Try to get an existing connection
while (this.pool.length > 0) {
const conn = this.pool.pop();
if (await this.validate(conn)) {
this.acquired.add(conn);
return conn;
}
await this.destroy(conn);
}
// Create new if under max
if (this.size < this.options.max) {
return this.createConnection();
}
// Wait for one to be released
return this.waitForConnection();
}
release(conn) {
this.acquired.delete(conn);
// Give to waiting request if any
if (this.waiting.length > 0) {
const { resolve } = this.waiting.shift();
this.acquired.add(conn);
resolve(conn);
} else {
// Return to pool
conn.lastUsed = Date.now();
this.pool.push(conn);
}
}
async waitForConnection() {
return new Promise((resolve, reject) => {
const timeout = setTimeout(() => {
const index = this.waiting.findIndex(w => w.resolve === resolve);
if (index !== -1) {
this.waiting.splice(index, 1);
reject(new Error('Acquire timeout'));
}
}, this.options.acquireTimeout);
this.waiting.push({
resolve: (conn) => {
clearTimeout(timeout);
resolve(conn);
},
reject
});
});
}
// Higher-level API
async use(fn) {
const conn = await this.acquire();
try {
return await fn(conn);
} finally {
this.release(conn);
}
}
}
Learning milestones:
- Connections are reused → You understand pooling
- Waiting requests get connections → You understand queueing
- Timeouts work correctly → You understand Promise.race
- Pool maintains min connections → You understand warmup
Project 9: Stream Data Processor
- File: LEARN_NODEJS_DEEP_DIVE.md
- Main Programming Language: JavaScript (Node.js)
- Alternative Programming Languages: TypeScript
- Coolness Level: Level 4: Hardcore Tech Flex
- Business Potential: 4. The “Open Core” Infrastructure
- Difficulty: Level 3: Advanced
- Knowledge Area: Streams / Data Processing
- Software or Tool: ETL Pipeline Tool
- Main Book: “Node.js Design Patterns” by Mario Casciaro
What you’ll build: A stream-based data processing pipeline that can transform, filter, and aggregate large datasets (CSV, JSON Lines, logs) without loading them entirely into memory.
Why it teaches Node.js: Streams are Node.js’s superpower for handling large data. You’ll learn about backpressure, Transform streams, and how to process gigabytes of data with constant memory usage.
Core challenges you’ll face:
- Transform streams → maps to custom stream implementations
- Backpressure handling → maps to flow control
- Pipeline composition → maps to stream chaining
- Error propagation → maps to stream error handling
Key Concepts:
- Streams Deep Dive: “Node.js Design Patterns” Chapter 6 - Casciaro
- Backpressure: Node.js Documentation - Backpressure in Streams
- Pipeline Pattern: stream.pipeline() documentation
Difficulty: Advanced Time estimate: 2 weeks Prerequisites: Project 4, understanding of streams
Real world outcome:
// Your stream processing library
const pipeline = new DataPipeline();
// Process a 10GB CSV file with constant memory
await pipeline
.source(csvReader('./huge-file.csv'))
.transform(row => ({
...row,
fullName: `${row.firstName} ${row.lastName}`,
age: parseInt(row.age)
}))
.filter(row => row.age >= 18)
.batch(1000) // Collect into batches
.transform(async batch => {
// Async transform: enrich from database
const ids = batch.map(r => r.id);
const extra = await db.query('SELECT * FROM extra WHERE id IN (?)', [ids]);
return batch.map(r => ({ ...r, extra: extra.find(e => e.id === r.id) }));
})
.flatten() // Back to individual records
.sink(jsonWriter('./output.jsonl'))
.run();
// Stats during processing
pipeline.on('progress', (stats) => {
console.log(`Processed: ${stats.processed} | Rate: ${stats.rate}/sec | Memory: ${stats.memory}MB`);
});
// Example output:
// Processed: 10,000 | Rate: 5,234/sec | Memory: 45MB
// Processed: 20,000 | Rate: 5,189/sec | Memory: 47MB
// ...
// Processed: 50,000,000 | Rate: 5,102/sec | Memory: 48MB
// Done! Total: 50,000,000 records in 2h 43m
Implementation Hints:
Custom Transform stream:
const { Transform } = require('stream');
class MapTransform extends Transform {
constructor(fn, options = {}) {
super({ ...options, objectMode: true });
this.fn = fn;
}
_transform(chunk, encoding, callback) {
try {
const result = this.fn(chunk);
if (result instanceof Promise) {
result
.then(r => callback(null, r))
.catch(callback);
} else {
callback(null, result);
}
} catch (err) {
callback(err);
}
}
}
Backpressure-aware pipeline:
const { pipeline } = require('stream/promises');
async function processFile(input, output) {
await pipeline(
fs.createReadStream(input),
csvParser(), // Parse CSV to objects
new MapTransform(transform), // Your transforms
new FilterTransform(predicate),
jsonStringify(), // Objects to JSON lines
fs.createWriteStream(output)
);
}
Batch transform (accumulate N items):
class BatchTransform extends Transform {
constructor(size) {
super({ objectMode: true });
this.size = size;
this.batch = [];
}
_transform(chunk, encoding, callback) {
this.batch.push(chunk);
if (this.batch.length >= this.size) {
this.push(this.batch);
this.batch = [];
}
callback();
}
_flush(callback) {
if (this.batch.length > 0) {
this.push(this.batch);
}
callback();
}
}
Learning milestones:
- Memory stays constant with large files → You understand streaming
- Slow consumers don’t overwhelm fast producers → You understand backpressure
- Errors propagate correctly → You understand pipeline error handling
- Async transforms work → You understand async streams
Project 10: HTTP Client Library
- File: LEARN_NODEJS_DEEP_DIVE.md
- Main Programming Language: JavaScript (Node.js)
- Alternative Programming Languages: TypeScript
- Coolness Level: Level 3: Genuinely Clever
- Business Potential: 3. The “Service & Support” Model
- Difficulty: Level 2: Intermediate
- Knowledge Area: HTTP / Networking
- Software or Tool: HTTP Client (like axios/got)
- Main Book: “Node.js in Action” by Cantelon, Rajlich, Harter
What you’ll build: An HTTP client library with a nice API, automatic retries, timeouts, interceptors, streaming support, and connection pooling—like axios or got, but from scratch.
Why it teaches Node.js: The built-in http module is low-level. Building a high-level client teaches you about the request/response lifecycle, connection reuse, and how to build ergonomic APIs.
Core challenges you’ll face:
- Request/response handling → maps to http.request() patterns
- Redirects → maps to following Location headers
- Timeouts → maps to aborting requests
- Connection reuse → maps to HTTP keep-alive
Key Concepts:
- HTTP Client: “Node.js in Action” Chapter 5 - Cantelon et al.
- Keep-Alive: Node.js Documentation - HTTP Agent
- Abort Controller: Node.js Documentation - AbortController
Difficulty: Intermediate Time estimate: 1-2 weeks Prerequisites: Project 3
Real world outcome:
// Your HTTP client
const client = new HttpClient({
baseURL: 'https://api.example.com',
timeout: 5000,
headers: {
'User-Agent': 'MyApp/1.0'
}
});
// Interceptors
client.interceptors.request.use((config) => {
config.headers['Authorization'] = `Bearer ${getToken()}`;
return config;
});
client.interceptors.response.use(
(response) => response,
(error) => {
if (error.status === 401) {
return refreshTokenAndRetry(error.config);
}
throw error;
}
);
// Simple requests
const response = await client.get('/users');
console.log(response.data); // Parsed JSON
// With options
const response = await client.post('/users', {
data: { name: 'Alice', email: 'alice@example.com' },
timeout: 10000,
retry: { attempts: 3, backoff: 'exponential' }
});
// Streaming response
const stream = await client.get('/large-file', { stream: true });
stream.pipe(fs.createWriteStream('./file'));
// Parallel requests
const [users, posts] = await Promise.all([
client.get('/users'),
client.get('/posts')
]);
// Request cancellation
const controller = new AbortController();
setTimeout(() => controller.abort(), 1000);
try {
await client.get('/slow-endpoint', { signal: controller.signal });
} catch (err) {
if (err.name === 'AbortError') {
console.log('Request cancelled');
}
}
Implementation Hints:
Core request function:
const http = require('http');
const https = require('https');
async function request(url, options = {}) {
return new Promise((resolve, reject) => {
const parsedUrl = new URL(url);
const protocol = parsedUrl.protocol === 'https:' ? https : http;
const req = protocol.request({
hostname: parsedUrl.hostname,
port: parsedUrl.port,
path: parsedUrl.pathname + parsedUrl.search,
method: options.method || 'GET',
headers: options.headers || {},
agent: options.agent // For connection pooling
}, (res) => {
// Handle response
const chunks = [];
res.on('data', chunk => chunks.push(chunk));
res.on('end', () => {
const body = Buffer.concat(chunks).toString();
resolve({
status: res.statusCode,
headers: res.headers,
data: JSON.parse(body)
});
});
});
req.on('error', reject);
// Timeout
if (options.timeout) {
req.setTimeout(options.timeout, () => {
req.destroy(new Error('Request timeout'));
});
}
// Send body
if (options.data) {
req.write(JSON.stringify(options.data));
}
req.end();
});
}
Connection pooling with Agent:
const keepAliveAgent = new http.Agent({
keepAlive: true,
maxSockets: 10,
maxFreeSockets: 5,
timeout: 60000
});
// Use agent for all requests
request(url, { agent: keepAliveAgent });
Retry logic:
async function requestWithRetry(url, options) {
const { attempts = 3, backoff = 1000 } = options.retry || {};
for (let attempt = 1; attempt <= attempts; attempt++) {
try {
return await request(url, options);
} catch (err) {
if (attempt === attempts) throw err;
if (!isRetryable(err)) throw err;
const delay = backoff * Math.pow(2, attempt - 1);
await sleep(delay);
}
}
}
Learning milestones:
- Basic requests work → You understand http.request
- Timeouts cancel correctly → You understand abort patterns
- Connections are reused → You understand keep-alive
- Interceptors chain correctly → You understand middleware pattern
Project 11: Logger Library
- File: LEARN_NODEJS_DEEP_DIVE.md
- Main Programming Language: JavaScript (Node.js)
- Alternative Programming Languages: TypeScript
- Coolness Level: Level 2: Practical but Forgettable
- Business Potential: 3. The “Service & Support” Model
- Difficulty: Level 2: Intermediate
- Knowledge Area: Streams / Formatting
- Software or Tool: Logger (like pino/winston)
- Main Book: “Node.js Design Patterns” by Mario Casciaro
What you’ll build: A high-performance logger with structured logging (JSON), log levels, multiple transports (console, file, HTTP), log rotation, and child loggers with context.
Why it teaches Node.js: Logging is a great study in streams and performance. You’ll learn why synchronous logging blocks the event loop and how async/streaming loggers achieve high performance.
Core challenges you’ll face:
- Stream-based output → maps to non-blocking writes
- Log rotation → maps to file management
- Structured logging → maps to JSON serialization performance
- Context propagation → maps to child logger patterns
Key Concepts:
- Logging Patterns: Pino documentation - Philosophy
- Async vs Sync Logging: “Node.js Design Patterns” Chapter 6
- Writable Streams: Node.js Documentation - Stream
Difficulty: Intermediate Time estimate: 1 week Prerequisites: Project 9
Real world outcome:
// Your logger
const logger = new Logger({
level: 'info',
transports: [
new ConsoleTransport({ colors: true }),
new FileTransport({
filename: './logs/app.log',
rotation: {
maxSize: '10MB',
maxFiles: 5
}
}),
new HttpTransport({
url: 'https://logs.example.com/ingest',
batchSize: 100,
flushInterval: 5000
})
]
});
// Basic logging
logger.info('Server started', { port: 3000 });
logger.error('Failed to connect', { host: 'db.example.com', error: err.message });
// Child logger with context
const reqLogger = logger.child({ requestId: 'abc123', userId: 'user456' });
reqLogger.info('Processing request'); // Includes requestId and userId
// Levels
logger.debug('Detailed info'); // Only if level <= debug
logger.info('Normal info');
logger.warn('Warning');
logger.error('Error');
logger.fatal('Fatal error');
// Pretty console output:
// [2024-01-15T10:23:45.123Z] INFO: Server started
// port: 3000
// JSON file output:
// {"level":"info","time":"2024-01-15T10:23:45.123Z","msg":"Server started","port":3000}
// Performance (non-blocking)
for (let i = 0; i < 100000; i++) {
logger.info('High-throughput logging', { index: i });
}
// Doesn't block - logs are buffered and written asynchronously
Implementation Hints:
High-performance logger architecture:
class Logger {
constructor(options) {
this.level = this.levelToNumber(options.level || 'info');
this.transports = options.transports || [];
this.context = options.context || {};
}
log(level, msg, data = {}) {
if (this.levelToNumber(level) < this.level) return;
const entry = {
level,
time: new Date().toISOString(),
msg,
...this.context,
...data
};
// Non-blocking: queue to transports
for (const transport of this.transports) {
transport.log(entry);
}
}
child(context) {
return new Logger({
level: this.level,
transports: this.transports,
context: { ...this.context, ...context }
});
}
info(msg, data) { this.log('info', msg, data); }
error(msg, data) { this.log('error', msg, data); }
// ... other levels
}
File transport with rotation:
class FileTransport {
constructor(options) {
this.filename = options.filename;
this.maxSize = parseSize(options.rotation?.maxSize || '10MB');
this.maxFiles = options.rotation?.maxFiles || 5;
this.stream = null;
this.currentSize = 0;
this.openStream();
}
openStream() {
this.stream = fs.createWriteStream(this.filename, { flags: 'a' });
this.currentSize = fs.existsSync(this.filename)
? fs.statSync(this.filename).size
: 0;
}
log(entry) {
const line = JSON.stringify(entry) + '\n';
// Check rotation
if (this.currentSize + line.length > this.maxSize) {
this.rotate();
}
this.stream.write(line);
this.currentSize += line.length;
}
rotate() {
this.stream.end();
// Rename app.log -> app.log.1, app.log.1 -> app.log.2, etc.
for (let i = this.maxFiles - 1; i >= 1; i--) {
const from = i === 1 ? this.filename : `${this.filename}.${i - 1}`;
const to = `${this.filename}.${i}`;
if (fs.existsSync(from)) {
fs.renameSync(from, to);
}
}
this.openStream();
}
}
Learning milestones:
- Logging doesn’t block → You understand async I/O
- Files rotate correctly → You understand file management
- Child loggers work → You understand context propagation
- Transports are composable → You understand transport pattern
Project 12: Package Manager (mini npm)
- File: LEARN_NODEJS_DEEP_DIVE.md
- Main Programming Language: JavaScript (Node.js)
- Alternative Programming Languages: TypeScript, Rust
- Coolness Level: Level 5: Pure Magic (Super Cool)
- Business Potential: 5. The “Industry Disruptor”
- Difficulty: Level 4: Expert
- Knowledge Area: Package Management / Module Resolution
- Software or Tool: Package Manager
- Main Book: “Node.js Design Patterns” by Mario Casciaro
What you’ll build: A package manager that can install packages from npm, resolve dependencies, handle version conflicts, and create a proper node_modules structure.
Why it teaches Node.js: Understanding how npm works gives you deep insight into the Node.js module system. You’ll learn about package.json, lockfiles, the resolution algorithm, and why node_modules is structured the way it is.
Core challenges you’ll face:
- Dependency resolution → maps to graph algorithms (topological sort)
- Version constraint solving → maps to semver matching
- Registry interaction → maps to HTTP client patterns
- node_modules structure → maps to module resolution algorithm
Key Concepts:
- Module Resolution: Node.js Documentation - Modules: Resolution algorithm
- Semver: The semver specification
- npm Architecture: npm documentation - How npm works
Difficulty: Expert Time estimate: 1 month Prerequisites: Projects 1, 9, 10
Real world outcome:
$ minipm init
Created package.json
$ minipm install express
Resolving dependencies...
express@4.18.2
├── accepts@1.3.8
│ ├── mime-types@2.1.35
│ └── negotiator@0.6.3
├── body-parser@1.20.1
│ ├── bytes@3.1.2
│ └── ...
└── ... (30 packages)
Downloading packages...
[================] 30/30 packages
Installing...
express@4.18.2 -> node_modules/express
accepts@1.3.8 -> node_modules/accepts
...
Added 30 packages in 2.3s
$ minipm install lodash@4.17.0
+ lodash@4.17.0
$ minipm list
├── express@4.18.2
│ ├── accepts@1.3.8
│ └── ...
└── lodash@4.17.0
$ minipm outdated
Package Current Wanted Latest
lodash 4.17.0 4.17.21 4.17.21
$ minipm update lodash
Updated lodash 4.17.0 -> 4.17.21
Implementation Hints:
Package resolution algorithm:
class Resolver {
constructor(registry) {
this.registry = registry;
this.resolved = new Map(); // package@version -> metadata
}
async resolve(name, versionConstraint) {
// Get package metadata from registry
const metadata = await this.registry.getPackage(name);
// Find best version matching constraint
const version = this.findBestVersion(metadata.versions, versionConstraint);
const key = `${name}@${version}`;
if (this.resolved.has(key)) {
return this.resolved.get(key);
}
const pkg = metadata.versions[version];
this.resolved.set(key, pkg);
// Recursively resolve dependencies
const deps = pkg.dependencies || {};
for (const [depName, depVersion] of Object.entries(deps)) {
await this.resolve(depName, depVersion);
}
return pkg;
}
findBestVersion(versions, constraint) {
// Match semver constraint: ^4.0.0, ~4.0.0, >=4.0.0, etc.
const available = Object.keys(versions);
return semver.maxSatisfying(available, constraint);
}
}
node_modules layout:
node_modules/
├── express/ # Direct dependency
│ └── package.json
├── accepts/ # Hoisted from express's deps
├── body-parser/ # Hoisted
└── mime-types/ # Hoisted
Hoisting algorithm:
function createLayout(resolved) {
const layout = new Map();
for (const [key, pkg] of resolved) {
const [name, version] = key.split('@');
// Try to hoist to top level
const existing = layout.get(name);
if (!existing) {
// Hoist to node_modules/<name>
layout.set(name, { version, path: `node_modules/${name}` });
} else if (existing.version !== version) {
// Version conflict - nest under dependent
// node_modules/<dependent>/node_modules/<name>
}
}
return layout;
}
Registry API:
class Registry {
constructor(url = 'https://registry.npmjs.org') {
this.url = url;
this.cache = new Map();
}
async getPackage(name) {
if (this.cache.has(name)) {
return this.cache.get(name);
}
const response = await fetch(`${this.url}/${name}`);
const data = await response.json();
this.cache.set(name, data);
return data;
}
async downloadTarball(url, dest) {
// Download and extract .tgz file
}
}
Learning milestones:
- Resolve a single package with deps → You understand dependency trees
- Handle version conflicts → You understand semver and hoisting
- Create proper node_modules → You understand module resolution
- Generate lockfile → You understand reproducible builds
Project 13: Build Tool (Bundler)
- File: LEARN_NODEJS_DEEP_DIVE.md
- Main Programming Language: JavaScript (Node.js)
- Alternative Programming Languages: TypeScript, Rust
- Coolness Level: Level 5: Pure Magic (Super Cool)
- Business Potential: 5. The “Industry Disruptor”
- Difficulty: Level 4: Expert
- Knowledge Area: AST / Bundling
- Software or Tool: Module Bundler (like esbuild/webpack)
- Main Book: “Compilers: Principles and Practice” by Dave & Dave
What you’ll build: A JavaScript bundler that resolves imports, parses code into an AST, transforms it, and produces a single output file—like a mini esbuild or webpack.
Why it teaches Node.js: Bundlers are where parsing, AST manipulation, and performance optimization come together. You’ll understand how modern JavaScript tools work under the hood.
Core challenges you’ll face:
- Module resolution → maps to finding import targets
- AST parsing → maps to understanding code structure
- Dependency graph → maps to determining bundle order
- Code generation → maps to producing output from AST
Key Concepts:
- AST Parsing: esprima/acorn documentation
- Module Bundling: Webpack concepts documentation
- Code Generation: “Engineering a Compiler” Chapter 11 - Cooper
Difficulty: Expert Time estimate: 1 month Prerequisites: Project 12, understanding of ASTs
Real world outcome:
$ minibundle src/index.js --outfile dist/bundle.js
Resolving modules...
src/index.js
├── src/utils.js
├── node_modules/lodash-es/lodash.js
└── src/components/Button.js
└── src/styles.css
Transforming...
- Converted ESM to CommonJS wrapper
- Minified 4 modules
- Processed 1 CSS file
Output:
dist/bundle.js (45.2 KB, 12.1 KB gzipped)
dist/bundle.css (2.3 KB)
Done in 0.8s
// src/index.js
import { debounce } from 'lodash-es';
import { Button } from './components/Button';
import './styles.css';
const button = new Button();
button.onClick(debounce(() => console.log('clicked'), 300));
// dist/bundle.js (simplified output)
(function(modules) {
var cache = {};
function require(id) {
if (cache[id]) return cache[id].exports;
var module = cache[id] = { exports: {} };
modules[id](module, module.exports, require);
return module.exports;
}
require(0);
})({
0: function(module, exports, require) {
var lodash = require(1);
var Button = require(2);
// ...
},
1: function(module, exports, require) {
// lodash debounce function
},
2: function(module, exports, require) {
// Button component
}
});
Implementation Hints:
Bundler pipeline:
┌─────────────┐ ┌─────────────┐ ┌─────────────┐ ┌─────────────┐
│ Resolve │───▶│ Parse │───▶│ Transform │───▶│ Generate │
│ Modules │ │ to AST │ │ AST │ │ Code │
└─────────────┘ └─────────────┘ └─────────────┘ └─────────────┘
Module resolution:
function resolveModule(importPath, fromFile) {
// Relative import: ./utils
if (importPath.startsWith('.')) {
return path.resolve(path.dirname(fromFile), importPath);
}
// Package import: lodash
// Walk up to find node_modules
let dir = path.dirname(fromFile);
while (dir !== '/') {
const modulePath = path.join(dir, 'node_modules', importPath);
if (fs.existsSync(modulePath)) {
return resolvePackageMain(modulePath);
}
dir = path.dirname(dir);
}
throw new Error(`Cannot resolve: ${importPath}`);
}
AST-based import extraction:
const acorn = require('acorn');
function extractImports(code) {
const ast = acorn.parse(code, { sourceType: 'module', ecmaVersion: 2022 });
const imports = [];
for (const node of ast.body) {
if (node.type === 'ImportDeclaration') {
imports.push({
source: node.source.value,
specifiers: node.specifiers.map(s => ({
imported: s.imported?.name || 'default',
local: s.local.name
}))
});
}
}
return imports;
}
Bundle generation:
function generateBundle(modules) {
const moduleCode = modules.map((mod, id) => {
return `${id}: function(module, exports, require) {\n${mod.code}\n}`;
}).join(',\n');
return `
(function(modules) {
var cache = {};
function require(id) {
if (cache[id]) return cache[id].exports;
var module = cache[id] = { exports: {} };
modules[id](module, module.exports, require);
return module.exports;
}
require(0);
})({
${moduleCode}
});
`;
}
Learning milestones:
- Imports are resolved correctly → You understand module resolution
- AST is parsed → You understand code structure
- Single bundle is generated → You understand code generation
- Dead code is eliminated → You understand tree shaking basics
Project 14: Test Runner
- File: LEARN_NODEJS_DEEP_DIVE.md
- Main Programming Language: JavaScript (Node.js)
- Alternative Programming Languages: TypeScript
- Coolness Level: Level 3: Genuinely Clever
- Business Potential: 3. The “Service & Support” Model
- Difficulty: Level 3: Advanced
- Knowledge Area: Testing / Process Management
- Software or Tool: Test Runner (like Jest/Vitest)
- Main Book: “Node.js Design Patterns” by Mario Casciaro
What you’ll build: A test runner with test discovery, assertions, before/after hooks, parallel execution, watch mode, and coverage reporting—like a mini Jest.
Why it teaches Node.js: Test runners combine many Node.js concepts: file system traversal, child processes for isolation, async coordination, and the vm module for sandboxing.
Core challenges you’ll face:
- Test discovery → maps to glob patterns and file system
- Assertion library → maps to comparison and error formatting
- Test isolation → maps to vm module or child processes
- Parallel execution → maps to worker threads or process pool
Key Concepts:
- VM Module: Node.js Documentation - vm
- Worker Threads: Node.js Documentation - Worker Threads
- Code Coverage: v8-coverage and c8 documentation
Difficulty: Advanced Time estimate: 2-3 weeks Prerequisites: Projects 2, 7
Real world outcome:
// tests/math.test.js
import { describe, it, expect, beforeEach } from 'minitest';
describe('Math operations', () => {
let calculator;
beforeEach(() => {
calculator = new Calculator();
});
it('should add numbers', () => {
expect(calculator.add(2, 3)).toBe(5);
});
it('should throw on division by zero', () => {
expect(() => calculator.divide(10, 0)).toThrow('Division by zero');
});
it('should handle async operations', async () => {
const result = await calculator.fetchAndAdd('https://api.example.com/number');
expect(result).toBeGreaterThan(0);
});
});
$ minitest
PASS tests/math.test.js
Math operations
✓ should add numbers (2ms)
✓ should throw on division by zero (1ms)
✓ should handle async operations (45ms)
PASS tests/string.test.js
String operations
✓ should concatenate (1ms)
✓ should trim whitespace (1ms)
FAIL tests/api.test.js
API tests
✓ should fetch users (120ms)
✗ should handle errors (15ms)
Expected: 404
Received: 500
at tests/api.test.js:25:14
Test Suites: 2 passed, 1 failed, 3 total
Tests: 6 passed, 1 failed, 7 total
Time: 0.34s
$ minitest --watch
Watching for changes...
[Enter] to run all, [p] to filter by pattern
$ minitest --coverage
...
Coverage:
Statements: 85.2%
Branches: 72.1%
Functions: 90.0%
Lines: 85.2%
Implementation Hints:
Test runner architecture:
class TestRunner {
constructor() {
this.suites = [];
this.results = [];
}
async discover(patterns) {
const files = await glob(patterns);
for (const file of files) {
await this.loadTestFile(file);
}
}
async loadTestFile(file) {
// Run test file in isolated context
const code = fs.readFileSync(file, 'utf-8');
const context = this.createTestContext(file);
// Use vm module for isolation
vm.runInNewContext(code, context, { filename: file });
}
createTestContext(file) {
const suite = { file, tests: [], beforeEach: [], afterEach: [] };
this.suites.push(suite);
return {
describe: (name, fn) => {
suite.name = name;
fn();
},
it: (name, fn) => {
suite.tests.push({ name, fn });
},
beforeEach: (fn) => suite.beforeEach.push(fn),
afterEach: (fn) => suite.afterEach.push(fn),
expect: createExpect(),
// ... other globals
};
}
async run() {
for (const suite of this.suites) {
await this.runSuite(suite);
}
this.report();
}
async runSuite(suite) {
for (const test of suite.tests) {
try {
for (const before of suite.beforeEach) await before();
await test.fn();
for (const after of suite.afterEach) await after();
this.results.push({ suite, test, passed: true });
} catch (err) {
this.results.push({ suite, test, passed: false, error: err });
}
}
}
}
Assertion library:
function createExpect() {
return function expect(actual) {
return {
toBe(expected) {
if (actual !== expected) {
throw new AssertionError({
message: `Expected: ${expected}\nReceived: ${actual}`,
expected,
actual
});
}
},
toEqual(expected) {
if (!deepEqual(actual, expected)) {
throw new AssertionError({ ... });
}
},
toThrow(message) {
let threw = false;
try {
actual();
} catch (err) {
threw = true;
if (message && !err.message.includes(message)) {
throw new AssertionError({ ... });
}
}
if (!threw) {
throw new AssertionError({ message: 'Expected function to throw' });
}
}
};
};
}
Learning milestones:
- Tests are discovered → You understand glob patterns
- Tests run in isolation → You understand vm module
- Assertions work → You understand comparison logic
- Parallel execution works → You understand worker coordination
Project 15: Process Manager (mini PM2)
- File: LEARN_NODEJS_DEEP_DIVE.md
- Main Programming Language: JavaScript (Node.js)
- Alternative Programming Languages: TypeScript, Rust
- Coolness Level: Level 5: Pure Magic (Super Cool)
- Business Potential: 4. The “Open Core” Infrastructure
- Difficulty: Level 4: Expert
- Knowledge Area: Process Management / Clustering
- Software or Tool: Process Manager
- Main Book: “Node.js Design Patterns” by Mario Casciaro
What you’ll build: A process manager that can start, stop, restart, and monitor Node.js applications—with clustering, log management, and graceful restarts.
Why it teaches Node.js: Process managers demonstrate mastery of child processes, the cluster module, IPC, and system-level Node.js programming. This is how you run Node.js in production.
Core challenges you’ll face:
- Child process management → maps to spawn, exec, fork
- Clustering → maps to utilizing multiple CPU cores
- Graceful shutdown → maps to signal handling
- Daemon mode → maps to detaching from terminal
Key Concepts:
- Child Processes: Node.js Documentation - Child Process
- Cluster Module: Node.js Documentation - Cluster
- Signal Handling: “Linux System Programming” Chapter 10 - Love
Difficulty: Expert Time estimate: 3-4 weeks Prerequisites: Projects 7, 11
Real world outcome:
$ minipm start app.js --name my-app --instances 4
[PM] Starting my-app with 4 instances...
[PM] my-app:0 online (pid: 12340)
[PM] my-app:1 online (pid: 12341)
[PM] my-app:2 online (pid: 12342)
[PM] my-app:3 online (pid: 12343)
[PM] my-app started successfully
$ minipm list
┌────────┬────┬─────────┬────────┬─────────┬──────────┐
│ Name │ ID │ Status │ CPU │ Memory │ Restarts │
├────────┼────┼─────────┼────────┼─────────┼──────────┤
│ my-app │ 0 │ online │ 0.5% │ 45 MB │ 0 │
│ my-app │ 1 │ online │ 0.3% │ 43 MB │ 0 │
│ my-app │ 2 │ online │ 0.4% │ 44 MB │ 0 │
│ my-app │ 3 │ online │ 0.2% │ 42 MB │ 0 │
└────────┴────┴─────────┴────────┴─────────┴──────────┘
$ minipm logs my-app
[my-app:0] Server listening on port 3000
[my-app:1] Server listening on port 3000
[my-app:2] Received request: GET /users
[my-app:3] Received request: GET /posts
$ minipm restart my-app --graceful
[PM] Gracefully restarting my-app...
[PM] Sending SIGTERM to my-app:0
[PM] my-app:0 exited gracefully
[PM] my-app:0 restarted (pid: 12350)
[PM] Sending SIGTERM to my-app:1
...
[PM] Restart complete
$ minipm stop my-app
[PM] Stopping my-app...
[PM] All instances stopped
$ minipm monit
┌─────────────────────────────────────────────────────────┐
│ my-app 4 instances │
├─────────────────────────────────────────────────────────┤
│ CPU ████████░░░░░░░░░░░░ 38% │
│ MEM ███░░░░░░░░░░░░░░░░░ 175 MB / 8 GB │
│ REQ ████████████████░░░░ 1,234 req/s │
└─────────────────────────────────────────────────────────┘
Implementation Hints:
Process manager architecture:
┌─────────────────────────────────────────────────────────────┐
│ minipm daemon │
│ │
│ ┌─────────────┐ │
│ │ Master │ (Manages all applications) │
│ └──────┬──────┘ │
│ │ │
│ ┌──────▼──────┐ │
│ │ App │ (One per application) │
│ │ Manager │ │
│ └──────┬──────┘ │
│ │ │
│ ┌──────┼──────┬──────┐ │
│ ▼ ▼ ▼ ▼ │
│ ┌───┐ ┌───┐ ┌───┐ ┌───┐ │
│ │W 0│ │W 1│ │W 2│ │W 3│ (Worker processes) │
│ └───┘ └───┘ └───┘ └───┘ │
│ │
└─────────────────────────────────────────────────────────────┘
│ ▲
│ IPC │ Logs
▼ │
┌─────────┐ ┌──────┴──────┐
│ minipm │ │ Log files │
│ CLI │ │ │
└─────────┘ └─────────────┘
Using cluster module:
const cluster = require('cluster');
const os = require('os');
class AppManager {
constructor(script, options) {
this.script = script;
this.instances = options.instances || os.cpus().length;
this.workers = new Map();
}
start() {
cluster.setupMaster({
exec: this.script,
args: this.args
});
for (let i = 0; i < this.instances; i++) {
this.forkWorker(i);
}
cluster.on('exit', (worker, code, signal) => {
const id = this.workers.get(worker.process.pid);
console.log(`Worker ${id} died. Restarting...`);
this.forkWorker(id);
});
}
forkWorker(id) {
const worker = cluster.fork({ INSTANCE_ID: id });
this.workers.set(worker.process.pid, id);
worker.on('message', (msg) => {
this.handleWorkerMessage(id, msg);
});
return worker;
}
gracefulRestart() {
// Restart workers one at a time
const workers = [...cluster.workers];
let index = 0;
const restartNext = () => {
if (index >= workers.length) return;
const worker = workers[index++];
worker.send('shutdown');
// Give worker time to finish requests
setTimeout(() => {
worker.kill('SIGTERM');
}, 5000);
worker.on('exit', () => {
this.forkWorker(index - 1);
restartNext();
});
};
restartNext();
}
}
Worker process (in your app):
// Graceful shutdown handling
process.on('message', (msg) => {
if (msg === 'shutdown') {
// Stop accepting new connections
server.close(() => {
// Finish existing requests
process.exit(0);
});
}
});
process.on('SIGTERM', () => {
// Same graceful shutdown
});
Learning milestones:
- Processes spawn and manage → You understand child_process
- Clustering works → You understand multi-core utilization
- Graceful restart works → You understand signal handling
- Monitoring shows stats → You understand process introspection
Project 16: Native Addon (N-API)
- File: LEARN_NODEJS_DEEP_DIVE.md
- Main Programming Language: C/C++
- Alternative Programming Languages: Rust (with neon)
- Coolness Level: Level 5: Pure Magic (Super Cool)
- Business Potential: 4. The “Open Core” Infrastructure
- Difficulty: Level 5: Master
- Knowledge Area: Native Extensions / V8 Integration
- Software or Tool: Native Addon
- Main Book: “Node.js API (N-API)” Documentation
What you’ll build: A native addon in C++ that exposes high-performance functionality to JavaScript—like a faster image processor or crypto function.
Why it teaches Node.js: Sometimes JavaScript isn’t fast enough. Native addons let you drop to C++ for performance-critical code. You’ll understand how Node.js integrates with V8 and the costs of crossing the JS/C++ boundary.
Core challenges you’ll face:
- N-API bindings → maps to stable native interface
- Memory management → maps to preventing leaks across boundary
- Async operations → maps to libuv thread pool integration
- Type conversion → maps to JS values to C++ and back
Key Concepts:
- N-API: Node.js Documentation - N-API
- node-addon-api: C++ wrapper for N-API
- libuv Integration: libuv documentation
Difficulty: Master Time estimate: 3-4 weeks Prerequisites: C/C++ knowledge, understanding of Node.js internals
Real world outcome:
// Using your native addon
const addon = require('./build/Release/image_processor.node');
// Synchronous (blocks event loop - use sparingly)
const grayscale = addon.toGrayscale(imageBuffer);
// Asynchronous (uses libuv thread pool)
addon.resize(imageBuffer, 800, 600, (err, resized) => {
if (err) throw err;
fs.writeFileSync('resized.jpg', resized);
});
// Promise-based async
const blurred = await addon.blur(imageBuffer, 5);
// Performance comparison
console.time('native');
for (let i = 0; i < 1000; i++) {
addon.hash(data); // Native SHA256
}
console.timeEnd('native');
// native: 15ms
console.time('js');
for (let i = 0; i < 1000; i++) {
crypto.createHash('sha256').update(data).digest();
}
console.timeEnd('js');
// js: 250ms
Implementation Hints:
Project structure:
native-addon/
├── binding.gyp # Build configuration
├── package.json
├── src/
│ ├── addon.cc # Addon entry point
│ ├── image_processor.cc
│ └── image_processor.h
├── lib/
│ └── index.js # JavaScript wrapper
└── build/ # Compiled output
binding.gyp:
{
"targets": [
{
"target_name": "image_processor",
"sources": ["src/addon.cc", "src/image_processor.cc"],
"include_dirs": [
"<!@(node -p \"require('node-addon-api').include\")"
],
"defines": ["NAPI_DISABLE_CPP_EXCEPTIONS"]
}
]
}
Basic N-API addon (using node-addon-api wrapper):
// src/addon.cc
#include <napi.h>
// Synchronous function
Napi::Value ToGrayscale(const Napi::CallbackInfo& info) {
Napi::Env env = info.Env();
// Get buffer argument
Napi::Buffer<uint8_t> buffer = info[0].As<Napi::Buffer<uint8_t>>();
uint8_t* data = buffer.Data();
size_t length = buffer.Length();
// Process (example: grayscale conversion)
for (size_t i = 0; i < length; i += 3) {
uint8_t gray = (data[i] + data[i+1] + data[i+2]) / 3;
data[i] = data[i+1] = data[i+2] = gray;
}
// Return same buffer (modified in place)
return buffer;
}
// Async function using AsyncWorker
class ResizeWorker : public Napi::AsyncWorker {
public:
ResizeWorker(Napi::Buffer<uint8_t>& buffer, int width, int height,
Napi::Function& callback)
: Napi::AsyncWorker(callback), inputData(buffer.Data()),
inputLength(buffer.Length()), width(width), height(height) {}
// Runs on libuv thread pool (not main thread!)
void Execute() override {
// CPU-intensive work here
outputData = resize(inputData, inputLength, width, height, &outputLength);
}
// Runs on main thread after Execute completes
void OnOK() override {
Napi::HandleScope scope(Env());
Callback().Call({
Env().Null(),
Napi::Buffer<uint8_t>::Copy(Env(), outputData, outputLength)
});
free(outputData);
}
private:
uint8_t* inputData;
size_t inputLength;
int width, height;
uint8_t* outputData;
size_t outputLength;
};
Napi::Value Resize(const Napi::CallbackInfo& info) {
Napi::Buffer<uint8_t> buffer = info[0].As<Napi::Buffer<uint8_t>>();
int width = info[1].As<Napi::Number>().Int32Value();
int height = info[2].As<Napi::Number>().Int32Value();
Napi::Function callback = info[3].As<Napi::Function>();
ResizeWorker* worker = new ResizeWorker(buffer, width, height, callback);
worker->Queue();
return info.Env().Undefined();
}
// Module initialization
Napi::Object Init(Napi::Env env, Napi::Object exports) {
exports.Set("toGrayscale", Napi::Function::New(env, ToGrayscale));
exports.Set("resize", Napi::Function::New(env, Resize));
return exports;
}
NODE_API_MODULE(image_processor, Init)
Learning milestones:
- Simple sync function works → You understand N-API basics
- Async function with thread pool works → You understand AsyncWorker
- Memory is managed correctly → You understand buffer handling
- Performance beats pure JS → You understand when native makes sense
Project Comparison Table
| Project | Difficulty | Time | Depth of Understanding | Fun Factor | Business Value |
|---|---|---|---|---|---|
| 1. CLI Task Manager | Beginner | Weekend | ⭐⭐ | ⭐⭐ | Resume Gold |
| 2. File Watcher | Beginner | Weekend | ⭐⭐⭐ | ⭐⭐⭐ | Micro-SaaS |
| 3. HTTP Server | Intermediate | 1 week | ⭐⭐⭐⭐ | ⭐⭐⭐ | Resume Gold |
| 4. Static File Server | Intermediate | 1 week | ⭐⭐⭐ | ⭐⭐ | Micro-SaaS |
| 5. Mini Web Framework | Intermediate | 2 weeks | ⭐⭐⭐⭐⭐ | ⭐⭐⭐⭐ | Open Core |
| 6. Real-time Chat | Intermediate | 1-2 weeks | ⭐⭐⭐⭐ | ⭐⭐⭐⭐⭐ | Service |
| 7. Job Queue | Advanced | 2-3 weeks | ⭐⭐⭐⭐⭐ | ⭐⭐⭐⭐ | Open Core |
| 8. Connection Pool | Advanced | 1-2 weeks | ⭐⭐⭐⭐ | ⭐⭐⭐ | Service |
| 9. Stream Processor | Advanced | 2 weeks | ⭐⭐⭐⭐⭐ | ⭐⭐⭐⭐ | Open Core |
| 10. HTTP Client | Intermediate | 1-2 weeks | ⭐⭐⭐ | ⭐⭐⭐ | Service |
| 11. Logger Library | Intermediate | 1 week | ⭐⭐⭐ | ⭐⭐ | Service |
| 12. Package Manager | Expert | 1 month | ⭐⭐⭐⭐⭐ | ⭐⭐⭐⭐⭐ | Disruptor |
| 13. Build Tool | Expert | 1 month | ⭐⭐⭐⭐⭐ | ⭐⭐⭐⭐⭐ | Disruptor |
| 14. Test Runner | Advanced | 2-3 weeks | ⭐⭐⭐⭐ | ⭐⭐⭐⭐ | Service |
| 15. Process Manager | Expert | 3-4 weeks | ⭐⭐⭐⭐⭐ | ⭐⭐⭐⭐⭐ | Open Core |
| 16. Native Addon | Master | 3-4 weeks | ⭐⭐⭐⭐⭐ | ⭐⭐⭐⭐ | Open Core |
Recommended Learning Paths
For Beginners (New to Node.js)
Start with: Project 1 → Project 2 → Project 3 → Project 4
This path teaches you:
- Basic Node.js: modules, file I/O, CLI
- EventEmitters and async patterns
- HTTP fundamentals without frameworks
- Streams for efficient I/O
Time: 3-4 weeks
For Web Developers (Know Express, Want Depth)
Start with: Project 5 → Project 6 → Project 7 → Project 9
This path teaches you:
- How Express actually works
- Real-time with WebSockets
- Background job processing
- Stream-based data processing
Time: 6-8 weeks
For Backend Engineers (Production Systems)
Start with: Project 7 → Project 8 → Project 14 → Project 15
This path teaches you:
- Job queues for async processing
- Resource pooling and management
- Testing infrastructure
- Production process management
Time: 8-10 weeks
For Systems Programmers (Deep Internals)
Start with: Project 12 → Project 13 → Project 15 → Project 16
This path teaches you:
- Module resolution and npm internals
- AST parsing and code generation
- Process management and clustering
- Native C++ integration
Time: 3-4 months
Final Capstone Project: Full Node.js Platform
- File: LEARN_NODEJS_DEEP_DIVE.md
- Main Programming Language: JavaScript/TypeScript (Node.js)
- Alternative Programming Languages: None (pure Node.js mastery)
- Coolness Level: Level 5: Pure Magic (Super Cool)
- Business Potential: 5. The “Industry Disruptor”
- Difficulty: Level 5: Master
- Knowledge Area: Full Node.js Stack
- Software or Tool: Complete Development Platform
- Main Book: All previous resources combined
What you’ll build: A complete Node.js development platform combining everything you’ve learned—a web framework, CLI tools, process management, job queues, and development tooling (bundler, test runner).
Why it teaches Node.js: This is the ultimate test of Node.js mastery. You’ll combine all the concepts into a cohesive platform that could power real applications.
Components to integrate:
- Web framework (from Project 5)
- WebSocket support (from Project 6)
- Job queue (from Project 7)
- Stream processing (from Project 9)
- HTTP client (from Project 10)
- Logging (from Project 11)
- Build tool (from Project 13)
- Test runner (from Project 14)
- Process manager (from Project 15)
Real world outcome:
# Development
$ myplatform dev
Starting development server...
✓ Built in 0.3s
✓ Server running at http://localhost:3000
✓ Watching for changes...
# Testing
$ myplatform test
Running 47 tests...
✓ All tests passed (2.1s)
# Production
$ myplatform build
Building for production...
✓ Bundled 127 modules
✓ Minified (234KB → 67KB gzipped)
$ myplatform start --cluster 4
Starting 4 workers...
✓ Worker 0 ready (pid 12340)
✓ Worker 1 ready (pid 12341)
✓ Worker 2 ready (pid 12342)
✓ Worker 3 ready (pid 12343)
$ myplatform status
┌─────────┬────────┬────────┬─────────┐
│ Worker │ Status │ CPU │ Memory │
├─────────┼────────┼────────┼─────────┤
│ 0 │ online │ 2.3% │ 78 MB │
│ 1 │ online │ 1.8% │ 75 MB │
│ 2 │ online │ 2.1% │ 76 MB │
│ 3 │ online │ 1.9% │ 74 MB │
└─────────┴────────┴────────┴─────────┘
Jobs: 45 queued, 5 active, 1,234 completed
Time estimate: 2-3 months Prerequisites: Complete at least 12 of the earlier projects
Learning milestones:
- All components work independently → You’ve mastered each concept
- Components integrate seamlessly → You understand system design
- Platform is developer-friendly → You understand DX
- Platform handles production load → You understand performance
Summary
| # | Project | Main Language |
|---|---|---|
| 1 | CLI Task Manager | JavaScript (Node.js) |
| 2 | File Watcher | JavaScript (Node.js) |
| 3 | HTTP Server from Scratch | JavaScript (Node.js) |
| 4 | Static File Server | JavaScript (Node.js) |
| 5 | Mini Web Framework | JavaScript (Node.js) |
| 6 | Real-time Chat Server | JavaScript (Node.js) |
| 7 | Job Queue System | JavaScript (Node.js) |
| 8 | Database Connection Pool | JavaScript (Node.js) |
| 9 | Stream Data Processor | JavaScript (Node.js) |
| 10 | HTTP Client Library | JavaScript (Node.js) |
| 11 | Logger Library | JavaScript (Node.js) |
| 12 | Package Manager (mini npm) | JavaScript (Node.js) |
| 13 | Build Tool (Bundler) | JavaScript (Node.js) |
| 14 | Test Runner | JavaScript (Node.js) |
| 15 | Process Manager (mini PM2) | JavaScript (Node.js) |
| 16 | Native Addon (N-API) | C/C++ |
| Capstone | Full Node.js Platform | JavaScript/TypeScript |
Additional Resources
Books
- “Node.js Design Patterns” by Mario Casciaro & Luciano Mammino
- “Node.js in Action” by Mike Cantelon et al.
- “Distributed Systems with Node.js” by Thomas Hunter II
- “Node Cookbook” by Bethany Griggs
Documentation
- Node.js Official Documentation (nodejs.org/docs)
- libuv documentation (docs.libuv.org)
- V8 Developer Guide
Source Code to Study
- Express: Simple, readable codebase
- Koa: Modern async/await patterns
- Fastify: Performance-optimized patterns
- Pino: High-performance logging
- node-postgres: Connection pooling patterns
- Bull: Job queue implementation
Videos/Talks
- “The Node.js Event Loop” by Bert Belder
- “Streams Deep Dive” by Matteo Collina
- “Understanding the Node.js Event Loop” by Daniel Khan
Node.js isn’t just JavaScript on the server—it’s a different way of thinking about I/O. Master the event loop, embrace streams, and you’ll build systems that handle thousands of concurrent connections with elegance.