Streams & Buffers
MedStreams let you process data piece by piece instead of loading everything into memory. Buffers hold binary data. Together they enable efficient handling of files, network requests, and real-time data.
Interactive Visualization
Code
1const fs = require('fs');23const stream = fs.createReadStream('file.txt');45stream.on('data', (chunk) => {6 console.log('Chunk:', chunk.length);7});89stream.on('end', () => {10 console.log('Done!');11});
Stream Pipeline
file.txtreadable
chunk1
chunk2
chunk3
File stream created but not yet flowing
1 / 6
Key Insight: Streams process data in chunks (default 16KB). Memory efficient - never loads entire file!
Key Points
- Streams process data in chunks (memory efficient)
- Four types: Readable, Writable, Duplex, Transform
- Buffers are fixed-size chunks of binary data
- Backpressure: slow consumer signals fast producer to pause
- pipe() handles backpressure automatically
- highWaterMark controls internal buffer size (default 16KB)
Code Examples
Reading a File Stream
const fs = require("fs"); // BAD: Loads entire file into memory const data = fs.readFileSync("huge.txt"); // GOOD: Stream processes chunks const stream = fs.createReadStream("huge.txt"); stream.on("data", (chunk) => { console.log(`Received ${chunk.length} bytes`); }); stream.on("end", () => { console.log("Done reading"); });
Streams process large files without loading into memory
Piping Streams
const fs = require("fs"); const zlib = require("zlib"); // Read → Compress → Write fs.createReadStream("input.txt") .pipe(zlib.createGzip()) .pipe(fs.createWriteStream("output.gz")); // pipe() handles backpressure! // If write is slow, read pauses
pipe() chains streams with automatic backpressure
Transform Stream
const { Transform } = require("stream"); const upperCase = new Transform({ transform(chunk, encoding, callback) { // Modify data as it flows through const upper = chunk.toString().toUpperCase(); callback(null, upper); } }); process.stdin .pipe(upperCase) .pipe(process.stdout); // Type lowercase → outputs UPPERCASE
Transform streams modify data in flight
Buffer Basics
// Create buffer from string const buf1 = Buffer.from("Hello"); console.log(buf1); // <Buffer 48 65 6c 6c 6f> // Create empty buffer const buf2 = Buffer.alloc(10); // Buffer operations buf1.toString(); // "Hello" buf1.length; // 5 buf1[0]; // 72 (ASCII 'H') // Concatenate buffers const combined = Buffer.concat([buf1, buf2]);
Buffers hold raw binary data
Backpressure Handling
const readable = getReadableStream(); const writable = getWritableStream(); readable.on("data", (chunk) => { // write() returns false if buffer full const ok = writable.write(chunk); if (!ok) { // Pause reading until drain readable.pause(); writable.once("drain", () => { readable.resume(); }); } }); // Or just use pipe() which does this!
Manual backpressure handling (or use pipe)
HTTP Streaming
const http = require("http"); const fs = require("fs"); http.createServer((req, res) => { // Stream file directly to response const stream = fs.createReadStream("video.mp4"); res.writeHead(200, { "Content-Type": "video/mp4" }); stream.pipe(res); // Video starts playing immediately! // No need to load entire file first }).listen(3000);
Stream large files in HTTP responses
Common Mistakes
- Loading entire files into memory instead of streaming
- Ignoring backpressure (causes memory issues)
- Not handling stream errors (crashes server)
- Confusing Buffer.alloc() with Buffer.allocUnsafe()
Interview Tips
- Know the four stream types and when to use each
- Explain backpressure and how pipe() handles it
- Understand why streams are memory-efficient
- Be able to implement a custom Transform stream