Lab 12: Streams & Buffers

Time: 30 minutes | Level: Practitioner | Docker: docker run -it --rm node:20-alpine sh

Overview

Master Node.js streams: Readable, Writable, Transform, and Duplex streams; piping; backpressure; stream.pipeline; and Buffer operations (alloc/from/concat/copy/slice).


Step 1: Readable Streams

const { Readable } = require('node:stream');

// Create from iterable
const fromArray = Readable.from([1, 2, 3, 4, 5]);
fromArray.on('data', chunk => process.stdout.write(String(chunk) + ' '));
fromArray.on('end', () => console.log('(end)'));

// Create custom Readable
class NumberStream extends Readable {
  constructor(start, end, options) {
    super({ ...options, objectMode: true });
    this.current = start;
    this.end = end;
  }
  _read() {
    if (this.current <= this.end) {
      this.push(this.current++);
    } else {
      this.push(null); // Signal end
    }
  }
}

const nums = new NumberStream(1, 5);
(async () => {
  for await (const n of nums) {
    process.stdout.write(n + ' ');
  }
  console.log();
})();

// Consuming with async iteration (recommended modern approach)
async function readAll(stream) {
  const chunks = [];
  for await (const chunk of stream) chunks.push(chunk);
  return chunks;
}

Step 2: Writable Streams

💡 Always handle backpressure! If write() returns false, wait for the 'drain' event before writing more.


Step 3: Transform Streams


Step 4: Duplex Streams


Step 5: Pipeline


Step 6: Buffer Deep Dive


Step 7: Backpressure and highWaterMark


Step 8: Capstone — Stream Pipeline

Run verification:

📸 Verified Output:


Summary

Stream Type
Direction
Use Case

Readable

Source (outward)

File read, HTTP response, data generation

Writable

Sink (inward)

File write, HTTP request, data consumption

Transform

Both

Compression, encryption, parsing

Duplex

Both (independent)

TCP sockets, crypto streams

PassThrough

Both (transparent)

Monitoring, spy streams

Buffer Method
Description

Buffer.alloc(n)

Allocate n zero bytes

Buffer.from(str)

Create from string

Buffer.concat([b1,b2])

Concatenate buffers

buf.subarray(start,end)

Slice (shared memory)

buf.copy(target)

Copy to another buffer

buf.toString('base64')

Encode to base64

Last updated