Stream Module
Streams handle data in chunks (like reading a large file without loading it all into memory). Useful for files, networks, or big data. There are four types: Readable, Writable, Duplex (both Readable & Writable), Transform (modify data).
Key Concepts
- Readable Stream: Data source (eg., reading file)
- Writable Stream: Data destination (e.g., writing file).
- Piping: Connect readable to writable easily.
- Events: Like “data” (new chunk), “end” (done), “error”.
- Buffering: Data comes in small pieces (default 64KB for files).
Common Commands
Include import * as fs from "fs"; (fs has stream methods).
- Readable Stream (Read file in chunks):
const readStream = fs.createReadStream('largefile.txt', 'utf8')
readStream.on('data', (chunk) => {
console.log(chunk)
})
readStream.on('end', () => {
console.log('Done reading')
})- Explanation: Reads file bit by bit.
- Writable Stream (Write to file):
const writeStream = fs.createWriteStream('output.txt')
writeStream.write('Some data\n')
writeStream.write('More data')
writeStream.end()- Explanation: Writes chunks to file.
- Piping (Copy file):
const readStream = fs.createReadStream('source.txt')
const writeStream = fs.createWriteStream('destination.txt')
readStream.pipe(writeStream)- Explanation: Automatically sends from read to write.
- Transform Stream (e.g., compress with zlib, built-in module):
import * as zlib from 'zlib'
fs.createReadStream('file.txt')
.pipe(zlib.createGzip())
.pipe(fs.createWriteStream('file.txt.gz'))- Explanation: Reads, compresses, writes gzipped file.
