Chapter 10: Streams
Chapter 10 of 15
Chapter 10: Streams
10.1 Readable Streams
Streams allow processing data in chunks rather than loading everything into memory. Essential for handling large files efficiently.
const fs = require('fs');
// Create readable stream
const readStream = fs.createReadStream('large-file.txt', 'utf8');
// Handle data chunks
readStream.on('data', (chunk) => {
console.log('Received chunk:', chunk.length, 'bytes');
// Process chunk
});
// Handle end of stream
readStream.on('end', () => {
console.log('File read complete');
});
// Handle errors
readStream.on('error', (err) => {
console.error('Stream error:', err);
});
Stream Benefits:
- Memory efficient (process data in chunks)
- Faster processing (start before all data loaded)
- Can handle large files
- Composable (pipe streams together)
Stream Events:
- data: Emitted when data chunk is available
- end: Emitted when no more data
- error: Emitted on error
- close: Emitted when stream closes
10.2 Writable Streams
Writable streams allow writing data incrementally.
const fs = require('fs');
// Create writable stream
const writeStream = fs.createWriteStream('output.txt', 'utf8');
// Write data
writeStream.write('Hello ');
writeStream.write('World');
writeStream.end(); // Close stream
// Handle events
writeStream.on('finish', () => {
console.log('Write complete');
});
writeStream.on('error', (err) => {
console.error('Write error:', err);
});
Piping Streams:
// Pipe readable to writable
const readStream = fs.createReadStream('input.txt');
const writeStream = fs.createWriteStream('output.txt');
readStream.pipe(writeStream);
// Or with transformation
readStream
.pipe(transformStream)
.pipe(writeStream);
10.3 Transform Streams
Transform streams modify data as it flows through.
const { Transform } = require('stream');
// Create transform stream
const upperCaseTransform = new Transform({
transform(chunk, encoding, callback) {
// Transform data
this.push(chunk.toString().toUpperCase());
callback();
}
});
// Use transform
readStream
.pipe(upperCaseTransform)
.pipe(writeStream);
10.4 Stream Best Practices
Follow best practices when working with streams.
- Always handle error events
- Use pipe() for simple operations
- Handle backpressure (when data arrives faster than can be processed)
- Close streams properly
- Use streams for large data processing