stream module in Node JS
The stream
module in Node.js is a core module that provides an abstract interface for working with streaming data. Streams are a fundamental concept in Node.js that allow you to handle data efficiently in a way that can process large amounts of data without loading it all into memory at once. This is particularly useful for tasks like reading from files, writing to files, or handling HTTP requests and responses.
Key Concepts
Streams: Represent a sequence of data that can be read from or written to. Streams are instances of the
EventEmitter
class and can emit events related to data processing.Stream Types: There are four main types of streams:
- Readable: Streams that you can read data from (e.g., reading from a file or HTTP request).
- Writable: Streams that you can write data to (e.g., writing to a file or HTTP response).
- Duplex: Streams that are both readable and writable (e.g., a TCP socket).
- Transform: A type of duplex stream that can modify or transform the data as it is read or written.
Core Stream Classes and Methods
1. Readable Streams
Readable streams represent a source of data that can be consumed. Examples include reading from a file or receiving data from an HTTP request.
Common Methods:
stream.read()
: Reads data from the stream. When called, it returns the next chunk of data from the stream.stream.pipe(destination)
: Pipes the readable stream into a writable stream. This is a convenient way to transfer data from one stream to another.
Example:
const fs = require('fs');
const readableStream = fs.createReadStream('file.txt');
readableStream.on('data', (chunk) => {
console.log(`Received ${chunk.length} bytes of data.`);
});
readableStream.on('end', () => {
console.log('No more data.');
});
2. Writable Streams
Writable streams represent a destination to which data can be written. Examples include writing to a file or sending data in an HTTP response.
Common Methods:
stream.write(chunk)
: Writes data to the stream. Thechunk
can be a string or a buffer.stream.end()
: Signals that no more data will be written to the stream and optionally writes the final chunk of data.
Example:
const fs = require('fs');
const writableStream = fs.createWriteStream('file.txt');
writableStream.write('Hello, World!\n');
writableStream.end();
3. Duplex Streams
Duplex streams are both readable and writable. They allow for bidirectional communication. Examples include network sockets.
Example:
const { Duplex } = require('stream');
const duplexStream = new Duplex({
read(size) {
// Implement the read logic
this.push('Hello, World!');
this.push(null); // Signal end of stream
},
write(chunk, encoding, callback) {
// Implement the write logic
console.log(`Received chunk: ${chunk.toString()}`);
callback();
}
});
duplexStream.on('data', (chunk) => {
console.log(`Data: ${chunk.toString()}`);
});
duplexStream.write('Hello!');
duplexStream.end();
4. Transform Streams
Transform streams are a type of duplex stream that can modify or transform the data as it is being read or written.
Example:
const { Transform } = require('stream');
const transformStream = new Transform({
transform(chunk, encoding, callback) {
// Transform the chunk to uppercase
this.push(chunk.toString().toUpperCase());
callback();
}
});
process.stdin.pipe(transformStream).pipe(process.stdout);
In this example, the transform stream converts all input data to uppercase before it is output to the terminal.
Common Events
data
: Emitted when a chunk of data is available to read (for readable streams).end
: Emitted when there is no more data to read (for readable streams).error
: Emitted when an error occurs in the stream.finish
: Emitted when all data has been flushed to the underlying resource (for writable streams).
Use Cases
- File Handling: Efficiently read from and write to large files without loading the entire file into memory.
- HTTP Requests/Responses: Handle large HTTP request bodies or responses by streaming data instead of buffering it all at once.
- Data Transformation: Modify or transform data as it is being processed using transform streams.
- Real-Time Data Processing: Process data in real-time from sources such as network connections or user input.
Example Use Case
Streaming File Copy
You might use streams to efficiently copy a large file from one location to another:
const fs = require('fs');
const readStream = fs.createReadStream('largeFile.txt');
const writeStream = fs.createWriteStream('copyOfLargeFile.txt');
readStream.pipe(writeStream);
writeStream.on('finish', () => {
console.log('File copy completed.');
});
In this example, the pipe()
method is used to connect the readable stream (source file) to the writable stream (destination file), allowing data to be copied efficiently.
Summary
- Streams: Handle data in chunks rather than all at once, which is more efficient for large amounts of data.
- Readable Streams: Provide a way to read data.
- Writable Streams: Provide a way to write data.
- Duplex Streams: Support both reading and writing.
- Transform Streams: Modify or transform data as it is processed.
- Events: Streams emit events like
data
,end
,error
, andfinish
to signal different stages of data processing.