- Node.js Tutorial
- NodeJS Home
- NodeJS Introduction
- NodeJS Setup
- NodeJS First App
- NodeJS REPL
- NodeJS Command Line
- NodeJS NPM
- NodeJS Callbacks
- NodeJS Events
- NodeJS Event-Loop
- NodeJS Event-Emitter
- NodeJS Global-Objects
- NodeJS Console
- NodeJS Process
- NodeJS Buffers
- NodeJS Streams
- Node.js File Handling
- Node.js File System
- Node.js Read/Write File
- Working with folders in Node.js
- HTTP and Networking
- Node.js HTTP Module
- Anatomy of an HTTP Transaction
- Node.js MongoDB
- MongoDB Get Started
- MongoDB Create Database
- MongoDB Create Collection
- MongoDB Insert
- MongoDB Find
- MongoDB Query
- MongoDB Sort
- MongoDB Delete
- MongoDB Update
- MongoDB Limit
- MongoDB Join
- Node.js MySQL
- MySQL Get Started
- MySQL Create Database
- MySQL Create Table
- MySQL Insert Into
- MySQL Select From
- MySQL Where
- MySQL Order By
- MySQL Delete
- MySQL Update
- MySQL Join
- Node.js Modules
- Node.js Modules
- Node.js Built-in Modules
- Node.js Utility Modules
- Node.js Web Module
- Node.js Advanced
- Node.js Debugger
- Node.js Scaling Application
- Node.js Packaging
- Node.js Express Framework
- Node.js RESTFul API
- Node.js Useful Resources
- Node.js Useful Resources
- Node.js Discussion
Node.js Streams
Streams are a powerful feature in Node.js, allowing for efficient handling of large amounts of data. Streams are objects that enable reading and writing of data in a continuous flow, without needing to load the entire dataset into memory. This is especially beneficial when working with files, network requests, or any other large data operations.
Key Features of Streams
- Efficient Data Handling: Streams allow processing data in chunks, making it easier to work with large data sets without overwhelming system memory.
- Types of Streams: There are different types of streams in Node.js: Readable, Writable, Duplex, and Transform streams.
- Asynchronous Processing: Streams support asynchronous operations, ensuring non-blocking behavior when dealing with data.
Types of Streams
1. Readable Streams
A Readable stream is a source of data that can be read from. Examples include files and HTTP responses.
const fs = require('fs');
const readableStream = fs.createReadStream('file.txt', 'utf8');
readableStream.on('data', (chunk) => {
console.log(`Received chunk: ${chunk}`);
});
readableStream.on('end', () => {
console.log('Stream ended');
});
data
event is emitted when a chunk of data is available to be read.end
event is emitted when no more data is available.
2. Writable Streams
A Writable stream is a destination for data that can be written to. Examples include files and HTTP requests.
const fs = require('fs');
const writableStream = fs.createWriteStream('output.txt');
writableStream.write('Hello, World!\n');
writableStream.end();
write()
method writes data to the stream.end()
method signals the end of the stream.
3. Duplex Streams
Duplex streams can both read and write data. A common example of a duplex stream is a TCP socket, which can both send and receive data.
const net = require('net');
const server = net.createServer((socket) => {
socket.write('Hello client!\n');
socket.on('data', (data) => {
console.log(`Received from client: ${data}`);
});
});
server.listen(8080, () => {
console.log('Server listening on port 8080');
});
- The
socket
object in the example is a duplex stream since it can both send and receive data.
4. Transform Streams
A Transform stream is a type of Duplex stream that modifies the data as it is being read and written. A common example is the zlib stream, which can compress or decompress data.
const zlib = require('zlib');
const fs = require('fs');
const gzip = zlib.createGzip();
const input = fs.createReadStream('input.txt');
const output = fs.createWriteStream('output.txt.gz');
input.pipe(gzip).pipe(output);
- In the example above, data is compressed using the
gzip
transform stream before being written to the output file.
Stream Methods and Properties
1. pipe()
The pipe()
method is used to pass data from one stream to another. It is commonly used to connect a readable stream to a writable stream.
const fs = require('fs');
const readableStream = fs.createReadStream('file.txt');
const writableStream = fs.createWriteStream('output.txt');
readableStream.pipe(writableStream);
- The
pipe()
method pipes the readable stream data to the writable stream, allowing the data to flow through seamlessly.
2. setEncoding()
The setEncoding()
method is used to specify the encoding for the stream data. This is mostly used with readable streams.
const fs = require('fs');
const readableStream = fs.createReadStream('file.txt', 'utf8');
readableStream.on('data', (chunk) => {
console.log(chunk); // The chunk will be in UTF-8 format
});
setEncoding('utf8')
ensures that the data is returned as a string in UTF-8 format.
3. pause() and resume()
Readable streams have pause()
and resume()
methods to control the flow of data.
const fs = require('fs');
const readableStream = fs.createReadStream('file.txt');
readableStream.on('data', (chunk) => {
console.log(chunk);
readableStream.pause();
setTimeout(() => {
readableStream.resume();
}, 1000); // Pauses for 1 second before resuming
});
pause()
stops the flow of data, andresume()
resumes it.
Summary
Streams in Node.js provide an efficient way to handle large datasets by allowing data to be read and written in chunks. They come in different types, such as Readable, Writable, Duplex, and Transform streams, each serving a specific purpose. Methods like pipe()
, setEncoding()
, pause()
, and resume()
help manage the flow of data in streams. Streams are essential for building scalable applications that process large amounts of data, like reading files or handling HTTP requests.