Node.js Streams

Node.js Streams

Streams in Node.js are powerful objects that allow you to efficiently read data from a source and write it to a destination. Instead of loading everything into memory at once, streams process data in chunks, making them highly efficient for large files and real-time operations.

Introduction to Applied AI:–Click Here

There are four main types of streams in Node.js:

  • Readable: Used for reading data.
  • Writable: Used for writing data.
  • Duplex: Supports both reading and writing.
  • Transform: A type of duplex stream where the output is computed based on the input.

Each stream in Node.js is an instance of EventEmitter and can trigger various events. The most commonly used events include:

Data Science Tutorial:-Click Here

  • data – fired when data is available to read.
  • end – fired when there is no more data left.
  • error – fired when an error occurs while reading or writing.
  • finish – fired when all data has been flushed to the destination.

Reading from a Stream

Create a file named input.txt with the following content:

UpdateGadh is one of the best online tutorial websites to learn different technologies in an easy and efficient manner.

Now, create a JavaScript file main.js with this code:

Download New Real Time Projects :–Click here

var fs = require("fs");  
var data = '';  

// Create a readable stream  
var readerStream = fs.createReadStream('input.txt');  

// Set encoding  
readerStream.setEncoding('UTF8');  

// Handle stream events  
readerStream.on('data', function(chunk) {  
   data += chunk;  
});  

readerStream.on('end', function(){  
   console.log(data);  
});  

readerStream.on('error', function(err){  
   console.log(err.stack);  
});  

console.log("Program Ended");  

Run the program:

node main.js

You’ll see the content of input.txt printed in the console.

Writing to a Stream

Now, let’s write data to a file using streams. Create main.js:

var fs = require("fs");  
var data = 'A Solution for all Technology';  

// Create a writable stream  
var writerStream = fs.createWriteStream('output.txt');  

// Write data with UTF8 encoding  
writerStream.write(data, 'UTF8');  

// Mark end of file  
writerStream.end();  

// Stream events  
writerStream.on('finish', function() {  
    console.log("Write completed.");  
});  

writerStream.on('error', function(err){  
   console.log(err.stack);  
});  

console.log("Program Ended");  

Run it:

node main.js

This will create an output.txt file containing the written data.

Machine Learning Tutorial:–Click Here

Piping Streams

Piping allows the output of one stream to be passed directly as input to another. Here’s an example of copying content from one file to another:

var fs = require("fs");  

// Read from input.txt and write to output.txt  
var readerStream = fs.createReadStream('input.txt');  
var writerStream = fs.createWriteStream('output.txt');  

readerStream.pipe(writerStream);  

console.log("Program Ended");  

When you run this, output.txt will contain the same data as input.txt.

Complete Advance AI topics:- CLICK HERE

Chaining Streams

Chaining lets you connect multiple stream operations together. It’s commonly used with piping to perform tasks like file compression and decompression.

Compress a file:

var fs = require("fs");  
var zlib = require('zlib');  

// Compress input.txt to input.txt.gz  
fs.createReadStream('input.txt')  
  .pipe(zlib.createGzip())  
  .pipe(fs.createWriteStream('input.txt.gz'));  

console.log("File Compressed.");  

Run:

node main.js

This will create a compressed file named input.txt.gz.

Decompress a file:

var fs = require("fs");  
var zlib = require('zlib');  

// Decompress input.txt.gz to input.txt  
fs.createReadStream('input.txt.gz')  
  .pipe(zlib.createGunzip())  
  .pipe(fs.createWriteStream('input.txt'));  

console.log("File Decompressed.");  

Run again:

node main.js

This will restore the original input.txt from the compressed file.

Deep Learning Tutorial:– Click Here
Complete Python Course with Advance topics:-Click Here
SQL Tutorial :–Click Here


node.js streams
node.js streams api
node js streams and buffers
node js streams example
node js streams types
node js streams tutorial
node js streams use cases
node js streams documentation
node.js streams explained
node js streams pipe
types of streams in node js
node js streams w3schools
nodejs transform stream example
readable stream nodejs
streams in node js example
node js streams javascript
transform stream nodejs
nodejs stream pipeline

Share this content:

Post Comment