NodeJs Streams
In NodeJs Streams are used to read data or write data continuously. There are 4 types of streams available.
- Readable – used during a read operation
- Writable – used during a write operation
- Duplex – used during both read & write operation.
- Transform – It is type of duplex stream where the output is computed according to input.
Reading from a stream:
Let’s create a file “input.txt” with below content.
Salesforce Drillers is the world's best online learning space and one of the world's leading training providers. We partner with companies and individuals to address their unique needs, providing training and consultancy that helps working professionals to achieve their career goals.
Let’s create readStream.js with below code
var fs1 = require("fs");
var data1 = '';
// Create a readable stream
var readerStream1 = fs1.createReadStream('input.txt');
// Set the encoding to be utf8.
readerStream1.setEncoding('UTF8');
// Handle stream events --> data, end, and error
readerStream1.on('data', function(chunk) {
data1 += chunk;
});
readerStream1.on('end',function() {
console.log(data1);
});
readerStream1.on('error', function(err) {
console.log(err.stack);
});
console.log("Program Ended");
Below SS shows output of this programs
Salesforce Drillers is the world's best online learning space and one of the world's leading training providers. We partner with companies and individuals to address their unique needs, providing training and consultancy that helps working professionals to achieve their career goals.
var fs1 = require("fs"); var data1 = ''; // Create a readable stream var readerStream1 = fs1.createReadStream('input.txt'); // Set the encoding to be utf8. readerStream1.setEncoding('UTF8'); // Handle stream events --> data, end, and error readerStream1.on('data', function(chunk) { data1 += chunk; }); readerStream1.on('end',function() { console.log(data1); }); readerStream1.on('error', function(err) { console.log(err.stack); }); console.log("Program Ended");
Writing from a Stream:
Let’s create writeStream.js with below code
var fs1 = require("fs"); var data = 'Hello World'; // writable stream var writerStream1 = fs1.createWriteStream('output.txt'); // Write the data to stream writerStream1.write(data,'UTF8'); // Mark the end of file writerStream1.end(); // Handle stream events --> finish, and error writerStream1.on('finish', function() { console.log("File Write completed."); }); writerStream1.on('error', function(err){ console.log(err.stack); }); console.log("All Program Completed");
Below SS shows output of this programs
Piping a Stream:
Piping means output of a stream is used as input. For example, read a file stream to write it in another. Create a read stream and write stream.
We can use the pipe method as “readerStream.pipe(writerStream)”. by doing this, we can read from a stream and write it an another stream directly.
Chaining a Stream:
Chaining is another way of performing pipe. Or using a output as input for another stream. We can do multiple activities with changing as below.
var fs = require("fs"); var zlib = require('zlib'); // Compress the file input.txt to input.txt.gz fs.createReadStream('input.txt') .pipe(zlib.createGzip()) .pipe(fs.createWriteStream('input.txt.gz')); console.log("File Compressed.");
In the above code we create a Read Stream and then pipe it for zlib compression. Then again we pipe it for write stream.