Node Streams: An Early Look

Node Js applications may efficiently route and handle input and output data using Node Streams. By using Node Js streaming, entrepreneurs may enhance the functionality, scalability, and maintainability of Node Js applications that deal with massive volumes of data. For a deeper understanding, learn about the different types of streams in Node Js and how to use them in a tutorial. Analyse Node Stream piping and chaining.

Define Streams in Node Js

Data that may be read or written consecutively can be manipulated using streams, which are abstract user interfaces. Streams are a crucial feature in Node Js used to manage data flow between input and output sources.

Streams are vital in Node Js frameworks because they make managing massive quantities of data efficient. Streams are a crucial idea in Node Js frameworks. The data is processed in sections as it becomes accessible through streams instead of being loaded into memory all at once. Without storing all of the data in memory at once, data can be streamed in real time from a source (like a file or a network socket) to a destination (like a reply object or an additional file).

For instance, one may read from or write to a stream from and to numerous data sinks, files, network connections, and stdin/stdout.

1. The Stream Module

The stream module is an essential module in Node Js that offers a way to manage streaming data. It provides a set of APIs for generating, reading from, and writing to streams.

2. API for NodeJs Streaming

The Node Js Stream API is a collection of APIs that provide users with a means to manage streaming data. A selection of classes and operations for generating, reading from, and writing to Node streams are offered by the Stream API.

The following are the primary elements of the Node Js Stream API:

1. Stream Classes:

Classes for interacting with Node JS streams are provided by the Stream API, including the Readable, Writable, Duplex, and Transform classes. These classes offer many stream types with various functionalities.

2. Stream Methods:

The Stream API offers several ways to deal with streams, such as the pipe() function for establishing connections between written and readable types and the onData() method for managing a data event or more.

3. Events:

‘data,’ ‘end,’ ‘error,’ and ‘finish’ are just a few of the events that streams may deliver and are provided by the Stream API. These events are applied to various stream-processing tasks.

4. Stream Options:

It is used for customizing streams, including changing the high watermark for written streams or the encoding for readable streams via the Stream API.

Node Stream Types

Each of the four types of streams—Readable Node Js Streams, Writable Streams, Duplex Streams, and Transform Streams for Node Js applications—has a particular use.

Let’s examine the Readable Node Js Stream illustration.

1. Reading Stream

Readable streams are utilized to read data from a source, such as a network socket or a file. When fresh data becomes available, they send a “data” event; when the stream is finished, they produce an “end” event. ‘fs.createReadStream()’ and ‘http.IncomingMessage’ are two examples of readable streams in Node.js that may be used to read files and HTTP requests, respectively.

Let’s use an example to understand the Readable NodeJs Stream further.

const fs = require('fs');

// Create a readable stream from a file
const readStream = fs.createReadStream('example.txt', { encoding: 'utf8' });

// Handle the stream's 'data' events
readStream.on('data', (chunk) => {
  console.log(`Received ${chunk.length} bytes of data.`);
});

// Handle the 'end' event emitted by the stream
readStream.on('end', () => {
  console.log('End of file reached.');
});

// Handle errors emitted by the stream
readStream.on('error', (err) => {
  console.error(`Error: ${err}`);
});

Output

Reading data from the file in chunk: My name is john doe
Read Stream Ended!

We’ll develop a readable stream from a file called “example.txt” using the fs module in this example. We changed the encoding option to ‘utf8’ to read the file as a string.

When a small amount of data is read from the file, the stream emits the ‘data’ event, which must be handled. We only record the number of bytes received in this scenario.

We additionally handle the stream’s ‘end’ event, which is set off when the file’s end is reached. Finally, we record any stream issues in the console.

2. Writable Stream

Writable Stream data can be written to a file or a network socket using a writable stream. Both ‘write()’ and ‘end()’ methods are available for writing data and signaling the end of the stream, respectively. Node.js provides ‘fs.createWriteStream()’ for writing to files and ‘http.ServerResponse’ for writing HTTP replies as examples of these streams.

A Node Js Writable Stream example:

const fs = require('fs');

// Create a writable stream
const writeStream = fs.createWriteStream('output.txt');

// write data to file
writeStream.write('Hello from write stream')

// ending writable stream
writeStream.end();

// Handle stream events
writeStream.on('finish', () => {
    console.log(`Write Stream Finished!`);
})

writeStream.on('error', (error) => {
    console.error(`Write Stream error: ${error}`);
})

Output

Write Stream Finished!

In this example, we create a writable stream to a file called “output.txt” using the fs module. We selected the ‘utf8’ encoding option and read the file’s data as a string.

Then, we publish data to the stream using the write() function twice to write two lines of text. The end() function is used to terminate the stream.

We also manage the stream’s ‘finish’ event, which is raised once all data has been sent to the file. Finally, we record any stream problems in the console.

3. Stream in Duplex

Duplex streams may read and write data in both directions. They can be utilized for activities like data proxies between network sockets. Duplex streams have the methods of both “Readable” and “Writable” since they derive from both.

Example of Duplex Stream:

const { Duplex } = require('stream');

const myDuplex = new Duplex({
  write(chunk, encoding, callback) {
    console.log(chunk.toString());
    callback();
  },
  read(size) {
    if (this.currentCharCode > 90) {
      this.push(null);
      return;
    }
    this.push(String.fromCharCode(this.currentCharCode++));
  }
});

myDuplex.currentCharCode = 65;

process.stdin.pipe(myDuplex).pipe(process.stdout);

In this example, the Duplex class from the stream module is used to build a new Duplex stream. Every time data is sent to the stream; the write method is invoked, which merely reports the data chunk to the console. Every time the stream is read from, the read method is invoked. In this case, it writes ASCII characters to the stream up to character code 90 before signaling the end of the stream by pushing null.

Then, we pipe the Duplex stream to the standard output stream (process.stdout) and the standard input stream (process.stdin) to our Duplex stream. As a result, we can enter information into the console, write it to the Duplex stream, and then have the Duplex stream’s output sent to the console.

4. Transform Stream

A duplex stream known as a transform stream can alter data as it travels through it. They can be applied to data compression, encryption, or validation tasks. Since they inherit from the ‘ Duplex ‘ class, transform streams have both a read() and a write() function. Data is transformed by the transform function when it is written to a transform stream before being output.

Let’s look at a transform Node.js stream sample.

const fs = require('fs');

// Importing strema APIs
const { Transform, pipeline } = require('stream');

// Create a readable stream
const readableStream = fs.createReadStream('input.txt');

// Create a writable stream
const writableStream = fs.createWriteStream('output.txt');

// Set the encoding to be utf8. 
readableStream.setEncoding('utf8');

// Transform chunk into uppercase
const uppercaseWordProcessing = new Transform({
    transform(chunk, encoding, callback) {
        console.log(`Data to be transformed: ${chunk}`);
        callback(null, chunk.toString().toUpperCase());
    }
});

readableStream
    .pipe(uppercaseWordProcessing)
    .pipe(writableStream)

// Alternatively, we can use the pipeline API to pipe a series of streams easily
// together and get notified when the pipeline is fully completed.
pipeline(readableStream, uppercaseWordProcessing, writableStream, (error) => {
    if (error) {
        console.error(`Error occurred while transforming stream: ${error}`);
    } else {
        console.log('Pipeline succeeded!');
    }
});

// Handle stream events
readableStream.on('end', () => {
    console.log(`Read Stream Ended!`);
    writableStream.end();
})

readableStream.on('error', (error) => {
    console.error(`Read Stream Ended with an error: ${error}`);
})

writableStream.on('finish', () => {
    console.log(`Write Stream Finished!`);
})

writableStream.on('error', (error) => {
    console.error(`Write Stream error: ${error}`);
})

Output

Data to be transformed: My name is john doe
Read Stream Ended!
Write Stream Finished!
Pipeline succeeded!

In this example, we develop the built-in ‘Transform’ class from the stream module by introducing a new category called ‘UpperCaseTransform.’ We override the ‘_transform’ function and use the ‘toUpperCase’ method of the string object to change every element of incoming data to uppercase. Then, after using the ‘push’ method to send the modified chunk to the writable stream, we use the ‘callback’ function to indicate that the chunk has completed processing.

We then pipe the converted data to the ‘stdout’ writable stream after passing the ‘stdin’ readable stream through an instance of our ‘UpperCaseTransform’ class. All information written to “stdin” is changed to uppercase and displayed in the console as a result.

Let’s move on to the advantages of using Node Js Streams for business now that we know their many forms.

Node Js streaming benefits

Netflix, NASA, Uber, Walmart, and other well-known Node.js users with vast amounts of data utilize streams Node JS to manage, support, and run their apps more effectively. The benefits of using Node Streams in your Node.js applications are listed below.

1. Modularity: Node Streams may be piped together and merged, enabling the division of large data processing jobs into smaller, more manageable components. As a result, code could get easier to read and maintain.

2. Backpressure management: When the data destination cannot keep up, streams can manage backpressure by automatically slowing down the data source. Buffer overflows and other performance problems may be avoided in this way.

3. Performance: Streams can be quicker and more effective than other approaches that read or write the complete data set simultaneously since they can handle data in sections. This can be especially helpful for real-time applications with high throughput and low latency.

4. Flexibility: A wide variety of data sources and destinations, including files, network sockets, and HTTP requests and responses, may be handled by streams. Because of this, streams are a dynamic tool for processing data in various scenarios.

5. Memory efficiency: Streams can handle a lot of data without storing it all in memory at once. This indicates that streams can take data and files too big to fit in memory.

Overall, Node Js usage of streams can enhance the speed, scalability, and maintainability of applications that deal with significant volumes of data.

It is time to learn more about the use cases for Node Js Streams and the possibilities and scope of implementing Node streaming.

Stream Chaining Node Js

Chaining in Node Streams is a technique for tying several stream operations together via method chaining. Chaining makes it simple to build a pipeline of stream operations that may process or alter data as it passes through the pipeline and is applied to a readable stream.

Calling methods on a reading stream generates new stream objects that may be further configured or linked to other Node streams, allowing you to chain stream actions. As the data passes through the pipeline, the resultant stream operations are sequenced.

An illustration of how to create a pipeline of stream operations using chaining is as follows:

const fs = require('fs');

// Create a readable stream from a file
const readStream = fs.createReadStream('input.txt');

// Chain stream operations to transform the data
readStream
  .pipe(transformStream1)
  .pipe(transformStream2)
  .pipe(transformStream3)
  .pipe(writeStream);

// Make a stream that is writable to a file
const writeStream = fs.createWriteStream('output.txt');

// Define transform stream operations
const transformStream1 = // ...
const transformStream2 = // ...
const transformStream3 = // ...

In this Node stream example, the fs module converts a file into a readable stream. The data is then transformed by combining various stream actions, which are connected one to the other using the pipe() technique.

We define each transform stream operation independently and provide it to pipe() as a parameter. Any stream type, including transform, duplex, and other readable streams, may be used for these tasks.

Finally, we use pipe() to generate a writable stream to a file and connect it to the pipeline.

Benefits of Chaining in Nodes

  • Process flexibility
  • Reusability
  • enhanced efficiency
  • Simple bug fixing

Drawbacks of Chaining in Nodes

  • Complex
  • Sharp learning curve
  • Inadequate compatibility
  • Flow control challenges

Streaming in Node Piping

A reading stream may be connected with a written one using the pipe() function in Node Js Streaming. A written stream is connected to a readable stream using the pipe() function, which accepts a stream as an input.

Pipe() automatically sends data from the readable stream to the writable stream until the end is reached after setting up listeners on the ‘data’ and ‘end’ events of the readable stream. This makes it simple to chain several streams together and build a pipeline for data processing.

Here is a usage case for the pipe() method:

const fs = require('fs');

// Create a readable stream from a file
const readStream = fs.createReadStream('input.txt');

// Make a stream that is writable to a file
const writeStream = fs.createWriteStream('output.txt');

// Connect the write stream to the readable stream through pipe
readStream.pipe(writeStream);

// Handle errors emitted by either stream
readStream.on('error', (err) => {
  console.error(`Error reading file: ${err}`);
});

writeStream.on('error', (err) => {
  console.error(`Error writing file: ${err}`);
});

In this example, we first use the fs module to create reading and write type streams. Using the pipe () technique, the reading stream is subsequently linked to the written stream using the pipe() method.

Using the on(‘error’) function, we also deal with any problems that any stream may issue.

Benefits of Piping Node Streams

  • Quick processing
  • Simple to Use
  • Modular code management
  • Handles the backpressure

Drawbacks of Piping Node Streams

  • Steep learning curve
  • Not suitable for use with other Nodes.js streams
  • Debugging bugs
  • Complicated control flow

Conclusion

Node developers may easily handle incoming and outgoing data with the help of Node.js streams. Especially with better memory management, business owners may manage, operate, and get outstanding Node js performance out of their Node application utilizing streams.

As a result, the top Node app development company, like bosctechlabs.com, would take advantage of these many Node streams to develop apps with appealing and user-friendly interfaces. The selection of streams in NodeJS is based on the application’s unique requirements, design objectives, and the expertise of Node professionals who fully utilize these components to produce high-quality, user-friendly apps.

Frequently Asked Questions (FAQs)

1. What kind of stream is a Node?

In Node.js, a stream is an abstract interface for interacting with streaming data. The stream interface may be implemented using the Node:stream module’s API. Node.js offer numerous stream objects. Taking an HTTP server request and process.stdout is an example for both streams.

2. How does Node write a stream into a file?

A new writable stream must be created for a file utilizing streams. The stream can be permanently closed once all the data packets have been sent. Data can then be written to the stream periodically, all at once, or by data availability from a server or other operation.

3. What do pipeline Nodes do?

The physical devices hosting one or more pipeline processes are called pipeline Nodes. The pipeline executable that drives the pipeline processes is installed and executed on the pipeline Node. You configure and maintain the pipeline configuration file for all pipelines hosted by this system.

4. What does “stream” in a file mean?

A series of bytes used to store file data is known as a file stream. A file typically only has one file stream, the default data stream. On file systems that permit multiple data streams, however, one file can have a lot of file streams. The default data stream, which has no name, is one of these streams.


Book your appointment now

Get in touch






    Stay up-to-date with our blogs