Core Modules

Node.js Stream

Working with Streams

Node.js Stream processes data in chunks for efficient I/O.

Understanding Node.js Streams

Node.js Streams are a powerful mechanism for handling I/O operations in an efficient manner. They allow you to process data piece by piece, which is especially useful for handling large amounts of data. Streams can be used for both reading and writing data, making them a versatile tool in Node.js applications.

Streams are a part of the core modules in Node.js and are implemented with the stream module. There are four main types of streams:

  • Readable: Used to read data in chunks.
  • Writable: Used to write data.
  • Duplex: Combines both Readable and Writable streams.
  • Transform: A type of Duplex stream where the output is computed based on the input.

Creating a Readable Stream

A readable stream allows you to read data from a source. For example, you can create a readable stream from a file using the fs module:

Creating a Writable Stream

A writable stream allows you to write data to a destination. You can create a writable stream to a file as follows:

Using Pipe for Stream Chaining

One of the most powerful features of streams is the ability to pipe them together. The pipe method allows you to take the output of one stream and pass it as input to another, making it easy to chain operations:

Implementing a Transform Stream

Transform streams are a type of duplex stream where the output is modified based on the input. You can implement a transform stream by extending the Transform class:

Previous
Buffer