Core Modules
Node.js Stream
Working with Streams
Node.js Stream processes data in chunks for efficient I/O.
What are Node.js Streams?
Node.js Streams are a powerful way to handle reading or writing files, network communications, or any kind of end-to-end information exchange in a more efficient manner. Streams are instances of EventEmitter and can be readable, writable, or both. They process data in chunks, which means you can start processing data before it is fully loaded, resulting in faster and more efficient I/O operations.
Types of Streams
Node.js provides four fundamental stream types:
- Readable: Streams from which data can be read. For example, reading data from a file or network.
- Writable: Streams to which data can be written. For example, writing data to a file.
- Duplex: Streams that are both Readable and Writable, such as TCP sockets.
- Transform: A type of Duplex stream where the output is computed based on the input.
Creating a Readable Stream
Let's create a simple readable stream using the fs
module to read data from a file.
Creating a Writable Stream
Now, let's create a writable stream to write data to a file.
Piping Streams
Piping is a great way to connect readable streams to writable streams. It allows you to take data from one stream and pass it to another, making it easy to build complex data processing pipelines.
Handling Stream Events
Streams are instances of EventEmitter and emit several events that can be handled to perform specific actions:
- data: Emitted when a chunk of data is available.
- end: Emitted when there is no more data to be consumed.
- error: Emitted when an error occurs.
- finish: Emitted when all data has been flushed to the underlying system.
Here is how you can handle these events: