How to wrap a buffer as a stream2 Readable stream?

How can I transform a node.js buffer into a Readable stream following using the stream2 interface ?

I already found this answer and the stream-buffers module but this module is based on the stream1 interface.


The easiest way is probably to create a new PassThrough stream instance, and simply push your data into it. When you pipe it to other streams, the data will be pulled out of the first stream.

var stream = require('stream');

// Initiate the source
var bufferStream = new stream.PassThrough();

// Write your buffer
bufferStream.end(Buffer.from('Test data.'));

// Pipe it to something else  (i.e. stdout)
bufferStream.pipe(process.stdout)

As natevw suggested, it's even more idiomatic to use a stream.PassThrough, and end it with the buffer:

var buffer = new Buffer( 'foo' );
var bufferStream = new stream.PassThrough();
bufferStream.end( buffer );
bufferStream.pipe( process.stdout );

This is also how buffers are converted/piped in vinyl-fs.


A modern simple approach that is usable everywhere you would use fs.createReadStream() but without having to first write the file to a path.

const {Duplex} = require('stream'); // Native Node Module 

function bufferToStream(myBuuffer) {
    let tmp = new Duplex();
    tmp.push(myBuuffer);
    tmp.push(null);
    return tmp;
}

const myReadableStream = bufferToStream(your_buffer);
  • myReadableStream is re-usable.
  • The buffer and the stream exist only in memory without writing to local storage.
  • I use this approach often when the actual file is stored at some cloud service and our API acts as a go-between. Files never get wrote to a local file.
  • I have found this to be the very reliable no matter the buffer (up to 10 mb) or the destination that accepts a Readable Stream. Larger files should implement