Awesome
get-stream
Get a stream as a string, Buffer, ArrayBuffer or array
Features
- Works in any JavaScript environment (Node.js, browsers, etc.).
- Supports text streams, binary streams and object streams.
- Supports async iterables.
- Can set a maximum stream size.
- Returns partially read data when the stream errors.
- Fast.
Install
npm install get-stream
Usage
Node.js streams
import fs from 'node:fs';
import getStream from 'get-stream';
const stream = fs.createReadStream('unicorn.txt');
console.log(await getStream(stream));
/*
,,))))))));,
__)))))))))))))),
\|/ -\(((((''''((((((((.
-*-==//////(('' . `)))))),
/|\ ))| o ;-. '((((( ,(,
( `| / ) ;))))' ,_))^;(~
| | | ,))((((_ _____------~~~-. %,;(;(>';'~
o_); ; )))(((` ~---~ `:: \ %%~~)(v;(`('~
; ''''```` `: `:::|\,__,%% );`'; ~
| _ ) / `:|`----' `-'
______/\/~ | / /
/~;;.____/;;' / ___--,-( `;;;/
/ // _;______;'------~~~~~ /;;/\ /
// | | / ; \;;,\
(<_ | ; /',/-----' _>
\_| ||_ //~;~~~~~~~~~
`\_| (,~~
\~\
~~
*/
Web streams
import getStream from 'get-stream';
const {body: readableStream} = await fetch('https://example.com');
console.log(await getStream(readableStream));
This works in any browser, even the ones not supporting ReadableStream.values()
yet.
Async iterables
import {opendir} from 'node:fs/promises';
import {getStreamAsArray} from 'get-stream';
const asyncIterable = await opendir(directory);
console.log(await getStreamAsArray(asyncIterable));
API
The following methods read the stream's contents and return it as a promise.
getStream(stream, options?)
stream
: stream.Readable
, ReadableStream
, or AsyncIterable<string | Buffer | ArrayBuffer | DataView | TypedArray>
options
: Options
Get the given stream
as a string.
getStreamAsBuffer(stream, options?)
Get the given stream
as a Node.js Buffer
.
import {getStreamAsBuffer} from 'get-stream';
const stream = fs.createReadStream('unicorn.png');
console.log(await getStreamAsBuffer(stream));
getStreamAsArrayBuffer(stream, options?)
Get the given stream
as an ArrayBuffer
.
import {getStreamAsArrayBuffer} from 'get-stream';
const {body: readableStream} = await fetch('https://example.com');
console.log(await getStreamAsArrayBuffer(readableStream));
getStreamAsArray(stream, options?)
Get the given stream
as an array. Unlike other methods, this supports streams of objects.
import {getStreamAsArray} from 'get-stream';
const {body: readableStream} = await fetch('https://example.com');
console.log(await getStreamAsArray(readableStream));
options
Type: object
maxBuffer
Type: number
Default: Infinity
Maximum length of the stream. If exceeded, the promise will be rejected with a MaxBufferError
.
Depending on the method, the length is measured with string.length
, buffer.length
, arrayBuffer.byteLength
or array.length
.
Errors
If the stream errors, the returned promise will be rejected with the error
. Any contents already read from the stream will be set to error.bufferedData
, which is a string
, a Buffer
, an ArrayBuffer
or an array depending on the method used.
import getStream from 'get-stream';
try {
await getStream(streamThatErrorsAtTheEnd('unicorn'));
} catch (error) {
console.log(error.bufferedData);
//=> 'unicorn'
}
Browser support
For this module to work in browsers, a bundler must be used that either:
- Supports the
exports.browser
field inpackage.json
- Strips or ignores
node:*
imports
Most bundlers (such as Webpack) support either of these.
Additionally, browsers support web streams and async iterables, but not Node.js streams.
Tips
Alternatives
If you do not need the maxBuffer
option, error.bufferedData
, nor browser support, you can use the following methods instead of this package.
streamConsumers.text()
import fs from 'node:fs';
import {text} from 'node:stream/consumers';
const stream = fs.createReadStream('unicorn.txt', {encoding: 'utf8'});
console.log(await text(stream))
streamConsumers.buffer()
import {buffer} from 'node:stream/consumers';
console.log(await buffer(stream))
streamConsumers.arrayBuffer()
import {arrayBuffer} from 'node:stream/consumers';
console.log(await arrayBuffer(stream))
readable.toArray()
console.log(await stream.toArray())
Array.fromAsync()
If your environment supports it:
console.log(await Array.fromAsync(stream))
Non-UTF-8 encoding
When all of the following conditions apply:
getStream()
is used (as opposed togetStreamAsBuffer()
orgetStreamAsArrayBuffer()
)- The stream is binary (not text)
- The stream's encoding is not UTF-8 (for example, it is UTF-16, hexadecimal, or Base64)
Then the stream must be decoded using a transform stream like TextDecoderStream
or b64
.
import getStream from 'get-stream';
const textDecoderStream = new TextDecoderStream('utf-16le');
const {body: readableStream} = await fetch('https://example.com');
console.log(await getStream(readableStream.pipeThrough(textDecoderStream)));
Blobs
getStreamAsArrayBuffer()
can be used to create Blobs.
import {getStreamAsArrayBuffer} from 'get-stream';
const stream = fs.createReadStream('unicorn.txt');
console.log(new Blob([await getStreamAsArrayBuffer(stream)]));
JSON streaming
getStreamAsArray()
can be combined with JSON streaming utilities to parse JSON incrementally.
import fs from 'node:fs';
import {compose as composeStreams} from 'node:stream';
import {getStreamAsArray} from 'get-stream';
import streamJson from 'stream-json';
import streamJsonArray from 'stream-json/streamers/StreamArray.js';
const stream = fs.createReadStream('big-array-of-objects.json');
console.log(await getStreamAsArray(
composeStreams(stream, streamJson.parser(), streamJsonArray.streamArray()),
));
Benchmarks
Node.js stream (100 MB, binary)
getStream()
: 142mstext()
: 139msgetStreamAsBuffer()
: 106msbuffer()
: 83msgetStreamAsArrayBuffer()
: 105msarrayBuffer()
: 81msgetStreamAsArray()
: 24msstream.toArray()
: 21ms
Node.js stream (100 MB, text)
getStream()
: 90mstext()
: 89msgetStreamAsBuffer()
: 127msbuffer()
: 192msgetStreamAsArrayBuffer()
: 129msarrayBuffer()
: 195msgetStreamAsArray()
: 89msstream.toArray()
: 90ms
Web ReadableStream (100 MB, binary)
getStream()
: 223mstext()
: 221msgetStreamAsBuffer()
: 182msbuffer()
: 153msgetStreamAsArrayBuffer()
: 171msarrayBuffer()
: 155msgetStreamAsArray()
: 83ms
Web ReadableStream (100 MB, text)
getStream()
: 141mstext()
: 139msgetStreamAsBuffer()
: 91msbuffer()
: 80msgetStreamAsArrayBuffer()
: 89msarrayBuffer()
: 81msgetStreamAsArray()
: 21ms
FAQ
How is this different from concat-stream
?
This module accepts a stream instead of being one and returns a promise instead of using a callback. The API is simpler and it only supports returning a string, Buffer
, an ArrayBuffer
or an array. It doesn't have a fragile type inference. You explicitly choose what you want. And it doesn't depend on the huge readable-stream
package.
Related
- get-stdin - Get stdin as a string or buffer
- into-stream - The opposite of this package