Awesome
libxget-js
Non-interractive, chunk-based, web content retriever
Installing
Via NPM:
# as a dependency
npm install libxget
# as a command
npm install -g libxget
This installs a CLI command accessible with the xget
command.
# Check if the xget command has been installed and accessible on your path
$ xget -v
v0.10.0
Usage
CLI
The xget
command, utilizes the library to retrieve web content by its chunks according to specification
# Normal
xget https://google.com/doodle.png
# Write to output file
xget https://google.com/doodle.png image.png
# Piping output
xget https://myserver.io/runtime.log --no-bar | less
# Stream response in real time (e.g Watching a movie)
xget https://service.com/movie.mp4 | vlc -
Use --help
to see full usage documentation.
Programmatically
import xget from "libxget";
xget("https://github.com/microsoft/TypeScript/archive/master.zip", {
chunks: 10,
retries: 10,
}).pipe(fs.createWriteStream("master.zip"));
Get the master branch of the Typescript repository. With 10 simultaneous downloads. Retrying each one to a max of 10.
How it works
|progress| |=========|
/- xresilient[axios] -> || part || -> || cache || -\
/- xresilient[axios] -> || part || -> || cache || -\
/- xresilient[axios] -> || part || -> || cache || -\
URL -> xresilient[axios] -> || part || -> || cache || -> chunkmerger [ -> hasher ] -> output
\- xresilient[axios] -> || part || -> || cache || -/
\- xresilient[axios] -> || part || -> || cache || -/
\- xresilient[axios] -> || part || -> || cache || -/
|progress| |=========|
xget, using the axios library first infers from an abrupt GET response whether or not the server supports byte-ranges through Accept-Ranges or Content-Range. In the event that it does, it opens N connections feeding in non-overlapping segments of the resource. In order to retry broken connections, xget wraps the request generator in xresilient streams to ensure proper retries and probable completion of each chunk stream. The streams are piped and tracked through the progress bar and into a caching stream and then all chunks are merged sequentially, in-place and in-order and piped into an optional hasher and finally the output.
The purpose of the caching stream is to ensure that other chunks can begin while the merger is still writing previous chunks. Liberating the download speed from the write speed, recieved chunks are buffered in memory to a maximum cache limit. The cacher also comes after the progress bar to ensure it properly measures the download speed and not the disk write speed, in the case where disk speed is slower than the network, although unlikely.
The purpose of the hasher is to check the integrity of the merged chunks while writing instead of that being a separate process. Very useful for large-file downloads, so you only process the file once, while downloading.
API
xget(url[, options])
url
: <string>options
: <XGETOptions>- Returns: <XGETStream>
<a id='xgetoptions'></a> XGETOptions <sub>extends
</sub> AxiosOpts: Object
chunks
: <number> Maximum number of non-overlapping chunk connections. Default:5
retries
: <number> Number of retries for each chunk. Default:5
timeout
: <number> Network response timeout (ms). Default:20000
start
: <number> Position to start feeding the stream from. Default:0
auto
: <boolean> Whether or not to start the request automatically or wait for arequest.start()
call (useful when chaining events you want to fire in order). Default:true
size
: <number> Number of bytes to stream off the response.hash
: <string> Hash algorithm to use to create a crypto.Hash instance computing the stream hash.cache
: <number> Whether or not to use an in-memory cache to enable read-aheads of pending chunks.cacheSize
: <boolean> Custom maximum cache size (bytes).use
: <object> Key-value pairs of middlewares with which to pipe the response object through. keys are strings, values are Transformer generating functions (Alternatively, use the xget.use() method).with
: <object> Key-value pairs of middlewares with which to pipe the dataslice object through. keys are strings, values are functions whose return values are accessible within the store. (Alternatively, use the xget.with() method).headHandler
: <HeadHandler> An interceptor for the initial headers, useful for programmatically defining a range offset;
<a id='xgetstore'></a> xget.store: Map
A map whose keys and values are tags and return types of content processed within the withStack of the xget object.
xget(URL)
.with("variable", () => 5)
.once("set", (store) => {
/*
`store` is a map whose key and values directly match tags and return types within
> a with call or the with object in the xget options
*/
console.log(store.get("variable")); // 5
})
.pipe(FILE);
xget.ended: Boolean
A readonly property that tells whether or not the xget instance has ended.
xget.loaded: Boolean
A readonly property that tells whether or not the xget instance has been loaded.
xget.bytesRead: Number
A readonly property that tells how many bytes has been processed by the underlying streams.
<a id='xgetstream'></a> Class: XGETStream <sub>extends
</sub> stream.Readable
The core multi-chunk request instance.
new xget.XGETStream(url[, options])
url
: <string>options
: <XGETOptions>- Returns: <XGETStream>
Event: 'end'
The 'end'
event is emitted after the data from the URL has been fully flushed.
Event: 'set'
store
: <xget.store> The shared internal data store.
The 'set'
event is emitted after all the middlewares defined in the with
option of the XGETOptions or with the xget.with() method.
This event is fired after the 'loaded'
event.
Event: 'error'
err
: <Error> The error instance.
The 'error'
event is emitted once a chunk has met it's maximum number of retries.
At which point, it would abruptly destroy other chunk connections.
Event: 'retry'
retrySlice
:meta
: <boolean> Whether or not the error causing the retry was caused while getting the URL metadata. i.e before any streams are employed.index
: <number> The index count of the chunk.retryCount
: <number> The number of retry iterations so far.maxRetries
: <number> The maximum number of retries possible.bytesRead
: <number> The number of bytes previously read (if any).totalBytes
: <number> The total number of bytes that are to be read by the stream.lastErr
: <Error> The error emitted by the previous stream.store
: <xget.store> The shared internal data store.
The 'retry'
event is emitted by every chunk once it has been re-initialized underneath.
Based on the spec of the xresilient module, chunks are reinitialized once an error event is met.
Event: 'loaded'
loadData
: <LoadData> The pre-computed config for the loaded data slice.
This is emitted right after the initial headers data is gotten, preprocessed, parsed and used to tailor the configuration for the chunk setup.
This loadData
contains information like the actual size of the remote file and whether or not the server supports multiple connections, chunking, file resumption, etc.
This event is fired after calling the headHandler and prior to the 'set'
event.
xget.start()
- Returns: <boolean>
Starts the request process if options.auto
was set to false.
Returns true
if the request was started, false
if it had already been started.
xget.getHash([encoding])
Calculates the digest of all data that has been processed by the library and its middleware transformers. This, creates a deep copy of the internal state of the current crypto.Hash object of which it calculates the digest.
This ensures you can get a hash of an instancce of the data even while still streaming from the URL response.
xget.getHashAlgorithm()
- Returns: <string>
Returns the hash algorithm if any is in use.
xget.setHeadHandler()
fn
: <HeadHandler> Handler to be set.- Returns: <boolean> Whether or not the handler was successfully set.
Sets an interceptor for the initial headers, useful for programmatically defining a range offset. Returns false
if the request has already been loaded, true
if successfully set.
xget.setCacheCapacity()
size
: <number>- Returns: <XGETStream>
Set maximum capacity for internal cache.
<a id='xgetuse'></a> xget.use(tag, handler)
tag
: <string>handler
: <UseMiddlewareFn>- Returns: <XGETStream>
Add a named handler to the use middleware stack whose return value would be used to transform the response stream in a series of pipes.
The handler
method is called after the stream is requested from and we start pumping the underlying request
instances for a data response stream.
The core expects the handler
to return a stream.Duplex instance. (A readable, writable stream) to transform or passthrough the raw data streams along the way.
// Example, compressing the response content in real time
xget(URL)
.use("compressor", () => zlib.createGzip())
.pipe(createWriteStreamSomehow());
<a id='xgetwith'></a> xget.with(tag, handler)
tag
: <string>handler
: <WithMiddlewareFn>- Returns: <XGETStream>
Add a named handler
to the with middleware stack whose return value would be stored within the store after execution.
xget(URL)
.with("bar", ({ size }) => progressBar(size)) // Create a finite-sized progress bar
.use("bar", (_, store) => store.get("bar").genStream()) // Create a stream handling object that updates the progressbar from the number of bytes flowing through itself
.once("set", (store) => store.get("bar").print("Downloading..."))
.pipe(createWriteStreamSomehow());
<a id='xgetgeterrcontext'></a> xget.getErrContext(err)
Extract data from an error if it was either thrown from within a UseMiddlewareFn or a WithMiddlewareFn function.
xget(URL)
.use('errorThrower', () => {
throw new Error('Custom error being thrown');
})
.once('error', err => {
const ({tag, source}) = xget.getErrContext(err);
if (source)
console.log(`Error thrown from within the [${tag}] method of the [${source}] middlware`);
// Error thrown from within the [errorThrower] method of the [xget:use] middleware
})
.pipe(createWriteStreamSomehow());
<a id='headhandler'></a> HeadHandler: function
props
: <object>chunks
: <number> Number of chunks the resource can simultaneously provide.headers
: <IncomingHttpHeaders> GET headers from the URL.start
: <number> Relayed.start
field from XGETOptions.totalSize
: <number> Actual size of the resource without an offset.acceptsRanges
: <boolean> Whether or not the URL resource accepts byte ranges.
- Returns: <number | void> An offset to begin streaming from. Analogous to the
.start
field in XGETOptions. If void, defaults to.start
or0
;
An interceptor for the initial GET data, useful for programmatically defining a range offset.
<a id='loaddata'></a> LoadData: Object
url
: <string> The URL specified.size
: <number> Finite number returned if server responds appropriately, elseInfinity
.start
: <number> Sticks to specification if server allows chunking viacontent-ranges
else, resets to0
.chunkable
: <number> Whether or not the URL feed can be chunked, supporting simultaneous connections.totalSize
: <number> Actual size of the resource without an offset.chunkStack
: <ChunkLoadInstance[]> The chunkstack array.headers
: <IncomingHttpHeaders> The headers object.
<a id='chunkloadinstance'></a> ChunkLoadInstance: Object
min
: <number> The minimum extent for the chunk segment range.max
: <number> The maximum extent for the chunk segment range.size
: <number> The total size of the chunk segment.stream
: <ResilientStream> A resilient stream that wraps around a request instance.
<a id='withmiddlewarefn'></a> WithMiddlewareFn: Function
loadData
: <LoadData>
This handler
is called immediately after metadata from URL is loaded that describes the response.
That is, pre-streaming data from the GET response like size (content-length), content-type, filename (content-disposition), whether or not it's chunkable (accept-ranges, content-range) and a couple of other criterias.
This information is passed into a handler whose return value is filed within the store referenced by the tag
.
<a id='usemiddlewarefn'></a> UseMiddlewareFn: Function
dataSlice
: <ChunkLoadInstance>store
: <xget.store>- Returns: <stream.Duplex>
CLI Info
- To avoid the terminal being cluttered while using pipes, direct other chained binaries'
stdout
andstderr
to/dev/null
# Watching from a stream, hiding vlc's log information
xget https://myserver.com/movie.mp4 | vlc - > /dev/null 2>&1
Development
Building
Feel free to clone, use in adherance to the license. Pull requests are very much welcome.
git clone https://github.com/miraclx/libxget-js.git
cd libxget-js
npm install
# hack on code
License
Apache 2.0 © Miraculous Owonubi (@miraclx) <omiraculous@gmail.com>