Home

Awesome

S3LevelDown

An abstract-leveldown compliant implementation of LevelDOWN that uses Amazon S3 as a backing store. S3 is actually a giant key-value store on the cloud, even though it is marketed as a file store. Use this database with the LevelUP API.

To use this optimally, please read "Performance considerations" and "Warning about concurrency" sections below.

You could also use this as an alternative API to read/write S3. The API simpler to use when compared to the AWS SDK!

Installation

Install s3leveldown and peer dependencies levelup and @aws-sdk/client-s3 with yarn or npm.

$ npm install s3leveldown @aws-sdk/client-s3 levelup

Documentation

See the LevelUP API for high level usage.

new S3LevelDown(location [, s3])

Constructor of s3leveldown backing store. Use with levelup.

Arguments:

Example

Please refer to the AWS SDK docs to set up your API credentials before using.

Using Promises

const levelup = require('levelup');
const S3LevelDown = require('s3leveldown');

(async () => {
  // create DB
  const db = levelup(new S3LevelDown('mybucket'));

  // put items
  await db.batch()
    .put('name', 'Pikachu')
    .put('dob', 'February 27, 1996')
    .put('occupation', 'Pokemon')
    .write();
  
  // read items
  await db.createReadStream()
    .on('data', data => { console.log('data', `${data.key.toString()}=${data.value.toString()}`); })
    .on('close', () => { console.log('done!') });
})();

Using Callbacks

const levelup = require('levelup');
const S3LevelDown = require('s3leveldown');

const db = levelup(new S3LevelDown('my_bucket'));

db.batch()
  .put('name', 'Pikachu')
  .put('dob', 'February 27, 1996')
  .put('occupation', 'Pokemon')
  .write(function () { 
    db.readStream()
      .on('data', console.log)
      .on('close', function () { console.log('Pika pi!') })
  });

Example with min.io

You could also use s3leveldown with S3 compatible servers such as MinIO.

const levelup = require('levelup');
const S3LevelDown = require('s3leveldown');
const AWS = require('aws-sdk');

const s3 = new AWS.S3({
  apiVersion: '2006-03-01',
  accessKeyId: 'YOUR-ACCESSKEYID',
  secretAccessKey: 'YOUR-SECRETACCESSKEY',
  endpoint: 'http://127.0.0.1:9000',
  s3ForcePathStyle: true,
  signatureVersion: 'v4'
});

const db = levelup(new S3LevelDown('my_bucket', s3));

Example with PouchDB

Sub folders

You can create your Level DB in a sub-folder in your S3 bucket, just use my_bucket/sub_folder when passing the location.

Performance considerations

There are a few performance caveats due to the limited API provided by the AWS S3 API:

Warning about concurrency

Individual operations (put get del) are atomic as guaranteed by S3, but the implementation of batch is not atomic. Two concurrent batch calls will have their operations interwoven. Don't use any plugins which require this to be atomic or you will end up with your database corrupted! However, if you can guarantee that only one process will write the S3 bucket at a time, then this should not be an issue. Ideally, you want to avoid race conditions where two processes are writing to the same key at the same time. In those cases the last write wins.

Iterator snapshots are not supported. When iterating through a list of keys and values, you may get the changes, similar to dirty reads.

Tests and debug

S3LevelDown uses debug. To see debug message set the environment variable DEBUG=S3LevelDown.

To run the test suite, you need to set a S3 bucket to the environment variable S3_TEST_BUCKET. Also be sure to set your AWS credentials

$ S3_TEST_BUCKET=my-test-bucket npm run test

License

MIT