Awesome
<br /> <p align="center"><img alt="AlgoMart Logo" src="./AlgoMart-Logo.png" width="400" height="140"></p> <br />AlgoMart Marketplace
๐ง 2.0 Work In Progress ๐ง
Please note that the current version of this project should be considered unstable for the time being as we finalize the upgrade to version 2.0
Migration
This project is developed to be a foundational starter for creating your own NFT storefront on the Algorand blockchain. It is a monorepo that includes:
- A headless CMS (Directus)
- A back-end API (Fastify)
- A job server (BullMQ)
- A front-end sample implementation (NextJS)
- Shared libs
- Terraform templates for setting up infrastructure on Google Cloud Platform
- GitHub Workflows for linting, type-checking, building, dockerizing, and deploying
๐ General Project Overview
The main purpose of this platform is twofold; to make it easy for developers to spin up a custom storefront that interfaces with the blockchain, and to make that storefront accessible to the storefront administrators and its end-users who might not be familiar with the technical nuances of blockchain technology.
In order to accomplish this, storefront administrators should be able to easily to create, configure, and mint NFTs. Likewise, end-users should be able to redeem, purchase, or bid on them without concern of what's happening behind the scenes on the blockchain.
Backend Overview
To accomplish these challenges, templates are used within the CMS.
NFT Templates represent an NFT that will be minted (any number of specified times). It includes key information such as title, description, rarity, and other important configurable metadata that will be consumed by the API.
NFT Templates are grouped within Pack Templates. Packs can contain one or more NFT Templates and also have a number of configurable settings. Most notably, they can be set to be purchasable, to be auctioned, to be claimed via an auto-generated redemption code, or to be given away free. For a full overview of the CMS model, see the CMS README.
Meanwhile, the API continually polls the CMS for new NFT Templates and Pack Templates. So once the templates are configured in the CMS, the API will find them, generate the NFT instances in the API's database, and then group them into Packs based on the Pack template's configuration.
From here on out, the NFT and Pack information can be accessed from the API and displayed to an end-user, who can then purchase, bid on, redeem, or freely claim the Pack (based on the corresponding Pack template's configuration).
Frontend Overview
The backend API can be accessed via REST endpoints by a frontend. This frontend can be custom-built, or the included NextJS web project can be used and further customized.
When an end-user registers through the site, a user account record is created via the API and a new Algorand wallet is generated on their behalf. That wallet's mnemonic key is encrypted via a secret PIN code the end-user provides upon sign-up.
An authenticated end-user can then engage in user flows that allow them to acquire Packs (again, containing one or more to-be-minted NFTs). In the case of a monetary transaction, an end-user can enter their credit card information. Upon submission, this information will be validated and processed via Circle's Payments API. Upon a valid confirmation, the API then mints and transfers the assets to the user's wallet.
๐ง Pre-release
This software is in a pre-release state. This means while we strive to keep it stable and include database migrations, sometimes we may introduce breaking changes or an accidental bug. Follow our issue tracker for more details on what's coming next.
โ Requirements
- Node.js v16.10 or greater (lts is v16.13.1 as of Jan 2022 and works well), npm v7 or greater (manage version via nvm)
- PostgreSQL (Postgres.app is recommended on macOS, or run via
docker-compose up db
) - Redis for jobs
- algod (Algorand node)
- Consider using a third party api for initial setup and experimentation.
- A sandbox is recommended for continued learning about Algorand and its smart signatures/smart contracts
- Circle account for taking payments
- Firebase account for authentication
- Pinata account for storing NFTs
- Chainalysis for blockchain address verification
๐คท Optional
- SendGrid for sending email notifications
- Google Cloud Platform account for hosting
- Install the Nx CLI for ease of development:
npm i -g nx
- Docker for a local dev environment using VSCode Dev Containers or docker compose
๐ Get Started
You can either build and run each application manually or you can use docker-compose
.
Development Environment Setup
-
Create .env files
cp ./.env.exmaple ./.env cp ./apps/cms/.env.example ./apps/cms/.env cp ./apps/scribe/.env.example ./apps/scribe/.env cp ./apps/api/.env.example ./apps/api/.env cp ./apps/web/.env.example ./apps/web/.env
-
Address
SETUP:
comments in env files -
Initialize the databases with
npm run drop-tables && npm run initialize
-
Start the CMS
nx serve cms
- If database is empty, it will automatically seed itself with test data
- Checkpoint: you should be able to log in the CMS and see some sample pack templates (http://localhost:8055/admin/content/pack_templates)
-
Start the job server
nx serve scribe
- Checkpoint: you should be able to see the jobs dashboard (http://localhost:3002/bullboard)
- Run the
sync-cms-cache
job manually, twice, by promoting the delayed job in the dashboard - Checkpoint: you should see rows in the Pack table in the API database (
algomart_api.public.Pack
)
-
Start the API
nx serve api
- Checkpoint: you should be able to see swagger docs (http://localhost:3001/docs/static/index.html)
-
Start the web server
nx serve web
- Checkpoint: You should be able to register an AlgoMart account, sign in and see some drops available (http://localhost:3000/drops)
-
Configure Circle webhooks
- Add Money to your merchant wallet to cover/float pending end user credit card transactions
- Get your merchant wallet's address using your Circle my-sandbox account's "Transfer from a blockchain wallet" functionality
- Send testnet USDC to your merchant wallet from Algorand Testnet Dispenser
- Checkpoint: you can add money to your wallet using one of Circle's test card numbers
- Add Money to your merchant wallet to cover/float pending end user credit card transactions
-
Purchase a pack using credits
- Using credits from the previous step
- Checkpoint: you have collectibles in your collection
-
Configure an Algorand sandbox (optional)
Local Algorand Setup
This is an alternative to using a 3rd party Algorand node API. For local development, the Algorand Sandbox is handy docker instance that makes interfacing with the blockchain simple from your local machine. More information on Sandbox vs Third-party API services here
- Download the Algorand Sandbox and start up the docker instance:
./sandbox up
- By default this will create a private network, as well as fund a few accounts with Algos. You'll need to generate a passphrase mnemonic for one of these accounts. To see the list of the created accounts:
./sandbox goal account list
- Take the
<ADDRESS>
from the first account and input here
./sandbox goal account export -a <ADDRESS>
Use this outputted mnemonic as the FUNDING_MNEMONIC variable within the
.env
file within theapi
andscribe
projects.Disclaimer: If you use a private network as described above, you will not be able to test features that require a public network, including the Pera Wallet (the mobile non-custodial wallet app for Algorand). For running a public network node, see below.
Testnet Algorand Setup
Alternatively you may choose to run the algorand sandbox on a public network such as testnet. In that case you'll need a few additional steps, such as creating and funding an account.
- To run sandbox on testnet:
./sandbox up testnet
- Then create an account:
./sandbox goal account new
This will create a new, unfunded account. Testnet accounts can be funded with fake Algos using the Testnet Dispenser. You can then follow the account export steps above to get your mnemonic passphrase.
Disclaimer: The sandbox testnet configuration will not provide an indexer. There are public indexer's available (e.g. https://algoindexer.testnet.algoexplorerapi.io/), and the Indexer Configuration will need to be updated in both
app
andscribe
.env files.To learn more about available
goal
CLI commands for e.g. creating a new account, see the Alogrand docs.
๐พ DB initialization
To initialize the databases:
npm run drop-tables # drop existing CMS and API databases
npm run initialize # initialize the CMS and API databases
๐ Run
Run all 4 projects (api, cms, web, & scribe) simultaneously with combined output.
npm start
๐ฆ Build
To build everything:
npm run build
โ๏ธ Unit Tests
To run all tests:
npm test
To run tests/lint only a specific library:
# assuming shared-utils is an nx library
# (ie. an alias for libs/shared/utils defined in workspace.json)...
nx run shared-utils:test
nx run shared-utils:lint
โ๏ธ E2E Tests
To run End-to-end integration tests with Cypress:
Be sure to follow steps outlined in the web-e2e README first.
# To open the Cypress UI and watch the tests run:
npm run test:cypress:open
# To run the test in the terminal
npm run test:cypress:run
๐งน Linting
To run eslint for all projects:
npm run lint
๐ณ Running with docker-compose
Alternative to running the services manually, they can also be run via Docker. After creating the relevant .env
files above, add a file called .babelrc
to the root of the web project (apps/web/
) and populate it with:
{ "presets": ["next/babel"] }
Then run all services:,
๐ Running with VSCode Dev Containers.
This codebase leverages VSCode Dev Containers.
-
Install VSCode Remote Containers plugin.
-
Open the project in VSCode. When prompted, click the "Reopen in Container" button
๐ช Project dependencies
When updating dependencies, there are a few things that must be kept in mind.
Directus
If doing any updates to the Directus version, the version numbers must match across the application and the snapshot.yml file must be created with the updated version. You can use apps/cms/scripts/directus-update.sh
to perform these steps. Update the version number at the top of the script.
- Update versions
- Pin
directus
,@directus/sdk
, and@directus/extensions-sdk
versions inpackage.json
- Pin
host
version in/apps/cms/extensions/displays/pack-price/package.json
- Pin
host
version in/apps/cms/extensions/interfaces/price-conversion/package.json
- Pin
host
version in/apps/cms/extensions/hooks/import-data/package.json
- Pin
host
version in/apps/cms/extensions/hooks/kyc-management/package.json
- Pin
host
version in/apps/cms/extensions/hooks/set-verification-status/package.json
- Set npm install step of
/docker/deploy/cms/Dockerfile
to version
- Pin
- Run
npm install
from root to generate latestpackage-lock.json
- Run
nx export cms
to generate latestsnapshot.yml
- Rebuild cms extensions, either via
nx build cms
or all of these:- Run
nx build-price-display cms
to generate latest js file - Run
nx build-price-interface cms
to generate latest js file - Run
nx build-import-data cms
to generate latest js file - Run
nx build-kyc-management cms
to generate latest js file - Run
nx build-set-verification-status cms
to generate latest js file
- Run
Libs
libs/*
Shared Typescript interfaces and enums
For performance and code organization reasons, the Nx docs recommend
putting as much functionality as possible into libs, even if the code is only used in a single app. In Nx, a lib is more than just a directory under the libs/
directory. Each lib must have an entry in the workspace.json file for the lib to build and import correctly.
Linting will fail for any lib code that tries to import code from an app. This means that lib code should never access things like
global configuration variables or environment variables. (eg. Configuration
) Rather, lib code should receive any environment configuration via
arguments that are passed in.
If you wanted to create a new library at the path libs/shared/utils
, you'd use the nx generator...
nx generate @nrwl/node:lib utils --directory shared
๐ข Deployment
Please see the detailed step-by-step guide for instructions on how to use the included Terraform templates and Github Workflow to create a complete storefront environment on Google Cloud Platform.