Awesome
What it is
CLI tool to patch Hasura metadata json
file with needed objects or with another Hasura metadata file. You can use it to deploy complex CI/CD flows for applications, which are using Hasura on a backend.
Why it is useful
If you use different environments, you likely have different webhooks in each environment. With this tool you can describe needed metadata for each webhook separately and finally merge them all when you need to deploy your release. Such approach gives an ability to several people to develop different webhooks independently.
Requirements
- Python 3.6 or higher
requirements.txt
How it works
Syntax
python main.py -r remote_schemas -r actions -r custom_types -r event_triggers -s dev_metadata.json -m prod_metadata.json -o out.json
Find all the syntax by python main.py --help
Supported operations
Merge
Default mode. Use it to merge needed Hasura objects from a mixin file to a source metadata file.
Replace -r
Use it to define, which objects should be fully replaced in a source metadata file by objects from a mixin.
For example, if a mixin file is another Hasura metadata file and you call
python main.py -r event_triggers -s dev_metadata.json -m prod_metadata.json -o out.json
then:
- All the metadata objects from
prod_metadata.json
will be mixed in to metadata fromdev_metadata.json
. This means that if object does not exist indev_metadata.json
, it will be created, but if exists it will be replaced with new object from a mixin fileprod_metadata.json
. - All the event triggers will be removed for all the tables in the resulted metadata from the previous step.
- New event triggers from
prod_metadata.json
will be inserted instead. - The result metadata goes to
out.json
.
Typical Hasura metadata release flow
- Export metadata in
json
format from Hasura in dev environment, which should be deployed to production. - Export metadata in
json
format from Hasura in current production environment. - Run patcher with source
json
file from dev and mixin file from production. - Run patcher with output
json
file from previous step and mixin files with new Hasura objects to deploy. - The last output file is your new Hasura metadata for production environment.
Points 1-3 are needed to migrate new tables and permissions from dev environment, since they are not supported by this patcher for now.
Supported Hasura objects
- Event triggers
- Remote schemas
- Actions
- Custom types (object types and input types)
- Sources
- Tables
- Table permissions
Hasura objects format to patch
Event trigger
{
"type": "event_trigger",
"object": {
"name": "myEventTrigger",
"table": {
"schema": "core",
"name": "some_table"
},
"definition": {
"enable_manual": false,
"insert": {
"columns": "*"
},
"update": {
"columns": []
}
},
"retry_conf": {
"num_retries": 0,
"interval_sec": 10,
"timeout_sec": 60
},
"webhook": "https://mywebhook.url",
"config": "1.2.3",
"headers": [{
"name": "token",
"value": "%TOKEN_ENV%"
}],
"comment": "0.1.0"
}
}
Action
{
"type": "action",
"object": {
"name": "myAction",
"definition": {
"handler": "https://mywebhook.url",
"output_type": "myOutputType",
"headers": [
{
"name": "Authorization",
"value_from_env": "AUTHORIZATION_HEADER"
}
],
"arguments": [
{
"name": "value",
"type": "String!"
}
],
"type": "mutation",
"kind": "synchronous"
},
"permissions": [
{
"role": "user"
}
],
"comment": "0.1.0"
}
}
Custom type
{
"type": "custom_type",
"object": {
"name": "myCustomType",
"fields": [
{
"name": "affected_rows",
"type": "Int!"
}
]
}
}
Remote schema
{
"type": "remote_schema",
"object": {
"name": "myRemoteSchema",
"definition": {
"url": "https://myschema.url/graphql",
"timeout_seconds": 60,
"forward_client_headers": false,
"headers": [
{
"name": "Authorization",
"value_from_env": "AUTHORIZATION_HEADER"
}
]
},
"comment": "0.1.0"
}
}