Awesome
go-graphql-subscription-example
Project that demonstrates graphQL subscriptions (over Websocket) to consume pre-configured topics from different kinds of stream sources like Apache Kafka, redis, NSQ...
Purpose
This repository implements a simple service allowing clients to consume messages from a topics/channels through a graphQL subscription endpoint.
<p align="center"> <img src="doc/overview.png" title="overview"> </p>This particular example demonstrates how to perform basic operations such as:
-
serve a graphiQL page
-
expose a Prometheus endpoint
-
implement a subscription resolver using WebSocket transport (compliant with Apollo v0.9.16 protocol)
-
implement custom graphQL scalars
-
consumer following kind of stream sources:
- Apache Kafka - an open-source stream-processing software which aims to provide a unified, high-throughput, low-latency platform for handling real-time data feeds.
- Redis Streams - an open source, in-memory data structure store, message broker with streaming capabilities.
- NSQ - a realtime distributed messaging platform designed to operate at scale.
-
process messages using reactive streams
-
filter messages using an expression evaluator
-
...
Pre-requisites
Requires Go 1.14.x or above, which support Go modules. Read more about them here.
Build
The project comes with a magefile.go
, so all the main activities can be performed by mage.
:warning: The source code provided is incomplete - build needs a code generation phase, especially for the embedding of the static resources.
To build the project, simply invoke the build
target:
mage build
Alternately, the project can be build by docker:
mage docker
Command will produce the image ccamel/go-graphql-subscription-example
.
How to play with it using Kafka?
1. Start Kafka server
At first, kafka must be started. See official documentation for more.
Kafka uses ZooKeeper so you need to first start a ZooKeeper server if you don't already have one.
> bin/zookeeper-server-start.sh config/zookeeper.properties
Now start the Kafka server:
> bin/kafka-server-start.sh config/server.properties
2. Create topics
For the purpose of the demo, some topics shall be created. So, let's create 2 topics named topic-a
and topic-b
,
with a single partition and only one replica:
> bin/kafka-topics.sh --create --bootstrap-server localhost:9092 --replication-factor 1 --partitions 1 --topic topic-a
> bin/kafka-topics.sh --create --bootstrap-server localhost:9092 --replication-factor 1 --partitions 1 --topic topic-b
3. Start the GraphQL server
The configuration is pretty straightforward:
> ./go-graphql-subscription-example --help
Usage:
go-graphql-subscription-example [flags]
Flags:
-h, --help help for go-graphql-subscription-example
--port uint16 The listening port (default 8000)
--source string The URI of the source to connect to
--topics strings The list of topics/stream names that subscribers can consume (default [foo])
Run the application which exposes the 2 previously created topics to subscribers:
> ./go-graphql-subscription-example --topics topic-a,topic-b
Alternately, if the docker image has been previously built, the container can be started this way:
> docker run -ti --rm -p 8000:8000 ccamel/go-graphql-subscription-example --topics topic-a,topic-b
4. Subscribe
The application exposes a graphql endpoint through which clients can receive messages coming from a kafka topic.
Navigate to http://localhost:8000/graphiql
URL and submit the subscription to the topic topic-a
.
subscription {
event(on: "topic-a")
}
The offset id to consume from can also be specified. Negative values have a special meaning:
-1
: the most recent offset available for a partition (end)-2
: the least recent offset available for a partition (beginning)
subscription {
event(on: "topic-a", at: -1)
}
Additionally, a filter expression can be specified. The events consumed are then only ones matching the given predicate. You can refer to antonmedv/expr for an overview of the syntax to use to write predicates.
subscription {
event(
on: "topic-a",
at: -1,
matching: "value > 8"
)
}
5. Push messages
Run the producer and then type a few messages into the console to send to Kafka. Note that messages shall be JSON objects.
> bin/kafka-console-producer.sh --broker-list localhost:9092 --topic topic-a
{ "message": "hello world !", "value": 14 }
The message should be displayed on the browser.
How to play with it using Redis?
⚠️ Redis implementation does not support offsets (i.e. the capability to resume at some point in time).
1. Start Redis
At first, a redis server (at least v5.0) must be started. See official documentation for more.
2. Start the GraphQL server
Run the application which exposes the 2 previously created topics to subscribers:
> ./go-graphql-subscription-example --source redis://6379?name=foo --topics topic-a,topic-b
Alternately, if the docker image has been previously built, the container can be started this way:
> docker run -ti --rm -p 8000:8000 ccamel/go-graphql-subscription-example --source redis://6379?name=foo --topics topic-a,topic-b
3. Subscribe
The application exposes a graphql endpoint through which clients can receive messages coming from a redis stream.
Navigate to http://localhost:8000/graphiql
URL and submit the subscription to the topic topic-a
.
subscription {
event(on: "topic-a")
}
Additionally, a filter expression can be specified. The events consumed are then only ones matching the given predicate. You can refer to antonmedv/expr for an overview of the syntax to use to write predicates.
subscription {
event(
on: "topic-a",
matching: "message contains \"hello\""
)
}
4. Push messages
Start the redis-cli
and then use the XADD
command to send the messages to the Redis stream.
> redis-cli
127.0.0.1:6379> XADD topic-a * message "hello world !" "value" "14"
The message should be displayed on the browser.
How to play with it using NSQ?
⚠️ NSQ implementation does not support offsets (i.e. the capability to resume at some point in time).
1. Start NSQ
At first, NSQ must be started. See official documentation for more.
> nsqlookupd
> nsqd --lookupd-tcp-address=127.0.0.1:4160
> nsqadmin --lookupd-http-address=127.0.0.1:4161
2. Start the GraphQL server
Run the application which exposes the 2 previously created topics to subscribers:
> ./go-graphql-subscription-example --source nsq: --topics topic-a,topic-b
Alternately, if the docker image has been previously built, the container can be started this way:
> docker run -ti --rm -p 8000:8000 ccamel/go-graphql-subscription-example --source nsq: --topics topic-a,topic-b
3. Subscribe
The application exposes a graphql endpoint through which clients can receive messages coming from a redis stream.
Navigate to http://localhost:8000/graphiql
URL and submit the subscription to the topic topic-a
.
subscription {
event(on: "topic-a")
}
Additionally, a filter expression can be specified. The events consumed are then only ones matching the given predicate. You can refer to antonmedv/expr for an overview of the syntax to use to write predicates.
subscription {
event(
on: "topic-a",
matching: "message contains \"hello\""
)
}
4. Push messages
Publish a message to the topic topic-a
by using the command line below:
> curl -d '{ "message": "hello world !", "value": 14 }' 'http://127.0.0.1:4151/pub?topic=topic-a'
The message should be displayed on the browser.
Stack
Technical
This application mainly uses:
-
GraphQL
-
Prometheus
-
Kafka
-
Redis
-
NSQ
-
Expression language
-
Design Patterns
-
CLI
-
Log
Project
-
Build
↳ mage
-
Linter
License
[MIT] © [Chris Camel]