Home

Awesome

ReactRelayNetworkLayer (for Relay Classic)

npm Travis Commitizen friendly semantic-release FlowType compatible

For Relay Modern please use react-relay-network-modern package.

The ReactRelayNetworkLayer is a Relay Network Layer with various middlewares which can manipulate requests/responses on the fly (change auth headers, request url or perform some fallback if request fails), batch several relay request by timeout into one http request.

ReactRelayNetworkLayer can be used in browser, react-native or node server for rendering. Under the hood this module uses global fetch method. So if your client is too old, please import explicitly proper polyfill to your code (eg. whatwg-fetch, node-fetch or fetch-everywhere).

yarn add react-relay-network-layer
OR
npm install react-relay-network-layer --save

Migrating from v1 to v2

Changes in v2.0.0:

All other parts stay unaffected. So if you use request batching, you should change your config:

import Relay from 'react-relay';
import {
  RelayNetworkLayer,
  urlMiddleware,
+  batchMiddleware,
} from 'react-relay-network-layer';

Relay.injectNetworkLayer(new RelayNetworkLayer([
+  batchMiddleware({
+    batchUrl: (req) => '/graphql/batch',
+  }),
  urlMiddleware({
    url: (req) => '/graphql',
-    batchUrl: (req) => '/graphql/batch',
  }),
- ], { disableBatchQuery: false }));
+ ]));

Big thanks to @brad-decker and @jeanregisser in helping to done this release.

Previous documentation for version 1.x.x can be found here.

Middlewares

Available middlewares:

Advanced options (2nd argument after middlewares)

RelayNetworkLayer may accept additional options:

const middlewares = []; // array of middlewares
const options = {}; // optional advanced options
const network = new RelayNetworkLayer(middlewares, options);

Available options:

Example of injecting NetworkLayer with middlewares on the client side.

import Relay from 'react-relay';
import {
  RelayNetworkLayer,
  urlMiddleware,
  batchMiddleware,
  loggerMiddleware,
  gqErrorsMiddleware,
  perfMiddleware,
  retryMiddleware,
  authMiddleware,
} from 'react-relay-network-layer';

Relay.injectNetworkLayer(new RelayNetworkLayer([
  urlMiddleware({
    url: (req) => '/graphql',
  }),
  batchMiddleware({
    batchUrl: (reqestMap) => '/graphql/batch',
    batchTimeout: 10,
  }),
  loggerMiddleware(),
  gqErrorsMiddleware(),
  perfMiddleware(),
  retryMiddleware({
    fetchTimeout: 15000,
    retryDelays: (attempt) => Math.pow(2, attempt + 4) * 100, // or simple array [3200, 6400, 12800, 25600, 51200, 102400, 204800, 409600],
    forceRetry: (cb, delay) => { window.forceRelayRetry = cb; console.log('call `forceRelayRetry()` for immediately retry! Or wait ' + delay + ' ms.'); },
    statusCodes: [500, 503, 504]
  }),
  authMiddleware({
    token: () => store.get('jwt'),
    tokenRefreshPromise: (req) => {
      console.log('[client.js] resolve token refresh', req);
      return fetch('/jwt/refresh')
        .then(res => res.json())
        .then(json => {
          const token = json.token;
          store.set('jwt', token);
          return token;
        })
        .catch(err => console.log('[client.js] ERROR can not refresh token', err));
    },
  }),

  // example of the custom inline middleware
  next => req => {
    // `req` is an object with settings for `fetch` function. It's not an express request object.
    // Internally works following code:
    //    let { url, ...opts } = req;
    //    fetch(url, opts)
    // So `req` is a fetch options. And into this options, I added `url` prop, which will be extracted as shown above.
    // You have fully control under `fetch` via `req` object.

    req.method = 'GET'; // change default POST request method to GET
    req.headers['X-Request-ID'] = uuid.v4(); // add `X-Request-ID` to request headers
    req.credentials = 'same-origin'; // provide CORS policy to XHR request in fetch method
    return next(req);
  }
]));

How middlewares work internally

Middlewares on bottom layer use fetch method. So req is compliant with a fetch() options. And res can be obtained via resPromise.then(res => ...), which returned by fetch().

Middlewares have 3 phases:

Basic skeleton of middleware:

export default function skeletonMiddleware(opts = {}) {
  // [SETUP PHASE]: here you can process `opts`, when you create Middleware

  return next => req => {
    // [CAPTURING PHASE]: here you can change `req` object, before it will pass to following middlewares.
    // ...some code which modify `req`

    const resPromise = next(req); // pass request to following middleware and get response promise from it

    // [BUBBLING PHASE]: here you may change response of underlying middlewares, via promise syntax
    // ...some code, which may add `then()` or `catch()` to response promise
    //    resPromise.then(res => { console.log(res); return res; })

    return resPromise; // return response promise to upper middleware
  };
}

Middlewares use LIFO (last in, first out) stack. Or simply put - use compose function. So if you pass such middlewares [M1(opts), M2(opts)] to NetworkLayer it will be work such way:

Batching several requests into one

Joseph Savona wrote: For legacy reasons, Relay splits "plural" root queries into individual queries. In general we want to diff each root value separately, since different fields may be missing for different root values.

Also if you use react-relay-router and have multiple root queries in one route pass, you may notice that default network layer will produce several http requests.

So for avoiding multiple http-requests, the ReactRelayNetworkLayer is the right way to combine it in single http-request.

Example how to enable batching

...on server

Firstly, you should prepare server to proceed the batch request:

import express from 'express';
import graphqlHTTP from 'express-graphql';
import { graphqlBatchHTTPWrapper } from 'react-relay-network-layer';
import bodyParser from 'body-parser';
import myGraphqlSchema from './graphqlSchema';

const port = 3000;
const server = express();

// setup standart `graphqlHTTP` express-middleware
const graphqlServer = graphqlHTTP({
  schema: myGraphqlSchema,
  formatError: (error) => ({ // better errors for development. `stack` used in `gqErrors` middleware
    message: error.message,
    stack: process.env.NODE_ENV === 'development' ? error.stack.split('\n') : null,
  }),
});

// declare route for batch query
server.use('/graphql/batch',
  bodyParser.json(),
  graphqlBatchHTTPWrapper(graphqlServer)
);

// declare standard graphql route
server.use('/graphql',
  graphqlServer
);

server.listen(port, () => {
  console.log(`The server is running at http://localhost:${port}/`);
});

More complex example of how you can use a single DataLoader for all (batched) queries within a one HTTP-request.

If you are on Koa@2, koa-graphql-batch provides the same functionality as graphqlBatchHTTPWrapper (see its docs for usage example).

...on client

And right after server side ready to accept batch queries, you may enable batching on the client:

Relay.injectNetworkLayer(new RelayNetworkLayer([
  batchMiddleware({
    batchUrl: '/graphql/batch', // <--- route for batch queries
  }),
]));

How batching works internally

Internally batching in NetworkLayer prepare list of queries [ {id, query, variables}, ...] sends it to server. And server returns list of responces [ {id, payload}, ...], (where id is the same value as client requested for identifying which data goes with which query, and payload is standard response of GraphQL server: { data, error }).

Recommended modules

Contribute

I actively welcome pull requests with code and doc fixes. Also if you made great middleware and want share it within this module, please feel free to open PR.

CHANGELOG

License

MIT