Home

Awesome

Protox

Elixir CI Coverage Status Hex.pm Version Hex Docs License

protox is an Elixir library to work with Google's Protocol Buffers, versions 2 and 3. It supports both binary and JSON encoding and decoding (well-known types included, except the Any type for the time being).

The primary objective of protox is reliability: it uses property based testing and has a near 100% code coverage. Also, using mutation testing with the invaluable help of Muzak pro, the quality of the protox test suite has been validated. Therefore, protox passes all the tests of the conformance checker provided by Google.

It's also easy to use: just point to the *.proto files or give the schema to the Protox macro, no need to generate any file! However, should you need to generate files, a mix task is available.

Given the following protobuf definition, protox will generate a Msg struct:

message Msg{
  int32 a = 1;
  map<int32, string> b = 2;
}

You can then interact with Msg like any Elixir structure:

iex> msg = %Msg{a: 42, b: %{1 => "a map entry"}}
iex> {:ok, iodata} = Msg.encode(msg)
iex> {:ok, iodata} = Msg.json_encode(msg)

iex> binary = # read binary from a socket, a file, etc.
iex> {:ok, msg} = Msg.decode(binary)
iex> json = # read json from a socket, file, etc.
iex> {:ok, msg} = Msg.json_decode(json)

You can find here a more involved example with most types.

Table of contents

Prerequisites

Installation

Add :protox to your list of dependencies in mix.exs:

def deps do
  [{:protox, "~> 1.7"}]
end

If you plan to use the JSON encoding, you'll need to add Jason to your dependencies:

def deps do
  [
    {:protox, "~> 1.7"},
    {:jason, "~> 1.2"}
  ]
end

Usage with an inlined textual description

The following example generates two modules: Baz and Foo from a textual description:

defmodule MyModule do
  use Protox, schema: """
  syntax = "proto3";

  message Baz {
  }

  message Foo {
    int32 a = 1;
    map<int32, Baz> b = 2;
  }
  """
end

ℹī¸ The module in which the Protox macro is called is completely ignored and therefore does not appear in the names of the generated modules.

Usage with files

Here's how to generate the modules from a set of files:

defmodule MyModule do
  use Protox, files: [
    "./defs/foo.proto",
    "./defs/bar.proto",
    "./defs/baz/fiz.proto",
  ]
end

Protobuf binary format

Encode

Here's how to create and encode a new message to binary protobuf:

iex> msg = %Fiz.Foo{a: 3, b: %{1 => %Fiz.Baz{}}}
iex> {:ok, iodata} = Protox.encode(msg)

Or, with throwing style:

iex> iodata = Protox.encode!(msg)

It's also possible to call encode/1 and encode!/1 directly on the generated structures:

iex> {:ok, iodata} = Fiz.Foo.encode(msg)
iex> iodata = Fiz.Foo.encode!(msg)

ℹī¸ Note that encode/1 returns an IO data for efficiency reasons. Such IO data can be used directly with files or sockets write operations:

iex> {:ok, iodata} = Protox.encode(%Fiz.Foo{a: 3, b: %{1 => %Fiz.Baz{}}})
[[[], <<18>>, <<4>>, "\b", <<1>>, <<18>>, <<0>>], "\b", <<3>>]
iex> {:ok, file} = File.open("msg.bin", [:write])
{:ok, #PID<0.1023.0>}
iex> IO.binwrite(file, iodata)
:ok

👉 You can use :binary.list_to_bin/1 or IO.iodata_to_binary to get a binary:

iex> %Fiz.Foo{a: 3, b: %{1 => %Fiz.Baz{}}} |> Protox.encode!() |> :binary.list_to_bin()
<<8, 3, 18, 4, 8, 1, 18, 0>>

Decode

Here's how to decode a message from binary protobuf:

iex> {:ok, msg} = Protox.decode(<<8, 3, 18, 4, 8, 1, 18, 0>>, Fiz.Foo)

Or, with throwing style:

iex> msg = Protox.decode!(<<8, 3, 18, 4, 8, 1, 18, 0>>, Fiz.Foo)

It's also possible to call decode/1 and decode!/1 directly on the generated structures:

iex> {:ok, msg} = Fiz.Foo.decode(<<8, 3, 18, 4, 8, 1, 18, 0>>)
iex> msg = Fiz.Foo.decode!(<<8, 3, 18, 4, 8, 1, 18, 0>>)

Protobuf JSON format

protox implements the Google's JSON specification.

Encode

Here's how to encode a message to JSON, exported as IO data:

iex> msg = %Fiz.Foo{a: 42}
iex> {:ok, iodata} = Protox.json_encode(msg)
{:ok, ["{", ["\"a\"", ":", "42"], "}"]}

Or, with throwing style:

iex> msg = %Fiz.Foo{a: 42}
iex> iodata = Protox.json_encode!(msg)
["{", ["\"a\"", ":", "42"], "}"]

It's also possible to call json_encode and json_encode! directly on the generated structures:

iex> {:ok, iodata} = Fiz.Foo.json_encode(msg)
iex> iodata = Fiz.Foo.json_encode!(msg)

Decode

Here's how to decode JSON to a message:

iex> Protox.json_decode("{\"a\":42}", Fiz.Foo)
{:ok, %Fiz.Foo{__uf__: [], a: 42, b: %{}}}

Or, with throwing style:

iex> Protox.json_decode!("{\"a\":42}", Fiz.Foo)
%Fiz.Foo{__uf__: [], a: 42, b: %{}}

It's also possible to call json_decode and json_decode! directly on the generated structures:

iex> Fiz.Foo.json_decode("{\"a\":42}")
iex> Fiz.Foo.json_decode!("{\"a\":42}")

JSON library configuration

By default, protox uses Jason to encode values to JSON (mostly to escape strings). You can also use Poison:

iex> Protox.json_decode!(iodata, Fiz.Foo, json_library: Protox.Poison)
iex> Protox.json_encode!(msg, json_library: Protox.Poison)

ℹī¸ You can use any other library by implementing the Protox.JsonLibrary behaviour.

👉 Don't forget to add the chosen library to the list of dependencies in mix.exs.

Well-known types

Note that protox does not completely support the Any well-know type: it will be encoded and decoded like a regular message, rather than with the custom encoding specified in the JSON specification.

Packages and namespaces

Packages

Protobuf provides a package directive:

package abc.def;
message Baz {}

Modules generated by protox will include this package declaration. Thus, the example above will be translated to Abc.Def.Baz (note the camelization of package abc.def to Abc.Def).

Prepend namespaces

In addition, protox provides the possibility to prepend a namespace with the namespace option to all generated modules:

defmodule Bar do
  use Protox, schema: """
    syntax = "proto3";

    package abc;

    message Msg {
        int32 a = 1;
      }
    """,
    namespace: MyApp
end

In this example, the module MyApp.Abc.Msg is generated:

iex> msg = %MyApp.Abc.Msg{a: 42}

Specify import path

An import path can be specified using the path: or paths: options that respectively specify the directory or directories in which to search for imports:

defmodule Baz do
  use Protox,
    files: [
      "./defs/prefix/foo.proto",
      "./defs/prefix/bar/bar.proto",
    ],
    path: "./defs"
end

If multiple search paths are needed:

defmodule Baz do
  use Protox,
    files: [
      "./defs1/prefix/foo.proto",
      "./defs1/prefix/bar.proto",
      "./defs2/prefix/baz/baz.proto"
    ],
    paths: [
      "./defs1",
      "./defs2"
    ]
end

It corresponds to the -I option of protoc.

Unknown fields

Unknown fields are fields that are present on the wire but which do not correspond to an entry in the protobuf definition. Typically, it occurs when the sender has a newer version of the protobuf definition. It enables backwards compatibility as the receiver with an old version of the protobuf definition will still be able to decode old fields.

When unknown fields are encountered at decoding time, they are kept in the decoded message. It's possible to access them with the unknown_fields/1 function defined with the message.

iex> msg = Msg.decode!(<<8, 42, 42, 4, 121, 97, 121, 101, 136, 241, 4, 83>>)
%Msg{a: 42, b: "", z: -42, __uf__: [{5, 2, <<121, 97, 121, 101>>}]}

iex> Msg.unknown_fields(msg)
[{5, 2, <<121, 97, 121, 101>>}]

You must always use unknown_fields/1 as the name of the field (e.g. __uf__ in the above example) is generated at compile-time to avoid collision with the actual fields of the Protobuf message. This function returns a list of tuples {tag, wire_type, bytes}. For more information, please see protobuf encoding guide.

When you encode a message that contains unknown fields, they will be reencoded in the serialized output.

Disable support of unknown fields

You can deactivate the support of unknown fields by setting the :keep_unknown_fields option to false:

defmodule Baz do
  use Protox,
    schema: """
    syntax = "proto3";

    message Sub {
      int32 a = 1;
      string b = 2;
    }
    """,
    keep_unknown_fields: false
end

ℹī¸ protox will still correctly parse unknown fields, they just won't be added to the structure and you won't be able to access them. This also means that unkown fields won't be serialized back.

Unsupported features

Implementation choices

Generated code reference

The detailed reference of the generated code is available here.

Files generation

It's possible to generate a file that will contain all code corresponding to the protobuf messages:

MIX_ENV=prod mix protox.generate --output-path=/path/to/message.ex --include-path=./test/samples test/samples/messages.proto test/samples/proto2.proto

The generated file will be usable in any project as long as protox is declared in the dependencies as it needs functions from the protox runtime.

Options

Conformance

The protox library has been thoroughly tested using the conformance checker provided by Google.

Here's how to launch the conformance tests:

Skipped conformance tests

You may have noticed that there are 21 expected failures. Indeed, we removed on purpose some conformance tests that protox can't currently pass. Here are the reasons why:

The exact list of skipped tests is here.

Types mapping

The following table shows how Protobuf types are mapped to Elixir's ones.

ProtobufElixir
int32integer()
int64integer()
uint32integer()
uint64integer()
sint32integer()
sint64integer()
fixed32integer()
fixed64integer()
sfixed32integer()
sfixed64integer()
floatfloat() | :infinity | :'-infinity' | :nan
doublefloat() | :infinity | :'-infinity' | :nan
boolboolean()
stringString.t()
bytesbinary()
repeatedlist(value_type) where value_type is the type of the repeated field
mapmap()
oneof{atom(), value_type} where atom() is the type of the set field and where value_type is the type of the set field
enumatom() | integer()
messagestruct()

Benchmarks

You can launch benchmarks to see how protox perform:

mix run ./benchmarks/generate_payloads.exs # first time only, generates random payloads
mix run ./benchmarks/run.exs --lib=./benchmarks/protox.exs
mix run ./benchmarks/load.exs

Development

protox uses pre-commit to launch git hooks. Thus, it's strongly recommended to install it, and then to install hooks as follows:

pre-commit install && pre-commit install -t pre-push

Credits

Both gpb and exprotobuf were very useful in understanding how to implement Protocol Buffers.