gRPC Elixir is a full-featured Elixir implementation of the gRPC protocol, supporting unary and streaming RPCs, interceptors, HTTP transcoding, and TLS. This version adopts a unified stream-based model for all types of calls.
- Installation
- Protobuf Code Generation
- Server Implementation
- Application Startup
- Client Usage
- HTTP Transcoding
- CORS
- Features
- Benchmark
- Contributing
The package can be installed as:
def deps do
[
{:grpc, "~> 0.10"},
{:protobuf, "~> 0.14"}, # optional for import wellknown google types
{:grpc_reflection, "~> 0.1"} # optional enable grpc reflection
]
end
Use protoc
with protobuf elixir plugin or using protobuf_generate hex package to generate the necessary files.
- Write your protobuf file:
syntax = "proto3";
package helloworld;
// The request message containing the user's name.
message HelloRequest {
string name = 1;
}
// The response message containing the greeting
message HelloReply {
string message = 1;
}
// The greeting service definition.
service GreetingServer {
rpc SayUnaryHello (HelloRequest) returns (HelloReply) {}
rpc SayServerHello (HelloRequest) returns (stream HelloReply) {}
rpc SayBidStreamHello (stream HelloRequest) returns (stream HelloReply) {}
}
- Compile protos (protoc + elixir plugin):
protoc --elixir_out=plugins=grpc:./lib -I./priv/protos helloworld.proto
All RPC calls must be implemented using the stream-based API, even for unary requests.
defmodule HelloworldStreams.Server do
use GRPC.Server, service: Helloworld.GreetingServer.Service
alias GRPC.Stream
alias Helloworld.HelloRequest
alias Helloworld.HelloReply
@spec say_unary_hello(HelloRequest.t(), GRPC.Server.Stream.t()) :: any()
def say_unary_hello(request, _materializer) do
GRPC.Stream.unary(request)
|> GRPC.Stream.map(fn %HelloReply{} = reply ->
%HelloReply{message: "[Reply] #{reply.message}"}
end)
|> GRPC.Stream.run()
end
end
def say_server_hello(request, materializer) do
Stream.repeatedly(fn ->
index = :rand.uniform(10)
%HelloReply{message: "[#{index}] Hello #{request.name}"}
end)
|> Stream.take(10)
|> GRPC.Stream.from()
|> GRPC.Stream.run_with(materializer)
end
@spec say_bid_stream_hello(Enumerable.t(), GRPC.Server.Stream.t()) :: any()
def say_bid_stream_hello(request, materializer) do
output_stream =
Stream.repeatedly(fn ->
index = :rand.uniform(10)
%HelloReply{message: "[#{index}] Server response"}
end)
GRPC.Stream.from(request, join_with: output_stream)
|> GRPC.Stream.map(fn
%HelloRequest{name: name} -> %HelloReply{message: "Welcome #{name}"}
other -> other
end)
|> GRPC.Stream.run_with(materializer)
end
💡 The Stream API supports composable stream transformations via ask
, map
, run
and others functions, enabling clean and declarative stream pipelines. For a complete list of available operators see here.
Add the server supervisor to your application's supervision tree:
defmodule Helloworld.Application do
@ false
use Application
@impl true
def start(_type, _args) do
children = [
GrpcReflection,
{
GRPC.Server.Supervisor, [
endpoint: Helloworld.Endpoint,
port: 50051,
start_server: true,
# adapter_opts: [# any adapter-specific options like tls configuration....]
]
}
]
opts = [strategy: :one_for_one, name: Helloworld.Supervisor]
Supervisor.start_link(children, opts)
end
end
iex> {:ok, channel} = GRPC.Stub.connect("localhost:50051")
iex> request = Helloworld.HelloRequest.new(name: "grpc-elixir")
iex> {:ok, reply} = channel |> Helloworld.GreetingServer.Stub.say_unary_hello(request)
# With interceptors
iex> {:ok, channel} = GRPC.Stub.connect("localhost:50051", interceptors: [GRPC.Client.Interceptors.Logger])
...
Check the examples and interop directories in the project's source code for some examples.
The default adapter used by GRPC.Stub.connect/2
is GRPC.Client.Adapter.Gun
. Another option is to use GRPC.Client.Adapters.Mint
instead, like so:
GRPC.Stub.connect("localhost:50051",
# Use Mint adapter instead of default Gun
adapter: GRPC.Client.Adapters.Mint
)
The GRPC.Client.Adapters.Mint
adapter accepts custom configuration. To do so, you can configure it from your mix application via:
# File: your application's config file.
config :grpc, GRPC.Client.Adapters.Mint, custom_opts
The accepted options for configuration are the ones listed on Mint.HTTP.connect/4
- Adding grpc-gateway annotations to your protobuf file definition:
import "google/api/annotations.proto";
import "google/protobuf/timestamp.proto";
package helloworld;
// The greeting service definition.
service Greeter {
// Sends a greeting
rpc SayHello (HelloRequest) returns (HelloReply) {
option (google.api.http) = {
get: "/v1/greeter/{name}"
};
}
rpc SayHelloFrom (HelloRequestFrom) returns (HelloReply) {
option (google.api.http) = {
post: "/v1/greeter"
body: "*"
};
}
}
- Add protoc plugin dependency and compile your protos using protobuf_generate hex package:
In mix.exs:
def deps do
[
{:grpc, "~> 0.7"},
{:protobuf_generate, "~> 0.1.1"}
]
end
And in your terminal:
mix protobuf.generate \
--include-path=priv/proto \
--include-path=deps/googleapis \
--generate-descriptors=true \
--output-path=./lib \
--plugins=ProtobufGenerate.Plugins.GRPCWithOptions \
google/api/annotations.proto google/api/http.proto helloworld.proto
- Enable http_transcode option in your Server module
defmodule Helloworld.Greeter.Server do
use GRPC.Server,
service: Helloworld.Greeter.Service,
http_transcode: true
# callback implementations...
end
See full application code in helloworld_transcoding example.
When accessing gRPC from a browser via HTTP transcoding or gRPC-Web, CORS headers may be required for the browser to allow access to the gRPC endpoint. Adding CORS headers can be done by using GRPC.Server.Interceptors.CORS
as an interceptor in your GRPC.Endpoint
module, configuring it as decribed in the module documentation:
Example:
# Define your endpoint
defmodule Helloworld.Endpoint do
use GRPC.Endpoint
intercept GRPC.Server.Interceptors.Logger
intercept GRPC.Server.Interceptors.CORS, allow_origin: "mydomain.io"
run Helloworld.Greeter.Server
end
- Various kinds of RPC:
- HTTP Transcoding
- TLS Authentication
- Error handling
- Interceptors
- Connection Backoff
- Data compression
- gRPC Reflection
-
Simple benchmark by using ghz
-
Benchmark followed by official spec
Your contributions are welcome!
Please open issues if you have questions, problems and ideas. You can create pull requests directly if you want to fix little bugs, add small features and so on. But you'd better use issues first if you want to add a big feature or change a lot of code.