Towards Dev

A publication for sharing projects, ideas, codes, and new theories.

Follow publication

gRPC with Python

Sanchit Ahuja
Towards Dev
Published in
4 min readFeb 9, 2022

--

Lately, there’s been a lot of buzz around gRPC and Protocol Buffers. This article aims to simplify these terms and write small pieces of code in Python demonstrating how to use them in your projects.

What are Protocol Buffers?

Protocol Buffers are Google’s language-neutral and platform-independent mechanism to serialize structured data. You can specify the structure of your objects once and later generate language-specific code to use them in your applications.
It’s an alternative to existing JSON/XML objects but smaller and faster to interpret.
We define Proto Buffs in a file with a .proto extension.

The “ = 1”, “ = 2” markers on each element identify the unique “tag” that field uses in the binary encoding. Tag numbers 1–15 require one less byte to encode than higher numbers, so as an optimization you can decide to use those tags for the commonly used or repeated elements, leaving tags 16 and higher for less-commonly used optional elements.

Each field must be annotated with one of the following modifiers:

  • repeated: the field may be repeated any number of times (including zero).
  • optional: the field may or may not be set. If an optional field value isn't set, a default value is used.
  • required: a value for the field must be provided, otherwise the message will be considered "uninitialized". Serializing an uninitialized message will raise an exception. Parsing an uninitialized message will fail. Other than this, a required field behaves exactly like an optional field.

What is gRPC

RPC can be considered as an alternative to REST where a client can call a method on a server application. RPC ideology is around creating a service that can be called across machines with specified parameters and specified return types. gRPC uses Proto Buffs as the mechanism for serializing and deserializing structured data.

Example how gRPC request and response flows
Flow of data in gRPC

gRPC lets you defined four kinds of services. A small description of them are as follow:

  1. Unary RPC: Where the client sends a single request and receives a single response from the server in return.
  2. Server Streaming RPC: Where the client sends a single request and the server returns a stream of messages. The client can read until there are no more messages. gRPC ensure ordering of the messages
  3. Client Streaming RPCs: Where the client sends a stream request to the server. The server waits until there are no more requests and returns a single response. gRPC ensures ordering of the request messages.
  4. Bidirectional streaming RPCs: Where both sides send a stream of requests and responses. The two streams are independent of each other so the Client and Server can read and write messages as per their requirements. Again the order of messages received is preserved.

Why gRPC ?

  • Performance
  • Code generation
  • Strict Specification
  • Streaming
  • Deadline/timeouts & cancellation

gRPC is well suited to the following scenarios:

  • Microservices: gRPC is designed for low latency and high throughput communication. gRPC is great for lightweight microservices where efficiency is critical.
  • Point-to-point real-time communication: gRPC has excellent support for bi-directional streaming. gRPC services can push messages in real-time without polling.
  • Polyglot environments: gRPC tooling supports all popular development languages, making gRPC a good choice for multi-language environments.
  • Network constrained environments: gRPC messages are serialized with Protobuf, a lightweight message format. A gRPC message is always smaller than an equivalent JSON message.
  • Inter-process communication (IPC): IPC transports such as Unix domain sockets and named pipes can be used with gRPC to communicate between apps on the same machine.

Time to Code

Installing Protobuf

  1. Download a current Protobuff release from GitHub
 wget https://github.com/protocolbuffers/protobuf/releases/download/v3. 4/protobuf-all-3. 4.tar.gz.

2. Extract the archive

tar -xzf protobuf-all-3. 4.tar.gz.

3. Install Protobuff

cd protobuf-3. 4/ && ./configure && make && sudo make install.

4. Install grpc & grpcio-tools

pip install grpcio grpcio-tools

Creating Proto File

5. Declare user_search.proto file in protos directory.

6. Compile the .proto file

python -m grpc_tools.protoc — proto_path=. protos/user_search.proto — python_out=. — grpc_python_out=.

Post compilation you will find two files user_search_pb2.py & user_search_pb2_grpc.py

7. Create a server.py

8. Create a client.py

In the client file we demonstrate all the four types of services.

Other used functions can be found in utils.py

The above code can be found here.

References

--

--

Published in Towards Dev

A publication for sharing projects, ideas, codes, and new theories.

Written by Sanchit Ahuja

I am a Backend Developer specializing in Software Architecture, Databases & API Development. I also have a great interest in Quantitative trading & Stock Market

No responses yet