Switching My gRPC Project to Use Buf

Updated: Thursday, December 16, 2021

Do you work with Protobufs and gRPC in your projects or at work? Have you ever been frustrated with the tooling, boilerplate setup, or wondered if you were doing things the right way ? Same. Good news! There is a better way.

My own project Flipt has used Protobufs and gRPC/gRPC gateway from the very start, however, I have always been frustrated with just how clunky the entire process was.

In this post I’ll go over some of my frustrations with the Protobuf and gRPC ecosystems and how I switched Flipt over to use Buf and why you should too. At the end of this post you will have a better understanding of how Buf can help you better organize and work with your Protobufs and save you time generating code.

note: I’m not affiliated with the Buf team. I just like the tooling.

If you’re not familiar with Protobufs or gRPC, I’d first recommend reading up on them and the problems they intend to solve before going further. There’s also an architecture page on how/why I use gRPC and gRPC Gateway in Flipt for more info on my specific use case.

Proto Problems

After working with Protobuf and gRPC over the years, I’ve come to dislike the whole process of generating code from these .proto files whenever a change is made to an existing API or a new API is added.

Some of my gripes include:

  1. The actual commands to invoke protoc and the multitude of options (with varying degrees of documentation) have always seemed archaic to me.

    protoc -I ./proto \
    --go_out ./proto --go_opt paths=source_relative \
    --go-grpc_out ./proto --go-grpc_opt paths=source_relative \
    --grpc-gateway_out ./proto --grpc-gateway_opt paths=source_relative \

    Who designed this CLI? Even better, each protoc-*-plugin can have a different set of options, so if you want to generate clients in a multitude of languages, you’ll have to (usually) do a lot of trial and error to find the correct commands to execute.

  2. You have to know about the rules of adding/updating/deprecating fields in your .proto files because they aren’t enforced until you generate the code and try to use it. Nothing like getting to your pre-production or even production environments and finding out your new field is not backward compatible with the data from the server because someone didn’t use reserved. These types of breaking changes are also hard to spot in code-review unless you know what you are looking for.

  3. Finally, depending on any other .proto file from your organization (such as a common library or set of domain types) means that you had to physically copy or clone that file into your own project or tell protoc where to find it on your machine.

    While this seems obvious that this would be the case, it’s a huge hassle, especially when working in a large organization across many different repositories with many developers to require everyone to copy code locally on their machine and run a verbose encantation each time they want to generate code.

And yes I’ve tried Bazel and don’t even get me started…

Level Up With Go

Thanks for reading! While you're here, sign up to receive a free sample chapter of my upcoming guide: Level Up With Go.

No spam. I promise.

Buf to the Rescue

I recently stumbled upon Buf and I’ll admit that I was skeptical at first, however, after using it for about a week I can say that I would recommend it to anyone who works with Protobufs in their projects.

But what is Buf? Per their documentation:

Buf is building tooling to make Protobuf reliable and easy to use for service owners and clients, while keeping it the obvious choice on the technical merits.

Currently, Buf seems to have two main projects that work hand in hand with each other. The buf CLI and the Buf Schema Registry or BSR.

The buf CLI

The buf CLI is a tool that you install locally on your machine which fixes most of my gripes with the existing proto tools. It includes commands for:

  • Linting your .proto files to ensure they match your organization or personal as well as industry standards.
  • Detecting breaking changes against a base branch (i.e. main) via a set of rules that enforce backward compatibility.
  • Code generation that invokes protoc behind the scenes, but uses a sane configurable template (YAML file) that can be checked in and managed as code. This was the big one for me as it means that I no longer have to think about protoc!

You still need to have protoc and the necessary plugins installed on your machine however, for example, if you want to generate Go GRPC clients you’d still need to first ensure you have Go installed as well as the protoc-gen-go and protoc-gen-go-grpc plugins via:

go install google.golang.org/protobuf/cmd/protoc-gen-go@latest
go install google.golang.org/grpc/cmd/protoc-gen-go-grpc@latest

In my opinion, it would be great if buf would install these plugins for you if they didn’t exist on your machine, depending on which language you are trying to generate code for, somewhat like how bazel does.. just with less fuss.

After getting buf installed and configured properly, I was able to simplify my Makefile for the Protobuf generation step from a mess of protoc arguments to one command: buf generate. Win.

I also found that my .proto file failed many of the linting rules that buf lint uses by default. In the meantime, I simply added those failing rules to be ignored so that I can fix them over time to ensure that I don’t break any existing client code.


My final complaint with the traditional Protobuf ecosystem was that depending on any shared .protos required you to physically copy those files to your machine where protoc could find them. In my case, I am generating openAPI v2 documentation for my REST API straight from my .proto file with the use of the protoc-gen-openapiv2 plugin.

Before Buf, this required me to keep a copy of two files options/annotations.proto and options/openapiv2.proto within my project in order to generate the docs successfully. Now, with the help of the Buf Schema Repository, I can delete those files locally and simply add the required dependencies in my buf.yaml:

 - buf.build/googleapis/googleapis
 - buf.build/grpc-ecosystem/grpc-gateway

Now, buf can get these files directly from the BSR instead of them having to exist on my machine. This will be a game-changer for organizations or large projects that have multiple shared proto dependences! Now you can simply publish a module that contains your .proto files to the BSR and your other projects can depend on them directly if they are also using buf. Best of all, these modules are versioned automatically, so different consumers can depend on different versions independently!

Simplifying Client Generation

Another part of my workflow that I was able to simplify with buf was generating client code in other languages, namely Ruby. As Flipt exposes both a REST API (via gRPC gateway) and a gRPC API, I wanted to be able to generate gRPC clients in at least a couple of languages for consumers to use. I chose Ruby and Go since those are the languages I know the best. These two clients are published to separate repositories on GitHub, flipt-grpc-ruby and flipt-grpc-go respectively.

I had previously captured the client code generation in a script, which again was mainly just a bunch of protoc-*-plugin options, however, it had the added step of installing the grpc-tools RubyGem if it wasn’t already installed. Honestly, though, I disliked having to ensure I had an up-to-date version of Ruby and the grpc-tools gem on my machine just to generate this Ruby client once in a while.

Thankfully, I was able to replace all of the above with a separate buf.public.gen.yaml configuration to generate these clients. Now all I have to do is run buf generate --template=buf.public.gen.yaml to generate these clients, without having to script anything. The best part is I don’t even need to worry about ensuring that I have Ruby or the grpc-tools gem installed locally, thanks to Buf’s experimental Remote Generation!

In my buf.public.gen.yaml file I’m now depending on buf.build/protocolbuffers/plugins/ruby and buf.build/grpc/plugins/ruby plugins that actually generate the Ruby code for me remotely!

I wanted to see how this is done, so I fired up my favorite proxy and captured the output. Sure enough, it seems that the buf client makes a request to https://api.buf.build/ with the contents of my .proto file and returns generated Ruby code that it then saves locally! Wild.

Remote Code Generation

Final Thoughts

So as you can see I’ve pretty much become a Buf convert. I think it smooths most of the rough edges around working with Protobufs and gRPC and that it’s going to be the main way that developers interact with these ecosystems in the future. Buf has great documentation and an awesome tutorial that helped me get going without much headache.

I’m looking forward to seeing what the Buf team releases next and how it helps shape the adoption of Protobufs and gRPC in the future.

Have you or your team tried moving your projects to use Buf? How did it go? Have you run into any issues in the process? Let me know on Twitter.

Like this post? Do me a favor and share it!