Coder Social home page Coder Social logo

rules-proto-grpc / rules_proto_grpc Goto Github PK

View Code? Open in Web Editor NEW
239.0 10.0 149.0 4.47 MB

Bazel rules for building Protobuf and gRPC code and libraries from proto_library targets

Home Page: https://rules-proto-grpc.com

License: Apache License 2.0

Python 3.87% Makefile 2.18% C++ 2.12% Go 30.74% Starlark 61.09%
bazel bazel-build bazel-rules protobuf protocol-buffers grpc build-tools

rules_proto_grpc's Introduction

Protobuf and gRPC rules for Bazel

Bazel rules for building Protobuf and gRPC code and libraries from proto_library targets

If you or your company find these rules useful, please consider supporting the building and maintenance of these rules โ˜•

Important

The master branch now contains the Bzlmod-only update of the rules that'll be released in version 5.0.0. If you need to see the WORKSPACE based rules used in version 4.x.x, please see the legacy branch. For tracking the status of language support with the Bzlmod rules, please see #299

Announcements ๐Ÿ“ฃ

2023/12/14 - Version 4.6.0

Version 4.6.0 has been released, which contains a few bug fixes for Bazel 7 support. Note that this is likely to be the last WORKSPACE supporting release of rules_proto_grpc, as new Bzlmod supporting rules are introduced in the next major version

2023/09/12 - Version 4.5.0

Version 4.5.0 has been released, which contains a number of version updates, bug fixes and usability improvements over 4.4.0. Additionally, the Rust rules contain a major change of underlying gRPC and Protobuf library; the rules now use Tonic and Prost respectively

Usage

Full documentation for the current and previous versions can be found here

Rules

Language Rule Description
Buf buf_proto_breaking_test Checks .proto files for breaking changes (example)
Buf buf_proto_lint_test Lints .proto files (example)
C c_proto_compile Generates C protobuf .h & .c files (example)
C c_proto_library Generates a C protobuf library using cc_library, with dependencies linked (example)
C++ cpp_proto_compile Generates C++ protobuf .h & .cc files (example)
C++ cpp_grpc_compile Generates C++ protobuf and gRPC .h & .cc files (example)
C++ cpp_proto_library Generates a C++ protobuf library using cc_library, with dependencies linked (example)
C++ cpp_grpc_library Generates a C++ protobuf and gRPC library using cc_library, with dependencies linked (example)
Documentation doc_docbook_compile Generates DocBook .xml documentation file (example)
Documentation doc_html_compile Generates .html documentation file (example)
Documentation doc_json_compile Generates .json documentation file (example)
Documentation doc_markdown_compile Generates Markdown .md documentation file (example)
Documentation doc_template_compile Generates documentation file using Go template file (example)
Go go_proto_compile Generates Go protobuf .go files (example)
Go go_grpc_compile Generates Go protobuf and gRPC .go files (example)
Go go_proto_library Generates a Go protobuf library using go_library from rules_go (example)
Go go_grpc_library Generates a Go protobuf and gRPC library using go_library from rules_go (example)
gRPC-Gateway gateway_grpc_compile Generates gRPC-Gateway .go files (example)
gRPC-Gateway gateway_openapiv2_compile Generates gRPC-Gateway OpenAPI v2 .json files (example)
gRPC-Gateway gateway_grpc_library Generates gRPC-Gateway library files (example)
Java java_proto_compile Generates a Java protobuf srcjar file (example)
Java java_grpc_compile Generates a Java protobuf and gRPC srcjar file (example)
Java java_proto_library Generates a Java protobuf library using java_library (example)
Java java_grpc_library Generates a Java protobuf and gRPC library using java_library (example)
Objective-C objc_proto_compile Generates Objective-C protobuf .m & .h files (example)
Objective-C objc_grpc_compile Generates Objective-C protobuf and gRPC .m & .h files (example)
Objective-C objc_proto_library Generates an Objective-C protobuf library using objc_library (example)
Objective-C objc_grpc_library Generates an Objective-C protobuf and gRPC library using objc_library (example)
Python python_proto_compile Generates Python protobuf .py files (example)
Python python_grpc_compile Generates Python protobuf and gRPC .py files (example)
Python python_grpclib_compile Generates Python protobuf and grpclib .py files (supports Python 3 only) (example)
Python python_proto_library Generates a Python protobuf library using py_library from rules_python (example)
Python python_grpc_library Generates a Python protobuf and gRPC library using py_library from rules_python (example)
Python python_grpclib_library Generates a Python protobuf and grpclib library using py_library from rules_python (supports Python 3 only) (example)

License

This project is derived from stackb/rules_proto under the Apache 2.0 license and this project therefore maintains the terms of that license

rules_proto_grpc's People

Contributors

aaliddell avatar amitz avatar ash2k avatar codersasha avatar cwoodcock avatar echochamber avatar gonzojive avatar ivucica avatar laurentlb avatar linzhp avatar lopopolo-brex avatar neilisaac avatar nmalsang avatar oliviernotteghem avatar paladine avatar pcj avatar philwo avatar purkhusid avatar qzmfranklin avatar smukherj1 avatar stevewolter avatar tandel-pratik avatar thefallentree avatar thomas-kielbus avatar tivvit avatar toddlipcon avatar vmax avatar william-smith-skydio avatar yeswalrus avatar ypxu avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

rules_proto_grpc's Issues

Cannot build multiple python proto targets along the same directory path

Hi, I'm a current user of these rules (thanks so much for your hard work on them!) in a repo supporting a larger organization. However, I have run into a fairly significant issue: If I have multiple python_proto_library targets for proto libraries in the same directory, I cannot seem to simultaneously depend on more than one of these targets.

I have attached a minimal example that illustrates this -
https://github.com/laurit17/rules_proto_grpc_failing_example

It seems that the issue is that the output directories of each of these targets is appended to the PYTHONPATH by Bazel, and thus, only one of them can be resolved to the package containing the various targets. The only workaround I can think of is to build all of them in the same target.

As a bit of a newbie to Bazel it is interesting to me to look at the outputs of these rules. For example, in my attached example, if I run bazel build //proto:b_py_proto, the outputs are listed as

bazel-bin/proto/b_py_proto_pb/proto/b_py2.py
bazel-bin/proto/b_py_proto_pb/proto/a_py2.py (the underling b_proto library depends on a_proto.

I assume that having the generated b_py_proto_pb folder avoids race conditions between rules that have the same generated files (for example, building :a_py_proto would also output a file named a_pb2.py). However, it comes at the expense of allowing targets to depend on multiple of these Python proto/grpc libraries at once.

I have included some native cc_proto_library targets for comparison because interestingly, they do not follow the pattern of the python_proto_library rules here. For example, building the b_cc_proto target above does generate an a.pb.h file as it would have to (just as a_cc_proto would) because the underlying proto_library depends on a.proto, but it does not list it as an output. Is this a potential workaround for the issue above?

Let me know your thoughts. This is an important blocker for us, I am happy to attempt to contribute if necessary!

Bazel 1.0.0 support

Build currently breaks on recently released Bazel 1.0.0, see https://buildkite.com/bazel/rules-proto-grpc-rules-proto-grpc/builds/91. Pending issues:

  • Update gRPC to fix proto provider issue (not determined if this is fixed upstream yet)
  • Update rules_scala to fix create_provider issue. Resolved upstream in bazelbuild/rules_scala#513
  • Update all dependencies
  • Rename rules_proto_grpc_dependencies to rules_proto_grpc_repos
  • Write release notes
  • Add default WORKSPACE rules to each language docs
  • Try updating C# packages, otherwise bump to 1.x.0
  • Fix grpc-gateway
  • Fix rust rules
  • Drop gRPC.js

Other linked issues are on the 1.0.0 milestone.

Changes are being made on the bazel-1.0-support branch.

Address upstream issues

Tracking resolution of upstream issues

Python path conflict when a binary depends on several python_proto_library

I have have a python_binary that have dependencies on several python_proto_library.
something as follow:

/dir1/my_python_binary  (py_binary)
/dir2/proto/proto_a.proto (python_proto_library)
/dir2/proto/proto_b.proto (python_proto_library)
/dir3/proto/proto_c.proto (python_proto_library)

proto_c depends on proto_a
my_python_binary depends on proto_b and proto_c

all of this which generates the following directory structure:

bazel-bin/dir1/my_python_binary.run_files/proto_a_pb/dir2
bazel-bin/dir1/my_python_binary.run_files/proto_c_pb/dir2
bazel-bin/dir1/my_python_binary.run_files/proto_c_pb/dir3

bazel adds bazel-bin/dir1/my_python_binary.run_files/proto_a_pb and bazel-bin/dir1/my_python_binary.run_files/proto_c_pb to the sys.path

which means that there are 2 different directory matching the following import statement:
from dir2 import *

and python will import everything only from the first directory matching in the sys.path hence part of my protos are not reachable.

this issue might be related to this one to : #25

Adding 'Options'?

Is there a way to add 'Options' to a plug-in without modifying the project's BUILD file? Passing it as argument when defining the rule? was thinking about something like that:

scala_grpc_compile(
    name = "test",
    options = [
            "java_conversions",
            "grpc",
        ],
    deps = [":testl_proto"],
)

We need to use java_conversions here, but I'm not completely OK with adding it to the scala BUILD file, as while it's very popular, not all user will want to use it.

Thanks.

Error encountered when trying to use python_grpc_library

I assume that pip_import with requirements = "@rules_proto_grpc//python:requirements.txt" is supposed to grab this requirement. I don't know about pip_import to properly debug this.

Error

ERROR: Analysis of target '//path/to/my/target:py_proto' failed; build aborted: no such package '@rules_proto_grpc_py3_deps_pypi__grpcio_1_25_0//': The repository '@rules_proto_grpc_py3_deps_pypi__grpcio_1_25_0' could not be resolved

BUILD.bazel

load("@rules_proto_grpc//python:defs.bzl", "python_grpc_library")

proto_library(
    name = "proto",
    srcs = ["my_grpc_service.proto"],
    visibility = ["//visibility:public"],
    deps = [
        "//proto/dep/target:proto",
        "@googleapis_archive//:api_proto",
    ],
)

python_grpc_library(
    name = "py_proto",
    protos = [":proto"],
    visibility = ["//visibility:public"],
    deps = [":proto"],
)

WORKSPACE

http_archive(
    name = "rules_proto_grpc",
    urls = ["https://github.com/rules-proto-grpc/rules_proto_grpc/archive/1.0.1.tar.gz"],
    sha256 = "497225bb586e8f587e139c55b0f015e93bdddfd81902985ce24623528dbe31ab",
    strip_prefix = "rules_proto_grpc-1.0.1",
)

load("@rules_proto_grpc//:repositories.bzl", "rules_proto_grpc_toolchains", "rules_proto_grpc_repos")
rules_proto_grpc_toolchains()
rules_proto_grpc_repos()

load("@rules_proto_grpc//python:repositories.bzl", rules_proto_grpc_python_repos="python_repos")

rules_proto_grpc_python_repos()

load("@com_github_grpc_grpc//bazel:grpc_deps.bzl", "grpc_deps")

grpc_deps()

load("@rules_python//python:repositories.bzl", "py_repositories")
py_repositories()

load("@rules_python//python:pip.bzl", "pip_repositories")
pip_repositories()

load("@rules_python//python:pip.bzl", "pip_import")
pip_import(
    name = "rules_proto_grpc_py2_deps",
    requirements = "@rules_proto_grpc//python:requirements.txt",
)

load("@rules_proto_grpc_py2_deps//:requirements.bzl", pip2_install="pip_install")
pip2_install()

pip_import(
    name = "rules_proto_grpc_py3_deps",
    requirements = "@rules_proto_grpc//python:requirements.txt",
)

load("@rules_proto_grpc_py3_deps//:requirements.bzl", pip3_install="pip_install")
pip3_install()

Cannot depend on two overlapping cpp_proto_library()'s

If you do this, then bazel will link in two separate (merged) directories. At best, this will cause ODR violations at link time; at worst, it gets caught at runtime (I found it when compiling with ASAN to debug a segfault caused by it).

Essentially, if I have a cc_binary that depends on a rule at //path:proto_one_cc, which is a cpp_proto_library(), and another rule at //path:proto_two_cc, and they both depend on a cpp_proto_library(name="common"), then I'll get two instances of every linked object from the common proto. Typically, this will include things harmless like the default instance of the protos, but also various static objects like descriptors that are compared by pointer (because why would you have two descriptors for the same proto in your binary?).

One way to avoid this is to make the cpp_proto_library() also an aspect, not just the cpp_proto_compile(), and switch to merge_directories=False. That aspect would be able to grab the includes from all its dependencies, transitively, and add those to itself. It would then return CcInfo to be part of a C++ compilation natively.

grpc-web commonjs_grpc_compile error generating protobuf JS

Hello rules_proto_grpc friends,
I am trying to generate grpc-web/typescript bindings for a grpc service, but my setup is giving an error on commonjs_grpc_compile. A minimal example is in a git repo https://github.com/humphrej/rules_proto_grpc_ts_example (Bazel 1.2.1, rules_proto_grpc 1.0.2).

My service definition is :
https://github.com/humphrej/rules_proto_grpc_ts_example/blob/master/proto/sysinternals.proto

Based on https://github.com/grpc/grpc-web#typescript-support, and generating the files manually (these work), I want to generate:

  • SysinternalsServiceClientPb.ts : a service stub in typescript (from the typescript protoc generator)
  • sysinternals_pb.d.ts : a typescript type descriptor of the protobuf payloads (from the typescript protoc generator)
  • sysinternals_pb.js : a javascript file containing the protobuf payloads (from the commonjs protoc generator)

A script to generate the files manually is in the repo
https://github.com/humphrej/rules_proto_grpc_ts_example/blob/master/run_protoc.sh

My Bazel build file is :
https://github.com/humphrej/rules_proto_grpc_ts_example/blob/master/proto/BUILD

Rules java_grpc_library and ts_grpc_compile both work, but commonjs_grpc_compile gives the following error:

$ bazel build //...
DEBUG: /private/var/tmp/_bazel_humphrej/30d1a6d490a744b5cd1a1d6be8d4786a/external/rules_proto_grpc/aspect.bzl:375:13: ProtoCompile: bazel-out/host/bin/external/com_google_protobuf/protoc --descriptor_set_in=bazel-out/darwin-fastbuild/bin/external/com_google_protobuf/empty_proto-descriptor-set.proto.bin --plugin=protoc-gen-commonjs_plugin=bazel-out/darwin-opt-exec-2B5CBBC6/bin/external/com_github_grpc_grpc_web/javascript/net/grpc/web/protoc-gen-grpc-web --commonjs_plugin_out=import_style=commonjs,mode=grpcweb:bazel-out/darwin-fastbuild/bin/external/com_google_protobuf/empty_proto/commonjs_grpc_compile_aspect_verb1 google/protobuf/empty.proto
DEBUG: /private/var/tmp/_bazel_humphrej/30d1a6d490a744b5cd1a1d6be8d4786a/external/rules_proto_grpc/aspect.bzl:375:13: ProtoCompile: bazel-out/host/bin/external/com_google_protobuf/protoc --descriptor_set_in=bazel-out/darwin-fastbuild/bin/external/com_google_protobuf/empty_proto-descriptor-set.proto.bin:bazel-out/darwin-fastbuild/bin/proto/sysinternals_proto-descriptor-set.proto.bin --plugin=protoc-gen-commonjs_plugin=bazel-out/darwin-opt-exec-2B5CBBC6/bin/external/com_github_grpc_grpc_web/javascript/net/grpc/web/protoc-gen-grpc-web --commonjs_plugin_out=import_style=commonjs,mode=grpcweb:bazel-out/darwin-fastbuild/bin/proto/sysinternals_proto/commonjs_grpc_compile_aspect_verb1 proto/sysinternals.proto
INFO: Analyzed 5 targets (0 packages loaded, 0 targets configured).
INFO: Found 5 targets...
ERROR: /private/var/tmp/_bazel_humphrej/30d1a6d490a744b5cd1a1d6be8d4786a/external/com_google_protobuf/BUILD:309:2: output 'external/com_google_protobuf/empty_proto/commonjs_grpc_compile_aspect_verb1/google/protobuf/empty_grpc_web_pb.js' was not created
ERROR: /private/var/tmp/_bazel_humphrej/30d1a6d490a744b5cd1a1d6be8d4786a/external/com_google_protobuf/BUILD:309:2: not all outputs were created or valid
INFO: Elapsed time: 0.311s, Critical Path: 0.12s
INFO: 2 processes: 2 darwin-sandbox.
FAILED: Build did NOT complete successfully

Maven central should all use https (not http)

Since Jan 15, Maven central is requiring HTTPS [1]. I think that repositories.bzl needs to be updated to reflect that.

[1] see https://support.sonatype.com/hc/en-us/articles/360041287334
[2] see https://github.com/rules-proto-grpc/rules_proto_grpc/blob/master/repositories.bzl#L191

From my build output:
ERROR: /builder/server-core/proto/BUILD:14:1: //server-core/proto:healthcheck_java_proto depends on @javax_annotation_javax_annotation_api//jar:jar in repository @javax_annotation_javax_annotation_api which failed to fetch. no such package '@javax_annotation_javax_annotation_api//jar': java.io.IOException: Error downloading [http://central.maven.org/maven2/javax/annotation/javax.annotation-api/1.2/javax.annotation-api-1.2.jar] to /workspace/.bazel/external/javax_annotation_javax_annotation_api/javax.annotation-api-1.2.jar: GET returned 501 HTTPS Required

cpp_proto_library Bazel 1.1.0 redefined timestamp.proto

I am trying to upgrade to Bazel 1.1.0.

I'm using 0.2.0 of this repo. Building a cpp_proto_library that does not depend on any other protos (not even WKTs) yields the following error:

ERROR: /mnt/data/git/LogiOcean/api/proto/gitdb/BUILD:188:1: error executing shell command: '/bin/bash -c bazel-out/host/bin/external/com_google_protobuf/protoc --descriptor_set_out=bazel-out/k8-fastbuild/bin/api/proto/gitdb/commit.pb.cc_pb/descriptor.source.bin --proto_path=bazel-out/k8-f...' failed (Exit 1) bash failed: error executing command
  (cd /mnt/data/cache/bazel/_bazel_zhongming/5a5efdfcd4f98edbd655a37c61128c16/sandbox/linux-sandbox/844/execroot/logi && \
  exec env - \
  /bin/bash -c 'bazel-out/host/bin/external/com_google_protobuf/protoc --descriptor_set_out=bazel-out/k8-fastbuild/bin/api/proto/gitdb/commit.pb.cc_pb/descriptor.source.bin --proto_path=bazel-out/k8-fastbuild/bin/api/proto/gitdb/commit.pb.cc_pb --include_imports --include_source_info --cpp_out=bazel-out/k8-fastbuild/bin/api/proto/gitdb/commit.pb.cc_pb bazel-out/k8-fastbuild/bin/api/proto/gitdb/commit.pb.cc_pb/api/proto/gitdb/commit.proto bazel-out/k8-fastbuild/bin/api/proto/gitdb/commit.pb.cc_pb/api/proto/version_def.proto bazel-out/k8-fastbuild/bin/api/proto/gitdb/commit.pb.cc_pb/api/proto/common/attr_def.proto bazel-out/k8-fastbuild/bin/api/proto/gitdb/commit.pb.cc_pb/_virtual_imports/timestamp_proto/google/protobuf/timestamp.proto')
Execution platform: //tools/platforms:linux64_clang

Use --sandbox_debug to see verbose messages from the sandbox
api/proto/gitdb/commit.proto:5:1: warning: Import google/protobuf/timestamp.proto but not used.
_virtual_imports/timestamp_proto/google/protobuf/timestamp.proto:131:9: "google.protobuf.Timestamp.seconds" is already defined in file "google/protobuf/timestamp.proto".
_virtual_imports/timestamp_proto/google/protobuf/timestamp.proto:137:9: "google.protobuf.Timestamp.nanos" is already defined in file "google/protobuf/timestamp.proto".
_virtual_imports/timestamp_proto/google/protobuf/timestamp.proto:127:9: "google.protobuf.Timestamp" is already defined in file "google/protobuf/timestamp.proto".
INFO: Elapsed time: 0.306s, Critical Path: 0.10s
INFO: 31 processes: 31 remote cache hit.
FAILED: Build did NOT complete successfully

I glanced over the 1.0 milestone issues of this repo but did not spot any obviously related issues. I would like to try the bazel-1.0-support branch. But while doing that, I'd like to understand whether you've seen this before.

Thanks a lot.

Tool Should be OS Sensitive

Hi again :-)

We tried running our Bazel project on Windows, and I'm getting the following error when trying to compile scala_grpc_* targets:

--grpc_scala_plugin_out: protoc-gen-grpc_scala_plugin: %1 is not a valid Win32 application.

Had a look on the code, and it looks like @com_github_scalapb_scalapb reference a zip file (scalapbc-0.9.4.zip). The zip file has a bin folder that has two proto-gen-scala files, one binary for Windows, and one for Linux. scalapb BUILD file only exports the Linux file.

Also, the tool defined in the Scala BUILD file should be OS Sensitive. I'm assuming the easiest way to fix it, is to add another parameter to proto_plugin called tool_windows and set a default value to it. Then - proto_plugin_impl can check if the parameters is not set to the default value, and if indeed the OS is Windows, and if so - set the ProtoPluginInfo Provider tool to be the windows tool (or else just the tool). This will have the least impact on other BUILD files, because everything can just keep working the same way it was - and proto_plugin_impl will just pick the the tool parameter normally.

If you don't want another parameter you can change tool to be an array of sort that has two tools, but this means a broader change across the project.

Let me know if this makes sense for you - or perhaps you got a better idea. If needed, I might be able to help with a Pull Request.

Creating a JAR With No Dependencies?

At the moment, when building a Java Library, all dependencies are being included into a single artifact (JAR). For example:

proto_library(
    name = 'test',
    srcs = ['test.java'],
    deps = ['://dependencies:java_dependency']
)

java_grpc_library(
    name = "grpc",
    deps = [":test"],
)

The Jar that will get created will have both the test.java class, and the test2 classes. This is kinda goes against how java_proto_library works. The reason it's important for Java users is due to dependency management. Let's say for example I have two Jars. One would be test (the one defined above), and another one called test2. Both have dependency in ['://dependencies:java_dependency'] - but one Jar was created a week ago, and one Jar today and include changes in ['://dependencies:java_dependency']. When both Jars will load, JVM will randomly pick one - and that's a problem.

EDIT: Well, this is more complex than I originally thought.

        DefaultInfo(
            files = depset(all_outputs),
            data_runfiles = ctx.runfiles(files = all_outputs),
        ),

It looks like all the Provider returns all the chain of dependencies. I'm assuming that's on purpose, so you won't have to re-define the same deps on target that use the rule. So while makes writing Bazel easier, it's a problem with Java. Maybe it's worth considering adding a 'Transitive' flag of a sort that defines if the return value form the rule return all transitive dependencies - or just the head.

Explicitly load rules_proto, rules_python, rules_cc etc

In the upcoming Bazel 1.0 release, a number of presently native rules will need to be explicitly loaded. Once 0.29 is released, testing with the incompatible flags can begin:

py_* -> https://github.com/bazelbuild/rules_python (bazelbuild/bazel#8893)
cc_* -> https://github.com/bazelbuild/rules_cc (bazelbuild/bazel#7643)
proto_* -> https://github.com/bazelbuild/rules_proto (bazelbuild/bazel#8891)
java_* -> https://github.com/bazelbuild/rules_java (bazelbuild/bazel#8741)

Release 1.0.0 of rules_proto_grpc will be scheduled to coincide with Bazel 1.0 and incorporate these changes.

Presently blocked on 0.29 release.

Settings Scala Version to 2.12.8

Hi :-)

I'm using scala_grpc_compile to build couple of gRPC services. It looks like scala_grpc_compile basically use rules_scala. rules_scala allows you to set the Scala version your compiling against with the scala_repositories() like so:

scala_repositories((
    "2.12.8",
    {
        "scala_compiler": "f34e9119f45abd41e85b9e121ba19dd9288b3b4af7f7047e86dc70236708d170",
        "scala_library": "321fb55685635c931eba4bc0d7668349da3f2c09aee2de93a70566066ff25c28",
        "scala_reflect": "4d6405395c4599ce04cea08ba082339e3e42135de9aae2923c9f5367e957315a",
    },
))

Along with settings the Scala proto repositories:

scala_proto_repositories(scala_version = "2.12.8")

However, while it works when I try it directly with rules_scala, It failed when using scala_grpc_compile:

Exception in thread "main" java.lang.NoSuchMethodError: scala.Predef$.refArrayOps([Ljava/lang/Object;)Lscala/collection/mutable/ArrayOps;
        at scalapb.compiler.ProtobufGenerator$.parseParameters(ProtobufGenerator.scala:1742)
        at scalapb.compiler.ProtobufGenerator$.handleCodeGeneratorRequest(ProtobufGenerator.scala:1767)
        at CompilerPlugin$.main(CompilerPlugin.scala:13)
        at CompilerPlugin.main(CompilerPlugin.scala)
--grpc_scala_plugin_out: protoc-gen-grpc_scala_plugin: Plugin failed with status code 1.

It looks like a versions difference with ScalaPB. So I forked the repository and changed the following in repositories.bzl:

    "com_github_scalapb_scalapb": {
        "type": "http",
        "urls": ["https://github.com/scalapb/ScalaPB/releases/download/v0.9.0/scalapbc-0.9.0.zip"],
        # "sha256": "bda0b44b50f0a816342a52c34e6a341b1a792f2a6d26f4f060852f8f10f5d854",
        "strip_prefix": "scalapbc-0.9.0/lib",
        "build_file": "@rules_proto_grpc//third_party:BUILD.bazel.com_github_scalapb_scalapb",
    },

And set the correct jar:

java_import(
    name = "compilerplugin",
    jars = ["com.thesamet.scalapb.compilerplugin-0.9.0.jar"],
    visibility = ["//visibility:public"],
)

(By the way. Is there a way to change the version rather than directly fork and change the file?)
Getting a different error now:

Exception in thread "main" java.lang.NoSuchMethodError: scala.Predef$.refArrayOps([Ljava/lang/Object;)Lscala/collection/mutable/ArrayOps;
        at scalapb.compiler.ProtobufGenerator$.parseParameters(ProtobufGenerator.scala:1853)
        at scalapb.compiler.ProtobufGenerator$.handleCodeGeneratorRequest(ProtobufGenerator.scala:1882)
        at CompilerPlugin$.main(CompilerPlugin.scala:13)
        at CompilerPlugin.main(CompilerPlugin.scala)
--grpc_scala_plugin_out: protoc-gen-grpc_scala_plugin: Plugin failed with status code 1.
Target //cme/orgmodel/api/v2:orgmodel_scala_better_grpc failed to build
Use --verbose_failures to see the command lines of failed build steps.

Looks like a version mismatch again. Am I'm missing here something? Maybe it's the scala-library import that is hard-coded to 2.11 (it's taken from the scalapb I'm guess).

Thanks!

Plugin output_directory fails when no outputs generated

See #31 (comment)

If a plugin that uses output_directory does not produce any outputs (e.g. gRPC plugin with no services), the copy to merge the directories will fail. Most plugins are well behaved, in that they will generate empty output files even when there are no services, but this should not be a hard requirement.

The command should first check that outputs have been generated or otherwise allow cp to fail.

Support C++ implementation for Python protobuf messages

Presently this can be achieved by setting the correct --define and binding of python_headers, but is somewhat cumbersome. Using the prebuilt wheels may be a reasonable solution, as this is equivalent to what is done for other languages.

csharp_grpc_compile assumes that all dependent protos have services

If a proto file with a service contains a dependency on another proto file without services, csharp_grpc_compile fails with an error:

ERROR: /home/aml/tmp/rules-proto-grpc-bug/BUILD.bazel:3:1: output 'mypackage_proto/csharp_grpc_compile_aspect_verb0/MypackageGrpc.cs' was not created
ERROR: /home/aml/tmp/rules-proto-grpc-bug/BUILD.bazel:3:1: not all outputs were created or valid

Min repro: https://github.com/amlinux/rules-proto-grpc-bug

Missing Srcjar

x_grpc_library doesn't output a srcjar file into the package artifact folder (I tried in both Scala and Java). I looks like the aspect generates the srcjar, as I can see two srcjars per proto target (one for messages and one for gRPC) in the aspect_verb folder, but a single srcjar that include both messages and gRPC classes from all proto targets isn't being saved.

I know both the native java_proto_library and scala_proto_library output a main srcjar, so It might be a good idea that both rules_proto_grpc's library will do the same.

Thanks!

Bazel complains on ..._reflection_proto_only does not have mandatory providers: 'proto'

I configured the bazel project follow the section cpp_grpc_library of https://rules-proto-grpc.github.io/rules_proto_grpc/cpp/

My bazel version is 1.0.0

My WORKSPACE file looks like:
'''
load("@rules_proto_grpc//:repositories.bzl", "rules_proto_grpc_toolchains")
rules_proto_grpc_toolchains()

load("@rules_proto_grpc//cpp:repositories.bzl", rules_proto_grpc_cpp_repos="cpp_repos")
rules_proto_grpc_cpp_repos()

load("@com_github_grpc_grpc//bazel:grpc_deps.bzl", "grpc_deps")
grpc_deps()
'''

My proto build file looks like:
'''
package(default_visibility = ["//visibility:public"])

load("@rules_proto_grpc//cpp:defs.bzl", "cpp_grpc_library")

proto_library(
name = "protos",
srcs = [
"tts.proto",
"speech.proto",
"sagittarius_event_server.proto",

],
visibility = [
    "//visibility:public",
],

)

cpp_grpc_library(
name = "grpc_lib",
deps = [":protos"],
)
'''

During the bazel build, I met the error message as following.
ERROR: /home/qli/.cache/bazel/_bazel_qli/94d4690820716448475d1a244f54f936/external/com_github_grpc_grpc/src/proto/grpc/reflection/v1alpha/BUILD:21:1: in srcs attribute of _generate_cc rule @com_github_grpc_grpc//src/proto/grpc/reflection/v1alpha:_reflection_proto_grpc_codegen: '@com_github_grpc_grpc//src/proto/grpc/reflection/v1alpha:_reflection_proto_only' does not have mandatory providers: 'proto'. Since this rule was created by the macro 'grpc_proto_library', the error might have been caused by the macro implementation in /home/qli/.cache/bazel/_bazel_qli/94d4690820716448475d1a244f54f936/external/com_github_grpc_grpc/src/proto/grpc/reflection/v1alpha/BUILD:21:1

It seems maybe an uncompatible version? Do you know how to fix this?

Best Regards
Qian

Public headers must be in `hdrs`, not `srcs`.

Currently generated proto headers for C/C++ are in srcs (code). Since these are supposed to be "public" headers, this is incorrect. See documentation for explanation.

Headers in srcs must only be directly included from the files in hdrs and srcs of the library itself.

The current implementation works by accident because Bazel doesn't enforce the distinction between public and private headers.

Setting up Python 2/3 in Windows Doesn't Work

Hi,

It's not something you can fix I think, but I think it's probably worth mentioning it on the documentation.

When installing Python 2 or 3 in Windows, the name of the executable is python.exe. When you use Python2 rules, it's looking for python2 and when Python3 it's looking for python3. Even if you can figure out it's Windows run - calling 'Python' is not enough because you don't know which Python is currently installed.

What I did here is to rename Python.exe (which is Python2) to Python2.exe and the Python.exe (which is in a different folder and this time is Python3) to Python3.exe.

Probably worth mentioning in the documentation as a workaround, as I don't see a better option.
Thanks!

How to specify the version of grpc and protobuf

I used rules_proto_grpc with version 0.2.0 and found that the default version of grpc was 1.22.
Now I want to update my grpc and protobuf to the latest released version, how can I do this?

ts_grpc_compile() doesn't generate service definitions

I am trying to understand how to generate the grpc client class using ts_grpc_compile(). The rule only seems to be generating a .d.ts file containing generated code for proto messages.

Here is my setup (relative to WORKSPACE):

  1. protos/test.proto:
syntax = "proto3";
option java_package = "example.simple";
option java_outer_classname = "SimpleProtos";

message SimpleRequest {
    int64 id = 1 ;
}

message SimpleResponse {
    string data = 1;
}

service Simple {
    rpc Fetch(SimpleRequest) returns (SimpleResponse);
}
  1. protos/BUILD
package(default_visibility = ["//visibility:public"])

load("@rules_proto_grpc//github.com/grpc/grpc-web:defs.bzl", "ts_grpc_compile")

proto_library(
    name = "test_proto",
    srcs = ["test.proto"],
)

ts_grpc_compile(
    name = "test_grpc_web",
    deps = [":test_proto"],
    verbose = 3,
)

Running bazel //protos/... generates the file ./bazel-bin/protos/test_grpc_web/protos/test_pb.d.ts which doesn't have generated code for service Simple client:

import * as jspb from "google-protobuf"

export class SimpleRequest extends jspb.Message {
  getId(): number;
  setId(value: number): void;

  serializeBinary(): Uint8Array;
  toObject(includeInstance?: boolean): SimpleRequest.AsObject;
  static toObject(includeInstance: boolean, msg: SimpleRequest): SimpleRequest.AsObject;
  static serializeBinaryToWriter(message: SimpleRequest, writer: jspb.BinaryWriter): void;
  static deserializeBinary(bytes: Uint8Array): SimpleRequest;
  static deserializeBinaryFromReader(message: SimpleRequest, reader: jspb.BinaryReader): SimpleRequest;
}

export namespace SimpleRequest {
  export type AsObject = {
    id: number,
  }
}

export class SimpleResponse extends jspb.Message {
  getData(): string;
  setData(value: string): void;

  serializeBinary(): Uint8Array;
  toObject(includeInstance?: boolean): SimpleResponse.AsObject;
  static toObject(includeInstance: boolean, msg: SimpleResponse): SimpleResponse.AsObject;
  static serializeBinaryToWriter(message: SimpleResponse, writer: jspb.BinaryWriter): void;
  static deserializeBinary(bytes: Uint8Array): SimpleResponse;
  static deserializeBinaryFromReader(message: SimpleResponse, reader: jspb.BinaryReader): SimpleResponse;
}

export namespace SimpleResponse {
  export type AsObject = {
    data: string,
  }
}

Note that at verbose=3, ts_grpc_compile() outputs the following command it runs:
bazel-out/host/bin/external/com_google_protobuf/protoc --descriptor_set_in=bazel-out/k8-fastbuild/bin/protos/test_proto-descriptor-set.proto.bin --plugin=protoc-gen-ts_plugin=bazel-out/k8-opt-exec-724A0052/bin/external/com_github_grpc_grpc_web/javascript/net/grpc/web/protoc-gen-grpc-web --ts_plugin_out=import_style=typescript,mode=grpcweb:bazel-out/k8-fastbuild/bin/protos/test_proto/ts_grpc_compile_aspect_verb3 protos/test.proto

If I run this command manually, after changing bazel-out/k8-fastbuild/bin/protos/test_proto/ts_grpc_compile_aspect_verb3 to /tmp/bazel_test, then I get the following output (which has a file testServiceClientPb.ts containing the generated client):

$ tree /tmp/bazel_test/
/tmp/bazel_test/
|-- Protos
|   `-- testServiceClientPb.ts
`-- protos
    `-- test_pb.d.ts

Am I missing something or is ts_grpc_compile() not supposed to output testServiceCleintPb.ts?

Fix CI

Submitted for Bazel CI

Only Build gRPC Protobuf with the gRPC Gateway

The protoc-gen-grpc_gateway plugin generates gRPC gateway from gRPC protobuf files.
Consider this example:

service_a.proto

syntax = "proto3";

package service_a;

message TestMeServiceA {
    string name = 1;
}

service_b.proto

syntax = "proto3";

package service_a;

import "google/api/annotations.proto";
import "service_a.proto";

service TestService {
    rpc Echo(service_a.TestMeServiceA) returns (service_a.TestMeServiceB) {
      option (google.api.http) = {
        post: "/v1/example/echo"
        body: "*"
      }
  }
}

service_a only hold a message. service_b reference to service_a, and also define gRPC request that use the message defined in service_a.

When trying to build the gateway against it, Bazel will complain he can't find a file called service_a.gw.go. Checking the final DepSet, there's 4 files:

  • service_a.go
  • service_a.gw.go
  • service_b.go
  • service_b.gw.go

(the reason is generated two files for each proto is because the gateway_grpc_aspect defines both go and grpc as plugins).

However, the gateway plugin, will not create a *.gw.go file, when the file doesn't have gRPC definition (it's doesn't print any error, just doesn't create anything).

So the real files created on the filesystem would be:

  • service_a.go
  • service_b.go
  • service_b.gw.go

That's why Bazel complains about missing service_.gw.go.

Feature request `ts_proto_library` & `ts_grpc_library`

Hi there,

Thanks so much for carrying the torch and providing these much needed rules. I'm in the process of migrating away from the stackb rules currently, and struggling to create ts_library for both protobuf and gRPC (a colleague who no longer works with me wrote the custom rules on top of stackb to generate the necessary typescript code previously).

While I have taken a stab at it, bazel is unfortunately not my forte. I thought I'd open an issue here and see if there was any interest in supporting both a ts_proto_library rule and a ts_grpc_library rule here since there is already at ts_grpc_compile.

Is there community need for TS proto+gRPC bazel rules?

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    ๐Ÿ–– Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. ๐Ÿ“Š๐Ÿ“ˆ๐ŸŽ‰

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google โค๏ธ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.