java 39 s rule_rules_protobuf: Bazel rules for building protocol buffers and gRPC services (java, c+...

NOTICE

rules_protobuf was initially written when the bazel protobuf ecosystem was

fairly immature. Now 2 years later, this repository is showing its age. Rather

than retrofit this set of rules, it's been re-written from the ground-up to work

correctly with the native.proto_library rule and is available at

https://github.com/stackb/rules_proto. Consequently, these rules are effectively

no longer being maintained.

Please upgrade to the newer rules when appropriate and file issues if there are

feature gaps that need to be filled.

Cheers, @pcj (Oct 25 2018)

rules_protobuf rules_protobuf.svg?branch=master

Bazel skylark rules for building protocol buffers

with +/- gRPC support on (osx, linux)

9dd349310cb82fae8303f79a58ff2d31.png.

bazel-icon.svg

wtfcat.png

014027f0498a5e911ce79685ee149c73.png

Bazel

rules_protobuf

gRPC

How is this related to the proto_library rules within Bazel itself?

These rules sprung out of a need to have protobuf support when there

was limited exposed and documented proto generation capabilities in

the main bazel repository. This is a moving target. The main goals

of this project are to:

Provide protoc, the protocol buffer compiler

(v3.5.1).

Provide the language-specific plugins.

Provide the necessary libraries and dependencies for gRPC support,

when possible.

Provide an extensible proto_language abstraction (used in

conjunction with the proto_compile rule) to generate outputs for

current and future custom protoc plugins not explicitly provided

here.

Rules

Refer to DEPENDENCIES.md for a more detailed

summary of workspace dependencies / versions.

Support for generation of protoc outputs via proto_compile()

rule.

Support for generation + compilation of outputs with protobuf

dependencies.

gRPC support.

Highly experimental (probably not functional yet). A

work-in-progress for those interested in contributing further work.

Usage

1. Install Bazel

If you have not already installed bazel on your workstation, follow

the bazel instructions.

NOTE: Bazel 0.8.0 or above is required for go support.

2. Add rules_protobuf your WORKSPACE

Specify the language(s) you'd like use by loading the

language-specific *_proto_repositories rule(s):

git_repository(

name = "org_pubref_rules_protobuf",

remote = "https://github.com/pubref/rules_protobuf",

tag = "v0.8.2",

#commit = "..." # alternatively, use latest commit on master

)

load("@org_pubref_rules_protobuf//java:rules.bzl", "java_proto_repositories")

java_proto_repositories()

load("@org_pubref_rules_protobuf//cpp:rules.bzl", "cpp_proto_repositories")

cpp_proto_repositories()

load("@org_pubref_rules_protobuf//go:rules.bzl", "go_proto_repositories")

go_proto_repositories()

Several languages have other rules_* dependencies that you'll need

to load before the *_proto_repositories() function is invoked:

Language

Requires

closure_proto_repositories

csharp_proto_repositories

go_proto_repositories

gogo_proto_repositories

grpc_gateway_proto_repositories

node_proto_repositories

py_proto_repositories 1

1 Only needed for python grpc support.

3. Add *_proto_* rules to your BUILD files

To build a java-based gRPC library:

load("@org_pubref_rules_protobuf//java:rules.bzl", "java_proto_library")

java_proto_library(

name = "protolib",

protos = [

"my.proto"

],

with_grpc = True,

verbose = 1, # 0=no output, 1=show protoc command, 2+ more...

)

Examples

To run the examples & tests in this repository, clone it to your

workstation.

# Clone this repo

$git clone https://github.com/pubref/rules_protobuf

# Go to examples/helloworld directory

$cdrules_protobuf/examples/helloworld

# Run all tests

$bazel testexamples/...

# Build a server

$bazel build cpp/server

# Run a server from the command-line

$$(bazel info bazel-bin)/examples/helloworld/cpp/server

# Run a client

$bazel run go/client

$bazel run cpp/client

$bazel run java/org/pubref/rules_protobuf/examples/helloworld/client:netty

Overriding or excluding WORKSPACE dependencies

To load alternate versions of dependencies, pass in a

dict having the same overall structure of a

deps.bzl file. Entries having a matching key will

override those found in the file. For example, to load a different

version of https://github.com/golang/protobuf, provide a different

commit ID:

load("@org_pubref_rules_protobuf//go:rules.bzl", "go_proto_repositories")

go_proto_repositories(

overrides = {

"com_github_golang_protobuf": {

# Override golang with a different commit

"commit": "2c1988e8c18d14b142c0b472624f71647cf39adb",

}

},

)

You may already have some external dependencies already present in

your workspace that rules_protobuf will attempt to load, causing a

collision. To prevent rules_protobuf from loading specific external

workspaces, name them in the excludes list:

go_proto_repositories(

excludes = [

"com_github_golang_glog",

]

)

To completely replace the set of dependencies that will attempt to be

loaded, you can pass in a full dict object to the lang_deps

attribute.

go_proto_repositories(

lang_deps = {

"com_github_golang_glog": {

...

},

},

)

There are several language --> language dependencies as well. For

example, python_proto_repositories and ruby_proto_repositories

(and more) internally call the cpp_proto_repositories rule to

provide the grpc plugins. To suppress this (and have better control

in your workspace), you can use the omit_cpp_repositories=True

option.

Proto B --> Proto A dependencies

Use the proto_deps attribute to name proto rule dependencies. Use of

proto_deps implies you're using imports, so read on...

Imports

In all cases, these rules will include a --proto_path=. (-I.)

argument. This is functionally equivalent to --proto_path=$(bazel info execution_root). Therefore, when the protoc tool is invoked, it

will 'see' whatever directory structure exists at the bazel execution

root for your workspace. To better learn what this looks like, cd $(bazel info execution_root) and look around. In general, it

contains all your sourcefiles as they appear in your workspace with an

additional external/WORKSPACE_NAME directory for all dependencies

used.

This has implications for import statements in your protobuf

sourcefiles, if you use them. The two cases to consider are imports

within your workspace (referred to here as 'internal' imports),

and imports of other protobuf files in an external workspace

(external imports).

Internal Imports

Internal imports should require no additional parameters if your

import statements follow the same directory structure of your

workspace. For example, the

examples/helloworld/proto/helloworld.proto file imports the

examples/proto/common.proto file. Since this matches the workspace

directory structure, protoc can find it, and no additional arguments

to a cc_proto_library are required for protoc code generation step.

Obviously, importing a file does not mean that code will be generated

for it. Therefore, use of the imports attribute implies that the

generated files for the imported message or service already exist

somewhere that can be used as a dependency some other library rule

(such as srcs for java_library).

Rather than using imports, it often make more sense to declare a

dependency on another proto_library rule via the proto_deps

attribute. This makes the import available to the calling rule and

performs the code generation step. For example, the

cc_proto_library rule in examples/helloworld/proto:cpp names the

//examples/proto:cpp's cc_proto_library rule in its proto_deps

attribute to accomplish both code generation and compilation of object

files for the proto chain.

External Imports

The same logic applies to external imports. The two questions to ask

yourself when setting up your rules are:

[Question 1]: Can protoc "see" the imported file? In order to satisfy this

requirement, pass in the full path of the required file(s) relative to

the execution root where protoc will be run. For example, the

well-known descriptor.proto could be made visible to protoc via:

java_proto_library(

name = 'fooprotos',

protos = 'foo.proto`,

imports = [

"external/com_google_protobuf/src/",

],

inputs = [

"@com_google_protobuf//:well_known_protos",

],

)

This would be imported as import "google/protobuf/descriptor.proto"

given that the file

@com_google_protobuf/src/google/protobuf/descriptor.proto is

in the package google.protobuf.

[Question 2]: Can the cc_proto_library rule "see" the generated protobuf files?

(in this case descriptor.pb.{h,cc}. Just because the file was

imported does not imply that protoc will generate outputs for it, so

somewhere in the cc_library rule dependency chain these files must

be present. This could be via another cc_proto_library rule defined

elswhere, or a some other filegroup or label list. If the source is

another cc_proto_library rule, specify that in the proto_deps

attribute to the calling cc_proto_library rule. Otherwise, pass a

label that includes the (pregenerated) protobuf files to the deps

attribute, just as you would any typical cc_library rule.

Important note about sandboxing: simply stating the path where

protoc should look for imports (via the imports attribute) is not

enough to work with the bazel sandbox. Bazel is very particular

about needing to know exactly which inputs are required for a

rule, and exactly what output files it generates. If an input is

not declared, it will not be exposed in the sandbox. Therefore, we

have to provide both the import path and a label-generating rule

in the inputs attribute that names the files we want available in

the sandbox (given here by :well_known_protos).

If you are having problems, put verbose={1,2,3} in your build rule

and/or disable sandboxing with --spawn_strategy=standalone.

Contributing

Contributions welcome; please create Issues or GitHub pull requests.

Credits

Much thanks to all

contributors

and the members of the bazel, protobuf, and gRPC teams.

评论
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包
实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值