Coder Social home page Coder Social logo

gpb's People

Contributors

bobergj avatar evanmcc avatar fbernier avatar fenollp avatar g-andrade avatar galdor avatar gilbertwong96 avatar jabarszcz avatar jesse-lauro avatar kianmeng avatar klajo avatar krzysiekj avatar lpgauth avatar lrascao avatar lukebakken avatar madninja avatar mmmries avatar nuxlli avatar peterzeller avatar pma avatar roowe avatar sasa1977 avatar shuieryin avatar squidfunk avatar the-mikedavis avatar tomas-abrahamsson avatar tsloughter avatar tyler-eon avatar ypaq avatar yueyoum avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

gpb's Issues

Maps support

It would be nice if there was an option to use Erlang maps for protobuf maps but still records for the actual protobuf encode/decode.

It seems like a natural match since protobufs are a defined set of fields, like a record, and a map is dynamic. I can look at adding this feature.

Dialyzer issues for required fields

When the codegen goes over required fields, it generates code like:


'merge_msg_mesos_state_agentoverlayinfo.state'(#'mesos_state_agentoverlayinfo.state'{status
                       =
                       PFstatus,
                         error
                       =
                       PFerror},
                 #'mesos_state_agentoverlayinfo.state'{status
                       =
                       NFstatus,
                         error
                       =
                       NFerror},
                 _) ->
    #'mesos_state_agentoverlayinfo.state'{status =
                if NFstatus =:= undefined ->
                 PFstatus;
             true -> NFstatus
                end,
            error =
                if NFerror =:= undefined ->
                 PFerror;
             true -> NFerror
                end}.

The generated typespec for status / NFstatus doesn't include undefined. Given this, if NFstatus =:= undefined should never match. I think that the codegen should omit that if the field is required in the merge function.

Add support for syntax tag in .proto

Hi,
I'm currently experimenting with protobuf 3.0. It would be nice to add support for syntax = "proto2"; or syntax = "proto3"; to specify a syntax version. This is part of protobuf 3.0-beta2.

Cheers,
LP

Does maps support is stable ?

the new maps syntax is very convenient,

I wanna use gpb with the maps syntax.

So, Does the maps support is stable ?

Support for rpc option?

service rpc defination supports option like this:

service Greeter {
  // {} is support 
  rpc SayHello (HelloRequest) returns (HelloReply) {}
}

But only format without option is supported now:

service Greeter {
  // only `;` is supported now
  rpc SayHello (HelloRequest) returns (HelloReply);
}

epb_compatibility overrides module_name_suffix

I use epb_compatibility to tell gpb to generate the encode/1 and decode/2 functions but now it is also forcing the suffix _pb even if i set {module_name_suffix, "'}. I don't think this is proper, is it?

"message" keyword cannot be used for a message name

I have the following :

message message {
  optional string first = 1;
  optional string last = 2;
}

but when I compile I get :

./bin/protoc-erl -I. test.proto
../test.proto:1: syntax error before: message

(if I change to "message message { ... }" to "message Message { ... }", then it works).

This is a valid protocol buffer definition, could you fix it ? This is our top level protocol message and it's hard for us to change the name to top_message (for example). Thank you.

oneof fields not populating the rnum field during post processing

I'm working on adding support for importing multiple .proto files and I noticed a strange behavior. I'm not familiar with oneof so maybe this is expected.

If I parse a single .proto file like this:

{ok, Defs} = gpb_parse:parse(Tokens),
{ok, Def2} = gpb_parse:post_process_one_file(Defs, Opts),
{ok, Defs3} = gpb_parse:post_process_all_files(Def2, Opts).

Then I get back gpb_oneof records with rnum populated, but if I parse that same file as if it were part of a list of files like this:

{ok, Defs} = gpb_parse:parse(Tokens),
{ok, Defs2} = gpb_parse:post_process_one_file(Defs, Opts),
{ok, Defs3} = gpb_parse:post_process_all_files([Defs2], Opts).

Then the gpb_oneof records have rnum = :undefined.

Here is a quick reproducible. Sample data is in elixir format because that's the console I'm in, but I can reproduce in and erlang shell if that is helpful.

Behavior I'm expecting

  • Input to post_process_all_files/2
[{{:msg, [:., :SubMsg]},
  [{:field, :test, 1, :undefined, :string, :required, []}]},
 {{:msg, [:., :SampleOneofMsg]},
  [{:field, :one, 1, :undefined, :string, :optional, []},
   {:gpb_oneof, :foo, :undefined,
    [{:field, :body, 3, :undefined, :string, :optional, []},
     {:field, :code, 4, :undefined, :uint32, :optional, []}]}]},
 {{:msg, [:., :AdvancedOneofMsg]},
  [{:field, :one, 1, :undefined, {:ref, [:SubMsg]}, :optional, []},
   {:gpb_oneof, :foo, :undefined,
    [{:field, :body, 3, :undefined, {:ref, [:SubMsg]}, :optional, []},
     {:field, :code, 4, :undefined, :uint32, :optional, []}]}]},
 {{:msg, [:., :ReversedOrderOneOfMsg]},
  [{:gpb_oneof, :foo, :undefined,
    [{:field, :body, 1, :undefined, :string, :optional, []},
     {:field, :code, 2, :undefined, :uint32, :optional, []}]},
   {:field, :bar, 3, :undefined, :string, :optional, []}]}],
  • Output
[{{:msg, :SubMsg}, [{:field, :test, 1, 2, :string, :required, []}]},
 {{:msg, :SampleOneofMsg},
  [{:field, :one, 1, 2, :string, :optional, []},
   {:gpb_oneof, :foo, 3,
    [{:field, :body, 3, 3, :string, :optional, []},
     {:field, :code, 4, 3, :uint32, :optional, []}]}]},
 {{:msg, :AdvancedOneofMsg},
  [{:field, :one, 1, 2, {:msg, :SubMsg}, :optional, []},
   {:gpb_oneof, :foo, 3,
    [{:field, :body, 3, 3, {:msg, :SubMsg}, :optional, []},
     {:field, :code, 4, 3, :uint32, :optional, []}]}]},
 {{:msg, :ReversedOrderOneOfMsg},
  [{:gpb_oneof, :foo, 2,
    [{:field, :body, 1, 2, :string, :optional, []},
     {:field, :code, 2, 2, :uint32, :optional, []}]},
   {:field, :bar, 3, 3, :string, :optional, []}]}]

Unexpected Behavior

  • Input to post_process_all_files/2 (same as before, but nested inside another list)
[[{{:msg, [:., :SubMsg]},
  [{:field, :test, 1, :undefined, :string, :required, []}]},
 {{:msg, [:., :SampleOneofMsg]},
  [{:field, :one, 1, :undefined, :string, :optional, []},
   {:gpb_oneof, :foo, :undefined,
    [{:field, :body, 3, :undefined, :string, :optional, []},
     {:field, :code, 4, :undefined, :uint32, :optional, []}]}]},
 {{:msg, [:., :AdvancedOneofMsg]},
  [{:field, :one, 1, :undefined, {:ref, [:SubMsg]}, :optional, []},
   {:gpb_oneof, :foo, :undefined,
    [{:field, :body, 3, :undefined, {:ref, [:SubMsg]}, :optional, []},
     {:field, :code, 4, :undefined, :uint32, :optional, []}]}]},
 {{:msg, [:., :ReversedOrderOneOfMsg]},
  [{:gpb_oneof, :foo, :undefined,
    [{:field, :body, 1, :undefined, :string, :optional, []},
     {:field, :code, 2, :undefined, :uint32, :optional, []}]},
   {:field, :bar, 3, :undefined, :string, :optional, []}]}]]
  • Output (rnum set to undefined)
[[{{:msg, [:., :SubMsg]},
   [{:field, :test, 1, :undefined, :string, :required, []}]},
  {{:msg, [:., :SampleOneofMsg]},
   [{:field, :one, 1, :undefined, :string, :optional, []},
    {:gpb_oneof, :foo, :undefined,
     [{:field, :body, 3, :undefined, :string, :optional, []},
      {:field, :code, 4, :undefined, :uint32, :optional, []}]}]},
  {{:msg, [:., :AdvancedOneofMsg]},
   [{:field, :one, 1, :undefined, {:ref, [:SubMsg]}, :optional, []},
    {:gpb_oneof, :foo, :undefined,
     [{:field, :body, 3, :undefined, {:ref, [:SubMsg]}, :optional, []},
      {:field, :code, 4, :undefined, :uint32, :optional, []}]}]},
  {{:msg, [:., :ReversedOrderOneOfMsg]},
   [{:gpb_oneof, :foo, :undefined,
     [{:field, :body, 1, :undefined, :string, :optional, []},
      {:field, :code, 2, :undefined, :uint32, :optional, []}]},
    {:field, :bar, 3, :undefined, :string, :optional, []}]}]]

Let's Wrap Up This Issue

Is this behavior expected? If I don't nest multiple parsed files inside another list then reference checking breaks for me, but if I do nest it in another list I get this strange rnum behavior.

Generated e_msg_X functions do not correctly match optional terms (in maps mode)

Consider the following message:

message Optional {
    required uint32 x = 1;
    optional uint32 y = 2;
}

When Optional.proto is compiled using the maps option, the following function gets generated as part of Optional.erl:

e_msg_Optional(#{x := F1, y := F2}, Bin) ->
    B1 = e_varint(F1, <<Bin/binary, 8>>),
    if F2 == undefined -> B1;
       true -> e_varint(F2, <<B1/binary, 16>>)
    end.

It appears that the intention of this code is to allow a map that does not contain the optional field y to be encoded correctly (by simply omitting to encoding of y in the output). However, attempting to encode such a map fails:

1> 'Optional':encode_msg(#{x => 1}, 'Optional').
** exception error: {gpb_type_error,{{expected_msg,'Optional'},
                                     [{value,#{x => 1}},{path,'Optional'}]}}
     in function  'Optional':mk_type_error/3 (src/Optional.erl, line 187)
     in call from 'Optional':encode_msg/3 (src/Optional.erl, line 28)

Encoding a maps that does include y succeeds (but this defeats the point of having an optional field ;) ):

2> 'Optional':encode_msg(#{x => 1, y => 2}, 'Optional').
<<8,1,16,2>>

I think the issue is that e_msg_Optional is slightly mis-using maps pattern matching syntax. The code expects F2 to get bound to undefined if the map does not contain the key y, but that's not quite what happens (instead, Erlang doesn't match the term at all, because it expects y to exist in the map). It might be better to match only the required fields in the definition of e_msg_Optional, and then extract the optional fields later in the body of the function using something like maps:find.

(There is a similar pattern/problem in the generated v_msg_Optional function as well).

how to set value to oneof fields?

Hi!
I have a protobuf:

message Chat {
    required ChatType type = 1;
    oneof  chat_oneof {
        CommonChat common_chat = 2;
        PrivateChat private_chat = 3;
    }
}

how to set value to chat_oneof?

gpb_opts - additional documentation and examples...???

Hi Tomas,

Is there any additional documentation or examples to better learn your plugin? I will admit I'm new to Erlang (3-4 months). I've read through your source code but still trying to wrap my head around it. Specifically, I want to see how to use your plugin to decode serialized protocol buffers attached as the binary payload of an HTTP POST request...with Content-Type set application/octet-stream. And to encode to serialized protocol buffer as the payload of an HTTP POST response with Content-Type set application/octet-stream.

I'm designing a module using cowboy_http behavior to handle HTTP request and responses and want to decode (handle) the serialized pb and encode (handle) responses to serialized pb.

I hope this make sense, any guidance will be appreciated.

Thank you!

Licensing

Hello Tomas,

I may have mentioned this already, but I have been investigating using your excellent gpb library as a replacement for erlang_protobuffs in Riak. Your library is actively developed and supports features that erlang_protobuffs does not.

Riak is licensed under the Apache 2.0 license, which I thought was LGPL compatible. According to the Apache Foundation, projects that are Apache licensed should not use LGPL libraries (source). I then found that we have run into Apache-LGPL issues in another part of Riak.

Would you consider changing the gpb license to something compatible with Apache 2.0? (source).

Thank you!
Luke

Optional fields don't use default values when unset

Message definition from here:

message DtFetchReq {
    required bytes bucket = 1;
    required bytes key    = 2;
    required bytes type  = 3;
    optional uint32 r             =  4;
    optional uint32 pr            =  5;
    optional bool   basic_quorum  =  6;
    optional bool   notfound_ok   =  7;
    optional uint32 timeout       =  8;
    optional bool   sloppy_quorum =  9;
    optional uint32 n_val         = 10;
    optional bool include_context = 11 [default=true];
}

Input data:

Req = #dtfetchreq{bucket = "bucket",
                  key = <<"key">>,
                  type = <<"type">>}.

Expected record with values after decoding:

#dtfetchreq{bucket = <<"bucket">>,
            key = <<"key">>,
            type = <<"type">>,
            r = 0,
            pr = 0,
            basic_quorum = false,
            notfound_ok = false,
            timeout = 0,
            sloppy_quorum = false,
            n_val = 0,
            include_context = true}

Actual decoded data:

#dtfetchreq{bucket = <<"bucket">>,
            key = <<"key">>,
            type = <<"type">>,
            r = undefined,
            pr = undefined,
            basic_quorum = undefined,
            notfound_ok = undefined,
            timeout = undefined,
            sloppy_quorum = undefined,
            n_val = undefined,
            include_context = undefined}

It seems like, according to the spec an optional field that is not supplied a default value or a value during encoding should be using type-specific defaults (0 for numeric fields, for instance). Instead, it seems like undefined is being used.

This may be related to #71

Thoughts? I'm going to start digging into the gpb code now to see what I can find. Thanks!

Generate types and type declarations for record fields

Attempting to use GPB and Dialyzer together, and finding that the untyped fields in records (especially when records are nested) can cause issues where Dialyzer assumes values in those fields can be any() when it's clear that it should be a nested record, or a number, or .... GPB already knows this (as it even emits documentation for all the types).

It would be great if it were possible to do two things:

  1. For each record type defined, also define and export a type for that record.
  2. For any field in a record, also include type information for that field.

package not supported?

Look at this two proto files:

a.proto

package Test.a;

import "b.proto";

message Foo {
    enum Type {
        AA = 1;
        BB = 2;
        CC = 3;
    }

    required int32 id = 1;
    required Type type = 2;
    required Test.b.Bar bar = 3;
}

b.proto

package Test.b;

message Bar {
    required int32 id = 1;
    required string name = 2;
}

When I try to compile with the command:

protoc-erl -I . -o . -Wall *.proto

I got this error:

in msg Foo, field bar: undefined reference  Test.b.Bar

Does gpb not support the package syntax?

service rpc metadata methods

i'm trying to figure out a way to introduce a method that provides the data contained in the .proto file regarding service rpc's input and output arguments, for example:
{input_arg, output_arg} = xpto_info(). where xpto is a rpc defined in a service.
is this something that would be of interest to this project? i've already implemented a similar thing in basho's protobuffs (https://github.com/lrascao/erlang_protobuffs/tree/add_protobuf_service_support)
can you provide me some pointers on the best way to implement this in gpb?

Add option for major version compatability

I have an issue where I generate .erl files on OTP19 and then use them on OTP17. This causes issues with the dialyzer nowarn attributes that are generated.

A work around for me could be to instead always generate on the lowest OTP version I want to support but it would be nicer if it could be an option instead.

Optional elements do not recognize implicit defaults

When encoding a message, I noticed that I get an error when I have an optional element with no explicit default, but the protobuf specification defines implicit defaults for certain data types in this case that are not being recognized by gpb.

The full write-up can be found here for the proto3 spec and here for the proto2 spec. The short version is:

  • Strings are empty string.
  • Bytes are empty bytes.
  • Bools are false.
  • Numbers are zero.
  • Enums are the first defined value, which must be zero.
  • Messages are not set.
  • Repeated fields are empty arrays.

I actually have a vague idea of how to implement this in the code so I'll try to submit a PR for this in the near future.

Generated code & dialyzer warning

Hi Tomas -

In basho/riak_pb#197, I have migrated to your gpb library (very easy to do!) and get several of these warnings when running make dialyzer:

riak.erl:3396: Expression produces a value of type ['ok'], but this value is unmatched

Here is the code that generates this warning:

if is_list(F4) ->
   [v_msg_rpbcommithook(Elem, [precommit | Path])
    || Elem <- F4];
   true ->
   mk_type_error({invalid_list_of, {msg, rpbcommithook}},
         F4, Path)
end,

It is generated from the following field in the RpbBucketProps message in src/riak.proto:

repeated RpbCommitHook precommit = 4;

Basically, any repeated field will generate an is_list() call that triggers the dialyzer warning.

This is not a serious warning, and it is easy for me to ignore, but I thought I'd let you know. I am using OTP R16B02 compiled from the basho10 tag in this repository.

Enums are broken for nifs

I've got proto like this:

message ProtocolMessageResponse
{
optional int32 protocol_version = 1 [default = 1];

    required int32 client_number = 2;

    enum Error {
            WRONG_SERVER = 0;
    }
    optional Error error = 30;

...
}

i'm generating code with command like this:
erl -boot start_clean -pa ./deps/gpb/ebin -noshell -noinput +B -I/some-path -o-nif-cc ./some-path/src -o-erl ./some_path/src -o-hrl ./include -nif -strbin -s gpb_compile c /some-path/protobuf_webserv.proto

and compiling like this:
g++ -g -fPIC -Wall -O3 -I./some_path/src -I/some_path/include -I./some_path/src -o ./some_path/src/protobuf_webserv.nif.o -c ./some_path/src/protobuf_webserv.nif.cc

and i'm getting compilation errors:
protobuf_webserv.nif.cc: In function ‘int p_msg_ProtocolMessageResponse(ErlNifEnv_, ERL_NIF_TERM, WebservServerProtocol::ProtocolMessageResponse_)’: protobuf_webserv.nif.cc:12198: error: ‘ProtocolMessageResponse_Error_WRONG_SERVER’ was not declared in this scope

protobuf_webserv.nif.cc: In function ‘ERL_NIF_TERM u_msg_ProtocolMessageResponse(ErlNifEnv_, const WebservServerProtocol::ProtocolMessageResponse_)’: protobuf_webserv.nif.cc:22843: error: ‘ProtocolMessageResponse_Error_WRONG_SERVER’ was not declared in this scope

I use gpb 3.17.9

Name resolution issue in gpb_parse.post_process_all_files/2

I have two simple proto files like this:

import.proto

package chat;
import "imported.proto";
message WebsocketServerContainer {
  required string name = 1;
  optional authorization.Failure failure = 2;
}

imported.proto

package authorization;
message Failure {
  optional uint32 status_code = 1;
  optional string reason = 2;
}

I then scan, parse and gpb_parse:post_process_one_file/1 each of these messages individually. Then when I merge these together into a single list of defs I get a data structure like:

[
  {package, [chat]},
  {import, "imported.proto"},
  {{msg, ['.', 'WebsocketServerContainer']},[
    {field, name, 1, undefined, string, required, []},
    {field, failure, 2, undefined, {ref, [authorization, '.', 'Failure']}, optional, []}
  ]},
  {package, [authorization]},
  {{msg, ['.', 'Failure']},[
    {field, status_code, 1, undefined, uint32, optional, []},
    {field, reason, 2, undefined, string, optional, []}]
  }
]

And when I pass that to gpb_parse:post_process_all_files/2 I get back an error tuple like:

{error,[
  {ref_to_undefined_msg_or_enum,{{
    ['.','WebsocketServerContainer'],failure},
    [authorization,'.','Failure']}}]}

I've also tried reversing the order of the defs so that authorization.Failure is passed in before it gets used, but that didn't work either. Am I missing a call somewhere?

Encoding result is not the same with basho/erlang_protobuffs when having negative integers

  • rebar.config
{plugins, [rebar3_gpb_plugin]}.

{provider_hooks, [{post, [
                          {compile, {protobuf, compile}},
                          {clean, {protobuf, clean}}
                         ]}
                 ]}.

{gpb_opts, [{module_name_suffix, "_pb"},
            {o_erl, "src/auto_gen/proto"},
            {o_hrl, "src/auto_gen/proto"},
            strings_as_binaries,
            type_specs,
            maps
           ]}.
  • test.proto
message Test {
  required int32 result = 1;
}
  • Outputs of gpb
test_pb:encode_msg(#{result => -1}, 'Test').

Outputs: <<8,255,255,255,255,15>>
  • Outputs of erlang_protobuffs
import("test_pb.hrl").
list_to_binary(test_pb:encode_test(#test{result= -1})).

Outputs: <<8,255,255,255,255,255,255,255,255,255,1>>

It seems compressed for negative integers. Is there any option to control this result?

generate types for records

much in the same way types are now being generated for enums, i think
it would be interesting to also generate types for records, for basically the
same reasons as in enums, if this feature is deemed useful i'd like to offer
a pull request for this

parsing message extensions inside a message body

I get an error when trying to parse protobuf definitions like:

package atlas.dispatch;

// Base resource message
message Shard {
  extensions 200 to max;

  // Column fields
  optional string guid             = 1;
  optional string name             = 2;

  // Extension fields
  extend Shard {
    optional int64 status_code = 200;
  }
}

The same message definition parses fine if you move the extend outside the message definition like this:

package atlas.dispatch;

// Base resource message
message Shard {
  extensions 200 to max;

  // Column fields
  optional string guid             = 1;
  optional string name             = 2;  
}

// Extension fields
extend Shard {
  optional int64 status_code = 200;
}

Google's protoc compiler handles both of those forms just fine. Is this something that gpb can easily support? I would be happy to help out with a pull request if you are interested in supporting this feature and you can point me in the right general direction.

Enums are broken for NIFs

It seems that enums in NIFs are missing their prefixes. In the example bellow, local should be Test_BidOrigin_local (which is generated by protoc).

Using the following protobuf definition:

message Test {
    enum BidOrigin {
        local = 1;
        ewr = 2;
        lax = 3;
        netelligent = 4;
    }
    optional BidOrigin bid_origin = 1;
}

When compiling:

c++ -c -O3 -D_THREAD_SAFE -I/usr/local/Cellar/protobuf250/2.5.0/include -g -Wall -fPIC  -I/usr/local/lib/erlang/lib/erl_interface-3.7.19/include -I/usr/local/lib/erlang/erts-6.2.1/include   c_src/test.nif.cc -o c_src/test.nif.o

c_src/test.nif.cc:85:31: error: use of undeclared identifier 'local'
            m->set_bid_origin(local);
                              ^
c_src/test.nif.cc:87:31: error: use of undeclared identifier 'ewr'
            m->set_bid_origin(ewr);
                              ^
c_src/test.nif.cc:89:31: error: use of undeclared identifier 'lax'
            m->set_bid_origin(lax);
                              ^
c_src/test.nif.cc:91:31: error: use of undeclared identifier 'netelligent'
            m->set_bid_origin(netelligent);
                              ^
c_src/test.nif.cc:147:18: error: use of undeclared identifier 'local'
            case local: elem1 = gpb_aa_local; break;
                 ^
c_src/test.nif.cc:148:18: error: use of undeclared identifier 'ewr'
            case ewr: elem1 = gpb_aa_ewr; break;
                 ^
c_src/test.nif.cc:149:18: error: use of undeclared identifier 'lax'
            case lax: elem1 = gpb_aa_lax; break;
                 ^
c_src/test.nif.cc:150:18: error: use of undeclared identifier 'netelligent'
            case netelligent: elem1 = gpb_aa_netelligent; break;
                 ^
8 errors generated.

Google Protobufs 3 incompatibility

Given "msg.proto":

syntax = "proto3";

message Payload {
int32 seq = 1;
bytes payload = 2;
};

message ServerMessage {
int32 seen = 1;
int32 ack = 2;
repeated Payload payloads = 3;
};

gpb encodes

{'ServerMessage',1,2,
[{'Payload',1,<<"derpyderp1">>},
{'Payload',2,<<"derpyderp2">>},
{'Payload',3,<<"derpyderp3">>},
{'Payload',4,<<"derpyderp4">>},
{'Payload',5,<<"derpyderp5">>}]}

to the 81 byte bin

<<8,1,16,2,26,75,14,8,1,18,10,100,101,114,112,121,100,101,114,112,49,14,8,
2,18,10,100,101,114,112,121,100,101,114,112,50,14,8,3,18,10,100,101,114,
112,121,100,101,114,112,51,14,8,4,18,10,100,101,114,112,121,100,101,114,
112,52,14,8,5,18,10,100,101,114,112,121,100,101,114,112,53>>

and the protobuf3 generated C++ encoder generates the 84 byte bin

<<8,1,16,2,26,14,8,1,18,10,100,101,114,112,121,100,101,114,112,49,26,14,8,2,18,
10,100,101,114,112,121,100,101,114,112,49,26,14,8,3,18,10,100,101,114,112,
121,100,101,114,112,49,26,14,8,4,18,10,100,101,114,112,121,100,101,114,112,
49,26,14,8,5,18,10,100,101,114,112,121,100,101,114,112,49>>

and gpb is able to decode its own bin, and protobuf3 is able to decode its own bin, but gpb crashes with an exception ("no function clause matching msg:skip_64_Payload(<<18,10,100,101,114,112,121>>,0,0,0,<<>>,undefined)") trying to decode protobuf3's bin, and protobuf3 does not recover the repeated Payload field from gpb's bin (the resulting payload field is []).

Setting "option allow_alias = true" causes message compilation to fail

Trying to compile the following message with gpb fails:

message FailingTest{
    enum myEnum
    {
    option allow_alias = true;
    ENUM1 = 1;
    ENUM2 = 1;
    }
}

In case it's relevant, I'm compiling this message in a project that uses rebar, with a rebar.config as follows:

{proto_compiler, gpb}.
{gpb_opts, [maps, include_as_lib, use_packages, {verify, always}]}.
{deps,
 [
  {gpb, "", {git, "https://github.com/tomas-abrahamsson/gpb.git", {tag, "3.17.7"}}}
 ]
}.

The error message is:

$ ./rebar compile         
==> gpb (compile)
Compiled src/gpb.erl
==> messages (compile)
...messages/proto/FailingTest.proto:4: syntax error before: option
ERROR: Failed to compile src/messages/proto/FailingTest.proto
Compiling src/messages/proto/FailingTest.proto failed:
ERROR: compile failed while processing: rebar_abort

I'm not sure if allow_alias is a supported option or not. The gpb readme states:

gpb reads but ignores or throws away:
options other than 'packed' or 'default'
custom options

From this, I would expect gpb to ignore the allow_alias option, rather than failing to compile. As a side note, if I remove the option, the message does compile with a warning:

src/FailingTest.erl:180: Warning: this clause cannot match because a previous clause at line 179 always matches

This makes sense, because the lines in question are the 'enum_symbol_by_value_FailingTest.myEnum' function, which looks up the enum symbol by value.

I'm compiling messages that I don't have direct control over, so I'd actually prefer if gpb would just ignore the allow_alias option, rather than failing to compile. Is it possible to change this behavior? Thanks! :)

stream requests and response support in rpc

Hi, Thomas! Thanks for the development of such nice library!
I'm thinking about implementing of grpc erlang server and client on the base of it.
At the current moment, I'm on very early 'research' stage.
Could you please add stream option support to parser, to make it possible to compile base grpc service example like this one: https://github.com/grpc/grpc/blob/release-0_14/examples/protos/route_guide.proto
It also will require adding some kind of 'stream' indication to find_rpc_def/2 return value.

'undefined' return when an optional field does not exist

According to the protobuf's official document:

When a message is parsed, if it does not contain an optional element, the corresponding field in the parsed object is set to the default value for that field.
If the default value is not specified for an optional element, a type-specific default value is used instead

But I found that when an optional field does not exist and no default value is set, an undefined will be returned which ought be an type-specific default value, e.g. 0 for integers.

Forward compatibility for enums

Currently gpb crashes when decoding an unknown enum value. This breaks forward compatability for new enum values in the enum type. Would it be possible to add an option to have it not crash and decode to a value such as {unknown, integer()} or maybe just the integer value?

In my usecase I don't care about the actual value, I just want to have the decode not break on new values.

Another solution, to force users to handle the error case, would be to have enum values decode to {ok, atom()} | {error, integer()} when the option is enabled.

erlang_protobuffs compatibility option

Hi Tomas,

I'd like to propose an option in gpb to enable compatibility with erlang_protobuffs. There are a few subtle differences between the libraries and having an option to make gpb compatible would be a great way for Basho to direct erlang_protobuffs (epb) users to your library.

Here are the areas I've found so far -

  • gpb uses encode_msg / decode_msg and epb uses encode / decode
  • epb allows using 0 and 1 to specify true and false
  • epb will encode a list of integers as a set of unicode code points for bytes fields in a proto message

Do you think this could be a useful option?

Thanks -
Luke

import proto from another package, compile failed.

when we use proto like this, the compile will not success, can u help me?

package pb_commom;

message pbId32 {
        optional int32 id = 1;                          // 32位的id信息
}
package pb_9;
import "pb_common.proto";

//null
message pbnull {
        optional pb_common.pbId32 id = 1;
}

got error like this:

===> proto files found: ["/Users/zhangrong/repository/leshu/server_p05/subgit/p05_proto/pb_common.proto",
                                "/Users/zhangrong/repository/leshu/server_p05/subgit/p05_proto/pb_9.proto",
                                "/Users/zhangrong/repository/leshu/server_p05/subgit/p05_proto/pb_50.proto",
                                "/Users/zhangrong/repository/leshu/server_p05/subgit/p05_proto/pb_40.proto",
                                "/Users/zhangrong/repository/leshu/server_p05/subgit/p05_proto/pb_30.proto",
                                "/Users/zhangrong/repository/leshu/server_p05/subgit/p05_proto/pb_21.proto",
                                "/Users/zhangrong/repository/leshu/server_p05/subgit/p05_proto/pb_19.proto",
                                "/Users/zhangrong/repository/leshu/server_p05/subgit/p05_proto/pb_18.proto",
                                "/Users/zhangrong/repository/leshu/server_p05/subgit/p05_proto/pb_17.proto",
                                "/Users/zhangrong/repository/leshu/server_p05/subgit/p05_proto/pb_16.proto",
                                "/Users/zhangrong/repository/leshu/server_p05/subgit/p05_proto/pb_15.proto",
                                "/Users/zhangrong/repository/leshu/server_p05/subgit/p05_proto/pb_14.proto",
                                "/Users/zhangrong/repository/leshu/server_p05/subgit/p05_proto/pb_13.proto",
                                "/Users/zhangrong/repository/leshu/server_p05/subgit/p05_proto/pb_12.proto",
                                "/Users/zhangrong/repository/leshu/server_p05/subgit/p05_proto/pb_11.proto",
                                "/Users/zhangrong/repository/leshu/server_p05/subgit/p05_proto/pb_10.proto"]
===> compiling "/Users/zhangrong/repository/leshu/server_p05/subgit/p05_proto/pb_9.proto" to "/Users/zhangrong/repository/leshu/server_p05/_build/default/lib/server/src/protocol/pb_9_pb.erl"
===> opts: [{i,"./subgit/p05_proto"},
                   {module_name_suffix,"_pb"},
                   {o_erl,"/Users/zhangrong/repository/leshu/server_p05/_build/default/lib/server/src/protocol"},
                   {o_hrl,"/Users/zhangrong/repository/leshu/server_p05/_build/default/lib/server/include"},
                   {strings_as_binaries,false},
                   type_specs,
                   {i,"/Users/zhangrong/repository/leshu/server_p05/subgit/p05_proto"},
                   {i,"/Users/zhangrong/repository/leshu/server_p05/subgit/p05_proto"}]
in msg pbnull, field id: undefined reference  pb_common.pbId32

===> failed to compile /Users/zhangrong/repository/leshu/server_p05/subgit/p05_proto/pb_9.proto: in msg pbnull, field id: undefined reference  pb_common.pbId32

Ignore UTF-8 BOM

The Parser seems to fail for files with a UTF-8 BOM set:

** (CaseClauseError) no case clause matching: {:error, {1, :gpb_scan, {:illegal, [65279]}}, 1}

xref warnings for unused functions

While gpb rightly adds a few lines like -compile({nowarn_unused_function,cons/3}). to modules it generates, xref still warns about these unused local functions. I think just adding:

-ignore_xref([id/2, cons/3, lists_reverse/2, 'erlang_++'/3]).

To the top of the generated modules would resolve this.

including gpb as dependency in erlang.mk, fails relx when generating a release

the usual erlang.mk build process

wget https://erlang.mk/erlang.mk
make -f erlang.mk bootstrap-rel
make -f erlang.mk new-app in=an_app

Then create a Makefile in the root dir:

LOCAL_DEPS = an_app
TEST_DEPS = sync meck

include erlang.mk

and in the new app (named 'an_app') apps/an_app/Makefile:

PROJECT = an_app
PROJECT_DESCRIPTION = New project
PROJECT_VERSION = 0.0.1

DEPS += gpb

include ../../erlang.mk

run make, and it fails. the relx.

Didn't quite get to the root of the problem. sorry about that.

Record Names

Is there a way to specify the record name for a message or to set a formatter besides suffix/prefix/lowercase?

Specifically I'd like to have the message CamelCase become the record #camel_case{} instead of #camelcase{}.

Digging around in the code this does not seem to be supported, would it be an acceptable contribution?

protoc command line tool

hi Thomas,
i'm interested in making another contribution to the project, basically is replacing the rebar.config prehook that is suggested now:
{pre_hooks,
[{compile, "mkdir -p include"}, %% ensure the include dir exists
{compile,
"erl +B -noinput -pa /path/to/gpb/ebin "
" -Ipwd/proto -o-erl src -o-hrl include "
" -s gpb_compile c pwd/proto/*.proto"
}]}.

with something like this:
{pre_hooks,
[{compile, "mkdir -p include"}, %% ensure the include dir exists
{compile, "deps/gpb/bin/protoc-erl proto/.proto"},
{compile, "mv proto/
.erl src"},
{compile, "mv proto/*.hrl include"}
]}.

an even better option would be:
{pre_hooks,
[{compile, "mkdir -p include"}, %% ensure the include dir exists
{compile, "deps/gpb/bin/protoc-erl --output-erl src --output-include include proto/*.proto"}
]}.

this would involve adding a protoc-erl tool to the bin directory of gpb, and to support the second version (the one with --output-erl and --output-include) adding an external dependency that takes care of parsing the command line arguments (https://github.com/jcomellas/getopt for example)
what are your thoughts on this?
thanks!

unit tests involving NIFs fail on macos

Not sure if this is an issue to be solved in the tests, but it should be reported for future reference.
NIF tests fail with errors similar to this:
Undefined symbols for architecture x86_64:
"enif_alloc_binary", referenced from:
e_msg_ntest2(enif_environment_t
, int, unsigned long const_) in gpb_nif_with_strbin.nif.o
"enif_get_tuple", referenced from:
e_msg_ntest2(enif_environment_t
, int, unsigned long const_) in gpb_nif_with_strbin.nif.o...

to overcome this the Makefile test invocation should be:
LDFLAGS="-undefined dynamic_lookup -dynamiclib" make test

please advise if this is something that should be fixed in the unit tests or not, i'll close the issue or propose a pull request accordingly.

Redeclaring imported protos

Why are imported protos redeclared in the generated code for the protos that import them?

Meaning for proto a.proto with import b.proto where b.proto contains message B { ...} and I have msg_name_prefix set to the name of the proto file I find that in a.erl and a.hrl the record #a_b{} which is a duplicate of #b{}.

Shouldn't it simply use b.hrl and b.erl from a? Or at the very least ignore the msg_name_prefix when importing protos?

gpb should be published to hex.pm

There is currently at least one package (exprotobuf) which depends on gpb. Currently, the user must add a dependency record for gpb explicitly at the top-level, as hex dependencies don't get their own github dependencies fetched automatically.

It would be very convenient if gpb were published. More information on doing so can be found here: https://hex.pm/docs/publish

gpb doesn't build on windows/cygwin

Trying to build gpb on windows (from a cygwin shell) fails:

$ rebar get compile
==> gpb (get-deps)
==> gpb (compile)
'build' is not recognized as an internal or external command,
operable program or batch file.
ERROR: Command [compile] failed!

This seems like a rebar issue; I'll file a separate bug with that project.

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.