Coder Social home page Coder Social logo

ahamez / protox Goto Github PK

View Code? Open in Web Editor NEW
261.0 5.0 17.0 1.45 MB

A fast, easy to use and 100% conformant Elixir library for Google Protocol Buffers (aka protobuf)

License: MIT License

Elixir 100.00%
protobuf elixir protocol-buffers protobuf-message protobuf-runtime protoc json

protox's Issues

Fails to generate module

I have the following protobuf, which protox seems to compile fine.

message CSVCMsg_SendTable {
	message sendprop_t {
		optional int32 type = 1;
		optional string var_name = 2;
		optional int32 flags = 3;
		optional int32 priority = 4;
		optional string dt_name = 5;
		optional int32 num_elements = 6;
		optional float low_value = 7;
		optional float high_value = 8;
		optional int32 num_bits = 9;
	}

	optional bool is_end = 1;
	optional string net_table_name = 2;
	optional bool needs_decoder = 3;
	repeated .CSVCMsg_SendTable.sendprop_t props = 4;
}

I then want to decode my binary using CSVCMsg_SendTable.decode!(message), but i get the following error.

** (UndefinedFunctionError) function CSVCMsg_SendTable.SendpropT.get_required_fields/0 is undefined (module CSVCMsg_SendTable.SendpropT is not available)
    CSVCMsg_SendTable.SendpropT.get_required_fields()
    (protox) lib/protox/decode.ex:149: Protox.Decode.parse_delimited/2
    (protox) lib/protox/decode.ex:85: Protox.Decode.parse_value/3
    (protox) lib/protox/decode.ex:64: Protox.Decode.parse_key_value/4
    (protox) lib/protox/decode.ex:17: Protox.Decode.decode!/3
    (demex) lib/demex.ex:160: Demex.parse_data_table/1
    (demex) lib/demex.ex:146: Demex.process_data_tables/1
    (demex) lib/demex.ex:89: Demex.process_bin/1

Without deeper knowledge of how the parser works, should protox generate the missing module? Or am i missing something?

Thank you.

[BUG] The code generated is not deterministic

We recently switched from macro to mix protox.generate to improve the compile time. One thing we noticed is the code generated by the command is not deterministic. Even if there is no change in the source proto files, the order of key/value present inside the defs_by_name() function changes. This only happens for proto files with a lot of fields (> 50 fields) and there are changes in unrelated area (like if I checkout different branch etc). I can't share my project proto file here, but if you have difficulty with reproducing the issue let me know.

     def(defs_by_name()) do
       %{
        ...

Protobuf 3.4.1 conformance failure

CONFORMANCE TEST BEGIN ====================================

ERROR, test=Required.Proto2.ProtobufInput.RepeatedScalarSelectsLast.UINT64.ProtobufOutput: Output was not equivalent to reference message: deleted: optional_uint64: 0
. request=protobuf_payload: " \271` \377\377\377\377\377\377\377\377\377\001 \000" requested_output_format: PROTOBUF message_type: "protobuf_test_messages.proto2.TestAllTypesProto2", response=protobuf_payload: ""
ERROR, test=Required.Proto2.ProtobufInput.RepeatedScalarSelectsLast.FIXED64.ProtobufOutput: Output was not equivalent to reference message: deleted: optional_fixed64: 0
. request=protobuf_payload: "A90\000\000\000\000\000\000A\377\377\377\377\377\377\377\377A\000\000\000\000\000\000\000\000" requested_output_format: PROTOBUF message_type: "protobuf_test_messages.proto2.TestAllTypesProto2", response=protobuf_payload: ""
ERROR, test=Required.Proto2.ProtobufInput.RepeatedScalarSelectsLast.FIXED32.ProtobufOutput: Output was not equivalent to reference message: deleted: optional_fixed32: 0
. request=protobuf_payload: "=90\000\000=\377\377\377\377=\000\000\000\000" requested_output_format: PROTOBUF message_type: "protobuf_test_messages.proto2.TestAllTypesProto2", response=protobuf_payload: ""
ERROR, test=Recommended.Proto2.ProtobufInput.OneofZeroMessage.ProtobufOutput: Output was not equivalent to reference message: deleted: oneof_nested_message.a: 0
. request=protobuf_payload: "\202\007\002\010\000" requested_output_format: PROTOBUF message_type: "protobuf_test_messages.proto2.TestAllTypesProto2", response=protobuf_payload: "\202\007\000"

These tests failed.  If they can't be fixed right now, you can add them to the failure list so the overall suite can succeed.  Add them to the failure list by running:
  ./update_failure_list.py  --add failing_tests.txt

  Recommended.Proto2.ProtobufInput.OneofZeroMessage.ProtobufOutput
  Required.Proto2.ProtobufInput.RepeatedScalarSelectsLast.FIXED32.ProtobufOutput
  Required.Proto2.ProtobufInput.RepeatedScalarSelectsLast.FIXED64.ProtobufOutput
  Required.Proto2.ProtobufInput.RepeatedScalarSelectsLast.UINT64.ProtobufOutput

CONFORMANCE SUITE FAILED: 382 successes, 425 skipped, 0 expected failures, 4 unexpected failures.

Enable manual import path specification

Providing multiple files to the files directive will automatically use the common_path (provided by the common_directory_path function) as protoc's import directory.

For instance, if you have the following directory layout:

some/prefix/path/foo.proto
some/prefix/path/bar/baz.proto

protox will automatically use some/prefix/foo as the import path.

This can be problematic if baz.proto uses some/prefix/path in its import statement:

syntax = "proto3";

import "some/prefix/path/foo.proto";

...

The current path handling will return a some/prefix/path/foo.proto: File not found. error.

Would it be possible to add a path argument to the use statement which makes it possible to work around this behavior?

Generating code for services?

I am using Protox for a gRPC client, which needs service definitions. Right now I'm manually creating structs like:

%{
  request_type: Echo.EchoRequest,
  request_stream?: false,
  response_type: Echo.EchoReply,
  response_stream?: false,
  service: "SayHello",
  service_module: "echo"
}

for

syntax = "proto3";

package echo;

service EchoServer {
  rpc SayHello (EchoRequest) returns (EchoReply) {}
}

It would be great if Protox could handle the service definitions as well. What are your thoughts on this?

key not found when encoding optional fields

Describe the bug
When using optional fields there seems to be somekind of problem with the naming:

** (KeyError) key :_optional not found in: %Bar{__uf__: [], optional: "### OPTIONAL ###"}. Did you mean one of:

           * :optional

     code: Bar.json_encode!(msg)

To Reproduce
I wrote this test case to show the issue

defmodule OptionalTest do
  use ExUnit.Case

  use Protox,
    schema: """
      syntax = "proto3";

      message Foo {
        string required = 1;
      }

       message Bar {
        optional string optional = 1;
      }
    """

  test "not optional string" do
    msg = %Foo{required: "### NOT OPTIONAL ###"}

    assert ["{", ["\"required\"", ":", "\"### NOT OPTIONAL ###\""], "}"] ==
             Foo.json_encode!(msg)
  end

  test "optional string" do
    msg = %Bar{optional: "### OPTIONAL ###"}

    assert ["{", ["\"optional\"", ":", "\"### OPTIONAL ###\""], "}"] ==
             Bar.json_encode!(msg)
  end
end

output:

  1) test optional string (OptionalTest)
     test/timestamp_test.exs:24
     ** (KeyError) key :_optional not found in: %Bar{__uf__: [], optional: "### OPTIONAL ###"}. Did you mean one of:

           * :optional

     code: Bar.json_encode!(msg)
     stacktrace:
       :erlang.map_get(:_optional, %Bar{__uf__: [], optional: "### OPTIONAL ###"})
       (protox 1.7.0) lib/protox/json_encode.ex:76: Protox.JsonEncode.encode_msg_field/3
       (protox 1.7.0) lib/protox/json_encode.ex:19: anonymous fn/4 in Protox.JsonEncode.encode_message/2
       (elixir 1.11.4) lib/enum.ex:2193: Enum."-reduce/3-lists^foldl/2-0-"/3
       (protox 1.7.0) lib/protox/json_encode.ex:18: Protox.JsonEncode.encode_message/2
       test/timestamp_test.exs:28: (test)



Finished in 0.3 seconds
2 tests, 1 failure

Randomized with seed 860092

Expected behavior
The struct with the optional field is encoded.

Environment (please complete the following information):

  • Elixir version: Elixir 1.11.4
  • Erlang version: 23.3.4.17
  • protoc version: libprotoc 3.21.7

Additional context
This is a minimal reproduction. I've observed in binary decode as well as json_decode. Linked to #49/#50.

[BUG] Compilation of Protox 1.6 fails

Congrats on the new release! I was trying to update to the latest version but got the following errors. I was able to reproduce this by generating a brand new Phoenix 1.6 project and adding protox as a dependency.

Describe the bug
Compilation of Protox fails with the following output:

==> protox
Compiling 51 files (.ex)

== Compilation error in file lib/google/protobuf.ex ==
** (MatchError) no match of right hand side value: {:error, "[libprotobuf WARNING ../../../../../src/google/protobuf/compiler/parser.cc:651] No syntax specified for the proto file: 3EE60D098B85EF5C3B580784A65632E262F3BC4B.proto. Please use 'syntax = \"proto2\";' or 'syntax = \"proto3\";' to specify a syntax version. (Defaulted to proto2 syntax.)\ngoogle/protobuf/any.proto: File not found.\ngoogle/protobuf/duration.proto: File not found.\ngoogle/protobuf/field_mask.proto: File not found.\ngoogle/protobuf/struct.proto: File not found.\ngoogle/protobuf/timestamp.proto: File not found.\ngoogle/protobuf/wrappers.proto: File not found.\n3EE60D098B85EF5C3B580784A65632E262F3BC4B.proto:1:1: Import \"google/protobuf/any.proto\" was not found or had errors.\n3EE60D098B85EF5C3B580784A65632E262F3BC4B.proto:2:1: Import \"google/protobuf/duration.proto\" was not found or had errors.\n3EE60D098B85EF5C3B580784A65632E262F3BC4B.proto:3:1: Import \"google/protobuf/field_mask.proto\" was not found or had errors.\n3EE60D098B85EF5C3B580784A65632E262F3BC4B.proto:4:1: Import \"google/protobuf/struct.proto\" was not found or had errors.\n3EE60D098B85EF5C3B580784A65632E262F3BC4B.proto:5:1: Import \"google/protobuf/timestamp.proto\" was not found or had errors.\n3EE60D098B85EF5C3B580784A65632E262F3BC4B.proto:6:1: Import \"google/protobuf/wrappers.proto\" was not found or had errors.\n"}
    expanding macro: Protox.__using__/1
    lib/google/protobuf.ex:8: Google.Protobuf (module)
    (elixir 1.12.2) expanding macro: Kernel.use/2
    lib/google/protobuf.ex:8: Google.Protobuf (module)
could not compile dependency :protox, "mix compile" failed. You can recompile this dependency with "mix deps.compile protox", update it with "mix deps.update protox" or clean it with "mix deps.clean protox"

To Reproduce

  • run mix phx.new protox_compile_bug --no-ecto --no-html
  • open up mix.exs
  • add {:protox, "~> 1.6"}, to the dependencies
  • mix deps.get
  • mix deps.compile

Expected behavior
It should compile succesfully,

Environment (please complete the following information):
elixir 1.12.2-otp-24
erlang 24.0.5

I can try to take a closer look at what is going on later today if it is not immediately obvious to you what the issue could be. My hunch is that perhaps there are some new files that are not being included in the hex package?

Unexpected list in typespec: [:high, :low]

Hi,

First of all, thank you for putting effort into this library. :-)

I'm trying to compile the counter strike proto files found here
https://github.com/SteamRE/SteamKit/blob/master/Resources/Protobufs/csgo/cstrike15_usermessages.proto

If i run protoc -I./ --descriptor_set_out=./protox_test.test cstrike15_usermessages.proto i get no errors, and my file is populated.

But when i run

defmodule Demex.UserMessages do
  @external_resource "./lib/protobufs/cstrike15_usermessages.proto"

  use Protox, files: [
    "./lib/protobufs/cstrike15_usermessages.proto",
  ], namespace: Ost
 end

I can't compile my project, getting the following compile error.

โ‡’  mix compile
Compiling 1 file (.ex)
[libprotobuf WARNING google/protobuf/compiler/parser.cc:546] No syntax specified for the proto file: cstrike15_usermessages.proto. Please use 'syntax = "proto2";' or 'syntax = "proto3";' to specify a syntax version. (Defaulted to proto2 syntax.)
[libprotobuf WARNING google/protobuf/compiler/parser.cc:546] No syntax specified for the proto file: netmessages.proto. Please use 'syntax = "proto2";' or 'syntax = "proto3";' to specify a syntax version. (Defaulted to proto2 syntax.)
[libprotobuf WARNING google/protobuf/compiler/parser.cc:546] No syntax specified for the proto file: cstrike15_gcmessages.proto. Please use 'syntax = "proto2";' or 'syntax = "proto3";' to specify a syntax version. (Defaulted to proto2 syntax.)
[libprotobuf WARNING google/protobuf/compiler/parser.cc:546] No syntax specified for the proto file: steammessages.proto. Please use 'syntax = "proto2";' or 'syntax = "proto3";' to specify a syntax version. (Defaulted to proto2 syntax.)
cstrike15_usermessages.proto: warning: Import google/protobuf/descriptor.proto but not used.
Compiling lib/user_messages.ex (it's taking more than 15s)

== Compilation error in file lib/user_messages.ex ==
** (CompileError) lib/user_messages.ex:4: unexpected list in typespec: [:high, :low]
    (elixir) lib/kernel/typespec.ex:1171: Kernel.Typespec.compile_error/2
    (elixir) lib/kernel/typespec.ex:1160: Kernel.Typespec.typespec/3
    (elixir) lib/kernel/typespec.ex:1189: Kernel.Typespec.fn_args/5
    (elixir) lib/kernel/typespec.ex:544: Kernel.Typespec.translate_spec/7

I simply don't know where to look, hope someone can help me.

Could i be, that it's trying to parse this.
https://github.com/google/protobuf/blob/master/src/google/protobuf/descriptor.proto#L690

Not seeing mix tasks

Hi @ahamez

First off, great work on this, I am interested in learning more about Protobufs with Elixir and possibly using this is production. I am using Protox 1.2, and after running mix deps.get I don't have the mix task mix protox.generate to generate the modules. Can you please point me at what I am missing or doing wrong?

Erlang/OTP 23 [erts-11.1] [source] [64-bit] [smp:12:12] [ds:12:12:10] [async-threads:1] [hipe]

Elixir 1.11.2 (compiled with Erlang/OTP 23)

Thanks in advance,

Viv

[BUG] README.md contains misleading examples

Hi there!
Trying the library, happen to notice this:

Describe the bug
README says do the Protox.encode(msg) while the correct version would be Protox.Encode.encode(msg), probably same with decode.

It is written correctly in Elixir docs, but wrong(outdated?) in README on github.

Cheers!

Proto3 Field Presence

First off, thanks for the great library! In my initial testing, it offers the best performance out of our options in the BEAM ecosystem, and I appreciate the property and mutation tests!

It appears that they have added field presence to proto3, so now it is possible again (as it was in proto2) to distinguish between a value that was not set at all and a value that was set to the default value.

See:

https://github.com/protocolbuffers/protobuf/blob/master/docs/field_presence.md
https://github.com/protocolbuffers/protobuf/blob/master/docs/implementing_proto3_presence.md
protocolbuffers/protobuf#1606

Would you be interested in either adding this feature to Protox or accepting a PR?

protox generates nested lists (was: Strange behavior when encoding complex nested struct)

Hey there! I'm using Protox over at axon_onnx. Great library! It was really easy to get up and running.

I'm trying to encode and decode data based on this protocol: https://github.com/onnx/onnx/blob/master/onnx/onnx.proto

Everything works fine for decoding; however, when I try to encode models, it produces a strange nested list. As an example, I tried to read, decode, and then encode a model to determine if the result was the same and it was not. Specifically, I downloaded this model: https://github.com/onnx/models/blob/master/vision/classification/resnet/model/resnet18-v1-7.onnx

And then did:

iex(1)> data = File.read!("resnet18-v1-7.onnx")
<<8, 3, 58, 244, 218, 169, 22, 10, 172, 1, 10, 4, 100, 97, 116, 97, 10, 22, 114,
  101, 115, 110, 101, 116, 118, 49, 53, 95, 99, 111, 110, 118, 48, 95, 119, 101,
  105, 103, 104, 116, 18, 19, 114, 101, 115, 110, 101, 116, 118, 49, ...>>
iex(2)> decoded = Onnx.ModelProto.decode!(data)
%Onnx.ModelProto{...everything is fine here...}
iex(3) encoded = Onnx.ModelProto.encode!(decoded)
[
  [
    [[], "\b", <<3>>],
    ":",
    [
      [<<241>>, [<<218>>, [<<169>>, <<22>>]]],
      <<10, 170, 1, 10, 4, 100, 97, 116, 97, 10, 22, 114, 101, 115, 110, 101,
        116, 118, 49, 53, 95, 99, 111, 110, 118, 48, 95, 119, 101, 105, 103,
        104, 116, 18, 19, 114, 101, 115, 110, 101, 116, 118, 49, 53, ...>>
    ]
  ],
  [[], "B", [<<2>>, <<16, 8>>]]
]

Not sure what's going wrong, but I am unable to encode even basic models. Please let me know if I can provide any additional info to help fix this :)

[BUG]

Describe the bug
The generated code, if the output is in 1 file, is wrapped in a list.

To Reproduce
Have a nested message.
I ran the generate command like so:
MIX_ENV=prod mix protox.generate --output-path=./lib/protos.ex --namespace=Protos ./path/to/file.proto

Expected behaviour
Have the defmodules listed not in List form

Actual behaviour

[
  defmodule Protos.CalibrationErrorStatus do
    @moduledoc false
    (
      defstruct []

      (
        @spec default() :: :CALIBRATION_ERROR_STATUS_UNINIT
        def default() do
          :CALIBRATION_ERROR_STATUS_UNINIT
        end
      )

...

which is weird: why the List?
Also, the Message that is calling all other messages is Request, not CalibrationErrorStatus.

Environment (please complete the following information):

  • Elixir version [1.13-otp-24]
  • Erlang version [24.2]
  • OS: Linux Mint
    • proto version [2]

Additional context
When using --multiple-files, and thus having the 1 defmodule per file, the main wrapping list is gone, but the functions are wrapped in a List again:

@spec encode(atom()) :: integer() | atom()
    [
      (
        def encode(:TRIP_TYPE_UNDEFINED) do
          0
        end

        def encode("TRIP_TYPE_UNDEFINED") do
          0
        end
      ),
      (
        d

Maybe it's not a bug, and I am doing something not right.

[Feature request] support `extend` when nested inside a message definition

I have the following (simplified) proto2 definition:

syntax = "proto2";

message SessionCommand {
    enum SessionCommandType {
        PING = 1000;
        // etc
    }
    extensions 100 to max;
}

message Command_Ping {
    extend SessionCommand {
        optional Command_Ping ext = 1000;
    }
}

// etc

When decoding a Command_Ping (when i don't know what message type i'm looking at) I get the following struct:

iex> Proto.SessionCommand.decode!(<<194, 62, 0>>)
%Proto.SessionCommand{__uf__: [{1000, 2, ""}]}

I then have to detect and correctly decode the extension with something like this:

defp decode(cmd, encoded) do
  [{id, _, _}] = SessionCommand.unknown_fields(cmd)

  case SessionCommandType.decode(id) do
    :PING ->
      Command_Ping.decode!(encoded)
  end
end

It would be much nicer if the SessionCommand decoder had a way to know what messages extend it, and then automatically decode to the correct struct (or decode them into some well-known field on the SessionCommand)

[QUESTION] Enums implementation

Right now enums are implemented in a way which allows delivering an arbitrary value, out of the expected set which makes them a bit less reliable than they could be. What I mean is that in the generated code the decode and encode functions have clauses which accept all values and return them as they are. For example:

...
@spec encode(atom()) :: integer() | atom()
[
  (
    def(encode(:DISABLED)) do
      0
    end

    def(encode("DISABLED")) do
      0
    end
  ),
  (
    def(encode(:ENABLED)) do
      1
    end

    def(encode("ENABLED")) do
      1
    end
  )
]

def(encode(x)) do
  x
end
...

@spec decode(integer()) :: atom() | integer()
[
  def(decode(0)) do
    :DISABLED
  end,
  def(decode(1)) do
    :ENABLED
  end
]

def(decode(x)) do
  x
end
...

Wouldn't be better if we either got rid of these last clauses or raised exceptions from them?

[Feature Request] support for decimal 1.9.x

The library currently depends on decimal 2.0 which makes it hard to upgrade to the latest version since there are other dependencies that are still on 1.9.x. Other popular libraries like ecto support both versions. Since the usage of decimal is very low, it should be easier to support both. I can open a PR if you are ok with adding support.

[BUG] Fix compile warning when use --keep-unknown-fields=false

Describe the bug
Use --keep-unknown-fields=false to generate module(s) from *.proto file(s), when compile the generated Elixir module(s) will see the following warning:

warning: invalid expression (). If you want to invoke or define a function, make sure there are no spaces between the function name and its arguments. If you wanted to pass an empty block, pass a value instead, such as a nil or an atom
  lib/elixir_foo_test.ex:30

warning: invalid expression (). If you want to invoke or define a function, make sure there are no spaces between the function name and its arguments. If you wanted to pass an empty block, pass a value instead, such as a nil or an atom
  lib/elixir_foo_test.ex:83

To Reproduce
Steps, code or protobuf messages to reproduce the behavior:

1, Here is foo.proto file

syntax = "proto2";

package Foo;

message Test {
    optional string name = 1;
}

2, Create a mix application and add the latest protox deps, and use it to generate module file from foo.proto in the root of this application.

mix protox.generate --keep-unknown-fields=false --multiple-files --output-path=lib --include-path=. protos/foo.proto

3, Open the generated file lib/elixir_foo_test.ex as below:

# credo:disable-for-this-file
defmodule(Foo.Test) do
  @moduledoc(false)
  (
    defstruct(name: nil)
    (
      @spec(encode(struct) :: {:ok, iodata} | {:error, any})
      def(encode(msg)) do
        try do
          {:ok, encode!(msg)}
        rescue
          e ->
            {:error, e}
        end
      end
      @spec(encode!(struct) :: iodata | no_return)
      def(encode!(msg)) do
        [] |> encode_name(msg)
      end
      []
      [defp(encode_name(acc, msg)) do
        field_value = msg.name()
        case(field_value) do
          nil ->
            acc
          _ ->
            [acc, "\n", Protox.Encode.encode_string(field_value)]
        end
      end]
      (
        
      )
    )
    (
      @spec(decode(binary) :: {:ok, struct} | {:error, any})
      def(decode(bytes)) do
        try do
          {:ok, decode!(bytes)}
        rescue
          e ->
            {:error, e}
        end
      end
      (
        @spec(decode!(binary) :: struct | no_return)
        def(decode!(bytes)) do
          parse_key_value(bytes, struct(Foo.Test))
        end
      )
      (
        @spec(parse_key_value(binary, struct) :: struct)
        defp(parse_key_value(<<>>, msg)) do
          msg
        end
        defp(parse_key_value(bytes, msg)) do
          {field, rest} = case(Protox.Decode.parse_key(bytes)) do
            {0, _, _} ->
              raise(%Protox.IllegalTagError{})
            {1, _, bytes} ->
              {len, bytes} = Protox.Varint.decode(bytes)
              <<delimited::binary-size(len), rest::binary>> = bytes
              value = delimited
              field = {:name, value}
              {[field], rest}
            {tag, wire_type, rest} ->
              {_, rest} = Protox.Decode.parse_unknown(tag, wire_type, rest)
              {[], rest}
          end
          msg_updated = struct(msg, field)
          parse_key_value(rest, msg_updated)
        end
      )
      []
    )
    @spec(defs() :: %{required(non_neg_integer) => {atom, Protox.Types.kind(), Protox.Types.type()}})
    def(defs()) do
      %{1 => {:name, {:default, ""}, :string}}
    end
    @spec(defs_by_name() :: %{required(atom) => {non_neg_integer, Protox.Types.kind(), Protox.Types.type()}})
    def(defs_by_name()) do
      %{name: {1, {:default, ""}, :string}}
    end
    (
      
    )
    @spec(required_fields() :: [])
    def(required_fields()) do
      []
    end
    @spec(syntax() :: atom)
    def(syntax()) do
      :proto2
    end
    [@spec(default(atom) :: {:ok, boolean | integer | String.t() | float} | {:error, atom}), [def(default(:name)) do
      {:ok, ""}
    end], def(default(_)) do
      {:error, :no_such_field}
    end]
  )
  def(__generated_code__()) do
    "(\n  defstruct(name: nil)\n  (\n    @spec(encode(struct) :: {:ok, iodata} | {:error, any})\n    def(encode(msg)) do\n      try do\n        {:ok, encode!(msg)}\n      rescue\n        e ->\n          {:error, e}\n      end\n    end\n    @spec(encode!(struct) :: iodata | no_return)\n    def(encode!(msg)) do\n      [] |> encode_name(msg)\n    end\n    []\n    [defp(encode_name(acc, msg)) do\n      field_value = msg.name()\n      case(field_value) do\n        nil ->\n          acc\n        _ ->\n          [acc, \"\\n\", Protox.Encode.encode_string(field_value)]\n      end\n    end]\n    (\n      \n    )\n  )\n  (\n    @spec(decode(binary) :: {:ok, struct} | {:error, any})\n    def(decode(bytes)) do\n      try do\n        {:ok, decode!(bytes)}\n      rescue\n        e ->\n          {:error, e}\n      end\n    end\n    (\n      @spec(decode!(binary) :: struct | no_return)\n      def(decode!(bytes)) do\n        parse_key_value(bytes, struct(Foo.Test))\n      end\n    )\n    (\n      @spec(parse_key_value(binary, struct) :: struct)\n      defp(parse_key_value(<<>>, msg)) do\n        msg\n      end\n      defp(parse_key_value(bytes, msg)) do\n        {field, rest} = case(Protox.Decode.parse_key(bytes)) do\n          {0, _, _} ->\n            raise(%Protox.IllegalTagError{})\n          {1, _, bytes} ->\n            {len, bytes} = Protox.Varint.decode(bytes)\n            <<delimited::binary-size(len), rest::binary>> = bytes\n            value = delimited\n            field = {:name, value}\n            {[field], rest}\n          {tag, wire_type, rest} ->\n            {_, rest} = Protox.Decode.parse_unknown(tag, wire_type, rest)\n            {[], rest}\n        end\n        msg_updated = struct(msg, field)\n        parse_key_value(rest, msg_updated)\n      end\n    )\n    []\n  )\n  @spec(defs() :: %{required(non_neg_integer) => {atom, Protox.Types.kind(), Protox.Types.type()}})\n  def(defs()) do\n    %{1 => {:name, {:default, \"\"}, :string}}\n  end\n  @spec(defs_by_name() :: %{required(atom) => {non_neg_integer, Protox.Types.kind(), Protox.Types.type()}})\n  def(defs_by_name()) do\n    %{name: {1, {:default, \"\"}, :string}}\n  end\n  (\n    \n  )\n  @spec(required_fields() :: [])\n  def(required_fields()) do\n    []\n  end\n  @spec(syntax() :: atom)\n  def(syntax()) do\n    :proto2\n  end\n  [@spec(default(atom) :: {:ok, boolean | integer | String.t() | float} | {:error, atom}), [def(default(:name)) do\n    {:ok, \"\"}\n  end], def(default(_)) do\n    {:error, :no_such_field}\n  end]\n)"
  end
end

We will see there are two sections(L30 and L83) with ( ) in the generated lib/elixir_foo_test.ex file.

Expected behavior
Fix the above mentioned warning.

Environment (please complete the following information):

  • Elixir version: 1.10
  • Erlang version: 22

Additional context
N/A

Supporting FileOptions

Hi, thank you for the protox.

Just wanted to know if there is any chance of adding support for FileOptions? I see a commit to remove FileOptions long time back so wanted to know the rationale

Thank you

[Feature request] Define spec for structs

As described in the docs there is already a mapping of field to type.

And the decode/1 function is using a generic struct.
As an example of the generated code:

@spec decode(binary) :: {:ok, struct} | {:error, any}

Would it be possible to generate a @type t for the structs?

Inconsistency in generated functions specs?

Protox: v. 1.6.2

When I generate a module for a given protobuf message the code is decorated with functions specs. For example:

defmodule PB.Hello do
  ...

  @spec json_decode(iodata(), keyword()) :: {:ok, struct()} | {:error, any()}
  def(json_decode(input, opts \\ [])) do
    try do
      {:ok, json_decode!(input, opts)}
    rescue
      e in Protox.JsonDecodingError ->
        {:error, e}
    end
  end

  @spec json_decode!(iodata(), keyword()) :: iodata() | no_return()
  def(json_decode!(input, opts \\ [])) do
    {json_library_wrapper, json_library} = Protox.JsonLibrary.get_library(opts, :decode)

    Protox.JsonDecode.decode!(
      input,
      PB.Hello,
      &json_library_wrapper.decode!(json_library, &1)
    )
  end

  ...
end

Unfortunately it seems there's inconsistency between json_decode! and json_decode specs for their happy paths. The former one is to return iodata whereas the latter one struct. That's impossible taking into account how the result value is defined for json_decode:

 {:ok, json_decode!(input, opts)}

Shouldn't they both return {:ok, struct()} or struct respectively?

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    ๐Ÿ–– Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. ๐Ÿ“Š๐Ÿ“ˆ๐ŸŽ‰

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google โค๏ธ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.