Coder Social home page Coder Social logo

lua-protobuf's Introduction

Google protobuf support for Lua

Build StatusCoverage Status

English | 中文


This project offers a C module for Lua (5.1, 5.2, 5.3, 5.4 and LuaJIT) manipulating Google's protobuf protocol, both for version 2 and 3 syntax and semantics. It splits to the lower-level and the high-level parts for different goals.

For converting between binary protobuf data with Lua tables, using pb.load() loads the compiled protobuf schema content (*.pb file) generated by Google protobuf's compiler named protoc and call pb.encode()/pb.decode().

Or use these modules to manipulate the raw wire format in lower-level way:

  • pb.slice: a wire format decoding module.
  • pb.buffer: a buffer implement that use to encode basic types into protobuf's wire format. It can be used to support streaming decode protobuf data.
  • pb.conv: a module converting integers in the protobuf wire format.
  • pb.io: a module access stdin/stdout or other files in binary mode.

If you don't want to depend Google's protobuf compiler, protoc.lua is a pure Lua module translating text-based protobuf schema content into the *.pb binary format.

Install

To install, you could just use luarocks:

luarocks install lua-protobuf

If you want to build it from source, just clone the repo and use luarocks:

git clone https://github.com/starwing/lua-protobuf
luarocks make rockspecs/lua-protobuf-scm-1.rockspec

If you don't have luarocks, use hererocks to install Lua and luarocks:

pip install hererocks
git clone https://github.com/starwing/lua-protobuf
hererocks -j 2.0 -rlatest .
bin/luarocks make lua-protobuf/rockspecs/lua-protobuf-scm-1.rockspec CFLAGS="-fPIC -Wall -Wextra" LIBFLAGS="-shared"
cp protoc.lua pb.so ..

Or you can build it by hand, it only has a pure Lua module protoc.lua and a pair of C source: pb.h and pb.c. Notice that in order to build the pb C module, you need Lua header file and/or libary file installed. replace $LUA_HEADERS and $LUA_LIBS below to real install locations.

To build it on macOS, use your favor compiler:

gcc -O2 -shared -undefined dynamic_lookup -I "$LUA_HEADERS" pb.c -o pb.so

On Linux, use the nearly same command:

gcc -O2 -shared -fPIC -I "$LUA_HEADERS" pb.c -o pb.so

On Windows, you could use MinGW or MSVC, create a *.sln project or build it on the command line (notice the Lua_BUILD_AS_DLL flag):

cl /O2 /LD /Fepb.dll /I "$LUA_HEADERS" /DLUA_BUILD_AS_DLL pb.c "$LUA_LIBS"

Example

local pb = require "pb"
local protoc = require "protoc"

-- load schema from text (just for demo, use protoc.new() in real world)
assert(protoc:load [[
   message Phone {
      optional string name        = 1;
      optional int64  phonenumber = 2;
   }
   message Person {
      optional string name     = 1;
      optional int32  age      = 2;
      optional string address  = 3;
      repeated Phone  contacts = 4;
   } ]])

-- lua table data
local data = {
   name = "ilse",
   age  = 18,
   contacts = {
      { name = "alice", phonenumber = 12312341234 },
      { name = "bob",   phonenumber = 45645674567 }
   }
}

-- encode lua table data into binary format in lua string and return
local bytes = assert(pb.encode("Person", data))
print(pb.tohex(bytes))

-- and decode the binary data back into lua table
local data2 = assert(pb.decode("Person", bytes))
print(require "serpent".block(data2))

Use case

零境交错

Usage

protoc Module

Function Returns Descriptions
protoc.new() Proroc object create a new compiler instance
protoc.reload() true reload all google standard messages into pb module
p:parse(string) table transform schema to DescriptorProto table
p:compile(string) string transform schema to binary *.pb format data
p:load(string) true load schema into pb module
p.loaded table contains all parsed DescriptorProto table
p.unknown_import see below handle schema import error
p.unknown_type see below handle unknown type in schema
p.include_imports bool auto load imported proto

To parse a text schema content, create a compiler instance first:

local p = protoc.new()

Then, set some options to the compiler, e.g. the unknown handlers:

-- set some hooks
p.unknown_import = function(self, module_name) ... end
p.unknown_type   = function(self, type_name) ... end
-- ... and options
p.include_imports = true

The unknown_import and unknown_type handle could be true, string or a function. Seting it to true means all non-exist modules and types are given a default value without triggering an error; A string means a Lua pattern that indicates whether an unknown module or type should raise an error, e.g.

p.unknown_type = "Foo.*"

means all types prefixed by Foo will be treat as existing type and do not trigger errors.

If these are functions, the unknown type and module name will be passed to functions. For module handler, it should return a DescriptorProto Table produced by p:load() functions, for type handler, it should return a type name and type, such as message or enum, e.g.

function p:unknown_import(name)
  -- if can not find "foo.proto", load "my_foo.proto" instead
  return p:parsefile("my_"..name)
end

function p:unknown_type(name)
  -- if cannot find "Type", treat it as ".MyType" and is a message type return ".My"..name, "message"
end

After setting options, use load() or compile() or parse() function to get result.

pb Module

pb module has high-level routines to manipulate protobuf messages.

In below table of functions, we have several types that have special means:

  • type: a string that indicates the protobuf message type, ".Foo" means the type in a proto definition that has not package statement declared. "foo.Foo" means the type in a proto definition that declared package foo;

  • data: could be string, pb.Slice value or pb.Buffer value.

  • iterator: a function that can use in Lua for in statement, e.g.

    for name in pb.types() do
      print(name)
    end

NOTICE: Only pb.load() returns error on failure, do check the result it returns. Other routines raise a error when failure for convenience.

Function Returns Description
pb.clear() None clear all types
pb.clear(type) None delete specific type
pb.load(data) boolean,integer load a binary schema data into pb module
pb.encode(type, table) string encode a message table into binary form
pb.encode(type, table, b) buffer encode a message table into binary form to buffer
pb.decode(type, data) table decode a binary message into Lua table
pb.decode(type, data, table) table decode a binary message into a given Lua table
pb.pack(fmt, ...) string same as buffer.pack() but return string
pb.unpack(data, fmt, ...) values... same as slice.unpack() but accept data
pb.types() iterator iterate all types in pb module
pb.type(type) see below return informations for specific type
pb.fields(type) iterator iterate all fields in a message
pb.field(type, string) see below return informations for specific field of type
pb.typefmt(type) String transform type name of field into pack/unpack formatter
pb.enum(type, string) number get the value of a enum by name
pb.enum(type, number) string get the name of a enum by value
pb.defaults(type[, table/nil]) table get the default table of type
pb.hook(type[, function]) function get or set hook functions
pb.option(string) string set options to decoder/encoder
pb.state() pb.State retrieve current pb state
pb.state(newstate | nil) pb.State set new pb state and retrieve the old one

Schema loading

pb.load() accepts the schema binary data and returns a boolean indicates the result of loading, success or failure, and a offset reading in schema so far that is useful to figure out the reason of failure.

Type mapping

Protobuf Types Lua Types
double, float number
int32, uint32, fixed32, sfixed32, sint32 number or integer in Lua 5.3+
int64, uint64, fixed64, sfixed64, sint64 number or "#" prefixed string or integer in Lua 5.3+
bool boolean
string, bytes string
message table
enum string or number

Type Information

Using pb.(type|field)[s]() functions retrieve type information for loaded messages.

pb.type() returns multiple informations for specified type:

  • name : the full qualifier name of type, e.g. ".package.TypeName"
  • basename: the type name without package prefix, e.g. "TypeName"
  • "map" | "enum" | "message": whether the type is a map_entry type, enum type or message type.

pb.types() returns a iterators, behavior like call pb.type() on every types of all messages.

print(pb.type "MyType")

-- list all types that loaded into pb
for name, basename, type in pb.types() do
  print(name, basename, type)
end

pb.field() returns information of the specified field for one type:

  • name: the name of the field
  • number: number of field in the schema
  • type: field type
  • default value: if no default value, nil
  • "packed"|"repeated"| "optional": label of the field, optional or repeated, required is not supported
  • [oneof_name, oneof_index]: if this is a oneof field, this is the oneof name and index

And pb.fields() iterates all fields in a message:

print(pb.field("MyType", "the_first_field"))

-- notice that you needn't receive all return values from iterator
for name, number, type in pb.fields "MyType" do
  print(name, number, type)
end

pb.enum() maps from enum name and value:

protoc:load [[
enum Color { Red = 1; Green = 2; Blue = 3 }
]]
print(pb.enum("Color", "Red")) --> 1
print(pb.enum("Color", 2)) --> "Green"

Default Values

Using pb.defaults() to get or set a table with all default values from a message. this table will be used as the metatable of the corresponding decoded message table when setting use_default_metatable option.

You could also call pb.defaults with "*map" or "*array" to get the default metatable for map and array when decoding a message. These settings will bypass use_default_metatable option.

To clear a default metatable, just pass nil as second argument to pb.defaults().

   check_load [[
      message TestDefault {
         optional int32 defaulted_int = 10 [ default = 777 ];
         optional bool defaulted_bool = 11 [ default = true ];
         optional string defaulted_str = 12 [ default = "foo" ];
         optional float defaulted_num = 13 [ default = 0.125 ];
      } ]]
   print(require "serpent".block(pb.defaults "TestDefault"))
-- output:
-- {
--   defaulted_bool = true,
--   defaulted_int = 777,
--   defaulted_num = 0.125,
--   defaulted_str = "foo"
-- } --[[table: 0x7f8c1e52b050]]

Hooks

If set pb.option "enable_hooks", the hook function will be enabled. you could use pb.hook() and pb.encode_hook to set or get a decode or encode hook function, respectively: call it with type name directly get current setted hook; call it with two arguments to set a hook; and call it with nil as the second argument to remove the hook. in all case, the original one will be returned.

After the hook function setted and hook enabled, the decode function will be called after a message get decoded and encode functions will be called before the message is encoded. So you could get all values in the table passed to hook function. That's the only argument of hook.

If you need type name in hook functions, use this helper:

local function make_hook(name, func)
  return pb.hook(name, function(t)
    return func(name, t)
  end)
end

Options

Setting options to change the behavior of other routines. These options are supported currently:

Option Description
enum_as_name set value to enum name when decode a enum (default)
enum_as_value set value to enum value when decode a enum
int64_as_number set value to integer when it fit into uint32, otherwise return a number (default)
int64_as_string same as above, but return a string instead
int64_as_hexstring same as above, but return a hexadigit string instead
auto_default_values act as use_default_values for proto3 and act as no_default_values for the others (default)
no_default_values do not default values for decoded message table
use_default_values set default values by copy values from default table before decode
use_default_metatable set default values by set table from pb.default() as the metatable
enable_hooks pb.decode will call pb.hooks() hook functions
disable_hooks pb.decode do not call hooks (default)
encode_default_values default values also encode
no_encode_default_values do not encode default values (default)
decode_default_array work with no_default_values,decode null to empty table for array
no_decode_default_array work with no_default_values,decode null to nil for array (default)
encode_order guarantees the same message will be encoded into the same result with the same schema and the same data (but the order itself is not specified)
no_encode_order do not have guarantees about encode orders (default)
decode_default_message decode null message to default message table
no_decode_default_message decode null message to null (default)

Note: The string returned by int64_as_string or int64_as_hexstring will prefix a '#' character. Because Lua may convert between string with number, prefix a '#' makes Lua return the string as-is.

all routines in all module accepts '#' prefix string/hex string as arguments regardless of the option setting.

Multiple State

pb module support multiple states. A state is a database that contains all type information of registered messages. You can retrieve current state by pb.state(), or set new state by pb.state(newstate).

Use pb.state(nil) to discard current state, but not to set a new one (the following routines call that use the state will create a new default state automatedly). Use pb.state() to retrieve current state without setting a new one. e.g.

local old = pb.state(nil)
-- if you use protoc.lua, call protoc.reload() here.
assert(pb.load(...))
-- do someting ...
pb.state(old)

Notice that if you use protoc.lua module, it will register some message to the state, so you should call proto.reload() after setting a new state.

pb.io Module

pb.io module reads binary data from a file or stdin/stdout, pb.io.read() reads binary data from a file, or stdin if no file name given as the first parameter.

pb.io.write() and pb.io.dump() are same as Lua's io.write() except they write binary data. the former writes data to stdout, and the latter writes data to a file specified by the first parameter as the file name.

All these functions return a true value when success, and return nil, errmsg when an error occurs.

Function Returns Description
io.read() string read all binary data from stdin
io.read(string) string read all binary data from file name
io.write(...) true write binary data to stdout
io.dump(string, ...) string write binary data to file name

pb.conv Module

pb.conv provide functions to convert between numbers.

Encode Function Decode Function
conv.encode_int32() conv.decode_int32()
conv.encode_uint32() conv.decode_uint32()
conv.encode_sint32() conv.decode_sint32()
conv.encode_sint64() conv.decode_sint64()
conv.encode_float() conv.decode_float()
conv.encode_double() conv.decode_double()

pb.slice Module

Slice object parse binary protobuf data in a low-level way. Use slice.new() to create a slice object, with the optional offset i and j to access a subpart of the original data (named a view).

As protobuf usually nest sub message with in a range of slice, a slice object has a stack itself to support this. Calling s:enter(i, j) saves current position and enters next level with the optional offset i and j just as slice.new(). calling s:leave() restore the prior view. s:level() returns the current level, and s:level(n) returns the current position, the start and the end position information of the nth level. calling s:enter() without parameter will read a length delimited type value from the slice and enter the view in reading value. Using #a to get the count of bytes remains in current view.

local s = slice.new("<data here>")
local tag = s:unpack "v"
if tag%8 == 2 then -- tag has a type of string/bytes? maybe it's a sub-message.
  s:enter() -- read following bytes value, and enter the view of bytes value.
  -- do something with bytes value, e.g. reads a lot of fixed32 integers from bytes.
  local t = {}
  while #s > 0 do
    t[#t+1] = s:unpack "d"
  end
  s:leave() -- after done, leave bytes value and ready to read next value.
end

To read values from slice, use slice.unpack(), it use a format string to control how to read into a slice as below table (same format character are also used in buffer.pack()). Notice that you can use pb.typefmt() to convert between format and protobuf type names (returned from pb.field()).

Format Description
v variable Int value
d 4 bytes fixed32 value
q 8 bytes fixed64 value
s length delimited value, usually a string, bytes or message in protobuf.
c receive a extra number parameter count after the format, and reads count bytes in slice.
b variable int value as a Lua boolean value.
f 4 bytes fixed32 value as floating point number value.
F 8 bytes fixed64 value as floating point number value.
i variable int value as signed int value, i.e. int32
j variable int value as zig-zad encoded signed int value, i.e.sint32
u variable int value as unsigned int value, i.e. uint32
x 4 bytes fixed32 value as unsigned fixed32 value, i.e.fixed32
y 4 bytes fixed32 value as signed fixed32 value, i.e. sfixed32
I variable int value as signed int value, i.e.int64
J variable int value as zig-zad encoded signed int value, i.e. sint64
U variable int value and treat it as uint64
X 8 bytes fixed64 value as unsigned fixed64 value, i.e. fixed64
Y 8 bytes fixed64 value as signed fixed64 value, i.e. sfixed64

And extra format can be used to control the read cursor in one slice.unpack() process:

Format Description
@ returns current cursor position in the slice, related with the beginning of the current view.
* set the current cursor position to the extra parameter after format string.
+ set the relate cursor position, i.e. add the extra parameter to the current position.

e.g. If you want to read a varint value twice, you can write it as:

local v1, v2 = s:unpack("v*v", 1)
-- v: reads a `varint` value
-- *: receive the second parameter 1 and set it to the current cursor position, i.e. restore the cursor to the head of the view
-- v: reads the first `varint` value again

All routines in pb.slice module:

Function Returns Description
slice.new(data[,i[,j]]) Slice object create a new slice object
s:delete() none same as s:reset(), free it's content
tostring(s) string return the string repr of the object
#s number returns the count of bytes can read in current view
s:result([i[, j]]) String return the remaining bytes in current view
s:reset([...]) self reset object to another data
s:level() number returns the count of stored state
s:level(number) p, i, j returns the informations of the nth stored state
s:enter() self reads a bytes value, and enter it's view
s:enter(i[, j]) self enter a view start at i and ends at j, includes
s:leave([number]) self, n leave the number count of level (default 1) and return current level
s:unpack(fmt, ...) values... reads values of current view from slice

pb.buffer Module

Buffer module used to construct a protobuf data format stream in a low-level way. It's just a bytes data buffer. using buffer.pack() to append values to the buffer, and buffer.result() to get the encoded raw data, or buffer.tohex() to get the human-readable hex digit value of data.

buffer.pack() use the same format syntax with slice.unpack(), and support '()' format means the inner value will be encoded as a length delimited value, i.e. a message value encoded format.

parenthesis can be nested.

e.g.

b:pack("(vvv)", 1, 2, 3) -- get a bytes value that contains three varint value.

buffer.pack() also support '#' format, it means prepends a length into the buffer.

e.g.

b:pack("#", 5) -- prepends a varint length #b-5+1 at offset 5

All routines in pb.buffer module:

Function Returns Description
buffer.new([...]) Buffer object create a new buffer object, extra args will passed to b:reset()
b:delete() none same as b:reset(), free it's content
tostring(b) string returns the string repr of the object
#b number returns the encoded count of bytes in buffer
b:reset() self reset to a empty buffer
b:reset([...]) self resets the buffer and set its content as the concat of it's args
b:tohex([i[, j]]) string return the string of hexadigit represent of the data, i and j are ranges in encoded data, includes. Omit it means the whole range
b:result([i[,j]]) string return the raw data, i and j are ranges in encoded data, includes,. Omit it means the whole range
b:pack(fmt, ...) self encode the values passed to b:pack(), use fmt to indicate how to encode value

lua-protobuf's People

Contributors

changnet avatar cjtallman avatar doublecai avatar dualhappiness avatar edam avatar fmaksim74 avatar forsakenyang avatar javierguerragiraldez avatar jayatubi avatar membphis avatar mxi-box avatar scandgy avatar seclionys avatar spacewander avatar starwing avatar sundream avatar wzhengsen avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

lua-protobuf's Issues

ld: can't write output file: pb.so for architecture x86_64

Mac :
yhh:lua-protobuf watl$ gcc -O2 -shared -undefined dynamic_lookup pb.c -o pb.so
ld: can't write output file: pb.so for architecture x86_64
clang: error: linker command failed with exit code 1 (use -v to see invocation)
yhh:lua-protobuf watl$ gcc -O2 -shared -undefined dynamic_lookup pb.c -o pb.so -v
Apple LLVM version 9.0.0 (clang-900.0.39.2)
Target: x86_64-apple-darwin17.3.0
Thread model: posix
InstalledDir: /Volumes/Mac/Applications/Xcode.app/Contents/Developer/Toolchains/XcodeDefault.xctoolchain/usr/bin
"/Volumes/Mac/Applications/Xcode.app/Contents/Developer/Toolchains/XcodeDefault.xctoolchain/usr/bin/clang" -cc1 -triple x86_64-apple-macosx10.13.0 -Wdeprecated-objc-isa-usage -Werror=deprecated-objc-isa-usage -emit-obj -disable-free -disable-llvm-verifier -discard-value-names -main-file-name pb.c -mrelocation-model pic -pic-level 2 -mthread-model posix -mdisable-fp-elim -fno-strict-return -masm-verbose -munwind-tables -target-cpu penryn -target-linker-version 305 -v -dwarf-column-info -debugger-tuning=lldb -resource-dir /Volumes/Mac/Applications/Xcode.app/Contents/Developer/Toolchains/XcodeDefault.xctoolchain/usr/lib/clang/9.0.0 -O2 -fdebug-compilation-dir /Volumes/Mac/Documents/workspace/lua-protobuf -ferror-limit 19 -fmessage-length 181 -stack-protector 1 -fblocks -fobjc-runtime=macosx-10.13.0 -fencode-extended-block-signature -fmax-type-align=16 -fdiagnostics-show-option -fcolor-diagnostics -vectorize-loops -vectorize-slp -o /var/folders/8z/77k5pp_95pg0blphwr1bmqk00000gn/T/pb-d0a484.o -x c pb.c
clang -cc1 version 9.0.0 (clang-900.0.39.2) default target x86_64-apple-darwin17.3.0
#include "..." search starts here:
#include <...> search starts here:
/usr/local/include
/Volumes/Mac/Applications/Xcode.app/Contents/Developer/Toolchains/XcodeDefault.xctoolchain/usr/lib/clang/9.0.0/include
/Volumes/Mac/Applications/Xcode.app/Contents/Developer/Toolchains/XcodeDefault.xctoolchain/usr/include
/usr/include
/System/Library/Frameworks (framework directory)
/Library/Frameworks (framework directory)
End of search list.
"/Volumes/Mac/Applications/Xcode.app/Contents/Developer/Toolchains/XcodeDefault.xctoolchain/usr/bin/ld" -demangle -lto_library /Volumes/Mac/Applications/Xcode.app/Contents/Developer/Toolchains/XcodeDefault.xctoolchain/usr/lib/libLTO.dylib -dynamic -dylib -arch x86_64 -macosx_version_min 10.13.0 -undefined dynamic_lookup -undefined dynamic_lookup -o pb.so /var/folders/8z/77k5pp_95pg0blphwr1bmqk00000gn/T/pb-d0a484.o -lSystem /Volumes/Mac/Applications/Xcode.app/Contents/Developer/Toolchains/XcodeDefault.xctoolchain/usr/lib/clang/9.0.0/lib/darwin/libclang_rt.osx.a
ld: can't write output file: pb.so for architecture x86_64
clang: error: linker command failed with exit code 1 (use -v to see invocation)
yhh:lua-protobuf watl$

内嵌repeated message所有字段都未设置值时,encode之后再decode出错

proto定义

syntax="proto3";

message MessageA
{
    int32 field = 1;
}

message MessageB
{
    repeated MessageA field = 1;
}

json

{
    "field" : [
        {
            
        }
    ]
}

lua代码

local file = io.open("test.json", "rb")
local json_bytes = file:read("*a")
file:close()
local obj = json.decode(json_bytes)

proto_bytes = pb.encode("MessageB", obj)
print(#proto_bytes)

local newObj, message = pb.decode("MessageB", proto_bytes)
if newObj ~= nil then
    print(newObj)
else
    print(message)
end

输出结果

1
invalid bytes length: 0 (at offset 2)

proto2 default not parsed correctly

Attempting to load a proto2 syntax .proto file with the following field:

message Foo {
   // some fields here

   optional int32 defaulted_num = 10 [ default = 777 ];
}

does not work. When I subsequently try to serialize to the specified type (not using defaulted_num field), it fails with 'type Foo not exists' message.

also: would you mind correcting that to 'type Foo does not exist'?)

also: shouldn't protoc:loadfile() fail when it can't load a type? I don't think it does, at least not in a way I can see.

pb_nextentry 有bug

PB_API int pb_nextentry(pb_Table *t, pb_Entry **pentry) {
size_t i = *pentry ? pbT_offset(pentry, t->hash) : 0;
size_t size = t->size
t->entry_size;
while (i += t->entry_size, i < size) {
pb_Entry *entry = pbT_index(t->hash, i);
if (entry->key != 0) {
*pentry = entry;
return 1;
}
}
*pentry = NULL;
return 0;
}

如果*pentry为null,i=0时数据获取不到,进入到while的第一次时i会加上t->entry_size,所以如果number为8的标记,会找不到

Multiple contexts

Could you add the ability for multiple pb contexts within a single lua state?
Ideally the API would be to make everything a method that passes the context around.
e.g.

local pb_ctx = require "pb".new()
local pb_compiler = require "protoc".new(pb_ctx)
assert(pb_compiler:load(someproto))
local mytype = pb_ctx:type("Hello")


local pb_ctx2 = require "pb".new()
local pb_compiler2 = require "protoc".new(pb_ctx2)
assert(pb_compiler2:load(someproto)) -- where someproto contains a different/conflicting defintition of the same types.
local mytype2 = pb_ctx:type("Hello")

inconsistent proto3 enum handling (some fields decoded as names, other as values)

Given the following proto3 syntax type:

message Foo {
  enum Color {
    RED = 0; // 0 must be present, is the default, and must be first in list
    GREEN = 1;
    BLUE = 2;
  }
  repeated Color color_enumf = 1;

and the following input:

 {
   color_enumf = { 2, "RED", 34, "z" }
}

After encoding and decoding, the output looks as follows:

{
  color_enumf = { "BLUE", 0, 34 }
}

The inconsistency is that an input of 2 (or "BLUE") appears as in the output as "BLUE" (this is good), but input of 0 (or "RED") appears in output as 0. So -- some fields get translated to the enum names, others to enum values.

(to be clear, the other two inputs behave as I expect: 34 stays 34, "z" is ignored; both of these are reasonable).

uint32 field coredump

message testmessage
{
optional string account = 1;
optional string password = 2;
optional uint32 deviceid = 3;
optional string test = 4;
}

uint32 changed to int32 is ok..

image

protoc:load有BUG

如果加载下面文本,就会出错

出错时的调用栈
LuaException: protoc.lua:10: attempt to index a function value (local 't')
stack traceback:
protoc.lua:10: in upvalue 'default'
protoc.lua:695: in local 'body_parser'
protoc.lua:713: in local 'top_parser'
protoc.lua:756: in function 'protoc.parse'
protoc.lua:992: in function 'protoc.compile'
protoc.lua:1004: in function 'protoc.load'
protobuf3.lua:53: in main chunk
[C]: in function 'require'
ProtobufTest.lua:2: in main chunk
[C]: in function 'require'

syntax = "proto3";
package protocol;
//import "pb_common.proto";
//import "pb_login.proto";

message get_ping {
}

message get_ping_ret {
}

enum register_result {
enumZero = 0;
registerSuccess = 1; //注册成功
userExists = 2; //用户名已经存在
invalidFormat = 3; //用户名格式不符合规范
invalidServer = 6; //连接游戏服务器失败
}

//登录服务器注册账号
message login_register{
string username = 1; //用户名
string password = 2; //密码
}

//登录服务器注册账号返回
message login_register_ret {
register_result result = 1; //注册结果
int32 uid = 2; //用户id 注册成功返回此字段
string token = 3; //登录令牌 注册成功返回此字段
string gameServer = 4; //返回游戏服务器地址
string msg = 5; //反馈消息 登录失败返回
}

message C2S {
int32 seq = 1; //消息序号

//-----------------通用接口---------------
get_ping get_ping = 3;

//-----------------登录模块接口---------------
login_register login_register = 6;

}

message S2C {
int32 seq = 1;
//-----------------通用接口---------------
get_ping_ret get_ping_ret = 3;

//-----------------登录服接口---------------
login_register_ret login_register_ret = 6;

}

//rpc 服务间转发
service RPCService {
rpc request(C2S) returns(S2C);
}

Random file created after usage

Im trying to figure out why the following code produces a directory and file: lua-protobuf/pb.gcda

local socket = require "socket"
local schema = require "utils.net.schema"

package.cpath = "./src/lib/?.so"
local pb = require "pb"
local protoc = require "lib.protoc"

pb.load(assert(protoc.new():compile(schema)))
local sock = socket.udp()

sock:settimeout(nil)
sock:setpeername("%s", %i)

while true do
	local sendMsg = love.thread.getChannel('net.queue.outgoing'):demand() -- blocks until receive
        if sendMsg then
		sock:send(assert(pb.encode("schema.Packet", {
			TypeOf = sendMsg.typeof,
		        Body = assert(pb.encode(sendMsg.typeof, sendMsg.body)),
		})))
	end
end

this also seems to happen with the compilation method shown in the example.

LuaJIT support?

Is LuaJIT support at all possible? I am trying to use this library for the net communication in a Love2D-based game.

default不支持

assert(protoc:load [[
message Phone {
optional string name = 1;
optional int64 phonenumber = 2[default=0];
}
message Person {
optional string name = 1;
optional int32 age = 2;
optional string address = 3;
repeated Phone contacts = 4;
} ]])

error:
function[lfdfd] parameter start:
value: table
value: table
value: table
value: table
value: table
value: table
string: [script/protoc.lua:998: bad argument #2 to '?' (string expected at fiel
d 'default_value', got number)]
----parameter end----

反序列化来自C#的二进制流的时候,出现错误

通过C#序列化一个带有repeated int32字段的proto结构体到二进制流,在通过lua-protobuf反序列化将会出现错误:

type mismatch at offset 2, varint expected for type int32, got bytes

proto文件定义:

syntax="proto3";

message MyMessage
{
    repeated int32 intList = 1;
}

C#代码:

using System.IO;

public class Application
{
    public static int Main(string[] args)
    {
        var instance = Google.Protobuf.JsonParser.Default.Parse<MyMessage>(File.ReadAllText("data.json"));
        using (var ms = new MemoryStream())
        using (var cs = new Google.Protobuf.CodedOutputStream(ms))
        {
            instance.WriteTo(cs);
            cs.Flush();
            File.WriteAllBytes("pb.bin", ms.ToArray());
        }

        return 0;
    }
}

data.json

{
    "intList": [
        1,
        2,
        3
    ]    
}

C#生成的二进制流:

  Offset: 00 01 02 03 04 05 06 07 08 09 0A 0B 0C 0D 0E 0F 	
00000000: 0A 03 01 02 03                                     .....

Lua代码:

local pb = require("pb")
local json = require("rapidjson")

if pb.loadfile("../proto/pb.description") then
    local file = io.open("../csharp/pb.bin", "rb")
    local bytes = file:read("*a")
    file:close()
    local obj, message = pb.decode("MyMessage", bytes)
    if obj ~= nil then
        print(json.encode(obj))
    else
        print(message)
    end
else
    print("pb description load failed")
end

maps with integer keys can be confusing

In the old implementation, a protobuf map was representated in Lua as follows
local f={
{ key = "key1", value="value1" },
{ key = "key2", value = "value2"}
}

After the recent change, the implementation changed, unless I'm mistaken, to directly use table keys:
local f = {
key1 = "value1",
key2 = "value2"
}

I was unable to get this new implementation to work with integer keys consistently.
I suspect this is because Lua treats integer keys specially (as array portion of the table, rather than the hash portion of the table). Unless I'm misunderstanding how tables are supposed to be represented...

Basically, encoding and then decoding this:
local obj = {
map_intstr = {
[1] = "one",
[2] = "two",
[0] = "zero",
[-1] = "minus one",
}
}
results in
{
map_intstr = { "one", "two",
[-1] = "minus one",
[0] = "zero"
},
}
Note how the values outside of the normal range of array indices are treated as hash entries (accessed with 'pairs'), and integer values >=1 get treated as array values (accessed with ipairs).

Basically:

  • while using natural Lua syntax for tables seems nice, it has problems
  • the old representation, while less natural/convenient, didn't have the problem, but the new one does.

could you release this code under an Open Source license?

Hi,
I am very interested in using your lua-protobuf code for a project at work.
I also have a number of bugfixes that I'd like to contribute back to lua-protobuf
(e.g., fixes for a few crashes, missing data type support, etc).

Unfortunately, I'm not allowed to to do this unless this project is explicitly licensed under an open source license.

Could you please release it under an Open Source license such as the BSD license, the MIT license, or the Apache license?

(to do it, I think it's a matter of including a LICENSE file with the text of the license, and possibly some mentions of it in the README and the source files).

I really hope you can do this. Thanks in advance!

aliased enums not fully supported, cause a crash

using an enum such as the following:

enum AliasedEnum {
    option allow_alias = true;
    ZERO = 0;
    ONE = 1;
    TWO = 2;
    FIRST = 1;
  }

is problematic. Using non-aliased enum members works fine. Using "FIRST" above works fine as well. Using "ONE" (which FIRST is aliased to) crashes the library. I've not tried multiple aliases, or determined the exact behavior (superficially it seems that the last member using a given value seems to work, and earlier ones don't; this needs to be confirmed).

repeated int32解析出错

遇到一个比较奇怪的BUG,解析的数据与nodejs的不一致,但是nodejs的两种数据都能decode,pb不行
protoc:load([[
message login_test_res{
repeated int32 r = 1;
}
]])

print(pb.tohex(pb.encode("login_test_res", {r = {1,2000}})))
--打印数据为08 01 08 D0 0F
local backp1 = pb.decode("login_test_res", "\8\1\8\208\15")
--所以上面的数据能正常decode
--下面的数据为JS中解出来数据
local backp = pb.decode("login_test_res", "\10\3\1\208\15")

--JS中的测试如下
proto.load("login.proto", function(err, root){
if (err)
throw err;

    const Login = root.lookupType("login_test_res");  
    let message = Login.create(
        {
            "r":[1, 2000]
        }
    )   

    let buf = Login.encode(message).finish();
    console.log(`login:$s`, buf.toString("hex"));
    //buf的数据为login:$s 0a0301d00f,也就是10,3,1,208,15

    var arr = new Array(8,1,8,208,15);
    var msg = Login.decode(arr)//lua encode后的数据正常解析出
    var msg1 = Login.decode(new Array(10,3,1,208,15))// JS自身的数据正常解析出
    //但是最终这个数据在lua中不能正常decode出来

})

发现是[packed=true]这个相关,加上后pb的结果就是10,3,1,208,15

enum 不支持 alias 选项啊

如果enum 中包含 两个numbers 相等, check_enum.check_dup 会报错 。
在声明有 option allow_alias=true; 时候应当忽略 check_dup 。

测试的proto 节选如下:

image

'bytes' fields are incorrectly deserialized

deserializing a 'bytes' field produces a number value instead of a byte buffer.
I've only tried this on a type that was serialized by the same library, so don't know if serialization works correctly or not (superficially it appears to, but I've not examined the wiretype bytes yet).

Proto3

Hi, does this project work with proto3?

extension support problem

I'm having trouble getting nested extensions to serialize correctly.

Given these definitions:

message Person {
   required int32 id = 1;
   required string name = 2;
   optional string email = 3;
   repeated int32 arr = 4;
   extensions 10 to max;
}

extend Person {
   optional int32 test  = 11;
}

I'm able to serialize an instance of Person with the 'test' field with no problems:

local p = {
   id = 3, name = "Alice",
   test = 15
}
pb.encode("Person", p)
-- and then decode, compare the results -- and all is well

HOWEVER: if the extension is instead written as below (replacing the above):

message ExtendedPerson {
  extend Person {
    optional int32 test  = 11;
  } 
}

I'm unable to serialize the message successfully (after deserialization, it comes out as nil):

-- p is same as above
pb.encode("ExtendedPerson", p)

Do I need to set up the table differently for this case? Or is this not supported?

proto3 'Any' type not working

This used to work in the previous implementation (prior to recent changes).
Not sure if the problem is a bug, or if the usage pattern changed, reporting what I encountered.

Given the following proto3 type:

syntax = "proto3";
package foo;
import "google/protobuf/any.proto";

message Bar {
  string   stringf   = 1;
}


message Foo
{
  google.protobuf.Any any = 1;
}

And the following input usage:

local ANY_TYPE = "foo.Bar"
local ANY_TYPE_URL = "type.googleapis.com/foo.Bar"

local prim  = {
   stringf = "this is a string in the any message",
}

local encoded_bar, msg = pb.encode( ANY_TYPE, prim )
local obj = {
   any = {
      type_url = ANY_TYPE_URL,
      value = encoded_bar
   }
}

Encoding and decoding "obj" above produces an empty table.

As I mentioned, the same exact usage pattern worked fine in the previous version of the code
(I'd had to fix the handling of 'bytes' for that, but after that it worked fine).

not able to decode empty table

empty table must return instead return nil

suggestion:
static int parse_slice(lua_State *L, pb_Slice *slice, pb_Type *t) {
pb_State *S = default_state(L);
Context ctx;
luaL_checkstack(L, 3, "proto nest level too big");
lua_newtable(L);
ctx.p.S = S;
ctx.p.type = t;
ctx.p.on_field = on_field;
ctx.p.on_mistype = NULL;
ctx.p.on_unknown = NULL;
ctx.L = L;
if (pb_parse(&ctx.p, slice))
return 1;
//lua_pop(L, 1);
return 1;
}

bug

function on_field
case PB_Tuint32: case PB_Tfixed32:
must break;

Using a service

I'm investingating various lua libraries for use with gRPC.
Using the current git HEAD, I was able to parse a .proto file containing a gRPC service definition.
However I'm not sure how to actually use the service: e.g. pb.type("myservice") doesn't return the service.

For initial testing, I'm using the example service definition from here:

// Copyright 2015 gRPC authors.
//
// Licensed under the Apache License, Version 2.0 (the "License");
// you may not use this file except in compliance with the License.
// You may obtain a copy of the License at
//
//     http://www.apache.org/licenses/LICENSE-2.0
//
// Unless required by applicable law or agreed to in writing, software
// distributed under the License is distributed on an "AS IS" BASIS,
// WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
// See the License for the specific language governing permissions and
// limitations under the License.

syntax = "proto3";

option java_multiple_files = true;
option java_package = "io.grpc.examples.helloworld";
option java_outer_classname = "HelloWorldProto";
option objc_class_prefix = "HLW";

package helloworld;

// The greeting service definition.
service Greeter {
  // Sends a greeting
  rpc SayHello (HelloRequest) returns (HelloReply) {}
}

// The request message containing the user's name.
message HelloRequest {
  string name = 1;
}

// The response message containing the greetings
message HelloReply {
  string message = 1;
}

protoc.lua: compilefile fails with "bad argument #2 to '?' (string expected at field 'default_value', got number)"

Tried using protoc module to load a .proto file, failed with the error mesage:
'''bad argument #2 to '?' (string expected at field 'default_value', got number)'''

Code snippet:
'''
local protoc = require "protoc"
local pb = require "pb"
local p = protoc.new()
-- set up paths as appropriate
-- p.paths[#p.paths+1] = "foo"
-- p.paths[#p.paths+1] = "bar"
local descriptor = p:compilefile( "relative/path/to/some.proto" )
'''

sub.import_fallback

sub.import_fallback中使用的self.unknown_import和self.import_fallback字段和文档中描述(unknown_module)不一致.

elseif type(self.import_fallback) == 'string' then

if self.unknown_import == true then

Then, you can set some options to compiler, e.g. the search path, the unknown handlers, etc.
p.paths[#p.paths+1] = "whatever/folder/hold/.proto/files"
p.unknown_module = function(self, module_name) ... end
p.unknown_type = function(self, type_name) ... end

pb.dll, pb.so

你好,请问下有编译好的pb.dll么?
能写一篇编译的博客么?

import出错,message中的field丢失了

local pb = require "pb"
local pbio = require "pb.io"
local buffer = require "pb.buffer"
local slice = require "pb.slice"
local conv = require "pb.conv"
local protoc = require "protoc".new()
protoc.paths[#protoc.paths+1]="./protobuf3"

protoc:loadfile("protocol.proto")
print(pb.type("login_test"))
print(pb.type("protocol.C2S"))

for field, number in pb.fields("protocol.C2S") do
print(field, number)
end

for field, number in pb.fields("protocol.S2C") do
--print(field, number, "\n")
print(field, number)
end

1、加载附件中的文件时,在S2C中找不到res_change这个field,打印不出来
2、在protoc中调试跟踪是有这个field的,但是pb.load后没有了

Protobuf3.zip

protoc.lua: some methods not accessible

local protoc = require "protoc"
local p = protoc.new()
local idlfile = "google/protobuf/any.proto"
-- p.paths[#p.paths+1] = ...  -- set to wherever the .proto files live
p.include_imports = true

assert(p:parsefile( relidlfile ))
assert(p:reload())
assert(p:compilefile( relidlfile ))

In the above sequence, the call to parsefile succeds. For both of the subsequent calls, I get a message such as:

luajit: test_pb.lua:8: attempt to call method 'reload' (a nil value)

And indeed, printing the contents of the table, I see that those methods are missing:

{
  include_imports = true,
  loaded = {},
  paths = { ".",  },
  typemap = {},
  <metatable> = <1>{
    __index = <table 1>,
    __name = "Parser",
    addpath = <function 1>,
    error = <function 2>,
    loaded = {},
    new = <function 3>,
    parse = <function 4>,
    parsefile = <function 5>,
    paths = { "." },
    resolve = <function 6>,
    typemap = {}
  }
}

proto3 aliased enum not working

Given the following proto3 syntax type:

message Foo {
  enum AliasedEnum {
    option allow_alias = true;
    ZERO = 0;
    ONE = 1;
    TWO = 2;
    FIRST = 1;
  }
  repeated AliasedEnum aliased_enumf = 2;
}

and the following input:

 {
  aliased_enumf = { "ZERO", "FIRST", "TWO", 23, "ONE" }
}

encoding and then decoding produces the following:

{
  aliased_enumf = { 0, "TWO", 23, "ONE" }
}

Basically, the entry 'FIRST' is ignored, but should not be.
(note: aliased enums are ambiguous, so it's ok to make a choice to decode a value as either of the names; but ignoring it is clearly a bug).

Require string

The current require string of "pb" conflicts with other libraries (such as https://github.com/Neopallium/lua-pb/)

As your library is named "lua-protobuf", could you change your require string to "protobuf"?

Also perhaps moving "protoc" to "protobuf.compiler" or similar?

protoc:load问题

for i, v in pairs(filelist) do
local out = loader(v)
if out then
local bb = protoc:load(out.text)
if bb == true then
print("ok")
else
print("failed")
end
--print(out.text)
end
end
只能把第一个文件给加载,后续的文件就不行,换了文件顺序发现,就是第一个成功

protoc:loadfile加载不存在的proto文件报错不准确

protoc:loadfile加载不存在的proto文件,导致调用了未初始化的 info = self.import_fallback(name), 导致报错信息不准确。

info = self.import_fallback(name)

测试代码:

local pb = require("pb")
local protoc = require("protoc")
protoc:loadfile("relative/path/to/not/exists.proto")

输出:
lua: ./protoc.lua:283: attempt to call field 'import_fallback' (a nil value)
stack traceback:
./protoc.lua:283: in function 'parsefile'
./protoc.lua:1001: in function 'compilefile'
./protoc.lua:1011: in function 'loadfile'
demo2.lua:5: in main chunk
[C]: in ?

怀疑有一个内存泄漏问题。

重现方法如下:
临时加了一个"Lpb_loadfileTest" 函数(修改自Lpb_loadfile). 方便测试

//==in pb.c file==========================
int Lpb_loadfileTest(lua_State *L, const char * filename) {
pb_State *S = default_state(L);
//const char *filename = luaL_checkstring(L, 1);
size_t size;
pb_Buffer b;
pb_SliceExt s;
int ret;
FILE *fp = fopen(filename, "rb");
if (fp == NULL)
return luaL_fileresult(L, 0, filename);
pb_initbuffer(&b);
do {
void *d = pb_prepbuffsize(&b, BUFSIZ);
if (d == NULL) { fclose(fp); return luaL_error(L, "out of memory"); }
size = fread(d, 1, BUFSIZ, fp);
pb_addsize(&b, size);
} while (size == BUFSIZ);
fclose(fp);
s = lpb_initext(pb_result(&b));
ret = pb_load(S, &s.base);
pb_resetbuffer(&b);
lua_pushboolean(L, ret == PB_OK);
lua_pushinteger(L, lpb_offset(&s));
return 2;
}

//==================== in main.c file======================
#include <stdio.h>
#include <stdint.h>
#include <sys/stat.h>
#include <sys/time.h>
#include <unistd.h>
#include <stdlib.h>
#include <string.h>
#include <stdio.h>
#include "lua.h"
#include "lauxlib.h"
#include "lualib.h"

extern int Lpb_loadfileTest(lua_State *L, const char * fileName);

int main()
{
int ret = 0;
int i = 0;
for( i = 0; i < 1000000; i++ )
{
lua_State *L = luaL_newstate();
luaL_openlibs( L );
luaopen_pb( L );

    Lpb_loadfileTest( L, "C:/u2test2.pb" );
    
    lua_close( L );

    usleep( 1000 );

    if( i % 1000 == 0 )
    {
        printf( "RunIndex:%d\n", i );   
    }
}
return 0;

}

///////////////////////////////////////////////////////
编译后执行, 进程内存会不停地涨, 注释掉 Lpb_loadfileTest 的调用就不会涨。
换大的 pb 文件会更明显。

proto2 extensions broken

Given the following type definitions:

syntax = "proto2";
package foo;

message Extendable {
  required string     name     = 1;
  extensions 10 to max;
}


extend Extendable {
  optional uint32 rank = 300;
}

message LocalExtended {
  extend Extendable {
    optional string address = 400;
  }
}

and the following input:

{
   name = "George"
}

Encoding/decoding results in an empty table.

Same behavior happens for following inputs that actually try to use fields in the extension (rank, address).

与python生成的二进制文件不兼容

涉及的库版本:

  1. lua-protobuf-0.2.0
  2. protobuf-3.5.1(使用proto 3格式)
  3. python-2.7.14
  4. unity 2017.3.1f1(.NET 4.6)
  5. slua-1.5.5(luajit-2.1.0-beta3) <= 编译集成lua-protobuf-0.2.0
  6. 使用protobuf-3.5.1的protoc生成python和c#代码以及FileDescriptorSet(.pb)文件

问题描述:
proto文件定义如下:

syntax = "proto3";
package table;
option csharp_namespace="Table";
message example
{
    int32 int_v = 1;
    int64 long_v = 2;
    float float_v = 3;
    double double_v = 4;
    string string_v = 5;
    bool bool_v = 6;
    repeated int32 array_int = 7;
}

message example_collection
{
    string key_field = 1;
    repeated example rows = 2;
}

填充数据如下(这里我就用lua代码格式来描述了):

exampleCollection = {
    key_field = "int_v",
    rows = {
        {
            int_v = 123,
            long_v = 8589934592,
            float_v = 3.14,
            double_v = 3.1415926,
            string_v = "hello world",
            bool_v = true,
            array_int = { 1, 2, 3, 4, 5 }
        }
    }
}

使用protoc生成python代码后,使用python代码根据proto定义填充数据并编码写入文件example.bytes, protoc生成的c#代码可以解析example.bytes文件。使用protoc生成FileDescriptorSet(.pb)文件, 在lua中调用pb.load接口载入.pb文件,然后调用pb.decode解析example.bytes文件,会产生报错:
type mismatch at offset 39, varint expected for type int32, got bytes
如果我使用lua-protobuf来填充相同的数据并encode写入文件example.bytes, 这个文件就可以同时被lua和c#解析成功。

gen_bytes.zip

proto3: imported types not working?

Given the following type definitions:
File imported.proto:

syntax = "proto3";
package foo;

message SimpleImported {
  bool     flag     = 13;
  string   stringf   = 14;
}

File simple.proto:

syntax = "proto3";
package foo;

import "imported.proto";

message Foo
{
  foo.SimpleImported imported_simple_proto3 = 8;
}

and the following input:

 {
   imported_simple_proto3 = {
      stringf = "a string",
      flag = true,
   }
}

Encoding and decoding results in an empty table.

The same issue occurs if 'simple.proto' contains a proto2 type

It's possible that the 'Any' issue I just reported is caused by broken imports as well, not sure.

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.