Coder Social home page Coder Social logo

jsonrs's Introduction

Jsonrs

A safe NIF-based JSON library for Elixir powered by Rust's Serde. API Documentation.

Installation

Add Jsonrs as a dependency in your mix.exs file.

def deps do
  [{:jsonrs, "~> 0.3"}]
end

To avoid requiring an entire Rust toolchain for projects that use Jsonrs, pre-built binaries are provided via Rustler Precompiled, which will download a platform-specific binary from the appropriate Github release during compilation. If you would like to avoid using the pre-built binaries, set the environment variable FORCE_JSONRS_BUILD=true when building.

Summary

Jsonrs is much faster than other JSON libraries for Elixir, including other NIFs like jiffy.

It uses much less memory than other JSON libraries when operating in a mode with comparable features.

It has an API typical of Elixir JSON libraries. decode/1 encode/2 and encode_to_iodata/2 are available along with their raising ! counterparts, and structs can implement the Jsonrs.Encoder protocol to specify custom encoding.

Multiple operating modes allow you to turn off features you don't need for extra performance. In the default 2-pass mode, Jsonrs works like Jason and Poison and honors custom encoding of structs. In the opt-in 1-pass (lean) mode, Jsonrs will ignore custom encoding impls and operate more like jiffy.

Encoder Design

Jsonrs is a 2-pass encoder by default.

The first pass involves mapping the input object according to any custom encodings implemented for types in the input. This is done in Elixir.

The second pass performs the actual encoding of the input to a JSON string. This is done in the Rust NIF.

If custom encoding definitions are not required, the first pass can be disabled entirely by passing lean: true in the encode opts. When operating in this mode, structs will always be encoded as if they were maps regardless of custom encoder implementations.

Performance

Jsonrs has different performance characteristics in different modes. In the default 2-pass mode, Jsonrs is comparable feature-wise to Jason or Poison and should always be much faster (~5-10x) and consume less memory (~3-30x) than either of them when working with any non-trivial JSON. In lean mode, Jsonrs is comparable feature-wise to jiffy and should always be faster (~3x) and use less memory (~100x+) than it.

For example, here is a benchmark of Jsonrs's speed and memory consumption compared to other popular JSON libraries when encoding the 8MB issue-90.json from the Poison benchmark data.

Speed comparison of JSON libraries

Memory comparison of JSON libraries

It is easy to see that Jsonrs dramatically outperforms Poison and Jason, while Jsonrs-lean similarly outperforms jiffy.

Quick Start

Jsonrs.decode/1 takes a JSON string and turns it into an Elixir term.

Jsonrs.encode/2 takes an Elixir term and a keyword list of options and turns the input term into a JSON string with behavior defined by the options. Valid options are:

  • lean: disables the first encoder pass (default false)
  • pretty: turns on pretty printing when true or a non-negative-integer specifying indentation size (default false)
  • compress: turns on streaming compression when passed in.
    • Currently you can pass :gzip to get default level gzip compression
    • You can also pass {:gzip, 0..9} to get a specific level of gzip compression
    • Other compression forms may be accepted in the future

Examples

Basic usage

iex> Jsonrs.encode!(%{"x" => [1, 2], "y" => 0.5})
"{\"x\":[1,2],\"y\":0.5}"

iex> Jsonrs.decode!("{\"x\":[1,2],\"y\":0.5}")
%{"x" => [1, 2], "y" => 0.5}

iex> Jsonrs.encode!(%{"x" => false, "y" => []}, pretty: true) |> IO.puts()
{
  "x": false,
  "y": []
}

Defining a custom encoder

Implement Jsonrs.Encoder for your struct, with an encode/1 function that returns any other type encodable by Jsonrs (it does not need to return a string).

For example:

defimpl Jsonrs.Encoder, for: [MapSet, Range, Stream] do
  def encode(struct), do: Enum.to_list(struct)
end

Limitations

Inherited from serde_json, decoding is limited to a nesting depth of 128. As per RFC 7159, section 9:

An implementation may set limits on the maximum depth of nesting.

Credits

Jsonrs is built on the backs of giants. It is connecting the Rust libraries Serde_rustler and Serde_json, exposing them to Elixir through Rustler, and wrapping the NIF in an interface inspired by Jason. Without these projects, Jsonrs probably wouldn't exist.

jsonrs's People

Contributors

benhaney avatar joedevivo avatar kianmeng avatar lytedev avatar lytedev-divvy avatar mmmries avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar

jsonrs's Issues

Jason seems to be faster on recent Elixir version (1.14)

System:

Operating System: macOS
CPU Information: Apple M1
Number of Available Cores: 8
Available memory: 16 GB
Elixir 1.14.0
Erlang 25.0.4
small_list = Enum.to_list(1..100)
big_list = Enum.to_list(1..10_000)
small_map = Enum.map(small_list, fn(x)-> {"key#{x}", x} end) |> Enum.into(%{})
big_map = Enum.map(big_list, fn(x)-> {"key#{x}", x} end) |> Enum.into(%{})


Benchee.run(
  %{
    "jsonrs.encode! - small" => fn -> Jsonrs.encode!(small_map) end,
    "Jason.encode! - small" => fn -> Jason.encode!(small_map) end
  }
)

Benchee.run(
  %{
    "jsonrs.encode! - big" => fn -> Jsonrs.encode!(big_map) end,
    "Jason.encode! - big" => fn -> Jason.encode!(big_map) end
  }
)

RESULT:

Operating System: macOS
CPU Information: Apple M1
Number of Available Cores: 8
Available memory: 16 GB
Elixir 1.14.0
Erlang 25.0.4

Benchmark suite executing with the following configuration:
warmup: 2 s
time: 5 s
memory time: 0 ns
reduction time: 0 ns
parallel: 1
inputs: none specified
Estimated total run time: 14 s

Benchmarking Jason.encode! - small ...
Benchmarking jsonrs.encode! - small ...

Name                             ips        average  deviation         median         99th %
Jason.encode! - small        84.74 K       11.80 μs    ±36.67%       11.33 μs       16.83 μs
jsonrs.encode! - small       25.35 K       39.45 μs    ±56.39%       35.08 μs       94.58 μs

Comparison:
Jason.encode! - small        84.74 K
jsonrs.encode! - small       25.35 K - 3.34x slower +27.65 μs
Operating System: macOS
CPU Information: Apple M1
Number of Available Cores: 8
Available memory: 16 GB
Elixir 1.14.0
Erlang 25.0.4

Benchmark suite executing with the following configuration:
warmup: 2 s
time: 5 s
memory time: 0 ns
reduction time: 0 ns
parallel: 1
inputs: none specified
Estimated total run time: 14 s

Benchmarking Jason.encode! - big ...
Benchmarking jsonrs.encode! - big ...

Name                           ips        average  deviation         median         99th %
Jason.encode! - big         642.38        1.56 ms    ±10.56%        1.50 ms        1.93 ms
jsonrs.encode! - big        478.25        2.09 ms     ±6.31%        2.07 ms        2.36 ms

Comparison:
Jason.encode! - big         642.38
jsonrs.encode! - big        478.25 - 1.34x slower +0.53 ms

Statements from the readme file seem not to reflect those recent developments. Maybe adding a disclaimer about differences on different Erlang VMs / Elixir versions would help?

Cheers and thanks for this library!

Support streaming gzip encoding?

I am loving the approach of your library. At my day job I have a use-case where we create some very large data structures and encode them to JSON, then gzip that JSON. It is pretty common for us to see JSON documents that are 140MB+ in size which get compressed to ~9MB after gzipping.

Would you be open to me extending your library to support an option to do a streaming gzip encoding of the JSON as it's being compiled. This would reduce my system's memory footprint significantly since I wouldn't need to hold the 140MB of JSON in memory. I would only need to hold 9MB of gzipped json.

Update rustler to 0.22.x

Hi,

The newer rustler version fixes building this library when using OTP 24. The related rustler's issue.

Could you please update the library version, thanks.

Type issue on decode!()

Hello, I'm trying to shift from Jason to Jsonrs but I'm hitting some weird type issues. Dialyzer is going crazy overall. One example is the code below:

@spec deserialize(binary) :: {term, Metadata.t() | nil} | :error
 def deserialize(message) do
    case Jsonrs.decode!(message) do
      %{"route" => route, "reference" => reference, "payload" => payload} ->
        {payload, %Metadata{reference: reference, route: route}}
      %{"d" => payload} ->
        {payload, nil}
      _ ->
        :error
    end
 end

Dialyzer just give me this for the first case clause line

The pattern can never match the type.

Pattern:
%{
  <<114, 101, 102, 101, 114, 101, 110, 99, 101>> => _reference,
  <<114, 111, 117, 116, 101>> => _route,
  <<112, 97, 121, 108, 111, 97, 100>> => _payload
}

Type:
binary()

I don't understand how decode!() can be expecting a binary() type as return for the dialyzer run?

I did look at Jsonrs typing and everything seems correct, specs says decode!() should return term so I'm in a bit of a loss...

Important to notice is that I had no dialyzer issues at all with Jason and the only thing that changed in that function was the decode function module name

Inner structs don't obey defimpl Encoder rules

Description of problem

When a struct contains an inner struct, both with Jsonrs.Encoder definitions, the Jsonrs.encode function will not pick up on the inner struct's definition.

Test recreating problem

This test fails when it should pass. The inner structs defimpl is not

  defmodule SomeStruct do
    defstruct [:field, :inner_struct]
  end

  defmodule InnerStruct do
    defstruct [:field, :ignored_field]
  end

  defimpl Jsonrs.Encoder, for: SomeStruct do
    def encode(%{field: f, inner_struct: is}) do
      %{f: f, is: is}
    end
  end

  defimpl Jsonrs.Encoder, for: InnerStruct do
    def encode(%{field: f}) do
      %{f: f}
    end
  end

  test "protocols" do
    is = %InnerStruct{field: "a", ignored_field: "b"}

    val = %SomeStruct{
      field: "a",
      inner_struct: is
    }

    assert ~s({"f":"a"}) == Jsonrs.encode!(is)

    assert ~s({"f":"a","is":{"f":"a"}}) ==
             Jsonrs.encode!(val)
  end

Investigate using binary-backed vector

The largest current memory-inefficiency when doing an encode involves an extra copy across the Rust-to-Elixir boundary to convert the final encoded Rust string to an Elixir binary. This can potentially be avoided by allocating erlang binaries as the backing store of the destination vector, instead of a Rust byte array. The final binary can just be passed directly to the BEAM and used without an additional copy. There's probably some computational overhead with allocating erlang binaries instead of byte arrays to back the vector, which will need to be compared against the advantage of halving the allocations.
I think there's a good chance this ends up being a net positive, but it needs testing.

FORCE_JSONRS_BUILD doesn't seem to work

Hello.

$ FORCE_JSONRS_BUILD=true mix deps.compile jsonrs
==> jsonrs
Compiling 2 files (.ex)

== Compilation error in file lib/jsonrs.ex ==
** (RuntimeError) precompiled NIF is not available for this target: "aarch64-unknown-linux-musl".
The available targets are:
 - aarch64-apple-darwin
 - x86_64-apple-darwin
 - x86_64-unknown-linux-gnu
 - x86_64-unknown-linux-musl
 - arm-unknown-linux-gnueabihf
 - aarch64-unknown-linux-gnu
 - x86_64-pc-windows-msvc
 - x86_64-pc-windows-gnu
    lib/jsonrs.ex:8: (module)
    (elixir 1.14.0) lib/kernel/parallel_compiler.ex:346: anonymous fn/5 in Kernel.ParallelCompiler.spawn_workers/7
could not compile dependency :jsonrs, "mix compile" failed. Errors may have been logged above. You can recompile this dependency with "mix deps.compile jsonrs", update it with "mix deps.update jsonrs" or clean it with "mix deps.clean jsonrs"

Also how do I go about requesting aarch64-unknown-linux-musl target? I'm running in an Alpine 3 container on a Mac M1 host.

Thank you.

Incorrect encoding / decoding of nested structs

Example:

iex> foo_struct = %Foo{payload: ~T[12:00:00]}
%Foo{payload: ~T[12:00:00]}
iex21> foo_map = %{payload: ~T[12:00:00]}      
%{payload: ~T[12:00:00]}
iex22> Jsonrs.encode!(foo_struct)
"{\"payload\":{\"calendar\":\"Elixir.Calendar.ISO\",\"hour\":12,\"microsecond\":[0,0],\"minute\":0,\"second\":0}}"
iex23> Jsonrs.encode!(foo_map)   
"{\"payload\":\"12:00:00\"}"

Unusable with Phoenix Sockets

Hi @benhaney,
We are having some issues using this with phoenix live dashboard as it's making the views crash.

I set up the project like this:

# config.exs
config :phoenix, :json_library, Jsonrs

And then you go to /dashboard after setting it up and you'll see the loading bar just continue loading.

To debug the issues you'll have to create a module with cloning it and adding Jason as a dependency:

defmodule TestApp.Jsontest do
  def decode!(input, opts \\ []) do
    IO.puts("####### decode")
    IO.puts("INC VAL:")
    IO.inspect(input)

    IO.puts("JASON:")
    IO.inspect(Jason.decode!(input, opts))
    IO.puts("JSONRS:")
    IO.inspect(Jsonrs.decode!(input, opts))
    IO.puts("####### decode")

    Jsonrs.decode!(input, opts)
  end

  def encode_to_iodata!(input, opts \\ []) do
    IO.puts("####### ENCODE")
    IO.puts("INC VAL:")
    IO.inspect(input)

    IO.puts("JASON:")
    IO.inspect(Jason.encode!(input, opts))
    IO.puts("JSONRS:")
    IO.inspect(Jsonrs.encode!(input, opts))
    IO.puts("####### ENCODE")

    Jsonrs.encode!(input, opts)
  end
end

And pointing the json_library above to it.

Then you'll see some differences between Jason and Jsonrs:

####### ENCODE
INC VAL:
[nil, "26", "phoenix", "phx_reply", %{response: %{}, status: :ok}]
JASON:
"[null,\"26\",\"phoenix\",\"phx_reply\",{\"response\":{},\"status\":\"ok\"}]"
JSONRS:
"[null,\"26\",\"phoenix\",\"phx_reply\",{\"response\":{},\"status\":\"Ok\"}]"
####### ENCODE

It seems like Jsonrs is turning atoms into capitalized letters which is breaking stuff here and there.

can't decode a charlist

I receive a payload as a charlist from :httpc, so I need to convert it into a string to Jsonrs.decode! it. The library Jason can do it without the string conversion. Is it worth using Jsonrs when I have this kind of conversion to do?

Update benchmarks for OTP 24

OTP 24 includes a new JIT that should greatly increase the performance of non-NIF encoders like Jason and Poison. It will be interesting to see how much of the performance advantage of Jsonrs shrinks with this update.

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.