mozilla-services / lua_sandbox_extensions Goto Github PK
View Code? Open in Web Editor NEWExtension packages (sandboxes and modules) for the lua_sandbox project
License: Other
Extension packages (sandboxes and modules) for the lua_sandbox project
License: Other
I've just upgraded my Hindsight setup to the latest version with the new ssl module from the new lua_sandbox_extensions repository, and when Hindsight is starting, it cannot start the heka_tcp output sandbox anymore and outputs the following error message:
1473094428480839671 [error] output.heka_tcp /usr/lib/luasandbox/io_modules/ssl.lua:11: attempt to index global 'table' (a nil value)
1473094428481049377 [error] output_plugins output.heka_tcp lsb_heka_create_output failed
1473094428481063853 [error] output_plugins output.heka_tcp create_output_plugin failed
The offending line in /usr/lib/luasandbox/io_modules/ssl.lua
file is
local unpack = table.unpack or unpack
I suppose this is due to the updated LuaSec version (the working one was the one from LuaDist: https://github.com/LuaDist/luasec)
Any idea how to fix this issue?
Hi @trink
Multiple problems:
Using elasticsearch_bulk_api.lua with discard_on_error = false
,
if send_batch()
return -1 or -2, we don't discard, but we don't set retry = true
, so we keep consuming message (I ended up with a 14G batch file :) )
We should retry or discard
if we hit a network error, ie send_batch()
returns -3, we set retry = true
.
Now in a future process_message()
call, if send_batch()
returns 0, we just skipped/dropped a message? (not sure about this one)
The retry errors are logged at debug level
https://github.com/mozilla-services/hindsight/blob/7c66c4bd1043366cba1a38f056bc6630b2bfb736/src/hs_output_plugins.c#L429
Is there a way to log from a sandbox at a different level, without killing the sandbox with error (having debug/info/notice/warning would be great)?
Other solutions are:
discard never log the fact that it discarded a batch, and we can't increment the fail counter by the batch size
on restart we don't reread the batch_count
I'll try to send a PR for the simple issues, but I think we might need a separate process_batch api, because we can't really return meaningful return code when we do 2 totally separate actions
Your thoughts?
P.S: Thanks a lot for hindsight, it really rocks, fast, low memory usage, stable
See #87
This is in preparation for scaling out the filtering capacity in Hindsight analysis threads.
Hello,
I'm experiencing an important issue when using tcp.lua
with lots of client connections.
The plugin crashes and report this information:
1509130702213999114 [info] input.tcp detaching received: 1 msg: process_message() /usr/share/luasandbox/sandboxes/heka/input/tcp.lua:118: bad argument #1 to 'select' (descriptor too large for set size)
It appears this plugin is not directly in question, but the lib LuaSocket is.
When Hindsight holds at least FD_SETSIZE
open file descriptors, socket.select() returns an error. FD_SETSIZE
is defined to 1024 by a macro in Glibc (doc).
socket.select() leverages on this macro, so when I start few hundreds of TCP clients, total of open file descriptor goes quickly above 1024 (with my settings, when no clients are connected, Hindsight still has ~250 fd, so the 1024 value is very limiting, especially for infrastructure which need to connect thousands of clients).
Maybe I could start more (Hindsight) instances to spread connections, but it's not a good choice, I can't be sure they will always be well distributed, and any program could flood input and makes plugin to crash.
This LuaSocket limitation is mentionned in this issue, where @diegonehab suggest to use libevent
.
Regards.
Since the lua_sandbox only runs using a UTC timezone a TZ conversion module would be useful for logs that aren't in UTC or outputting an offset in the log.
this is how I build kafka rpm pacekage for testing
[root@~ release]# cmake -DCMAKE_BUILD_TYPE=release -DEXT_bloom_filter=off -DEXT_circular_buffer=off -DEXT_cjson=off -DEXT_compat=off -DEXT_cuckoo_filter=off -DEXT_elasticsearch=off -DEXT_geoip=off -DEXT_heka=off -DEXT_hyperloglog=off -DEXT_lfs=off -DEXT_lpeg=off -DEXT_lsb=off -DEXT_moz_telemetry=off -DEXT_openssl=off -DEXT_parquet=off -DEXT_postgres=off -DEXT_rjson=off -DEXT_sax=off -DEXT_snappy=off -DEXT_socket=off -DEXT_ssl=off -DEXT_struct=off -DEXT_syslog=off -DEXT_systemdDISABLED=off -DEXT_zlib=off -DEXT_kafka=on -DCPACK_GENERATOR=RPM ..
bloom_filter DISABLED
circular_buffer DISABLED
cjson DISABLED
compat DISABLED
cuckoo_filter DISABLED
elasticsearch DISABLED
geoip DISABLED
heka DISABLED
hyperloglog DISABLED
kafka ENABLED <-------------- only this is enabled
lfs DISABLED
lpeg DISABLED
lsb DISABLED
moz_telemetry DISABLED
openssl DISABLED
parquet DISABLED
postgres DISABLED
rjson DISABLED
sax DISABLED
snappy DISABLED
socket DISABLED
ssl DISABLED
struct DISABLED
syslog DISABLED
systemd DISABLED
zlib DISABLED
-- Configuring done
-- Generating done
-- Build files have been written to: /root/lua_sandbox_extensions/release
[root@~ release]# make
[ 25%] Linking C shared library kafka.so
[ 50%] Built target kafka
[100%] Built target kafka_test_sandbox
[100%] Built target kafka_copy_tests
[root@ansible release]# make packages
CPack: Create package using RPM
CPack: Install projects
CPack: - Run preinstall target for: kafka
CPack: - Install project: kafka
CPack: Create package
CPackRPM: Will use GENERATED spec file: /root/lua_sandbox_extensions/release/_CPack_Packages/Linux/RPM/SPECS/luasandbox-kafka.spec
CPack: - package: /root/lua_sandbox_extensions/release/luasandbox-kafka-1.0.5-Linux.rpm generated.
then I install kafka rpm
[root@~ release]# rpm -ivh luasandbox-kafka-1.0.5-Linux.rpm
error: Failed dependencies:
librdkafka.so.1()(64bit) is needed by luasandbox-kafka-1.0.5-1.x86_64
but I have installed librdkafka by
git clone https://github.com/edenhill/librdkafka.git
make
make install
and copy librdkafka.so.1 to lib path
cp /usr/local/lib/librdkafka.so.1 /lib64/
cp /usr/local/lib/librdkafka.so.1 /lib/
[root@~ release]# objdump -f /lib64/librdkafka.so.1 | grep ^architecture
architecture: i386:x86-64, flags 0x00000150:
is there something I miss?
When a message is rejected by librdkafka, the failure counter is incremented but no message is logged.
Note: the callback function rd_kafka_conf_set_dr_cb is deprecated.
Latest version from git contains invalid meta information in Debian package luasandbox-lpeg-1.0.5-Linux.deb
:
Package: luasandbox-lpeg
Version: 1.0.5
Section: devel
Priority: optional
Architecture: amd64
Depends: , luasandbox (>= 1.0.2)
/usr/lib/luasandbox/io_modules/openssl.so: symbol X509_check_host, version libcrypto.so.10 not defined in file libcrypto.so.10 with link time reference
A saved plugin state with a 64MM entry expiring cuckoo filter is core dumping Hindsight on startup
Some properties must be set on a topic level, cf librdkafka doc.
Actually create_topic doesn't support it.
Please add a statsd output for hindsight.
If one wants to output cbufd
format, it seems that a fourth argument set to true
needs to be passed to circular_buffer.new()
.
This is not documented. I was able to realize this by looking at the unique_items.lua
filter.
Some grammars parse out the variables but don't label the message. The static string should be part of the capture or an appropriate description should be added. https://github.com/mozilla-services/lua_sandbox_extensions/blob/master/syslog/modules/lpeg/linux/sshd.lua
Ideally we may just want generate the grammars from the source logit calls e.g.
logit("Bad protocol version identification '%.100s' "
"from %s port %d", client_version_string,
ssh_remote_ipaddr(ssh), ssh_remote_port(ssh));
With this issue I just want to report that building the ssl extension fails with cmake 3.0.2, which is the version shipped with Debian Jessie.
This is the error I get at make
time:
root@1b2e35c5cfd3:/workdingdir/lua_sandbox_extensions/release# make
Scanning dependencies of target ep_ssl
[ 11%] Creating directories for 'ep_ssl'
[ 22%] Performing download step (git clone) for 'ep_ssl'
-- Avoiding repeated git clone, stamp file is up to date: '/workdingdir/lua_sandbox_extensions/release/ssl/ep_ssl-prefix/src/ep_ssl-stamp/ep_ssl-gitclone-lastrun.txt'
[ 33%] No patch step for 'ep_ssl'
[ 44%] Performing update step for 'ep_ssl'
[ 55%] Performing configure step for 'ep_ssl'
Not searching for unused variables given on the command line.
CMake Error at /usr/share/cmake-3.0/Modules/FindOpenSSL.cmake:293 (list):
list GET given empty list
Call Stack (most recent call first):
CMakeLists.txt:41 (find_package)
CMake Error at /usr/share/cmake-3.0/Modules/FindOpenSSL.cmake:294 (list):
list GET given empty list
Call Stack (most recent call first):
CMakeLists.txt:41 (find_package)
CMake Error at /usr/share/cmake-3.0/Modules/FindOpenSSL.cmake:296 (list):
list GET given empty list
Call Stack (most recent call first):
CMakeLists.txt:41 (find_package)
CMake Error at /usr/share/cmake-3.0/Modules/FindOpenSSL.cmake:298 (list):
list GET given empty list
Call Stack (most recent call first):
CMakeLists.txt:41 (find_package)
-- Found OpenSSL: /usr/lib/x86_64-linux-gnu/libssl.so;/usr/lib/x86_64-linux-gnu/libcrypto.so (found version ".0.0`")
-- Configuring incomplete, errors occurred!
See also "/workdingdir/lua_sandbox_extensions/release/ssl/ep_ssl-prefix/src/ep_ssl-build/CMakeFiles/CMakeOutput.log".
ssl/CMakeFiles/ep_ssl.dir/build.make:103: recipe for target 'ssl/ep_ssl-prefix/src/ep_ssl-stamp/ep_ssl-configure' failed
make[2]: *** [ssl/ep_ssl-prefix/src/ep_ssl-stamp/ep_ssl-configure] Error 1
CMakeFiles/Makefile2:1005: recipe for target 'ssl/CMakeFiles/ep_ssl.dir/all' failed
make[1]: *** [ssl/CMakeFiles/ep_ssl.dir/all] Error 2
Makefile:86: recipe for target 'all' failed
make: *** [all] Error 2
It looks like cmake 3.0.2 is not able to determine the version OpenSSL (found version ".0.0"
does not look correct).
I tried with cmake 3.5.2, downloaded from the cmake website, and it works. So the problem is related to cmake 3.0.2 (and possibly other cmake versions between 3.0.2 and 3.5.2).
I am creating this issue mainly for posterity. One thing that could be done though is to change the required cmake version on the README page.
Appears to be related to: https://bugzilla.redhat.com/show_bug.cgi?id=1057388
==6225== 2,597,440 bytes in 81,170 blocks are possibly lost in loss record 1,242 of 1,243
==6225== at 0x4C2B975: calloc (vg_replace_malloc.c:711)
==6225== by 0xEABDC89: ??? (in /usr/lib64/libnsspem.so)
==6225== by 0xEAAF102: ??? (in /usr/lib64/libnsspem.so)
==6225== by 0xEAB2EB8: ??? (in /usr/lib64/libnsspem.so)
==6225== by 0xEAB7C3A: ??? (in /usr/lib64/libnsspem.so)
==6225== by 0x90B8C01: ??? (in /usr/lib64/libnss3.so)
==6225== by 0x90BAA32: PK11_CreateGenericObject (in /usr/lib64/libnss3.so)
==6225== by 0x77C7488: ??? (in /usr/lib64/libcurl.so.4.3.0)
==6225== by 0x77C7544: ??? (in /usr/lib64/libcurl.so.4.3.0)
==6225== by 0x77FEF2B: ??? (in /usr/lib64/libcurl.so.4.3.0)
==6225== by 0x77F57ED: ??? (in /usr/lib64/libcurl.so.4.3.0)
==6225== by 0x77CDCBC: ??? (in /usr/lib64/libcurl.so.4.3.0)
Currently it returns false but this can be misinterpreted as 'already exists'.
Systemd (Debian 8 here) have sessionid
starting with c
plus integer instead of a simple integer.
Apr 27 13:45:01 aragorn sshd[16723]: pam_unix(sshd:session): session opened for user root by (uid=0)
Apr 27 13:45:01 aragorn systemd-logind[61554]: New session c43046 of user root.
Apr 27 13:45:01 aragorn sshd[16723]: Starting session: command for root from 192.168.3.53 port 30874
Apr 27 13:45:01 aragorn sshd[16723]: pam_unix(sshd:session): session closed for user root
Apr 27 13:45:01 aragorn systemd-logind[61554]: Removed session c43046.
Create a lua wrapper for https://github.com/aws/aws-sdk-cpp
eg to allow approx lookup of IP address locations
Is there a plan to add SSL Support for elasticsearch output?
Some products can send syslog via tcp, which is a bit more reliable.
==7142== Conditional jump or move depends on uninitialised value(s)
==7142== at 0xADACA67: buffer_meth_receive (in /usr/lib/luasandbox/io_modules/ssl.so)
==7142== by 0x505636F: ??? (in /usr/lib64/libluasandbox.so.1.2.8)
==7142== by 0x50668FE: ??? (in /usr/lib64/libluasandbox.so.1.2.8)
==7142== by 0x50567EC: ??? (in /usr/lib64/libluasandbox.so.1.2.8)
==7142== by 0x5055AEA: ??? (in /usr/lib64/libluasandbox.so.1.2.8)
==7142== by 0x505695A: ??? (in /usr/lib64/libluasandbox.so.1.2.8)
==7142== by 0x504EC77: lua_pcall (in /usr/lib64/libluasandbox.so.1.2.8)
==7142== by 0xA994F79: ??? (in /usr/lib/luasandbox/io_modules/socket/core.so)
==7142== by 0x505636F: ??? (in /usr/lib64/libluasandbox.so.1.2.8)
==7142== by 0x5066989: ??? (in /usr/lib64/libluasandbox.so.1.2.8)
==7142== by 0x50567EC: ??? (in /usr/lib64/libluasandbox.so.1.2.8)
==7142== by 0x5055AEA: ??? (in /usr/lib64/libluasandbox.so.1.2.8)
==7142==
==7142== Use of uninitialised value of size 8
==7142== at 0x5060A1C: ??? (in /usr/lib64/libluasandbox.so.1.2.8)
==7142== by 0x504E3B9: lua_pushlstring (in /usr/lib64/libluasandbox.so.1.2.8)
==7142== by 0x504F17E: ??? (in /usr/lib64/libluasandbox.so.1.2.8)
==7142== by 0x504FEE8: luaL_pushresult (in /usr/lib64/libluasandbox.so.1.2.8)
==7142== by 0xADAC749: buffer_meth_receive (in /usr/lib/luasandbox/io_modules/ssl.so)
==7142== by 0x505636F: ??? (in /usr/lib64/libluasandbox.so.1.2.8)
==7142== by 0x50668FE: ??? (in /usr/lib64/libluasandbox.so.1.2.8)
==7142== by 0x50567EC: ??? (in /usr/lib64/libluasandbox.so.1.2.8)
==7142== by 0x5055AEA: ??? (in /usr/lib64/libluasandbox.so.1.2.8)
==7142== by 0x505695A: ??? (in /usr/lib64/libluasandbox.so.1.2.8)
==7142== by 0x504EC77: lua_pcall (in /usr/lib64/libluasandbox.so.1.2.8)
==7142== by 0xA994F79: ??? (in /usr/lib/luasandbox/io_modules/socket/core.so)
...
1511394110907382351 [info] hindsight exiting
==7142== Thread 1:
==7142== Invalid free() / delete / delete[] / realloc()
==7142== at 0x4C2ACDD: free (vg_replace_malloc.c:530)
==7142== by 0x5F0EBDB: __libc_freeres (in /usr/lib64/libc-2.17.so)
==7142== by 0x4A24749: _vgnU_freeres (vg_preloaded.c:77)
==7142== by 0x5DE3E2A: __run_exit_handlers (in /usr/lib64/libc-2.17.so)
==7142== by 0x5DE3EB4: exit (in /usr/lib64/libc-2.17.so)
==7142== by 0x5DCCB1B: (below main) (in /usr/lib64/libc-2.17.so)
==7142== Address 0x61653d0 is 0 bytes inside data symbol "noai6ai_cached"
The use case is for fast de-duping of a data stream to address https://bugzilla.mozilla.org/show_bug.cgi?id=1342111
In the grammar for common_log_format, request_method is restricted to GET/POST/HEAD/PUT/DELETE/OPTIONS/TRACE/CONNECT
. These are mandatory supported method for HTTP server but a method can be any token.
Nginx restricts method to A-Z
range plus _
and Apache allows any case sensitive token with methods registered by Script
(mod_action).
Given RFC 7230 section 3.2.6, request_method should be (l.alnum + l.S"!#$%&'*+-.^_
|~")^1`
Unable to build lua_sandbox_extensions due to error during step
# UNIX
cmake -DCMAKE_BUILD_TYPE=release -DENABLE_ALL_EXT=true -DCPACK_GENERATOR=TGZ ..
The error:
CMake Error: The following variables are used in this project, but they are set to NOTFOUND.
Please set them or make sure they are set and tested correctly in the CMake files:
LIBRTKAFKA_LIBRARY
linked by target "kafka" in directory ...lua_sandbox_extensions/kafka
Hi @trink,
I know you added this wiki page a while ago:
https://github.com/mozilla-services/lua_sandbox_extensions/wiki/Third-Party-Sandbox-Extensions
I'm just wondering what do you prefer:
-DEXTERNAL_EXT_<ext-name>=<ext-path>
I'm now using the original project, not a fork of a fork champtar@68fdb46
original PR: #50
Regards
Today I noticed that if I temporarily stop my heka_tcp input plugin, then the heka_tcp ouput plugin that were sending data to that stopped plugin will terminate quickly with the following error in logs:
1484245022748499356 [error] output.heka_tcp terminated: process_message() instruction_limit exceeded
Note that my heka_tcp input/output plugin are configured with ssl support.
this is my test.lua
local path = "/usr/lib/luasandbox/modules"
local m_package_path = package.path
package.path = string.format("%s;%s?.lua;", m_package_path, path, path)
local clf = require "lpeg.common_log_format"
local log_format = '$remote_addr - $remote_user [$time_local] "$request" $status $body_bytes_sent "$http_referer" "$http_user_agent" "$http_x_forwarded_for" "$upstream_addr"'
local grammar = clf.build_apache_grammar(log_format)
local data = '10.100.27.90 - - [22/Dec/2016:14:32:22 +0800] "GET /index HTTP/1.1" 200 10057 "http://www.nxin.com" "Mozilla/5.0 (Windows NT 10.0; Win64; x64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/54.0.2840.99 Safari/537.36" "-" "10.219.18.10:8080"'
local fields = grammar:match(data)
print(fields)
console
test.lua:11: attempt to index local 'fields' (a nil value)
stack traceback:
test/test_access_grammer.lua:11: in main chunk
[C]: ?
This version adds IPv6 country lookup
Both rjson and snappy requires libc6 version >= 2.14 on Debian: this requirement prevent installation on Debian Wheezy (oldstable) which is still in use.
Is there any reason for this restrictive requirement? Could we at least downgrade the requirement to 2.13 so that these plugins can be installed on Debian Wheezy?
See:
https://github.com/mozilla-services/lua_sandbox_extensions/blob/master/rjson/CMakeLists.txt#L21
https://github.com/mozilla-services/lua_sandbox_extensions/blob/master/snappy/CMakeLists.txt.snappy#L18
Hi,
not sure if it's a configuration mistake from me, but I can't have the syslog extension working with a simple message.
Here is the TCP payload sent to my Hindsight instance:
echo -n '<13>2017-10-16T15:00:47.812399+02:00 foo bar: dummy_message' | nc -q0 localhost 5566
And the configuration files:
run/input/syslog_tcp.cfg:
filename = "tcp.lua"
address = "*"
decoder_module = "decoders.syslog"
send_decode_failures = true
decoders_syslog = { }
run/output/debug.cfg:
filename = "heka_debug.lua"
message_matcher = "TRUE"
But syslog decoding always fail... I can see it in output :
root@node /etc/hindsight # hindsight main.cfg 7
1508164863963884644 [info] hindsight starting
1508164863974031089 [info] input.syslog_tcp starting
1508164863974290796 [info] analysis_plugins starting thread: 0
1508164863974409135 [debug] input_reader analysis0 opened file: /var/opt/hosting/data/hindsight/input/0.log offset: 5750
1508164863976650617 [info] output.debug starting
1508164863976704785 [debug] input_reader output.debug opened file: /var/opt/hosting/data/hindsight/input/0.log offset: 5750
1508164863976731206 [debug] input_reader output.debug opened file: /var/opt/hosting/data/hindsight/analysis/0.log
:Uuid: B68D6CDE-AB4D-4A6C-A65B-518842E55
:Timestamp: 2017-10-16T14:41:18.334463744Z
:Type: error
:Logger: input.syslog_tcp
:Severity: 7
:Payload: parse failed
:EnvVersion: <nil>
:Pid: <nil>
:Hostname: node008.bench.loghosts.m1.p.fti.net
:Fields:
I tried to adjust syslog template, message format type, and use standard logger
cmd, or even set a bad formated template.. in all cases I have always the same result : "parse failed".
It's difficult to debug for me, when I add some "print" statements in /usr/share/luasandbox/modules/lpeg/syslog.lua
I can't collect more informations.. all grammar related vars are in type "userdata".
Maybe everybody can reproduce the problem ... but it seems I'm wrong with something... or it's really a bug ?
Thanks a lot.
error says can not find parquet-cpp
CMake Error at parquet/CMakeLists.txt:9 (find_package):
Could not find a package configuration file provided by "parquet-cpp" with
any of the following names:
parquet-cppConfig.cmake
parquet-cpp-config.cmake
Add the installation prefix of "parquet-cpp" to CMAKE_PREFIX_PATH or set
"parquet-cpp_DIR" to a directory containing one of the above files. If
"parquet-cpp" provides a separate development package or SDK, be sure it
has been installed.
CMake Error: The following variables are used in this project, but they are set to NOTFOUND.
Please set them or make sure they are set and tested correctly in the CMake files:
LIBRTKAFKA_LIBRARY
linked by target "kafka" in directory /opt/lua_sandbox_extensions/kafka
-- Configuring incomplete, errors occurred!
See also "/opt/lua_sandbox_extensions/release/CMakeFiles/CMakeOutput.log".
but I did install the parquet-cpp with following steps in readme.md of https://github.com/apache/parquet-cpp
[root@ansible release]# ll /usr/local/lib/libparquet.so
-rwxr-xr-x 1 root root 14580809 Dec 15 17:39 /usr/local/lib/libparquet.so
As we increase the number of parsers that operate on the syslog payload we need a better way to incorporated them e.g., a sub-decoder module that automatically adds its grammars to the hash table.
The ssl module adds a dependency on libssl1.0.0 (>= 1.0.1), see:
https://github.com/mozilla-services/lua_sandbox_extensions/blob/master/ssl/CMakeLists.txt.ssl#L20
But libssl1.0.0 is not available on Debian Stretch (stable) which only has libssl1.0.2 or libssl1.1.
Do you think it could be made compatible with both Debian Jessie and Stretch?
A Hindsight segfault occurred during a Kafka upgrade/repartition on Sunday. This is a best guess as there are no other details on the failure and it was the only unusual activity at the time.
Note: It may simply be the old librdkafka client library we are using.
It should mirror the Heka protobuf decoder behavior. #79 will become the json mutating decoder allowing various types of transformations
A declarative, efficient, and flexible JavaScript library for building user interfaces.
๐ Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.
TypeScript is a superset of JavaScript that compiles to clean JavaScript output.
An Open Source Machine Learning Framework for Everyone
The Web framework for perfectionists with deadlines.
A PHP framework for web artisans
Bring data to life with SVG, Canvas and HTML. ๐๐๐
JavaScript (JS) is a lightweight interpreted programming language with first-class functions.
Some thing interesting about web. New door for the world.
A server is a program made to process requests and deliver data to clients.
Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.
Some thing interesting about visualization, use data art
Some thing interesting about game, make everyone happy.
We are working to build community through open source technology. NB: members must have two-factor auth.
Open source projects and samples from Microsoft.
Google โค๏ธ Open Source for everyone.
Alibaba Open Source for everyone
Data-Driven Documents codes.
China tencent open source team.