Coder Social home page Coder Social logo

elasticsearch-lua's Introduction

elasticsearch-lua

Build Status Coverage Status

Join the chat at https://gitter.im/DhavalKapil/elasticsearch-lua License

LuaRocks Lua

A low level client for Elasticsearch written in Lua.

In accordance with other official low level clients, the client accepts associative arrays in the form of lua table as parameters.

Features:

  1. One-to-one mapping with REST API and other language clients.
  2. Proper load balancing across all nodes.
  3. Pluggable and multiple connection, selection strategies and connection pool.
  4. Console logging facility.
  5. Almost every parameter is configurable.

Elasticsearch Version Matrix

Elasticsearch Version elasticsearch-lua Branch
>= 2.0, < 5.0 2.x.y

Lua Version Requirements

elasticsearch-lua works for lua >= 5.1 version.

Setup

It can be installed using luarocks

  [sudo] luarocks install elasticsearch

Documentation

The complete documetation is here.

Create elasticsearch client instance:

  local elasticsearch = require "elasticsearch"

  local client = elasticsearch.client{
    hosts = {
      { -- Ignoring any of the following hosts parameters is allowed.
        -- The default shall be set
        protocol = "http",
        host = "localhost",
        port = 9200
      }
    },
    -- Optional parameters
    params = {
      pingTimeout = 2
    }
  }
  -- Will connect to default host/port
  local client = elasticsearch.client()

Full list of params:

  1. pingTimeout : The timeout of a connection for ping and sniff request. Default is 1.
  2. selector : The type of selection strategy to be used. Default is RoundRobinSelector.
  3. connectionPool : The type of connection pool to be used. Default is StaticConnectionPool.
  4. connectionPoolSettings : The connection pool settings,
  5. maxRetryCount : The maximum times to retry if a particular connection fails.
  6. logLevel : The level of logging to be done. Default is warning.

Standard call

local param1, param2 = client:<func>()

param1: Stores the data returned or nil on error

param2: Stores the HTTP status code on success or the error message on failure

Getting info of elasticsearch server

local data, err = client:info()

Index a document

Everything is represented as a lua table.

local data, err = client:index{
  index = "my_index",
  type = "my_type",
  id = "my_doc",
  body = {
    my_key = "my_param"
  }
}

Get a document

data, err = client:get{
  index = "my_index",
  type = "my_type",
  id = "my_doc"
}

Delete a document

data, err = client:delete{
  index = "my_index",
  type = "my_type",
  id = "my_doc"
}

Searching a document

You can search a document using either query string:

data, err = client:search{
  index = "my_index",
  type = "my_type",
  q = "my_key:my_param"
}

Or either a request body:

data, err = client:search{
  index = "my_index",
  type = "my_type",
  body = {
    query = {
      match = {
        my_key = "my_param"
      }
    }
  }
}

Update a document

data, err = client:update{
  index = "my_index",
  type = "my_type",
  id = "my_doc",
  body = {
    doc = {
      my_key = "new_param"
    }
  }
}

Contribution

Feel free to file issues and submit pull requests – contributions are welcome. Please try to follow the code style used in the repository.

License

elasticsearch-lua is licensed under the MIT license.

elasticsearch-lua's People

Contributors

criztianix avatar dhavalkapil avatar ignacio avatar kraftman avatar neilcook avatar pmusa avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

elasticsearch-lua's Issues

Benchmarking

The performance of the client should be compared with other official clients.

Endpoint "ping" not working

Call to ping return error as it tries to parse body response from HEAD request. Maybe we should verify the "self:getMethod" before decoding the body.

Code to reproduce error:
local es = require"elasticsearch"
local c = es:client()
c:ping()

lua: /usr/local/share/lua/5.3/endpoints/Endpoint.lua:99: Expected value but found T_END at character 1
stack traceback:
[C]: in function 'cjson.decode'
/usr/local/share/lua/5.3/endpoints/Endpoint.lua:99: in function 'endpoints.Endpoint.request'
/usr/local/share/lua/5.3/Client.lua:52: in function 'Client.requestEndpoint'
/usr/local/share/lua/5.3/Client.lua:77: in function 'Client.ping'
es.lua:7: in main chunk
[C]: in ?

what happened? Msearch.lua

luarocks install --server=http://luarocks.org/manifests/dhavalkapil elasticsearch
Warning: falling back to curl - install luasec to get native HTTPS support
Installing http://luarocks.org/manifests/dhavalkapil/elasticsearch-scm-0.rockspec...
Using http://luarocks.org/manifests/dhavalkapil/elasticsearch-scm-0.rockspec... switching to 'build' mode
Cloning into 'elasticsearch-lua'...
remote: Counting objects: 348, done.
remote: Compressing objects: 100% (190/190), done.
remote: Total 348 (delta 245), reused 215 (delta 154), pack-reused 0
Receiving objects: 100% (348/348), 115.52 KiB | 142.00 KiB/s, done.
Resolving deltas: 100% (245/245), done.
Checking connectivity... done.
stat: cannot stat ‘src/elasticsearch/endpoints/Msearch.lua’: No such file or directory

Error: Build error: Failed installing src/elasticsearch/endpoints/Msearch.lua in /usr/local/openresty/luajit/lib/luarocks/rocks/elasticsearch/scm-0/lua/elasticsearch/endpoints/Msearch.lua

luarocks install elasticsearch not working

I tried installing Elastic search using the command luarocks install elasticsearch, but every time I got this error
"luarocks install elasticsearch

Installing http://luarocks.org/repositories/rocks/elasticsearch-1.0.0-1.src.rock...

Missing dependencies for elasticsearch:
lua-cjson

cl /MD /O2 -c -Folua_cjson.obj -IC:/Program Files (x86)/Lua/5.1/include lua_cjson.c -DDISABLE_INVALID_NUMBERS -DUSE_INTERNAL_ISINF
'cl' is not recognized as an internal or external command,
operable program or batch file.

Error: Failed installing dependency: http://luarocks.org/repositories/rocks/lua-cjson-2.1.0.6-1.src.rock - Build error: Failed compiling object lua_cjson.obj
<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<
"
I tried installing the missing dependency using "luarocks install lua-cjson" but it could not install the dependency. @pmusa @DhavalKapil

Documentation

Apart from the documentation generated by ldoc, it is important that proper guides be written.

Stress Testing

The client should handle all kinds of errors and exit gracefully.

Potential module naming conflicts

Your rockspec defines a bunch of top level modules:

    ["elasticsearch"] = "elasticsearch/elasticsearch.lua",
    ["Settings"] = "elasticsearch/Settings.lua",
    ["Logger"] = "elasticsearch/Logger.lua",
    ["parser"] = "elasticsearch/parser.lua",
    ["helpers"] = "elasticsearch/helpers.lua",
    ["Client"] = "elasticsearch/Client.lua",
    ["Cluster"] = "elasticsearch/Cluster.lua",
    ["Nodes"] = "elasticsearch/Nodes.lua",
    ["Indices"] = "elasticsearch/Indices.lua",
    ["Transport"] = "elasticsearch/Transport.lua",

This means that if I happen to have a file called helpers.lua, or parse.lua, etc. in my project's directory, anytime you call require within your project on one of those modules my code will shadow yours. (This is the case with the default package.path)

You should scope all the modules you install with the name of your project, for example:

    ["elasticsearch"] = "elasticsearch/elasticsearch.lua",
    ["elasticsearch.Settings"] = "elasticsearch/Settings.lua",
    ["elasticsearch.Logger"] = "elasticsearch/Logger.lua",
    ["elasticsearch.parser"] = "elasticsearch/parser.lua",
    ["elasticsearch.helpers"] = "elasticsearch/helpers.lua",
    -- etc

And you'll also need to update the require calls in your code.

install error

luarocks install --server=http://luarocks.org/dev elasticsearch

cp: cannot stat `src/elasticsearch/Cat.lua': No such file or directory
Error: Build error: Failed installing src/elasticsearch/Cat.lua in /usr/lib/luarocks/rocks/elasticsearch/scm-0/lua/elasticsearch/Cat.lua

bulk operations abort for large sizes

Even though you can construct large bulk bodies, the :bulk API method runs out of memory for large sizes that are well within a bulk operation size. I have records that are approximately 300 bytes each and bulk inserts work okay for 512 insert bodies, but start slowing down drastically at 2048 and somewhere around 4096 start raising "luajit: not enough memory" exceptions.

I'm not sure why this happens, although I assume it's because you're trying to serialize the entire bulk table into the outgoing client JSON TCP stream at the same time rather than generating smaller buffers.

Unit Testing

Extensive unit testing for all endpoints is needed. (Tests are already present for the rest of the codebase)

  • HTTP method needs to be tested for different combinations of parameters
  • HTTP url needs to be tested for different combinations of parameters
  • Adding a Mock Transport class to make this possible, it will intercept the transport's request function and deep check every parameter for correctness.
  • Cover every endpoint

Tests should encompass everything so as to prevent issues like #15. Tests should even check the requests that are made. Checking input/output is not sufficient.

'params' are not cleared between different client requests

es = require "elasticsearch"

c = es.client({ params = { logLevel = 'trace'}})

c:search({ scroll = '1m', body = { query = {match_all = {}} } })
c:get({ index = 'stack', type = 'question', id = 'a'})

The last get request is something like GET stack/question/a?scroll=1m

Integration Testing

Apart from testing of every component individually, it is equally important that they work together while interacting with each other.

Some dataset(big one) should be chosen to test. It makes no sense to include the dataset within the repository. Perhaps it would be better to download it from some external source on the fly.

Tests would be layered. Special command line options would allow for different types of testing.

Reindex should expect host instead of client.

In order to make it very easy to use we should create and kill the client inside the reindex function.

function helpers.reindex(sourceClient, sourceIndex, targetIndex, query, targetClient, scroll)

allowParams should be a table and not an array.

Instead of having the following which forces a loop to verify

Scroll.allowedParams = {
  "scroll",
  "scroll_id"
}

we should have the following in which the lookup is direct.

Scroll.allowedParams = {
  "scroll" = true,
  "scroll_id" = true
}

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.