Coder Social home page Coder Social logo

modelop / hadrian Goto Github PK

View Code? Open in Web Editor NEW
130.0 33.0 50.0 242.08 MB

Implementations of the Portable Format for Analytics (PFA)

License: Apache License 2.0

Scala 45.40% Python 48.30% R 5.36% Makefile 0.09% Batchfile 0.08% JavaScript 0.06% HTML 0.71%

hadrian's Introduction

Hadrian: implementations of the PFA specification

As of version 0.8.4, Hadrian, Titus, and Aurelius are available with the Apache License v2.0

Version v.0.8.4

The Portable Format for Analytics (PFA) is a specification for scoring engines: event-based processors that perform predictive or analytic calculations. It is a common language to help smooth the transition from statistical model development to large-scale and/or online production. For a model expressed as PFA to be run against data, an application is required.

Hadrian (API) is Open Data's complete implementation of PFA for the Java Virtual Machine (JVM). Hadrian is designed as a library to be embedded in applications or used as a scoring engine container. To make Hadrian immediately usable, we provide containers that allow Hadrian to be dropped into an existing workflow. Hadrian can currently be used as a standard-input/standard-output process, a Hadoop map-reduce workflow, an actor-based workflow of interacting scoring engines, or as a servlet in a Java Servlet container, including Google App Engine.

Titus (API) is Open Data's complete implementation of PFA for Python. Hadrian and Titus both execute the same scoring engines, but while Hadrian's focus is speed and portability, Titus's focus is on model development. Included with Titus are standard model producers, a PrettyPFA parser for easier editing, a PFA-Inspector commandline for interactive analysis of a PFA document, and many other tools and scripts.

In addition, Aurelius is an R package for producing PFA from the R programming language and Antinous is a sidecar app for building models in any environment where Hadrian can be deployed. These and other tools are included in the Hadrian repository.

See the Hadrian wiki for more information, including installation instructions and tutorials.

Contact [email protected] to see how Hadrian can fit into your environment.

The Roman emperor naming convention is continued from Augustus, Open Data's producer and consumer of the Predictive Model Markup Language (PMML).

hadrian's People

Contributors

bwengals avatar collinbennett avatar jpivarski avatar mahowald-odg avatar stevenmmortimer avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

hadrian's Issues

[Aurelius] Add Converters from Model Object to Complete PFA Doc

Currently, models generated with Aurelius cannot be converted directly from their state as an object into a valid PFA document. Is it possible to add S3 wrapper methods for model object types (e.g. glm, lm, etc.) that convert these models directly to a completely valid PFA document? This would decrease time in having to construct these simple models by hand from their object components as done in the tutorial:

https://github.com/opendatagroup/hadrian/wiki/Aurelius-glm

[Titus] Pandas support

I would like to suggest adding Pandas support so there would be no need in record-by-record data-copying.

Syntax error in PrettyPFA example in the wiki

In the example on the wiki page for PrettyPFA, if I try to copy-paste the code into my python terminal, it fails with titus.errors.PrettyPfaException: Parsing syntax error.

I realised this is because the return value in the else condition is accidentally commented out.
It says

  else
    // otherwise, return null (N/A)        null

when it should be

  else
    // otherwise, return null (N/A)        
    null

As a side note (not sure if this is worth a ticket), is there a list of dependencies somewhere needed for titus? I only found through trial and error that I needed avro and ply, and I couldn't see anything about this in the instructions.

Memory leak while using PFA model in a system

We are getting memory leak issue while invoking the PFA model within a deployed system in OSGi container. The issue is in org.apache.avro.specific.SpecificData class due to the cache CTOR_CACHE. The cache size keeps on increasing and there is no way to remove or limit the cache size.
SpecificData class is extended by Hadrian's PFASpecificData scala class and is in use while invoking the model.

PFA chain not behaving as expected

I'm in a situation where I have two YAML strings that define valid PFA models and can be evaluated in sequence, but when trying to chain them together, I get unexpected renaming of records which results in a PFASemanticException.

I've constructed a minimal reproducible example:

import yaml
from titus.genpy import PFAEngine
from titus.producer import chain

doc1 = '''
input:
  type: map
  values: double
output:
  type: int

cells:
  variables:
    type:
      type: array
      items:
        type: record
        name: Variable
        fields:
          - name: vname
            type: string
          - name: mean
            type: double

fcns:
  adjust:
    params:
      - v: Variable
      - vars:
          type: map
          values: double
    ret: double
    do:
      - 1

action:
  - let:
      vars:
        a.map:
          - cell: variables
          - fcn: u.adjust
            fill: {vars: input}

  - 1
'''

doc2 = '''
input:
  type: int
output:
  type: int

action:
  - 2
'''

m1 = yaml.load(doc1)
m2 = yaml.load(doc2)
m1['cells']['variables']['init'] = [{'vname': 'var1', 'mean': 0.5},
                                    {'vname': 'var2', 'mean': -0.25}]
y1 = yaml.dump(m1)
y2 = yaml.dump(m2)

pfa1, = PFAEngine.fromYaml(y1)
v = {'var1': 1, 'var2': 2}
x = pfa1.action(v)
pfa2, = PFAEngine.fromYaml(y2)
pfa2.action(x)

returns 2 as expected but

y3 = chain.json([y1, y2], tryYaml=True, verbose=True)

results in

Thu Nov 10 17:01:04 2016 Converting all inputs to ASTs
Thu Nov 10 17:01:04 2016     step 1
Thu Nov 10 17:01:04 2016     step 2
Thu Nov 10 17:01:04 2016 Changing type names to avoid collisions
Thu Nov 10 17:01:04 2016 Verifying that input/output schemas match along the chain
Thu Nov 10 17:01:04 2016 Adding [name, instance, metadata, actionsStarted, actionsFinished, version] as model parameters
Thu Nov 10 17:01:04 2016 Converting scoring engine algorithm
Thu Nov 10 17:01:04 2016     step 1: Engine_112
Thu Nov 10 17:01:04 2016     step 2: Engine_113
Thu Nov 10 17:01:04 2016 Create types for model parameters
Thu Nov 10 17:01:04 2016     step 1:
Thu Nov 10 17:01:04 2016         cell variables
Thu Nov 10 17:01:04 2016 Resolving all types
Thu Nov 10 17:01:04 2016 Converting the model parameters themselves
Thu Nov 10 17:01:04 2016     step 1:
Thu Nov 10 17:01:04 2016         cell variables
Thu Nov 10 17:01:04 2016 Verifying PFA validity


PFASemanticExceptionTraceback (most recent call last)
 in ()
----> 1 y3 = chain.json([y1, y2], tryYaml=True, verbose=True)

/Users/brandon/.pyenv/versions/2.7.11/lib/python2.7/site-packages/titus/producer/chain.pyc in json(pfas, lineNumbers, check, name, randseed, doc, version, metadata, options, tryYaml, verbose)
    131     :return: a PFA document representing the chained workflow
    132     """
--> 133     return ast(pfas, check, name, randseed, doc, version, metadata, options, tryYaml, verbose).toJson(lineNumbers)
    134 
    135 def engine(pfas, name=None, randseed=None, doc=None, version=None, metadata={}, options={}, tryYaml=False, verbose=False, sharedState=None, multiplicity=1, style="pure", debug=False):

/Users/brandon/.pyenv/versions/2.7.11/lib/python2.7/site-packages/titus/producer/chain.pyc in ast(pfas, check, name, randseed, doc, version, metadata, options, tryYaml, verbose)
    658     if check:
    659         if verbose: sys.stderr.write(time.asctime() + " Verifying PFA validity\n")
--> 660         PFAEngine.fromAst(out)
    661 
    662     if verbose: sys.stderr.write(time.asctime() + " Done\n")

/Users/brandon/.pyenv/versions/2.7.11/lib/python2.7/site-packages/titus/genpy.pyc in fromAst(engineConfig, options, version, sharedState, multiplicity, style, debug)
   1445         pfaVersion = titus.signature.PFAVersion.fromString(version)
   1446 
-> 1447         context, code = engineConfig.walk(GeneratePython.makeTask(style), titus.pfaast.SymbolTable.blank(), functionTable, engineOptions, pfaVersion)
   1448         if debug:
   1449             print code

/Users/brandon/.pyenv/versions/2.7.11/lib/python2.7/site-packages/titus/pfaast.pyc in walk(self, task, symbolTable, functionTable, engineOptions, pfaVersion)
    919             ufname = "u." + fname
    920             scope = topWrapper.newScope(True, False)
--> 921             fcnContext, fcnResult = fcnDef.walk(task, scope, withUserFunctions, engineOptions, pfaVersion)
    922             userFcnContexts.append((ufname, fcnContext))
    923 

/Users/brandon/.pyenv/versions/2.7.11/lib/python2.7/site-packages/titus/pfaast.pyc in walk(self, task, symbolTable, functionTable, engineOptions, version)
   1513             scope.put(name, avroType)
   1514 
-> 1515         results = [x.walk(task, scope, functionTable, engineOptions, version) for x in self.body]
   1516 
   1517         inferredRetType = results[-1][0].retType

/Users/brandon/.pyenv/versions/2.7.11/lib/python2.7/site-packages/titus/pfaast.pyc in walk(self, task, symbolTable, functionTable, engineOptions, version)
   3037 
   3038             scope = symbolTable.newScope(True, True)
-> 3039             exprContext, exprResult = expr.walk(task, scope, functionTable, engineOptions, version)
   3040             calls = calls.union(exprContext.calls)
   3041 

/Users/brandon/.pyenv/versions/2.7.11/lib/python2.7/site-packages/titus/pfaast.pyc in walk(self, task, symbolTable, functionTable, engineOptions, version)
   2007 
   2008         else:
-> 2009             raise PFASemanticException("parameters of function \"{0}\" do not accept [{1}]".format(self.name, ",".join(map(ts, argTypes))), self.pos)
   2010 
   2011         return context, task(context, engineOptions)

PFASemanticException: PFA semantic error at YAML lines 3-7 (PFA field "action -> 0 -> let -> vars"): parameters of function "a.map" do not accept [
    array(record(Step1_Engine_112_Variable,
                 vname: string,
                 mean: double))
,
    fcn(arg1: record(Variable,
           vname: string,
           mean: double) -> double)
]

In case it's helpful, I'm on macOS Sierra (10.12.1) and

โžœ  ~ pip show titus
Name: titus
Version: 0.8.4
Summary: Python implementation of Portable Format for Analytics (PFA): producer, converter, and consumer.
Home-page: UNKNOWN
Author: Open Data Group
Author-email: [email protected]
License: UNKNOWN
Location: /Users/brandon/.pyenv/versions/2.7.11/lib/python2.7/site-packages
Requires: avro, ply

Batch processing with PFA

Unsure of where to post this question, I am asking it here. If there is a better forum, let me know as I could not join Slack.

I know PFA modelling is tied to individual datum. Is there a way to model on batch data? Like, one of my use case is taking CSV into spark data frame and doing a sort or groupBy. Is such an operation possible here?

[Hadrian] JSON datum "Inf" does not match type "double"

If I use "Inf" value in a cell, it works with Titus:

library(aurelius)
pfaDocument = pfa_document(
  input = avro_double,
  output = avro_boolean,
  cells = list(x = pfa_cell(avro_double, Inf)),
  action = expression(
    input < x
  )
)

library(jsonlite)
engine = pfa_engine(pfaDocument)
x = 1
engine$action(x)

But Hadrian gives error:

f = "/usr/local/src/gdrive/results/pfa/inf_example.pfa"
write_pfa(pfaDocument, file = f, pretty = TRUE)

library(jsonlite)
tmp1 = tempfile(fileext = ".json")
tmp2 = tempfile(fileext = ".json")
write(minify(toJSON(x, auto_unbox = TRUE)), file = tmp1)
cmd = paste0("cd /usr/local/src/gdrive/; touch ", tmp2, "; ",
             "java -jar scripts/hadrian/hadrian-standalone-0.8.1-jar-with-dependencies.jar -i json -o json ",
             f, " ", tmp1, " > ", tmp2)
system(cmd)

I expected that Titus and Hadrian should behave identically. Is it a bug of Titus or Hadrian?

randomForest with all numerical values fails because of buildOneTree's implementation

I had been working on converting a randomForest model from R to PFA without success until I found that buildOneTree seems to be harcoded for a leaf string value. My trees are all numerical.

If I duplicate the function for the same one returning list(double = node[[6]]) instead it all seems to work.

My final R code looks like this:

buildMyTree <- function(tree, whichNode, valueNeedsTag, dataLevels, fieldTypes = NULL) {
    node <- tree[whichNode,]

    if (node[[1]] > 0) {  # branch node
        f <- gsub("\\.", "_", node[[3]])
        dl <- dataLevels[[f]]

        if (is.null(fieldTypes)  ||  is.null(fieldTypes[[f]]))
            t <- avro.type(node[[4]])
        else {
            t <- fieldTypes[[f]]
            if (is.list(t)  &&  "name" %in% names(t))
                t <- t$name
        }

        if (!is.null(dl)  &&  length(dl) == 2  &&  (node[[4]] == 1  ||  node[[4]] == 2  ||  node[[4]] == 0.5)) {
            if (dl[[node[[4]]]] == 0.5)
                val <- FALSE
            else
                val <- dl[[node[[4]]]]
            if (valueNeedsTag) {
                out <- list()
                out[[t]] <- val
                val <- out
            }
            op <- "=="
        }
        else if (!is.null(dl)  &&  length(dl) > 0) {
            l <- length(dl)
            a <- 2^(0:l)
            b <- 2*a
            val <- list()
            for (i in 1:l) {
                if ((node[[4]] %% b[i]) >= a[i])
                    val[length(val) + 1] <- dl[[i]]
            }
            if (valueNeedsTag) {
                out <- list(array = val)
                val <- out
            }
            op <- "in"
        }
        else {
            val <- node[[4]]
            if (valueNeedsTag) {
                out <- list()
                out[[t]] <- val
                val <- out
            }
            op <- "<="
        }

        list(TreeNode =
             list(field = f,
                  operator = op,
                  value = val,
                  pass = buildMyTree(tree, node[[1]], valueNeedsTag, dataLevels, fieldTypes),
                  fail = buildMyTree(tree, node[[2]], valueNeedsTag, dataLevels, fieldTypes)))
    }
    else
        ## the only modified line is the one below
        list(double = node[[6]])   # leaf node
}

aurelius_forest <- list()
for (i in 1:forestObject$ntree) {
  treeTable <- pfa.randomForest.extractTree(forestObject, i, labelVar = TRUE)
  aurelius_forest[[length(aurelius_forest) + 1]] <-
    buildMyTree(treeTable, 1, valueNeedsTag = FALSE, dataLevels = list(), fieldTypes = NULL)$TreeNode
}

[Aurelius-0.8.4] Could not find function "pfa.config"

When following the tutorial at https://github.com/opendatagroup/hadrian/wiki/Tutorial-Small-Model-R section "Model as code" exploits function pfa.config in the second code listing.

When copy-pasting the code in RStudio console with package aurelius v 0.8.4 loaded error "could not find function "pfa.config"" is returned.

Issuing command ?pfa.config doesn't return nothing, while issuing command methods("pfa") returns the following:
[1] pfa.Arima* pfa.HoltWinters* pfa.cv.glmnet* pfa.ets* pfa.forecast* pfa.gbm* pfa.glm*
[8] pfa.glmnet* pfa.ipredknn* pfa.kcca* pfa.kccasimple* pfa.kmeans* pfa.knn3* pfa.knnreg*
[15] pfa.lda* pfa.lm* pfa.naiveBayes* pfa.randomForest* pfa.rpart*

"not a valid symbol name" when using strings as map keys that are not valid symbol identifiers

Everything I've read in the PFA documentation says that the keys in maps can be ANY strings. However, consider the following expression:

        {
          "new" : {
            "1st_4th" : 14.0,
            "10th" : 7.0,
            "Some_college" : 1.0,
            "HS_grad" : 0.0,
            "Preschool" : 15.0,
            "9th" : 10.0,
            "Assoc_acdm" : 6.0,
            "5th_6th" : 13.0,
            "7th_8th" : 8.0,
            "11th" : 5.0,
            "Masters" : 3.0,
            "12th" : 11.0,
            "Doctorate" : 12.0,
            "Prof_school" : 9.0,
            "Bachelors" : 2.0,
            "Assoc_voc" : 4.0
          },
          "type" : {
            "type" : "map",
            "values" : "double"
          }
        }

Hadrian is choking on this, saying that:

"1st_4th" is not a valid symbol name

Hadrian and Titus do not behave the same

I came across this bug when using la.dot, where one input is an array of array of ints and the other is an array of doubles. Hadrian complains that java.lang.Integer cannot be cast to java.lang.Double but Titus evaluates it successfully. As Hadrian and Titus are implementations of the same PFA specification, I believe they should behave the same.

The files necessary for a minimal reproducible example in this gist. To run it yourself (assuming you have pip installed), simply

git clone https://gist.github.com/8716b19618fee138f984289853d4a48b.git
cd 8716b19618fee138f984289853d4a48b
sh pfa.sh

which should output

Evaluating with Hadrian
Exception in thread "Hadrian-engine-0" java.lang.ClassCastException: java.lang.Integer cannot be cast to java.lang.Double
    at scala.runtime.BoxesRunTime.unboxToDouble(BoxesRunTime.java:119)
    at com.opendatagroup.hadrian.lib.la.package$Dot$$anonfun$applyVector$3$$anonfun$apply$7.apply(la.scala:514)
    at scala.collection.Iterator$class.exists(Iterator.scala:753)
    at scala.collection.AbstractIterator.exists(Iterator.scala:1157)
    at scala.collection.IterableLike$class.exists(IterableLike.scala:77)
    at scala.collection.AbstractIterable.exists(Iterable.scala:54)
    at com.opendatagroup.hadrian.lib.la.package$Dot$$anonfun$applyVector$3.apply(la.scala:514)
    at com.opendatagroup.hadrian.lib.la.package$Dot$$anonfun$applyVector$3.apply(la.scala:514)
    at scala.collection.Iterator$class.exists(Iterator.scala:753)
    at scala.collection.AbstractIterator.exists(Iterator.scala:1157)
    at scala.collection.IterableLike$class.exists(IterableLike.scala:77)
    at scala.collection.AbstractIterable.exists(Iterable.scala:54)
    at com.opendatagroup.hadrian.lib.la.package$Dot.applyVector(la.scala:514)
    at PFA_Engine_1$2.apply(Unknown Source)
    at PFA_Engine_1.action(Unknown Source)
    at PFA_Engine_1.action(Unknown Source)
    at com.opendatagroup.hadrian.standalone.Main$$anonfun$main$1$EngineRunnable$1.action(standalone.scala:196)
    at com.opendatagroup.hadrian.standalone.Main$$anonfun$main$1$EngineRunnable$1$$anonfun$19.apply(standalone.scala:221)
    at com.opendatagroup.hadrian.standalone.Main$$anonfun$main$1$EngineRunnable$1$$anonfun$19.apply(standalone.scala:221)
    at com.opendatagroup.hadrian.standalone.Main$$anonfun$main$1$EngineRunnable$1.run(standalone.scala:232)
    at java.lang.Thread.run(Thread.java:745)

Evaluating with Titus
[1.0, 0.0]

Unexpected regex results on Mac OS X using Titus

When running the tests:

$ cd titus
$ python setup.py test

you get 5 failed tests. This only happens on Mac with Titus and it has to do with differences in the underlying regex libraries. Hadrian uses Joni which behaves the same on Mac and Linux. Titus uses libc which does not. The tests that fail are in titus/test/lib/testRegex.py:

  • FAIL: testGroups (test.lib.testRegex.TestLib1Regex)
  • FAIL: testPosix (test.lib.testRegex.TestLib1Regex)
  • FAIL: testfindGroupsFirst (test.lib.testRegex.TestLib1Regex)
  • FAIL: testindexAll (test.lib.testRegex.TestLib1Regex)
  • FAIL: testUnpackBoolean (test.testGenpy.TestGeneratePython)

[Titus] Neural network "math range error" with Titus, works with Hadrian

I ported the example of a neural network to aurelius:

library(aurelius)
tm = avro_typemap(
  Layer = avro_record(list(
    weights = avro_array(avro_array(avro_double)),
    bias = avro_array(avro_double)
  ))
)

pfaDocument = pfa_document(
  input = avro_array(avro_double),
  output = avro_double,
  cells = list(neuralnet = pfa_cell(avro_array(tm("Layer")), "[]")),
  action = expression(
    activation <- model.neural.simpleLayers(input, neuralnet, function(x = avro_double -> avro_double) m.link.logit(x)),
    m.link.logit(activation[0])
  )
)

neuralnet = list(
  list(
    weights = list(
      list(-6.0, -8.0),
      list(-25.0, -30.0)
    ),
    bias = list(4.0, 50.0)
  ),
  list(
    weights = list(
      list(-12.0, 30.0)
    ),
    bias = list(-25.0)
  )
)

pfaDocument$cells$neuralnet$init = neuralnet
engine = pfa_engine(pfaDocument)

x = list(
  list(0.0, 0.0),
  list(1.0, 0.0),
  list(0.0, 1.0),
  list(1.0, 1.0)
)
sapply(x, engine$action)

f = "/usr/local/src/gdrive/results/pfa/nnet_example.pfa"
write_pfa(pfaDocument, file = f)

With modified input, the model gives "math range error" with Titus:

x = list(100.0, 0.0)
model = read_pfa(file(f))
engine = pfa_engine(model)
engine$action(x)

However, with Hadrian, it works:

library(jsonlite)
tmp1 = tempfile(fileext = ".json")
tmp2 = tempfile(fileext = ".json")
write(minify(toJSON(x, auto_unbox = TRUE)), file = tmp1)
cmd = paste0("cd /usr/local/src/gdrive/; touch ", tmp2, "; ",
             "java -jar scripts/hadrian/hadrian-standalone-0.8.1-jar-with-dependencies.jar -i json -o json ",
             f, " ", tmp1, " > ", tmp2)
system(cmd)
out = fromJSON(readChar(tmp2, file.info(tmp2)$size), simplifyVector = FALSE)
unlink(tmp1)
unlink(tmp2)
print(out)

The PFA model looks like this:

{
  "input": {
    "type": "array",
    "items": "double"
  },
  "output": "double",
  "action": [
    {
      "let": {
        "activation": {
          "model.neural.simpleLayers": [
            "input",
            {
              "cell": "neuralnet"
            },
            {
              "params": [
                {
                  "x": "double"
                }
              ],
              "ret": "double",
              "do": {
                "m.link.logit": [
                  "x"
                ]
              }
            }
          ]
        }
      }
    },
    {
      "m.link.logit": [
        {
          "attr": "activation",
          "path": [
            0
          ]
        }
      ]
    }
  ],
  "cells": {
    "neuralnet": {
      "type": {
        "type": "array",
        "items": {
          "type": "record",
          "fields": [
            {
              "name": "weights",
              "type": {
                "type": "array",
                "items": {
                  "type": "array",
                  "items": "double"
                }
              }
            },
            {
              "name": "bias",
              "type": {
                "type": "array",
                "items": "double"
              }
            }
          ],
          "name": "Record_3"
        }
      },
      "init": [
        {
          "weights": [
            [
              -6,
              -8
            ],
            [
              -25,
              -30
            ]
          ],
          "bias": [
            4,
            50
          ]
        },
        {
          "weights": [
            [
              -12,
              30
            ]
          ],
          "bias": [
            -25
          ]
        }
      ],
      "source": "embedded",
      "shared": false,
      "rollback": false
    }
  }
}

What could be the issue?

[Aurelius] Add More GLM Families

The current implementation of pfa.glm.extractParams() only supports models generated by glm() using the binomial family with logit link function (binomial(logit)). Is it possible to expand to all families of GLMs supported by the glm() function in R?

Hyperlink error in README.md

Selecting the Portable Format for Analytics (PFA) hyperlink (in Chrome) results in an error -

"""
This site canโ€™t be reached

scoringengine.orgโ€™s server DNS address could not be found.
"""

Hadrian type-safe cast fails on ["null", "int"]

I've got the following simple file, consisting only of a type-safe cast to return a boolean for whether or not the input is non-null:

{
  "input": ["null", "int"],
  "output": "boolean",
  "action": [
    {
      "cast": "input",
      "cases": [
        {
          "as": "null",
          "named": "x",
          "do": [false]
        },
        {
          "as": "int",
          "named": "i",
          "do": [true]
        }
      ]
    }
  ]
}

Calling this in Hadrian with null as the input throws the following exception:

Exception in thread "main" scala.MatchError: null
    at com.opendatagroup.hadrian.jvmcompiler.W$.asInt(jvmcompiler.scala:413)
    at com.opendatagroup.hadrian.jvmcompiler.W.asInt(jvmcompiler.scala)
    at PFA_Engine_1$2$1$2.apply(Unknown Source)
    at PFA_Engine_1$2$1.apply(Unknown Source)
    at PFA_Engine_1$2.apply(Unknown Source)
    at PFA_Engine_1.action(Unknown Source)
    at PFA_Engine_1.action(Unknown Source)
    at Main$.main(Main.scala:16)
    at Main.main(Main.scala)
    at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
    at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    at java.lang.reflect.Method.invoke(Method.java:606)
    at com.intellij.rt.execution.application.AppMain.main(AppMain.java:147)

If I switch it up and use the "ifnotnull" special form instead, it works just fine.

When I switch the datatype from ["null", "int"] to ["null", "double"], it still fails. When I switch it to ["null", "string"], the model runs... but produces the wrong output (it says true).

Is there an issue here of pattern matching against null?

(I should add: I have verified that null is being interpreted as null, and not as "null")

Aurelius: avro_from_df should handle case of missing values

Currently, avro_from_df function just takes data type of the column and converts it to corresponding PFA type. It'd help if this function handles the case where a column has some missing values.

Minimal reproducible example:

a = data.frame(col1 = c(1, 2), col2 = c(1.234565, NA))
avro_from_df(a)

I'd expect col2 to be of type avro_union(avro_double, avro_null) and not avro_double.

Bitwise operations, hashing

Hey folks,

A large number of machine learning applications require hashing methods. As far as I can tell, there are no hashing algorithms built into PFA; but, I'm more than happy to handle that on my end, as a user-defined function.

Hashing methods, however, rely heavily on bit-based operations. The PFA spec already includes bitwise logic operations (&, |, ^, ~); but, it doesn't contain shift operators.

So, I guess my question is:

  1. Are there any plans to add primitives for common hashing algorithms?
  2. If not, are there any plans to add left and right bit shift operators?

Thanks!

[aurelius] assigning whole number to a variable of type double

If I try to assign a constant whole number to a variable of type double, it works with Titus:

library(aurelius)
pfaDocument = pfa_document(
  input = avro_array(avro_double),
  output = avro_array(avro_double),
  action = expression(
    out <- input,
    out[0] <- 5.0,
    out
  )
)

library(jsonlite)
engine = pfa_engine(pfaDocument)
x = 1:2
engine$action(x)

But it gives error 'Assignment conversion not possible from type "int" to type "java.lang.Double"' in Hadrian:

f = "/usr/local/src/gdrive/results/pfa/int_as_double_example_error1.pfa"
write_pfa(pfaDocument, file = f, pretty = TRUE)
library(jsonlite)
tmp1 = tempfile(fileext = ".json")
tmp2 = tempfile(fileext = ".json")
write(minify(toJSON(x, auto_unbox = TRUE)), file = tmp1)
cmd = paste0("cd /usr/local/src/gdrive/; touch ", tmp2, "; ",
             "java -jar scripts/hadrian/hadrian-standalone-0.8.1-jar-with-dependencies.jar -i json -o json ",
             f, " ", tmp1, " > ", tmp2)
system(cmd)

PFA file:

{
  "input": {
    "type": "array",
    "items": "double"
  },
  "output": {
    "type": "array",
    "items": "double"
  },
  "action": [
    {
      "let": {
        "out": "input"
      }
    },
    {
      "do": [
        {
          "let": {
            "tmp_7": 5
          }
        },
        {
          "set": {
            "out": {
              "attr": "out",
              "path": [
                0
              ],
              "to": "tmp_7"
            }
          }
        },
        "tmp_7"
      ]
    },
    "out"
  ]
}

I tried to use always_decimal parameter, but then it gives another error 'path index for an array must resolve to a long or int; item 0 is a "double"' because the path index in out[0] is also converted to double.

f = "/usr/local/src/gdrive/results/pfa/int_as_double_example_error2.pfa"
write_pfa(pfaDocument, file = f, pretty = TRUE, always_decimal = TRUE)
library(jsonlite)
tmp1 = tempfile(fileext = ".json")
tmp2 = tempfile(fileext = ".json")
write(minify(toJSON(x, auto_unbox = TRUE)), file = tmp1)
cmd = paste0("cd /usr/local/src/gdrive/; touch ", tmp2, "; ",
             "java -jar scripts/hadrian/hadrian-standalone-0.8.1-jar-with-dependencies.jar -i json -o json ",
             f, " ", tmp1, " > ", tmp2)
system(cmd)

PFA file:

{
  "input": {
    "type": "array",
    "items": "double"
  },
  "output": {
    "type": "array",
    "items": "double"
  },
  "action": [
    {
      "let": {
        "out": "input"
      }
    },
    {
      "do": [
        {
          "let": {
            "tmp_7": 5.0
          }
        },
        {
          "set": {
            "out": {
              "attr": "out",
              "path": [
                0.0
              ],
              "to": "tmp_7"
            }
          }
        },
        "tmp_7"
      ]
    },
    "out"
  ]
}

How can I force aurelius to use always_decimal only in the places where required?

The only workaround I see at the moment is to manually change the exported .pfa file (make sure that 0 is printed as 0 and 5 as 5.0).

aurelius::pfa for rpart (method = "anova") issue argument is length zero

Trying to call pfa.rpart on a regression tree (instead of a classification tree as in the example) gives the following

model <- rpart::rpart(Sepal.Length ~ ., data=iris)
model_as_pfa <- pfa(model)

Console:

Error in if (!is.na(node$SplitVar)) { : argument is of length zero

Please advise.

[Hadrian] Error in "Hadrian Basic Use"

Following the tutorial using latest Hadrian release 0.8.1 (thus, jar file hadrian-standalone-0.8.1-jar-with-dependencies.jar) results in error "java.lang.NoSuchMethodError: scala.Predef$.byteArrayOps([B)Lscala/collection/mutable/ArrayOps;
at com.opendatagroup.hadrian.reader.jsonToAst$.ingestJsonAsBytes(reader.scala:394)
at com.opendatagroup.hadrian.reader.jsonToAst$.readJsonToBytes(reader.scala:434)
at com.opendatagroup.hadrian.reader.jsonToAst$.readJsonToString(reader.scala:439)
at com.opendatagroup.hadrian.reader.jsonToAst$.readAvroPlaceholder(reader.scala:445)
at com.opendatagroup.hadrian.reader.jsonToAst$.readEngineConfig(reader.scala:259)
at com.opendatagroup.hadrian.reader.jsonToAst$.parserToAst(reader.scala:150)
at com.opendatagroup.hadrian.reader.jsonToAst$.apply(reader.scala:144)
at com.opendatagroup.hadrian.jvmcompiler.PFAEngine$.fromJson(jvmcompiler.scala:1796)
... 33 elided" when issuing command "val engine = PFAEngine.fromJson("""
| | {"input": "double",
| | "output": "double",
| | "action": {"+": ["input", 100]}}
| | """).head" in scala prompt.

Titus for Python 3.4+ is now available

Hi everyone,

With the sunset of Python 2 by the end of this year, I am glad to inform that Titus is now fully supported in Python 3.4-3.8.
Please head to this repo - https://github.com/animator/titus2
The migrated code is passing all unit tests and conformance tests.
Please don't forget to star the project in case you find it useful.

Cheers,
Ankit

[TO BE CLOSED] [Hadrian] Multiple errors in "Tutorial 4: executing a model in Hadrian-Standalone"

Following the tutorial using latest Hadrian release 0.8.1 (thus, jar file hadrian-standalone-0.8.1-jar-with-dependencies.jar) results in multiple errors:

  • when issuing command "java -jar hadrian-standalone-0.8.1-jar-with-dependencies.jar -i json -o json ... " error "Exception in thread "main" com.opendatagroup.hadrian.errors.PFASyntaxException: PFA syntax error at JSON line:col 1:2: PFA engine must be a JSON object, not start of array ("[" character)" pops-up
  • if trying scala version by calling "val engine = PFAEngine.fromJson(new java.io.File("myModel.pfa")).head" in scala REPL error same as above.

PFA function requires reserved keyword "to" in record name

Several PFA library functions (e.g., interp.linear) require records with the field name to, but trying to define a record with a field to leads to an error.

Minimal reproducible example:

In [1]: from titus import prettypfa
      :
      : doc = """
      : types: MyRecord = record(x: int, to: double);
      : input: int
      : output: int
      : action: 1
      : """
      :
      : prettypfa.jsonNode(doc)
      :
      :
---------------------------------------------------------------------------
PrettyPfaException                        Traceback (most recent call last)
<ipython-input-1-0a736b5ef5a3> in <module>()
      8 """
      9
---> 10 prettypfa.jsonNode(doc)
     11

/Users/brandon/miniconda2/lib/python2.7/site-packages/titus/prettypfa.pyc in jsonNode(text, lineNumbers, check, version, subs, **subs2)
   2214     :return: PFA in Pythonized JSON
   2215     """
-> 2216     return ast(text, check, version, subs, **subs2).jsonNode(lineNumbers, set())
   2217
   2218 def json(text, lineNumbers=True, check=True, version=None, subs={}, **subs2):

/Users/brandon/miniconda2/lib/python2.7/site-packages/titus/prettypfa.pyc in ast(text, check, version, subs, **subs2)
   2187             parser.initialize(lex, yacc)
   2188
-> 2189     out = parser.parse(text, subs2)
   2190
   2191     anysubs = lambda x: x

/Users/brandon/miniconda2/lib/python2.7/site-packages/titus/prettypfa.pyc in parse(self, text, subs)
   2122         self.text = text
   2123         self.subs = subs
-> 2124         out = self.yacc.parse(text, lexer=self.lexer)
   2125         if self.wholeDocument:
   2126             return out

/Users/brandon/miniconda2/lib/python2.7/site-packages/ply/yacc.pyc in parse(self, input, lexer, debug, tracking, tokenfunc)
    329             return self.parseopt(input, lexer, debug, tracking, tokenfunc)
    330         else:
--> 331             return self.parseopt_notrack(input, lexer, debug, tracking, tokenfunc)
    332
    333

/Users/brandon/miniconda2/lib/python2.7/site-packages/ply/yacc.pyc in parseopt_notrack(self, input, lexer, debug, tracking, tokenfunc)
   1197                             errtoken.lexer = lexer
   1198                         self.state = state
-> 1199                         tok = call_errorfunc(self.errorfunc, errtoken, self)
   1200                         if self.errorok:
   1201                             # User must have done some kind of panic

/Users/brandon/miniconda2/lib/python2.7/site-packages/ply/yacc.pyc in call_errorfunc(errorfunc, token, parser)
    191     _token = parser.token
    192     _restart = parser.restart
--> 193     r = errorfunc(token)
    194     try:
    195         del _errok, _token, _restart

/Users/brandon/miniconda2/lib/python2.7/site-packages/titus/prettypfa.pyc in p_error(p)
   2103                 else:
   2104                     offendingLine = "\n".join(insertArrow(lines[(lineno - 1):(lineno + 2)], 1))
-> 2105                     raise PrettyPfaException("Parsing syntax error on line {0}:\n{1}".format(p.lineno, offendingLine))
   2106
   2107         self.yacc = yacc.yacc(debug=False, write_tables=False)

PrettyPfaException: Parsing syntax error on line 2:
types: MyRecord = record(x: int, to: double);
input: int     <----
output: int

pfachain incompatible with recursively defined types

This occurs because of the prepended Step{}_Engine_{}_ strings that pfachain puts in.

Minimal reproducible example: download this gist and run

# taken from the tutorial https://github.com/opendatagroup/hadrian/wiki/Basic-random-forest
python ex.py  

# modified random forest example using pfachain
python ex_chain.py

The second script fails with

โžœ  python ex_chain.py
Traceback (most recent call last):
  File "ex_chain.py", line 158, in <module>
    engine, = PFAEngine.fromJson(pfaDocument)
  File "/Users/brandon/miniconda2/envs/arena/lib/python2.7/site-packages/titus/genpy.py", line 1565, in fromJson
    return PFAEngine.fromAst(titus.reader.jsonToAst(src), options, version, sharedState, multiplicity, style, debug)
  File "/Users/brandon/miniconda2/envs/arena/lib/python2.7/site-packages/titus/genpy.py", line 1509, in fromAst
    value = titus.datatype.jsonDecoder(cellConfig.avroType, cellConfig.initJsonNode)
  File "/Users/brandon/miniconda2/envs/arena/lib/python2.7/site-packages/titus/datatype.py", line 925, in jsonDecoder
    return [jsonDecoder(avroType.items, x) for x in value]
  File "/Users/brandon/miniconda2/envs/arena/lib/python2.7/site-packages/titus/datatype.py", line 934, in jsonDecoder
    out[field.name] = jsonDecoder(field.avroType, value[field.name])
  File "/Users/brandon/miniconda2/envs/arena/lib/python2.7/site-packages/titus/datatype.py", line 953, in jsonDecoder
    raise titus.errors.AvroException("{0} does not match schema {1}".format(json.dumps(value), ts(avroType)))
titus.errors.AvroException: {"TreeNode": {"operator": "<", "field": "petal_length", "fail": {"TreeNode": {"operator": "<", "field": "sepal_length", "fail": {"TreeNode": {"operator": "<", "field": "petal_width", "fail": {"TreeNode": {"operator": "<", "field": "petal_width", "fail": {"TreeNode": {"operator": "<", "field": "sepal_length", "fail": {"string": "versicolor"}, "value": 0.0, "pass": {"string": "virginica"}}}, "value": 1.45, "pass": {"TreeNode": {"operator": "<", "field": "sepal_length", "fail": {"string": "versicolor"}, "value": 0.0, "pass": {"string": "virginica"}}}}}, "value": 1.35, "pass": {"TreeNode": {"operator": "<", "field": "sepal_width", "fail": {"TreeNode": {"operator": "<", "field": "petal_width", "fail": {"string": "versicolor"}, "value": 0.15000000000000002, "pass": {"string": "setosa"}}}, "value": 2.45, "pass": {"TreeNode": {"operator": "<", "field": "petal_width", "fail": {"string": "versicolor"}, "value": 1.05, "pass": {"string": "versicolor"}}}}}}}, "value": 5.05, "pass": {"TreeNode": {"operator": "<", "field": "petal_length", "fail": {"TreeNode": {"operator": "<", "field": "sepal_width", "fail": {"TreeNode": {"operator": "<", "field": "sepal_width", "fail": {"string": "setosa"}, "value": 3.45, "pass": {"string": "setosa"}}}, "value": 3.3, "pass": {"TreeNode": {"operator": "<", "field": "petal_length", "fail": {"string": "versicolor"}, "value": 3.4, "pass": {"string": "setosa"}}}}}, "value": 1.55, "pass": {"TreeNode": {"operator": "<", "field": "petal_width", "fail": {"TreeNode": {"operator": "<", "field": "sepal_width", "fail": {"string": "setosa"}, "value": 3.25, "pass": {"string": "setosa"}}}, "value": 0.15000000000000002, "pass": {"TreeNode": {"operator": "<", "field": "sepal_width", "fail": {"string": "setosa"}, "value": 0.0, "pass": {"string": "virginica"}}}}}}}}}, "value": 1.45, "pass": {"TreeNode": {"operator": "<", "field": "sepal_length", "fail": {"TreeNode": {"operator": "<", "field": "petal_length", "fail": {"TreeNode": {"operator": "<", "field": "sepal_length", "fail": {"TreeNode": {"operator": "<", "field": "petal_width", "fail": {"string": "setosa"}, "value": 0.15000000000000002, "pass": {"string": "setosa"}}}, "value": 4.45, "pass": {"TreeNode": {"operator": "<", "field": "sepal_length", "fail": {"string": "setosa"}, "value": 0.0, "pass": {"string": "virginica"}}}}}, "value": 1.1, "pass": {"TreeNode": {"operator": "<", "field": "petal_length", "fail": {"TreeNode": {"operator": "<", "field": "sepal_width", "fail": {"string": "setosa"}, "value": 0.0, "pass": {"string": "virginica"}}}, "value": 0.0, "pass": {"TreeNode": {"operator": "<", "field": "sepal_length", "fail": {"string": "virginica"}, "value": 0.0, "pass": {"string": "virginica"}}}}}}}, "value": 4.35, "pass": {"TreeNode": {"operator": "<", "field": "petal_length", "fail": {"TreeNode": {"operator": "<", "field": "petal_length", "fail": {"TreeNode": {"operator": "<", "field": "sepal_length", "fail": {"string": "setosa"}, "value": 0.0, "pass": {"string": "virginica"}}}, "value": 0.0, "pass": {"TreeNode": {"operator": "<", "field": "petal_length", "fail": {"string": "virginica"}, "value": 0.0, "pass": {"string": "virginica"}}}}}, "value": 0.0, "pass": {"TreeNode": {"operator": "<", "field": "sepal_width", "fail": {"TreeNode": {"operator": "<", "field": "sepal_width", "fail": {"string": "virginica"}, "value": 0.0, "pass": {"string": "virginica"}}}, "value": 0.0, "pass": {"TreeNode": {"operator": "<", "field": "sepal_length", "fail": {"string": "virginica"}, "value": 0.0, "pass": {"string": "virginica"}}}}}}}}}}} does not match schema
    union(string,
          record(Step2_Engine_2_TreeNode,
                 field: enum([sepal_length, sepal_width, petal_length, petal_width], Step2_Engine_2_Enum_1),
                 operator: string,
                 value: double,
                 pass: union(string,
                             Step2_Engine_2_TreeNode),
                 fail: union(string,
                             Step2_Engine_2_TreeNode)))

Related issue: #30

Hadrian fails when numbers are used as Map keys.

For instance, the YAML

output: 
  type: map
  values: double
method: map
action:
  - {"new": {"0": input}, "type":{"values":"double","type": "map"}}

fails with the message "0" is not a valid symbol name. The same YAML with "a" as the map key works.

In Titus, both
PFAEngine.fromJson('''{"input":"double","output":{"values":"double","type":"map"},"method":"map","action":{"new":{"0":"input"}, "type":{"values":"double","type":"map"}}}''')
and
PFAEngine.fromJson('''{"input":"double","output":{"values":"double","type":"map"},"method":"map","action":{"new":{"a":"input"}, "type":{"values":"double","type":"map"}}}''')
work.

[Hadrian] Invoking user-defined functions externally needs to track start times

Currently, timeout tracking in generated engines relies on a startTime field being set at the beginning of the invocation. Exposed user-defined functions do not set this field, so the last setting from a begin/action/end invocation takes hold. This effectively means that a user has (timeout)ms to invoke UDFs before they turn into a source of error messages that gets reset when action is invoked again.

It appears, if I am understanding the code correctly, that it should be possible to generate a wrapping function that sets the startTime field when invoked externally. Since all user-defined functions and the begin/action/end triumvirate are synchronized on the runlock, this ought to be enough to make it work.

I am attempting to make this modification now, I'll open a PR if I get a solution working.

Incorrect definition for m.ln1p

The spec for the function
m.ln1p defines it as ln(1+ x^2). This is surprising because the normal definition, is ln(1+ x). This aligns with the classic Taylor series (the Mercator series) and java.lang.Math.log1p.

Looking at the code for hadrian, I see that the implementation is actually java.lang.Math.log1p.

I can verify this by running the PFA script

input: double
output: {type: array, items: double}
action:
  - new: [{m.ln1p: [input]}, {m.ln: [{+: [1, input]}]}, {m.ln: [{+: [1, {"**": [input, 2]}]}]}]
    type: {type: array, items: double}

and seeing the that the output for the first value matches the second, not the third.

So I think we should change the spec to say ln(1+ x).

Incorrect Scoping Behavior in "set" Special Form

My understanding from the documentation is that all assignments in a "set" special form should be done in a copy of the scope -- that is, if one assignment updates a counter, and another USES the counter, then the second one should be seeing the value of the counter from before the update.

However, in Hadrian, that doesn't seem to be the case. For instance, take the following (silly) example:

{
  "input": "int",
  "output": "double",
  "action": [
    {"let": {"i": 0, "total": 0.0}},
    {
      "while": {"<": ["i", "input"]},
      "do": [
        {
          "set": {
            "total": {
              "do": [
                {"log": ["i"]},
                {"+": ["total", {"m.pi": []}]}
              ]
            },
            "i": {"+": ["i", 1]},
          }
        }
      ]
    },
    "total"
  ]
}

If I run this in the web UI at dmg.org, I get the following:

[LOG] 1
3.141592653589793
[LOG] 1
[LOG] 2
6.283185307179586
[LOG] 1
[LOG] 2
[LOG] 3
9.42477796076938

We should, of course, be seeing 0, then 01, then 012.

Interestingly, if I switch the order of the two updates in the JSON document, it seems to switch the order of the updates: I see 0, 01, 012 for that.

Obviously, this is a contrived example; but, I'm having the same problem in a much more important case (looping through a couple of arrays, acting on their indices and updating them simultaneously) in the most recent build I've pulled from github.

Titus: python setup.py test failure

The following failed on MacOSX El Capitan:

$ cd hadrian/titus
$ python setup.py test

...

======================================================================
FAIL: testUnpackBoolean (test.testGenpy.TestGeneratePython)
----------------------------------------------------------------------
Traceback (most recent call last):
  File "/Users/me/Git/hadrian/titus/test/testGenpy.py", line 2108, in testUnpackBoolean
    self.assertEqual(engine.action("".join(map(unsigned, [8]))), True)
AssertionError: False != True

----------------------------------------------------------------------
Ran 805 tests in 40.097s

FAILED (failures=1, errors=17)
$

[Titus] Option to make chain deterministic

Here's a little example:

from titus.producer import chain
import json

j1 = """
{"input": "int",
 "output": {"type": "record",
            "name": "Output",
            "fields": [{"name": "one", "type": "int"},
                       {"name": "two", "type": "double"},
                       {"name": "three", "type": "string"}]},
 "action":
   {"type": "Output",
    "new": {"one": "input", "two": "input", "three": {"s.int": "input"}}}}
"""

j2 = """
{"input": {"type": "record",
           "name": "Output",
           "fields": [{"name": "one", "type": "int"},
                      {"name": "two", "type": "double"},
                      {"name": "three", "type": "string"}]},
 "output": "string",
 "method": "emit",
 "action": [
   {"emit": "input.three"},
   {"emit": "input.three"},
   {"emit": "input.three"}]}
"""

p1 = json.loads(j1)
p2 = json.loads(j2)

x = chain.json([p1, p2], randseed=1)
y = chain.json([p1, p2], randseed=1)

At the end of this, x ends up being not equal to y. This is quite annoying for testing purposes. Would it be possible to make this function pure? Or at least add an option?

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    ๐Ÿ–– Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. ๐Ÿ“Š๐Ÿ“ˆ๐ŸŽ‰

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google โค๏ธ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.