Coder Social home page Coder Social logo

gucharbon / pyheartex Goto Github PK

View Code? Open in Web Editor NEW

This project forked from humansignal/pyheartex

0.0 1.0 1.0 180.29 MB

Heartex Python SDK - Connect your own models to Heartex Data Labeling

Home Page: https://heartex.ai

License: MIT License

Python 100.00%

pyheartex's Introduction

pyheartex

Python interface for running ML backend server and using it for active learning & prelabeling & prediction within Heartex platform

Installation

Heartex SDK is packaged using Poetry. You can learn more about Poetry on the official documentation.

Install Heartex SDK:

git clone https://github.com/heartexlabs/pyheartex.git
cd pyheartex/
poetry install

Note: You can install all optional dependencies (scikit-learn and fastai) using the command poetry install -E all

Quick start

Quick start guide provides the usage of the following popular machine learning frameworks within Heartex platform:

First make sure you have Redis server running (otherwise you can use only prediction, not active learning). For testing purpose, you can start a redis instance using docker with the following command:

docker run -d --rm -p 6379:6379 redis:latest

Last thing you should do is to start RQ workers in the background:

rq worker default

Scikit-learn

scikit-learn is an optional dependency of pyheartex. You can install it using the following command:

poetry install -E "sklearn"`

Let's serve scikit-learn model for text classification.

You can simply launch:

python examples/quickstart.py

This script looks like

from htx.adapters.sklearn import serve

from sklearn.feature_extraction.text import TfidfVectorizer
from sklearn.linear_model import LogisticRegression
from sklearn.pipeline import make_pipeline


if __name__ == "__main__":

    # Creating sklearn-compatible model
    my_model = make_pipeline(TfidfVectorizer(), LogisticRegression())

    # Start serving this model
    serve(my_model)

It starts serving at http://localhost:16118 listening for Heartex event. To connect your model, go to Heartex -> Settings -> Machine learning page and choose "Add custom model".

Or you can use Heartex API to activate your model:

curl -X POST -H 'Content-Type: application/json' \
-H 'Authorization: Token <PUT-YOUR-TOKEN-HERE>' \
-d '[{"url": "$HOST:$PORT", "name": "my_model", "title": "My model", "description": "My new model deployed on Heartex"}]' \
http://go.heartex.net/api/projects/{project-id}/backends/

where $HOST:$PORT is your server URL that should be accessible from the outside.

FastAI

fastai is an optional dependency of pyheartex. You need to install it using the following command:

poetry install -E "fastai"

You can integrate FastAI models similarly to scikit-learn. Check this example to learn how to plug in updateable image classifier.

Using Docker

Here is an example how to start serving image classifier:

cd examples/docker
./scripts/build.sh
./scripts/deploy.sh

All you need to replace with your own model is to change loading, inference and training scripts from this file.

Advanced usage

When you want to go beyond using sklearn compatible API, you can build your own model, by making manually input/output interface conversion. You have to subclass Heartex models as follows:

from htx.base_model import BaseModel

# This class exposes methods needed to handle model in the runtime (loading into memory, running predictions)
class MyModel(BaseModel):

    def get_input(self, task):
        """Extract input from serialized task"""
        pass

    def get_output(self, task):
        """Extract output from serialized task"""
        pass

    def load(self, train_output):
        """Loads model into memory. `train_output` dict is actually the output the `train` method (see below)"""
        pass

    def predict(self, tasks):
        """Get list of tasks, already processed by `get_input` method, and returns completions in Heartex format"""
        pass

# This method handles model retraining
def train(input_tasks, output_model_dir, **kwargs):
    """
    :param input_tasks: list of tasks already processed by `get_input`
    :param output_model_dir: output directory where you can optionally store model resources
    :param kwargs: any additional kwargs taken from `train_kwargs`
    :return: `train_output` dict for consequent model loading
    """
    pass

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    ๐Ÿ–– Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. ๐Ÿ“Š๐Ÿ“ˆ๐ŸŽ‰

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google โค๏ธ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.