Coder Social home page Coder Social logo

mlflow-redisai's Introduction

GitHub issues CircleCI Dockerhub codecov Total alerts Forum Discord

Caution

RedisAI is no longer actively maintained or supported.

We are grateful to the RedisAI community for their interest and support.

RedisAI

RedisAI is a Redis module for executing Deep Learning/Machine Learning models and managing their data. Its purpose is being a "workhorse" for model serving, by providing out-of-the-box support for popular DL/ML frameworks and unparalleled performance. RedisAI both maximizes computation throughput and reduces latency by adhering to the principle of data locality, as well as simplifies the deployment and serving of graphs by leveraging on Redis' production-proven infrastructure.

To read RedisAI docs, visit redisai.io. To see RedisAI in action, visit the demos page.

Quickstart

RedisAI is a Redis module. To run it you'll need a Redis server (v6.0.0 or greater), the module's shared library, and its dependencies.

The following sections describe how to get started with RedisAI.

Docker

The quickest way to try RedisAI is by launching its official Docker container images.

On a CPU only machine

docker run -p 6379:6379 redislabs/redisai:1.2.7-cpu-bionic

On a GPU machine

For GPU support you will need a machine you'll need a machine that has Nvidia driver (CUDA 11.3 and cuDNN 8.1), nvidia-container-toolkit and Docker 19.03+ installed. For detailed information, checkout nvidia-docker documentation

docker run -p 6379:6379 --gpus all -it --rm redislabs/redisai:1.2.7-gpu-bionic

Building

You can compile and build the module from its source code. The Developer page has more information about the design and implementation of the RedisAI module and how to contribute.

Prerequisites

  • Packages: git, python3, make, wget, g++/clang, & unzip
  • CMake 3.0 or higher needs to be installed.
  • CUDA 11.3 and cuDNN 8.1 or higher needs to be installed if GPU support is required.
  • Redis v6.0.0 or greater.

Get the Source Code

You can obtain the module's source code by cloning the project's repository using git like so:

git clone --recursive https://github.com/RedisAI/RedisAI

Switch to the project's directory with:

cd RedisAI

Building the Dependencies

Use the following script to download and build the libraries of the various RedisAI backends (TensorFlow, PyTorch, ONNXRuntime) for CPU only:

bash get_deps.sh

Alternatively, you can run the following to fetch the backends with GPU support.

bash get_deps.sh gpu

Building the Module

Once the dependencies have been built, you can build the RedisAI module with:

make -C opt clean ALL=1
make -C opt

Alternatively, run the following to build RedisAI with GPU support:

make -C opt clean ALL=1
make -C opt GPU=1

Backend Dependancy

RedisAI currently supports PyTorch (libtorch), Tensorflow (libtensorflow), TensorFlow Lite, and ONNXRuntime as backends. This section shows the version map between RedisAI and supported backends. This extremely important since the serialization mechanism of one version might not match with another. For making sure your model will work with a given RedisAI version, check with the backend documentation about incompatible features between the version of your backend and the version RedisAI is built with.

RedisAI PyTorch TensorFlow TFLite ONNXRuntime
1.0.3 1.5.0 1.15.0 2.0.0 1.2.0
1.2.7 1.11.0 2.8.0 2.0.0 1.11.1
master 1.11.0 2.8.0 2.0.0 1.11.1

Note: Keras and TensorFlow 2.x are supported through graph freezing. See this script to see how to export a frozen graph from Keras and TensorFlow 2.x.

Loading the Module

To load the module upon starting the Redis server, simply use the --loadmodule command line switch, the loadmodule configuration directive or the Redis MODULE LOAD command with the path to module's library.

For example, to load the module from the project's path with a server command line switch use the following:

redis-server --loadmodule ./install-cpu/redisai.so

Give it a try

Once loaded, you can interact with RedisAI using redis-cli. Basic information and examples for using the module is described here.

Client libraries

Some languages already have client libraries that provide support for RedisAI's commands. The following table lists the known ones:

Project Language License Author URL
JRedisAI Java BSD-3 RedisLabs Github
redisai-py Python BSD-3 RedisLabs Github
redisai-go Go BSD-3 RedisLabs Github
redisai-js Typescript/Javascript BSD-3 RedisLabs Github
redis-modules-sdk TypeScript BSD-3-Clause Dani Tseitlin Github
redis-modules-java Java Apache-2.0 dengliming Github
smartredis C++ BSD-2-Clause Cray Labs Github
smartredis C BSD-2-Clause Cray Labs Github
smartredis Fortran BSD-2-Clause Cray Labs Github
smartredis Python BSD-2-Clause Cray Labs Github

The full documentation for RedisAI's API can be found at the Commands page.

Documentation

Read the docs at redisai.io.

Contact Us

If you have questions, want to provide feedback or perhaps report an issue or contribute some code, here's where we're listening to you:

License

RedisAI is licensed under your choice of the Redis Source Available License 2.0 (RSALv2) or the Server Side Public License v1 (SSPLv1).

mlflow-redisai's People

Contributors

boat-builder avatar chayim avatar gkorland avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar

mlflow-redisai's Issues

[FR] Compatibility with MLflow 2.0

Proposal Summary

In MLflow 2.0 (scheduled for release on Nov. 14), we will be making small modifications to the MLflow Model Server's RESTful scoring protocol (documented here: https://output.circle-artifacts.com/output/job/bb07270e-1101-421c-901c-01e72bc7b6df/artifacts/0/docs/build/html/models.html#deploy-mlflow-models) and the MLflow Deployment Client predict() API (documented here: https://output.circle-artifacts.com/output/job/bb07270e-1101-421c-901c-01e72bc7b6df/artifacts/0/docs/build/html/python_api/mlflow.deployments.html#mlflow.deployments.BaseDeploymentClient.predict).

For compatibility with MLflow 2.0, the mlflow-redisai plugin will need to be updated to conform to the new scoring protocol and Deployment Client interface. The MLflow maintainers are happy to assist with this process, and we apologize for the short notice.

Motivation

  • What is the use case for this feature? Provide a richer, more extensible scoring protocol and broaden the deployment client prediction interface beyond dataframe inputs.
  • Why is this use case valuable to support for MLflow RedisAI Deployment plugin users in general? Necessary for compatibility for MLflow 2.0
  • Why is it currently difficult to achieve this use case? Without these changes, the mlflow-redisai plugin will break in MLflow 2.0.

Support PyTorch deployment

MLFlow currently doesn't support saving torchscript model which is required for implementing pytorch deployment on RedisAI through the plugin.
Development of pytorch support in MLFlow core can be tracked here mlflow/mlflow#2263

Enable Redis connection parameters for mlflow cli deployment

Currently, the plugin supports

  • Hostname
  • Port
  • Username
  • Password
  • DB

through the connection URI or through environmental variables. However, there are more options possible in the base client and we might need to support them for broader use cases

Problems connecting redisai container using mlflow-redisai library

Hi,

I want to deploy a model with mlflow and jupyter to redisai. The specifics of my project is that I need mlflow, jupyter notebook, and redis in different docker containers.

This is the docker-compose file

version: '3'
services:
  notebook:
    image: jupyter/base-notebook
    ports:
      - "8888:8888"
    depends_on: 
      - mlflow
      - redisai
    environment: 
      MLFLOW_TRACKING_URI: 'http://mlflow:5000'
      REDISAI_TRACKING_URI: 'https://redisai:6379'
    volumes:
      - /home/jpardo/Documents/MLops/data/mlruns:/home/jovyan/mlruns

  mlflow:
    image: burakince/mlflow
    expose: 
      - "5000"
    ports:
      - "5000:5000"
    volumes:
      - /home/jpardo/Documents/MLops/data/mlruns:/mlflow/mlruns

  redisai:
    image: redislabs/redisai
    expose: 
      - "6379"
    ports:
      - "6379:6379"

I have installed inside de notebook container mlflow_redisai and mlflow libraries.

mlflow==2.10.2
mlflow-redisai==0.1.0

When inside the notebook container I run

from mlflow.deployments import get_deploy_client 
REDISAI_TRACKING_URI = os.getenv("REDISAI_TRACKING_URI")
redisai = get_deploy_client(REDISAI_TRACKING_URI)

the error from the redisai container is

Possible SECURITY ATTACK detected. It looks like somebody is sending POST or Host: commands to Redis. This is likely due to an attacker attempting to use Cross Protocol Scripting to compromise your Redis instance. Connection aborted.

So the problem is, Why I cannot connect to the redisai container with the mlflow-redisai library? On the contrary I have no problems connecting the notebook container with the mlflow container.

Thanks for any help,

Is this library still working or deprecated?

For me when running - mlflow deployments create -t redisai -m model-uri --name redis-key is giving below error -

Traceback (most recent call last): File "/home/ubuntu/anaconda3/bin/mlflow", line 8, in <module> sys.exit(cli()) File "/home/ubuntu/anaconda3/lib/python3.7/site-packages/click/core.py", line 764, in __call__ return self.main(*args, **kwargs) File "/home/ubuntu/anaconda3/lib/python3.7/site-packages/click/core.py", line 717, in main rv = self.invoke(ctx) File "/home/ubuntu/anaconda3/lib/python3.7/site-packages/click/core.py", line 1137, in invoke return _process_result(sub_ctx.command.invoke(sub_ctx)) File "/home/ubuntu/anaconda3/lib/python3.7/site-packages/click/core.py", line 1137, in invoke return _process_result(sub_ctx.command.invoke(sub_ctx)) File "/home/ubuntu/anaconda3/lib/python3.7/site-packages/click/core.py", line 956, in invoke return ctx.invoke(self.callback, **ctx.params) File "/home/ubuntu/anaconda3/lib/python3.7/site-packages/click/core.py", line 555, in invoke return callback(*args, **kwargs) File "/home/ubuntu/anaconda3/lib/python3.7/site-packages/mlflow/deployments/cli.py", line 145, in create_deployment deployment = client.create_deployment(name, model_uri, flavor, config=config_dict) File "/home/ubuntu/anaconda3/lib/python3.7/site-packages/mlflow_redisai/__init__.py", line 122, in create_deployment flavor = get_preferred_deployment_flavor(model_config) File "/home/ubuntu/anaconda3/lib/python3.7/site-packages/mlflow_redisai/utils.py", line 78, in get_preferred_deployment_flavor error_code=RESOURCE_DOES_NOT_EXIST) mlflow.exceptions.MlflowException: The specified model does not contain any of the supported flavors for deployment. The model contains the following flavors: dict_keys(['python_function']). Supported flavors: ['torchscript', 'tensorflow']

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    ๐Ÿ–– Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. ๐Ÿ“Š๐Ÿ“ˆ๐ŸŽ‰

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google โค๏ธ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.