Coder Social home page Coder Social logo

nielsen-oss / fasttext-serving Goto Github PK

View Code? Open in Web Editor NEW
23.0 11.0 11.0 4.1 MB

Serve your fastText models for text classification and word vectors

License: Apache License 2.0

Dockerfile 1.39% Python 98.61%
fasttext grpc microservice serving

fasttext-serving's People

Contributors

defr8001 avatar dependabot[bot] avatar senecaso avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

fasttext-serving's Issues

Reload config file problem

Hi, I have been using tensorflow serving for a few months, and now I have to serve fasttext models.
I like very much the feature on TF Serving that you can reload models, updating your config file. I could not find my way with this on fasttext-serving.

Here is my code to try and reload the config file, to serve a second model.

import os
import grpc

from fts.protos import service_pb2, service_pb2_grpc

if __name__ == "__main__":

    # Generate GRPC stub
    channel = grpc.insecure_channel("localhost:50051")
    stub = service_pb2_grpc.FastTextStub(channel)

    request = service_pb2.ReloadModelsRequest()
    response = stub.ReloadConfigModels(request)
    print(response)
    request = service_pb2.LoadModelsRequest()
    response = stub.GetLoadedModels(request)
    print(response)
    loaded_models = [model.name for model in response.models]
    print(loaded_models)

I am trying with "yelp_review_polarity" and "dbpedia". Here my config.yaml file. I tried loading the server with just "yelp_review_polarity", then adding "dbpedia" on config.yaml, and sending a request from the previous client code.

# List of models to serve
models_path: sample/models
models:
  - base_path: yelp_review_polarity
    name: yelp_review_polarity
  - base_path: dbpedia
    name: dbpedia

What am i doing wrong?

Error while executing docker run.

I was trying to follow through the [Quickstart] (https://github.com/nielsen-oss/fasttext-serving#quick-start) and getting error while I execute the "docker run" command, the traceback is as follows.
Traceback (most recent call last): File "/usr/local/lib/python3.8/runpy.py", line 193, in _run_module_as_main return _run_code(code, main_globals, None, File "/usr/local/lib/python3.8/runpy.py", line 86, in _run_code exec(code, run_globals) File "/usr/src/app/fts/__main__.py", line 84, in <module> serve() File "/usr/src/app/fts/__main__.py", line 58, in serve servicer = FastTextServicer() File "/usr/src/app/fts/server/server.py", line 23, in __init__ self._fasttext_service = FastTextService() File "/usr/src/app/fts/service/fasttext_service.py", line 56, in __init__ self._observer.start() File "/usr/local/lib/python3.8/site-packages/watchdog/observers/api.py", line 253, in start emitter.start() File "/usr/local/lib/python3.8/site-packages/watchdog/utils/__init__.py", line 110, in start self.on_thread_start() File "/usr/local/lib/python3.8/site-packages/watchdog/observers/inotify.py", line 121, in on_thread_start self._inotify = InotifyBuffer(path, self.watch.is_recursive) File "/usr/local/lib/python3.8/site-packages/watchdog/observers/inotify_buffer.py", line 35, in __init__ self._inotify = Inotify(path, recursive) File "/usr/local/lib/python3.8/site-packages/watchdog/observers/inotify_c.py", line 200, in __init__ self._add_dir_watch(path, recursive, event_mask) File "/usr/local/lib/python3.8/site-packages/watchdog/observers/inotify_c.py", line 387, in _add_dir_watch raise OSError(errno.ENOTDIR, os.strerror(errno.ENOTDIR), path) NotADirectoryError: [Errno 20] Not a directory: b'sample/models'

I am using
docker version : Docker version 20.10.18, build b40c2f6
ubuntu 18.04

Trouble configuring client

Hello.

First off, thanks for your work!

Been trying to use this to vectorize text to serve an ML clustering algorithm. I have set the setup.py on the client side, by running "python setup.py install" on the src folder of the ML app project while having the fts/protos folder inside de project folder, and it in fact builds the proto protocol folder into it. Still I'm having trouble as running the app issues the following error:

TypeError: Couldn't build proto file into descriptor pool!
Invalid proto descriptor for file "service.proto":
service.proto: A file with this name is already in the pool.

Any idea what I'm doing wrong? Clearly I don't understand enough of gPRC, for starters... ;)

Thanks again

Pedro

No module named fts while executing sample/client.py

While running the final step i.e. python sample/client in Quickstart, I am getting following error.
Traceback (most recent call last): File "sample/client.py", line 19, in <module> import fts.protos.service_pb2_grpc ModuleNotFoundError: No module named 'fts'
Tried to find solution from protobuf/issues/1491, but failed finding a solution.

slow memory leak

There is a slow memory leak somewhere in the code base. The python process starts off small (~1GB), and over the course of several hours of processing many batch prediction requests, the resident memory of the process creeps up to 4GB, 5GB, etc, until finally the process is killed by the OOM killer. I am using code from git commit 8aae32727a19, which is fairly recent. If there is more information I can provide, let me know. I know very little about python, but I can gather information if required.

Serve word vector model

Hi.
Can your method only be used if the supervised model is attached to it? I don't have a classification model. All I have is word vectors (.bin and .txt). How can I serve it?

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    ๐Ÿ–– Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. ๐Ÿ“Š๐Ÿ“ˆ๐ŸŽ‰

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google โค๏ธ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.