Comments (5)
SageMaker Platform features like Serverless Inference are not directly built into the toolkit. For Multimodel it should be the same except that we make sure the if a inference.py
is provided it is used.
from sagemaker-huggingface-inference-toolkit.
Thanks for the answer, let me maybe clarify my question: I have a custom container that uses the sagemaker inference toolkit, and it works well for provisioned deployment. But it fails when I try to deploy it for serverless inference because of some errors in the default Sagemaker MMS web server (this for instance).
So I was curious to hear whether you made any specific change to the huggingface toolkit to have it work OOTB with serverless
from sagemaker-huggingface-inference-toolkit.
According to the documentation it is currently not possible to use custom registries/container for Serverless Inference: https://docs.aws.amazon.com/sagemaker/latest/dg/serverless-endpoints.html
from sagemaker-huggingface-inference-toolkit.
They do mention that private registries are not supported, but they also specifically say that custom containers are supported (in the Container Support
section). And the serverless endpoint starts, so I think it's only really a code issue with the sagemaker-inference default webserver
from sagemaker-huggingface-inference-toolkit.
Oh I actually found an answer there.
Thanks for your answers!
from sagemaker-huggingface-inference-toolkit.
Related Issues (20)
- Using custom inference script and models from Hub HOT 1
- get_pipeline function passes Path object rather than PretrainedTokenizer
- No support for multi-GPU HOT 2
- 🏷️ invalid
- Sagemaker endpoint inferencing error with HF model loading from s3bucket with new transformer update HOT 5
- Support multiple return sequences
- HF_TASK Enviournment Variable error HOT 1
- Endpoint creation completes before custom model_fn finishes loading resources
- ARCHITECTURES_2_TASK is limiting the tasks able to be deployed with HF DLC HOT 11
- Make DEFAULT_HF_HUB_MODEL_EXPORT_DIRECTORY configurable
- Sagemaker inference not loading model weight from s3
- Sagemaker endpoint doesn't use GPU (instance ml.g4dn.xlarge) HOT 1
- device kernel image is invalid
- Sagemaker endpoint inference Fails when following a tutorial
- Sagemaker HuggingfaceModel fails on phi3 model deployment HOT 2
- SageMaker deployment errors HOT 2
- Error on Sagemaker deployment for v1.0.1 HOT 1
- How can I delpoy a model with AWS S3 and without downloading model from hunggingface via TGI image on Sagemaker? HOT 2
- How to enable Batch inference on AWS deployed Serverless model from Hub? HOT 1
- Where is the logic for detecting custom inference.py? HOT 6
Recommend Projects
-
React
A declarative, efficient, and flexible JavaScript library for building user interfaces.
-
Vue.js
🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.
-
Typescript
TypeScript is a superset of JavaScript that compiles to clean JavaScript output.
-
TensorFlow
An Open Source Machine Learning Framework for Everyone
-
Django
The Web framework for perfectionists with deadlines.
-
Laravel
A PHP framework for web artisans
-
D3
Bring data to life with SVG, Canvas and HTML. 📊📈🎉
-
Recommend Topics
-
javascript
JavaScript (JS) is a lightweight interpreted programming language with first-class functions.
-
web
Some thing interesting about web. New door for the world.
-
server
A server is a program made to process requests and deliver data to clients.
-
Machine learning
Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.
-
Visualization
Some thing interesting about visualization, use data art
-
Game
Some thing interesting about game, make everyone happy.
Recommend Org
-
Facebook
We are working to build community through open source technology. NB: members must have two-factor auth.
-
Microsoft
Open source projects and samples from Microsoft.
-
Google
Google ❤️ Open Source for everyone.
-
Alibaba
Alibaba Open Source for everyone
-
D3
Data-Driven Documents codes.
-
Tencent
China tencent open source team.
from sagemaker-huggingface-inference-toolkit.