Comments (3)
Hi Rogerio, there is two ways to tell the inference toolkit to use a custom inference script:
- You can have the inference script in your local environment and then point to it as you did when you created the HuggingFaceModel() instance
- Or you can have the inference code in the tar.gz file, in which case there is no need to specify where the inference script is located.
In our call I only mentioned the first option, but in the links I sent to you they only mention the second option, so this is totally my fault, sorry. But when you use both options you get the error message you are seeing.
I personally have used the first approach in this notebook which I tested and it works: https://github.com/marshmellow77/text-summarisation-project/blob/main/4a_model_testing_deployed.ipynb
Hope this helps!
Cheers, Heiko
from sagemaker-huggingface-inference-toolkit.
Hey @rogeriobromfman,
The error No such file or directory: './code'
is showing up since you have defined source_dir="./code",
in your HuggingFaceModel
.
When you define source_dir
& entrypoint
in your HuggingFaceModel
the python-sagemaker-sdk is looking for those file on the machine where you execute HuggingFaceModel.deploy
and not inside the model.tar.gz
.
That way there is no requirement to bundle your inference.py
into model.tar.gz
.
But if you have bundled the inference.py
in your model.tar.gz
you can simply remove source_dir
and entry_point
from the HuggingFaceModel
and it should work.
from sagemaker-huggingface-inference-toolkit.
It worked! Having the inference code in the Notebook instance instead of the tar.gz
file is what I was missing. Thank you so much Heiko and Philipp!!
from sagemaker-huggingface-inference-toolkit.
Related Issues (20)
- Using custom inference script and models from Hub HOT 1
- get_pipeline function passes Path object rather than PretrainedTokenizer
- No support for multi-GPU HOT 2
- 🏷️ invalid
- Sagemaker endpoint inferencing error with HF model loading from s3bucket with new transformer update HOT 5
- Support multiple return sequences
- Make `DEFAULT_HF_HUB_MODEL_EXPORT_DIRECTORY` configurable through environment variable HOT 1
- InternalServerException while deploying HuggingFace model on SageMaker HOT 6
- Data format for inference HOT 1
- Support passing model_kwargs to pipeline HOT 1
- InternalServerException at runtime HOT 3
- trust_remote_code=True in new Hugging Face LLM Inference Container for Amazon SageMaker HOT 2
- How to access CustomAttributes in async inferece request input_fn HOT 1
- [DOCS] List of available HF_TASK and default inference scripts HOT 4
- Dead Link for Available HF_Tasks HOT 1
- SageMaker deployment errors HOT 2
- Error on Sagemaker deployment for v1.0.1 HOT 1
- How can I delpoy a model with AWS S3 and without downloading model from hunggingface via TGI image on Sagemaker? HOT 2
- How to enable Batch inference on AWS deployed Serverless model from Hub? HOT 1
- Where is the logic for detecting custom inference.py? HOT 6
Recommend Projects
-
React
A declarative, efficient, and flexible JavaScript library for building user interfaces.
-
Vue.js
🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.
-
Typescript
TypeScript is a superset of JavaScript that compiles to clean JavaScript output.
-
TensorFlow
An Open Source Machine Learning Framework for Everyone
-
Django
The Web framework for perfectionists with deadlines.
-
Laravel
A PHP framework for web artisans
-
D3
Bring data to life with SVG, Canvas and HTML. 📊📈🎉
-
Recommend Topics
-
javascript
JavaScript (JS) is a lightweight interpreted programming language with first-class functions.
-
web
Some thing interesting about web. New door for the world.
-
server
A server is a program made to process requests and deliver data to clients.
-
Machine learning
Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.
-
Visualization
Some thing interesting about visualization, use data art
-
Game
Some thing interesting about game, make everyone happy.
Recommend Org
-
Facebook
We are working to build community through open source technology. NB: members must have two-factor auth.
-
Microsoft
Open source projects and samples from Microsoft.
-
Google
Google ❤️ Open Source for everyone.
-
Alibaba
Alibaba Open Source for everyone
-
D3
Data-Driven Documents codes.
-
Tencent
China tencent open source team.
from sagemaker-huggingface-inference-toolkit.