You can read in detail in this medium article
git clone [email protected]:tahreemrasul/langserve_tutorial.git
cd ./langserve_tutorial
Install the LangChain CLI using
pip install -U langchain-cli
Set Up a Conda Environment (Recommended)
- If you don't have Conda, install it first.
- Create a new Conda environment:
conda create -n summarization_bot python=3.8
- Activate the environment:
conda activate summarization_bot
Install Dependencies
- Install the required packages using the
requirements.txt
file:
pip install -r requirements.txt
Set Up Your OpenAI API Key
- Create a .env file in the root directory of the project.
- Add your OpenAI API key to the
.env
file:
OPENAI_API_KEY='Your-OpenAI-API-Key-Here'
Run the application locally by navigating into main directory and:
langchain serve
This project folder includes a Dockerfile that allows you to easily build and host your LangServe app.
To build the image, you simply:
docker build . -t my-langserve-app
If you tag your image with something other than my-langserve-app
,
note it for use in the next step.
To run the image, you'll need to include any environment variables necessary for your application.
In the below example, we inject the OPENAI_API_KEY
environment
variable with the value set in my local environment
($OPENAI_API_KEY
)
We also expose port 8080 with the -p 8080:8080
option.
docker run -e OPENAI_API_KEY=$OPENAI_API_KEY -p 8080:8080 my-langserve-app
Cloud Run in GCP is a managed compute platform that lets you run containers directly on top of Google's scalable infrastructure. You can run on GCP using either the source code or through the Dockerfile. For running with source code, use:
gcloud run deploy SERVICE --source . --port 8080 --project PROJECT_ID --allow-unauthenticated --region REGION
--set-env-vars=OPENAI_API_KEY=$(OPENAI_API_KEY)