Coder Social home page Coder Social logo

vrobot / litellm Goto Github PK

View Code? Open in Web Editor NEW

This project forked from berriai/litellm

0.0 0.0 0.0 25.45 MB

lightweight package to simplify LLM API calls - Azure, OpenAI, Cohere, Anthropic, Replicate. Manages input/output translation

Home Page: https://litellm.readthedocs.io/en/latest/

License: MIT License

Python 60.59% Jupyter Notebook 39.14% Dockerfile 0.27%

litellm's Introduction

๐Ÿš… litellm

PyPI Version PyPI Version CircleCI Downloads litellm

a light package to simplify calling OpenAI, Azure, Cohere, Anthropic, Huggingface API Endpoints. It manages:

  • translating inputs to the provider's completion and embedding endpoints
  • guarantees consistent output, text responses will always be available at ['choices'][0]['message']['content']
  • exception mapping - common exceptions across providers are mapped to the OpenAI exception types

usage

Demo - https://litellm.ai/
Read the docs - https://litellm.readthedocs.io/en/latest/

quick start

pip install litellm
from litellm import completion

messages = [{ "content": "Hello, how are you?","role": "user"}]

# openai call
response = completion(model="gpt-3.5-turbo", messages=messages)

# cohere call
response = completion("command-nightly", messages)

# azure openai call
response = completion("chatgpt-test", messages, azure=True)

# hugging face call
response = completion(model="stabilityai/stablecode-completion-alpha-3b-4k", messages=messages, hugging_face=True)

# openrouter call
response = completion("google/palm-2-codechat-bison", messages)

Code Sample: Getting Started Notebook

Stable version

pip install litellm==0.1.345

Streaming Queries

liteLLM supports streaming the model response back, pass stream=True to get a streaming iterator in response. Streaming is supported for OpenAI, Azure, Anthropic models

response = completion(model="gpt-3.5-turbo", messages=messages, stream=True)
for chunk in response:
    print(chunk['choices'][0]['delta'])

# claude 2
result = completion('claude-2', messages, stream=True)
for chunk in result:
  print(chunk['choices'][0]['delta'])

support / talk with founders

why did we build this

  • Need for simplicity: Our code started to get extremely complicated managing & translating calls between Azure, OpenAI, Cohere

litellm's People

Contributors

ishaan-jaff avatar krrishdholakia avatar zakhar-kogan avatar

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    ๐Ÿ–– Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. ๐Ÿ“Š๐Ÿ“ˆ๐ŸŽ‰

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google โค๏ธ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.