Coder Social home page Coder Social logo

lazygpt's Introduction

lazyGPT

Use LLMs to generate chat responses lazily and asynchronously and control different threads of conversation with ease.

Installation

pip install git+https://github.com/simonsanvil/lazygpt.git                 

Usage

Make lazy evaluations of chat responses:

from lazygpt import GPT

gpt = GPT(model="gpt-4o", chat=True, async_=True, temperature=0.9)

with gpt.lazy():
    # The response to all of these won't be evaluated until a call to gpt.evaluate() is made
    gpt("Hello, I will give you a series of questions and you must answer them with honesty and sincerity. Understood?", role="user")
    gpt("What is the capital of Spain?", role="user")

print(gpt.threads[0]) # by default the first thread is the main conversation thread
# User:
# > Hello, I will give you a series of questions and you must answer them with honesty and sincerity. Understood?
# Assistant:
# > [LAZY EVALUATION - Not yet evaluated]
# User:
# > What is the capital of Spain?
# Assistant:
# > [LAZY EVALUATION - Not yet evaluated]

await gpt.evaluate_async()  # this will evaluate all lazy evaluations.

print(gpt.threads[0])
# User:
# > Hello, I will give you a series of questions and you must answer them with honesty and sincerity. Understood?
# Assistant:
# > Understood. Please proceed with any questions you may have.
# User:
# > What is the capital of Spain?
# Assistant:
# > The capital of Spain is Madrid.

The evaluations are done sequentially for each individual conversation thread and in parallel for different threads. Here's another example illustrating that:

countries = ["France", "Italy", "Germany"]
for i, country in enumerate(countries):
    with gpt.create_thread(thread_id=f"thread_{i+1}", copy_from=0):
        # this will create a new conversation in a different thread
        # forked from the main thread (thread_id=0)
        gpt(f"What is the capital of {country}?", role="user")
        gpt("Thank you.", role="user", model=None)
        # setting model=None will not trigger a response from the model
    
await gpt.evaluate_async()
# Since the messages to evaluate are all in different threads, 
# they will all be sent to the API simultaneously

for thread in gpt.threads:
    if thread==0: # don't print the main thread
        continue
    print(f"Thread {thread.thread_id}:")
    print(thread)
# Thread thread_1:
# User:
# Hello, I will give you a series of questions and you must answer them with honesty and sincerity. Understood?
# ...
# User:
# > What is the capital of France?
# Assistant:
# > The capital of France is Paris.
# User:
# > Thank you.

# Thread thread_2:
# Hello, I will give you a series of questions and you must answer them with honesty and sincerity. Understood?
# ...
# User:
# > What is the capital of Italy?
# Assistant:
# > The capital of Italy is Rome.
# User:
# > Thank you.

# Thread thread_3:
# Hello, I will give you a series of questions and you must answer them with honesty and sincerity. Understood?
# ...
# User:
# > What is the capital of Germany?
# Assistant:
# > The capital of Germany is Berlin.
# User:
# > Thank you.

lazygpt's People

Contributors

simonsanvil avatar

Watchers

 avatar

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    ๐Ÿ–– Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. ๐Ÿ“Š๐Ÿ“ˆ๐ŸŽ‰

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google โค๏ธ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.